## 30 Jan 2019

### Planet Maemo

It took a few days, but I've finally migrated my site to Nikola. I used to have blog.mardy.it served by Google's Blogger, the main sections of www.mardy.it generated with Jekyll, the image gallery served by the old and glorious Gallery2, plus a few leftovers from the old Drupal site.

discussion by Nicolas Alejandro, on Flickr

While Jekyll is cool, I was immediately captivated by Nikola's ease of use and by its developers' promptness in answering questions in the forum. Also, one nice thing about Nikola (and Pelican, too) which I forgot to mention in my previous post is it's support for multilingual sites. I guess I'll have to translate this post in interlingua too, to give you a demonstration. :-)

Anyway, while I've fallen in love with static site generators, I still would like to give people the chance of leaving comments. Services like Disqus are easy to integrate, but given the way they can be (ab)used to track the users, I prefered to go for something self hosted. So, enter Isso.

Isso is a Python server to handle comments; it's simple to install and configure, and offers some nice features like e-mail notifications on new replies.

### My Isso setup

Integrating Isso with Nikola was relatively easy, but the desire to keep a multilingual site and some hosting limitation made the process worth spending a couple of words.

##### FastCGI

First, my site if hosted by Dreamhost with a very basic subscription that doesn't allow me to keep long-running processes. After reading Isso's quickstart guide I was left quite disappointed, because it seemed that the only way to use Isso is to have it running all the time, or have a nginx server (Dreamhost offers Apache). Luckily, that's not quite the case, and more deployment approach are described in a separate page, including one for FastCGI (which is supported by Dreamhost). Those instructions are a bit wrong, but yours truly submitted some amendments to the documentation which will hopefully go live soon.

Isso can import comments from other sites, but an importer for Blogger (a.k.a. blogspot.com) was missing. So I wrote a quick and dirty tool for that job, and shared it in case it could be useful to someone else, too.

##### Multilingual sites

The default configuration of Nikola + Isso binds the comments to the exact URL that they were entered into. What I mean is that if your site supports multiple languages, and a user has entered a comment to an entry while visiting the English version of the site, users visiting the Italian version of the site would see same blog entry, but without that comment. That happens regardless of whether the blog entry has been translated into multiple languages or not: it's enough that the site has been configured for multiple languages.

My solution to fix the issue could not be accepted into Nikola as it would break old comments in existing sites, but if you are starting a new multilingual site you should definitely consider it.

### Testers welcome

Given that I've deployed Isso as a CGI, it's understandable that it's not the fastest thing ever: it takes some time to startup, so comments don't appear immediately when you open a page. However, once it's started it stays alive for several seconds, and that seems to help with performance when commenting.

But really, should the commenting system be completely broken, I'm sure you'll find a way to contact me, if you need to. :-)

0 0

30 Jan 2019 8:21pm GMT

## 20 Jan 2019

### Planet Maemo

#### Choosing a static site generator

In the last few days I've been writing a simple website for Imaginario. I'm a terrible site designer, and I can't really say that I enjoy writing websites, but it's something that from time to time people might need to do. While the PhotoTeleport website is built with Jekyll, this time I decided to try some other static site generator, in order to figure out if Jekyll is indeed the best for me, or if there are better alternatives for my (rather basic) needs.

I set out trying a couple of Python-based generators, Pelican and Nikola. Here is a brief review of them (and of Jekyll), in case it helps someone else make their own choice.

### Jekyll

I've been using it since several months for the PhotoTeleport website, which features a news section and a handful of static pages. It does the job very well and I haven't any major complaint. It's very popular and there are plenty of plugins to customize its behaviour or add new functionality. The documentation is sufficient for a basic usage of the site, and information on how to solve more specific issues can easily be found in the internet.

My only issue is that it's not totally intuitive to use, and in order to customize the interactions for your own needs you need to write your own scripts - at least, I didn't find a ready solution to create a new post, or deploy the generated content into my site.

### Pelican

My first impression with Pelican has been extremely positive: it's very easy to setup and start a blog. It's also quite popular, even though not as much as Jekyll, and there are may themes for it. By looking at the themes, though, I quickly realized that Pelican is meant to be used for blogs, and not for simple static sites. I'm almost sure that there must be a way to use it to create a static site, maybe with some tweaking, but I couldn't find information about this in its documentation. A quick search in the internet didn't help either, so I gave up and moved to the next one.

If I had to write a blog I'd certainly consider it, though.

### Nikola

Nikola is definitely less popular than Jekyll or Pelican, at least if we trust the number of stars and forks in GitHub, but it's still a popular and maintained project, with many plugins. Like Jekyll, it can handle both blogs and sites, or a combination of the two. It's well documented, the people in the forum are helpful, and its command line interface is simpler and more intuitive than Jekyll's. Also, the live preview functionality seems to be more advanced than Jekyll's, in that the browser is told to automatically reload the page whenever the site is rebuilt.

You can see my progress with the Imaginario website by inspecting the commits in its repository; you'll see how easy it was to set it up, and hopefully following my steps you'll save some time should you decide to create your own site with Nikola.

Overall, I'd rate Jekyll and Nikola on the same level: Jekyll wins for the wider community and amount of available plugins, while Nikola wins for the better command line interactions, and the fact that it's in Python gives me better confidence should I ever need to modify it deeply (though, admittedly, the latter is just a personal preference - Ruby developers will say the opposite).

0 0

20 Jan 2019 11:44am GMT

## 11 Dec 2018

### Planet Maemo

#### A Pathetic Human Being

A Venetian gondoliere thought it a good idea to decorate his gondola with fascist symbols, yet he can't handle that others think it not a good "joke"

The post A Pathetic Human Being appeared first on René Seindal.

0 0

11 Dec 2018 3:40pm GMT

## 06 Dec 2018

### Planet Maemo

#### Venice Kayak

Kayaking in Venice is a unique experience. Venice Kayak offers guided kayak tours in the city of Venice and in the lagoon.

The post Venice Kayak appeared first on René Seindal.

0 0

06 Dec 2018 4:34pm GMT

#### Venice Street Photography

I have put up a separate site with my street photography from Venice

The post Venice Street Photography appeared first on René Seindal.

0 0

06 Dec 2018 4:29pm GMT

#### Photo walks in Venice

The locals know Venice

The post Photo walks in Venice appeared first on René Seindal.

0 0

06 Dec 2018 4:18pm GMT

## 19 Nov 2018

### Planet Maemo

#### Brexit from a distance

Brexit doesn't influence me directly, but being Danish living in Italy means my existence relies on freedom of movement. Brexit attacks that freedom.

The post Brexit from a distance appeared first on René Seindal.

0 0

19 Nov 2018 6:54pm GMT

## 08 Nov 2018

### Planet Maemo

#### From Blender to OpenCV Camera and back

In case you want to employ Blender for Computer Vision like e.g. for generating synthetic data, you will need to map the parameters of a calibrated camera to Blender as well as mapping the blender camera parameters to the ones of a calibrated camera.

Calibrated cameras typically base around the pinhole camera model which at its core is the camera matrix and the image size in pixels:

K = \begin{bmatrix}f_x & 0 & c_x \\ 0 & f_y& c_y \\ 0 & 0 & 1 \end{bmatrix}, (w, h)

But if we look at the Blender Camera, we find lots non-standard and duplicate parameters with random or without any units, like

• unitless shift_x
• duplicate angle, angle_x, angle_y, lens

Doing some research on their meaning and fixing various bugs in the proposed conversion formula, I could however come up with the following python code to do the conversion from blender to OpenCV


# get the relevant data
cam = bpy.data.objects["cameraName"].data
scene = bpy.context.scene
# assume image is not scaled
assert scene.render.resolution_percentage == 100
# assume angles describe the horizontal field of view
assert cam.sensor_fit != 'VERTICAL'

f_in_mm = cam.lens
sensor_width_in_mm = cam.sensor_width

w = scene.render.resolution_x
h = scene.render.resolution_y

pixel_aspect = scene.render.pixel_aspect_y / scene.render.pixel_aspect_x

f_x = f_in_mm / sensor_width_in_mm * w
f_y = f_x * pixel_aspect

# yes, shift_x is inverted. WTF blender?
c_x = w * (0.5 - cam.shift_x)
c_y = h * (0.5 + cam.shift_y)

K = [[f_x, 0, c_x],
[0, f_y, c_y],
[0,   0,   1]]



So to summarize the above code

• Note that f_x/ f_y encodes the pixel aspect ratio and not the image aspect ratio w/ h.
• Blender enforces identical sensor and image aspect ratio. Therefore we do not have to consider it explicitly. Non square pixels are instead handled via pixel_aspect_x/ pixel_aspect_y.
• We left out the skew factor s (non rectangular pixels) because neither OpenCV nor Blender support it.
• Blender allows us to scale the output, resulting in a different resolution, but this can be easily handled post-projection. So we explicitly do not handle that.
• Blender has the peculiarity of converting the focal length to either horizontal or vertical field of view (sensor_fit). Going the vertical branch is left as an exercise to the reader.

The reverse transform can now be derived trivially as


cam.shift_x = -(c_x / w - 0.5)
cam.shift_y = c_y / h - 0.5

cam.lens = f_x / w * sensor_width_in_mm

pixel_aspect = f_y / f_x
scene.render.pixel_aspect_x = 1.0
scene.render.pixel_aspect_y = pixel_aspect



0 0

08 Nov 2018 5:12pm GMT

## 19 Oct 2018

### Planet Maemo

#### Ubports at the Linux Piter conference

I'm happy (and thankful) for having been invited to speak at the Linux Piter conference in Saint Petersburg on November 2nd. I'll be talking about the Ubports project, which is the community-driven continuation of the Ubuntu Touch effort, driven by Canonical until April 7th, when the project was cancelled.

Demo of Ubuntu convergence in action

The conference talks will be in English and Russian, with simultaneous translation on the other language. The videos will appear a couple of weeks after the conference on the organization's YouTube channel, but in any case I will write a post here - unless, of course, something goes terribly wrong and I feel ashamed of my performance ;-). In order to minimize this risk, I won't be giving a live demo (at least, not before I finish talking on my slides), but I'll take a couple of Ubports devices with me and people are very welcome to come to me and check them out.

As far as I've understood, most of the audience will not be very familiar with Linux-based mobile devices, but I guess that could play into an advantage for me: no difficult questions, yay! ;-)
And I really hope that some member of the audience gets interested in the project and decides to become part of it. We'll see. :-)

0 0

19 Oct 2018 12:20pm GMT

## 07 Aug 2018

### Planet Maemo

#### Doing It Right examples on autotools, qmake, cmake and meson

I finished my earlier work on build environment examples. Illustrating how to do versioning on shared object files right with autotools, qmake, cmake and meson. You can find it here.

The DIR examples are examples for various build environments on how to create a good project structure that will build libraries that are versioned with libtool or have versioning that is equivalent to what libtool would deliver, have a pkg-config file and have a so called API version in the library's name.

## What is right?

Information on this can be found in the autotools mythbuster docs, the libtool docs on versioning and freeBSD's chapter on shared libraries. I tried to ensure that what is written here works with all of the build environments in the examples.

libpackage-4.3.so.2.1.0, what is what?

You'll notice that a library called 'package' will in your LIBDIR often be called something like libpackage-4.3.so.2.1.0

We call the 4.3 part the APIVERSION, and the 2.1.0 part the VERSION (the ABI version).

I will explain these examples using semantic versioning as APIVERSION and either libtool's current:revision:age or a semantic versioning alternative as field for VERSION (like in FreeBSD and for build environments where compatibility with libtool's -version-info feature ain't a requirement).

Noting that with libtool's -version-info feature the values that you fill in for current, age and revision will not necessarily be identical to what ends up as suffix of the soname in LIBDIR. The formula to form the filename's suffix is, for libtool, "(current - age).age.revision". This means that for soname libpackage-APIVERSION.so.2.1.0, you would need current=3, revision=0 and age=1.

The VERSION part

In case you want compatibility with or use libtool's -version-info feature, the document libtool/version.html on autotools.io states:

The rules of thumb, when dealing with these values are:

• Increase the current value whenever an interface has been added, removed or changed.
• Always increase the revision value.
• Increase the age value only if the changes made to the ABI are backward compatible.
2. Update the version information only immediately before a public release of your software. More frequent updates are unnecessary, and only guarantee that the current interface number gets larger faster.
3. If the library source code has changed at all since the last update, then increment revision ('c:r:a' becomes 'c:r+1:a').
4. If any interfaces have been added, removed, or changed since the last update, increment current, and set revision to 0.
5. If any interfaces have been added since the last public release, then increment age.
6. If any interfaces have been removed or changed since the last public release, then set age to 0.

When you don't care about compatibility with libtool's -version-info feature, then you can take the following simplified rules for VERSION:

• SOVERSION = Major version
• Major version: increase it if you break ABI compatibility
• Minor version: increase it if you add ABI compatible features
• Patch version: increase it for bug fix releases.

Examples when these simplified rules are or can be applicable is in build environments like cmake, meson and qmake. When you use autotools you will be using libtool and then they ain't applicable.

The APIVERSION part

For the API version I will use the rules from semver.org. You can also use the semver rules for your package's version:

Given a version number MAJOR.MINOR.PATCH, increment the:

1. MAJOR version when you make incompatible API changes,
2. MINOR version when you add functionality in a backwards-compatible manner, and
3. PATCH version when you make backwards-compatible bug fixes.

When you have an API, that API can change over time. You typically want to version those API changes so that the users of your library can adopt to newer versions of the API while at the same time other users still use older versions of your API. For this we can follow section 4.3. called "multiple libraries versions" of the autotools mythbuster documentation. It states:

In this situation, the best option is to append part of the library's version information to the library's name, which is exemplified by Glib's libglib-2.0.so.0 > soname. To do so, the declaration in the Makefile.am has to be like this:

lib_LTLIBRARIES = libtest-1.0.la

libtest_1_0_la_LDFLAGS = -version-info 0:0:0


The pkg-config file

Many people use many build environments (autotools, qmake, cmake, meson, you name it). Nowadays almost all of those build environments support pkg-config out of the box. Both for generating the file as for consuming the file for getting information about dependencies.

I consider it a necessity to ship with a useful and correct pkg-config .pc file. The filename should be /usr/lib/pkgconfig/package-APIVERSION.pc for soname libpackage-APIVERSION.so.VERSION. In our example that means /usr/lib/pkgconfig/package-4.3.pc. We'd use the command pkg-config package-4.3 -cflags -libs, for example.

Examples are GLib's pkg-config file, located at /usr/lib/pkgconfig/glib-2.0.pc

The include path

I consider it a necessity to ship API headers in a per API-version different location (like for example GLib's, at /usr/include/glib-2.0). This means that your API version number must be part of the include-path.

For example using earlier mentioned API-version 4.3, /usr/include/package-4.3 for /usr/lib/libpackage-4.3.so(.2.1.0) having /usr/lib/pkg-config/package-4.3.pc

The linker will for -lpackage-4.3 typically link with /usr/lib/libpackage-4.3.so.2 or with libpackage-APIVERSION.so.(current - age). Noting that the part that is calculated as (current - age) in this example is often, for example in cmake and meson, referred to as the SOVERSION. With SOVERSION the soname template in LIBDIR is libpackage-APIVERSION.so.SOVERSION.

## What is wrong?

Not doing any versioning

Without versioning you can't make any API or ABI changes that wont break all your users' code in a way that could be managable for them. If you do decide not to do any versioning, then at least also don't put anything behind the .so part of your so's filename. That way, at least you wont break things in spectacular ways.

Coming up with your own versioning scheme

Knowing it better than the rest of the world will in spectacular ways make everything you do break with what the entire rest of the world does. You shouldn't congratulate yourself with that. The only thing that can be said about it is that it probably makes little sense, and that others will probably start ignoring your work. Your mileage may vary. Keep in mind that without a correct SOVERSION, certain things will simply not work correct.

In case of libtool: using your package's (semver) release numbering for current, revision, age

This is similarly wrong to 'Coming up with your own versioning scheme'.

Never try to set the interface numbers so that they correspond to the release number of your package. This is an abuse that only fosters misunderstanding of the purpose of library versions.

This basically means that once you are using libtool, also use libtool's versioning rules.

Refusing or forgetting to increase the current and/or SOVERSION on breaking ABI changes

The current part of the VERSION (current, revision and age) minus age, or, SOVERSION is/are the most significant field(s). The current and age are usually involved in forming the so called SOVERSION, which in turn is used by the linker to know with which ABI version to link. That makes it … damn important.

Some people think 'all this is just too complicated for me', 'I will just refuse to do anything and always release using the same version numbers'. That goes spectacularly wrong whenever you made ABI incompatible changes. It's similarly wrong to 'Coming up with your own versioning scheme'.

That way, all programs that link with your shared library can after your shared library gets updated easily crash, can corrupt data and might or might not work.

By updating the current and age, or, SOVERSION you will basically trigger people who manage packages and their tooling to rebuild programs that link with your shared library. You actually want that the moment you made breaking ABI changes in a newer version of it.

When you don't want to care about libtool's -version-info feature, then there is also a set of more simple to follow rules. Those rules are for VERSION:

• SOVERSION = Major version (with these simplified set of rules, no subtracting of current with age is needed)
• Major version: increase it if you break ABI compatibility
• Minor version: increase it if you add ABI compatible features
• Patch version: increase it for bug fix releases.

## What isn't wrong?

Not using libtool (but nonetheless doing ABI versioning right)

GNU libtool was made to make certain things more easy. Nowadays many popular build environments also make things more easy. Meanwhile has GNU libtool been around for a long time. And its versioning rules, commonly known as the current:revision:age field as parameter for -verison-info, got widely adopted.

What GNU libtool did was, however, not really a standard. It's is one interpretation of how to do it. And a rather complicated one, at that.

Please let it be crystal clear that not using libtool does not mean that you can do ABI versioning wrong. Because very often people seem to think that they can, and think they'll still get out safely while doing ABI versioning completely wrong. This is not the case.

Not having a APIVERSION at all

It isn't wrong not to have an APIVERSION in the soname. It however means that you promise to not ever break API. Because the moment you break API, you disallow your users to stay on the old API for a little longer. They might both have programs that use the old and that use the new API. Now what?

When you have an APIVERSION then you can allow the introduction of a new version of the API while simultaneously the old API remains available on a user's system.

Using a different naming-scheme for APIVERSION

I used the MAJOR.MINOR version numbers from semver to form the APIVERSION. I did this because only the MAJOR and the MINOR are technically involved in API changes (unless you are doing semantic versioning wrong - in which case see 'Coming up with your own versioning scheme').

Some projects only use MAJOR. Examples are Qt which puts the MAJOR number behind the Qt part. For example libQt5Core.so.VERSION (so that's "Qt" + MAJOR + Module). The GLib world, however, uses "g" + Module + "-" + MAJOR + ".0″ as they have releases like 2.2, 2.3, 2.4 that are all called libglib-2.0.so.VERSION. I guess they figured that maybe someday in their 2.x series, they could use that MINOR field?

DBus seems to be using a similar thing to GLib, but then without the MINOR suffix: libdbus-1.so.VERSION. For their GLib integration they also use it as libdbus-glib-1.so.VERSION.

Who is right, who is wrong? It doesn't matter too much for your APIVERSION naming scheme. As long as there is a way to differentiate the API in a) the include path, b) the pkg-config filename and c) the library that will be linked with (the -l parameter during linking/compiling). Maybe someday a standard will be defined? Let's hope so.

## FreeBSD

The three principles of shared library building are:

1. Start from 1.0
2. If there is a change that is backwards compatible, bump minor number (note that ELF systems ignore the minor number)
3. If there is an incompatible change, bump major number

For instance, added functions and bugfixes result in the minor version number being bumped, while deleted functions, changed function call syntax, etc. will force the major version number to change.

I think that when using libtool on a FreeBSD (when you use autotools), that the platform will provide a variant of libtool's scripts that will convert earlier mentioned current, revision and age rules to FreeBSD's. The same goes for the VERSION variable in cmake and qmake. Meaning that with those tree build environments, you can just use the rules for GNU libtool's -version-info.

I could be wrong on this, but I did find mailing list E-mails from ~ 2011 stating that this SNAFU is dealt with. Besides, the *BSD porters otherwise know what to do and you could of course always ask them about it.

Note that FreeBSD's rules are or seem to be compatible with the rules for VERSION when you don't want to care about libtool's -version-info compatibility. However, when you are porting from a libtoolized project, then of course you don't want to let newer releases break against releases that have already happened.

Modern Linux distributions

Nowadays you sometimes see things like /usr/lib/$ARCH/libpackage-APIVERSION.so linking to /lib/$ARCH/libpackage-APIVERSION.so.VERSION. I have no idea how this mechanism works. I suppose this is being done by packagers of various Linux distributions? I also don't know if there is a standard for this.

I will update the examples and this document the moment I know more and/or if upstream developers need to worry about it. I think that using GNUInstallDirs in cmake, for example, makes everything go right. I have not found much for this in qmake, meson seems to be doing this by default and in autotools you always use platform variables for such paths.

As usual, I hope standards will be made and that the build environment and packaging community gets to their senses and stops leaving this into the hands of developers. I especially think about qmake, which seems to not have much at all to state that standardized installation paths must be used (not even a proper way to define a prefix).

## Questions that I can imagine already exist

Why is there there a difference between APIVERSION and VERSION?

The API version is the version of your programmable interfaces. This means the version of your header files (if your programming language has such header files), the version of your pkgconfig file, the version of your documentation. The API is what software developers need to utilize your library.

The ABI version can definitely be different and it is what programs that are compiled and installable need to utilize your library.

An API breaks when recompiling the program without any changes, that consumes a libpackage-4.3.so.2, is not going to succeed at compile time. The API got broken the moment any possible way package's API was used, wont compile. Yes, any way. It means that a libpackage-5.0.so.0 should be started.

An ABI breaks when without recompiling the program, replacing a libpackage-4.3.so.2.1.0 with a libpackage-4.3.so.2.2.0 or a libpackage-4.3.so.2.1.1 (or later) as libpackage-4.3.so.2 is not going to succeed at runtime. For example because it would crash, or because the results would be wrong (in any way). It implies that libpackage-4.3.so.2 shouldn't be overwritten, but libpackage-4.3.so.3 should be started.

For example when you change the parameter of a function in C to be a floating point from a integer (and/or the other way around), then that's an ABI change but not neccesarily an API change.

In most projects that got ported from an environment that uses GNU libtool (for example autotools) to for example cmake or meson, and in the rare cases that they did anything at all in a qmake based project, I saw people converting the current, revision and age parameters that they passed to the -version-info option of libtool to a string concatenated together using (current - age), age, revision as VERSION, and (current - age) as SOVERSION.

I wanted to use the exact same rules for versioning for all these examples, including autotools and GNU libtool. When you don't have to (or want to) care about libtool's set of (for some people, needlessly complicated) -version-info rules, then it should be fine using just SOVERSION and VERSION using these rules:

• SOVERSION = Major version
• Major version: increase it if you break ABI compatibility
• Minor version: increase it if you add ABI compatible features
• Patch version: increase it for bug fix releases.

I, however, also sometimes saw variations that are incomprehensible with little explanation and magic foo invented on the spot. Those variations are probably wrong.

In the example I made it so that in the root build file of the project you can change the numbers and calculation for the numbers. However. Do follow the rules for those correctly, as this versioning is about ABI compatibility. Doing this wrong can make things blow up in spectacular ways.

## The examples

qmake in the qmake-example

Note that the VERSION variable must be filled in as "(current - age).age.revision" for qmake (to get 2.1.0 at the end, you need VERSION=2.1.0 when current=3, revision=0 and age=1)

To try this example out, go to the qmake-example directory and type

$cd qmake-example$ mkdir=_test
$qmake PREFIX=$PWD/_test
$make$ make install



This should give you this:

$find _test/ _test/ ├── include │ └── qmake-example-4.3 │ └── qmake-example.h └── lib ├── libqmake-example-4.3.so -> libqmake-example-4.3.so.2.1.0 ├── libqmake-example-4.3.so.2 -> libqmake-example-4.3.so.2.1.0 ├── libqmake-example-4.3.so.2.1 -> libqmake-example-4.3.so.2.1.0 ├── libqmake-example-4.3.so.2.1.0 ├── libqmake-example-4.la └── pkgconfig └── qmake-example-4.3.pc   When you now use pkg-config, you get a nice CFLAGS and LIBS line back (I'm replacing the current path with$PWD in the output each time):

$export PKG_CONFIG_PATH=$PWD/_test/lib/pkgconfig
$pkg-config qmake-example-4.3 --cflags -I$PWD/_test/include/qmake-example-4.3
$pkg-config qmake-example-4.3 --libs -L$PWD/_test/lib -lqmake-example-4.3



And it means that you can do things like this now (and people who know about pkg-config will now be happy to know that they can use your library in their own favorite build environment).

$export LD_LIBRARY_PATH=$PWD/_test/lib
$echo -en "#include <qmake-example.h>\nmain() {} " > test.cpp$ g++ -fPIC test.cpp -o test.o pkg-config qmake-example-4.3 --libs --cflags



You can see that it got linked to libqmake-example-4.3.so.2, where that 2 at the end is (current - age).

$ldd test.o linux-gate.so.1 (0xb77b0000) libqmake-example-4.3.so.2 =>$PWD/_test/lib/libqmake-example-4.3.so.2 (0xb77a6000)
libstdc++.so.6 => /usr/lib/i386-linux-gnu/libstdc++.so.6 (0xb75f5000)
libm.so.6 => /lib/i386-linux-gnu/libm.so.6 (0xb759e000)
libgcc_s.so.1 => /lib/i386-linux-gnu/libgcc_s.so.1 (0xb7580000)
libc.so.6 => /lib/i386-linux-gnu/libc.so.6 (0xb73c9000)
/lib/ld-linux.so.2 (0xb77b2000)



cmake in the cmake-example

Note that the VERSION property on your library target must be filled in with "(current - age).age.revision" for cmake (to get 2.1.0 at the end, you need VERSION=2.1.0 when current=3, revision=0 and age=1. Note that in cmake you must also fill in the SOVERSION property as (current - age), so SOVERSION=2 when current=3 and age=1).

To try this example out, go to the cmake-example directory and do

$cd cmake-example$ mkdir _test
$cmake -DCMAKE_INSTALL_PREFIX:PATH=$PWD/_test
-- Configuring done
-- Generating done
-- Build files have been written to: .
$make [ 50%] Building CXX object src/libs/cmake-example/CMakeFiles/cmake-example.dir/cmake-example.cpp.o [100%] Linking CXX shared library libcmake-example-4.3.so [100%] Built target cmake-example$ make install
[100%] Built target cmake-example
Install the project...
-- Install configuration: ""
-- Installing: $PWD/_test/lib/libcmake-example-4.3.so.2.1.0 -- Up-to-date:$PWD/_test/lib/libcmake-example-4.3.so.2
-- Up-to-date: $PWD/_test/lib/libcmake-example-4.3.so -- Up-to-date:$PWD/_test/include/cmake-example-4.3/cmake-example.h
-- Up-to-date: $PWD/_test/lib/pkgconfig/cmake-example-4.3.pc   This should give you this: $ tree _test/
_test/
├── include
│   └── cmake-example-4.3
│       └── cmake-example.h
└── lib
├── libcmake-example-4.3.so -> libcmake-example-4.3.so.2
├── libcmake-example-4.3.so.2 -> libcmake-example-4.3.so.2.1.0
├── libcmake-example-4.3.so.2.1.0
└── pkgconfig
└── cmake-example-4.3.pc



When you now use pkg-config, you get a nice CFLAGS and LIBS line back (I'm replacing the current path with $PWD in the output each time): $ pkg-config cmake-example-4.3 --cflags
-I$PWD/_test/include/cmake-example-4.3$ pkg-config cmake-example-4.3 --libs
-L$PWD/_test/lib -lcmake-example-4.3   And it means that you can do things like this now (and people who know about pkg-config will now be happy to know that they can use your library in their own favorite build environment): $ echo -en "#include <cmake-example.h>\nmain() {} " > test.cpp
$g++ -fPIC test.cpp -o test.o pkg-config cmake-example-4.3 --libs --cflags   You can see that it got linked to libcmake-example-4.3.so.2, where that 2 at the end is the SOVERSION. This is (current - age). $ ldd test.o
linux-gate.so.1 (0xb7729000)
libcmake-example-4.3.so.2 => $PWD/_test/lib/libcmake-example-4.3.so.2 (0xb771f000) libstdc++.so.6 => /usr/lib/i386-linux-gnu/libstdc++.so.6 (0xb756e000) libm.so.6 => /lib/i386-linux-gnu/libm.so.6 (0xb7517000) libgcc_s.so.1 => /lib/i386-linux-gnu/libgcc_s.so.1 (0xb74f9000) libc.so.6 => /lib/i386-linux-gnu/libc.so.6 (0xb7342000) /lib/ld-linux.so.2 (0xb772b000)   autotools in the autotools-example Note that you pass -version-info current:revision:age directly with autotools. The libtool will translate that to (current - age).age.revision to form the so's filename (to get 2.1.0 at the end, you need current=3, revision=0, age=1). To try this example out, go to the autotools-example directory and do $ cd autotools-example
$mkdir _test$ libtoolize
$aclocal$ autoheader
$autoconf$ automake --add-missing
$./configure --prefix=$PWD/_test
$make$ make install



This should give you this:

$tree _test/ _test/ ├── include │ └── autotools-example-4.3 │ └── autotools-example.h └── lib ├── libautotools-example-4.3.a ├── libautotools-example-4.3.la ├── libautotools-example-4.3.so -> libautotools-example-4.3.so.2.1.0 ├── libautotools-example-4.3.so.2 -> libautotools-example-4.3.so.2.1.0 ├── libautotools-example-4.3.so.2.1.0 └── pkgconfig └── autotools-example-4.3.pc   When you now use pkg-config, you get a nice CFLAGS and LIBS line back (I'm replacing the current path with$PWD in the output each time):

$export PKG_CONFIG_PATH=$PWD/_test/lib/pkgconfig
$pkg-config autotools-example-4.3 --cflags -I$PWD/_test/include/autotools-example-4.3
$pkg-config autotools-example-4.3 --libs -L$PWD/_test/lib -lautotools-example-4.3



And it means that you can do things like this now (and people who know about pkg-config will now be happy to know that they can use your library in their own favorite build environment):

$echo -en "#include <autotools-example.h>\nmain() {} " > test.cpp$ export LD_LIBRARY_PATH=$PWD/_test/lib$ g++ -fPIC test.cpp -o test.o pkg-config autotools-example-4.3 --libs --cflags



You can see that it got linked to libautotools-example-4.3.so.2, where that 2 at the end is (current - age).

$ldd test.o linux-gate.so.1 (0xb778d000) libautotools-example-4.3.so.2 =>$PWD/_test/lib/libautotools-example-4.3.so.2 (0xb7783000)
libstdc++.so.6 => /usr/lib/i386-linux-gnu/libstdc++.so.6 (0xb75d2000)
libm.so.6 => /lib/i386-linux-gnu/libm.so.6 (0xb757b000)
libgcc_s.so.1 => /lib/i386-linux-gnu/libgcc_s.so.1 (0xb755d000)
libc.so.6 => /lib/i386-linux-gnu/libc.so.6 (0xb73a6000)
/lib/ld-linux.so.2 (0xb778f000)



meson in the meson-example

Note that the version property on your library target must be filled in with "(current - age).age.revision" for meson (to get 2.1.0 at the end, you need version=2.1.0 when current=3, revision=0 and age=1. Note that in meson you must also fill in the soversion property as (current - age), so soversion=2 when current=3 and age=1).

To try this example out, go to the meson-example directory and do

$cd meson-example$ mkdir -p _build/_test
$cd _build$ meson .. --prefix=$PWD/_test$ ninja
$ninja install   This should give you this: $ tree _test/
_test/
├── include
│   └── meson-example-4.3
│       └── meson-example.h
└── lib
└── i386-linux-gnu
├── libmeson-example-4.3.so -> libmeson-example-4.3.so.2.1.0
├── libmeson-example-4.3.so.2 -> libmeson-example-4.3.so.2.1.0
├── libmeson-example-4.3.so.2.1.0
└── pkgconfig
└── meson-example-4.3.pc



When you now use pkg-config, you get a nice CFLAGS and LIBS line back (I'm replacing the current path with $PWD in the output each time): $ export PKG_CONFIG_PATH=$PWD/_test/lib/i386-linux-gnu/pkgconfig$ pkg-config meson-example-4.3 --cflags
-I$PWD/_test/include/meson-example-4.3$ pkg-config meson-example-4.3 --libs
-L$PWD/_test/lib -lmeson-example-4.3   And it means that you can do things like this now (and people who know about pkg-config will now be happy to know that they can use your library in their own favorite build environment): $ echo -en "#include <meson-example.h>\nmain() {} " > test.cpp
$export LD_LIBRARY_PATH=$PWD/_test/lib/i386-linux-gnu
$g++ -fPIC test.cpp -o test.o pkg-config meson-example-4.3 --libs --cflags   You can see that it got linked to libmeson-example-4.3.so.2, where that 2 at the end is the soversion. This is (current - age). $ ldd test.o
linux-gate.so.1 (0xb772e000)
libmeson-example-4.3.so.2 => $PWD/_test/lib/i386-linux-gnu/libmeson-example-4.3.so.2 (0xb7724000) libstdc++.so.6 => /usr/lib/i386-linux-gnu/libstdc++.so.6 (0xb7573000) libm.so.6 => /lib/i386-linux-gnu/libm.so.6 (0xb751c000) libgcc_s.so.1 => /lib/i386-linux-gnu/libgcc_s.so.1 (0xb74fe000) libc.so.6 => /lib/i386-linux-gnu/libc.so.6 (0xb7347000) /lib/ld-linux.so.2 (0xb7730000)  0 0 07 Aug 2018 2:30pm GMT ## 11 Jul 2018 ### Planet Maemo #### Doing it right, making libraries using popular build environments Enough with the political posts! Making libraries that are both API and libtool versioned with qmake, how do they do it? I started a project on github that will collect what I will call "doing it right" project structures for various build environments. With right I mean that the library will have a API version in its Library name, that the library will be libtoolized and that a pkg-config .pc file gets installed for it. I have in mind, for example, autotools, cmake, meson, qmake and plain make. First example that I have finished is one for qmake. Let's get started working on a libqmake-example-3.2.so.3.2.1 We get the PREFIX, MAJOR_VERSION, MINOR_VERSION and PATCH_VERSION from a project-wide include include(../../../qmake-example.pri)  We will use the standard lib template of qmake TEMPLATE = lib  We need to set VERSION to a semver.org version for compile_libtool (in reality it should use what is called current, revision and age to form an API and ABI version number. In the actual example it's explained in the comments, as this is too much for a small blog post). VERSION = $${MAJOR_VERSION}"."$${MINOR_VERSION}"."$${PATCH_VERSION}  According section 4.3 of Autotools' mythbusters we should have as target-name the API version in the library's name TARGET = qmake-example-$${MAJOR_VERSION}"."$${MINOR_VERSION}  We will write a define in config.h for access to the semver.org version as a double quoted string QMAKE_SUBSTITUTES += config.h.in  Our example happens to use QDebug, so we need QtCore here QT = core  This is of course optional CONFIG += c++14  We will be using libtool style libraries CONFIG += compile_libtool CONFIG += create_libtool  These will create a pkg-config .pc file for us CONFIG += create_pc create_prl no_install_prl  Project sources SOURCES = qmake-example.cpp  Project's public and private headers HEADERS = qmake-example.h  We will install the headers in a API specific include path headers.path =$${PREFIX}/include/qmake-example-$${MAJOR_VERSION}"."$${MINOR_VERSION}  Here put only the publicly installed headers headers.files = $${HEADERS}  Here we will install the library to target.path =$${PREFIX}/lib  This is the configuration for generating the pkg-config file QMAKE_PKGCONFIG_NAME = $${TARGET} QMAKE_PKGCONFIG_DESCRIPTION = An example that illustrates how to do it right with qmake # This is our libdir QMAKE_PKGCONFIG_LIBDIR =$$target.path # This is where our API specific headers are QMAKE_PKGCONFIG_INCDIR = $$headers.path QMAKE_PKGCONFIG_DESTDIR = pkgconfig QMAKE_PKGCONFIG_PREFIX =$${PREFIX} QMAKE_PKGCONFIG_VERSION =$$VERSION # These are dependencies that our library needs QMAKE_PKGCONFIG_REQUIRES = Qt5Core  Installation targets (the pkg-config seems to install automatically) INSTALLS += headers target  This will be the result after make-install ├── include │ └── qmake-example-3.2 │ └── qmake-example.h └── lib ├── libqmake-example-3.2.so -> libqmake-example-3.2.so.3.2.1 ├── libqmake-example-3.2.so.3 -> libqmake-example-3.2.so.3.2.1 ├── libqmake-example-3.2.so.3.2 -> libqmake-example-3.2.so.3.2.1 ├── libqmake-example-3.2.so.3.2.1 ├── libqmake-example-3.la └── pkgconfig └── qmake-example-3.pc  ps. Dear friends working at their own customers: when I visit your customer, I no longer want to see that you produced completely stupid wrong qmake based projects for them. Libtoolize it all, get an API version in your Library's so-name and do distribute a pkg-config .pc file. That's the very least to pass your exam. Also read this document (and stop pretending that you don't need to know this when at the same time you charge them real money pretending that you know something about modern UNIX software development). 0 0 11 Jul 2018 10:25pm GMT ## 26 Jun 2018 ### Planet Maemo #### Switching back from Chrome to Firefox One major grief for me when surfing on Android are ads. They not only increase page size and loading time, but also take away precious screen estate. Unfortunately the native Android browser, which nowadays is Chrome, does not support extensions and hence there is no ad-blocker. Therefore I was quite optimistic when Google announced they will be enforcing the betterads standards with Chrome - aka ad-blocking light. However after having used Chrome only showing "betterads", I must say that they are far away from what is tolerable to me. I am more in line with the Acceptable Ads criteria. (My site also keeps to them - if you choose to disable ad-blocking here) As someone who has to pay for hosting I fully understand that Ads are part of the game - but lets face it; as long as annoying ads get you more money, there will be annoying ads. Ad-blockers are a very effective way to let money speak here.. So I needed an adblock-capable browser on Android. Fortunately Mozilla greatly improved Firefox performance with their Quantum incentive. Or maybe modern Smartphones just got a lot faster. Anyway.. a recent Firefox virtually performs the same as Chrome on Android and thus is a viable alternative. As of recently there is also Microsoft Edge for Android, but actually it does not gain an edge over anything. So lets stick with open source software. With switching to Firefox on Android one should switch to Firefox on Desktop as well, so you get sync across devices. # On Linux Unfortunately Firefox has bad default settings on Linux. For one - unlike Chrome - it does not use client side decorations by default, and thus wastes space in the title bar. But this is easy to fix. Then it still uses the slow software rendering path. To make it use the GPU, visit about:config and set the following properties to true • layers.acceleration.force-enabled enable OpenGL based compositing which for smooth scrolling. (enabled by default on OSX, Windows) • layers.omtp.enabled (OMTP) further improve performance when scrolling. (enabled by default on OSX, Windows) They default to false due to issues on some obscure Mesa/ Xorg combinations, but generally work well for me with Nvidia drivers. Additionally, if you use a touch-pad or touch-screen, you should add the following environment variable: • MOZ_USE_XINPUT2=1 this will make Firefox correctly handle touch events instead of translating them to mouse wheel scrolling. This way you get pixel perfect scrolling on touch-pads and it is a prerequisite for drag to scroll on touch-screens. # On Android On Android Firefox generally has sane defaults. The only setting missing here to bring it on par with Chrome are the Encrypted Media Extensions. For this again visit about:config and create the following property and set it to true • media.eme.enabled Still you will need some time to adapt to Firefox; e.g. there is no pull to refresh. However there are other bonus points besides adblocking; for me the synchronized tabs sidebar (on desktop) has proven to be an invaluable usability improvement. 0 0 26 Jun 2018 12:32pm GMT ## 29 Apr 2018 ### Planet Maemo #### Teatime & Sensors Unity updated for Ubuntu 18.04 I updated my two little Apps; Teatime and Sensors Unity to integrate with Ubuntu 18.04 and consequently with Gnome 3. For this I ported them to the GtkApplication API which makes sure they integrate into Unity7 as well as Gnome Shell. Additionally it ensures that only one instance of the App is active at the same time. As Dash-to-Dock implements the Unity7 D-Bus API and snaps are available everywhere this drastically widens the target audience. To make the projects themselves more accessible, I also moved development from launchpad to github where you can now easily create pull-requests and report issues. Furthermore the translations are managed at POEditor, where you can help translating the apps to your language. Especially Sensors Unity could use some help. 0 0 29 Apr 2018 2:00pm GMT ## 25 Apr 2018 ### Planet Maemo #### Metaclasses, generative C++ This is awesome: 0 0 25 Apr 2018 7:20am GMT ## 19 Apr 2018 ### Planet Maemo #### Managing a developer shell with Docker When I'm not in Flowhub-land, I'm used to developing software in a quite customized command line based development environment. Like for many, the cornerstones of this for me are vim and tmux. As customization increases, it becomes important to have a way to manage that and distribute it across the different computers. For years, I've used a dotfiles repository on GitHub together with GNU Stow for this. However, this still means I have to install all the software and tools before I can have my environment up and running. ## Using Docker Docker is a tool for building and running software in a containerized fashion. Recently Tiago gave me the inspiration to use Docker not only for distributing production software, but also for actually running my development environment. Taking ideas from his setup, I built upon my existing dotfiles and built a reusable developer shell container. With this, I only need Docker installed on a machine, and then I'm two commands away from having my normal development environment: $ docker volume create workstation
$docker run -v ~/Projects:/projects -v workstation:/root -v ~/.ssh:/keys --name workstation --rm -it bergie/shell   Here's how it looks in action: Once I update my Docker setup (for example to install or upgrade some tool), I can get the latest version on a machine with: $ docker pull bergie/shell



At least in theory this should give me a fully identical working environment regardless of the host machine. Linux VPS, a MacBook, or a Windows machine should all be able to run this. And soon, this should also work out of the box on Chromebooks.

## Setting this up

The basics are pretty simple. I already had a repository for my dotfiles, so I only needed to write a Dockerfile to install and set up all my software.

To make things even easier, I configured Travis so that every time I push some change to the dotfiles repository, it will create and publish a new container image.

## Further development ideas

So far this setup seems to work pretty well. However, here are some ideas for further improvements:

• ARM build: Sometimes I need to work on Raspberry Pis. It might be nice to cross-compile an ARM version of the same setup
• Key management: Currently I create new SSH keys for each host machine, and then upload them to the relevant places. With this setup I could use a USB stick, or maybe even a Yubikey to manage them
• Application authentication: Since the Docker image is public, it doesn't come with any secrets built in. This means I still need to authenticate with tools like NPM and Travis. It might be interesting to manage these together with my SSH keys
• SSH host: with some tweaking it might be possible to run the same container on cloud services. Then I'd need a way to get my SSH public keys there and start an SSH server

If you have ideas on how to best implement the above, please get in touch.

0 0

19 Apr 2018 12:00am GMT

## 14 Jan 2018

### Planet Maemo

#### With sufficient thrust, pigs fly just fine

0 0

14 Jan 2018 11:34pm GMT