23 Jul 2014
It seems a fairly common, straight forward question. You've probably been asked it before. We all have reasons why we hack, why we code, why we write or draw. If you ask somebody this question, you'll hear things like "scratching an itch" or "making something beautiful" or "learning something new". These are all excellent reasons for creating or improving something. But contributing isn't just about creating, it's about giving that creation away. Usually giving it away for free, with no or very few strings attached. When I ask "Why do you contribute to open source", I'm asking why you give it away.
This question is harder to answer, and the answers are often far more complex than the ones given for why people simply create something. What makes it worthwhile to spend your time, effort, and often money working on something, and then turn around and give it away? People often have different intentions or goals in mind when the contribute, from benevolent giving to a community they care about to personal pride in knowing that something they did is being used in something important or by somebody important. But when you strip away the details of the situation, these all hinge on one thing: Recognition.
If you read books or articles about community, one consistent theme you will find in almost all of them is the importance of recognizing the contributions that people make. In fact, if you look at a wide variety of successful communities, you would find that one common thing they all offer in exchange for contribution is recognition. It is the fuel that communities run on. It's what connects the contributor to their goal, both selfish and selfless. In fact, with open source, the only way a contribution can actually stolen is by now allowing that recognition to happen. Even the most permissive licenses require attribution, something that tells everybody who made it.
Now let's flip that question around: Why do people contribute to your project? If their contribution hinges on recognition, are you prepared to give it? I don't mean your intent, I'll assume that you want to recognize contributions, I mean do you have the processes and people in place to give it?
We've gotten very good about building tools to make contribution easier, faster, and more efficient, often by removing the human bottlenecks from the process. But human recognition is still what matters most. Silently merging someone's patch or branch, even if their name is in the commit log, isn't the same as thanking them for it yourself or posting about their contribution on social media. Letting them know you appreciate their work is important, letting other people know you appreciate it is even more important.
If you the owner or a leader in a project with a community, you need to be aware of how recognition is flowing out just as much as how contributions are flowing in. Too often communities are successful almost by accident, because the people in them are good at making sure contributions are recognized and that people know it simply because that's their nature. But it's just as possible for communities to fail because the personalities involved didn't have this natural tendency, not because of any lack of appreciation for the contributions, just a quirk of their personality. It doesn't have to be this way, if we are aware of the importance of recognition in a community we can be deliberate in our approaches to making sure it flows freely in exchange for contributions.
23 Jul 2014 12:00pm GMT
With electricity prices in Australia seeming to be only going up, and solar being surprisingly cheap, I decided it was a no-brainer to invest in a solar installation to reduce my ongoing electricity bills. It also paves the way for getting an electric car in the future. I'm also a greenie, so having some renewable energy happening gives me the warm and fuzzies.
So today I got solar installed. I've gone for a 2 kWh system, consisting of 8 250 watt Seraphim panels (I'm not entirely sure which model) and an Aurora UNO-2.0-I-OUTD inverter.
It was totally a case of decision fatigue when it came to shopping around. Everyone claims the particular panels they want to sell at the best. It's pretty much impossible to make a decent assessment of their claims. In the end, I went with the Seraphim panels because they scored well on the PHOTON tests. That said, I've had other solar companies tell me the PHOTON tests aren't indicative of Australian conditions. It's hard to know who to believe. In the end, I chose Seraphim because of the PHOTON test results, and they're also apparently one of the few panels that pass the Thresher test, which tests for durability.
The harder choice was the inverter. I'm told that yield varies wildly by inverter, and narrowed it down to Aurora or SunnyBoy. Jason's got a SunnyBoy, and the appeal with it was that it supported Bluetooth for data gathering, although I don't much care for the aesthetics of it. Then I learned that there was a WiFi card coming out soon for the Aurora inverter, and that struck me as better than Bluetooth, so I went with the Aurora inverter. I discovered at the eleventh hour that the model of Aurora inverter that was going to be supplied wasn't supported by the WiFi card, but was able to switch models to the one that was. I'm glad I did, because the newer model looks really nice on the wall.
The whole system was up at running just in time to catch the setting sun, so I'm looking forward to seeing it in action tomorrow.
Apparently the next step is Energex has to come out to replace my analog power meter with a digital one.
I'm grateful that I was able to get Body Corporate approval to use some of the roof. Being on the top floor helped make the installation more feasible too, I think.
23 Jul 2014 5:36am GMT
The problem: Some time ago, I had a server "in the wild" from which I
wanted some data backed up to my rsync.net account. I didn't want to
put sensitive credentials on this server in case it got compromised.
The awesome admins at rsync.net pointed out their subuid feature. For
no extra charge, they'll give you another uid, which can have its own
ssh keys, whose home directory is symbolically linked under your main
uid's home directory. So the server can rsync backups to the subuid,
and if it is compromised, attackers cannot get at any info which didn't
originate from that server anyway.
23 Jul 2014 4:02am GMT
22 Jul 2014
Those of you who follow this blog since some time know for sure that the preferred language is English (a little number of posts in the early stages are an exception). Things are changing though.
It's not that difficult to understand: if you go on it.deshack.net you can see this website in Italian. I've been thinking about giving a big change to this little place in the web for a while, as I want it to become more than a simple blog. I am working on a new theme for business websites, but I'll let you know when it's time. In the mean time, don't be amazed if you see some small changes here.
Now it's time for me to ask something to you: do you think this is an interesting change? Let me know with a comment!
22 Jul 2014 5:21pm GMT
Release Metrics and Incoming Bugs
Release metrics and incoming bug data can be reviewed at the following link:
Status: Utopic Development Kernel
The Utopic kernel has been rebased to v3.16-rc6 and officially uploaded
to the archive. We (as in apw) has also completed a hurculean config
review for Utopic and administered the appropriate changes. Please test
and let us know your results.
Important upcoming dates:
Thurs Jul 24 - 14.04.1 (~2 days away)
Thurs Aug 07 - 12.04.5 (~2 weeks away)
Thurs Aug 21 - Utopic Feature Freeze (~4 weeks away)
The current CVE status can be reviewed at the following link:
Status: Stable, Security, and Bugfix Kernel Updates - Trusty/Saucy/Precise/Lucid
Status for the main kernels, until today (Jul. 22):
- Lucid - Released
- Precise - Released
- Saucy - Released
- Trusty - Released
Current opened tracking bugs details:
For SRUs, SRU report is a good source of information:
14.04.1 cycle: 29-Jun through 07-Aug
27-Jun Last day for kernel commits for this cycle
29-Jun - 05-Jul Kernel prep week.
06-Jul - 12-Jul Bug verification & Regression testing.
13-Jul - 19-Jul Regression testing & Release to -updates.
20-Jul - 24-Jul Release prep
24-Jul 14.04.1 Release 
07-Aug 12.04.5 Release 
cycle: 08-Aug through 29-Aug
08-Aug Last day for kernel commits for this cycle
10-Aug - 16-Aug Kernel prep week.
17-Aug - 23-Aug Bug verification & Regression testing.
24-Aug - 29-Aug Regression testing & Release to -updates.
 This will be the very last kernels for lts-backport-quantal, lts-backport-raring,
 This will be the lts-backport-trusty kernel as the default in the precise point
Open Discussion or Questions? Raise your hand to be recognized
No open discussions.
22 Jul 2014 5:12pm GMT
Box's evolution continues ahead. Due to the Qt development, the main theme for Lubuntu must grow a bit more to cover more apps, devices and, of course, environments. Now it's Qt, the sub-system for the next Lubuntu desktop, but this will allow its use for KDE5 and Plasma Next. For now it's just a project, but the Dolphin file manager looks fine! Note: this is under heavy development, no
22 Jul 2014 4:11pm GMT
What has Stayed the Same?
- Continue to create and run innovative programs to facilitate ever more community contributions and growing the community.
- Continue to provide good advice to me and the rest of Canonical regarding how to be the best community members we can be, given our privileged positions of being paid to work within that community.
- Continue to assist with outward communication from Canonical to the community regarding plans, project status, and changes to those.
What Has Changed?
Secondly, while individuals on the team had been hired to have specific roles in the community, every one of them had branched out to tackle new challenges as needed.
22 Jul 2014 12:56pm GMT
The Box theme support continues growing, covering more and more environments. Now we're celebrating that the MATE desktop environment, a GTK3 fork of the traditional Gnome2, will have its own Ubuntu flavour, named Ubuntu MATE Remix. Once tested, I noticed I missed something familiar, our beloved Lubuntu spirit on it. So here begins the (experimental) theme support. It'll be available to download
22 Jul 2014 10:24am GMT
Yesterday's autopkgtest 3.2 release brings several changes and improvements that developers should be aware of.
Cleanup of CLI options, and config files
adt-run versions had rather complex, confusing, and rarely (if ever?) used options for filtering binaries and building sources without testing them. All of those (
--binaries-fortests) now went away. Now there is only
--no-built-binaries left, which disables building/using binaries for the subsequent unbuilt tree or dsc arguments (by default they get built and their binaries used for tests), and I added its opposite
--built-binaries for completeness (although you most probably never need this).
--help output now is a lot easier to read, both due to above cleanup, and also because it now shows several paragraphs for each group of related options, and sorts them in descending importance. The manpage got updated accordingly.
Another new feature is that you can now put arbitrary parts of the command line into a file (thanks to porting to Python's argparse), with one option/argument per line. So you could e. g. create config files for options and runners which you use often:
$ cat adt_sid --output-dir=/tmp/out -s --- schroot sid $ adt-run libpng @adt_sid
Shell command tests
If your test only contains a shell command or two, or you want to re-use an existing upstream test executable and just need to wrap it with some command like
env, you can use the new
Test-Command: field instead of
Tests: to specify the shell command directly:
Test-Command: xvfb-run -a src/tests/run Depends: @, xvfb, [...]
This avoids having to write lots of tiny wrappers in
debian/tests/. This was already possible for click manifests, this release now also brings this for deb packages.
It is now very easy to define an autopilot test with extra package dependencies or restrictions, without having to specify the full command, using the new
autopilot_module test definition. See /usr/share/doc/autopkgtest/README.click-tests.html for details.
If your test fails and you just want to run your test with additional dependencies or changed restrictions, you can now avoid having to rebuild the .click by pointing
--override-control (which previously only worked for deb packages) to the locally modified manifest. You can also (ab)use this to e. g. add the autopilot
-v option to
Unpacking of test dependencies was made more efficient by not downloading Python 2 module packages (which cannot be handled in "unpack into temp dir" mode anyway).
Finally, I made the adb setup script more robust and also faster.
As usual, every change in control formats, CLI etc. have been documented in the manpages and the various READMEs. Enjoy!
22 Jul 2014 6:16am GMT
I picked up Zoe from Sarah this morning and dropped her at Kindergarten. Traffic seemed particularly bad this morning, or I'm just out of practice.
I spent the day powering through the last two parts of the registration block of my real estate licence training. I've got one more piece of assessment to do, and then it should be done. The rest is all dead-tree written stuff that I have to mail off to get marked.
Zoe's doing tennis this term as her extra-curricular activity, and it's on a Tuesday afternoon after Kindergarten at the tennis court next door.
I'm not sure what proportion of the class is continuing on from previous terms, and so how far behind the eight ball Zoe will be, but she seemed to do okay today, and she seemed to enjoy it. Megan's in the class too, and that didn't seem to result in too much cross-distraction.
After that, we came home and just pottered around for a bit and then Zoe watched some TV until Sarah came to pick her up.
22 Jul 2014 1:23am GMT
21 Jul 2014
Welcome to the Ubuntu Weekly Newsletter. This is issue #375 for the weeks July 7 - 20, 2014, and the full version is available here.
In this issue we cover:
- 12.04.x HWE Stack EOL Notification
- Ubuntu 13.10 (Saucy Salamander) End of Life reached on July 17 2014
- Ubuntu Stats
- Ubucon LatinAmerica Speakers! #1
- Juju 1.20 is out the door!
- Brightbox now offering official Ubuntu images
- Jonathan Riddell: Frameworks 5 and Plasma 5 announcements
- The Fridge: New Ubuntu Membership Board Members
- The Fridge: [Proposed] Ubuntu Online Summit dates: 4-6 Nov 2014
- Lubuntu Blog: PCManFM 1.2.1
- Ubuntu App Developer Blog: Content Hub to replace Friends API
- Lubuntu Blog: Box support for MATE
- Nicholas Skaggs: A new test runner approaches
- Upstart 1.13 released
- Unity8 & Mir update July 8 & 15, 2014
- Canonical News
- This Feisty Linux Company Has An Interesting Plan To Topple Android
- In The Blogosphere
- Featured Audio and Video
- Weekly Ubuntu Development Team Meetings
- Upcoming Meetings and Events
- Updates and Security for 10.04, 12.04, 13.10 and 14.04
- And much more!
The issue of The Ubuntu Weekly Newsletter is brought to you by:
- Elizabeth K. Joseph
- Jose Antonio Rey
- And many others
Except where otherwise noted, content in this issue is licensed under a Creative Commons Attribution 3.0 License BY SA Creative Commons License
21 Jul 2014 11:42pm GMT
When life gives you a sunny beach to live on, make a mojito and go for a swim. Since KDE has an office that all KDE developer are welcome to use in Barcelona I decided to move to Barcelona until I get bored. So far there's an interesting language or two, hot weather to help my fragile head and water polo in the sky. Do drop by next time you're in town.
Plasma 5 Release Party Drinks
Also new poll for Plasma 5. What's your favourite feature?
21 Jul 2014 7:22pm GMT
This past spring I had the great opportunity to work with Matthew Helmke, José Antonio Rey and Debra Williams of Pearson on the 8th edition of The Official Ubuntu Book.
In addition to the obvious task of updating content, one of our most important tasks was working to "future proof" the book more by doing rewrites in a way that would make sure the content of the book was going to be useful until the next Long Term Support release, in 2016. This meant a fair amount of content refactoring, less specifics when it came to members of teams and lots of goodies for folks looking to become power users of Unity.
Quoting the product page from Pearson:
The Official Ubuntu Book, Eighth Edition, has been extensively updated with a single goal: to make running today's Ubuntu even more pleasant and productive for you. It's the ideal one-stop knowledge source for Ubuntu novices, those upgrading from older versions or other Linux distributions, and anyone moving toward power-user status.
Its expert authors focus on what you need to know most about installation, applications, media, administration, software applications, and much more. You'll discover powerful Unity desktop improvements that make Ubuntu even friendlier and more convenient. You'll also connect with the amazing Ubuntu community and the incredible resources it offers you.
Huge thanks to all my collaborators on this project. It was a lot of fun to work them and I already have plans to work with all three of them on other projects in the future.
So go pick up a copy! As my first published book, I'd be thrilled to sign it for you if you bring it to an event I'm at, upcoming events include:
- August 9, Philadelphia: Fosscon
- August 27-31, Portland, OR: Debconf 14
- September 11-13, Orlando: Fossetcon
- September 23, San Francisco: PuppetConf
- October 22-23, Raleigh: All Things Open
And of course, monthly Ubuntu Hours and Debian Dinners in San Francisco.
21 Jul 2014 4:21pm GMT
Technically a fork is any instance of a codebase being copied and developed independently of its parent. But when we use the word it usually encompasses far more than that. Usually when we talk about a fork we mean splitting the community around a project, just as much as splitting the code itself. Communities are not like code, however, they don't always split in consistent or predictable ways. Nor are all forks the same, and both the reasons behind a fork, and the way it is done, will have an effect on whether and how the community around it will split.
There are, by my observation, three different kinds of forks that can be distinguished by their intent and method. These can be neatly labeled as Convergent, Divergent and Emergent forks.
Most often when we talk about forks in open source, we're talking about convergent forks. A convergent fork is one that shares the same goals as its parent, seeks to recruit the same developers, and wants to be used by the same users. Convergent forks tend to happen when a significant portion of the parent project's developers are dissatisfied with the management or processes around the project, but otherwise happy with the direction of its development. The ultimate goal of a convergent fork is to take the place of the parent project.
Because they aim to take the place of the parent project, convergent forks must split the community in order to be successful. The community they need already exists, both the developers and the users, around the parent project, so that is their natural source when starting their own community.
Less common that convergent forks, but still well known by everybody in open source, are the divergent forks. These forks are made by developers who are not happy with the direction of a project's development, even if they are generally satisfied with its management. The purpose of a divergent fork is to create something different from the parent, with different goals and most often different communities as well. Because they are creating a different product, they will usually be targeting a different group of users, one that was not well served by the parent project. They will, however, quite often target many of the same developers as the parent project, because most of the technology and many of the features will remain the same, as a result of their shared code history.
Divergent forks will usually split a community, but to a much smaller extent than a convergent fork, because they do not aim to replace the parent for the entire community. Instead they often focus more on recruiting those users who were not served well, or not served at all, by the existing project, and will grown a new community largely from sources other than the parent community.
Emergent forks are not technically forks in the code sense, but rather new projects with new code, but which share the same goals and targets the same users as an existing project. Most of us know these as NIH, or "Not Invented Here", projects. They come into being on their own, instead of splitting from an existing source, but with the intention of replacing an existing project for all or part of an existing user community. Emergent forks are not the result of dissatisfaction with either the management or direction of an existing project, but most often a dissatisfaction with the technology being used, or fundamental design decisions that can't be easily undone with the existing code.
Because they share the same goals as an existing project, these forks will usually result in a split of the user community around an existing project, unless they differ enough in features that they can targets users not already being served by those projects. However, because they do not share much code or technology with the existing project, they most often grow their own community of developers, rather than splitting them from the existing project as well.
All of these kinds of forks are common enough that we in the open source community can easily name several examples of them. But they are all quite different in important ways. Some, while forks in the literal sense, can almost be considered new projects in a community sense. Others are not forks of code at all, yet result in splitting an existing community none the less. Many of these forks will fail to gain traction, in fact most of them will, but some will succeed and surpass those that came before them. All of them play a role in keeping the wider open source economy flourishing, even though we may not like them when they affect a community we've been involved in building.
21 Jul 2014 8:00am GMT
20 Jul 2014
Why oh why are they so hard to write?
Even using the built in modules it is insanely hard to debug. Playing a bootsplash in X sucks and my machine boots too fast to test it on reboot.
Basically, euch. All I wanted was a hackers zebra on boot :(
20 Jul 2014 9:02pm GMT
19 Jul 2014
Friday was my last day at Collabora, the awesome Open Source consultancy in Cambridge. I'd been there more than three years, and it was time for a change.
As luck would have it, that change came in the form of a job offer 3 months ago from my long-time friend in Open Source, Miguel de Icaza. Monday morning, I fly out to Xamarin's main office in Boston, for just over a week of induction and face time with my new co workers, as I take on the title of Release Engineer.
My job is to make sure Mono on Linux is a first-class citizen, rather than the best-effort it's been since Xamarin was formed from the ashes of the Attachmate/Novell deal. I'm thrilled to work full-time on what I do already as community work - including making Mono great on Debian/Ubuntu - and hope to form new links with the packer communities in other major distributions. And I'm delighted that Xamarin has chosen to put its money where its mouth is and fund continued Open Source development surrounding Mono.
If you're in the Boston area next week or the week after, ping me via the usual methods!
19 Jul 2014 7:35pm GMT