12 Nov 2018

feedPlanet KDE

Second Edition of “Dessin et peinture numérique avec Krita” published!

Last month French publisher D-Booker released the 2nd edition of Timothée Giet's book "Dessin et peinture numérique avec Krita".

The first edition was written for Krita 2.9.11, almost three years ago. A lot of things have changed since then! So Timothée has completely updated this new edition for Krita version 4.1. There are also a number of notes about the new features in Krita 4.

And more-over, D-Booker worked again on updating and improving the French translation of Krita! Thanks again to D-Booker edition for their contribution.

You can order this book directly from the publisher's website. There is both a digital edition (pdf or epub) as well as a paper edition.

12 Nov 2018 2:22pm GMT

Interview with HoldXtoRevive

Could you tell us something about yourself?

I'm Brigette, but I go mainly go by my online handle of HoldXtoRevive. I'm from the UK and mostly known as a fanartist.

Do you paint professionally, as a hobby artist, or both?

I have had a few commissions but outside that I would call myself a hobbyist. I would love to work professionally at some point.

What genre(s) do you work in?

I do semi-realistic sci-fi art. I most recently I have been drawing character portraits inspired from the Art Nouveau style, the majority of it has been fanart of a few different Sci-fi games.

Whose work inspires you most - who are your role models as an artist?

It's hard to list them all really. Top of the list would be my other half, RedSkittlez, who is an amazing concept and character artist, also my friends Blazbaros, SilverBones and many more that would cause this to go on for too long.

Outside of my friends I would say Charles Walton, Pete Mohrbacher and Valentina Remenar to name a few.

How and when did you get to try digital painting for the first time?

About 4 years ago I downloaded GIMP as I wanted to get back into art after not drawing for about 15 years. I got a simple drawing tablet soon after and things just progressed from there.

What makes you choose digital over traditional painting?

The flexibility and practicality of it. Whilst I would love to try traditional acquiring, maintaining and storing supplies is not easy for me.

How did you find out about Krita?

My partner was looking at alternatives to photoshop and came across it via a youtube video. He recommended me to try it out.

What was your first impression?

How clean the UI is and how all of the tools where easy to find, and the fun I had messing with the brushes.

What do you love about Krita?

The fact it was really easy to get to grips with, yet I can tell there is more I can get from it. Also the autosave.

What do you think needs improvement in Krita? Is there anything that really annoys you?

I would like a brightness/contrast slider alongside the curve for ease of use. It would also be nice if the adjustment windows would not close when the autosave kicks in.

What sets Krita apart from the other tools that you use?

I have not used many programs before I can across Krita. But the thing that jumped out at me was the ease of use and it had everything I wanted in an art program, I know that if I want to try animation I do not need to go and find another program.

If you had to pick one favourite of all your work done in Krita so far, what would it be, and why?

That is hard to say, but in a pinch I would say the one titled "Saladin's White Wolf". I was really happy how the background came out, it was also the one to be picked out and promoted by Bungie on their twitter.

What techniques and brushes did you use in it?

For the most part I use a multiply layer over flats for shading. My main brushes are just the basic tip (gaussian), basic wet soft and the soft smudge brush.

Where can people see more of your work?

Over on my DeviantArt page: https://www.deviantart.com/holdxtorevive
And my twitter: https://twitter.com/HoldX2ReviveART

Anything else you'd like to share?

I'm bad with words but I want to show appreciation to the Krita crew for making this wonderful program and to everyone who has supported and encouraged me.

12 Nov 2018 8:00am GMT

11 Nov 2018

feedPlanet KDE

Future Developments in clang-query

Getting started - clang-tidy AST Matchers

Over the last few weeks I published some blogs on the Visual C++ blog about Clang AST Matchers. The series can be found here:

I am not aware of any similar series existing which covers creation of clang-tidy checks, and use of clang-query to inspect the Clang AST and assist in the construction of AST Matcher expressions. I hope the series is useful to anyone attempting to write clang-tidy checks. Several people have reported to me that they have previously tried and failed to create clang-tidy extensions, due to various issues, including lack of information tying it all together.

Other issues with clang-tidy include the fact that it relies on the "mental model" a compiler has of C++ source code, which might differ from the "mental model" of regular C++ developers. The compiler needs to have a very exact representation of the code, and needs to have a consistent design for the class hierarchy representing each standard-required feature. This leads to many classes and class hierarchies, and a difficulty in discovering what is relevant to a particular problem to be solved.

I noted several problems in those blog posts, namely:

Last week at code::dive in Wroclaw, I demonstrated tooling solutions to all of these problems. I look forward to video of that talk (and videos from the rest of the conference!) becoming available.

Meanwhile, I'll publish some blog posts here showing the same new features in clang-query and clang-tidy.

clang-query in Compiler Explorer

Recent work by the Compiler Explorer maintainers adds the possibility to use source code tooling with the website. The compiler explorer contains new entries in a menu to enable a clang-tidy pane.

clang-tidy in Compiler Explorer

clang-tidy in Compiler Explorer

I demonstrated use of compiler explorer to use the clang-query tool at the code::dive conference, building upon the recent work by the compiler explorer developers. This feature will get upstream in time, but can be used with my own AWS instance for now. This is suitable for exploration of the effect that changing source code has on match results, and orthogonally, the effect that changing the AST Matcher has on the match results. It is also accessible via cqe.steveire.com.

It is important to remember that Compiler Explorer is running clang-query in script mode, so it can process multiple let and match calls for example. The new command set print-matcher true helps distinguish the output from the matcher which causes the output. The help command is also available with listing of the new features.

The issue of clang-query not printing both diagnostic information and AST information at the same time means that users of the tool need to alternate between writing

set output diag

and

set output dump

to access the different content. Recently, I committed a change to make it possible to enable both output and diag output from clang-query at the same time. New commands follow the same structure as the set output command:

enable output dump
disable output dump

The set output <feature> command remains as an "exclusive" setting to enable only one output feature and disable all others.

Dumping possible AST Matchers

This command design also enables the possibility of extending the features which clang-query can output. Up to now, developers of clang-tidy extensions had to inspect the AST corresponding to their source code using clang-query and then use that understanding of the AST to create an AST Matcher expression.

That mapping to and from the AST "mental model" is not necessary. New features I am in the process of upstreaming to clang-query enable the output of AST Matchers which may be used with existing bound AST nodes. The command

enable output matcher

causes clang-query to print out all matcher expressions which can be combined with the bound node. This cuts out the requirement to dump the AST in such cases.

Inspecting the AST is still useful as a technique to discover possible AST Matchers and how they correspond to source code. For example if the functionDecl() matcher is already known and understood, it can be dumped to see that function calls are represented by the CallExpr in the Clang AST. Using the callExpr() AST Matcher and dumping possible matchers to use with it leads to the discovery that callee(functionDecl()) can be used to determine particulars of the function being called. Such discoveries are not possible by only reading AST output of clang-query.

Dumping possible Source Locations

The other important discovery space in creation of clang-tidy extensions is that of Source Locations and Source Ranges. Developers creating extensions must currently rely on the documentation of the Clang AST to discover available source locations which might be relevant. Usually though, developers have the opposite problem. They have source code, and they want to know how to access a source location from the AST node which corresponds semantically to that line and column in the source.

It is important to make use a semantically relevant source location in order to make reliable tools which refactor at scale and without human intervention. For example, a cursory inspection of the locations available from a FunctionDecl AST node might lead to the belief that the return type is available at the getBeginLoc() of the node.

However, this is immediately challenged by the C++11 trailing return type feature, where the actual return type is located at the end. For a semanticallly correct location, you must currently use

getTypeSourceInfo()->getTypeLoc().getAs().getReturnLoc().getBeginLoc()

It should be possible to use getReturnTypeSourceRange(), but a bug in clang prevents that as it does not appreciate the trailing return types feature.

Once again, my new output feature of clang-query presents a solution to this discovery problem. The command

enable output srcloc

causes clang-query to output the source locations by accessor and caret corresponding to the source code for each of the bound nodes. By inspecting that output, developers of clang-tidy extensions can discover the correct expression (usually via the clang::TypeLoc heirarchy) corresponding to the source code location they are interested in refactoring.

Next Steps

I have made many more modifications to clang-query which I am in the process of upstreaming. My Compiler explorer instance is listed as the 'clang-query-future' tool, while the clang-query-trunk tool runs the current trunk version of clang-query. Both can be enabled for side-by-side comparison of the future clang-query with the exising one.

11 Nov 2018 10:46pm GMT

31 Oct 2018

feedplanet.freedesktop.org

Bastien Nocera: Pipewire Hackfest 2018

Good morning from Edinburgh, where the breakfast contains haggis, and the charity shops have some interesting finds.

My main goal in attending this hackfest was to discuss Pipewire integration in the desktop, and how it will eventually replace PulseAudio as the audio daemon.

The main problem GNOME has had over the years with PulseAudio relate mostly to how PulseAudio was a black box when it came to its routing policy. What happens when you plug in an HDMI cable into your laptop? Or turn on your Bluetooth headset? I've heard the stories of folks with highly mobile workstations having to constantly visit the Sound settings panel.

PulseAudio has policy scattered in a number of places (do a "git grep routing" inside the sources to see that): some are in the device manager, then modules themselves can set priorities for their outputs and inputs. But there's nothing to take all the information in, and take a decision based on the hardware that's plugged in, and the applications currently in use.

For Pipewire, the policy decisions would be split off from the main daemon. Pipewire, as it gains PulseAudio compatibility layers, will grow a default/example policy engine that will try to replicate PulseAudio's behaviour. At the very least, that will mean that Pipewire won't regress compared to PulseAudio, and might even be able to take better decisions in the short term.

For GNOME, we still wanted to take control of that part of the experience, and make our own policy decisions. It's very possible that this engine will end up being featureful and generic enough that it will be used by more than just GNOME, or even become the default Pipewire one, but it's far too early to make that particular decision.

In the meanwhile, we wanted the GNOME policies to not be written in C, difficult to experiment with for power users, and for edge use cases. We could have started writing a configuration language, but it would have been too specific, and there are plenty of embeddable languages around. It was also a good opportunity for me to finally write the helper library I've been meaning to write for years, based on my favourite embedded language, Lua.

So I'm introducing Anatole. The goal of the project is to make it trivial to write chunks of programs in Lua, while the core of your project is written in C (we might even be able to embed it in Python or Javascript, once introspection support is added).

It's still in the very early days, and unusable for anything as of yet, but progress should be pretty swift. The code is mostly based on Victor Toso's incredible "Lua factory" plugin in Grilo. (I'm hoping that, once finished, I won't have to remember on which end of the stack I need to push stuff for Lua to do something with it ;)

31 Oct 2018 11:44am GMT

Roman Gilg: Representing KDE at XDC 2018

Last month the X.Org Developer's Conference (XDC) was held in A Coruña, Spain. I took part as a Plasma/KWin developer. My main goal was to simply get into contact with developers from other projects and companies working on open source technology in order to show them that the KDE community aims at being a reliable partner to them now and in the future.

Instead of recounting chronologically what went down at the conference let us look at three key groups of attendees, who are relevant to KWin and Plasma: the graphics drivers and kernel developers, upstream userland and colleagues working on other compositor projects.

Graphics drivers and kernel

If you search on Youtube for videos of talks from previous XDC conferences or for the videos from this year's XDC you will notice that there are many talks by graphics drivers developers, often directly employed by hardware vendors.

The reason is that hardware vendors have enough money to employ open source developers and send them to conferences and that they benefit greatly from contributing directly to open source projects. Something which also Nvidia management will realize at some point.

On the other side I talked to the Nvidia engineers at the conference, who were very friendly and eager to converse about their technical solutions which they are allowed to share with the community. Sadly their primarily usage of proprietary technology in general hinders them in taking a more active role in the community and there is apparently no progress on their proposed open standard Wayland buffer sharing API.

At least we arranged that they would send some hardware for testing purposes. I won't be the recipient, since my work focus will be on other topics in the immediate future, but I was able to point to another KWin contributor, who should receive some Nvidia hardware in the future so he can better troubleshoot problems our users on Nvidia experience.

The situation looks completely different for Intel and AMD. In particular Intel has a longstanding track record of open development of their own drivers and contributing to generic open source solutions also being supported by other vendors. And AMD decided not too long ago to open source their most commonly used graphics drivers on Linux. In both cases it is a bliss to target their latest hardware and it was as great as I imagined it to be talking to their developers at XDC, because they are not only interested in their own products but in boosting the whole ecosystem and finding suitable solutions for everyone. I want to explicitly mention Martin Peres from Intel and Harry Wentland from AMD, who I had long, interesting discussions with and who showed great interest in improving the collaboration of low-level engineers and us in userland.

Who I haven't mentioned yet is ARM. Although they are just like Nvidia, Intel and AMD an XDC "Gold Sponsor" their contribution in terms of content to the conference was minimal, most likely for the same reason of being mostly closed source as in the Nvidia case. And that is equally sad, since we do have some interest in making ARM a well supported target for Plasma. An example is Plasma on the Pinebook. But the driver situation for ARM Mali GPUs is just ugly, developing for them is torture. I know because I did some of the integration work for the Pinebook. All the more I respect the efforts by several extremely talented hackers to provide open-source drivers for ARM Mali GPUs. Some of them presented their work at XDC.

X.Org and freedesktop.org upstream

Linux graphics drivers are cool and all, but without XServer, Wayland and other auxiliary cross-vendor user space libraries there would be not much to show off to the user. And after all it is the X.Org Developer's conference, most notably being home to the XServer and maybe in the future governance wise also to freedesktop.org. So after looking at low-level driver development, what role did these projects and their developers play at the conference?

First I have to say, that the dichotomy established in the previous paragraph is of course not that distinct. Several graphics drivers are part of mesa, which is again part of freedesktop.org and many graphics drivers developers are also contributing to user land or involved in organizational aspects of X.Org and freedesktop.org. A more prominent one of these organizational aspects is hosting of projects. There was a presentation by Daniel Stone about the freedesktop.org transition to GitLab, what was a rather huge project this year and is still ongoing.

But regarding technical topics there were not many presentations about XServer, Wayland and other high level components. After seeing some lightning talks on the first day of the conference I decided to hold a lightning talk myself about my Xwayland GSOC project in 2017. I got one of the last slots on Friday and you can watch a video of my presentation here. Also Drew De Vault presented a demo of wlroot's layer shell.

So there were not so many talks about the higher level user space graphics stack, but some of us plan to increase the ratio of such talks in the future. After talking about graphics drivers developers and upstream userland this brings me directly to the last group of people:

Compositors developers

We were somewhat a special crowd at XDC. From distinct projects, some of us were from wlroots, Guido from Purism and me from KWin, we were united in, to my knowledge, all of us being the first time at XDC.

If you look at past conferences the involvement of compositor developers was marginal. My proclaimed goal and I believe also the one of all the others is to change this from now on. Because from embedded to desktop we will all benefit by working together where possible and exchanging information with each other, with upstream and with hardware vendors. I believe X.Org and freedesktop.org can be a perfect platform for that.

Final remarks on organisation

The organisation of the conference was simply great. Huge thanks to igalia for hosting XDC in their beautiful home town.

What I really liked about the conference schedule was that there were always three long breaks every day and long pauses between the talks allowing the attendees to talk to each other.

What I didn't like about the conference was that all the attendees were spread over the city in different hotels. I do like the KDE Akademy approach better in this regard: everyone in one place so you can drink together a last beer at the hotel bar before going to bed. That said there were events at multiple evenings throughout the week, but recommending a reasonable priced default hotel for everyone not being part of a large group might still be an idea for next XDC.

31 Oct 2018 11:00am GMT

30 Oct 2018

feedplanet.freedesktop.org

Christian Schaller: PipeWire Hackfest

So we kicked off the PipeWire hackfest in Edinburgh yesterday. We have 15 people attending including Arun Raghavan, Tanu Kaskinen and Colin Guthrie from PulseAudio, PipeWire creator Wim Taymans, Bastien Nocera and Jan Grulich representing GNOME and KDE, Mark Brown from the ALSA kernel team, Olivier Crête,George Kiagiadakis and Nicolas Dufresne was there to represent embedded usecases for PipeWire and finally Thierry Bultel representing automotive.

The event kicked off with Wim Taymans presenting on current state of PipeWire and outlining the remaining issues and current thoughts on how to resolve them. Most of the first day was spent on a roadtable discussion about what are and should be the goals of PipeWire and what potential tradeoffs there would be going forward. PipeWire is probably a bit closer to Jack than PulseAudio in design, so quite a bit of the discussion went on how that would affect the PulseAudio usecases and what is planned to ensure PipeWire works very well for consumer audio usecases.

Personally I ended up spending quite some time just testing and running various Jack apps to see what works already and what doesn't. In terms of handling outputing audio with Jack apps I was positively surprised how many Jack apps I was able to make work (aka output audio) using PipeWire instead of Jack, but of course we still have some gaps to cover before PipeWire is ready as a drop-in Jack replacement, for instance the Jack session management protocol needs to be implemented first.

The second day we outlined the areas that need work before we are ready to replace PulseAudio and came up with the following list:

It is still a bit hard to have a clear timeline for when we will be ready to drop in PipeWire support to replace PulseAudio and then Jack, but we feel the Wayland migration was a good example to follow where we held off doing the switch until we felt comfortable the move would be transparent to most users. There will of course always be corner cases and bugs, but we hope that in general people agree that the Wayland transition was done in a responsible manner and thus could be a good example to follow for us here.

We would like to offers big thanks to the GNOME Foundation for sponsoring travel for some of the community attendees and to Collabora for sponsoring dinner for all attendees the first night.

If you want to take a look at PipeWire, Wim updated the wiki page with PipeWire build intructions to be up-to-date. The hackfest attendees tested them out so we are sure they work, just be aware that you want the 'Work' branch and not the Master branch, as that is the one where all the audio work is happening. The Master branch is the video focused branch we use in Fedora for desktop remoting support in browsers and VNC under Wayland.

30 Oct 2018 2:15pm GMT