news aggregator

Robert Collins: Distributed bugtracking – quick thoughts

Planet Ubuntu - Mon, 2014-05-05 06:11

Just saw and I feel compelled to note that distributed bug trackers are not new – the earliest I personally encountered was Aaron Bentley’s Bugs everywhere – coming up on it’s 10th birthday. BE meets many of the criteria in the dbts post I read earlier today, but it hasn’t taken over the world – and I think this is in large part due to the propogation nature of bugs being very different to code – different solutions are needed.

XXXX: With distributed code versioning we often see people going to some effort to avoid conflicts – semantic conflicts are common, and representation conflicts extremely common.The idions

Take for example Here we can look at the nature of the content:

  1. Concurrent cannot-conflict content – e.g. the discussion about the bug. In general everyone should have this in their local bug database as soon as possible, and anyone can write to it.
  2. Observations of fact – e.g. ‘the code change that should fix the bug has landed in Ubuntu’ or ‘Commit C should fix the bug’.
  3. Reports of symptoms – e.g. ‘Foo does not work for me in Ubuntu with package versions X, Y and Z’.
  4. Collaboratively edited metadata – tags, title, description, and arguably even the fields like package, open/closed, importance.

Note that only one of these things – the commit to fix the bug – happens in the same code tree as the code; and that the commit fixing it may be delayed by many things before the fix is available to users. Also note that conceptually conflicts can happen in any of those fields except 1).

Anyhow – my humble suggestion for tackling the conflicts angle is to treat all changes to a bug as events in a timeline – e.g. adding a tag ‘foo’ is an event to add ‘foo’, rather than an event setting the tags list to ‘bar,foo’ – then multiple editors adding ‘foo’ do not conflict (or need special handling). Collaboratively edited fields would be likely be unsatisfying with this approach though – last-writer-wins isn’t a great story. OTOH the number of people that edit the collaborative fields on any given bug tend to be quite low – so one could defer that to manual fixups.

Further, as a developer wanting local access to my bug database, syncing all of these things is appealing – but if I’m dealing with a million-bug bug database, I may actually need the ability to filter what I sync or do not sync with some care. Even if I want everything, query performance on such a database is crucial for usability (something git demonstrated convincingly in the VCS space).

Lastly, I don’t think distributed bug tracking is needed – it doesn’t solve a deeply burning use case – offline access would be a 90% solution for most people. What does need rethinking is the hugely manual process most bug systems use today. Making tools like whoopsie-daisy widely available is much more interesting (and that may require distributed underpinnings to work well and securely). Automatic collation of distinct reports and surfacing the most commonly experienced faults to developers offers a path to evidence based assessment of quality – something I think we badly need.

Elizabeth Krumbach Joseph: LOPSA East 2014 wrap-up

Planet Ubuntu - Mon, 2014-05-05 03:08

On Friday and Saturday I had the opportunity to finally attend and participate in a conference I’ve had my eyes on for years: LOPSA-East. I first heard about this conference several years ago while living in Philadelphia, but could never gather up the time or funds to attend. This year I was delighted to see an invitation to submit a proposal land in my inbox and I submitted a talk on Code Review for Systems Administrators which was accepted. Even better, they also asked if I could give the closing keynote on attracting more women to IT!

One of the things I always admired from afar about this conference was their passion for systems administration/ops work, the people who voluntarily spend their time running this conference and many of the speakers spend vast amounts of time off work hours on the community. It syncs up well with my own passions and those of many of the local groups nearby, so I was really delighted when I saw PLUG represented on their supporters board near the entrance to registration (along with the great Franklin tux logo by Stephanie A. Fox!).

Friday was a tutorial day for the conference, where I chose to attend Jennifer Davis’ “Implementing Kanban to Improve your Workflow” in the morning and “How to Interview a System Administrator” by Adam Moskowitz in the afternoon.

Jennifer’s tutorial was a real treat. She had group activities throughout the tutorial that made it more engaging, and since they were about our work I was happy to engage, rather than being uncomfortable (group activities don’t tend to be my thing). Even better, she managed to sneak in a game of Fluxx as one of the activities to demonstrate the disruptive and interrupt-drive environment that systems administrators often find ourselves in. The Kanban scheduling system for work is something I’m seeing increasingly in the industry, including on a team I work with in OpenStack. I’ve also been reading The Phoenix Project, where they appear prominently, but it was great to sit down and have a tutorial that helped me better understand how other teams are using them in production. We also got to make a demo one ourselves with post-its, which was a lot of fun, especially if you love office supplies like I do (doesn’t everyone?).

Adam’s session on interviewing systems administrators was really great too. The team I work on has been doing a fair amount of hiring lately, so I’ve been asked to help conduct interviews. The first good news out of this session is that I generally have the right idea with interviews, but there are always improvements to be made! He suggested an approach that centers around the key question of “Tell me a time when you…” which will show you about how they solved a problem and will teach you a lot about their skills in that area. The goal is to show that the applicant is a smart problem-solver who is able to learn and adapt to new applications as the job of systems administration often requires, not someone who is solidly attached to a single technology – “don’t ask them what the -z flag of ls does.” He also explained the process at his company where an applicant must give a presentation on a subject (typically a technology or problem they’ve solved) to the interview panel, which was quite the contentious suggestion, but he argued that communication skills are vital for an applicant and they wouldn’t be judging them on their public speaking ability. Finally, one of my favorite things he mentioned was making the applicant feel comfortable. Interviews are stressful, just by seeing how an applicant performs in an interview you get some idea of how they handle stress, there is no need to manufacture stress for them.

Friday night was the first keynote, by OpenStack guru Vishvananda Ishaya. He gave a history of OpenStack talk and gave some details about the current uses of it in the industry. I’ve heard a similar talk from him before, but this was the first time I’d seen it at an operations-focused conference, so that was pretty exciting. It was also notable that both the keynotes this year were by folks who work full time on OpenStack. First we took over open source conferences, now operations!

Saturday kicked off the talks of the conference. I had a chance to catch up with Kartik Subbarao who recently published a book Enlightening Technical Leadership. I’ve recently jumped on the meditation bandwagon and have sought to bring mindfulness into standard practice in my life, so the timing of his book, and the related talk I went to first thing on Saturday, was great. His proposal was for the changing of mental models for handling various situations, and brought up in person vs email discussions as an example: body language and tone tell us a lot in person, in email so many things are much less clear, a phrase like “Good luck” can be interpreted many ways. He implores the audience to take a mindful step back and seek to adjust their reactions to be more positive, constructive and rational.

The second talk of the day was mine, I’ve given my Code Review for Sys Admins talk at a few open source conferences, but this was my first time giving it to a sysadmin-ful audience at an ops-focused conference, so I was eager to hear feedback. I ended up having a lot of great chats after my talk with folks who were coming from various backgrounds who were interested in learning more about the tools and where the bottlenecks were in our workflow. But perhaps the most exciting part about my talk was during someone else’s – Adam Moskowitz did a talk in the afternoon called “The Future of System Administration (and what you should do to prepare)” where he described an almost identical workflow he was using at his company with automated developer testing and systems administration code being pushed through code review too! His premise was that sysadmins will increasingly need coding ability as we dive further into automation of everything. It sure was exciting to see the work we do in the OpenStack project being called the future.

The next talk I went to was “Git Hooks for Sys Admins: With Puppet Examples” by Thomas Uphill. I’ve been using git on a day to day basis for over a year now and over the past few months have been thinking “I really should figure out these git hook things”. This presentation was the kick I needed, particularly since his `puppet parser validate` example is something I totally should be using instead of manually running my own script prior to commit. It was really nice to hear some of the details about what all the stuff in the hooks files were so I’ll be more familiar once I start digging in. His slides, including the examples, are available here:

Early in the morning I was approached to participate as a panelist in the “Professional Topics Panel Discussion” and so that was my next stop of the day. The first topic that was brought up by an audience member was how people handle change review processes that end up really getting in the way of work and goals of the team they’re working on. After some discussion consensus was that working with your manager and other teams to make sure efficiency goals were synced up with the needs of the change review process was important, and above all else – communication is key. Too many teams get too wrapped up in process and how things “have to be” when things actually could be vastly improved. The second topic was the position of an IT team in a small company running interference with the larger company that just bought them to make sure the small company employees could continue to do their work with their preferred workflow. Buy in from management was another key thing here, but there were also comments about how the smaller company is valuable to the larger when it came to some of their IT innovations and how honest communication between both IT teams was key.

The rest of my afternoon I spent in a series of talks, starting with “Don’t Be That Guy” which went through some of the typical archetypes of technology-types and offer advice on how to handle them, from the BOFH to the Downer to the person who seems to live in a cave and rarely collaborates with the team. The conference had a great series of lightning talks, and then I headed over to “Packing It In: Images, Containers, and Config Management” where Michael Goetz discussed use of Packer, Docker and Chef to build an environment where virtualization, containerization and configure management work well together. He also gave some tips about using containers, stressing that one should not overload a container like you might be tempted to for a full virtual machine.

And with that, the talks came to an end! All that was left was my closing keynote.

My talk was titled “Universal Design for Tech: Improving Gender Diversity in our Industry” and it’s one I was more nervous about than any other talks I’ve given recently. This is a tough topic, and one that’s quite personal for me. I’ve been on panels on the topic in the past couple years, this was the first time I’d done a solo talk on it since 2009 when I did an improving participation talk for a Linux Users Group. Since then I’ve either argued to adjust the topic or declined the invitation to speak on the topic because it’s too stressful. When the opportunity to give this keynote came up I was hesitant at first, but it’s important and I decided it was time to get back out there, even if it’s just for one talk. Over the years, and through success I’ve seen in my career and that of women I work with, I felt I could add value to the discussion and that it would be worth the stress and risk one takes when giving this kind of talk. I also had valuable input from several women I know, without whom I don’t think I could have crafted such an effective presentation.

There were a lot of great questions from the audience as I wrapped up, and I ended up being late for dinner due to post-talk discussions (oops!). Thanks to everyone who was so engaged and interested in this topic, it was really great to have such a supportive audience. Slides here: LOPSA-East-2014-Keynote-ekjoseph.pdf

In all, this was a great conference and I will be encouraging others to attend. Audience members were regularly engaged with the speakers (agreements and disagreements!). Even though I’m shy, I was able to have a lot more discussions with folks I don’t know than I usually do, a sure sign that I was pretty comfortable. So thanks to everyone who took time to talk to me and be friendly, it makes all the difference. Also thanks to the organizers for crafting such a great environment that I am proud to have participated in.

Andrew Pollock: [debian] Toning it down a bit

Planet Ubuntu - Sun, 2014-05-04 15:53

I received a complaint about the frequency of my life category posts appearing on Planet Debian. It's the first such complaint I've received, whereas I've received more complimentary feedback, presumably from readers via Planet Debian.

It has made me self-conscious about my posts, though, and I don't want it to affect my blogging, so I've pulled the life category from what I feed to Planet Debian. If you want to keep up with the minutia of my life, and you were doing that via Planet Debian, you'll have to follow my blog directly.

My apologies if anyone was annoyed.

Paul Tagliamonte: Ohanayou? Ohanami!

Planet Ubuntu - Sat, 2014-05-03 22:51

Ohanayou? Ohanami!

Diego Turcios: I have been selected in the GSOC 2014

Planet Ubuntu - Sat, 2014-05-03 21:43
I know this is not an Ubuntu related post. But I really wanted to share to the Ubuntu Community that I was selected to be part of the Google Summer of Code 2014. Just a special thanks to Paul Tagliamonte (@paultag) told me to continue trying after not being selected last year.

April 21st was a great day for me!
I received an email from Google informing that my proposal was accepted.
I will be working this summer with the organization.
My project is to improve the on-board tutorial enviroment for BeagleBone with live running interactive examples.

If you want to read more about the 7 projects of BeagleBone, you can read them here

Stephan Adig: Dealing with Disrespect - a Review

Planet Ubuntu - Sat, 2014-05-03 10:29

Normally I don’t write book reviews, but this time I have to, because it hit me personally.

As most of you should already know, Jono Bacon released another Book with the title ‘Dealing with Disrespect’ I don’t know if the review is being allowed by Amazon, so I publish it here as well:


“Dealing with Disrespect”

by Jono Bacon

First of all a full disclosure:

The Author, Jono Bacon, is a long standing colleague of mine, while working on the Ubuntu project. I am not, in any way, affiliated with his employer (Canonical), and sometimes (not all the times) I really don’t share his views and/or opinions.

Personal, I see him as a friend, not a close one, but more like ‘Brothers in Arms’. We share the passion of OpenSource and we do like Ubuntu OS, Heavy Metal and Pints of Beer. And especially we like to be a Dad of the most adorable and awesome Sons, we ever wished for.

I owe him a lot, because he (and some other community members, but he in particular) pulled me back into the Ubuntu Business a couple of years ago, and I am very thankful for this.

When Jono revealed his new writing 2 days ago, I started directly to read it, because, believe me or not, I was wondering if he was refering to me to some extend, because I can be exact the same guy who he pictures in his latest book. The disrespectful, the ranting and rambling guy, the angry ‘OpenSource’ guy, who sits too many hours per day in front of the computer, and reads a lot of nonsense from people who think they are the smartest guys on this planet.

Someone, who is passionate, angry and full of ramblings when it comes to some positions in our technical world, and sometimes speaks up, too loud.

Thankfully, he chose other examples, but I found myself in his book, which is not really charming.

Well, honestly, Jono hit ‘Bulls Eye’ with his detailed description, between the various aspects of how to read the different comments, responses or posts in our technical world.

His statement

"The trick here is to determine the attributes of the sender and the context." (PDF, Page 8, 'Dealing with Disrespect')

is the essential message (he extends this later to the four important ‘ingredients’ sender, content, tone, context).

Old Internet people like me, who still know the ‘UseNet’, we know how hard this can be. How many times, we read UseNet Posts, which were in our eyes and ears unacceptable, bollocks or insane, and we hit the ‘Reply’ button in our Newsreader and flamed this poor guy, we didn’t even know personally.

In these days, we never thought about the other guy, we just flamed, we insulted on a very personal level, but, believe me or not, it also came back, like a boomerang, and it really escalated. But these were those days, we all had leather as skin, and we could swallow a lot.

Today, world has changed, especially we don’t use the UseNet so often anymore, and our ‘ramblings’ can be found on Weblogs and in the ‘Comment’ section of those or on Web-Forums. What and how we are saying, writing, commenting nowadays is more publicly exposed than 20 years back. The people got softer, we are trying to be more friendly to each other, we are using mostly a conjugation of the word ‘Good’, even to say, that something was really bad.

What was missing all the time, was a guide, on how to deal with those, who are not ‘nice’, who are not socially well conditioned, people who don’t speak the political correct english/language of choice.

Until now.

Now, Jono wrote exactly this missing guide. On how to deal with those people. And Jono just didn’t write about it, he has the experience, working as ‘The Community Manager’ of Ubuntu. He already dealt with those. He knows what he is/was writing about

And he knows, that not all of these people are anti-social, hateful or disrespectful.

Many of those people are smart, and in real life really friendly people. It just needs some experience to deal with them, and Jono gave us now the right guide to learn from it.

I really beg you, to read this little guide of Jono, because you can learn from it. If you are Community Manager, or you have to deal with a very loud community, or even when you are the rambling guy. It’s worth a read. A lot to learn and to understand.

This book finally tries to solve issues, which can’t be fixed technically.

And thanks to Jono, I hope it will make the technical messsed up world a little more enjoyable.

Alberto Milone: Hybrid Graphics in Ubuntu 14.04

Planet Ubuntu - Sat, 2014-05-03 10:24

Here’s a short list of the new features concerning hybrid graphics in Ubuntu 14.04:

  • External displays connected to the NVIDIA GPU can now be used through the “nvidia-settings” panel. We used to disable them but this is no longer the case (also there’s a fix pending for LP: #1296020, in case your BIOS provides a fake output)
  • We have a more robust system to detect and enable hybrid graphics, thanks to the new gpu-manager (I’ll write a more technical article with all the details soon).
  • We now fall back on the open Intel driver if any of the required components is missing (e.g. the kernel module was not built for the newly installed kernel, or a key package was accidentally removed).
  • Installing the nvidia or the fglrx driver should allow hybrid graphics to work with no further action required. Switching from a power profile to another can be done using the relevant control panels (either AMD’s or NVIDIA’s), as usual.
  • A direct benefit of using a recent kernel is that tearing on Intel/NVIDIA systems, while still an issue, should be a little reduced.

My special thanks go to Maarten Lankhorst (of Nouveau fame), who helped a lot by providing guidance, testing, and debugging X issues.

Known issues


Andrea Veri: Adding reCAPTCHA support to Mailman

Planet Ubuntu - Sat, 2014-05-03 09:37

The GNOME and many other infrastructures have been recently attacked by an huge amount of subscription-based spam against their Mailman istances. What the attackers were doing was simply launching a GET call against a specific REST API URL passing all the parameters it needed for a subscription request (and confirmation) to be sent out. Understanding it becomes very easy when you look at the following example taken from our apache.log:

May 3 04:14:38 restaurant apache:, - - [03/May/2014:04:14:38 +0000] "GET /mailman/subscribe/banshee-list? HTTP/1.1" 403 313 "http://spam/index2.html" "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/34.0.1847.131 Safari/537.36"

As you can the see attackers were sending all the relevant details needed for the subscription to go forward (and specifically the full name, the email, the digest option and the password for the target list). At first we tried to either stop the spam by banning the subnets where the requests were coming from, then when it was obvious that more subnets were being used and manual intervention was needed we tried banning their User-Agents. Again no luck, the spammers were smart enough to change it every now and then making it to match an existing browser User-Agent. (with a good percentage to have a lot of false-positives)

Now you might be wondering why such an attack caused a lot of issues and pain, well, the attackers made use of addresses found around the web for their malicius subscription requests. That means we received a lot of emails from people that have never heard about the GNOME mailing lists but received around 10k subscription requests that were seemingly being sent by themselves.

It was obvious we needed to look at a backup solution and luckily someone on our support channel suggested the sysadmins recently added CAPTCHAs support to Mailman.  I’m now sharing the patch and providing a few more details on how to properly set it up on either DEB or RPM based distributions. Credits for the patch should be given to Debian Developer Tollef Fog Heen, who has been so kind to share it with us.

Before patching your installation make sure to install the python-recaptcha package (tested on Debian with Mailman 2.1.15) on DEB based distributions and python-recaptcha-client on RPM based distributions. (I personally tested it against Mailman release 2.1.15, RHEL 6)

The Patch diff --git a/Mailman/Cgi/ b/Mailman/Cgi/ index 4a54517..d6417ca 100644 --- a/Mailman/Cgi/ +++ b/Mailman/Cgi/ @@ -22,6 +22,7 @@ import os import cgi +import sys from Mailman import mm_cfg from Mailman import Utils @@ -30,6 +31,8 @@ from Mailman import Errors from Mailman import i18n from Mailman.htmlformat import * from Mailman.Logging.Syslog import syslog +sys.path.append("/usr/share/pyshared") +from recaptcha.client import captcha # Set up i18n _ = i18n._ @@ -200,6 +203,9 @@ def list_listinfo(mlist, lang): replacements[''] = mlist.FormatFormStart('listinfo') replacements[''] = mlist.FormatBox('fullname', size=30) + # Captcha + replacements[''] = captcha.displayhtml(mm_cfg.RECAPTCHA_PUBLIC_KEY, use_ssl=False) + # Do the expansion. doc.AddItem(mlist.ParseTags('listinfo.html', replacements, lang)) print doc.Format() diff --git a/Mailman/Cgi/ b/Mailman/Cgi/ index 7b0b0e4..c1c7b8c 100644 --- a/Mailman/Cgi/ +++ b/Mailman/Cgi/ @@ -21,6 +21,8 @@ import sys import os import cgi import signal +sys.path.append("/usr/share/pyshared") +from recaptcha.client import captcha from Mailman import mm_cfg from Mailman import Utils @@ -132,6 +130,17 @@ def process_form(mlist, doc, cgidata, lang): remote = os.environ.get('REMOTE_HOST', os.environ.get('REMOTE_ADDR', 'unidentified origin')) + + # recaptcha + captcha_response = captcha.submit( + cgidata.getvalue('recaptcha_challenge_field', ""), + cgidata.getvalue('recaptcha_response_field', ""), + mm_cfg.RECPTCHA_PRIVATE_KEY, + remote, + ) + if not captcha_response.is_valid: + results.append(_('Invalid captcha')) + # Was an attempt made to subscribe the list to itself? if email == mlist.GetListEmail(): syslog('mischief', 'Attempt to self subscribe %s: %s', email, remote)

Additional setup

Then on the /var/lib/mailman/templates/en/listinfo.html template (right below <mm-digest-question-end>)  add:

<tr> <td>Please fill out the following captcha</td> <td><mm-recaptcha-javascript></TD> </tr>

Make also sure to generate a public and private key at and add the following paramaters on your file:


Loading reCAPTCHAs images from a trusted HTTPS source can be done by changing the following line:

replacements[''] = captcha.displayhtml(mm_cfg.RECAPTCHA_PUBLIC_KEY, use_ssl=False)


replacements[''] = captcha.displayhtml(mm_cfg.RECAPTCHA_PUBLIC_KEY, use_ssl=True)

EPEL 6 related details

A few additional details should be provided in case you are setting this up against a RHEL 6 host: (or any other machine using the EPEL 6 package python-recaptcha-client-1.0.5-3.1.el6)

Importing the recaptcha.client module will fail for some strange reason, importing it correctly can be done this way:

ln -s /usr/lib/python2.6/site-packages/recaptcha/client /usr/lib/mailman/pythonlib/recaptcha

and then fix the imports also making sure sys.path.append(“/usr/share/pyshared”) is not there:

from recaptcha import captcha

That’s not all, the package still won’t work as expected given the API_SSL_SERVER, API_SERVER and VERIFY_SERVER variables on are outdated (filed as bug #1093855), substitute them with the following ones:


That should be all! Enjoy!

Dimitri John Ledkov: X4D Icons released

Planet Ubuntu - Fri, 2014-05-02 23:38
I am releasing some icons under MIT license. They will be hosted at and available for development on GitHub.
I had to create my own icons, as I couldn't find icons of similar nature under a free license. Hopefully others will find these useful as well.

The icons below are all available in PNG, GIF, SVG and EPS. To link to a specific version directly, add .png, .gif, -v.svg or -v.eps to the generic URI (or browse the icons repository to see all versions).

Document type Light Dark HTML 2.0 HTML 3.2 HTML 4.0 HTML 4.01 XHTML 1.0 XHTML 1.1 XHTML Basic 1.0 XHTML-Print 1.0 CSS CSS 1 CSS 2 MathML 2.0 SVG 1.0 SVG 1.1 SVG 1.2 SVG Tiny 1.1 SVG Tiny 1.2 XML 1.0 XML 1.1


Subscribe to Free Software Magazine aggregator