news aggregator

Forums Council: LoCo Forums

Planet Ubuntu - Fri, 2014-10-03 09:41

Up to now we have happily held areas for any LoCo who wishes us to do so.

Some of these are extremely active, notably the Catalan and Argentina teams areas, many though are unused at all and others show no activity at all since the middle of 2013.

Following a discussion with the LoCo Council and in conjunction with other changes we have been making to the forum, many of these LoCo forums have been archived. All of the posts in the archive area are still readable, but no new posts will be allowed in these archived forums.

If the LoCo Contact for any of the closed areas wants to discuss the situation with regard to their own forum we would be happy to do so.


The Fridge: Nominations open for two positions on the Ubuntu IRC Council

Planet Ubuntu - Fri, 2014-10-03 03:13

We are opening nominations for two positions on the Ubuntu IRC Council. We are filling in slots opened by resignations (IdleOne resigned last year, and AlanBell just announced to us he is resigning). We felt we could still perform with four members of the council but not with just three.

Details of the IRC Council can be read on the wiki; a summary of the nomination requirements is below (but all are suggested to read the wiki in full):

Elections of new IRC Council members will be held in the following way:

  1. An open call for nominations should be announced in the IRC Community, and people can nominate themselves for a seat on the council. Everyone is welcome to apply.
  2. To apply for a seat the candidate creates a Wiki page outlining their work in the community, and inviting others to provide testimonials.
  3. When the application deadline has passed, the IRC Council will review the applications and provide feedback on the candidates for the Community Council to review.
  4. The Community Council will identify a shortlist for the board and circulate the list publically for feedback from the community.
  5. The shortlist identified by the Community Council will be voted upon by team members as described at CommunityCouncil/Delegation. Members of the Ubuntu IRC Members Team are eligible to vote.
  6. The Community Council will then finalise the appointment of IRC Council members.

The deadline for the nominations will be announced later on.

Originally posted to the ubuntu-irc mailing list on Thu Oct 2 23:15:11 UTC 2014 by C de-Avillez

Raphaël Hertzog: My Free Software Activities in September 2014

Planet Ubuntu - Thu, 2014-10-02 16:20

This is my monthly summary of my free software related activities. If you’re among the people who made a donation to support my work (26.6 €, thanks everybody!), then you can learn how I spent your money. Otherwise it’s just an interesting status update on my various projects.

Django 1.7

Since Django 1.7 got released early September, I updated the package in experimental and continued to push for its inclusion in unstable. I sent a few more patches to multiple reverse build dependencies who had asked for help (python-django-bootstrap-form, horizon, lava-server) and then sent the package to unstable. At that time, I bumped the severity of all bug filed against packages that were no longer building with Django 1.7.

Later in the month, I made sure that the package migrated to testing, it only required a temporary removal of mumble-django (see #763087). Quite a few packages got updated since then (remaining bugs here).

Debian Long Term Support

I have worked towards keeping Debian Squeeze secure, see the dedicated article: My Debian LTS report for September 2014.

Distro Tracker

The pace of development on tracker.debian.org slowed down a bit this month, with only 30 new commits in the repository, closing 6 bugs. Some of the changes are noteworthy though: the news now contain true links on bugs, CVE and plain URLs (example here). I have also fixed a serious issue with the way users were identified when they used their Alioth account credentials to login via sso.debian.org.

On the development side, we’re now able to generate the test suite code coverage which is quite helpful to identify parts of the code that are clearly missing some tests (see bin/gen-coverage.sh in the repository).

Misc packaging

Publican. I have been behind packaging new upstream versions of Publican and with the freeze approaching, I decided to take care of it. Unfortunately, it wasn’t as easy as I had hoped and found numerous issues that I have filed upstream (invalid public identifier, PDF build fails with noNumberLines function available, build of the manual requires the network). Most of those have been fixed upstream in the mean time but the last issue seems to be a problem in the way we manage our Docbook XML catalogs in Debian. I have thus filed #763598 (docbook-xml: xmllint fails to identify local copy of docbook entities file) which is still waiting an answer from the maintainer.

Package sponsorship. I have sponsored new uploads of dolibarr (RC bug fix), tcpdf (RC bug fix), tryton-server (security update) and django-ratelimit.

GNOME 3.14. With the arrival of GNOME 3.14 in unstable, I took care of updating gnome-shell-timer and also filed some tickets for extensions that I use: https://github.com/projecthamster/shell-extension/issues/79 and https://github.com/olebowle/gnome-shell-timer/issues/25

git-buildpackage. I filed multiple bugs on git-buildpackage for little issues that have been irking me since I started using this tool: #761160 (gbp pq export/switch should be smarter), #761161 (gbp pq import+export should preserve patch filenames), #761641 (gbp import-orig should be less fragile and more idempotent).

Thanks

See you next month for a new summary of my activities.

No comment | Liked this article? Click here. | My blog is Flattr-enabled.

Julian Andres Klode: Acer Chromebook 13 (FHD): Initial impressions

Planet Ubuntu - Thu, 2014-10-02 16:10

Today, I received my Acer Chromebook 13, in the glorious FullHD variant with 4GB RAM. For those of you who don’t know it, the Acer Chromebook 13 is a 13.3 inch chromebook powered by a Tegra K1 cpu.

This version cannot be ordered currently, only pre-orders were shipped yesterday (at least here in Germany). I cannot even review it on Amazon (despite having it bought there), as they have not enabled reviews for it yet.

The device feels solidly built, and looks good. It comes in all-white matte plastic and is slightly reminiscent of the old white MacBooks. The keyboard is horrible, there’s no well defined pressure point. It feels like your typing on a pillow. The display is OK, an IPS would be a lot nicer to work with, though. Oh, and it could be brighter. I do not think that using it outside on a sunny day would be a good idea. The speakers are loud and clear compared to my ThinkPad X230.

The performance of the device is about acceptable (unfortunately, I do not have any comparison in this device class). Even when typing this blog post in the visual wordpress editor, I notice some sluggishness. Opening the app launcher or loading the new tab page while music is playing makes the music stop for or skip a few ms (20-50ms if I had to guess). Running a benchmark in parallel or browsing does not usually cause this stuttering, though.

There are still some bugs in Chrome OS:  Loading the Play Books library the first time resulted in some rendering issues. The “Browser” process always consumes at least 10% CPU, even when idling, with no page open; this might cause some of the sluggishness I mentioned above. Also watching Flash videos used more CPU than I expected given that it is hardware accelerated.

Finally, Netflix did not work out of the box, despite the Chromebook shipping with a special Netflix plugin. I always get some unexpected issue-type page. Setting the user agent to Chrome 38 from Windows, thus forcing the use of the EME video player instead of the Netflix plugin, makes it work.

I reported these software issues to Google via Alt+Shift+I. The issues appeared on the current version of the stable channel, 37.0.2062.120.

What’s next? I don’t know.


Filed under: Uncategorized

Kubuntu: KDE Applications and Development Platform 4.14.1

Planet Ubuntu - Thu, 2014-10-02 13:10

Packages for the release of KDE SC 4.14.1 are available for Kubuntu 14.04LTS and our development release. You can get them from the Kubuntu Backports PPA.

Bugs in the packaging should be reported to kubuntu-ppa on Launchpad. Bugs in the software to KDE.

Scarlett Clark: Kubuntu: KDE 4.14.1 on Trusty released.

Planet Ubuntu - Thu, 2014-10-02 12:44

We are finally catching up! I am pleased to announce I finished backporting KDE 4.14.1 to Trusty Tahr and it is available in the Kubuntu backports PPA.

http://ppa.launchpad.net/kubuntu-ppa/backports/ubuntu trusty main

Matt Zimmerman: Join me in supporting The Ada Initiative

Planet Ubuntu - Wed, 2014-10-01 16:36

When I first read that Linux kernel developer Valerie Aurora would be changing careers to work full-time on behalf of women in open source communities, I never imagined it would lead so far so fast. Today, The Ada Initiative is a non-profit organization with global reach, whose programs have helped create positive change for women in a wide range of communities beyond open source. Building on this foundation, imagine how much more they can do in the next four years! That’s why I’m pledging my continuing support, and asking you to join me.

For the next 7 days, I will personally match your donations up to $4,096. My employer, Heroku (Salesforce.com), will match my donations too, so every dollar you contribute will be tripled!

My goal is that together we will raise over $12,000 toward The Ada Initiative’s 2014 fundraising drive.

Since about 1999, I had been working in open source communities like Debian and Ubuntu, where women are vastly underrepresented even compared to the professional software industry. Like other men in these communities, I had struggled to learn what I could do to change this. Such a severe imbalance can only be addressed by systemic change, and I hardly knew where to begin. I worked to raise awareness by writing and speaking, and joined groups like Debian Women, Ubuntu Women and Geek Feminism. I worked on my own bias and behavior to avoid being part of the problem myself. But it never felt like enough, and sometimes felt completely hopeless.

Perhaps worst of all, I saw too many women burning out from trying to change the system. It was often taxing just to participate as a woman in a male-dominated community, and the extra burden of activism seemed overwhelming. They were all volunteers, doing this work in evenings and weekends around work or study, and it took a lot of time, energy and emotional reserve to deal with the backlash they faced for speaking out about sexism. Valerie Aurora and Mary Gardiner helped me to see that an activist organization with full-time staff could be part of the solution. I joined the Ada Initiative advisory board in February 2011, and the board of directors in April.

Today, The Ada Initiative is making a difference not only in my community, but in my workplace as well. When I joined Heroku in 2012, none of the engineers were women, and we clearly had a lot of work to do to change that. In 2013, I attended AdaCamp SF along with my colleague Peter van Hardenberg, joining the first “allies track”, open to participants of any gender, for people who wanted to learn the skills to support the women around them. We’ve gone on to host two ally skills workshops of our own for Heroku employees, one taught by Ada Initiative staff and another by a member of our team, security engineer Leigh Honeywell. These workshops taught interested employees simple, everyday ways to take positive action to challenge sexism and create a better workplace for women. The Ada Initiative also helped us establish a policy for conference sponsorship which supports our gender diversity efforts. Today, Heroku engineering includes about 10% women and growing. The Ada Initiative’s programs are helping us to become the kind of company we want to be.

I attended the workshop with a group of Heroku colleagues, and it was a powerful experience to see my co-workers learning tactics to support women and intervene in sexist situations. Hearing them discuss power and privilege in the workplace, and the various “a-ha!” moments people had, were very encouraging and made me feel heard and supported.
– Leigh Honeywell

If you want to see more of these programs from The Ada Initiative, please contribute now:


Mark Shuttleworth: Exchange controls in SA provide no economic guarantees of stability, but drive up the cost of cross-border relationships for everyone

Planet Ubuntu - Wed, 2014-10-01 13:48

The South African Supreme Court of Appeal today found in my favour in a case about exchange controls. I will put the returned funds of R250m plus interest into a trust, to underwrite constitutional court cases on behalf of those who’s circumstances deny them the ability to be heard where the counterparty is the State. Here is a statement in full:

Exchange controls may appear to be targeted at a very small number of South Africans but their consequences are significant for all of us: especially those who are building relationships across Southern Africa such as migrant workers and small businesses seeking to participate in the growth of our continent. It is more expensive to work across South African borders than almost anywhere else on Earth, purely because the framework of exchange controls creates a cartel of banks authorized to act as the agents of the Reserve Bank in currency matters.

We all pay a very high price for that cartel, and derive no real benefit in currency stability or security for that cost.

Banks profit from exchange controls, but our economy is stifled, and the most vulnerable suffer most of all. Everything you buy is more expensive, South Africans are less globally competitive, and cross-border labourers, already vulnerable, pay the highest price of all – a shame we should work to address. The IMF found that “A study in South Africa found that the comparative cost of an international transfer of 250 rand was the lowest when it went through a friend or a taxi driver and the highest when it went through a bank.” The World Bank found that “remittance fees punish poor Africans“. South Africa scores worst of all, and according to the Payments Association of South Africa and the Reserve Bank, this is “..mostly related to the regulations that South African financial institutions needed to comply with, such as the Financial Intelligence Centre Act (Fica) and exchange-control regulations.”

Today’s ruling by the Supreme Court of Appeal found administrative and procedural fault with the Reserve Bank’s actions in regards to me, and returned the fees levied, for which I am grateful. This case, however, was not filed solely in pursuit of relief for me personally. We are now considering the continuation of the case in the Constitutional Court, to challenge exchange control on constitutional grounds and ensure that the benefits of today’s ruling accrue to all South Africans.

This is a time in our history when it will be increasingly important to defend constitutional rights. Historically, these are largely questions related to the balance of power between the state and the individual. For all the eloquence of our Constitution, it will be of little benefit to us all if it cannot be made binding on our government. It is expensive to litigate at the constitutional level, which means that such cases are imbalanced – the State has the resources to make its argument, but the individual often does not.

For that reason, I will commit the funds returned to me to today by the SCA to a trust run by veteran and retired constitutional scholars, judges and lawyers, that will selectively fund cases on behalf of those unable to do so themselves, where the counterparty is the state. The mandate of this trust will extend beyond South African borders, to address constitutional rights for African citizens at large, on the grounds that our future in South Africa is in every way part of that great continent.

This case is largely thanks to the team of constitutional lawyers who framed their arguments long before meeting me; I have been happy to play the role of model plaintiff and to underwrite the work, but it is their determination to correct this glaring flaw in South African government policy which inspired me to support them.

For that reason I will ask them to lead the establishment of this new trust and would like to thank them for their commitment to the principles on which our democracy is founded.

This case also has a very strong personal element for me, because it is exchange controls which make it impossible for me to pursue the work I am most interested in from within South Africa and which thus forced me to emigrate years ago. I pursue this case in the hope that the next generation of South Africans who want to build small but global operations will be able to do so without leaving the country. In our modern, connected world, and our modern connected country, that is the right outcome for all South Africans.

Mark

Alan Pope: XDA Developer Conference 2014

Planet Ubuntu - Wed, 2014-10-01 10:09

The XDA Developer community had its second conference last weekend, this time in Manchester, UK. We were asked to sponsor the event and were happy to do so. I went along with Daniel Holbach from the Community Team and Ondrej Kubik from the Phone Delivery Team at Canonical.

This was my first non-Ubuntu conference for a while, so it was interesting for me to meet people from so many different projects. As well as us representing Ubuntu Phone, there were guys from the Jolla project showing off SailfishOS and their handset and ports. Asa Dotzler was also there to represent Mozilla & FirefoxOS.

Daniel did a small Ubuntu app development workshop which enabled us to learn a lot from our materials and process around App Dev Schools which we’ll feed back to later sessions. Ondrej gave a talk to a packed room about hardware bring-up and porting Ubuntu to other devices. It was well receieved and explained the platform nicely. I talked about the history of Ubuntu phone and what the future might hold.

There were other sponsor booths including big names like nVidia showing off the Sheild tablet and Sony demonstrating their rather bizarre Smart EyeGlass technology. Oppo and OnePlus had plenty of devices to lust after too including giant phones with beautiful displays. I enjoyed a bunch of the talks including MediaTek making a big announcement, and demonstrating their new LinkIT One platform.

The ~200 attendees were mostly pretty geeky guys whose ages ranged from 15 to 50. There were Android developers, ROM maintainers, hardware hackers and tech enthusiasts who all seemed very friendly and open to discuss all kinds of tech subjects at every opportunity.

One thing I’d not seen at other conferences which was big at XDA:DevCon was the hardware give-aways. The organisers had obtained a lot of tech from the sponsors to give away. This ranged from phone covers through bluetooth speakers, mobile printers, hardware hacking kits through to phones, smart watches & tablets, including an Oppo Find 7, pebble watch and nVidia Sheild & controller. These were often handed out as a ‘reward’ for attendees asking good questions, or as (free) raffle prizes. It certainly kept everyone on their toes and happy! I was delighted to see an Ubuntu community member get the Oppo Find 7 I was rewarded with an Anker MP141 Portable Bluetooth Speaker during one talk for some reason

On the whole I found the conference to be an incredibly friendly, well organised event. There was plenty of food and drink at break times and coffee and snacks in between with relaxing beers in the evening. A great conference which I’d certainly go to again.

Svetlana Belkin: Thoughts on Having a Meta Open Science Community

Planet Ubuntu - Wed, 2014-10-01 01:30

Over the last week, I started to think about how to improve the collaboration between the Open Science groups and researchers and also between the groups themselves. One of the ideas that I thought about using simple tools that are around in other Open * places (mainly Open Source/Linux distros). These tools are your forums (Discourse and other ones), Planet feeds, and wikis. Using these creates a meta community where members of the community can start there and get themselves involved in one or more groups. Open Science seems to lack this meta community.

Even though I think that meta community is not present, I do think that there is one group that can maintain this meta community and that group is the Open Knowledge Foundation Network (OKFN). They have a working group for Open Science. Therefore, I think, if they take the time and the resources, then it could happen or else some other group can be created for this.

What this meta community tool-wise needs:

Planet Feeds

Since I’m an official Ubuntu Member, I’m allowed to add my blog’s feed to Planet Ubuntu.  Planet Ubuntu allows anyone to read blog posts from many Ubuntu Members because it’s one giant feed reader.  This is well needed for Open Science, as Reddit doesn’t work for academia.  I asked on the Open Science OKFN mailing list and five people e-mailed me saying that they are interested in seeing one.  My next goal is to ask the folks of Open Science OKFN for help on building a Planet for Open Science.

Forums

I can only think of one forum, which is the Mozilla Science Lab one, that I wrote about last a few hours ago.  Having some general forum allows users to talk about various projects to job posting for their groups.  I don’t know if Discourse would be the right platform for the forums.  To me, it’s dynamicness is a bit too much at times.

Wiki

I have no idea if a wiki would work for this meta Open Science community but at least having a guide that introduces newcomers to the groups is worthwhile to have.  There is a plan for a guide.

I hope these ideas can be used by some group within the Open Science community and allow it the grow.


Svetlana Belkin: Mozilla Science Lab Forums Now Open

Planet Ubuntu - Tue, 2014-09-30 22:08

I am pleased to announce that the Mozilla Science Lab now has a forum that anyone can use.  Anyone can introduce themselves in this topic or the category.


Ubuntu Server blog: Server team meeting minutes: 2014-09-30

Planet Ubuntu - Tue, 2014-09-30 19:24
Agenda
  • Review ACTION points from previous meeting
  • rbasak to review mysql-5.6 transition plans with ABI breaks with infinity
  • blueprint updating
  • U Development
  • Server & Cloud Bugs (caribou)
  • Weekly Updates & Questions for the QA Team (psivaa)
  • Weekly Updates & Questions for the Kernel Team (smb, sforshee)
  • Ubuntu Server Team Events
  • Open Discussion
  • Announce next meeting date, time and chair
  • ACTION: meeting chair (of this meeting, not the next one) to carry out post-meeting procedure (minutes, etc) documented at https://wiki.ubuntu.com/ServerTeam/KnowledgeBase
Minutes
  • REVIEW ACTION POINTS FROM PREVIOUS MEETING
    • re: rbasak noted that regarding mysql mysql-5.6 transition / abi infinity action, we decided to defer the 5.6 move for this cycle, as we felt it was too late given the ABI concerns.
  • UTOPIC DEVELOPMENT
    • LINK: https://wiki.ubuntu.com/UtopicUnicorn/ReleaseSchedule
    • LINK: http://reqorts.qa.ubuntu.com/reports/rls-mgr/rls-u-tracking-bug-tasks.html#ubuntu-server
    • LINK: http://status.ubuntu.com/ubuntu-u/group/topic-u-server.html
    • LINK: https://blueprints.launchpad.net/ubuntu/+spec/topic-u-server
  • SERVER & CLOUD BUGS (CARIBOU)
    • Nothing to report.
  • WEEKLY UPDATES & QUESTIONS FOR THE QA TEAM (PSIVAA)
    • Nothing to report.
  • WEEKLY UPDATES & QUESTIONS FOR THE KERNEL TEAM (SMB, SFORSHEE)
    • smb reports that he is digging into a potential race between libvirt and xen init
  • UBUNTU SERVER TEAM EVENTS
    • None to report.
  • OPEN DISCUSSION
    • Pretty quiet. Not even any bad jokes. Back to crunch time!
  • ANNOUNCE NEXT MEETING DATE AND TIME
    • next meeting will be : Tue Oct 7 16:00:00 UTC 2014 chair will be lutostag
  • MEETING ACTIONS
    • ACTION: all to review blueprint work items before next weeks meeting.
People present (lines said)
  • beisner (54)
  • smb (8)
  • meetingology (4)
  • smoser (3)
  • rbasak (3)
  • kickinz1 (3)
  • caribou (2)
  • gnuoy (1)
  • matsubara (1)
  • jamespage (1)
  • arges (1)
  • hallyn (1)
IRC Log

Adam Stokes: sosreport (SoS) version 3.2 released

Planet Ubuntu - Tue, 2014-09-30 17:55

The sos team is pleased to announce the release of sos-3.2. This release includes a large number of enhancements and fixes, including:

  • Profiles for plugin selection
  • Improved log size limiting
  • File archiving enhancements and robustness improvements
  • Global plugin options:
    • --verify, --log-size, --all-logs
  • Better plugin descriptions
  • Improved journalctl log capture
  • PEP8 compliant code base
  • oVirt support improvements
  • New and updated plugins: hpasm, ctdb, dbus, oVirt engine hosted, MongoDB, ActiveMQ, OpenShift 2.0, MegaCLI, FCoEm, NUMA, Team network driver, Juju, MAAS, Openstack

References:

Ubuntu Kernel Team: Kernel Team Meeting Minutes – September 30, 2014

Planet Ubuntu - Tue, 2014-09-30 17:15
Meeting Minutes

IRC Log of the meeting.

Meeting minutes.

Agenda

20140930 Meeting Agenda


Release Metrics and Incoming Bugs

Release metrics and incoming bug data can be reviewed at the following link:

  • http://people.canonical.com/~kernel/reports/kt-meeting.txt


Status: Utopic Development Kernel

The Utopic kernel remainds rebased on the v3.16.3 upstream stable
kernel. The latest uploaded to the archive is 3.16.0-19.26. Please
test and let us know your results.
Also, Utopic Kernel Freeze is next week on Thurs Oct 9. Any patches
submitted after kernel freeze are subject to our Ubuntu kernel SRU
policy.
—–
Important upcoming dates:
Thurs Oct 9 – Utopic Kernel Freeze (~1 week away)
Thurs Oct 16 – Utopic Final Freeze (~2 weeks away)
Thurs Oct 23 – Utopic 14.10 Release (~3 weeks away)


Status: CVE’s

The current CVE status can be reviewed at the following link:

http://people.canonical.com/~kernel/cve/pkg/ALL-linux.html


Status: Stable, Security, and Bugfix Kernel Updates – Trusty/Precise/Lucid

Status for the main kernels, until today (Sept. 30):

  • Lucid – Verification and Testing
  • Precise – Verification and Testing
  • Trusty – Verification and Testing

    Current opened tracking bugs details:

  • http://kernel.ubuntu.com/sru/kernel-sru-workflow.html

    For SRUs, SRU report is a good source of information:

  • http://kernel.ubuntu.com/sru/sru-report.html

    Schedule:

    cycle: 19-Sep through 11-Oct
    ====================================================================
    19-Sep Last day for kernel commits for this cycle
    21-Sep – 27-Sep Kernel prep week.
    28-Sep – 04-Oct Bug verification & Regression testing.
    05-Oct – 11-Oct Regression testing & Release to -updates.


Open Discussion or Questions? Raise your hand to be recognized

No open discussion.

Mark Shuttleworth: Fixing the internet for confidentiality and security

Planet Ubuntu - Tue, 2014-09-30 14:24

“The Internet sees censorship as damage and routes around it” was a very motivating tagline during my early forays into the internet. Having grown up in Apartheid-era South Africa, where government control suppressed the free flow of ideas and information, I was inspired by the idea of connecting with people all over the world to explore the cutting edge of science and technology. Today, people connect with peers and fellow explorers all over the world not just for science but also for arts, culture, friendship, relationships and more. The Internet is the glue that is turning us into a super-organism, for better or worse. And yes, there are dark sides to that easy exchange – internet comments alone will make you cry. But we should remember that the brain is smart even if individual brain cells are dumb, and negative, nasty elements on the Internet are just part of a healthy whole. There’s no Department of Morals I would trust to weed ‘em out or protect me or mine from them.

Today, the pendulum is swinging back to government control of speech, most notably on the net. First, it became clear that total surveillance is the norm even amongst Western democratic governments (the “total information act” reborn).  Now we hear the UK government wants to be able to ban organisations without any evidence of involvement in illegal activities because they might “poison young minds”. Well, nonsense. Frustrated young minds will go off to Syria precisely BECAUSE they feel their avenues for discourse and debate are being shut down by an unfair and unrepresentative government – you couldn’t ask for a more compelling motivation for the next generation of home-grown anti-Western jihadists than to clamp down on discussion without recourse to due process. And yet, at the same time this is happening in the UK, protesters in Hong Kong are moving to peer-to-peer mechanisms to organise their protests precisely because of central control of the flow of information.

One of the reasons I picked the certificate and security business back in the 1990′s was because I wanted to be part of letting people communicate privately and securely, for business and pleasure. I’m saddened now at the extent to which the promise of that security has been undermined by state pressure and bad actors in the business of trust.

So I think it’s time that those of us who invest time, effort and money in the underpinnings of technology focus attention on the defensibility of the core freedoms at the heart of the internet.

There are many efforts to fix this under way. The IETF is slowly become more conscious of the ways in which ideals can be undermined and the central role it can play in setting standards which are robust in the face of such inevitable pressure. But we can do more, and I’m writing now to invite applications for Fellowships at the Shuttleworth Foundation by leaders that are focused on these problems. TSF already has Fellows working on privacy in personal communications; we are interested in generalising that to the foundations of all communications. We already have a range of applications in this regard, I would welcome more. And I’d like to call attention to the Edgenet effort (distributing network capabilities, based on zero-mq) which is holding a sprint in Brussels October 30-31.

20 years ago, “Clipper” (a proposed mandatory US government back door, supported by the NSA) died on the vine thanks to a concerted effort by industry to show the risks inherent to such schemes. For two decades we’ve had the tide on the side of those who believe it’s more important for individuals and companies to be able to protect information than it is for security agencies to be able to monitor it. I’m glad that today, you are more likely to get into trouble if you don’t encrypt sensitive information in transit on your laptop than if you do. I believe that’s the right side to fight for and the right side for all of our security in the long term, too. But with mandatory back doors back on the table we can take nothing for granted – regulatory regimes can and do change, as often for the worse as for the better. If you care about these issues, please take action of one form or another.

Law enforcement is important. There are huge dividends to a society in which people to make long term plans, which depends on their confidence in security and safety as much as their confidence in economic fairness and opportunity. But the agencies in whom we place this authority are human and tend over time, like any institution, to be more forceful in defending their own existence and privileges than they are in providing for the needs of others. There has never been an institution in history which has managed to avoid this cycle. For that reason, it’s important to ensure that law enforcement is done by due process; there are no short cuts which will not be abused sooner rather than later. Checks and balances are more important than knee-jerk responses to the last attack. Every society, even today’s modern Western society, is prone to abusive governance. We should fear our own darknesses more than we fear others.

A fair society is one where laws are clear and crimes are punished in a way that is deemed fair. It is not one where thinking about crime is criminal, or one where talking about things that are unpalatable is criminal, or one where everybody is notionally protected from the arbitrary and the capricious. Over the past 20 years life has become safer, not more risky, for people living in an Internet-connected West. That’s no thanks to the listeners; it’s thanks to living in a period when the youth (the source of most trouble in the world) feel they have access to opportunity and ideas on a world-wide basis. We are pretty much certain to have hard challenges ahead in that regard. So for all the scaremongering about Chinese cyber-espionage and Russian cyber-warfare and criminal activity in darknets, we are better off keeping the Internet as a free-flowing and confidential medium than we are entrusting an agency with the job of monitoring us for inappropriate and dangerous ideas. And that’s something we’ll have to work for.

Dustin Kirkland: Apply updates to multiple systems simultaneously using Byobu and Shift-F9

Planet Ubuntu - Tue, 2014-09-30 13:44
A StackExchange question, back in February of this year inspired a new feature in Byobu, that I had been thinking about for quite some time:
Wouldn't it be nice to have a hot key in Byobu that would send a command to multiple splits (or windows?This feature was added and is available in Byobu 5.73 and newer (in Ubuntu 14.04 and newer, and available in the Byobu PPA for older Ubuntu releases).

I actually use this feature all the time, to update packages across multiple computers.  Of course, Landscape is a fantastic way to do this as well.  But if you don't have access to Landscape, you can always do this very simply with Byobu!

Create some splits, using Ctrl-F2 and Shift-F2, and in each split, ssh into a target Ubuntu (or Debian) machine.

Now, use Shift-F9 to open up the purple prompt at the bottom of your screen.  Here, you enter the command you want to run on each split.  First, you might want to run:

sudo true

This will prompt you for your password, if you don't already have root or sudo access.  You might need to use Shift-Up, Shift-Down, Shift-Left, Shift-Right to move around your splits, and enter passwords.

Now, update your package lists:

sudo apt-get update

And now, apply your updates:

sudo apt-get dist-upgrade

Here's a video to demonstrate!


In a related note, another user-requested feature has been added, to simultaneously synchronize this behavior among all splits.  You'll need the latest version of Byobu, 5.87, which will be in Ubuntu 14.10 (Utopic).  Here, you'll press Alt-F9 and just start typing!  Another demonstration video here...




Cheers,
Dustin

Raphaël Hertzog: My Debian LTS report for September

Planet Ubuntu - Tue, 2014-09-30 13:24

Thanks to the sponsorship of multiple companies, I have been paid to work 11 hours on Debian LTS this month.

I started by doing lots of triage in the security tracker (if you want to help, instructions are here) because I noticed that the dla-needed.txt list (which contains the list of packages that must be taken care of via an LTS security update) was missing quite a few packages that had open vulnerabilities in oldstable.

In the end, I pushed 23 commits to the security tracker. I won’t list the details each time but for once, it’s interesting to let you know the kind of things that this work entailed:

  • I reviewed the patches for CVE-2014-0231, CVE-2014-0226, CVE-2014-0118, CVE-2013-5704 and confirmed that they all affected the version of apache2 that we have in Squeeze. I thus added apache2 to dla-needed.txt.
  • I reviewed CVE-2014-6610 concerning asterisk and marked the version in Squeeze as not affected since the file with the vulnerability doesn’t exist in that version (this entails some checking that the specific feature is not implemented in some other file due to file reorganization or similar internal changes).
  • I reviewed CVE-2014-3596 and corrected the entry that said that is was fixed in unstable. I confirmed that the versions in squeeze was affected and added it to dla-needed.txt.
  • Same story for CVE-2012-6153 affecting commons-httpclient.
  • I reviewed CVE-2012-5351 and added a link to the upstream ticket.
  • I reviewed CVE-2014-4946 and CVE-2014-4945 for php-horde-imp/horde3, added links to upstream patches and marked the version in squeeze as unaffected since those concern javascript files that are not in the version in squeeze.
  • I reviewed CVE-2012-3155 affecting glassfish and was really annoyed by the lack of detailed information. I thus started a discussion on debian-lts to see whether this package should not be marked as unsupported security wise. It looks like we’re going to mark a single binary packages as unsupported… the one containing the application server with the vulnerabilities, the rest is still needed to build multiple java packages.
  • I reviewed many CVE on dbus, drupal6, eglibc, kde4libs, libplack-perl, mysql-5.1, ppp, squid and fckeditor and added those packages to dla-needed.txt.
  • I reviewed CVE-2011-5244 and CVE-2011-0433 concerning evince and came to the conclusion that those had already been fixed in the upload 2.30.3-2+squeeze1. I marked them as fixed.
  • I droppped graphicsmagick from dla-needed.txt because the only CVE affecting had been marked as no-dsa (meaning that we don’t estimate that a security updated is needed, usually because the problem is minor and/or that fixing it has more chances to introduce a regression than to help).
  • I filed a few bugs when those were missing: #762789 on ppp, #762444 on axis.
  • I marked a bunch of CVE concerning qemu-kvm and xen as end-of-life in Squeeze since those packages are not currently supported in Debian LTS.
  • I reviewed CVE-2012-3541 and since the whole report is not very clear I mailed the upstream author. This discussion led me to mark the bug as no-dsa as the impact seems to be limited to some information disclosure. I invited the upstream author to continue the discussion on RedHat’s bugzilla entry.

And when I say “I reviewed” it’s a simplification for this kind of process:

  • Look up for a clear explanation of the security issue, for a list of vulnerable versions, and for patches for the versions we have in Debian in the following places:
    • The Debian security tracker CVE page.
    • The associated Debian bug tracker entry (if any).
    • The description of the CVE on cve.mitre.org and the pages linked from there.
    • RedHat’s bugzilla entry for the CVE (which often implies downloading source RPM from CentOS to extract the patch they used).
    • The upstream git repository and sometimes the dedicated security pages on the upstream website.
  • When that was not enough to be conclusive for the version we have in Debian (and unfortunately, it’s often the case), download the Debian source package and look at the source code to verify if the problematic code (assuming that we can identify it based on the patch we have for newer versions) is also present in the old version that we are shipping.

CVE triaging is often almost half the work in the general process: once you know that you are affected and that you have a patch, the process to release an update is relatively straightforward (sometimes there’s still work to do to backport the patch).

Once I was over that first pass of triaging, I had already spent more than the 11 hours paid but I still took care of preparing the security update for python-django. Thorsten Alteholz had started the work but got stuck in the process of backporting the patches. Since I’m co-maintainer of the package, I took over and finished the work to release it as DLA-65-1.

No comment | Liked this article? Click here. | My blog is Flattr-enabled.

Stuart Langridge: The next big thing is privacy

Planet Ubuntu - Mon, 2014-09-29 23:41

The way you beat an incumbent is by coming up with a thing that people want, that you do, and that your competitors can’t do.

Not won’t. Can’t.

How did Apple beat Microsoft? Not by making a better desktop OS. They did it by shifting the goalposts. By creating a whole new field of competition where Microsoft’s massive entrenched advantage didn’t exist: mobile. How did Microsoft beat Digital and the mainframe pushers? By inventing the idea that every desktop should have a real computer on it, not a terminal.

How do you beat Google and Facebook? By inventing a thing that they can’t compete against. By making privacy your core goal. Because companies who have built their whole business model on monetising your personal information cannot compete against that. They’d have to give up on everything that they are, which they can’t do. Facebook altering itself to ensure privacy for its users… wouldn’t exist. Can’t exist. That’s how you win.

If you ask actual people whether they want privacy, they say, yes. Always. But if you then ask, are they, are we, prepared to give that privacy up to get things? They say yes again. They, we, want privacy, but not as much as we want stuff. Not as much as we want to talk to one another. Giving up our personal data to enable that, that’s a reasonable cost to pay, because we don’t value our personal data. Some of that’s because there’s no alternative, and some of that’s because nobody’s properly articulated the alternative.

Privacy will define the next major change in computing.

We saw the change to mobile. The change to social. These things fundamentally redefined the way technology looked to the mainstream. The next thing will be privacy. The issue here is that nobody has worked out a way of articulating the importance of privacy which convinces actual ordinary people. There are products and firms trying to do that right now. Look at Blackphone. Look at the recent fertile ground for instant messaging with privacy included from Telegram and Threema and Whisper System‘s Text Secure. They’re all currently basically for geeks. They’re doing the right thing, but they haven’t worked out how to convince real people that they are the right thing.

The company who work out how to convince people that privacy is important will define the next five years of technology.

Privacy, historically the concern of super-geeks, is beginning to poke its head above the parapet. Tim Berners-Lee calls for a “digital Magna Carta”. The EFF tries to fix it and gets their app banned because it’s threatening Google’s business model to have people defend their own data. The desire for privacy is becoming mainstream enough that the Daily Mash are prepared to make jokes about it. Apple declare to the world that they can’t unlock your iPhone, and Google are at pains to insist that they’re the same. We’re seeing the birth of a movement; the early days before the concern of the geeks becomes the concern of the populace.

So what about the ind.ie project?

The ind.ie project will tell you that this is what they’re for, and so you need to get on board with them right now. That’s what they’ll tell you.

The ind.ie project is to open source as Brewdog are to CAMRA. Those of you who are not English may not follow this analogy.

CAMRA is the Campaign for Real Ale: a British society created in the 1970s and still existing today who fight to preserve traditionally made beer in the UK, which they name “real ale” and have a detailed description of what “real ale” is. Brewdog are a brewer of real ale who were founded in 2007. You’d think that Brewdog were exactly what CAMRA want, but it is not so. Brewdog, and a bunch of similar modern breweries, have discovered the same hatred that new approaches in other fields also discovered. In particular, Brewdog have done a superb job at bringing a formerly exclusive insular community into the mainstream. But that insular community feel resentful because people are making the right decisions, but not because they’ve embraced the insular community. That is: people drink Brewdog beer because they like it, and Brewdog themselves have put that beer into the market in such a way that it’s now trendy to drink real ale again. But those drinking it are not doing it because they’ve bought into CAMRA’s reasoning. They like real ale, but they don’t like it for the same reasons that CAMRA do. As Daniel Davies said, every subculture has this complicated relationship with its “trendy” element. From the point of view of CAMRA nerds, who believe that beer isn’t real unless it has moss floating in it, there is a risk that many new joiners are fair-weather friends just jumping on a trendy bandwagon and the Brewdog popularity may be a flash in the pan. The important point here is that the new people are honestly committed to the underlying goals of the old guard (real ale is good!) but not the old guard’s way of articulating that message. And while that should get applause, what it gets is resentment.

Ind.ie is the same. They have, rather excellently, found a way of describing the underlying message of open source software without bringing along the existing open source community. That is, they’ve articulated the value of being open, and of your data being yours without it being sold to others or kept as commercial advantage, but have not done so by pushing the existing open source message, which is full of people who start petty fights over precisely which OS you use and what distribution A did to distribution B back in the mists of prehistory. This is a deft and smart move; people in general tend to agree with the open source movement’s goals, but are hugely turned off by interacting with that existing open source movement, and ind.ie have found a way to have that cake and eat it.

Complaints from open source people about ind.ie are at least partially justified, though. It is not reasonable to sneer at existing open source projects for knowing nothing about users and at the same time take advantage of their work. It is not at all clear how ind.ie will handle a bunch of essential features — reading an SD card, reformatting a drive, categorising applications, storing images, sandboxing apps from one another, connecting to a computer, talking to the cloud — without using existing open source software. The ind.ie project seem confident that they can overlay a user experience on this essential substrate and make that user experience relevant to real people rather than techies; but it is at best disingenuous and at worst frankly offensive to simultaneously mock open source projects for knowing nothing about users and then also depend on their work to make your own project successful. Worse, it ignores the time and effort that companies such as Canonical have put in to user testing with actual people. It’s blackboard economics of the worst sort, and it will have serious repercussions down the line when the ind.ie project approaches one of its underlying open source projects and says “we need this change made because users care” and the project says “but you called us morons who don’t care about users” and so ignores the request. Canonical have suffered this problem with upstream projects, and they were nowhere near as smugly, sneeringly dismissive as ind.ie have been of the open source substrate on which they vitally depend.

However, they, ind.ie, are doing the right thing. The company who work out how to convince people that privacy is important will define the next five years of technology. This is not an idle prediction. The next big wave in technology will be privacy.

There are plenty of companies right now who would say that they’re already all over that. As mentioned above, there’s Blackphone and Threema and Telegram and ello and diaspora. All of them are contributors and that’s it. They’re not the herald who usher in the next big wave. They’re ICQ, or Friends Reunited: when someone writes the History Of Tech In The Late 2010s, Blackphone and ello and Diaspora will be footnotes, with the remark that they were early adopters of privacy-based technology. There were mp3 players before the iPod. There were social networks before Facebook. All the existing players who are pushing privacy as their raison d’etre and writing manifestos are creating an environment which is ripe for someone to do it right, but they aren’t themselves the agent of change; they’re the Diamond Rio who come before the iPod, the ICQ who come before WhatsApp. Privacy hasn’t yet found its Facebook. When it does, that Facebook of privacy will change the world so that we hardly understand that there was a time when we didn’t care about it. They’ll take over and destroy all the old business models and make a new tech universe which is better for us and better for them too.

I hope it comes soon.

Ben Howard: Cloud Images and Bash Vulnerabilities

Planet Ubuntu - Mon, 2014-09-29 17:45
Cloud Images and Bash VulnerabilitiesThe Ubuntu Cloud Image team has been monitoring the bash vulnerabilities. Due to the scope, impact and high profile nature of these vulnerabilties, we have published new images. New cloud images to address the lastest bash USN-2364-1 [1, 8, 9] are being released with a build serials of 20140927. These images include code to address all prior CVEs, including CVE-2014-6271 [6] and CVE-2014-7169 [7], and supersede images published in the past week which addressed those CVEs.

Please note: Securing Ubuntu Cloud Images requires users to regularly apply updates[5]; using the latest Cloud Images are insufficient. 

Addressing the full scope of the Bash vulnerability has been an iterative process. The security team has worked with the upstream bash community to address multiple aspects of the bash issue. As these fixes have become available, the Cloud Image team has published daily[2]. New released images[3] have been made available at the request of the Ubuntu Security team.

Canonical has been in contact with our public Cloud Partners to make these new builds available as soon as possible.
Cloud image update timelineDaily image builds are automatically triggered when new package versions become available in the public archives. New releases for Cloud Images are triggered automatically when a new kernel becomes available. The Cloud Image team will manually trigger new released images when either requested by the Ubuntu Security team or when a significant defect requires.

Please note:  Securing Ubuntu cloud images requires that security updates be applied regularly [5], using the latest available cloud image is not sufficient in itself.  Cloud Images are built only after updated packages are made available in the public archives. Since it takes time to build the  images, test/QA and finally promote the images, there is time (sometimes  considerable) between public availablity of the package and updated Cloud Images. Users should consider this timing in their update strategy.
[1] http://www.ubuntu.com/usn/usn-2364-1/
[2] http://cloud-images.ubuntu.com/daily/server/
[3] http://cloud-images.ubuntu.com/releases/
[4] https://help.ubuntu.com/community/Repositories/Ubuntu/
[5] https://wiki.ubuntu.com/Security/Upgrades/
[6] http://people.canonical.com/~ubuntu-security/cve/2014/CVE-2014-6271.html
[7] http://people.canonical.com/~ubuntu-security/cve/2014/CVE-2014-7169.html
[8] http://people.canonical.com/~ubuntu-security/cve/2014/CVE-2014-7187.html
[9] http://people.canonical.com/~ubuntu-security/cve/2014/CVE-2014-7186.html

Pages

Subscribe to Free Software Magazine aggregator