news aggregator

Valorie Zimmerman: Counting the days until Akademy!

Planet Ubuntu - Sat, 2014-08-23 02:49
It seems so soon after returning home from Randa and Geneva, but already the day of departure to Vienna and then Brno looms. So excited! For starters, both Scarlett and I got funding from Ubuntu so the e.V. is spared the cost of our travel! I've often felt guilty about how much airfare from Seattle is, for previous meetings. We're having a Kubuntu gathering on Thursday the 11th of September. Ping us if you have an issue you want discussed or worked on.

Also, Scarlett and I will be traveling together, which will be fun. And we're meeting Stefan Derkits in Vienna, to see some of his favorite places. Oh, a whole day in Vienna seems like heaven. We have a hostel booked; I hope it's nice. Now I need to figure out the bus or train from Vienna <> Brno.


Get your own banner at https://community.kde.org/Akademy/2014/badges
Then there is the e.V. annual meeting, which I enjoy since I was admitted to membership. It is great to hear the reports personally, and meet people I usually only hear from in email or IRC.

Finally, there is Akademy, which is always a blur of excitement, learning, socializing, and interacting with the amazing speakers. My favorite part is always hearing from the GSoC students about their projects, and their experience in the KDE community. After Akademy proper, there are days of BOFs, and our Kubuntu meeting. This part is often the most energizing, as each meeting is like a small-scale sprint.

Of course we do take some time to walk through the city, and eat out, and party a bit. Face-to-face meetings are the BEST! Sometimes we return home exhausted and jetlagged, but it is always worth it. KDE is a community, and our annual gathering is one important way for us to nurture that community. This energizes the entire next year of creating amazing software.

An extra-special part of Akademy this year is that we are planning to release our new KDE Frameworks 5 Cookbook at Akademy. Get some while they're hot!

Ben Howard: Archive-triggered Cloud Image Builds

Planet Ubuntu - Fri, 2014-08-22 18:11
For years, the Ubuntu Cloud Images have been built on a timer (i.e. cronjob or Jenkins). Every week, you can reasonably expect that stable and LTS releases to be built twice a week while our development build is build once a day.  Each of these builds is given a serial in the form of YYYYMMDD. 

While time-based building has proven to be reliable, different build serials may be functionally the same, just put together at a different point in time. Many of the builds that we do for stable and LTS releases are pointless.

When the whole heartbleed fiasco hit, it put the Cloud Image team into over-drive, since it required manually triggering builds the LTS releases. When we manually trigger builds, it takes roughly 12-16 hours to build, QA, test and release new Cloud Images. Sure, most of this is automated, but the process had to be manually started by a human. This got me thinking: there has to be a better way.

What if we build the Cloud Images when the package set changes?

With that, I changed the Ubuntu 14.10 (Utopic Unicorn) build process from time-based to archive trigger-based. Now, instead of building every day at 00:30 UTC, the build starts when the archive has been updated and the packages in the prior cloud image build is older than the archive version. In the last three days, there were eight builds for Utopic. For a development version of Ubuntu, this just means that developers don't have to wait 24 hours for the latest package changes to land in a Cloud Image.

Over the next few weeks, I will be moving the 10.04 LTS, 12.04 LTS and 14.04 LTS build processes from time to archive trigger-based. While this might result less frequent daily builds, the main advantage is that the daily builds will contain the latest package sets. And if you are trying to respond to the latest CVE, or waiting on a bug fix to land, it likely means that you'll have a fresh daily that you can use the following day.

Jonathan Riddell: Do you need to be brain damaged to care about desktop Linux? and Kubuntu day at Akademy

Planet Ubuntu - Fri, 2014-08-22 16:34
KDE Project:

After sell out dates in Glasgow and Belgium the tour of my dramatic talk "Do you need to be brain damaged to care about desktop Linux?" is making a stop in Brno for the KDE Conference Akademy. In it I'll talk about the struggles of recoving from a head injury mixed with creating a beautiful and friendly Linux distro: Kubuntu. It'll have drame, it'll have emotion, it'll have a discussion of the relative merits of community against in-house development. Make sure you book your tickets now!

Also at Akademy is the Kubuntu day on Thursday, sign up now if you want to come and talk about your ideas or grumble about your problems with Kubuntu. Free hugs will be in store.

Paul Tagliamonte: On my way to DebConf 14

Planet Ubuntu - Fri, 2014-08-22 15:33

Slowly, but I’ll be in by Tonight, PST (early morning EST!)

Hope to see everyone soon!

Dustin Kirkland: Call for Testing: Docker 1.0.1 in Ubuntu 14.04 LTS (Trusty)

Planet Ubuntu - Fri, 2014-08-22 14:21

Docker 1.0.1 is available for testing, in Ubuntu 14.04 LTS!Docker 1.0.1 has landed in the trusty-proposed archive, which we hope to SRU to trusty-updates very soon.  We would love to have your testing feedback, to ensure both upgrades from Docker 0.9.1, as well as new installs of Docker 1.0.1 behave well, and are of the highest quality you have come to expect from Ubuntu's LTS  (Long Term Stable) releases!  Please file any bugs or issues here.

Moreover, this new version of the Docker package now installs the Docker binary to /usr/bin/docker, rather than /usr/bin/docker.io in previous versions. This should help Ubuntu's Docker package more closely match the wealth of documentation and examples available from our friends upstream.

A big thanks to Paul Tagliamonte, James Page, Nick Stinemates, Tianon Gravi, and Ryan Harper for their help upstream in Debian and in Ubuntu to get this package updated in Trusty!  Also, it's probably worth mentioning that we're targeting Docker 1.1.2 (or perhaps 1.2.0) for Ubuntu 14.10 (Utopic), which will release on October 23, 2014.

Here are a few commands that might help your testing...
Check What Candidate Versions are Available$ sudo apt-get update
$ apt-cache show docker.io | grep ^Version:

If that shows 0.9.1~dfsg1-2 (as it should), then you need to enable the trusty-proposed pocket.
$ echo "deb http://archive.ubuntu.com/ubuntu/ trusty-proposed universe" | sudo tee -a /etc/apt/sources.list
$ sudo apt-get update
$ apt-cache show docker.io | grep ^Version:

And now you should see the new version, 1.0.1~dfsg1-0ubuntu1~ubuntu0.14.04.1, available (probably in addition to 1.0.1~dfsg1-0ubuntu1~ubuntu0.14.04.1).UpgradesCheck if you already have Docker installed, using:

$ dpkg -l docker.io

If so, you can simply upgrade.

$ sudo apt-get upgrade

And now, you can check your Docker version:

$ sudo dpkg -l docker.io | grep -m1 ^ii | awk '{print $3}'0.9.1~dfsg1-2

New InstallationsYou can simply install the new package with:
$ sudo apt-get install docker.io

And ensure that you're on the latest version with:
$ dpkg -l docker.io | grep -m1 ^ii | awk '{print $3}'
1.0.1~dfsg1-0ubuntu1~ubuntu0.14.04.1
Running DockerIf you're already a Docker user, you probably don't need these instructions.  But in case you're reading this, and trying Docker for the first time, here's the briefest of quick start guides :-)
$ sudo docker pull ubuntu
$ sudo docker run -i -t ubuntu /bin/bash

And now you're running a bash shell inside of an Ubuntu Docker container.  And only bash!
root@1728ffd1d47b:/# ps -ef
UID PID PPID C STIME TTY TIME CMD
root 1 0 0 13:42 ? 00:00:00 /bin/bash
root 8 1 0 13:43 ? 00:00:00 ps -ef

If you want to do something more interesting in Docker, well, that's whole other post ;-)
:-Dustin

Zygmunt Krynicki: Live coding videos

Planet Ubuntu - Fri, 2014-08-22 12:01

Today I was experimenting with coding live, on air with google hangouts. It is an interesting idea IMHO as it adds visibility to a process that is done in the open but rarely transparently in a way others can watch and learn from.

I've recorded two videos today:  Live coding: adding a man page for the new category unit and Live coding: fixing bug https://bugs.launchpad.net/checkbox-ng/+bug/1360125. If you want to see how I work (including and all the mistakes I make :-) do watch them and give me feedback so that I can get learn and get better at it.

Mohamad Faizul Zulkifli: How To Adjust Mouse/Touchpad Scroll Speed On Ubuntu

Planet Ubuntu - Fri, 2014-08-22 11:30



Thanks to nicknorton, just follow his instructions. Watch his video for more guided info.


  1. Install imwheel using whatever package manager you use.
  2. Debian based distros: sudo apt-get install imwheel
  3. Download the script http://www.nicknorton.net/mousewheel.sh
  4. Save it into your home folder, make it executable. Run it and enjoy.

Valorie Zimmerman: Learning to git

Planet Ubuntu - Fri, 2014-08-22 09:55
A few years ago, I learned from Myriam's fine blog how to build Amarok from source, which is kept in git. It sounds mysterious, but once all the dependencies are installed, PATH is defined and the environment is properly set up, it is extremely easy to refresh the source (git pull) and rebuild. In fact, I usually use the up-arrow in the konsole, which finds the previous commands, so I rarely have to even type anything! Just hit return when the proper command is in place.

Now we're using git for the KDE Frameworks book, so I learned how to not only pull the new or changed source files, but also to commit my own few or edited files locally, then push those commits to git, so others can see and use them.

To be able to write to the repository, an SSH key must be uploaded, in this case done in the KDE Identity account. If the Identity account is not a developer account, that must first be granted.

Just as in building Amarok, first the folders need to be created, and the repository cloned. Once cloned, I can see either in konsole or Dolphin the various files. It's interesting to me to poke around in most of them, but the ones I work in are markdown files, which is a type of text file. I can open them in kate (or your editor of choice) either from Dolphin or directly from the cli (for instance kate ki18n/ki18n.in.md).

Once edited, save the file, then it's time to commit. If there are a number of files to work on, they can be all committed at once. git commit -a is the command you need. Once you hit return, you will be immediately put into nano, a minimal text editor. Up at the top, you will see it is waiting for your commit message, which is a short description of the file or the changes you have made. Most of my commits have said something like "Edited for spelling and grammar." Once your message is complete, hit Control X, and y and return to save your changes.

It's a good idea to do another git pull just to be sure no one else has pushed a conflicting file while the commit message was being crafted, then git push. At this point the passphrase for the ssh key is asked for; once that is typed and you hit return, you'll get something like the following:

Counting objects: 7, done.                                                                                                                                                                              
Delta compression using up to 8 threads.                                                                                                                                                                
Compressing objects: 100% (4/4), done.                                                                                                                                                                  
Writing objects: 100% (4/4), 462 bytes | 0 bytes/s, done.                                                                                                                                                
Total 4 (delta 2), reused 1 (delta 0)                                                                                                                                                                    
remote: This commit is available for viewing at:
remote: http://commits.kde.org/kf5book/90c863e4ee2f82e4d8945ca74ae144b70b9e9b7b
To git@git.kde.org:kf5book                                                                                                                                                                              
   1d078fe..90c863e  master -> master                                                                                                                                                                    
valorie@valorie-HP-Pavilion-dv7-Notebook-PC:~/kde/book/kf5book$
In this case, the new file is now part of the KDE Frameworks 5 book repository. Git is a really nifty way to keep files of any sort organized and backed up. I'm really happy that we decided to develop the book using this powerful tool.

Jorge Castro: The Ubuntu Steam box, one year later

Planet Ubuntu - Thu, 2014-08-21 23:02

It’s been about a year since I started building my own Steam console for my living room. A ton has changed since then. SteamOS has been released, In Home Streaming is out of beta and generally speaking the living room experience has gotten a ton better.

This blog post will be a summary of what’s changed in the past year, in the hopes that it will help someone who might be interested in building their own “next-gen console” for about the same price, and take advantage of nicer hardware and all the things that PC gaming has to offer.

Step 1: Choosing the hardware
  • I consider the NVIDIA GTX 750Ti to be the best thing to happen in hardware for this sort of project. It’s based on their newest Maxwell technology so it runs cool, it does not need a special power supply plug, and it’s pretty small. It’s also between $120-$150 – which means nearly any computer is now capable of becoming a game console. And a competent one at that.

  • I have settled on the Cooler Master 110 case, which is one of the least obnoxious PC case you can find that won’t look too bad in the living room. Unfortunately Valve’s slick-looking case did not kick the case makers into making awesome-looking living room style cases. The closest you can find is the Silverstone RVZ01, which has the right insides, but they ruined the outside with crazy plastic ribs. The Digital Storm Bolt II looks great, but you can’t buy the case seperately. Both cases have CD drives for some reason, boo!

  • Nvidia has a great guide on building a PC within the console-price range if you want to look around. I also recommend checking out r/buildapc, which has tons of Mini-ITX/750Ti builds.

  • Another alternative is the excellent Intel NUC and Gigabyte Brix. These make for great portable machines, but for the upcoming AAA titles for Linux like Metro Redux, Star Citizen, and so on I decided to go with a dedicated graphics card. Gigabyte makes a very interesting model that is the size of a NUC, but with a GTX 760(!). This looks to be ideal, but unfortunately when Linus reviewed it he found heat/throttling issues. When they make a Maxwell based one of these it will likely be awesome.

  • Don’t forget the controller. The Xbox wireless ones will work out of the box. I recommend avoiding the off-brand dongles you see on Amazon, they can be hit or miss.

Step 2: Choosing the software

I’ve been using SteamOS since it came out. The genious about SteamOS is that fundamentally it does only 2 things. It boots, and then runs Steam Big Picture (BPM) mode. This means for a dedicated console, the OS is really not important. I have 2 drives in the box, one with SteamOS, and one with Ubuntu running BPM. After running both I prefer Ubuntu/Steam to SteamOS:

  • Faster boot (Upstart v. SysV)
  • PPAs enable fresh access to new Nvidia drivers and Plex Home Theater
  • Newer kernels and access to HWE kernels over the next 5 years

I tend to alternate between the two, but since I am more familiar with Ubuntu it makes it easier to use for, so the rest of this post will cover how to build a dedicated Steam Box using Ubuntu.

This isn’t to say SteamOS is bad, in fact, setting it up is actually easier than doing the next few steps; remember that the entire point is to not care about the OS underneath, and get you into Steam. So build whatever is most comfortable for you!

Step 3: Installation

These are the steps I am currently doing. It’s not for beginners, you should be comfortable admining an Ubuntu system.

  • Install Ubuntu 14.04.
  • (Optional) - Install openssh-server. I don’t know about you but lugging a keyboard/mouse back and forth to my living room is not my idea of a good time. I prefer to sit on the couch, and ssh into the box from my laptop.
  • Add the xorg-edgers PPA. You don’t need this per se, but let’s go all in!
  • Install the latest Nvidia drivers: As of this writing, nvidia-graphics-drivers-343.

After you’ve installed the drivers and all the security updates you should reboot to get to your nice new clean desktop system. Now it’s time to make it a console:

  • Log in, and install Steam. Log into steam, make sure it works.
  • Add the Marc Deslaurier’s SteamOS packages PPA. These are rebuilt for Ubuntu and he does a great job keeping them up to date.
  • sudo apt-get install steamos-compositor steamos-modeswitch-inhibitor steamos-xpad-dkms
  • Log out, and in the login screen, click on the Ubuntu symbol by the username and select the Steam session. This will get you the dedicated Steam session. Make sure that works. Exit out of that and now let’s make it so we can boot into that new session by default
  • Enable autologin in LightDM after the fact so that when your machine boots it goes right into Steam’s Big Picture mode.

We’re using Valve’s xpad module instead of xboxdrv because that’s what they use in SteamOS and I don’t want to deviate too much. But if you prefer about xboxdrv then follow this guide.

  • Steam updates itself at the client level so there’s no need to worry about that, the final step for a console-like experience is to enable automatic updates. Remember you’re using PPAs, so if you’re not confident that you can fix things, just leave it and do maintenance by hand every once in a while.
Step 4: Home Theater Bling

If you’re going to have a nice living room box, then let’s use it for other things. I have a dedicated server with media that I share out with Plex Media Server, so in this step I’ll install the client side.

Plex Home Theater:

sudo add-apt-repository ppa:plexapp/plexht sudo add-apt-repository ppa:ppa:pulse-eight/libcec sudo apt-get update && sudo apt-get install plexhometheater

In Steam you can then click on the + symbol, Add a non-steam game, and then add Plex. Use the gamepad (not the stick) to navigate the UI once you launch it. If you prefer XBMC/Kodi you can install that instead. I found that the controller also works out of the box there, so it’s a nice experience no matter which one you choose.

Step 5: In Home Streaming

This is a killer Steam feature, that allows you to stream your Windows games to your new console. It’s very straight forward, just have both machines on and logged into Steam on the same network, they will autodiscover each other, and your Windows games will show up in your Ubuntu/Steam UI, and you can stream them. Though it works suprisingly well over wireless, you’ll definately want to ensure you’ve got gigabit ethernet if you want to stream games 1080p at 60 frames per second.

Conclusion

And that’s basically it! There’s tons of stuff I’ve glossed over, but these are the basic steps. There’s lots of little things you can do, like remove a bunch of desktop packages you won’t need (so you don’t need to download and update them) and other tips and tricks, I’ll try to keep everyone up to date on how it’s going.

Enjoy your new next-gen gaming console!

TODO:

  • You can change out the plymouth theme to use the one for SteamOS - but I have an SSD in the box and combined with the fast boot it never comes up for me anyway.
  • It’d be cool to make a prototype of Ubuntu Core and then provide Steam in an LXC container on top of that so we don’t have to use a full blown desktop ISO.

Ubuntu Podcast from the UK LoCo: S07E21 – The One with the Rumour

Planet Ubuntu - Thu, 2014-08-21 20:23

Laura Cowen, Alan Pope, and Mark Johnson are in Studio L for Season Seven, Episode Twenty-One of the Ubuntu Podcast!

 Download OGG  Download MP3 Play in Popup

In this week’s show:-

We’ll be back next week, when we’ll be interviewing Daniel Holbach, and we’ll go through your feedback.

Please send your comments and suggestions to: podcast@ubuntu-uk.org
Join us on IRC in #uupc on Freenode
Leave a voicemail via phone: +44 (0) 203 298 1600, sip: podcast@sip.ubuntu-uk.org and skype: ubuntuukpodcast
Follow us on Twitter
Find our Facebook Fan Page
Follow us on Google+

Michael Hall: Communicating Recognition

Planet Ubuntu - Thu, 2014-08-21 13:00

Recognition is like money, it only really has value when it’s being passed between one person and another. Otherwise it’s just potential value, sitting idle.  Communication gives life to recognition, turning it’s potential value into real value.

As I covered in my previous post, Who do you contribute to?, recognition doesn’t have a constant value.  In that article I illustrated how the value of recognition differs depending on who it’s coming from, but that’s not the whole story.  The value of recognition also differs depending on the medium of communication.

Over at the Community Leadership Knowledge Base I started documenting different forms of communication that a community might choose, and how each medium has a balance of three basic properties: Speed, Thoughtfulness and Discoverability. Let’s call this the communication triangle. Each of these also plays a part in the value of recognition.

Speed

Again, much like money, recognition is something that is circulated.  It’s usefulness is not simply created by the sender and consumed by the receiver, but rather passed from one person to another, and then another.  The faster you can communicate recognition around your community, the more utility you can get out of even a small amount of it. Fast communications, like IRC, phone calls or in-person meetups let you give and receive a higher volume of recognition than slower forms, like email or blog posts. But speed is only one part, and faster isn’t necessarily better.

Thoughtfulness

Where speed emphasizes quantity, thoughtfulness is a measure of the quality of communication, and that directly affects the value of recognition given. Thoughtful communications require consideration upon both receiving and replying. Messages are typically longer, more detailed, and better presented than those that emphasize speed. As a result, they are also usually a good bit slower too, both in the time it takes for a reply to be made, and also the speed at which a full conversation happens. An IRC meeting can be done in an hour, where an email exchange can last for weeks, even if both end up with the same word-count at the end.

Discoverability

The third point on our communication triangle, discoverability, is a measure of how likely it is that somebody not immediately involved in a conversation can find out about it. Because recognition is a social good, most of it’s value comes from other people knowing who has given it to whom. Discoverability acts as a multiplier (or divisor, if done poorly) to the original value of recognition.

There are two factors to the discoverability of communication. The first, accessibility, is about how hard it is to find the conversation. Blog posts, or social media posts, are usually very easy to discover, while IRC chats and email exchanges are not. The second factor, longevity, is about how far into the future that conversation can still be discovered. A social media post disappears (or at least becomes far less accessible) after a while, but an IRC log or mailing list archive can stick around for years. Unlike the three properties of communication, however, these factors to discoverability do not require a trade off, you can have something that is both very accessible and has high longevity.

Finding Balance

Most communities will have more than one method of communication, and a healthy one will have a combination of them that compliment each other. This is important because sometimes one will offer a more productive use of your recognition than another. Some contributors will respond better to lots of immediate recognition, rather than a single eloquent one. Others will respond better to formal recognition than informal.  In both cases, be mindful of the multiplier effect that discoverability gives you, and take full advantage of opportunities where that plays a larger than usual role, such as during an official meeting or when writing an article that will have higher than normal readership.

Jorge Castro: Free Official Ubuntu Books for Local Teams

Planet Ubuntu - Wed, 2014-08-20 17:16

Prentice Hall has just released the 8th Ed. of “The Official Ubuntu Book”, authored by Matthew Helmke and Elizabeth K. Joseph with José Antonio Rey, Philip Ballew and Benjamin Mako Hill.

This is the book’s first update in 2 years and as the authors state in their Preface, “…a large part of this book has been rewritten—not because the earlier editions were bad, but because so much has happened since the previous edition was published. This book chronicles the major changes that affect typical users and will help anyone learn the foundations, the history, and how to harness the potential of the free software in Ubuntu.”

As with prior editions, publisher Prentice Hall has kindly offered to ship approved LoCo teams each (1) free copy of this new edition. To keep this as simple as possible, you can request your book by following these steps. The team contact shown on our LoCo Team List (and only the team contact) should send an email to Heather Fox at heather.fox@pearson.com and include the following details:

  • Your full name
  • Which team you are from
  • If your team resides within North America, please provide: Your complete street address (the book will ship by UPS)
  • If your team resides outside North America, you will first be emailed a voucher code to download the complete eBook bundle from the publisher site, InformIT, which includes the ePub/mobi/pdf files.

If your team does reside outside North America and you wish to be considered for a print copy, please provide:

Your complete street address, region, country AND IMPORTANT: Your phone number, including country and area code. (Pearson will make its best effort to arrange shipment through its nearest corporate office.)

A few notes:

  • Only approved teams are eligible for a free copy of the book.
  • Only the team contact for each team can make the request for the book.
  • There is a limit of (1) copy of each book per approved team.
  • Prentice Hall will cover postage, but not any import tax or other shipping fees.
  • When you have the books, it is up to you what you do with them. We recommend you share them between members of the team. LoCo Leaders: please don’t hog them for yourselves!

If you have any questions or concerns, please directly contact Pearson/Prentice Hall’s Heather Fox at heather.fox@pearson.com Also, for those teams who are not approved or yet to be approved, you can still score a rather nice 35% discount on the books by registering your LoCo with the Pearson User Group Program.

Svetlana Belkin: The Certificate of Ubuntu Membership

Planet Ubuntu - Wed, 2014-08-20 17:10

Today, in mail, came my Certificate of Ubuntu Membership that I requested back in February.  The photo was taken via my Ubuntu Touch on my Nexus 7 2013.


Kubuntu: KDE Applications and Development Platform 4.14

Planet Ubuntu - Wed, 2014-08-20 15:11

Packages for the release of KDE SC 4.14 are available for Kubuntu 14.04LTS and our development release. You can get them from the Kubuntu Backports PPA. It includes an update of Plasma Desktop to 4.11.11.

Bugs in the packaging should be reported to kubuntu-ppa on Launchpad. Bugs in the software to KDE.

Ubuntu Women: Calling For Testers for Orientation Quiz

Planet Ubuntu - Wed, 2014-08-20 11:56

The Ubuntu Women Project is pleased to present an orientation quiz that is aimed to help new comers into the Ubuntu Community to find their niche and get involved.  The base quiz was taken from the Ubuntu Italian LoCo. Our plans is to put this quiz on community.ubuntu.com  but we are seeking testers for it first!

How to test it:

Go to the page where the quiz is and play around with answering the questions.  If you find an issue, please e-mail the Ubuntu Women Mailing-List at ubuntu-women@lists.ubuntu.com.  If you want to see the code, you may ask Lyz at lyz@ubuntu.com or me at belkinsa@ubuntu.com.

Jonathan Riddell: Qt Licence Update

Planet Ubuntu - Wed, 2014-08-20 09:27
KDE Project:

Today Qt announced some changes to their licence. The KDE Free Qt team have been working behind the scenes to make these happen and we should be very thankful for the work they put in. Qt code was LGPLv2.1 or GPLv3 (this also allows GPLv2). Existing modules will add LGPLv3 to that. This means I can get rid of the part of the KDE Licensing Policy which says "Note: code may not be copied from Qt into KDE Platform as Qt is LGPLv2.1 only which would prevent it being used under LGPL 3".

New modules, starting with the new web module QtWebEngine (which uses Blink) will be LGPLv3 or GPLv2. Getting rid of LGPLv2.1 means better preserving our freedoms (can't use patents to restrict, must allow reverse enginerring, must allow to replace Qt etc). It's not a problem for the new Qt modules to link to LGPLv2 or LGPLv2+ libraries or applications of any licence (as long as they allow the freedoms needed such as those listed above). One problem with LGPLv3 is you can't link a GPLv2 only application to it (not because LGPLv3 prevents it but because GPL2 prevents it), this is not a problem here because it will be dual licenced as GPLv2 alongside.

The main action this prevents is directly copying code from the new Qt modules into Frameworks, but as noted above we forbid doing that anyway.

With the new that Qt moved to Digia and there is a new company being spun out I had been slightly worried that the new modules would be restricted further to encourage more commercial licences of Qt. This is indeed the case and it's being done in the best possible way, thanks Digia.

Benjamin Kerensa: Mozilla and Open Diversity Data

Planet Ubuntu - Wed, 2014-08-20 05:28

I have been aware of the Open Diversity Data project for awhile. It is the work of the wonderful members of Double Union and their community of awesome contributors. Recently, a Mozillian tweeted that Mozilla should release it’s Diversity Data. It is my understanding also that a discussion happened internally and for whatever reason a release of Mozilla’s diversity data did not entirely result although some numbers are available here.

Anyways, I’m now going to bring this suggestion up again and encourage that both Mozilla Corporation and Mozilla Foundation release individual diversity data reports in the form of some numbers, graphs and a blog post and perhaps a combined one of both orgs.

I would encourage other Mozillians to support the push for opening this data by sharing this blog post on the Social Media as an indicator of supporting Open Diversity Data publishing by Mozilla or by retweeting this.

I really think our Manifesto encourages us to support initiatives like this; specifically principle number two of our manifesto. If other companies (Kudos!) that are less transparent than Mozilla can do it then I think we have to do this.

Finally, I would like to encourage Mozilla to consider creating a position of VP of Diversity and Inclusion to oversee our various diversity and inclusion efforts and to help plan and create a vision for future efforts at Mozilla. Sure we have already people who kind of do this but it is not their full-time role.

Anyways that’s all I have on this…

Svetlana Belkin: Ohio Team Wiki Team Wiki Update/Clean Up

Planet Ubuntu - Tue, 2014-08-19 19:23

As my recent project, that is Ubuntu related, and a very overdo task on my To Do list, I worked on updating the Ohio Team Wiki pages (I took some ideas from how the Doc Team Wiki pages look);

  • I removed the unneeded pages after I had approval from Stephen Michael Kellat
  • I lumped similar pages such as the agendas to the meetings and the various events into main pages (/Meetings and /Events) and linked those two main pages on the home page
  • I only have two things left to do which is to fix the banner and find all of the missing records from the mailing list

Hopefully it is cleaner and easier to get the information that one needs.


Bodhi.Zazen: Music practice

Planet Ubuntu - Tue, 2014-08-19 18:26

With the advent of YouTube , there is a plethora of music “lessons” available on the internet. When learning new riffs, however, it is helpful to be able to alter the speed of playback and play selective sections of the lesson.

For some time I have been using audacity, which has the advantage of cross platform availability. However, audacity is a bit of overkill and I find it a bit slow at times.

In addition, when selecting a particular segment within the lesson, skipping dialog or parts already mastered, audacity is a bit “clunky” and somewhat time consuming. Alternately one can splice the lessons with ffmpeg, again somewhat time consuming.

Recently I came across a simple, no frills, light weight solution, “Play it slowly”

Home page

Download (github)

Play it slowly is a light weight application but has a simple , clean interface. It is simple to use and has basic features such as:

  1. Slow the speed on playback without altering pitch.
  2. Easily mark, move, and reset sections of a track for playback.
  3. Easy to start/stop/restart playback.

Play is slowly is in the Debian and Ubuntu repositories

sudo apt-get install playitslowly

For Fedora, first install the dependencies:

yum install gstreamer-python gstreamer-plugins-bad-extras

Download the source code from the above link (version 1.4.0 at the time of this writing)

Extract the tarball and install

tar xvzf playitslowly-1.4.0.tar.gz
cd playitslowly-1.4.0
sudo python setup.py install

For additional options see the README or run:

python setup.py --help

Ubuntu Kernel Team: Kernel Team Meeting Minutes – August 19, 2014

Planet Ubuntu - Tue, 2014-08-19 17:17
Meeting Minutes

IRC Log of the meeting.

Meeting minutes.

Agenda

20140819 Meeting Agenda


Release Metrics and Incoming Bugs

Release metrics and incoming bug data can be reviewed at the following link:

http://people.canonical.com/~kernel/reports/kt-meeting.txt


Status: Utopic Development Kernel

The Utopic kernel has been rebased to the first v3.16.1 upstream stable
kernel and uploaded to the archive, ie. linux-3.16.0-9.14. Please test
and let us know your results.
—–
Important upcoming dates:
Thurs Aug 21 – Utopic Feature Freeze (~2 days away)
Mon Sep 22 – Utopic Final Beta Freeze (~5 weeks away)
Thurs Sep 25 – Utopic Final Beta (~5 weeks away)
Thurs Oct 9 – Utopic Kernel Freeze (~7 weeks away)
Thurs Oct 16 – Utopic Final Freeze (~8 weeks away)
Thurs Oct 23 – Utopic 14.10 Release (~9 weeks away)


Status: CVE’s

The current CVE status can be reviewed at the following link:

http://people.canonical.com/~kernel/cve/pkg/ALL-linux.html


Status: Stable, Security, and Bugfix Kernel Updates – Trusty/Saucy/Precise/Lucid

Status for the main kernels, until today (Aug. 19):

  • Lucid – verification & testing
  • Precise – verification & testing
  • Trusty – verification & testing

    Current opened tracking bugs details:

  • http://kernel.ubuntu.com/sru/kernel-sru-workflow.html

    For SRUs, SRU report is a good source of information:

  • http://kernel.ubuntu.com/sru/sru-report.html

    Schedule:

    cycle: 08-Aug through 29-Aug
    ====================================================================
    08-Aug Last day for kernel commits for this cycle
    10-Aug – 16-Aug Kernel prep week.
    17-Aug – 23-Aug Bug verification & Regression testing.
    24-Aug – 29-Aug Regression testing & Release to -updates.

    cycle: 29-Aug through 20-Sep
    ====================================================================
    29-Aug Last day for kernel commits for this cycle
    31-Sep – 06-Sep Kernel prep week.
    07-Sep – 13-Sep Bug verification & Regression testing.
    14-Sep – 20-Sep Regression testing & Release to -updates.


Open Discussion or Questions? Raise your hand to be recognized

No open discussion.

Pages

Subscribe to Free Software Magazine aggregator