news aggregator

Elizabeth K. Joseph: Ubuntu at Fossetcon 2014

Planet Ubuntu - Tue, 2014-09-16 17:01

Last week I flew out to the east coast to attend the very first Fossetcon. The conference was on the smaller side, but I had a wonderful time meeting up with some old friends, meeting some new Ubuntu enthusiasts and finally meeting some folks I’ve only communicated with online. The room layout took some getting used to, but the conference staff was quick to put up signs and directing conference attendees in the right direction and in general leading to a pretty smooth conference experience.

On Thursday the conference hosted a “day zero” that had training and an Ubucon. I attended the Ubucon all day, which kicked off with Michael Hall doing an introduction to the Ubuntu on Phones ecosystem, including Mir, Unity8 and the Telephony features that needed to be added to support phones (voice calling, SMS/MMs, Cell data, SIM card management). He also talked about the improved developer portal with more resources aimed at app developers, including the Ubuntu SDK and simplified packaging with click packages.

He also addressed the concern of many about whether Ubuntu could break into the smartphone market at this point, arguing that it’s a rapidly developing and changing market, with every current market leader only having been there for a handful of years, and that new ideas need need to play to win. Canonical feels that convergence between phone and desktop/laptop gives Ubuntu a unique selling point and that users will like it because of intuitive design with lots of swiping and scrolling actions, gives apps the most screen space possible. It was interesting to hear that partners/OEMs can offer operator differentiation as a layer without fragmenting the actual operating system (something that Android struggles with), leaving the core operating system independently maintained.

This was followed up by a more hands on session on Creating your first Ubuntu SDK Application. Attendees downloaded the Ubuntu SDK and Michael walked through the creation of a demo app, using the App Dev School Workshop: Write your first app document.

After lunch, Nicholas Skaggs and I gave a presentation on 10 ways to get involved with Ubuntu today. I had given a “5 ways” talk earlier this year at the SCaLE in Los Angeles, so it was fun to do a longer one with a co-speaker and have his five items added in, along with some other general tips for getting involved with the community. I really love giving this talk, the feedback from attendees throughout the rest of the conference was overwhelmingly positive, and I hope to get some follow-up emails from some new contributors looking to get started. Slides from our presentation are available as pdf here: contributingtoubuntu-fossetcon-2014.pdf


Ubuntu panel, thanks to Chris Crisafulli for the photo

The day wrapped up with an Ubuntu Q&A Panel, which had Michael Hall and Nicholas Skaggs from the Community team at Canonical, Aaron Honeycutt of Kubuntu and myself. Our quartet fielded questions from moderator Alexis Santos of Binpress and the audience, on everything from the Ubuntu phone to challenges of working with such a large community. I ended up drawing from my experience with the Xubuntu community a lot in the panel, especially as we drilled down into discussing how much success we’ve had coordinating the work of the flavors with the rest of Ubuntu.

The next couple days brought Fossetcon proper, with I’ll write about later. The Ubuntu fun continued though! I was able to give away 4 copies of The Official Ubuntu Book, 8th Edition which I signed, and got José Antonio Rey to sign as well since he had joined us for the conference from Peru.

José ended up doing a talk on Automating your service with Juju during the conference, and Michael Hall had the opportunity to a talk on Convergence and the Future of App Development on Ubuntu. The Ubuntu booth also looked great and was one of the most popular of the conference.

I really had a blast talking to Ubuntu community members from Florida, they’re a great and passionate crowd.

Ubuntu LoCo Council: New SubLoCo Policy

Planet Ubuntu - Tue, 2014-09-16 15:24

Hi, after a lot of work, thinking and talking about the problem of the LoCo Organization and the SubLoCos, we came up with the following policy:

  • Each team will be a country (or state in the United States). We will call this a ‘LoCo’.
  • Each LoCo can have sub-teams. This sub-teams will be created at the will and need of each LoCo.
  • A LoCo may have sub-teams or not have sub-teams.
  • In the event a LoCo does have sub-teams, a Team Council needs to be created.
  • A Team Council is conformed by at least one member of each sub-team.
  • The members that will be part of the Team Council will be chosen by other current members of the team.
  • The Team Council will have the power to make decisions regarding to the LoCo.
  •  The Team Council will also have the power to request partner items, such as conference and DVD packs.
  • The LoCo Council will only recognize one team per country (or state in the United States). This is the team that will be in the ~locoteams team in Launchpad.
  • In the event a LoCo wants to go through the verification process, the LoCo will go through it, and not individual sub-teams.
  • LoCos not meeting the criteria of country/state teams will be denied verification.
  • In the event what is considered a sub-team wants to be considered a LoCo, it will need to present a request to the LoCo Council.
  • The LoCo Council will provide a response, which is, in no way, related to verification. The LoCo will still have to apply for verification if wanted.

We encourage the LoCo teams to see if this new form of organization is fits for you, if so please start forming subteams as you find useful. If a team needs help with this or anything else contact us, we are here to help!

The Fridge: Ubuntu Weekly Newsletter Issue 383

Planet Ubuntu - Mon, 2014-09-15 23:51

Welcome to the Ubuntu Weekly Newsletter. This is issue #383 for the week September 8 – 14, 2014, and the full version is available here.

In this issue we cover:

The issue of The Ubuntu Weekly Newsletter is brought to you by:

  • Elizabeth K. Joseph
  • Jose Antonio Rey
  • And many others

If you have a story idea for the Weekly Newsletter, join the Ubuntu News Team mailing list and submit it. Ideas can also be added to the wiki!

Except where otherwise noted, content in this issue is licensed under a Creative Commons Attribution 3.0 License BY SA Creative Commons License

José Antonio Rey: 3 years and counting…

Planet Ubuntu - Mon, 2014-09-15 16:22

On a 15th September, 3 years ago, I got my Ubuntu Membership.

There’s only thing I can say about it: it’s been the most wonderful and awesome 3 years I could have. I would’ve never thought that I would find such welcoming and amazing community.

Even though I may have not worked with you directly, thank you. You all are what makes the community awesome – I wouldn’t imagine it without one of you. We are all building the future, so let’s continue!

As I said on the title, I hope that it’s not only 3 years. I’ll keep on counting!


Thomas Ward: NGINX in Ubuntu, PPAs, and Debian: naxsi packages to be dropped by the end of the month.

Planet Ubuntu - Mon, 2014-09-15 14:50

Back in April, I upstreamed (that is, reported a bug to Debian) regarding the `nginx-naxsi` packages. The initial bug I upstreamed was about the outdated naxsi version in the naxsi packages. (see this bug in Ubuntu and the related bug in Debian)

The last update on the Debian bug is on September 10, 2014. That update says the following, and was made by Christos Trochalakis:

After discussing it with the fellow maintainers we have decided that it is
better to remove the nginx-naxsi package before jessie is freezed.

Packaging naxsi is not trivial and, unfortunately, none of the maintainers uses
it. That’s the reason nginx-naxsi is not in a good shape and we are not feeling
comfortable to release and support it.

We are sorry for any inconvenience caused.

I asked what the expected timeline was for the packages being dropped. In a response from Christos today, September 15, 2014, it was said:

It ‘ll get merged and released (1.6.1-3) by the end of the month.

In Ubuntu, these changes will likely not make it into 14.10, but future versions of Ubuntu beyond 14.10 (such as 15.04) will likely have this change.

In the PPAs, the naxsi packages will be dropped with stable 1.6.1-3+precise0 +trusty0 +utopic0 and mainline 1.7.4-1+precise0 +trusty0 +utopic0 or will be dropped in later versions if a new point release is made before then.

In Debian, these changes are likely to hit by the end of the month (with 1.6.1-3).

Michael Hall: Public speaking for introverts

Planet Ubuntu - Mon, 2014-09-15 09:00

Last week I attended FOSSETCON, a new open source convention here in central Florida, and I had the opportunity to give a couple of presentations on Ubuntu phones and app development. Anybody who knows me knows that I love talking about these things, but a lot fewer people know that doing it in front of a room of people I don’t know still makes me extremely nervous. I’m an introvert, and even though I have a public-facing job and work with the wider community all the time, I’m still an introvert.

I know there are a lot of other introverts out there who might find the idea of giving presentations to be overwhelming, but they don’t have to be.  Here I’m going to give my personal experiences and advice, in the hope that it’ll encourage some of you to step out of your comfort zones and share your knowledge and talent with the rest of us at meetups and conferences.

You will be bad at it…

Public speaking is like learning how to ride a bicycle, everybody falls their first time. Everybody falls a second time, and a third. You will fidget and stutter, you will lose your train of thought, your voice will sound funny. It’s not just you, everybody starts off being bad at it. Don’t let that stop you though, accept that you’ll have bruises and scrapes and keep getting back on that bike. Coincidentally, accepting that you’re going to be bad at the first ones makes it much less frightening going into them.

… until you are good at it

I read a lot of things about how to be a good and confident public speaker, the advice was all over the map, and a lot of it felt like pure BS.  I think a lot of people try different things and when they finally feel confident in speaking, they attribute whatever their latest thing was with giving them that confidence. In reality, you just get more confident the more you do it.  You’ll be better the second time than the first, and better the third time than the second. So keep at it, you’ll keep getting better. No matter how good or bad you are now, you will keep getting better if you just keep doing it.

Don’t worry about your hands

You’ll find a lot of suggestions about how to use your hands (or not use them), how to walk around (or not walk around) or other suggestions about what to do with yourself while you’re giving your presentation. Ignore them all. It’s not that these things don’t affect your presentation, I’ll admit that they do, it’s that they don’t affect anything after your presentation. Think back about all of the presentations you’ve seen in your life, how much do you remember about how the presenter walked or waved their hands? Unless those movements were integral to the subject, you probably don’t remember much. The same will happen for you, nobody is going to remember whether you walked around or not, they’re going to remember the information you gave them.

It’s not about you

This is the one piece of advice I read that actually has helped me. The reason nobody remembers what you did with your hands is because they’re not there to watch you, they’re there for the information you’re giving them. Unless you’re an actual celebrity, people are there to get information for their own benefit, you’re just the medium which provides it to them.  So don’t make it about you (again, unless you’re an actual celebrity), focus on the topic and information you’re giving out and what it can do for the audience. If you do that, they’ll be thinking about what they’re going to do with it, not what you’re doing with your hands or how many times you’ve said “um”. Good information is a good distraction from the things you don’t want them paying attention to.

It’s all just practice

Practicing your presentation isn’t nearly as stressful as giving it, because you’re not worried about messing up. If you mess up during practice you just correct it, make a note to not make the same mistake next time, and carry on. Well if you plan on doing more public speaking there will always be a next time, which means this time is your practice for that one. Keep your eye on the presentation after this one, if you mess up now you can correct it for the next one.

 

All of the above are really just different ways of saying the same thing: just keep doing it and worry about the content not you. You will get better, your content will get better, and other people will benefit from it, for which they will be appreciative and will gladly overlook any faults in the presentation. I guarantee that you will not be more nervous about it than I was when I started.

Martin Pitt: autopkgtest 3.5: Reboot support, Perl/Ruby implicit tests

Planet Ubuntu - Mon, 2014-09-15 08:23

Last week’s autopkgtest 3.5 release (in Debian sid and Ubuntu Utopic) brings several new features which I’d like to announce.

Tests that reboot

For testing low-level packages like init or the kernel it is sometimes desirable to reboot the testbed in the middle of a test. For example, I added a new boot_and_services systemd autopkgtest which configures grub to boot with systemd as pid 1, reboots, and then checks that the most important services like lightdm, D-BUS, NetworkManager, and cron come up as expected. (This test will be expanded a lot in the future to cover other areas like the journal, logind, etc.)

In a testbed which supports rebooting (currently only QEMU) your test will now find an “autopkgtest-reboot” command which the test calls with an arbitrary “marker” string. autopkgtest will then reboot the testbed, save/restore any files it needs to (like the tests file tree or previously created artifacts), and then re-run the test with ADT_REBOOT_MARK=mymarker.

The new “Reboot during a test” section in README.package-tests explains this in detail with an example.

Implicit test metadata for similar packages

The Debian pkg-perl team recently discussed how to add package tests to the ~ 3.000 Perl packages. For most of these the test metadata looks pretty much the same, so they created a new pkg-perl-autopkgtest package which centralizes the logic. autopkgtest 3.5 now supports an implicit debian/tests/control control file to avoid having to modify several thousand packages with exactly the same file.

An initial run already looked quite promising, 65% of the packages pass their tests. There will be a few iterations to identify common failures and fix those in pkg-perl-autopkgtest and autopkgtestitself now.

There is still some discussion about how implicit test control files go together with the DEP-8 specification, as other runners like sadt do not support them yet. Most probably we’ll declare those packages XS-Testsuite: autopkgtest-pkg-perl instead of the usual autopkgtest.

In the same vein, Debian’s Ruby maintainer (Antonio Terceiro) added implicit test control support for Ruby packages. We haven’t done a mass test run with those yet, but their structure will probably look very similar.

Riccardo Padovani: Create your first QML game with Bacon2D

Planet Ubuntu - Mon, 2014-09-15 07:00

Hi all,
after long time I return to write to show you how to create a simple game for Ubuntu for Phones (but also for Android) with Bacon2D.

Bacon2D is a framework to ease 2D game development, providing ready-to-use QML elements representing basic game entities needed by most of games.

As tutorial I’ll explain you how I create my first QML game, 100balls, that you could find on Ubuntu Store on Phones. Source is available on Github.

Installation

So, first of all we need to install Bacon2D on our system. I suppose you have already installed QT on your system, so we only need to take source and compile it:

git clone git@github.com:Bacon2D/Bacon2D.git
cd Bacon2D
mkdir build && cd build
qmake ..
make
sudo make install

Now you have Bacon2D on your system, and you can import it in every project you want.

A first look to Bacon2D

Bacon2D provides a good number of custom components for your app. Of course, I can’t describe them all in one article, so please read the documentation. We’ll use only few of them, and I think the best way to introduce you to them is writing the app.
So, let’s start!

First of all, we create our base file, called 100balls.qml:

import QtQuick 2.0 import Bacon2D 1.0

The first element we add is the Game element. Game is the top-level container, where all the game will be. We set some basic property and the name of the game, with gameName property:

import QtQuick 2.0 import Bacon2D 1.0   Game { id: game anchors.centerIn: parent   height: 680 width: 440   gameName: "com.ubuntu.developer.rpadovani.100balls" // Ubuntu Touch name format, you can use whatever you want }

But the Game itself is useless, we need to add one or more Scene to it. A scene is the place where all Entity of the game will be placed.
Scene has a lot of property, for now is importat to set two of them: running indicates if all things in the scene will move, and if game engine works; second property is physics, that indicates if Box2D has to be used to simulate physic in the game. We want a game where some balls fall, so we need to set it to true.

import QtQuick 2.0 import Bacon2D 1.0   Game { id: game anchors.centerIn: parent   height: 680 width: 440   gameName: "com.ubuntu.developer.rpadovani.100balls" // Ubuntu Touch name format, you can use whatever you want   Scene { id: gameScene physics: true running: true } }

Stuart Langridge: Brum Tech Scene interviews

Planet Ubuntu - Sun, 2014-09-14 23:56

Today I released the first of the Brum Tech Scene interviews, with me talking to Simon Jenner of Silicon Canal and Oxygen Startups. There’s a video on the site from me explaining why I’m doing this, but I figure that the more discerning audience for as days pass by might appreciate a more in-depth discussion.

I love this city. I love that we’re prepared to spend a hundred and ninety million quid on building the best library in the whole world. I love that there’s so much going on, tech-wise. But nobody talks to anybody else. If you look at, say, Brighton, the whole tech scene there all hang out together. They can put on a Digital Brighton week and have dConstruct be part of it and Seb do mad things with visualisations and that’s marvellous. We ought to have that. I want us to have that.

We don’t have a tech scene. We’ve got twenty separate tech scenes. What I want to do is knock down the walls a bit. So the designers talk to the SEO people and the Linux geeks talk to the designers. Because there is no way that this can be a bad thing.

I also want to learn a bit about videos. Now, let’s be clear here. I know from a decade of podcasting that with a mild expenditure of money on gear, and a great sound engineer (Jono Bacon, step forward) you can produce something as good as the professionals. Bad Voltage sounds as good, production-wise, as the BBC’s Today programme does. Video is not like that. There is a substantial difference between amateur and professional efforts; one bloke using mobile phones to record cannot make something that looks like Sherlock or Game of Thrones. I’m not trying to look professional here; I’m aiming for “competent amateur”. I’ve learned loads about how to record a video interview, how to mix it, how to do the editing. Sit far enough apart that your voice doesn’t sound on their mic. Apply video effects to the clip before you cut it up. Don’t speak over the interviewee. KDEnLive’s “set audio reference” is witchcraft brilliance. I knew none of this two months ago. And I’ve really enjoyed learning. I am in no wise good at this stuff, but I’m better than I was.

This has been a fun project to set up, and it will continue being fun as I record more interviews. My plan is to have a new one every Monday morning, indefinitely, as long as people like them and I’m still interested in doing them. I should give big love to Mike, my designer, who I fought with tooth and nail about the site design and the desaturated blue look to the videos, and to Dan Newns who sat and was interviewed as a test when I first came up with this idea, and has provided invaluable feedback throughout.

If you know something about video editing, I’d love to hear how I can do better. Ping me on twitter or by mail. Tell me as well who you want to hear interviewed; which cool projects are going on that I don’t know about. I’d also love to hear about cool venues in the city in which I can do interviews; one of my subsidiary goals here is to show off the city’s tech places. Annoyingly, I spoke to the Library and to the Birmingham Museums Trust and they were all “fill out our fifteen page form” because they’re oriented around the BBC coming in with a crew of twenty camera people, not one ginger guy with a mobile phone and a dream. Maybe I’ll do things with @HubBirmingham once they actually exist.

I should talk about the tech, here. I record the interviews on an iPhone 5, a Nexus 4, and a little HD camera I bought years ago. The audio is done with two Røde Smartlav lapel mics plugged into the two phones. None of this is expensive, which has a cost in terms of video and audio quality but critically doesn’t have much of a cost in terms of actual pounds sterling. And editing is done with KDEnLive (kdenlive?) which is a really powerful non-linear video editor for Ubuntu, and the team who make it should be quite proud. The big thing I’m missing (apart from a cameraman) is a tripod, which I can probably buy for about ten quid, and I will do once I find one that’s tall and yet still fits in my laptop bag.

Anyway, that’s the story of the Brum Tech Scene interviews. There’ll be one every Monday. I hope you like them. I hope they help, even in a small way, to make the Brum tech scene gel together even more than it has thus far. Let me know what you think. brumtechscene.co.uk.

Joel Leclerc: I’m quitting relinux

Planet Ubuntu - Sun, 2014-09-14 23:24

I will start this off by saying: I’m very (and honestly) sorry for, well, everything.

To give a bit of history, I started relinux as a side-project for my CosmOS project (cloud-based distribution … which failed), in order to build the ISO’s. The only reasonable alternative at the time was remastersys, and I realized I would have to patch it anyways, so I thought that I might as well make a reusable tool for other distributions to use too.

Then came a rather large amount of friction between me and the author of remastersys, of which I will not go into any detail of. I acted very immaturely then, and wronged him several times. I had defamed him, made quite a few people very angry at him, and even managed to get some of his supporters against him. True, age and maturity had something to do with it (I was 12 at the time), but that still doesn’t excuse my actions at all.

So my first apology is to Tony Brijeski, the author of remastersys, for all the trouble and possible pain I had put him through. I’m truly sorry for all of this.

However, though the dynamics with Tony and remastersys are definitely a large part of why I’m quitting relinux, that is not all. The main reason, actually, is lack of interest. I have rewritten relinux a total of 7 times (including the original fork of remastersys), and I really hate the debugging process (takes 15-20 minutes to create an ISO, so that I can debug it). I have also lost interest in creating linux distributions, so not only am I very tired of working on it, I also don’t really care about what it does.

On this note, my second apologies (and thanks) have to go those who have helped me so much through the process, especially those who have tried to encourage me to finish relinux. Those listed are in no particular order, and if I forgot you, then let me know (and I apologize for that!):

  • Ko Ko Ye
  • Raja Genupula
  • Navdeep Sidhu
  • Members of the TSS Web Dev Club
  • Ali Hallahi
  • Gert van Spijker
  • Aritra Das
  • Diptarka Das
  • Alejandro Fernandez
  • Kendall Weaver

Thank you very much for everything you’ve done!

Lastly, I would like to explain my plans for it, in case anyone wants to continue it (by no means do I want to enforce these, these are just ideas).

My plan for the next release of relinux was to actually make a very generic and scriptable CLI ISO creation tool, and then make relinux as a specific set of “profiles” for that tool (plus an interface). The tool would basically contain a few libraries for the chosen scripting language, for things like storing the filesystem (SquashFS or other), ISO creation, and general utilities for editing files while keeping permissions, mutli-threading/processing, etc… The “profiles” would then copy, edit, and delete files as needed, set up the tool wanted for running the live system (in ubuntu’s case, this’d be casper), setup the installer/bootloader, and such.

I would like to apologize to you all, the people who have used relinux and have waited for a stable version for 3 years, for not doing this. Thank you very much for your support, and I’m very sorry for having constantly pushed releases back and having never made a stable or well working version of relinux. Though I do have some excuses as to why the releases didn’t work, or why I didn’t test them well enough, none of them can cover why I didn’t fix them or work on it more. And for that, I am very sorry.

I know that this is a very large post for something so simple, but I feel that it would not be right if I didn’t apologize to those I have done wrong to, and thanked those who have helped me along the way.

So to summarize, thank you, sorry, and relinux is now dead.

- Joel Leclerc (MiJyn)


David Tomaschik: Getting Started in CTFs

Planet Ubuntu - Sun, 2014-09-14 20:07

My last post was about getting started in a career in information security. This post is about the sport end of information security: Capture the Flag (CTFs).

I'd played around with some wargames (Smash the Stack, Over the Wire, and Hack this Site) before, but my first real CTF (timed, competitive, etc.) was the CTF run by Mad Security at BSides SF 2013. By some bizarre twist of fate, I ended up winning the CTF, and I was hooked. I've probably played in about 30 CTFs since, most of them online with the team Shadow Cats. It's been a bumpy ride, but I've learned a lot about a variety of topics by doing this.

If you're in the security industry and you've never tried a CTF, you really should. Personally, I love CTFs because they get me to exercise skills that I never get to use at work. They also inspire some of my research and learning. The only problem is making the time. :)

Here's some resources I've thought were interesting:

Luke Faraone: "Your release sucks."

Planet Ubuntu - Sat, 2014-09-13 20:43
I look forward to Ubuntu's semiannual release day, because it's the completion of 6ish months of work by Ubuntu (and by extension Debian) developers.

I also loathe it, because every single time we get people saying "This Ubuntu release is the worst release ever!".

Ubuntu releases are always rocky around release time, because the first time Ubuntu gets widespread testing is on or after release day.

We ship software to 12 Million Ubuntu Users with only 150 MOTUs who work directly on the platform. That's a little less than 1 developer with upload rights to the archive for every 60,000 users. ((This number, like all other usage data, is dated, and probably wasn't even accurate when it was first calculated)) Compared to Debian, which (at last estimate in 2010) had 1.5 million uniques on security.debian.org, yet has around 1000 Debian Developers.

Debian has a strong testing culture; someone once estimated that around ¾ of Debian users are running unstable or testing. In Ubuntu, we don't have good metrics on how many people are using the development release that I'm aware of (pointers welcome), but I'd guess that it's a very very small percentage. A common thread in bug reports, if we get a response at all, goes on as follows:
Triager: ((Developer, bugcontrol member, etc. Somebody who is not experiencing the problem but wants to help.)) "Is this a problem in $devel?"User: "I'll let you know when it hits final"Triager: "It's too late then. Then we'll want you to test in the next release. We have to fix it BEFORE its final"User: "Ok, I'll test at beta."Triager: "That's 2 weeks before release, which will be too late. Please test ASAP if you want us to have time to fix it"
Of course, there are really important bugs with hardware support which keep on cropping up. But if they're just getting reported on or around release day, there are limits to what can be done about them this cycle.

We need to make it easier for people to run early development versions, and encourage more people to use them (as long as they're willing to deal with breakage). I'm not sure whether unstable/testing is appropriate for Ubuntu, and I'm fairly confident that we don't want to move to a rolling release (currently being discussed in Debian, summary). But we badly need more developers, and equally importantly, more testers to try it out earlier in the release process.

To users: please, please try out the development versions. Download a LiveCD and run a smoketest, or check if bugs you reported are in fact fixed in the later versions. And do it early and often.

David Tomaschik: Getting Started in Information Security

Planet Ubuntu - Sat, 2014-09-13 19:30

I've only been an information security practitioner for about a year now, but I've been doing things on my own for years before that. However, many people are just getting into security, and I've recently stumbled on a number of resources for newcomers, so I thought I'd put together a short list.

Stuart Langridge: Developers are users too

Planet Ubuntu - Sat, 2014-09-13 15:51

When you talk about the “user experience” of the thing you’re building, remember that developers who use your APIs are users too. And you need to think about their experience.

We seem to have created a world centred on github where everyone has to manage dependencies by hand, like we had to in 1997. This problem was completely solved by apt twenty years ago, but the new cool github world is, it seems, too cool to care about that. Go off to get some new project by git cloneing it and it’s quite likely to say “oh, and it depends on $SOME_OTHER_PROJECT (here’s a link to that project’s github repo)”. And then you have to go fetch both and set them up yourself. Which is really annoying.

Now, there are good reasons why to not care about existing dependency package management systems such as apt. Getting stuff into Ubuntu is hard, laborious work and most projects don’t want to do it. PPAs make it easier, but not much easier; if you’re building a thing and not specifically targeting Ubuntu with it, you don’t want to have to learn about Launchpad and PPAs and build recipes and whatnot. This sort of problem is also solves neatly for packages in a specific language by that language’s own packaging system; Python stuff is installable with pip install whatever and a virtualenv; Node stuff is installable with npm install whatever; all these take care of fetching any dependent stuff. But this rush for each language to have its own “app store” for its apps and libraries means that combining things from different languages is still the same 20th century nightmare. Take, for example, Mozilla’s new Firefox Tools Adaptor. I’m not picking on Mozilla here; the FTA is new, and it’s pretty cool, and it’s not finished yet. This is just the latest in a long line of things which exhibit the problem. The FTA allows you to use the Firefox devtools to debug web things running in other browsers. Including, excitingly, debugging things running in iOS Safari on the iPhone. Now, doing that’s a pain in the ringpiece at the moment; you have to install Google’s ios-webkit-debug-proxy, which needs to be compiled, and Apple break compatibility with it all the time and so you have to fetch and build new versions of libimobiledevice or something. I was eager to see that the new Firefox Tools Adaptor promises to allow debugging on iOS Safari just by installing a Firefox extension.

And then I read about it, and it says, “The Adapter’s iOS support uses Google’s ios-webkit-debug-proxy. Until that support is built directly into the add-on, you’ll need to install and run the ios-webkit-debug-proxy binary yourself”. Sigh. That’s the hard part. And it’s not any easier here.

Again, I’m not blaming Mozilla here — they plan to fix this, but they’ll have to fix it by essentially bundling ios-webkit-debug-proxy with the FTA. That’ll work, and that’s an important thing for them to do in order to provide a slick user experience for developers using this tool (because “download and compile this other thing first” is not ever ever a nice user experience). This is sorta kinda solved by brew for Mac users, but there’s a lot of stuff not in brew either. Still, there is willingness to solve it that way by having a packaging system. But it’s annoying that Ubuntu already has one and people are loath to use it. Using it makes for a better developer user experience. That’s important.

John Baer: Get a free Chromebook from the Google Lending Library

Planet Ubuntu - Fri, 2014-09-12 22:52

Are you are enrolled in college, need a laptop computer, and willing to accept a new Chromebook? If so, Google got a deal for you and it’s called the Google Lending Library.

The Chromebook Lending Library is traveling to 12 college campuses across the U.S. loaded with the latest Chromebooks. The Lending Library is a bit like your traditional library, but instead of books, we’re letting students borrow Chromebooks (no library card needed). Students can use a Chromebook during the week for life on campus— whether it’s in class, during an all-nighter, or browsing the internet in their dorm.

Lindsay Rumer, Chrome Marketing


Assuming you attend one the partnered Universities, here is how it works.

1. Request a Chromebook from the Library
2. Agree to the Terms of Use Agreement
3. Use the Chromebook as you like while you attend school
4. Return it when you want or when you leave

What happens if you don’t return it? Expect to receive a bill for the fair market value not to exceed $220.

Here’s the fine print.

“Evaluation Period” means the period of time specified to you at the time of checkout of a Device.

“Checkout Location” means the location specified by Google where Devices will be issued to you and collected from you.

1.1 Device Use. You may use the Device issued to you for your personal evaluation purposes. Upon your use of the Device, Google transfers title to the Device equipment to you, but retains all ownership rights, title and interest to any Google Devices and services and anything else that Google makes available to you, including without limitation any software on the Device.

1.2 Evaluation Period. You may use the Device during the Evaluation Period. Upon (i) expiration of the Evaluation Period, or (ii) termination of this Agreement, if this Agreement is terminated early in accordance Section 4, you agree to return the Device to the Checkout Location. If you fail to return the Device at the end of the Evaluation Period or upon termination of this Agreement, you agree Google may, to the extent allowed by applicable law, charge you up to the fair market value of the Device less normal wear and tear and any applicable taxes for an amount not to exceed Two Hundred Twenty ($220.00) Dollars USD.

1.3 Feedback. Google may ask you to provide feedback about the Device and related Google products optimized for Google Services. You are not required to provide feedback, but, if you do, it must only be from you, truthful, and accurate and you grant Google permission to use your name, logo and feedback in presentations and marketing materials regarding the Device. Your participation in providing feedback may be suspended at any time.

1.4 No Compensation. You will not be compensated for your use of the Devices or for your feedback.

2. Intellectual Property Rights. Nothing in this Agreement grants you intellectual property rights in the Devices or any other materials provided by Google. Except as provided in Section 1.1, Google will own all rights to anything you choose to submit under this Agreement. If that isn’t possible, then you agree to do whichever of the following that Google asks you to do: transfer all of your rights regarding your submissions to Google; give Google an exclusive, irrevocable, worldwide, royalty-free license to your submissions to Google; or grant Google any other reasonable rights. You will transfer your submissions to Google, and sign documents and provide support as requested by Google, and you appoint Google to act on your behalf to secure these rights from you. You waive any moral rights you have and agree not to exercise them, unless you notify Google and follow Google’s instructions.

3. Confidentiality. Your feedback and other submissions, is confidential subject to Google’s use of your feedback pursuant to Section 1.3.

4. Term. This Agreement becomes effective when you click the “I Agree” button and remains in force through the end of the Evaluation Period or earlier if either party gives written termination notice, which will be effective immediately. Upon expiration or termination, you will return the Device as set forth below. Additionally, Google will remove you from any related mailing lists within thirty (30) days of expiration or termination. Sections 1.3, 1.4, and Sections 2 through 5 survive any expiration or termination of this Agreement.

5. Device Returns. You will return the Device(s) to Google or its agents to the Checkout Location at the time specified to you at the time of checkout of the Device or if unavailable, to Google Chromebook Lending Library, 1600 Amphitheatre Parkway, Mountain View, CA 94043. Google may notify you during or after the term of this Agreement regarding return details or fees chargeable to you if you fail to return the Device.

The post Get a free Chromebook from the Google Lending Library appeared first on john's journal.

Ayrton Araujo: CloudFlare as a ddclient provider under Debian/Ubuntu

Planet Ubuntu - Fri, 2014-09-12 18:47

Dyn's free dynamic DNS service closed on Wednesday, May 7th, 2014.

CloudFlare, however, has a little known feature that will allow you to update
your DNS records via API or a command line script called ddclient. This will
give you the same result, and it's also free.

Unfortunately, ddclient does not work with CloudFlare out of the box. There is
a patch available
and here is how to hack[1] it up on Debian or Ubuntu, also works in Raspbian with Raspberry Pi.

Requirements

basic command line skills, and a domain name
that you own.

CloudFlare

Sign up to CloudFlare and add your domain name.
Follow the instructions, the default values it gives should be fine.

You'll be letting CloudFlare host your domain so you need to adjust the
settings at your registrar.

If you'd like to use a subdomain, add an 'A' record for it. Any IP address
will do for now.

Let's get to business...

Installation $ sudo apt-get install ddclient Patch $ sudo apt-get install curl sendmail libjson-any-perl libio-socket-ssl-perl $ curl -O http://blog.peter-r.co.uk/uploads/ddclient-3.8.0-cloudflare-22-6-2014.patch $ sudo patch /usr/sbin/ddclient < ddclient-3.8.0-cloudflare-22-6-2014.patch Config $ sudo vi /etc/ddclient.conf

Add:

## ### CloudFlare (cloudflare.com) ### ssl=yes use=web, web=dyndns protocol=cloudflare, \ server=www.cloudflare.com, \ zone=domain.com, \ login=you@email.com, \ password=api-key \ host.domain.com

Comment out:

#daemon=300

Your api-key comes from the account page

ssl=yes might already be in that file

use=web, web=dyndns will use dyndns to check IP (useful for NAT)

You're done. Log in to https://www.cloudflare.com and check that the IP listed for
your domain matches http://checkip.dyndns.com

To verify your settings:

sudo ddclient -daemon=0 -debug -verbose -noquiet

Fork this:
https://gist.github.com/ayr-ton/f6db56f15ab083ab6b55

Ayrton Araujo: New blog with Ghost

Planet Ubuntu - Fri, 2014-09-12 18:42

Here I am again, moving once more. This time from octopress to ghost.
At this time I'm moving because I want an easy way to update my blog. Keep it in a static content generator is a little bit harder to update and fix posts. But I will miss some things from octopress like the codesnipets and the responsible videos plugin. I'm considering to make some pull requests with the features I am missing. I will continue using Haroopad for off-line drafting markdown posts.

Well, for this migration I used an account in Wabble (because it is really cheap) with 4 VPS.

As I don't have an API in Wablle and theres no roadmap for this I used juju manual provisioning. Here is my enviroments.yaml:

environments: wable: type: manual default-series: precise bootstrap-host: example.com bootstrap-user: root

Before add new units, I cleaned up the machine with:

apt-get update && apt-get install curl && curl https://dl.dropboxusercontent.com/u/X/juju-agent.sh | sh

Because of the X, it will not works for you, but heres the script (remember to change the X):

#!/bin/bash # curl https://dl.dropboxusercontent.com/u/x/juju-agent.sh | sh locale-gen en_US.UTF-8 dpkg-reconfigure locales apt-get purge apache2.2-common -y apt-get dist-upgrade -y apt-get autoremove -y apt-get install dbus -y mkdir $HOME/.ssh echo 'ssh-rsa yourpubkey' > $HOME/.ssh/authorized_keys

And then I added new units with juju add-machine ssh:root@example.com with no error.

Then I deployed 2 mysql units and 4 ghost units, with haproxy as follows (as we're using manual provisioning, we need to specify the machines, otherwise it will not work):

juju deploy mysql --to 0 juju add-unit mysql --to 1 juju deploy haproxy --to 2 juju deploy ghost --to 0 juju add-unit ghost --to 1 juju add-unit ghost --to 2 juju add-unit ghost --to 3 juju add-relation mysql ghost juju add-relation haproxy ghost juju expose haproxy

Wait for units to deploy before add the relations.

And voilà:

All this power is just for test, I will change this schema soon, as my blog will never have engagement to justify that scale schema. Ahaha

Here's my juju-gui canvas:

I would like to say thanks to hatch, the creator of Ghost charm, that helped me a lot with some breaked deploys in #juju at irc.freenode.org and quote a related post of him: http://fromanegg.com/post/97035773367/juju-explain-it-to-me-like-im-5

What I would like to have next:

Search for me in #juju at irc.freenode.org if you pass through any problem.

Harald Sitter: My Family…

Planet Ubuntu - Fri, 2014-09-12 15:33

… is the best in the whole wide world!

Ubuntu Podcast from the UK LoCo: S07E24 – The One with the Holiday Armadillo

Planet Ubuntu - Fri, 2014-09-12 13:05

We’re back with Season Seven, Episode Twenty-Four of the Ubuntu Podcast! Alan Pope, Mark Johnson, and Laura Cowen are drinking tea and eating Battenburg cake in Studio L.

 Download OGG  Download MP3 Play in Popup

In this week’s show:

  • We discuss whether communities suck…

  • We also discuss:

    • Aurasma augmented reality
    • Upgrading to 14.10
    • Converting a family member to Ubuntu
  • We share some Command Line Lurve which does this (from Patrick Archibald on G+): curl -X POST -H "Content-Type: application/json" -d '{"jsonrpc":"2.0","method":"GUI.ShowNotification","params":{"title":"This is the title of the message","message":"This is the body of the message"},"id":1}' http://wopr.local:8080/jsonrpc
  • And we read your feedback. Thanks for sending it in!

We’ll be back next week, so please send your comments and suggestions to: podcast@ubuntu-uk.org
Join us on IRC in #uupc on Freenode
Leave a voicemail via phone: +44 (0) 203 298 1600, sip: podcast@sip.ubuntu-uk.org and skype: ubuntuukpodcast
Follow us on Twitter
Find our Facebook Fan Page
Follow us on Google+

Benjamin Kerensa: Off to Berlin

Planet Ubuntu - Thu, 2014-09-11 20:45
Right now, as this post is published, I’m probably settling into my seat for the next ten hours headed to Berlin, Germany as part of a group of leaders at Mozilla who will be meeting for ReMo Camp. This is my first transatlantic trip ever and perhaps my longest flight so far, so I’m both […]

Pages

Subscribe to Free Software Magazine aggregator