A couple of months ago Jono announced the dates for the Ubuntu Online Summit, June 10th – 12th, and those dates are almost upon us now. The schedule is opened, the track leads are on board, all we need now are sessions. And that’s where you come in.
Ubuntu Online Summit is a change for us, we’re trying to mix the previous online UDS events with our Open Week, Developer Week and User Days events, to try and bring people from every part of our community together to celebrate, educate, and improve Ubuntu. So in addition to the usual planning sessions we had at UDS, we’re also looking for presentations from our various community teams on the work they do, walk-throughs for new users learning how to use Ubuntu, as well as instructional sessions to help new distro developers, app developers, and cloud devops get the most out of it as a platform.
What we need from you are sessions. It’s open to anybody, on any topic, anyway you want to do it. The only requirement is that you can start and run a Google+ OnAir Hangout, since those are what provide the live video streaming and recording for the event. There are two ways you can propose a session: the first is to register a Blueprint in Launchpad, this is good for planning session that will result in work items, the second is to propose a session directly in Summit, which is good for any kind of session. Instructions for how to do both are available on the UDS Website.
There will be Track Leads available to help you get your session on the schedule, and provide some technical support if you have trouble getting your session’s hangout setup. When you propose your session (or create your Blueprint), try to pick the most appropriate track for it, that will help it get approved and scheduled faster.Ubuntu Development
Many of the development-oriented tracks from UDS have been rolled into the Ubuntu Development track. So anything that would previously have been in Client, Core/Foundations or Cloud and Server will be in this one track now. The track leads come from all parts of Ubuntu development, so whatever you session’s topic there will be a lead there who will be familiar with it.Track Leads:
- Łukasz Zemczak
- Steve Langasek
- Leann Ogasawara
- Antonio Rosales
- Marc Deslaurs
Introduced a few cycles back, the Application Development track will continue to have a focus on improving the Ubuntu SDK, tools and documentation we provide for app developers. We also want to introduce sessions focused on teaching app development using the SDK, the various platform services available, as well as taking a deeper dive into specifics parts of the Ubuntu UI Toolkit.Track Leads:
- Michael Hall
- David Planella
- Alan Pope
- Zsombor Egri
- Nekhelesh Ramananthan
This is the counterpart of the Application Development track for those with an interest in the cloud. This track will have a dual focus on planning improvements to the DevOps tools like Juju, as well as bringing DevOps up to speed with how to use them in their own cloud deployments. Learn how to write charms, create bundles, and manage everything in a variety of public and private clouds.Track Leads:
- Jorge Castro
- Marco Ceppi
- Patricia Gaughen
- Jose Antonio Rey
The community track has been a stable of UDS for as long as I can remember, and it’s still here in the Ubuntu Online Summit. However, just like the other tracks, we’re looking beyond just planning ways to improve the community structure and processes. This time we also want to have sessions showing users how they can get involved in the Ubuntu community, what teams are available, and what tools they can use in the process.Track Leads:
- Daniel Holbach
- Jose Antonio Rey
- Laura Czajkowski
- Svetlana Belkin
- Pablo Rubianes
This is a new track and one I’m very excited about. We are all users of Ubuntu, and whether we’ve been using it for a month or a decade, there are still things we can all learn about it. The focus of the Users track is to highlight ways to get the most out of Ubuntu, on your laptop, your phone or your server. From detailed how-to sessions, to tips and tricks, and more, this track can provide something for everybody, regardless of skill level.Track Leads:
- Elizabeth Krumbach Joseph
- Nicholas Skaggs
- Valorie Zimmerman
So once again, it’s time to get those sessions in. Visit this page to learn how, then start thinking of what you want to talk about during those three days. Help the track leads out by finding more people to propose more sessions, and let’s get that schedule filled out. I look forward to seeing you all at our first ever Ubuntu Online Summit.
[...] start using the emulator for everything you usually would do on the phone. We really want to make the emulator a class A engineering platform for everyone
While the final emulator is still work in progress, we’re continually seeing the improvements in finishing all the pieces to make it a first-class citizen for development, both for the platform itself and for app developers. However, as it stands today, the emulator is already functional, so I’ve decided to prepare a quickstart guide to highlight the great work the Foundations and Phonedations teams (along with many other contributors) are producing to make it possible.
While you should consider this as guide as a preview, you can already use it to start getting familiar with the emulator for testing, platform development and writing apps.Requirements
To install and run the Ubuntu emulator, you will need:
- Ubuntu 14.04 or later (see installation notes for older versions)
- 512MB of RAM dedicated to the emulator
- 4GB of disk space
- OpenGL-capable desktop drivers (most graphics drivers/cards are)
If you are using Ubuntu 14.04, installation is as easy as opening a terminal, pressing Ctrl+Alt+T and running these commands, followed by Enter:
sudo add-apt-repository ppa:ubuntu-sdk-team/ppa && sudo apt-get update
sudo apt-get install ubuntu-emulator
Alternatively, if you are running an older stable release such as Ubuntu 12.04, you can install the emulator by manually downloading its packages first:Show me how
- Create a folder named MARKDOWN_HASHb3eeabb8ee11c2be770b684d95219ecbMARKDOWN_HASH in your home directory
- Go to the goget-ubuntu-touch packages page in Launchpad
- Scroll down to Trusty Tahr and click on the arrow to the left to expand it
- Scroll further to the bottom of the page and click on the MARKDOWN_HASH05556613978ce6821766bb234e2ff0f2MARKDOWN_HASH package corresponding to your architecture (i386 or amd64) to download in the MARKDOWN_HASH1e681dc9c2bfe6538971553668079349MARKDOWN_HASH folder you created
- Now go to the Android packages page in Launchpad
- Scroll down to Trusty Tahr and click on the arrow to the left to expand it
- Scroll further to the bottom of the page and click on the MARKDOWN_HASH1843750ed619186a2ce7bdabba6f8062MARKDOWN_HASH package corresponding to download it at the same MARKDOWN_HASH1e681dc9c2bfe6538971553668079349MARKDOWN_HASH folder
- Open a terminal with Ctrl+Alt+T
- Change the directory to the location where you downloaded the package writing the following command in the terminal: MARKDOWN_HASH8844018ed0ccc8c506d6aff82c62c46fMARKDOWN_HASH
- Then run this command to install the packages: MARKDOWN_HASH0452d2d16235c62b87fd735e6496c661MARKDOWN_HASH
- Once the installation is successful you can close the terminal and remove the MARKDOWN_HASH1e681dc9c2bfe6538971553668079349MARKDOWN_HASH folder and its contents
- Downloaded images are cached at ~/.cache/ubuntuimage –using the standard XDG_CACHE_DIR location.
- Instances are stored at ~/.local/share/ubuntu-emulator –using the standard XDG_DATA_DIR location.
- While an image upgrade feature is in the works, for now you can simply create an instance of a newer image over the previous one.
The ubuntu-emulator tool makes it again really easy to manage instances and run the emulator. Typically, you’ll be opening a terminal and running these commands the first time you create an instance (where myinstance is the name you’ve chsen for it):
sudo ubuntu-emulator create myinstance --arch=i386
ubuntu-emulator run myinstance
You can create any instances you need for different purposes. And once the instance has been created, you’ll be generally using the ubuntu-emulator run myinstance command to start an emulator session based on that instance.
Notice how in the command above the --arch parameter was specified to override the default architecture (armhf). Using the i386 arch will make the emulator run at a (much faster) native speed.
Other parameters you might want to experiment with are also: --scale=0.7 and --memory=720. In these examples, we’re scaling down the UI to be 70% of the original size (useful for smaller screens) and specifying a maximum of 720GB for the emulator to use (on systems with memory to spare).
There are 3 main elements you’ll be interacting with when running the emulator:
- The phone UI – this is the visual part of the emulator, where you can interact with the UI in the same way you’d do it with a phone. You can use your mouse to simulate taps and slides. Bonus points if you can recognize the phone model where the UI is in ;)
- The remote session on the terminal – upon starting the emulator, a terminal will also be launched alongside. Use the phablet username and the same password to log in to an interactive ADB session on the emulator. You can also launch other terminal sessions using other communication protocols –see the link at the end of this guide for more details.
- The ubuntu-emulator tool – with this CLI tool, you can manage the lifetime and runtime of Ubuntu images. Common subcommands of ubuntu-emulator include create (to create new instances), destroy (to destroy existing instances), run (as we’ve already seen, to run instances), snapshot (to create and restore snapshots of a given point in time) and more. Use ubuntu-emulator --help to learn about these commands and ubuntu-emulator command --help to learn more about a particular command and its options.
- Make sure you’ve got enough space to install the emulator and create new instances, otherwise the operation will fail (or take a long time) without warning.
- At this time, the emulator takes a while to load for the first time. During that time, you’ll see a black screen inside the phone skin. Just wait a bit until it’s finished loading and the welcome screen appears.
- By default the latest built image from the devel-proposed channel is used. This can be changed during creation with the --channel and --revision options.
- If your host has a network connection, the emulator will use that transparently, even though the network indicator might show otherwise.
- To talk to the emulator, you can use standard adb. The emulator should appear under the list of the adb devices command.
I hope this guide has whetted your appetite to start testing the emulator! You can also contribute making the emulator a first-class target for Ubuntu development. The easiest way is to install it and give it ago. If something is not working you can then file a bug.
If you want to fix a bug yourself or contribute to code, the best thing is to ask the developers about how to get started by subscribing to the Ubuntu phone mailing list.
If you want to learn more about the emulator, including how to create instance snapshots and other cool features, head out to the Ubuntu Emulator wiki page.
And next… support for the tablet form factor and SDK integration. Can’t wait for those features to land!
- Review ACTION points from previous meeting
- T Development
- Server & Cloud Bugs (caribou)
- Weekly Updates & Questions for the QA Team (psivaa)
- Weekly Updates & Questions for the Kernel Team (smb, sforshee)
- Ubuntu Server Team Events
- Open Discussion
- Announce next meeting date, time and chair
- ACTION: meeting chair (of this meeting, not the next one) to carry out post-meeting procedure (minutes, etc) documented athttps://wiki.ubuntu.com/ServerTeam/KnowledgeBase
Pretty straightforward meeting given 14.04 release is still pretty fresh in our minds, and ODS was last week. Great Demo at UDS! Utopic Unicorn is underway way (https://wiki.ubuntu.com/UtopicUnicorn/ReleaseSchedule) the first Alpha release scheduled for June 26th, and vUDS on June 12th. Server team blueprints are also in progress with the topic blueprint (https://blueprints.launchpad.net/ubuntu/+spec/topic-u-server_ already created and dependencies being posted to it.
The Bugs we covered were:
- Launchpad bug 1319555 in ec2-api-tools (Ubuntu Utopic) “update out-dated ec2-api-tools for 12.04″ [High,New]
- Launchpad bug 1315052 in lxc (Ubuntu Utopic) “lxc-attach from a different login session fails” [High,Triaged]
- Launchpad bug 1317587 in clamav (Ubuntu Utopic) “ClamAV 0.98.1 is Outdated” [High,In progress]
- Launchpad bug 1317811 in linux (Ubuntu) “Dropped packets on EC2, “xen_netfront: xennet: skb rides the rocket: x slots”" [Medium,In progress]
Next meeting will be on Tuesday, May 27th at 16:00 UTC in #ubuntu-meeting.
Additional logs @ https://wiki.ubuntu.com/MeetingLogs/Server/20140520
Since it would briefly come back to life if allowed to rest for awhile unplugged, I bought a backup hard drive of the proper sort (with its own power supply), and plugged it in. However, the DVR died before backing up commenced.
This DVR is leased, so Dish sent along a new one, asking that I return the broken one within 10 days. Yesterday a technician showed up to be sure that everything was working, since the new machine had taken quite long to get a watchable image. I asked him if it was possible to move the old hard drive into the new machine and do the backup, before sending the old machine back? He said yes, although he couldn't do it for us.
So, new challenge: remove the old hard drive from the old DVR. I found a wikibook about the DVR here: en.wikibooks.org/wiki/VIP_922/Dish_Network. There the author says,
With no A/C power connected - from the old 922, pull the internal hard drive and set it aside. To do this remove 4, back cover screws (black,) then slide the cover back about 1/2 inch and tilt upwards to remove.Well now, here was my first problem. Screws, no biggie. But "slide the cover back"? It simply would not move for me. However, teamwork to the rescue. My husband Bob used the straight-slot screwdriver and pried with a bit more power than I would have used, and slide, it did. From there on out, the problem was redimensionated (thank you genii for that beautiful term!) and it was only a matter of more screws, unhooking the power and motherboard connection, and sliding out.
After using my husband again to help me slide out the TV cabinet and photograph and then remove the DVR hookups, I used the same procedure again. Remove the cover, then the HD, and then switched the old HD into its place. I left off the cover, and then Bob hooked up the new machine again, and we turned it on to wait. After the backup hard drive was plugged in, this slow beast restarted again, but we had a new option: backup. Just to be safe, we selected only the Doctor Who eps. If there is room once that's done, I'll select the rest of what I want. The backup is now proceeding, and the readout reports that it will take another 10 hours. OMG, usb is slow!
So, redimensionating is cool. I'm going to try to remember to do it more often. Also, many thanks to the dish tech and wikibook author who both shared their information freely, and my husband who supplied support, muscle, and didn't give up!
Welcome to the Ubuntu Weekly Newsletter. This is issue #369 for the week May 19 – 25, 2014, and the full version is available here.
In this issue we cover:
- Ubuntu is the leading OpenStack distribution
- Ubuntu Stats
- Ubuntu 14.04 Presentations at FeltonLUG and BALUG
- Ubuntu Ohio Team Update: May 22, 2014
- Cloudbase Solutions Partnership
- Ubuntu Cloud Documentation – 14.04LTS
- Xubuntu: Screen locking in Xubuntu 14.04
- Jono Bacon: Goodbye Canonical, Hello XPRIZE
- Sebastian Kugler: Locale changes in Plasma Next
- Michael Hall: App Developer Sprint
- Svetlana Belkin: Ubuntu Scientists Team Update: May 23, 2014
- Lubuntu Blog: LXQt-Admin: System admin tools for LXQt arrived
- Ubuntu Women: Phase 2 of ProjectHarvest
- Unity8 & Mir update May 20, 2014
- Canonical News
- In The Blogosphere
- Other Articles of Interest
- Ubuntu Podcast from the UK LoCo: S07E08 – The One with the Yeti
- Weekly Ubuntu Development Team Meetings
- Upcoming Meetings and Events
- Updates and Security for 10.04, 12.04, 13.10 and 14.04
- And much more!
The issue of The Ubuntu Weekly Newsletter is brought to you by:
- Elizabeth K. Joseph
- Paul White
- Diego Turcios
- And many others
Except where otherwise noted, content in this issue is licensed under a Creative Commons Attribution 3.0 License BY SA Creative Commons License
Pasta is a wonderful thing. However many unwonderful things are done to pasta when it's being cooked. Consider this post a how-to about improving your pasta eating experience.
A sub-par pasta experience is one that involves overcooked & bland pasta, with the sauce slopped on top. However, this very easy to avoid.How to Improve Your Pasta Experience
Whether boxed or fresh pasta, following (and remembering) these few things will improve your pasta cooking abilities and your life as well as impress your friends & lover(s).*
*these claims may be exaggerated.1. Use enough salt
Something I see done frequently by people is they boil water for their pasta and either don't salt their water or put only a pinch in.
Salt is crucial to giving your pasta flavour & lowering the overall amount of salt needed for your dish.
Also, if you've heard that salted water cooks food faster (because of its higher boiling temperature), those claims are a bit exaggerated; the amount of salt you're adding is only enough to raise the temperature about 1 degree.2. Don't add oil
There's a bad practice of adding oil to the pasta cooking liquid to keep it from sticking. This only achieves one thing: oily pasta & oily pasta means sauce won't cling to it or be absorbed, which equals flavourless pasta.
Adding oil may also keep the pasta water from bubbling up and boiling over the rim, but this can also be achieved by using a large enough pot and also by reducing the heat a little (but still maintaining a boil).3. Stir
During the first minute or two of cooking, give the pasta a good stir to keep it from sticking together.
This is the crucial, since during this time the pasta is coated with sticky starch. If you don't stir, pieces of pasta that are touching one another literally cook onto one another.4. Avoid rinsing
Rinsing the pasta after cooking, will cool the pasta and prevent the absorption of a sauce. Not to mention it can wash away any remaining surface starch, which is advantageous to your cooking of the pasta. This small amount of starch left on the pasta by the cooking water can thicken your sauce slightly when you do encorporate the pasta.5. Cooking al dente
The term al dente is simply culinary-speak for pasta that is just slightly undercooked, which is considered by many to be the optimal mouthfeel for pasta.
As cooking times vary for various pasta shapes, the only way to truly know is to sample one of the cooking pasta and see if it has just a little bite to it when you chew it –this is al dente and considered cooked.6. Finish cooking in sauce
As it cools, the starch in the pasta crystallizes and becomes insoluble, therefore the pasta won't absorb as much sauce. As such, I always prepare the sauce first in a large skillet, regardless of it's simplicity, before cooking the pasta.
The moment the pasta is done, I scoop it out of the water with a spider and let it drain over the pot for a few seconds. Then I dump it into the hot sauce, stir well, & cover it to let the pasta absorb the sauce for a minute or two, before serving.Bonus: Quick Tomato Sauce Recipe
It would be appropriate of me to provide a sauce recipe after all of that, so here's a quick tomato sauce.
- 1 large (28 oz.) can tomatoes, diced or whole (uncooked)
- 1 can (12 oz.) tomato paste
- 1 bulb of garlic (10-12 cloves), minced or thinly sliced
- 1 onion, diced
- 1-2 tablespoons, olive oil
- 2 bay leaves
- 2 tablespoons dry oregano
- 250-300mL red wine
- salt, to taste
- 1 bag, baby spinach
- 1 box dry pasta, such as penne or farfalle (bowties)
- Preheat your skillet over med-high.
- Add olive oil and saute the garlic & onion for a few minutes.
- Add the wine, canned tomatoes & tomato paste. Stir.
- Add the bay leaves & oregano. Season with salt. Simmer for 8-10 minutes.
- Cook your pasta and encorporate into sauce. Cover & let it absorb the sauce for a few minutes.
- Place the bag of spinach atop the encorporated pasta-sauce mixture & cover –the remaining heat will be enough to wilt the spinach.
- When spinach is wilted, serve and garnish with grated Parmesan, if desired.
I had no idea that innocuous little blog post would result in a friendship with the author, Daniel Suarez, himself. Daniel, and his publicist, Michelle, would send me an early preview print of the sequel to Daemon, Freedom™, as well as his next two books, Kill Decision and Influx over the subsequent 6 years.
I read Influx in December 2013, a couple of months before its official release, on a very long flight to Helsinki, Finland.
Predictably, I thoroughly enjoyed it as much as each of Daniel's previous 3 books. One particular story arch pays an overt homage to one of my favorite books of all time -- Alexandre Dumas' Count of Monte Cristo. Influx succeeded in generating even more tension, for me. While it's natural for me to know, intuitively, the line between science and fiction for the artificial intelligence, robotics, and computer technology pervasive in Daemon, Freedom™, and Kill Decision, Influx is in a different category entirely. There's an active, working element of new found thrills and subconscious tension not found in the others, built on the biotechnology and particle physics where I have no expertise whatsoever. I found myself constantly asking, "Whoa shit man -- how much of that is real?!?" All in all, it makes for another fantastic techno-thriller.
After 5+ years of email correspondence, I actually had the good fortune to meet Daniel in person in Austin during SxSW. My friend, Josh (who was the person that originally game me my first copy of Daemon back in 2008), and I had drinks and dinner with Daniel and his wife.
It was fun to learn that Daniel is actually quite a fan of Ubuntu (which made a brief cameo on the main character's computer in Kill Decision). Actually, Daniel shared the fact the he wrote the majority of Influx on a laptop running Ubuntu!
Good libraries come with good documentation. It is therefore essential for KDE Frameworks to provide comprehensive online and offline documentation.
- kgenapidox: Generate documentation for a single framework
- kgenframeworksapidox: Generate documentations for all frameworks as well as the landing page which lets you list and filter frameworks by supported platforms
- depdiagram-prepare and depdiagram-generate: Generate dependency diagrams (requires CMake and Graphviz)
In this post I am going to talk about kgenapidox, which is the tool you are most likely to run by yourself. While it is often good enough to read documentation online through api.kde.org, it is also useful to be able to generate documentation offline, for example because your Internet access is slow or you are currently offline, or because you want to improve the existing documentation. kgenapidox is the tool you want to use for this.Installing
The first thing to do is to install KApidox. The code is hosted in a Git repository on KDE infrastructure. Get it with:git clone git://anongit.kde.org/kapidox
KApidox tools are written in Python. In addition to Doxygen, you need to have the pyyaml and Jinja2 Python modules installed. If your distribution does not provide packages for those modules, you can install them with:pip install --user pyyaml jinja2
KApidox itself can be installed the standard Python way using python setup.py install. You can also run KApidox tools directly from the source directory.Generating Documentation
You are now ready to generate documentation. Go into any checkout of a framework repository and run kgenapidox:$ kgenapidox 19:08:48 INFO Running Doxygen 19:08:49 INFO Postprocessing 19:08:50 INFO Done 19:08:50 INFO Doxygen warnings are listed in apidocs/doxygen-warnings.log 19:08:50 INFO API documentation has been generated in apidocs/html/index.html
As you can see from the command output, the documentation is generated by default in the apidocs/html directory. You can now open the documentation with your preferred browser. kgenapidox can also tell Doxygen to generate man pages or Qt compressed help files. Run kgenapidox --help for more details.Improving the Documentation
If you maintain a framework, contribute to the KDE Frameworks project or want to get involved, open the warning file generated in apidocs/doxygen-warnings.log and start fixing! Improving the documentation of a framework can make it much more useful, so it is a very welcome contribution.
Vim tip: The warnings file can be loaded in the quickfix list with :cfile apidocs/doxygen-warnings.log.
Wow. Just wow.
Last week I went to a Canonical Sprint, in Malta. The track was about the Client, and we focus on how make the Ubuntu for Phones even better.
We discussed also about some new features and some new designs we will introduce in next weeks. Stay tuned! During the week I fixed many bugs, and wrote a lot of code.
But I think the Sprint isn’t (only) about codes and features. IMO it’s much more about networking. You know, we work mainly via IRC and mailing lists, and for how wonderful can be an online friendship, it will be never as drink some beers together!
So, first of all thanks to Canonical for the invite, and for the awesome adventure I’m living in this last year.
But mainly thanks to all the guys I met. This week allows me to understand that people I working with are not only cool as developers, but also awesome as people!
We were 6 of community that joined the sprint: other than me, there were Nik, my roommate, who is making clock app rocking; Victor and Andrew, which develope the music app, Kunal, our wizard of calendar, and Adnane, who works on the HTML5 SDK. It’s an honour to code side by side with these guys.
Then, there was the Canonical Community Team: Alan and David are my mentors, my point of contact in Canonical, it was a pleasure to meet them and to have a beer (or 2, maybe 3…) together! Michael is the man who keeps update developer.ubuntu.com (and he does a lot of other things) and when I report a bug I know that in five minutes will be fixed. Nicholas is a cool guy, but he wants I use autopilot. No one is perfect. There was also Daniel. I worked with him only for few patches in december, so I had no possibility before this week to speak with him. And know what? He’s so funny! Last but not least, Jono: we had some time to talk, and I’m very happy about this: thanks for all your work in Canonical, and good luck for your future!
But all other Canonical employers were very gentle, and I met some guys which I want to see soon, and now I know who ping on IRC to have a bug in the SDK fixed ;-)
Also, other guys I used to know only on IRC (boiko, elopio, mardy, zsombi and others) now have a face! And new guys, from Design and QA and online account teams!!!
Was a wonderful week, and I have no words to say how much I’m happy, and to say thanks to all!
So, thanks for all guys, hope to see you all soon, and continue to make Ubuntu rocking!
This work is licensed under Creative Commons Attribution 3.0 Unported
I managed to spend a few hours doing Debian stuff again today, which was great.
Today I learned about blhc, which is sadly not mentioned in the wiki page on hardening, which I always refer to. It turns out that it is mentioned in the walkthrough wiki page linked off it though. I'd not read that page until today. Many thanks to Samuel Bronson on IRC for pointing out the tool to me.
Initially I didn't think the tool told me anything I didn't already know, but then I realised it was saying that the upstream Makefile wasn't passing in $(CPPFLAGS) and $(LDFLAGS) when it invoked the compiler. Know that I know all of that, the build warning also mentioned in the PTS made a whole lot more sense. Definitely a case of "today I learned..."
So I made a simple patch to the upstream Makefile.in and simpleproxy is now all appropriately hardened. I'm very happy about that, as it was annoying me that it wasn't Lintian-clean.
I was able to use the same technique to similarly fix up sma. It's somewhat entertaining when you maintain a package for almost 7 years, and the upstream homepage changes from being the software author's website to what appears to be erotic fiction advertising for London escorts... That made for some entertaining reading this morning.
I've now managed to give all my packages a spring clean. I might do another pass and convert them all to debhelper 9 as a way of procrastinating before I touch isc-dhcp.