2012-04-15: I found it easier to work with signed URLs instead of a
separate header so the blog post was updated to use these.

I like to find new ways to use Ubuntu One as easily as possible. I have to
admit that the way described in headless Ubuntu One article is not quite
reliable since the file is read completely into the memory prior to being
transferred. For large files this can be a problem. There are ways to make this
work for streaming, but while investigating this I found out that we can use
curl for uploads and downloads, without any python wrapper.

But in order to make it work this way, you need just a tiny bit of preparation.

First of all, you must know that all Ubuntu One clients do not use your
password for authentication. Instead, a mechanism called OAuth is used. This means that
for every connected device a separate set of credentials is being assigned
allowing you to invalidate the access whenever you need and keep the really
secret password to yourself.

OAuth is not natively supported by curl but since this is a standard, there are
multiple solutions you can use to generate the signed URL as in the example,
there are no proprietary extensions or cookie storage.

First you need to get the tokens, we are using pretty much the same
ubuntuone-sso-login.py script I mentioned in the article about the REST client,
however in order to make it simpler to pass the arguments to other application,
i added the --format(-f) option, which does the following:

Please note that I am specifying Content-Type header too. Usually when you ask
curl to upload with --data-binary, it sets the Content-Type to
application/octet-stream. This is a mandatory header for Ubuntu One servers. In
case you omit the header, you will receive an error.

I have some SQL databases of text messages, expenses and a test openfire
installation. I upload the backup regularly to Ubuntu One with the following
simple script. I put the OAUTH exports to ~/.ubuntuone-credentials file which
gets sourced.

Once upon a time I had no stable internet connection, mult.ru distributed
their “Масяня” movies as .exe files and I was not happy with them on the linux
system.

In order to make a standalone movie, flashplayer.exe was simply concatenated
with .SWF file, so to get .SWF file I simply needed the offsets and
end markers (if any).

I wrote a simple application in C that was doing exactly this.

At some point in time my friend Alex asked me to see whether it is possible to
recover any data from his hard drive which had the file system table
completely overwritten by something and no data recovery software was able to
extract anything.

I dumped the image of the whole 20Gb drive and began thinking about the way
the data can be recovered if we don’t know where to look.

And so the nxtractor project was born. During next couple of days I was
examining the signatures and markers of the most common file types, wrote the
scanner that was sufficiently good and extensible.

I did not recover much information from that drive, the data was too
fragmented.

This was back in 2003.

Fast forward to today, when SD cards hold the same amount of data as the hard
drives of the past, data storage devices are used by music players, phones and
cameras.

Now imagine you have deleted all the photos from the camera. What would you
do?

Yes, that’s the ideal variant, in some cases you will need to do adjustments
to the types.dat file, for example to prevent false positives on TIFF files
you comment out the whole type=tiff block. To make sure you are not extracting
embedded data in JPEG files, set minsize in type=jpeg to 1000000 which means
that it will not consider files that are less than 1Mb to be real jpeg files.

For example, my recent removal of the photos resulted in the following:

I am doing this over and over again. Along with writing puppet config I am
posting the recipe for my WC3119 configuration here in case somebody finds it
useful too. I will describe the setup of Xerox WC3119 for remote printing and
scanning in Ubuntu Precise Pangolin (12.04).

I played with the software a bit and then decided to cancel the free trial
after 10 days. I had my PayPal account attached to my Live/XBox account so all
the charges originated there are automatically processed.

A good thing with Ukrainian banks is that when you sign up for an SMS banking,
you will get notifications about every transaction attempt. And, thanks to
proper mental state of the industry, there are no NSF fees, unlike some
parts of the world.

So I got a notification today at 2AM that there was a charge attempted. My
PayPal account is connected to a card I top up with the exact amount I plan to
spend, nothing more. And should I left some extra cash there I could have been
charged for £8.99 (I created the account while I was in the UK).

As a confirmation of the attempt I got the message from MSFT ZUNE:

We are contacting you because we have been unable to charge your PayPal
Account for your 1-month subscription service(s) being billed to you through
Microsoft Online Services. The following PayPal Account is the current
payment method on your billing account:

The site itself tells a different thing:

Try it free

The low risk way to try before you buy. Enjoy the full Zune Music Pass
benefits free for 14 days (Trial continues to paid subscription).**

...

** Free 14 day trial will automatically continue to a paid subscription
unless cancelled before the end of the trial period. Limit 1 trial per
person.

Once in a while I return to the idea to write something in Vala. Vala's
integration with Glib and Gnome is outstanding, you can write a useful DBus
client in 10 lines, compile and run at native code speed without any scripting
language interpreter overhead.

Ubuntu One filesync service exports a ton of DBus methods that are then used
for the Control Panel, nautilus plugin, rhythmbox music store plugin and
others. All the communication between the clients and the service on Ubuntu
is based on DBus.

So, let's make a client that asks syncdaemon for a list of all published
files and prints it out. Before we start, we need to find out what
methods and signals are actually exported.

I am running a few Ubuntu installations connected to the Internet. As you may
know, once machine is connected to the Internet, it is subject to various
hacking attempts, both automatic and manual.

The most widespread attack vector for *nix machines is SSH brute-forcing. I
once became a victim of such attack
and now all my machines are using SSH public key authentication only. I was
curious what passwords the attackers were using so I came up with a simple
idea of password collection.

As the old joke goes, “you cannot lose an android phone because it is always
near the power socket”. For me this is exactly what I am experiencing with my
Acer Liquid E. Whenever I am using it for browsing the Internet via WiFi or
3G, I can discharge it completely in 2 hours.

So right before we went on a bus trip to Norway I decided to find out what are
my options to get extended battery life for my phone.

Of course, there are high capacity battery for all normal phones.
Unfortunately, not an option for mine.

My wife is also using an Android-based HTC Desire Z, so charging different
devices is definitely a plus.

Also, we have cameras that take AA batteries and the ultimate choice would be
able to charge external batteries too.

So, your capacity basically depends on what batteries you chose. The unit I
purchased came with 4xAA batteries of 2700mAh. I purchased another set as a
backup.

It turned out that one set of these batteries can charge a phone and a half
completely. Also, we were able to charge 2 phones simultaneously using an USB
hub – pure awesomeness!

It is worth noting that when the unit functions as a USB charger from wall
outlet, it does not charge the batteries. When functioning as AA/AAA charger
it tracks every single battery charge separately.

The only minor issue I found so far is that when the batteries are almost
depleted the current converter starts producing a long squeaky sound until it
shuts down.