Blog

Adobe’s Portable Document Format (PDF) has reached great popularity over the last years and is the number one format for easy document exchange. It comes with great features such as embeddable images and multimedia, but also has rather unpleasant properties. The so called Security Features represent a simple Digital Rights Management (DRM) system and allow PDF authors to restrict the file usage. Using the DRM system, authors can allow or deny actions such as printing a file, commenting or copying content.

Even though this is a good idea for some situations, most of the times, it’s just annoying: Collecting ideas for seminar papers or a thesis, for instance, is almost impossible without being able to Copy & Paste certain paragraphs from the PDF.

Nowadays, it appears to me as if almost everything in the big and fancy world of IT comes with the need to sign up and create an account. Every little online tool, every social networking site and of course every instant messenger account. System administrators hits it even harder: The setup of a server machine requires to create lots of different users for every kind of service, — be it Postfix, Sendmail, Courier, MySQL, PostgreSQL, etc. Most of them require some kind of super-user password or account.

This is where a password manager comes in handy: Open the password vault by typing in the master password, put in all you secrets and crucial information, save it and be happy. As if!

Almost every password manager I found on the Web was crowded out by details so that it took minutes to add a single account. What I wanted was something like a text-file with password — and that’s what I made: A simple command-line password safe.

There are various peer-to-peer protocols out there. All of them focus the decentralisation of storage and other system resources. Most implement a distributed hash table (DHT) to store information. That is, each node of the network only holds a small part of the hash table but is able to locate and retrieve any requested entry. Kademlia, a protocol designed by two NYU students in 2002, is one of them.

In the last few years, the Internet has become increasingly important in various fields of our lives. Not only personal households have discovered the nearly endless possibilities of the Web, but also companies found many different ways of gaining revenue through the online world. Most of the global players and many medium-sized IT companies have realized what opportunities the Web and its technologies provide and used them to build up new services for consumers and businesses. In order to compete with the evolving market, companies of traditional business areas such as newspapers or TV broadcasting companies had to diversify their product lines and are forced to react in a fast, flexible and cost efficient way on every day’s changes of demands and technologies. In fact, every company has to adapt these technologies efficiently to have a chance in the growing market.

As it brings its benefits, cost savings and new customers, every new technology also comes with the more or less known downsides. Even if IT managers are qualified to consider most of the details in how to use and implement them, new software, hardware or resources will – no matter what – always cause unpredicted problems. Due to the IT dependence of today’s companies, every downtime, bug or system overload of a production system directly results in declining profits and higher costs. Especially for service providers, every downtime is business critical to many dependent companies and has to be prevented.

Therefore, companies spend a considerably high amount of money and time to create a stable, flexible and extensible IT environment that supports their business by minimizing risks, increasing availability and allowing to provide better service levels to customers.

Virtualization is a key technology that addresses to achieve these goals. It allows to run multiple virtual computers on the same physical system. By creating an abstraction of the underlying hardware, it allows to execute a variety of virtual machines (VMs) on top of a virtualized hardware.

This article will discuss how the technology of virtualization works, what advantages it offers and why it is an essential part of today’s data centers. The focus will be the server virtualization solution VMware Infrastructure, the flagship product suite of VMware Inc.

As some of you might know, Unison is this great tool that allows bidirectional synchronisation of two hosts, – no matter which operating system they’re running… Well, at least the well known ones are supported.

Since Unison can also be used to sychronise more than two hosts, it’s perfect for big amounts of data that has to be shared in a team.

A scenario like this is possible and works for me: UserA <-> Server <-> UserB.
But of course, also other users could sync with the server. Unison rocks!

Today, after reinstalling his OS, my friend got the following error message:

GCALDaemon is a great tool to synchronize many of Google’s services such as Google Calendar and Contacts with your local PC. Unfortunately, the installation on Ubuntu/Kubuntu and any other Linux distribution is still not the most comfortable. For this reason, I sat down some hours and packed the tool into a deb-package and additionally added a nice command line tool to simplify some of the basics.

Since nearly everybody in the US and more and more Europeans have an iPod and the whole world loves Youtube, wouldn’t it be nice to copy these flash streaming videos (flv-files) to your iPod Video? — Yes, it is possible. And I will tell you how.

Working on the console is sometimes tiring, especially when you have to rename files. Using Nautilus is much quicker for these types of actions. The problem is, that if you’re working in a deep depth of your file tree and your path is very long, it may take you some extra seconds to open this path in the Nautilus browser. So wouldn’t it be much easier to simply type naut on the console to open Nautilus with the current working directory?

The good thing about the file sync tool Unison is, that it’s available for several operating systems. This is great for groups working on different systems (Mac, Linux and Windows) but want to share and synchronize files on a remote server.

Well, the bad thing about Unison on the other hand is, that its backwards compatibility is anything but great, so that you have to make sure that everybody in the team uses the same version. And this can be tricky depending on what system you are using.

My home system is Ubuntu Hardy, the remote server system is Debian Etch. Both come with Unison 2.16.13 which would be great if not Apple’s new Leopard brings the newest version 2.27.57. Long story short, I needed the newest version on Hardy and Etch.