The Face of Venus

The beautifully named Openvenus platform aims to simplify an administrator's life; it runs commands against groups of servers and distributes files or software from a central location.

An administrator's daily work comprises many tasks, some of which are quite tedious: It involves installing patches, setting up new servers, or even making changes to all the computers in the enterprise. Virtualization and cloud computing don't improve this situation because setting up a new virtual server based on a master image is done quickly, and the inclination is thus to do so more often than needed. However, maintaining and managing the current crop of images requires the same effort as with a dedicated device.

To streamline tasks, admins tend to write their own toolsets or customize existing management solutions to manage their computer pools efficiently. scVenus [1] by Science+Computing is one such commercial management software.

Munich-based, open source developer Albert Flügel was quite impressed by scVenus when he was subcontracting as a system administrator in a large environment. The customer used scVenus to manage a four-digit number of clients. When the license for the framework expired, however, the team of administrators faced the problem of continuing to manage what, in some cases, were old operating system versions.

Switching to a fresher solution like Puppet did not work because this would have required new software. This and the features missing in scVenus prompted Flügel to continue the development on an implementation he had begun some time before: Openvenus [2]. The software is modeled on scVenus. Because Flügel's admin team had already invested a great deal of work in the scVenus scripts, his goal was to ensure compatibility with scVenus at the API level.

What the System Offers

Administrators who install an Openvenus server (see the "Installation" box), have a platform that facilitates centralized system management. The software not only controls Linux machines (Red Hat, Fedora, CentOS, SUSE) but also Solaris or HP-UX; you can even integrate Windows systems if the Cygwin environment is installed.

Installation

The software is available online [3] as a source code package. To design and operate Openvenus, you need a multithreading-capable operating system such as Linux, plus Perl, OpenSSL, and the Afbackup [3] utility library; the library comes from the same author as Openvenus.

In the lab, my tests revealed that it is not a good idea to simply run ./configure && make && make install. Instead, use make rpms as the third step and install the packages that this step creates. The reason is that the RPMs transport post-install scripts that, in particular, make it much easier to provision the server than the make install mechanism.

Openvenus runs commands simultaneously on many computers. To allow this to happen, admins group the devices; a computer may belong to multiple groups. Additionally, the framework distributes individual files or entire software packages to the managed systems.

Technically, Openvenus comprises a master server with which the clients to be administered are connected (Figure 1). On the server, you install two packages, Basic and Server, whereas the clients only need the basic package. Large setups can also run more than one server. Authentication between the client and server relies on SSL certificates issued by the master in the course of each client installation. Once the connection has been established, the server can remotely control its clients and transfer files or software packages to them.

Figure 1: Structure of a typical Openvenus setup.

Many Bosses

Openvenus is designed for multiple administrators, each of whom has their own account on the master and is assigned a separate set of tasks. To trigger actions, the administrator logs into the master. For each command, you can configure which administrator (or group) needs to run the command on which host (or in what group). Special rights also can be constructed; for example, John is given administrative access to all hosts in the Accounts group, except to the Banking host. The lists of administrators and host groups is either stored locally on the master or in a directory service such as NIS or LDAP; the latter is recommended.

On My Mark

When executing commands, Openvenus has two modes. Administrators can run simple commands as a superuser on the client using ovprdo, while ovrdo runs the command as a normal user. The output from the commands appears on the console where they were started and is written to the log; Openvenus also evaluates the exit code of each command it runs.

If a host from the list is unreachable, Openvenus logs this as an error but does not queue the task. You can queue commands for unreachable hosts using the +N option; the non-privileged variant is not designed for batch operation. Figure 2 shows a command workflow.

Figure 2: Here, Openvenus is controlling the execution of a command on venusclient.

The second mode, a variant for more complex operations, is methods. Openvenus has a separate API for this. Methods are implemented in Perl or Bash scripts. The admin checks them into a repository and can then use ovrapply to run them on the client side. The ovrapply command uses batching by default, but you can disable this with the -N option.

On the clients, the ovpoll service queries the server to see whether tasks are pending. Alternatively, ovqpush triggers a queue run on the server. For the queues themselves, you can use commands to list the outstanding jobs and to delete jobs. The server does not try of its own accord to restart a failed job, but you can do this with a push or pull cronjob.

Buy Linux Magazine

Related content

Keeping all the Linux clients in an enterprise environment up to date can be a major logistical challenge. Many harassed administrators appreciate a good software distribution system, especially if it is a free tool that doesn’t stress the IT budget.

The free vSphere Hypervisor 5.0 lets companies enter into professional virtualization without the overhead of a commercial solution. If you decide you need additional enhancements later, you can always upgrade to enhanced VMware solutions.