* Odin Authenticator - a cookie-based single sign-on system for Apache - [[User:japhy|japhy]]

+

* A short potpourri of tools which proved useful when working in a large & rather homogenous infrastructure. Will include war-stories. pssh, graylog2, puppet & facter, lldpd, pt-query-diag, etc. - [[User:Robe|Robe]]

** Runit, a sane reimplementation of Dan J. Bernstein's daemontools under BSD-like license. A tool for managing long-running services (I'd say "daemons", but they don't daemonize – they are managed by runit when running in the foreground): http://smarden.org/runit/

+

** LXC - Linux Containers, a lightweight userspace virtualization for Linux (may be used for real clean builds or for testing the packages): http://lxc.sourceforge.net/

FPM is a command-line tool that converts between different package formats (from: local directory, ruby gem, php pear, python module, node.js npm package, deb, rpm; to: local directory, deb, rpm, solaris, tar). The "local directory" pseudo-format allows for easy creation of quick-and-dirty packages that Just Work, short-circuiting all the layers from the Debian Policy Manual all the way up to dh_make with one quick script call. FPM doesn't care about actually building your stuff (as in: unpacking, configuring, compiling, etc). It just takes directory you put it into, and turns it into a package. To prepare the compiled files and put them in the right place first is your job. So is specifying the dependencies - there are no automatic dependencies on dynamically linked libraries.

+

+

Workflow for the continuous integration using bits & pieces from the gist:

+

** On developer's workstation, "vagrant up" command brings up new VirtualBox VM with clean Debian and sets up a clone of the "packages" repository with a local branch

+

** "vagrant ssh" to the VM, go to the packages repo, prepare or modify new package, test it

+

** "git push" from the VM to the "develop" branch on the host server

+

** "git pull . develop" on the host server's master branch to merge

+

** "git push" to central git repo

+

** buildbot (continuous integration server) picks up the commit

+

** buildbot builds changed package and adds it to the apt repo

+

** new package is available to clients via apt-get within minutes from the push.

WTF

A regular meetup of people interested in DevOps.

DevOps is kind of a new sysadmin buzzword for ways to close the gap between software development and operations; it's what happens when agile development spreads to the server room. We meet to discuss culture, processes, tools and trends that contribute to this field.

Next meetup

Agenda

Lots of discussion and friendly banter, tales from productionland, The Developer Is Not Always Right, etc ;)

Presentation/topic propositions for future

If you have something to present, discuss, show, or ask around about, but don't want to commit to a date yet – here's the place to do it. If you want to hear about something specific, add it here too - maybe there's someone here who can tell about it.

Runit, a sane reimplementation of Dan J. Bernstein's daemontools under BSD-like license. A tool for managing long-running services (I'd say "daemons", but they don't daemonize – they are managed by runit when running in the foreground): http://smarden.org/runit/

LXC - Linux Containers, a lightweight userspace virtualization for Linux (may be used for real clean builds or for testing the packages): http://lxc.sourceforge.net/

FPM is a command-line tool that converts between different package formats (from: local directory, ruby gem, php pear, python module, node.js npm package, deb, rpm; to: local directory, deb, rpm, solaris, tar). The "local directory" pseudo-format allows for easy creation of quick-and-dirty packages that Just Work, short-circuiting all the layers from the Debian Policy Manual all the way up to dh_make with one quick script call. FPM doesn't care about actually building your stuff (as in: unpacking, configuring, compiling, etc). It just takes directory you put it into, and turns it into a package. To prepare the compiled files and put them in the right place first is your job. So is specifying the dependencies - there are no automatic dependencies on dynamically linked libraries.

Workflow for the continuous integration using bits & pieces from the gist:

On developer's workstation, "vagrant up" command brings up new VirtualBox VM with clean Debian and sets up a clone of the "packages" repository with a local branch

"vagrant ssh" to the VM, go to the packages repo, prepare or modify new package, test it

"git push" from the VM to the "develop" branch on the host server

"git pull . develop" on the host server's master branch to merge

"git push" to central git repo

buildbot (continuous integration server) picks up the commit

buildbot builds changed package and adds it to the apt repo

new package is available to clients via apt-get within minutes from the push.