Tag Archives: fun

“Loadsnake” AKA the Novell Netware snake screensaver clone

For those that didn't have the pleasure to see the old Novell Netware snakes screensaver, I'll say here that it was the default Netware screensaver, in console/text mode. It showed one snake for each CPU you had (99.9% of people had just 1 really). The cool thing is that the snakes became longer and longer as your server load increased. They also started going faster.

Anyway, this is one of those little time-wasting projects that usually go nowhere. I started working on a clone of this Netware screensaver in 2007. I remember I wanted to figure out how to write an xscreensaver "hack", so I spent a weekend looking at the source code for all the existing hacks, and I picked popsquares.c as a base and started tearing it apart, injecting dubious amounts of crappy C code until it did what I wanted.

Fast forward 4 years. Yesterday, for some reason, I got back to it, cleaned up the code a bit, and implemented a "fantastic" new feature I've always wanted: different snake colors for every different CPU, instead of all snakes being red. So I did, and the result is, well, see it for yourself:

Source code, but don't take inspiration from it, please… :) is up on github at http://github.com/cosimo/xscreensaver-loadsnake. You can also download the xscreensaver binary module if you want (only for Linux x86_64), as compiling it requires a bit of fiddling on the xscreensaver source code.

I have to admit that it's cool to run your own screensaver :)

Upgrade of puppet from 0.24.5 to 2.6.2: “Got nil value for content”

Something I should have probably done a while ago. Upgrading our puppet clients and master from 0.24.5 (current Debian stable) to 2.6.2 (current backports). I've been advised to go all the way to 2.6.4, that contains an important security fix. I'll probably do that soon.

So, first error I did was to upgrade one client before the puppetmaster. In my mental model, it's always ok to upgrade the server first (puppetmaster), but I remember I was kind of surprised when I learned that puppet needs upgrading the clients first, so that is something I seemed to remember well: first upgrade clients, then the puppetmaster.

So I did, and that didn't work at all. The client couldn't retrieve the catalog. I discovered I had to update the puppetmaster first. Another very common thing with puppet: if something doesn't work, wipe out the SSL certificates. Almost every puppet user, maybe unexperienced like me, will tell you to remove the entire /var/lib/puppet/ssl directory, and restart your client. Good. So I did.

After fiddling for a while with client and server-side SSL certificates, removing, --waitforcert and puppetca invocations, I was able to finally connect the client with the puppetmaster and have them talk to each other correctly, downloading the manifests and applying the changes. However, one problem remained…

Digression: custom facter plugins

I wrote a specific facter plugin that provides a datacenter fact. Just as an example, it works like this:

#
# Provide an additional 'datacenter' fact
# to use in generic modules to provide datacenter
# specific settings, such as resolv.conf
#
# Cosimo, 03/Aug/2010
#
# $Id: datacenter.rb 15240 2011-01-03 14:27:44Z cosimo $

Facter.add("datacenter") do
    setcode do

        datacenter = "unknown"

        # Get current ip address from Facter's own database
        ipaddr = Facter.value(:ipaddress)

        if ipaddr.match("^12.34.56.")
            datacenter = "dc1"
        elsif ipaddr.match("^34.56.78.")
            datacenter = "dc2"
        elsif ipaddr.match("^56.78.12.")
            datacenter = "dc3"
        # etc ...
        # etc ...
        end

        datacenter
    end
end

This allows for very cool things, like specific DNS resolver settings by datacenter:

# Data-center based settings
case $datacenter {
    "dc1" :  { include opera::datacenters::dc1 }
    "dc2" :  { include opera::datacenters::dc2 }
    "dc3" :  { include opera::datacenters::dc3 }
    # ...
    default: { include opera::datacenters::dc1 }
}

where each opera::datacenter::dc class contains all the datacenter-dependent settings. In my case, just the resolver for now.

class opera::datacenters::dc1 {
    resolver::dns { "dc1-ns":
        domain => "opera.com",
        nameservers => [ "1.2.3.4", "5.6.7.8" ],
    }
}

resolver::dns is in turn a small define in a resolver module I wrote to generate resolv.conf contents from a small template. It's so small I can copy/paste it here:

# "resolver/manifests/init.pp"
define resolver::dns ($nameservers, $search="", $domain="") {
    file { "/etc/resolv.conf":
        ensure => "present",
        owner  => "root",
        group  => "root",
        mode   => 644,
        content => template("resolver/resolv.conf.erb"),
    }
}

# "resolver/templates/resolv.conf.erb"
# File managed by puppet <%= puppetversion %> on <%= fqdn %>
# Data center: <%= name %>
<% if domain != "" %>domain <%= domain %>
<% end %>
<% if search != "" %>search <%= search %>
<% end %>
<% nameservers.each do |ns| %>nameserver <%= ns %>
<% end %>

One problem remained…

Let's get back to the remaining problem. Every client run was spitting this error string:

info: Retrieving plugin
err: /File[/var/lib/puppet/lib]: Could not evaluate: Got nil value for content

After a lot of searching and reading through similar (but not quite the same) messages, I found this thread where one guy was having a very similar problem, but with a different filename, /root/.gitconfig that he obviously specified in his manifests.

My problem happened with /var/lib/puppet/lib, which is never specified in any of my manifests. But the symlink bit got me thinking. At some point, to have the specific facter plugin work, and having read about differences between older and 2.6.x versions of puppet, I had put it into my module's lib/facter folder, but creating also a symlink called plugins (required by the new puppet). Doing so, I thought, would probably prevent problems, as the new puppet could read the file from the new folder, while the older version could read it from "lib".

Well, turns out that puppet will not be able to read custom facter plugins from a plugins folder that is a symlink. Removing the symlink and making {module}/plugins/facter a real directory solved that problem too.

I really hope to save someone else the time I spent on this. Cheers :)

Geo-IP lookup without installing Geo-IP libraries (everywhere)

We've been using GeoDNS to distribute client requests to different data centers around the world, first as a highly experimental project, then more and more as months passed.

Currently we're using it as simple global load balancer for files.myopera.com, help.opera.com, and some get.opera.com stuff.

However, there is another minor feature that we built into it, like a hacker's backdoor :) Since it's using a full DNS server, and it relies on having GeoIP libraries installed and always up-to-date, we thought it was a nice and cool idea to have a quick way to perform geo-ip lookups from the command line.

It works in a similar way as DNS black lists do. Suppose you want to look up the IP address 80.90.100.110. You reverse the IP, and lookup a special geo.opera.com zone:

cosimo@cd01:~$ host -t TXT 110.100.90.80.lookup.geo.opera.com
110.100.90.80.lookup.geo.opera.com descriptive text "ip:80.90.100.110, country:de, continent:europe"

This uses the GeoDNS backends to resolve country and continent of the given IP address, and gets back the information in a very basic string format. A simple shell or Perl script can then process that for you if you need. In fact, I made a ~/bin/geolookup Perl script that I can use like this:

cosimo@cd01:~$ geolookup 80.90.100.110 90.100.110.120 100.110.120.130
80.90.100.110 => de, europe
90.100.110.120 => fr, europe
100.110.120.130 => unknown, unknown

Nothing special, but in this way, no matter what machine I'm on, I can always quickly lookup IPs if I need to, without having to download the Country or City GeoIP databases, and keep them up-to-date. On the geodns backends, this is of course done routinely with a set of simple cronjobs.

Facter ported to Perl 6

A few days ago I wrote about my fun experiment trying to port facter to rakudo Perl 6. I said it was "almost completely functional".

Well, I was a bit optimistic, it seems :) It took me a few more nights of hacking, but in the end now it's almost completely functional :) It basically runs, I have ported a few facts from Facter's original ruby code too, like the kernel fact, or the physicalprocessorcount fact and other simple ones.

What's missing to declare the experiment successful is the implementation of confines. A confine in facter speak it's a specific restriction that applies to a fact. For example, the physicalprocessorcount fact reads some files from /proc, and that is only available on Linux. So, in this case, the confine rule for physicalprocessorcount is that the fact kernel must have "Linux" as its value. In code that becomes:

Facter.add("physicalprocessorcount", sub ($f) {
    $f.confine("kernel" => "Linux");
    $f.setcode(block => sub {
        Facter::Util::Resolution.exec('grep "physical id" /proc/cpuinfo|cut -d: -f 2|sort -u|wc -l');
    });
});

which is pretty similar to the Ruby counterpart:

Facter.add("physicalprocessorcount") do
    confine :kernel => :linux

    setcode do
        ppcount = Facter::Util::Resolution.exec('grep "physical id" /proc/cpuinfo|cut -d: -f 2|sort -u|wc -l')
    end
end

The Ruby version is still more elegant, but all in all I'm very happy with the outcome so far. It could probably be improved a lot too. Perl 6 is awesome. Get the code from http://github.com/cosimo/perl6-facter/ and feel free to ping me or comment if you want to know more.

Another round of fixes for the varnish Accept-Language VCL extension

This very specific VCL extension to parse and normalize Accept-Language headers is becoming more robust as time goes by. Here's the latest round of fixes:

  • the original req.http.Accept-Language header could be overwritten when calling the vcl_rewrite_accept_language() function. Now this is fixed by copying the original header string into a static buffer and executing the processing on the copy.
  • improved a bit the style of the C code in a few places. Nothing miraculous really. Feels improved for me at least :)
  • fixed the use of a wrong define (string max length instead of languages list max length)
  • use of sizeof instead of using same constants in strncpy, etc…
  • added a small intro on the generated file too. In this way, if you find the code on some server, you can immediately understand what's that supposed to do, and where it came from :)

Enough blah blah, here's the new code. If you were already running this piece of VCL, then I won't tell you this is experimental stuff, because at this point it's no more experimental. But if you were running it already, then by all means upgrade. It is definitely better :)

http://github.com/cosimo/varnish-accept-language

Enjoy!

Another Ubiquity for Opera update, DuckDuckGo search

My small Ubiquity for Opera experiment gets another quick update.

This time I added one of my favorite search engines, DuckDuckGo. Despite being a young project, I think it's really interesting, and its results are highly relevant and up-to-date. I like it! Plus, it's a Perl project.

So that's it, I just added the duckduckgo command.

This new version also fixes an annoying problem with a couple of Google-related search commands, that were showing just 1 result, instead of the default 20 search results. There's so much more that could be improved, but I rarely find the time to work on it…

As always, the updated code is available on the Ubiquity for Opera project page, where you will also find the minified version (~40 kb instead of ~70).

Enjoy!

Have you ever heard “Bon Digi”?

It's a game. A crazy one. A really crazy one.
We were doing it friday evening, drinking like nuts, and something in my mind popped up, and I thought:

We should code the algorithm for this game

In fact, it seemed really stupid and obvious. But thinking about it more and more, I discovered that it's not so simple at all…

And if you want to code it elegantly and concisely, you have to think about it for some time.
So, I took this -Ofun opportunity, and turned it, as I promised to Zoso, into a Games::BonDigi CPAN module.

Enjoy!