Monthly Archives: January 2010

Goodbye Old Friend

As Giannis noted on Twitter earlier today, now redirects to

Sun Microsystems was the last of the UNIX companies. Its brand and technological expertise has been with us since 1982. Now, after 28 years it becomes part of Oracle.

My first contact with a computer was through a VT220 terminal, connected through a terminal server to a SunOS server. So it is with a bit of sadness that I see the name of “Sun” go away.

Let’s hope that there is a lot in a name, but there is even more in the people who are behind the name; that we will keep seeing technological advances like those that Sun has brought to the world of computing; that the name of “Sun” may be gone but the spirit of quality, technical and design excellence, the well-thought out customer support and the ever-present drive to push our overall computing experience to new frontiers will stay with us.

Χρήσιμες Ειδοποιήσεις Online Παραγγελιών Μέσω Email

Πριν λίγες μέρες αποφάσισα να πάρω μια νέα μπαταρία για το Thinkpad X61s μου. Η παλιά μπαταρία έχει αρχίσει και χάνει πολύ από την αρχική της κατάσταση, όπως φαίνεται κι από το “last full capacity” στην έξοδο της εντολής acpiconf:

keramida@kobe:~$ acpiconf -i0
Design capacity:        74880 mWh
Last full capacity:     41140 mWh

Continue reading

Parallel Downloads with Python and GNU Wget

GNU wget is a very useful utility that can download files over HTTP, HTTPS and FTP. Downloading a single file from a remote server is very easy. You just have to type:

$ wget -np -nd -c -r http://server/path/to/file

That’s all. The wget utility will start downloading the remote “file” and save it in the current directory with the same name.

When downloading a single file this works fine and will often be enough to do the job at hand easily, without a lot of fuss. Fetching multiple files is also easy with a tiny bit of shell plumbing. In a Bourne-compatible shell you can store the URLs of the remote files in a plain text file and then type:

$ while read file ; do \
    wget -np -nd -c -r "${file}" ; \
done < url-list.txt

This small shell snippet will download the files one after the other, but it is a linear process. The second file will start downloading only after the first one has finished. The utilization of your connection will probably be less than optimal. Continue reading

Distributed Development with Mercurial

Every clone of a Mercurial repository can act as a consistent, fully functional repository itself. This is a very useful property of a DVCS; one that you can take advantage of to make development a lot easier, especially for teams that are dispersed in multiple places of the globe.

Imagine for a moment that your team is not sharing the same physical space every day. In fact, half of your team works in an office in the west coast of the United States and the other half is located somewhere in Europe. Continue reading

Mercurial Clones without a Working Copy

Mercurial repository clones can have two parts:

  1. An .hg/ subdirectory, where all the repository metadata is stored
  2. A “working copy” area, where checked out files may live

The .hg/ subdirectory stores the repository metadata of the specific clone, including the history of all changesets stored in the specific clone, clone-specific hooks and scripts, information about local tags and bookmarks, and so on. This is the only part of a Mercurial repository that is actually mandatory for a functional repository. Continue reading