Text

A good, fast, howto on installing an official Java 7 JVM / JRE on Ubuntu 11.10 (Oneiric).

Text

I saw a report today that Google acquired a tranche of patents from IBM. Presumably stocking its patent armoury to defend itself in it's various deathmatches with Oracle, Microsoft and Apple. With these increasingly insane patent wars taking place, apparently all in the name of safeguarding 'innovation', I was vaguely reminded of John Carmack commenting on the issue back in the day. I managed to find the quote, and it's well worth reading.

Are software patents really protecting innovation? Or are they really just protecting large businesses, increasing barriers to entry and reducing the ability for a small, innovative competitor to survive?

It's something [software patents] that's really depressing because it's so horribly wrong for someone who's a creative engineering type to think about patenting a way of thinking. Even if you had something really, really clever, the idea that you're not going to allow someone else to follow that line of thought... All of science and technology is built standing on the shoulders of the people that come before you. Did Newton patent calculus and screw the hell out of the other guy working on it? It's just so wrong, but it's what the business world does to things, and certainly the world is controlled by business interests. And technical idealists are a minority.

Electric Playground interviews John Carmack, 1997

Carmack got bitten by patents himself when he discovered a shadow drawing technique that somewhat ironically came to be known as Carmack's Reverse. While Carmack arrived at the technique independently, Creative had sought a patent for something very similar a few months earlier. The issue raised its head again this year with the Doom 3 source code release. In order to get legal approval to release the code, Carmack had to change the implementation to avoid infringing Creative's patent.

Patenting software really doesn't make sense. In programming there is usually an optimal and natural way of solving any given problem. By making optimal solutions exclusive, it reduces the efficiency with which people can create robust software. It's something that doesn't seem likely to encourage innovation to me. It seems much more likely to stifle it. And that's before we even consider the chilling effect that legal threats from gargantuan corporations can have on a would-be technologists.

Text

When using apt-get remove or aptitude remove the configuration files of the packages uninstalled are not deleted.

Now you can manually do it yourself with apt-get purge , or, you can use the dpkg command to do it in a nice oneliner.

dpkg --get-selections | grep deinstall | sed 's/deinstall/\lpurge/' | dpkg --set-selections; dpkg -Pa
  
Text

I have been working with VirtualBox and vagrant a lot today and found myself having to double check the VBoxManage CLI syntax a number of times. So I'm dumping a quick cheatsheet down to help remember it / help anyone else wanting to efficiently manage VBox VMs.

Start a VM

VBoxManage startvm "Name of VM" --type headless|gui
  

Forcibly Shutdown a VM

VBoxManage controlvm "Name of VM" poweroff
  

List VMs

VBoxManage list vms
  

Unregister/delete a VM

VBoxManage unregistervm "Name of VM" --delete
  

Forward / Unforward Ports for SSH

Modifyvm must be only be used when VM is powered off, otherwise controlvm for running VMs.

VBoxManage modifyvm "VM name" --natpf1 "guestssh,tcp,,2222,,22"
  VBoxManage modifyvm "VM name" --natpf1 delete "guestssh"
  
Text

I don't know when it happened, but Skype has blown up on me sometime over the past two months.

$ skype
  > skype: error while loading shared libraries: libXss.so.1: cannot open shared object file: No such file or directory
  

Hmmm.

$ locate libXss.so.1
  > /usr/lib/x86_64-linux-gnu/libXss.so.1
  > /usr/lib/x86_64-linux-gnu/libXss.so.1.0.0
  

Okay, so it's moaning about the X screensaver library, libxss, not being there. But it is there, although specifically, it's a 64bit library. I bet Skype isn't 64bit...

$ file /usr/bin/skype
  > /usr/bin/skype: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.8, stripped
  

So that explains that then, we need compatible shared x86 libraries, but when you get these sorts of issues it's always best to see what else is missing. The ldd tool comes to the rescue.

$ ldd /usr/bin/skype
  > linux-gate.so.1 =>  (0xf76f2000)
  > libasound.so.2 => /usr/lib32/libasound.so.2 (0xf75df000)
  > libXv.so.1 => /usr/lib32/libXv.so.1 (0xf75d9000)
  > libXss.so.1 => not found
  > librt.so.1 => /lib32/librt.so.1 (0xf75d0000)
  > libQtDBus.so.4 => not found
  > libQtGui.so.4 => not found
  > libQtNetwork.so.4 => not found
  > libQtCore.so.4 => not found
  > libpthread.so.0 => /lib32/libpthread.so.0 (0xf75b4000)
  > libstdc++.so.6 => /usr/lib32/libstdc++.so.6 (0xf74c9000)
  > libm.so.6 => /lib32/libm.so.6 (0xf749e000)
  > libgcc_s.so.1 => /usr/lib32/libgcc_s.so.1 (0xf7480000)
  > libc.so.6 => /lib32/libc.so.6 (0xf7306000)
  > libdl.so.2 => /lib32/libdl.so.2 (0xf7301000)
  > libX11.so.6 => /usr/lib32/libX11.so.6 (0xf71cb000)
  > libXext.so.6 => /usr/lib32/libXext.so.6 (0xf71b8000)
  > /lib/ld-linux.so.2 (0xf76f3000)
  > libxcb.so.1 => /usr/lib32/libxcb.so.1 (0xf7198000)
  > libXau.so.6 => /usr/lib32/libXau.so.6 (0xf7194000)
  > libXdmcp.so.6 => /usr/lib32/libXdmcp.so.6 (0xf718d000)
  

So looking at this, I do not have both compatible libxss and misc qt libraries installed.

With Ubuntu, to enable multiarch support, check the file /etc/dpkg/dpkg.cfg.d/multiarch. There should be a line like below (and if there isn't, add it):

# in file /etc/dpkg/dpkg.cfg.d/multiarch
  foreign-architecture i386
  

Now we need to satisfy Skype's i386 dependencies.

 $ sudo apt-get install libxss1:i386 libqtcore4:i386 libqt4-dbus:i386 libqtgui4:i386
  

If we re-run ldd we can see Skype's shared library dependencies are now satisfied.

$ ldd /usr/bin/skype
  > linux-gate.so.1 =>  (0xf7749000)
  > libasound.so.2 => /usr/lib32/libasound.so.2 (0xf7634000)
  > libXv.so.1 => /usr/lib32/libXv.so.1 (0xf762e000)
  > libXss.so.1 => /usr/lib/i386-linux-gnu/libXss.so.1 (0xf7629000)
  > librt.so.1 => /lib32/librt.so.1 (0xf7620000)
  > libQtDBus.so.4 => /usr/lib/i386-linux-gnu/libQtDBus.so.4 (0xf75a6000)
  > libQtGui.so.4 => /usr/lib/i386-linux-gnu/libQtGui.so.4 (0xf6ae0000)
  > libQtNetwork.so.4 => /usr/lib/i386-linux-gnu/libQtNetwork.so.4 (0xf69a4000)
  > libQtCore.so.4 => /usr/lib/i386-linux-gnu/libQtCore.so.4 (0xf6702000)
  > libpthread.so.0 => /lib32/libpthread.so.0 (0xf66e7000)
  > libstdc++.so.6 => /usr/lib32/libstdc++.so.6 (0xf65fc000)
  > libm.so.6 => /lib32/libm.so.6 (0xf65d2000)
  > libgcc_s.so.1 => /usr/lib32/libgcc_s.so.1 (0xf65b4000)
  > libc.so.6 => /lib32/libc.so.6 (0xf6439000)
  > libdl.so.2 => /lib32/libdl.so.2 (0xf6434000)
  > libX11.so.6 => /usr/lib32/libX11.so.6 (0xf62fe000)
  > libXext.so.6 => /usr/lib32/libXext.so.6 (0xf62eb000)
  > libQtXml.so.4 => /usr/lib/i386-linux-gnu/libQtXml.so.4 (0xf62aa000)
  > libdbus-1.so.3 => /lib32/libdbus-1.so.3 (0xf6261000)
  > libfontconfig.so.1 => /usr/lib32/libfontconfig.so.1 (0xf622b000)
  > libaudio.so.2 => /usr/lib32/libaudio.so.2 (0xf6211000)
  > libglib-2.0.so.0 => /lib32/libglib-2.0.so.0 (0xf6118000)
  > libpng12.so.0 => /lib32/libpng12.so.0 (0xf60ee000)
  > libz.so.1 => /usr/lib32/libz.so.1 (0xf60d9000)
  > libfreetype.so.6 => /usr/lib32/libfreetype.so.6 (0xf6041000)
  > libgobject-2.0.so.0 => /usr/lib32/libgobject-2.0.so.0 (0xf5ff2000)
  > libSM.so.6 => /usr/lib32/libSM.so.6 (0xf5fe9000)
  > libICE.so.6 => /usr/lib32/libICE.so.6 (0xf5fcf000)
  > libXi.so.6 => /usr/lib32/libXi.so.6 (0xf5fbf000)
  > libXrender.so.1 => /usr/lib32/libXrender.so.1 (0xf5fb4000)
  > libgthread-2.0.so.0 => /usr/lib32/libgthread-2.0.so.0 (0xf5fad000)
  > /lib/ld-linux.so.2 (0xf774a000)
  > libxcb.so.1 => /usr/lib32/libxcb.so.1 (0xf5f8e000)
  > libexpat.so.1 => /lib32/libexpat.so.1 (0xf5f64000)
  > libXt.so.6 => /usr/lib32/libXt.so.6 (0xf5f08000)
  > libXau.so.6 => /usr/lib32/libXau.so.6 (0xf5f04000)
  > libpcre.so.3 => /lib32/libpcre.so.3 (0xf5ec4000)
  > libffi.so.6 => /usr/lib32/libffi.so.6 (0xf5ebd000)
  > libuuid.so.1 => /lib32/libuuid.so.1 (0xf5eb7000)
  > libXdmcp.so.6 => /usr/lib32/libXdmcp.so.6 (0xf5eb0000)
  

Skype now starts up and behaves as I would expect.

Text

Normally if you want to cross compile something to produce a 32bit executable with gcc, you have to pass in the -m32 argument.

$ gcc -m32 -o test32 test.c
  

After a new install of Ubuntu 11.10 I was left scratching my head for a little bit with the following error:

> In file included from /usr/include/stdio.h:28:0,
  >             from test.c:1:
  > /usr/include/features.h:323:26: fatal error: bits/predefs.h: No such file or directory
  > compilation terminated.
  

In order to build 32bit executables you need to install the i386 libc dev package.

$ sudo apt-get install libc6-dev-i386
  

For a little bit of 'fun' if you pass the -S -masm=intel arguments to gcc, you can compile once with -m32, and once with -m64 to see how your program changes based on target architecture. It can (perhaps it's just me :)) be Interesting to see the subtle changes in programs based on how they are assembled for x86_64 vs x86.

$ gcc -m32 -S -masm=intel test.c
  
Text

If you have a slightly dated PHPUnit test-case suite, and like me, have recently reinstalled your OS you will likely be running a modern version of PHPUnit (3.5/3.6).

Keeping up with changes to PHPUnit's suite of extensions can be occasionally dizzying, things have been put in and then ripped out of the core quite frequently in the last few releases. A common problem I'm seeing lately upgrading my test-suites to be 3.6 compatible, is the extraction of database testing classes to their own package on the pear.phpunit.de channel.

If you see an error like this:

> include(PHPUnit/Extensions/Database/TestCase.php): failed to open stream: No such file or directory
  

Odds are you need to do this:

$ sudo pear config-set auto_discover 1
  $ sudo pear install --alldeps pear.phpunit.de/DbUnit
  
Text

A great concise tutorial on how to write pipe aware programs for the unix shell in Ruby.

Text

I'm seldom surprised by some of the horrors under the Magento hood, but today's little gem takes some beating.

On a setup I administer, there are over 200,000 address records. When you view an order in the backend, and click 'edit address', the server grinds away and eventually dies either because it hits the max_execution_time limit or runs out of RAM.

You might see an otherwise meaningless error like this:

Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 79 bytes) in /home/somewhere/public_html/lib/Zend/Db/Statement/Pdo.php on line 290
  

The cause for this, is the strange manner in which Magento searches for addresses in app/code/core/Mage/Adminhtml/controllers/Sales/OrderController.

/**
   * Edit order address form
   */
  public function addressAction()
  {
      $addressId = $this->getRequest()->getParam('address_id');
      $address = Mage::getModel('sales/order_address')
          ->getCollection()
          ->getItemById($addressId);
      if ($address) {
          Mage::register('order_address', $address);
          $this->loadLayout();
          $this->renderLayout();
      } else {
          $this->_redirect('*/*/');
      }
  }
  

The problem is in the $address->getCollection()->getItemById() chain. Magento creates and fully loads a collection of Address objects (which when you have 200k of them, takes a while). The final call in the chain, getItemById() takes the collection, iterates over it assigning each address to an array keyed by its entityId. It then returns any value which matches $addressId.

Now, there's another, really simple way to do the same thing. It doesn't involve instantiating 200,000 address objects, or iterating over them, or even using associative arrays. It's very familiar.

$address = Mage::getModel('sales/order_address')->load($addressId);
  

This one line does the same thing far more efficiently. Now, the thing that worries me, is I can't see any reason why they aren't doing this already? The change in code works, nothing appears to break and the speed boost is (obviously) immense.

So why is it not done this way?

Tags: magento wtf
Text

Like with Apache, with Varnish 3 you can test the syntactic correctness of your VCL files without having to roll some dice restarting a running server.

$ varnishd -C -f /path/to/mysetup.vcl
  > ...
  

Varnish will compile the file and print out it's full configuration as output. If there's an error, varnish handily gives you a compiler error to help trace what went wrong, which could look something like this

> Message from VCC-compiler:
  > Expected an action, 'if', '{' or '}'
  > ('input' Line 82 Pos 6)
  >     vcl_hash(req.http.Cookie);
  > -----########------------------
  >
  > Running VCC-compiler failed, exit 1
  
Text

I've been using Oneiric since its release and in many ways it is a giant leap forward from Natty Narwhal. Unity is still horrible mind, but a lot of the rough edges (multiple / external monitor support) have been softened and the overall package is more stable.

Some things still grate, particularly the global menus and the overlay scrollbar, but they are manageable.

A great resource I think to checkout for anyone using Oneiric is webupd8. You can find a lot of tips to bend Oneiric to your will.

For me, that means getting an official JVM installed Installing Oracle Java 7 and getting a sane desktop environment configured with global menu and overlay scrollbars zapped away Things to Tweak After Installing Oneiric.

Linux Mint (now on release 12, Lisa) is increasingly becoming a viable alternative to Ubuntu. I've played around with it a fair bit, but I still think it is not quite there yet in terms of 'just working' when compared with Ubuntu.

For someone that needs to get work done with a minimum of fuss, Ubuntu is still the leader. I do hope though, someone can work out how to marry usability and power in a slightly better way than I feel Canonical are managing at the moment.

Text

I first came across the Imagemagick library and toolkit when I was new to Linux and trying to satisfy Enlightenment and E-term dependencies and get them up and running. A task, and this was the mid 90s, I do not recall fondly. (And thinking about that experience, despite all the controversy surrounding Gnome 3, and Unity, the Linux Desktop has come a long way).

Anyway, unlike either Enlightenment or E-Term, Imagemagick has been a firm friend ever since.

There are three main tools I find I fall back on, time and time again.

Display, Convert, and Identify.

Display, funnily enough, will open up an x-window with the contents of an image.

Convert, perhaps the most useful tool of them all, lets you resize, transform, and re-format images.

And lastly, identify, lets you get information about a file, size, format, colour depth, offset and so forth.

The power of these tools is best wielded with simple loops in bash, or using GNU Find. For example to convert a bunch of images from 1920x1080 to say 1280x720, you could do something like this

$ for IMAGE in *.jpg; do
  >   # usage convert <action> orgfile.jpg newfile.jpg
  >   convert -resize '1280x720' $IMAGE $(echo $IMAGE | sed 's/.jpg$/-resized.jpg/')
  > done
  

Convert tries to be clever and maintain aspect ratios. In this case, 1920x1080 fits into 1280x720 maintaning a 16:9 aspect ratio. But what if we had a 1920x1200 input image? In these cases, convert would actually resize the image to something like 1280x800. But if we really want it to ignore common sense and squish things down to our requested 1280x720 you need to use the bang operator e.g.

$ convert -resize '1280x720!' srcimg.jpg dstimg.jpg
  
Text

I am just going to give a quick example on how to this.

$ git push origin :REMOTE_BRANCH_NAME
  

For a concrete example of this in action

$ git branch -r
  > origin/develop
  > origin/master
  
  $ git push origin :develop
  > To /home/aaron/development/atestrepo.git
    - [deleted]         develop
  
  $ git branch -r
  > origin/master
  

So, why does this work, and why is the syntax so odd?

Well if we look at the push command in more detail it starts to make some sense.

If we want to push a local branch to a remote, we would do it like this

$ git push origin mybranch
  

Here, origin is the remote, and mybranch the local branch to push up. The result of this command is we now have a remote branch origin/mybranch. But what if we wanted to call the remote branch something else? We would do that like this:

$ git push origin mybranch:adiffnamefortheremotebranch
  

This syntax effectively translates to push to a branch adiffnameforremotebranch at remote origin the contents of mybranch. Now, can you see where this is going with respect to our delete? Deleting a remote branch with just a leading : (and no local branch name) is basically saying push nothing into remote branch called someremotebranch. Git takes this to mean delete the remote branch.

$ git push origin :someremotebranch
  

I find thinking of it in these terms, makes it easier to remember the syntax.

It's also worth reading the Pro Git Chapter on remotes which also covers (although, sadly, too briefly) deleting remote branches.

Text

Sponge, part of the Moreutils package, is a neat perl script which takes data from stdin and writes it to a file. This in itself this is not remarkable, after all you can use the '>' operator to do it. But what is different, is how Sponge waits until the end-of-file character (EOF) before opening and writing to an output file. I.e. it soaks up all the input data before commencing writing.

This is very, very handy if you want to do in-place substitutions with something like sed or in this case, the GNU text encoding conversion utility iconv.

Typically to convert between two encodings, you call iconv like this:

$ iconv -f cp1252 -t utf-8 myfile.txt
  

This will convert myfile.txt from windows latin 1, to utf-8 and dump the results to stdout.(Don't make the mistake of thinking iso-8859-1 and cp1252 are equivalent. It is safe to convert iso-8859-1 content as if it's cp1252, but the reverse is NOT true).

If you're in any doubt as to the encoding of the source file, you can inspect it with the 'file' command.

So if we have a directory of say c source files we want to convert, we can make use of bash, iconv and sponge to save us the tedium of converting each file manually to a new copy, then replacing the original file with the copy.

$ for FILE in *.c *.h; do iconv -f cp1252 -t utf-8 "$FILE" | sponge "$FILE"
  

Each file is filtered through iconv, the outputs of which are piped into sponge. Sponge soaks up the standard input until the EOF, then writes it to the original file.

Text

GNU Bash. Most *nix developers use it every day and if they are like me, they must never cease to be amazed at the cool stuff you can do with it.

A simple thing today I tried out today, downloading and extracting a web-based tarfile in a one liner.

It's really simple.

$ tar zxv < <(wget -q -O - http://www.somewhere.com/atarfile.tar.gz)
  > atarfile/
  > atarfile/file.txt
  > ...
  

So let's look at this statement in a little detail. Basically we evaluate the right side of the expression first. We're using wget to download, quietly (-q), a tgz file from somewhere on the internet and output the file's contents to stdout (-O -).

We use Bash's Process Substitution operator '<(' to create a Named Pipe, which effectively creates a temporary file descriptor, and then direct the contents of that descriptor into tar using the '<' file redirection operator.

Sounds complicated but looks simple.