Text

If you're trying to add a user to a group under OSX you might get stumped. This is straight forward enough on linux right, you go

$ usermod -a -G thegroup theuser
  

And job done. But OSX uses Open Directory rather than traditional flatfiles like /etc/passwd and /etc/group to store information about users and domains. So the typical unix commands we are used to, don't work.

The dscl (directory service command line) utility lets you manipulate Open Directory values and in our case, add a user to an additional group, handy if for example, you want to add your user to the wheel group to make use of password free sudo.

$ dscl localhost --append /Local/Default/Groups/<groupnamehere> <usernamehere>
  
Text

Today I learned that each sequence in a bash pipeline executes in a separate subshell…this means variables cannot be passed along the pipeline, as each new subprocess invokes a brand new environment.

For some workarounds checkout http://mywiki.wooledge.org/BashFAQ/024

Tags: bash shell unix
Text

In the past I used to mess around with NFS over SSH but these days the FUSE options are much easier, except when you want to use a public key to authenticate with the remote host. In that case do this:

$ sshfs -o ssh_command="ssh -i ~/ssh_keys/[email protected]" [email protected]:/var/www/ ~/Sites/awshost

Actually, right after I posted this, I realised there's a better way to do it. That is, use the 'IdentityFile' option instead (as per the format of .ssh/config).

$ sshfs -o "IdentityFile=~/ssh_keys/[email protected]" [email protected]:/var/www/ ~/Sites/awshost
  

If you have any problems then add '-o debug' to the above command to help track it down.

Tags: ssh cli unix
Text

It is 2013, we (still) don't have flying cars, or hoverboards, AND as developers, we still use terminals to interact with our operating system. So every so often I like to browse through commandlinefu.com and try and pick up any little tidbits which improve my command line efficiency.

Here's a small selection I have picked up recently that I didn't know.

sudo !! 
  

Run the previous command as sudo. This is great when you realise you needed to run something as root.

ctrl-x e
  

Open up $EDITOR to enter in a long command. In my setup it fires up vim. This is great for some of the long rails commands you need to create controllers or models.

cat /etc/passwd | column -s':' -t 
  

Column, columnates input, the -t argument will format standard input into a table and -s lets you specify an arbitrary field delimiter. For unformatted input this is very handy.

These next few are specific to zsh, and while I do love bash, since switching to zsh I haven't really looked back. It's things like this that when you work with a terminal every single day, you can't give up.

aaron@tempest ~ $ d                                 
  0    ~
  aaron@tempest ~ $ cd /etc
  aaron@tempest /etc $ d
  0    /etc
  1    ~
  aaron@tempest /etc $ 1
  ~
  aaron@tempest ~ $  
  

The 'd' command lists the directory stack, and then entering an integer will switch you directly to the directory index in the stack. It is a killer app.

Moving directories also is very pleasant in zsh. Use '..' to move up a directory, and simply type the name of the directory in, to move into a directory.

aaron@tempest ~ $ ..       
  aaron@tempest /Users $ aaron
  aaron@tempest ~ $ 
  

This last one is a trick I've know for a few years, I don't know how much time this has saved me exactly, but I use it every single day.

In vim, if you're editing a file that requires root (or any other user) permissions, you can write the file by doing

:w !sudo tee %
  

I use it so much that I've set up a leader key binding in my .vimrc

nnoremap <leader>sr :w !sudo tee %<CR>
  

There's nothing more annoying than making lengthy changes to a config file, go to write it and getting permission denied...

I make all my configs available online at github, if you're interested in seeing how I setup my environment.

Text

GNU Find never ceases to amaze me with its utility.

Yesterday I had to do an emergency restart of mysql in production and the resulting magento report/xxxx files swamped out everything else that I might have wanted to look at.

So specifically, I wanted to delete all the files that were created between a start and end date.

GNU find makes this easy

$ find . -type f -newer 655958024588 ! -newer 627812220533 -ls
  

This instructs find to list (-ls) all files (-type f) that are newer than a file called 655958024588 (-newer) and not newer than 627812220533 (! -newer).

If you do not have two files to act as date range boundaries, you can use touch to create them.

$ touch -t yyyymmddHHMM start_date_file
  $ touch -t yyyymmddHHMM end_date_file
  

Then supply these file names to -newer and ! -newer.

To delete the files we can use -exec.

$ find . -type f -newer 655958024588 ! -newer 627812220533 -exec rm -rf {} \;
  

Here it's the -exec argument does the heavy lifting. {} is a placeholder for the file name (find substitutes '{}' with each found filename) and \; terminates the command sequence (much like it does in regular bash).

Text

So, back from the Linux jungle and sitting in front of a Macbook once again.

My first real job has been to get a decent unix environment up. OSX's BSD utilities don't really cut it. Macports is far and away the best distribution out there.

Once you install coreutils and get ls, find etc it makes sense you will want to change your shell to a modern version of bash (or zsh if that's the way you roll).

Lion ships with bash 3.2 whereas Macports will give you a contemporary version 4.2 Unfortunately it's not as simple as going

$ sudo port install bash
  $ chsh 
    <input /opt/local/bin/bash>
  

I had to do this before, but I'd forgotten there's a trick to changing your shell in OSX to a non-standard location. The file /etc/shells contains a list of valid shells chsh will permit. You need to edit (as root or via sudo) this file and add your macports shells. Once that's done chsh will let you change no problem.

Tags: mac osx shell unix
Text

A quick bit of shell-fu.

To take a column from a MySQL database and quickly output it ready formatted as a Javascript array literal (without any specific escaping) do:

echo 'SELECT column FROM table WHERE some_column = "somevalue"' | mysql -uuser -ppass --silent yourdb | awk -v q="'" '{ print q $0 q }' | paste -s -d ',' | sed 's/(.*)/[\1];/'

The first part of the command is self explanatory, you pipe in a query to mysql, and ask it to give you raw unadorned output. It will return each row for column 'column' from table 'table' as a line of output.

You pipe it to awk and ask it to wrap the values in single quotes. Due to shell escaping with single quotes, you set the q variable to a single quote. Paste then joins all the output lines together separated by commas.

Finally I use sed to wrap the resulting output in Javascript array literal '[' and ']' symbols. Awk or any other tool to concatenation approach would do just fine here too.

Text

Like with Apache, with Varnish 3 you can test the syntactic correctness of your VCL files without having to roll some dice restarting a running server.

$ varnishd -C -f /path/to/mysetup.vcl
  > ...
  

Varnish will compile the file and print out it's full configuration as output. If there's an error, varnish handily gives you a compiler error to help trace what went wrong, which could look something like this

> Message from VCC-compiler:
  > Expected an action, 'if', '{' or '}'
  > ('input' Line 82 Pos 6)
  >     vcl_hash(req.http.Cookie);
  > -----########------------------
  >
  > Running VCC-compiler failed, exit 1
  
Text

I first came across the Imagemagick library and toolkit when I was new to Linux and trying to satisfy Enlightenment and E-term dependencies and get them up and running. A task, and this was the mid 90s, I do not recall fondly. (And thinking about that experience, despite all the controversy surrounding Gnome 3, and Unity, the Linux Desktop has come a long way).

Anyway, unlike either Enlightenment or E-Term, Imagemagick has been a firm friend ever since.

There are three main tools I find I fall back on, time and time again.

Display, Convert, and Identify.

Display, funnily enough, will open up an x-window with the contents of an image.

Convert, perhaps the most useful tool of them all, lets you resize, transform, and re-format images.

And lastly, identify, lets you get information about a file, size, format, colour depth, offset and so forth.

The power of these tools is best wielded with simple loops in bash, or using GNU Find. For example to convert a bunch of images from 1920x1080 to say 1280x720, you could do something like this

$ for IMAGE in *.jpg; do
  >   # usage convert <action> orgfile.jpg newfile.jpg
  >   convert -resize '1280x720' $IMAGE $(echo $IMAGE | sed 's/.jpg$/-resized.jpg/')
  > done
  

Convert tries to be clever and maintain aspect ratios. In this case, 1920x1080 fits into 1280x720 maintaning a 16:9 aspect ratio. But what if we had a 1920x1200 input image? In these cases, convert would actually resize the image to something like 1280x800. But if we really want it to ignore common sense and squish things down to our requested 1280x720 you need to use the bang operator e.g.

$ convert -resize '1280x720!' srcimg.jpg dstimg.jpg
  
Text

If you want to add a unix user to a supplementary group (say for example user 'aaron' belongs to group 'aaron' but I want to add him to the 'wheel' group as well) you use the usermod command.

$ usermod -a -G wheel aaron
  

The -a argument is very important, it ensures arguments to -G append to the existing list of groups. Otherwise existing groups will be replaced with the argument supplied.