I've been spending a lot of time fidding with Code Golf, a game where you take relatively easy computer problems, solve them, and then torture and maim the resulting code until it's as short as possible. Then you upload the resulting script and get scored based on how short your code is.
It's hard. Harder than it looks. For example, one of the contests is to take a list of numbers, such as '1 2 3 4 6 7 8 9 11 13 14', and output a series of ranges in the form of '1-4, 6-9, 11, 13-14.' I was able to do it with a Ruby script 89 bytes long:
$><<gets.split.inject{|s,i|$r=i.to_i-s.to_i>1?(print s,', '):(!$r&&$><<s<<'-';1);i}<<'.'The best answer was done in 50 bytes of Perl, or 67 bytes of Ruby. My answer is already close to line noise... the code highlighter can't even make heads or tails of it. I have a hard time seeing how I would shave off an entire 22 additional bytes!
It's a fascinating process. Determining when you can skip assigning something... whether to keep the input as a string, and convert to numbers as needed, or convert them all to numbers and output strings as needed. Which is shorter? Are there completely different algorithms that would be shorter? These are fun questions if you're a programmer.