I find this little library indispensable. I use it almost every day for all sorts of tasks, an obvious example is scanning log files with grep or monitoring them with tail. Here's something I just did with it that's a good example of one of the ways I use it quite often.
The Problem: I have a lightbox on iStockPhoto, I want to download the medium resolution images to print out and have on my laptop to show to a client at a meeting, there won't be an internet connection. iStock don't have an option to do this so I need to rip the images, here's how to do it in about 60 seconds without writing any code (if you're quick!)
Copy and paste the text off the page into notepad, if you've got more than one page just click through and paste it on the end. It's usually best to try and copy the rough area where the data is. Save your file in a new folder.
Open a command prompt and grep the file for the data you want, in this case it's grep "#" input.txt.
Copy the output into a column in excel. Make a simple excel formula that builds a shell command to run wget and grab the files you need, eg ="wget ""http://www.istockphoto.com/file/"&A1&"/2/"&A1&".jpg""".
Copy the resulting column and go back to your command prompt, paste it and it starts running. Watch all your images appear! When you do this often you may find it's quicker to start Excel before you start your grepping, then it should be ready for you when you need it ;)
I suggest having a quick look at the usage info for the various tools (tool --help), get to know how to use regular expressions with grep, also have a look at the things you can do with dir to get bare file listings. Try piping outputs from one operation into another, eg dir /B *.jpg | grep "DV0" > output.txt. If you have problems with line lengths (with looking in logs usually) just use mode con cols=1000, you can use lines instead of cols to increase your scrollback.
I promise you'll find uses for them every where and you'll soon wonder how you ever lived without them!
Labels: commands, tools, unix