Having been living in Linux for a while (and to a lesser extent, Mac OS X), I’ve had to learn a bit about the ways of the command line. As someone who’s only other experience with a command line interface was with DOS (with which I was about as adept as one could be), I was skeptical of the value of the command line interface, and reluctant to learn.
That said, I have come to learn that while a command line interface makes it difficult to discover what options are available to you, it does come with some inherent advantages. While I’m certainly not willing to give up my graphical user interface for a text-based console any time soon, I have learned to keep a terminal window handy while using both Gnome/Linux and Mac OS X for a few particularly handy tools.
While these tips are probably absurdly obvious to many people, I suspect there are also plenty of people like myself, web developers in particular, that were raised in Window 3.1-through-95 or Mac OS Classic that never ventured near a command line (except maybe to play Commander Keen now and then).
- WGET
-
Usage: wget [-options] [location]Example: wget http://domain.com/filename.ext
Occasionally I’ll come across a page with an embedded media file, like a QuickTime movie, that I would like to download. QuickTime doesn’t let me save things locally, so I view the HTML source of the page and sniff out the URL of the movie. However, with some file types, depending on how you have your browser and media applications setup, if you paste the direct URL of a media file into the browser, it will load the file in the media player — again, sometimes without any “Save” functionality. If I’m not too annoyed and frustrated to give up trying to download that butt-finger-sniffing-monkey video by now, I would actually have to open a text editor, create a quick HTML file that included a text link with the URL of the item I want to download. Then, I would save the HTML file, open it in a browser, right-click on the link, and choose “Save As…”. Good lord!
With WGET, as long as you have the URL of an item, you can download it. Just type wget followed by the URL.
For bonus points, WGET can resume downloads and supports any kind of downloading option you might need: gzip encoding, SSL, http-authentification.
- WHOIS
-
Usage: whois [domain]Example: whois www.actsofvolition.com
Who is indeed. When you’re trying to figure out a human name or address behind some evil scheming website (or just checking DNS info for a domain), the WHOIS domain database has your info. I used to use, and suspect many people rely on, sites that offer a web-based interface to the WHOIS database (Register.com, Userland, etc.). It usually faster to to open up a terminal and type whois domain.com. No anoying banner ads, and the results are in a simple text format — which you can even save into a file using…
- Piping | Rules
-
The real power of the command line starts to become apparent when you realize the power of piping. Piping is a simple means of passing output from one program as input into another. For example, to list the contents of a directory, I would type ls. I can search the results of that directory by passing the list of contents into a search tool, grep. ls | grep spaceman will return any files in the directory containing “spaceman” in their name. Similarly, I can pass the results of a WHOIS query into my text editor (GEdit) by typeing whois actsofvolition.com | gedit, or save the results directly into a file using the redirection feature “>” (whois actsofvolition.com > aovinfo.txt).
This saved me a load of time last week when I wanted to create a chart displaying the sizes of the various components and packages included in the Fedora Linux distribution. First, I needed a list of the file sizes of packages, so I ran ls *.rpm -l > filesizes.txt on the install CDs. This created a text file with the packages names and their file size which I could then open in a spreadsheet application, sort by size, and graph to my heart’s content.
All of this said, I still love a good icon.


