Being a dialup only user, I've been very frustrated with iTunes' lack of a resumable download for podcasts, etc. My attempts at downloading with Firefox from sites such as MacObserver(MacGeekGab) was poor, since Firefox would often find the connection closed after the download was partly downloaded. (This is the same thing I'd see in iTunes, where I would never know what had happened, except that the file size was way too small.)

I'm sure there are shareware-type downloaders out there that would work, but money doesn't grow on trees for a lot of us folks. So I use the built-in curl command in Terminal. Trying curl in Terminal using the options -C - -O will work for most sites. However, for sites (like MacObserver) that are using a service such as CacheFly which uses redirects, it won't work. I was stumped until I contacted CacheFly, and they pointed out that "you can have curl follow redirects by passing it the -L flag."

I had read the help for curl, but did not understand what Follow Location: hints (H) meant. Finally! Now I can grab the podcast URL from the website, and then run this command in Terminal:

curl -L -C - -O http://url.of.podcast

Even if the connection gets killed before I've finished downloading (hours later), I can resume and finish it. I hope this helps at least one other dialup user out there. Hopefully Apple will eventually connect iTunes with tools like curl that ship with OS X to upgrade their download capabilities.

I wanted to be able, from the Terminal, to select files in the Finder. That is, to use a command like this...

showinfinder /Developer/Tools/GetFileInfo /Developer/Tools/SetFile

...and have Terminal open the Finder, with the specified file(s) selected.

The command open wasn't useful in this case, because I wanted the files to be selected. And sometimes I have the complete path file in the clipboard, and it annoys me to have to strip the name off, and then in the Finder, look for this file.

My solution is an AppleScript called via a function in the shell. This way, you can also use completions from the shell to specify one or several files. If you call it normally, you will have to give the arguments with an absolute path. But, if the first argument is Current_path:path, then the script will look in the folder path to find files.

The first argument to the function is the directory to search, if not the current directory. After that, the architecture could be specified (but defaults to the system architecture). The code must be run as root to affect most system directories. Here's a list of likely directories to thin:

/bin /sbin /usr/bin /usr/sbin /usr/libexec /Applications

On an Intel mac, do not strip any Frameworks folders if you want to leave Rosetta working. On PowerPC Macs that is not an issue. Any file which is not a universal binary is ignored. The -not -perm +7000 ignores any files with a set_id bit. If you want to save a few more megabytes for some reason, change the script to only find a specific set_id bit, then restore the bit after lipo clears it.

Alternately, you could use ditto to accomplish the same thing, but I do not think ditto preserves set_id bits either.

[robg adds: This script is potentially very dangerous. I have not tested it. As noted, if you are going to, please back up your system first. Proceed at your own risk. Look both ways before crossing the street. Always eat your vegetables...]

I was messing around in the /usr/bin folder and found a binary called tidy, which is installed by default on OS X. Immediately curious, I looked up the function of this program. Its purpose is to generate cleaned-up versions of HTML, XML, and XHTML files, and it can even convert them. This is useful if you code web pages by hand, as I do. It fixes your mistakes, like the following, and many more:

<h1>heading
<h2>subheading</h3>

tidy will tell you the errors and then spit out a fixed version, which you can optionally save to a new file. Here are some examples:
Convert HTML to well-formed XML and output to a new file (hit control-D after the output is finished):

tidy -asxml test.html -output fixed.xml

Just show the errors and quit:

tidy -errors test.html

Use upper case tags for output:

tidy -upper test.html

For more options, try man tidy or tidy -h. One use for tidy I can think of is cleaning up those horribly formatted Word webpages. tidy doesn't seem to like them either, but with the use of a configuration file and the word-2000 option, it cleans it up pretty nicely. A simple script could be written to feed a Word webpage through tidy, and then strip out the extra annoyances.
See this page for more details.

Unfortunately, I've recently had reason to want to check the SMART status of my PowerBook's internal disk. Apple's Disk Utility will tell you a simple pass/fail summary of the SMART status of the disk, but it won't go into any detail, show you logs, or details of particular errors. What's more, I believe that it only looks at certain classes of errors to determing the Verified of Failing status, and so if the drive is experiencing errors, such as its firmware silently remapping sectors, it's still shown as A-OK.

I decided to go hunting for some more useful utilities, and first discovered SMARTReporter which displays an icon in the menu bar showing you a visual status of the disk. This utility, however, appears to use the same criteria as Disk Utility for determining as simple pass/fail status, so in this particular case isn't that useful to me. It is handy in a more general sense as it can monitor multiple disks, and upon a SMART failure it can do any combination of: pop up an alert dialog, execute another program or send emails to multiple addresses.

So, digging a bit deeper, I then found the smartmontools, which consist of smartctl and smartd that together will report on a whole lot of low-level SMART information, including showing the drive's error log, and other useful information. smartmontools run at the command line only, so you do have to delve into Terminal to get at it, but they provide a wealth of information that is difficult, or impossible, to get in any other way. I've got more information on them available in a recent blog posting.

What I'm going to do now is, through a bit of magic with dd and DiskWarrior, to add a line to the daily cron script to check the SMART status of my internal hard disk, and let me know if there are any errors at all, not just something that DiskUtility thinks is worthy.

[robg adds: I downloaded and installed the package (following the "install from source" instructions, and it worked as described -- ./configure, make, and sudo make install installed the package. It seems to work with the SATA drives in the Dual G5, even though the site doesn't make it clear that it will do so.]

This recent hint describes how to pause and resume a process or application. The automatic version of the hint involved a long, but single-line, perl script that I said could be made into an alias. However, that's easier said than done, since the script contains so many special symbols you have to escape them properly. Therefore, this hint is actually a standalone helper hint. If you want to make an alias out of a perl command or other complex script, you can let perl do it for you like this:

perl -we '$s=;print quotemeta $s'

Run this, then type (or paste) in any command, and it will print out an escaped version of the entered code that can then be aliased.

Occasionally I find my computer tied up with some long-duration, resource-intensive application that one can't simply quit in the middle of. For example, iDVD can run for a couple days. Other times, on my family's multi-user machine, I'd like to be able to turn off other switched-out user's resource-hungry but idle apps (e.g. Word) that cause glitches when I'm doing something like using VLC Player or watching a QuickTime 480p movie.

Rather than quiting the app, I pause and later resume it by sending it Unix signals from the command line:

kill -s STOP 3328

The above command sends the STOP signal to, in this example, process 3328, which immediatly sleeps the process without aborting it. When I am ready to resume, I send it another signal to continue:

kill -s CONT 3328

Now you might be wondering why not simply use the commands nice and renice? Well two reasons. First, and primarily because, even at nice 19, sometimes that is not affirmative enough, espcially when the process is consuming resources other than the CPU: e.g. network intensive or disk intensive operations. Second, it's tricky to undo nicing and be sure you got it right.

[robg adds: We covered stopping processes in this earlier hint. This hint, however, provides an automated solution based on processor load -- read on for the details. Note that I have not tested this one.]

I've been leeching off MacOSXHints for years; this is such a great site. So I think it's about time I share some of my own work. This hint was written because I wasn't entirely happy with the output of previous hints, so I kept working at it until I came up with this solution.
I wanted a way to view upcoming entries on the various Unix calendar pages -- both the stock calendars, as well as one of my own. Here's what I eventually came up with:

Each line that begins with calendar looks up date information in a different calendar file. The -f switch precedes the full path to a calendar file (most of which are in /usr/share/calendar, but you may also have personal ones elsewhere). The -l switch precedes the number of days to look ahead. When look ahead is set to 0, then only today's date information is returned. The tail end of each calendar line directs the input to a temporary file (the >> on all but the first line amends the output to the file created in the first line). The last calendar line looks in my personal file (in my home directory), which contains personal dates of importance.

The line which begins with cat lists the temporary file, and then directs the output to the sort util. sort then uses the first portion of the line to sort by date (so if your information spans one or more months, it should sort properly -- I've done a limited test on this). The last line removes the temporary file.

[robg adds: I tested this, and it seems to work as described. Remember to make the script executable after saving (via chmod a+x scriptname).]

Terminal lets you set the window title with some interesting options, but alas not the current directory. I wanted to move the current directory from the prompt to the window title. The downside is that you cannot copy the current working directory without typing pwd, but there are several upsides.

First there is the settitle function, which someone else wrote:

function settitle() { echo -ne "\e]2;$@\a\e]1;$@\a"; }

This changes the title of the current window. In Terminal, this is the string you can set in the Window Settings that defaults to "Terminal" but not the whole window title.

Next there is a new cd function to replace the built in command:

function cd() { command cd "$@"; settitle `pwd -P`; }

Using command before cd forces bash to use the built-in instead of the function, so no infinite loop on cd. You could pass anything to settitle; this passes the current full path without links.

To make this complete, change the prompt to remove the redundant path:

default: '\h:\w \u\$ '
export PS1='\h:\W \u\$ '
export PS1='\h:\u\$ '

The default is to have the full path after the hostname. The first option reduces the full path to the name of the current directory. The second option removes the path altogether.

Finally, if you want to be consistent, add the following line somewhere so that when the shell starts the title is set. Otherwise the title will be "Terminal" or whatever is set in the Window Settings until cd is called. Of course, you may prefer that.

settitle `pwd`

There are a lot of variations on how the path is derived and displayed, and what other information would be useful in the window title. Here is one last example that more closely matches the information in original prompt:

"${HOSTNAME%%.*}:${PWD/$HOME/~} $USER"

I assume that most people who would care about this also know where to place the above commands. But just in case, stick all the above code in either .bashrc or .profile in the home directory.

[kirkmc adds: We've run other hints on changing Terminal window titles, including this hint and this hint, both of which give different methods for putting the current directory in the window title. This is another way, and may interest some readers just because it's different.]

I am Polish and I have files whose names contain Polish characters. One day I was horrified to find that I could not pass such a file to an AppleScript. After much work, I finally figured out a way to pass arguments containing international characters to an AppleScript. Along the way, I learned many other things as well. I will explain this starting with simple examples, then expanding on them. In this hint you will learn the the magic shebang for an AppleScript, and some useful rules for Makefiles, in addition to the main hint itself, with an example script that opens files in Preview.

I wanted a way to run an AppleScript from the command line using the Unix shebang approach. An example of what I wanted is having the first line of a Bourne shell script be #!/bin/sh. In this way, you can simply type the name of the script in Terminal to launch it. I wanted a similar approach for an AppleScript.

[kirkmc adds: Wow, what a hint! I have to admit, my AppleScript skills are too limited to follow all of this one, but the poster went to no end to explain this in detail. What follows are nearly 4,000 words on the subject, so read on if you're interested in the very fine details.]