Sitemaps, Meta Data, and robots.txt Forum

I'm rather new to the world of robots.txt and i'm wondering if someone can help me out. I run a wiki that runs on usemod, and we've been having issues with people spamming the sandbox in order to increase their google rating. I'm wondering if i can disallow certain query strings on a perl script, while still allowing everything else.

For example, my wiki script runs in /cgi-bin/wiki.pl and to get to the sandbox page, the url would be /cgi-bin/wiki.pl?Sandbox...to get to the home page, it would be /cgi-bin/wiki.pl?Home...etc.

I still want google to index all my other wiki.pl? pages, but i'd like for it to disallow the indexing of the sandbox (/cgi-bin/wiki.pl?Sandbox). If i do something like this, will it work?