At the risk of breeching forum rules I will mention names, but the links are not forth coming.

Step one, you need a newsindexer. That serves the same function as torrent sites essentially ie they tell you WHAT is out there only.

You want a program like Newsleecher. It performs the same function as the torrent program ie it "gets" the downloadable data from the source.

Newsleecher is by the way offered free from Giganews (normally you pay a yearly licensing fee to use Newsleecher). Giganews is the actual source of the data (sort of, the process is naturally not that literal).

Giganews is a hosting service with varying plans much the same as an IP has varying plans. It's a monthly cost, and the full out deluxe costs you 25 bucks a month. So your cost is going to be basically that, for the mega version.

In order to use a newsgroup painlessly, you will need to get a free program called Quickpar. Quickpar is a file fixer. Because files are occasionally slighty mangled, and it is an auto fixer. You need to of course become a comnfortable user of more than just WinRar, although it is about the most used form of file process.
You will likely need to learn how to use IMGburn as data is often an image file, and it is about the best out there.

A great deal of stuff on a newsgroup is BIG file size. If you are planning to download much of anything at all, expect to draw attention with the sudden spike in bandwidth usage eh. This is something that simply isn't invisble on someone ELSE'S service.

Oh and regarding 110 gig, I can burn through that EVERY month with no effort. Depends on what you expect to download eh.

Port usage. Everything gets assigned a port when communicating over the internet: http://www.iana.org/assignments/port-numbers
Long story short plain HTTP is port 80, a lot of usenet providers (and if you are going to this length then you are probably using a good one).

Second is packet inspection. The point of a standard is something is the same (and probably a bit unique). As changing ports is trivial (for the most part) this method is often employed.
Some tools have basic encryption (looking at bit torrent) which is broken or not applied well enough (early torrent encryption just encrypted the header).
College servers will do this, they normally have serious hardware for this.
Usenet however comes in nice SSLv3 form with something like giganews, this is not cracked anywhere (it being the standard used for high level stuff the world over) let alone at a college dorm/internet connection. It will be a different port but seen as banking, secure email, shopping and more is done using it there will be no blocks.

110gigs. It seems virgin got their act together long enough to give downloads just over a megabyte a second. This is just over day at full pelt. In practice though a week is more likely for me. It does however fall under the "big" rule of computing (see any nostalgic/reflective computer person post anywhere on the internet).

Retention. Usenet is a bit like a forum, text posts last indefinitely these days but binaries at not so lucky. Current max (only from giganews as far as I am aware) is 240 days and bear in mind this means from upload to it being deleted it will come down at full speed (or as fast as your connection will allow; one connection (of my 10) from the US can easily clock 700kbytes/sec to here in the UK).

Everyone thus far has sage advice too. The reason you will see giganews mentioned frequently is because they actually work (how many ISPs will provide truly unlimited downloads for $25USD/month) and charge a decent price for their services (relatively expensive perhaps but for what they provide.... I suppose the weak US dollar helps those not in the US though)

Regarding usenet indexing sites Binsearch, newzleech.com, binobins and a handful of high end (leave the indexing sites alone) torrent sites (they do a good job of making a TV guide/news/info on what is out there).

So can colleges actually know what you are downloading if you are using Usenet (using any server)? I'll be stating school at a university in a couple weeks and would like to know what my options are for downloading. I know that some schools provide their own usenet servers for free to students(mine does too), however, I don't know the details on what they offer.

@bangbanger MAC address filtering is only useful (and even then that is debatable) for grouping people and denying people access to the network. I suppose bans/putting you on the slow can occasionally occur but I would sooner do it other ways. Basically if you already have access to the network you do not need to do anything.

@Seraph in the last 24 hours several thousand gigabytes (probably more, maybe hundreds of terabytes) of binaries have been uploaded. No college is going to provide this (see that 200 meg limit for each of a few thousand students) unless they outsource it (and that is not unheard of but not to be expected either), this means they will probably only carry discussion groups.
If they outsource like some ISPs then they will probably bounce it off one of their servers so logging is possible there.
The only way is to ask/find a spec sheet.