At long last, after years of debate and deliberations, the FTC finally released its recommendations for online privacy. Titled “Protecting Consumer Privacy in an Era of Rapid Change” [PDF], the 112-page report offered no big surprises. Bottom line: The FTC wimps out pretty badly in most cases.

Here are the highlights and lowlights:

1. Legislate this, mofo

The FTC report calls upon Congress to pass “baseline privacy legislation” ensuring our rights online and off. (To which I say, about friggin’ time.) Beyond that, though, the FTC was mum as to what that legislation might do, beyond being “technologically neutral and sufficiently flexible to allow companies to continue to innovate.”

Call me a cynic, but I think the best we can expect from this is some heavily watered down law written by industry lobbyists that offers little to no real consumer protection.

2. Don’t Track Me, Bro

The FTC report was full of hugs and kisses for the online ad industry’s voluntary Do Not Track mechanisms and Ad Choices program, which I’ve covered more than a few times in this space, stopping short of calling for new Do Not Track legislation. Me, I’m not quite as enamored of the industry’s self-regulation efforts. For one thing, the ad industry isn’t really offering a “Do Not Track” opt out, it’s offering a “Do Not Target” opt out.

That’s great. The problem? Less than one-fifth of 800+ online tracking companies that have been tallied up by Evidon, the keepers of the Ad Choices program, are members of the DAA. What will the rest do with your tracking info? Your guess is as good as mine, but I’m sure it won’t be to your benefit.

3. Data brokers gone wild

The FTC was less sanguine about the alleged self-regulatory efforts of data brokers, calling for Congress to pass new legislation…

… that would provide consumers with access to information about them held by a data broker. To further increase transparency, the Commission calls on data brokers that compile data for marketing purposes to explore creating a centralized website where data brokers could (1) identify themselves to consumers and describe how they collect and use consumer data and (2) detail the access rights and other choices they provide with respect to the consumer data they maintain.

So, in other words, the FTC wants those 800-odd data brokers, along with massive information collecting machines like Lexis-Nexis and Choicepoint, to tell you what dirt they’ve got on you. And if the data broker says you have zero rights to change or delete your data? Too friggin’ bad.

Actually, it’s worse: The FTC merely wants the brokers to “explore” creating a central database. (Data brokers to FTC: We followed your recommendation and explored building a database, then we decided it was too much of a hassle. See ya.)

Even if such legislation comes to pass, it could offer fewer rights than we currently have with credit reporting agencies. Highly flawed though they may be, the Fair Credit Reporting Act (FCRA) and the Fair and Accurate Credit Transactions Act (FACTA) do give consumers some useful benefits – like the ability to obtain free annual reports from the three big reporting agencies, correct inaccuracies in the data, and to put fraud flags on your account if your identity has been stolen. They also offer the ability to opt out of having your financial information sold for marketing purposes (though the agencies mostly manage to skirt that one).

It’s pretty clear you can’t opt out of credit reporting and expect to get a credit card, or a mortgage, or even rent an apartment in some cases. Many jobs require you to pass a background check, of which a credit report is usually a key element. If you want their stuff (credit, a job, insurance) you have to play by their rules (data collection).

Now imagine a world where the credit reporting agencies did more than just rate whether you’re qualified for a low-interest loan or a new store credit card. Imagine a credit bureau for everything, based on a profile concocted from your Web activity. (And not necessarily even from your Web activity so much as your browser’s, which could be generated by a number of different users.)

Visit the wrong Web sites, get placed into the wrong profile “bucket,” and suddenly your insurance rates skyrocket or your mortgage refinance gets nixed or you can’t get a security clearance for that job you desperately need. No one can tell you why, because the decisions were based on data that may have nothing to do with how big a health, credit, or security risk you actually are.

When these decisions are made by algorithms and driven by data over which we have no control, who’s got our backs?

Dan Tynan has been writing about technology since Mark Zuckerberg was in nappies. A prolific freelance writer whose work has appeared in more than 70 publications, he is the former editor in chief of Yahoo Tech and a longtime contributing editor for InfoWorld and PCWorld.