As yet there has been nothing definite about when we can expect this to happen, but the message is clear that although Google considers the impact of the original Penguin to have been relatively small (compared to Panda) it is going to hit more sites when it is eventually run again.

In the absence of the expected "links disavow" tool (which is the subject of another thread so please do not discuss it here), what can website owners realistically do in advance to try to prepare for what is to come?

I know of sites that are still doing very well despite seemingly doing things that would have caused them problems with Penguin, according to what the most popular ideas are of what would cause an issue with it.
Perhaps next time they are going to be hit as well.
What you are saying is fine, but to simply stop optimizing is not in my opinion going to be enough, because on the evidence of the first Penguin it is necessary to undo things that have already been done.
My question is how do sites start to go about that?

I have been getting lot of emails from people to delete forum profiles. Not sure if you can see this link but still ( http://www.mbaguys.net/member.php?u=87810 ) and this was a spam profile created for a link drop and I was contacted to remove that profile by another SEO expert that is being hired to clean things up and so possibly this is what people to get started getting compliance with Penguin.

I don't have any such emails for G4Ef possibly because site became nofollow pretty early and that was not the case with MBAGuys

Wanted to add that just to make life easier for people who are willing to remove their links now member profiles are only visible to members on MBAGuys

I think that a large part of the problem is that because Penguin hasn't been run again for 4 months, there is no real feedback from those hit by it as to whether whatever they did was successful, because they simply do not know yet.
That is coupled with the fact that even when supposedly "bad" links are removed Google is not aware of it because it hasn't re-crawled the sites they came from.
I feel sure that there are webmasters who consider themselves lucky not to have been hit the first time who are now wondering,(faced with all the problems experienced by those who were), what they can actually do to put things right before the next update.

As I check my stats, I feel the update is very rigorously on because one of the best traffic drivers for one of the keyword page is not found in Google and to be honest I never worked on that page for traffic or ranking but now I see one of the other pages is ranking instead of that page and slowly moving up the ranking.

In the middle of this I see at times my old page bounced back in ranking as well.

This concludes that Google is taking the feedback and making changes but possibly not making it more public for misuse.