Predictors are often regarded as black boxes that treat all incoming records exactly the same, regardless of whether or not they resemble those from which the predictor was built. This is inappropriate, especially in adversarial settings where rare but unusual records are of critical importance and some records might occur because of deliberate attempts to subvert the entire process. We suggest that any predictor can, and should, be hardened by including three extra functions that watch for different forms of anomaly: input records that are unlike those previously seen (novel records); records that imply that the predictor is not accurately modelling reality (interesting records); and trends in predictor behavior that imply that reality is changing and the predictor should be updated. Detecting such anomalies prevents silent poor predictions, and allows for responses, such as: human intervention, using a variant process for some records, or triggering a predictor update.