We recently showed (Smith & Goodale, 2011, VSS) that early visual and lateral occipital regions contain a rich amount of information about facial emotion categories. We observed reliable decoding that generalized across entirely different face sets, and found a reliable correlation with human behaviour independent of low level features (V1 model), suggesting that top down influences might be implicated in the implicit processing of facial expressions. In the present work we test more directly the role of top down influences in modulating processing of facial emotions in visual cortex. We reanalysed data previously partially published (Han et al. 2012) where participants performed an explicit emotion categorization task (5AFC – happy, fear, disgust, anger and neutral) in a rapid event related fMRI experimental protocol. Faces were presented either as whole faces, eyes only or rest of face without the eyes, thus providing two stimulus conditions where there was no overlap in the actual visual information presented (eyes only; rest of face minus eyes). We show that decoding of facial expressions is possible in early visual, face selective (FFA&OFA) and also frontal brain regions (prefrontal cortex plus inferior frontal gyrus). Strikingly, we find that successful decoding of emotion generalizes across entirely non-overlapping sets of presented visual information in occipital cortex (i.e. train eyes only and test rest of face, or vice versa). This suggests that feedback to early parts of visual cortex can be of a very rich nature: even allowing for reactivation of representations of non-presented parts of a stimulus, to aid in successful categorization (see Smith & Muckli 2010; Petro, Smith et al 2013).