Although humans are experts for visual shape processing, several recent studies have demonstrated that haptics can also create highly detailed shape representations. Here, we investigate how vision and touch are processed when both are put in a potentially noticeable conflict in a shape similarity judgment task. To have full control over the parameters of the conflict, we use a novel, calibrated virtual reality setup in which observers see their hand exploring an object, while touching a physical instantiation at the same time in the real world. For the experiment, N=18 participants explored two objects in succession (one baseline and one test object) in each trial and indicated whether they were the same or different. Stimuli were taken from a parametrized morph-space of novel, three-dimensional objects varying in perceptually equidistant steps. We used two randomly interleaved staircases to identify the morph-parameter difference for which an object would be perceived as "same", compared to one of two baseline objects. Importantly, the staircases were run in two conditions: a congruent condition, in which the visually- and haptically-explored objects were the same and an incongruent condition, in which the haptic information for the test object was much closer to the baseline object than the visual information. Paired t-tests on the final morph-parameter differences in the congruent and incongruent condition showed that participants perceived the incongruent conditions to be much closer to the baseline condition than the congruent conditions (t(17)=6.72, p< .001), indicating that the haptic input influenced the overall judgment. Surprisingly, 15 out of 18 participants even showed "haptic capture" in this conflict condition, choosing to largely ignore the visual information (which was judged by all participants to be highly realistic in the virtual-reality display). Our results show that even for shape processing, haptic information can override visual input in a supra-threshold conflict task.