It has frequently been reported that stereoscopically-defined, curved surfaces have greater perceived curvature when the axis is horizontal (tilt = 90 deg) than when it is vertical (tilt = 0 deg). This difference, which has been called curvature anisotropy, would be unlikely to occur in haptics. In our experiments, observers made curvature discriminations from visual information alone, haptic information alone, and from both. Visual stimuli were curved surfaces depicted by random-dot stereograms, partially occluded by an aperture. PHANToM force-feedback devices were used to create the haptic stimuli. The parabolic curved surfaces were oriented vertically (tilt = 0 deg) or horizontally (tilt = 90 deg). We found that most observers made finer curvature discriminations visually for tilt = 90 than for tilt = 0 and that haptic discrimination thresholds did not vary with tilt. We used the within-modal discrimination thresholds to set the parameters of a maximum-likelihood estimator for inter-modal visual-haptic judgments. The MLE model predicts how the weights given visual and haptic information should vary with tilt in an inter-modal experiment. The inter-modal results were quite consistent with the predictions. Observers gave predictably more weight to vision when the surfaces were oriented horizontally than when they were oriented vertically. Thus, the relative reliability of visual and haptic information influences inter-modal perception of curvature in a sensible and predictable fashion.