I tried scaling the domain and then applying DataRange inside ListContourPlot, but apparently ListContourPlot scales the data before doing its thing, which seems rather counterproductive.

My current workaround is to scale the domain, construct an Interpolation object of order 1, then call ContourPlot on that object using explicitly scaled input arguments.

However, I am not happy with that solution because the domain of my actual data is slightly ragged, which means to prevent Mathematica from showing the regions where the interpolation object is garbage, I have to set explicit boundaries for the plot region, either using

ContourPlot[..., {x,1.1,9.9},{y,1100,9900}]

or using RegionFunction, which is slow and requires a hefty amount of manual twiddling to get the region right. One nice feature of ListContourPlot is that it usually seems to truncate the plot region to the Delaunay “hull” of the input data, which is a feature I’d like to keep using. (Anyone know if this is indeed what it is doing?)

Is there a way I can tell ListContourPlot what numerical precision to use when constructing the interpolation? I would have expected the algorithm to fail for aspect ratios approaching machine precision, not 10^4. If this can be done, all the other problems seem be handled elegantly inside ListContourPlot.

I am not really interested in trying to regularize or smooth my data, as some of the raggedness is crucial for interpretation.