Helpful Hints/D.I.Y.:A new resolution test target

Go to page

pro member

Hi Folks,

Given the huge popularity of my Autofocus Microadjustment tool, I thought it might be interesting to also provide a tool to allow testing for resolution. It can be printed with any reasonable quality (inkjet) printer, and therefore offers a low cost solution for those who need to verify the resolution of their imaging chain (e.g. to test if a newly purchased lens performs well).

There are many types of resolution targets, but they are not all suited for testing digital cameras. Especially the types with parallel black and white bars (such as the well known 1951 USAF resolution test chart) are not really suited for testing discrete sampling devices like our digital cameras. Not only are such targets sensitive to alignment (by happenstance) with the sensel grid, also the high spatial frequency edges of the bars will distort the reliable determination of limiting resolution.

So what is needed is a type of target that is made from smooth, e.g. sinusoidal, cycles of brightness that have well understood characteristics and that are relatively easy to analyse, also numerically (e.g. with Fourier analysis). We can use a bar with increasingly higher spatial frequencies (see example) and observe that, as the spatial frequency increases, the recorded contrast will decrease, until we reach the absolute limiting resolution where contrast reaches zero. The drawback of such a bar test target is that we need to calibrate the image magnification (focal length and shooting distance) if we want to make a meaningful statement about the resolution in absolute terms, e.g. cycles per millimetre. It also only determines resolution in a single direction, so it would require several tests or targets at different angles to detect orientation specific issues.

To solve that, and make the readout more accurate, I had already devised a modification of another existing type of target for analog image recording, i.e. on film or video tubes, many years ago. The original target was, AFAIK, invented by the Americans Jewell and Nutting for lens resolution testing. It consisted of 72 alternating black and white tapered sectors arranged in an 8 inch circle. This allowed to know the exact distance between the segments at any given diameter. The Siemens company developed a test method based on the Jewell-star, therefore it became also known as a Siemens-star.

The drawback of sharp high spatial frequency edges in that original target makes it unsuitable for reliable testing of discrete sampling systems (such as digicams), hence my modification to a sinusoidal type of radial grating. One of the very useful properties of such a star target is that it is insensitive to differences in image magnification ratios caused by focal length and shooting distance. After all, all spatial frequencies are available at different diameters regardless of recording size. The only thing that matters is a measurement of diameter. All we need to make sure of, is that the finest details are smaller than the resolution capabilities of the best component our imaging chain.

This new test target avoids the error generating sharp edges, and records real resolution in many orientations with a single shot!

Anything smaller than the sampling density at the Nyquist frequency will be either blurred to zero contrast or aliased (also depending on things like anti-aliasing filters and/or defocus). Aliasing artifacts will stand out by their seemingly hyperbolic divergence from the expected radial direction.

What can it be used for?
The target can be used to test printers, and/or lenses plus cameras. It can serve to determine maximum resolution of a lens in its center and/or corners, and unveil asymmetry (e.g. due to decentering, or vibration). It can help to verify the acceptable DOF limits. It can visualize the trade-off by your camera or Raw converter between detail and artifacts when demosaicing the Bayer CFA filtered data. It also allows to determine differences in resolution from the same file when processed by different Raw converters, or Noise reduction algorithms. The resolution target allows to determine an absolute number for the limiting resolution, but also gives a visual impression, especially about potential artifacts (an insight which can be useful when comparing cameras to be used for certain tasks).

Where can you find it for download?
Now, the target itself. You can download a 16-bit/channel RGB file for printing (right mouse button click for Save-as);
for HP/Canon inkjet printers (10.8 MB)
for Epson inkjet printers (15.6 MB)Warning: People prone to epileptic response when viewing alternating bright and dark image patterns are advised to not look/stare at the pattern, especially when zoomed in.

How do you print it?
Print it at the indicated PPI without printer enhancements on glossy Photopaper. That PPI wiil usually be set by the printer driver when you select its maximum quality settings for glossy paper and set the correct size (130x130mm).

The target itself has no ICC colorspace profile embedded, so I suggest to assign the printpaper output profile to the file before you actually print it with that same profile. Doing so will probably keep the image brightnesses in the gradients distributed as evenly as intended. If you don't have a profiled print path, then just use the default printer driver's possibilities for your choice of paper and let the printer manage the translation. The step-wedge grayscales allow to recalibrate the image for numerical evaluation after regular gamma adjustment if necessary, but they also serve as a visual guide to evaluate neutral print quality.

How do you use it?
This should produce a 130x130mm test target, that can reveal issues with your printer (e.g. irregular paper feeding, or too much ink). You should shoot it with your (digi)cam from a (non-critical) distance like between 25-50x the focal length. Most good (inkjet) prints on glossy paper can achieve something like a 0.1 mm (254 PPI) resolution or better (despite some ink bleed), so at 25x the focal length distance the optical resolution should be better than most regular lenses can resolve in air, and it certainly outresolves most (if not all) sensor arrays by the time we reach 50x focal length distance.

For tests of performance at very large shooting distances, the sensor image may become too small for practical evaluation. In that case you can print the image larger (thus at a lower PPI), which should also tell you something about you enlargement algorithm's quality. Again, the shooting distance is not critical, just experiment and make sure you keep enough distance to get some low contrast blur in the center of the resulting image. The diameter of the blur center will only be affected by limiting resolution, not the shooting distance!.

The diameter of the resulting "blur"center is a measure of "on-sensor resolution" of the whole optical chain (lens + AA-filter + sensor), and can be expressed as cy/mm after calculating "(144 / pi) / diameter". The diameter can be expressed as number of pixels multiplied by the pixel pitch. So, for an example, (144 / pi) / (100 pixels x 0.0064 millimetre) = 71.6 cycles/mm on a 6.4 micron sensel pitch camera. One can also express it as 0.46 cycles/pixel, which would be close to the maximum reliable resolution, the Nyquist frequency at 0.5 cycles/pixel.

If you want to compare to different sized sensor arrays, all you need to do is scale it to the difference in physical sensor size. After all, physically larger sensor arrays, require less output magnification to reach a given output size.

For digital images one can use e.g. the ruler tool in Photoshop to measure the number of pixels in the central blur diameter at various orientation angles.
For print or analog film evaluation you can use an optical microscope and a reticle, or if you scan your film you can use a scanner to evaluate the entire imaging chain (camera lens + film + scanner).

Using this target may reveal some shortcomings in your equipment or workflow, so you are warned. Do not blame me for allowing you to detect it.

Closing remarks.
I've added some other features to the test target, such as slanted edges for MTF determination and a simple detector for wrong gamma (which yet may require some tweaking). I also added a copyright notice to warn against unauthorized reproduction of the target, but obviously you can download and print the target for your personal use. Feel free to ask questions if anything is not clear.

P.P.S. I've added a few patches that can be used to quickly verify the print quality of the target. This is in reaction to recent feedback where a sub-optimal print quality affected the capabilities to get good lens test results. The new features are explained here.

Active member

Hi, Bart,

Nice work. Thanks for this useful, tool and the excellent discussion of its use.

Thanks so much for always using units of cycles per <whatever> for spatial frequency, rather than lines, line pairs, half line pairs, double lines, line pairs called lines, and so forth per <whatever>.

Users of this may be interested to experience a little-recognized property of sensor arrays with the rows and columns vertical and horizontal: the resolution along the 45°/135° meridians will typically be greater than for 0° and 90° (nothing supernatural; it's inherent in the geometry).

Fuji in fact in some of their cameras laid out the sensor with a 45° rotation of its axes to exploit this (since we typically test along, and perhaps are more interested in, resolution along the 0° and 90° meridians).

pro member

Thanks so much for always using units of cycles per <whatever> for spatial frequency, rather than lines, line pairs, half line pairs, double lines, line pairs called lines, and so forth per <whatever>.

I knew that it would be appreciated, by those who understand the (sometimes subtle, sometimes not so subtle) difference in actual practice. It's the robust performance under all circumstances that counts.

Users of this may be interested to experience a little-recognized property of sensor arrays with the rows and columns vertical and horizontal: the resolution along the 45°/135° meridians will typically be greater than for 0° and 90° (nothing supernatural; it's inherent in the geometry).

Fuji in fact in some of their cameras laid out the sensor with a 45° rotation of its axes to exploit this (since we typically test along, and perhaps are more interested in, resolution along the 0° and 90° meridians).

Indeed. The only drawback of the Fuji approach is that it requires a Raw file that is 4x larger in pixel dimensions to not lose, and capture, most of the resolution advantage of such a 45° orientation.

My test target will unambiguously show any 45° resolution advantage of images in a regularly sampled grid pattern, be it in print or in capture of the print. In fact, I already prepared the target to have a resolution of Sqrt(2) larger resolution in the diagonal dimension at its center (at the expense of some aliasing in the horizontal/vertical dimension). I applied some super-sampling in the calculation of the 144 cycle pattern for the finite resolution PPI grid.

New member

BArt

Nice work, thanks. My screen aliases like an M9 when I display this

It may also be worth considering RGB versions for testing Bayer sensors - I suspect the performacne in any particular colour is signifcantly worse. (Although I'm equally aware that printers cannot print a target that is pure enough in spectra to demonstrate this fully, and of course they do refelct most of the real world in that)

Anyway, I pleased you found the time to think about and make this.

If I can find the time I might pull some sort of comparison together of different systems. How about 1Ds3 vs MF film vs Lumix lx5 (actually the first pairing is a hiding to internet oblivion. MR did that years ago and got different results from my observations)

pro member

LOL, but you need to display it at 100% zoom to view the subtleties on screen (I know that you know).

It may also be worth considering RGB versions for testing Bayer sensors - I suspect the performance in any particular colour is signifcantly worse. (Although I'm equally aware that printers cannot print a target that is pure enough in spectra to demonstrate this fully, and of course they do reflect most of the real world in that).

I have already considered it. The consideration to not do it is 2-fold.
1. There are very few situations where e.g. a worst case scenario of e.g. Red/Blue resolution on a Bayer CFA presents itself. The Red/Blue combination is on opposed sides of the visual spectrum, and rarely seen side-by-side in practice.
2. It is also not the most important hue combination for Luminance resolution, the main driving force in the human visual system in detecting edges in the medium spectral frequencies.

Nevertheless, for the rare case that it does matter, my test target is easily adaptable for such a worse case scenario. For any Color resolution one can adjust the (color) gradient map in Photoshop (one of the reasons why I supplied a 16-b/ch version of the file, where printing requirements hardly justify it).

If I can find the time I might pull some sort of comparison together of different systems. How about 1Ds3 vs MF film vs Lumix lx5 (actually the first pairing is a hiding to internet oblivion. MR did that years ago and got different results from my observations)

There's nothing wrong about re-testing older assumptions based on new insight ... In fact, my test target should also function just fine with captures on film, and scanned at various resolutions (highest native resolution is best to avoid grain-aliasing).

Active member

Indeed. The only drawback of the Fuji approach is that it requires a Raw file that is 4x larger in pixel dimensions to not lose, and capture, most of the resolution advantage of such a 45° orientation.

Imagine a sensor of dimensions 4x3 mm with a sensel pitch of 10 um, thus sensel dimensions of 400 x 300 sensels, a total of 120 k sensels.

The "geometric" resolution (Kell=1 and assuming sensel resolution equates to pixel resolution) of such an array in the 0-90° meridians would be 100 cy/mm. On the diagonal meridians, it would be 141 cy/mm.

If we retain the sensel pitch but rotate the grid by 45°, we will still have (almost exactly) 120 k sensels. That would require the same number of elements in the raw file as before.

Now, the "geometric" resolution along the 0-90° meridians would be 141 cy/mm, and along the diagonal meridians 100 cy/mm.

Although I don't know how it is exactly done, Fuji apparently reads out the data in horizontal/vertical fashion, forcing them to create intermediate grid positions to store the values. That apparently creates data files that are 2x as large (not 4x as I said) for a 41% increased hor/ver resolution and a reduced diagonal resolution. The clever thing is that most natural structures are oriented in horizontal/vertical directions (due to, or resisting, gravity), so the overall resolution gain is used effectively.

Maybe an interesting topic for another thread, to see what Fuji does in their Raw data files. I'd like to reserve this thread for questions and suggestions about the resolution test target, if you don't mind.

Someone with a Fuji camera can of course shoot my target and see what that brings in terms of resolution and % of Nyquist.

pro member

Here is an example of what the (300% zoomed in) center of the 'star' looks like of my EF 100mm f/2.8 L Macro IS at f/5.6, Raw conversions with Capture One Pro (left) and Adobe Camera Raw (right):

The camera was on tripod and the mirror was up 2 seconds before exposure. Focus was done manually with Live View. The Raw conversion on the left was with Capture One Pro 6.01 without sharpening, and the one on the right was done with Photoshop's ACR 6.3. The files were sharpened identically afterwards in Photoshop, just like I normally do.

I've added a circle at the 92 pixel diameter position of the Nyquist frequency (91.7 pixels) to make it easier to see that almost everything is resolved up to the Nyquist frequency (besides a little horizontal and vertical detail in the Capture One conversion and some false color artifacts in both), and beyond Nyquist (inside the circle) there are mostly aliasing artifacts.

The horizontal/vertical resolution of the Capture One conversion is roughly limited at a 102 pixel diameter, which would give a resolution of 0.449 cycles per pixel (90% of Nyquist), or 70.2 cycles/mm on the 6.4 micron sensor array. The ACR conversion has a slightly higher/cleaner resolution in the horizontal and vertical direction (fewer luminance artifacts), but also shows more false color artifacts, although they are low contrast. The diagonal resolution reaches Nyquist without much difficulty. We can say the lens is quite sharp.

Both Raw converters produce a high resolution conversion. The Capture One version has a slightly higher contrast near the limiting resolution which can in a worst case scenario (test targets ;-) also lead to some luminance artifacts. With the most recent versions, ACR has become much better than it was.

New member

Hi Bart

A nice example - indeed I would expect the 100 macro to be a sharp lens!

It's interesting that the AA filter is clearly not so strong as to prevent colour aliasing at approaching Nyquist frequency and allows some artifacting beyond that in the vertical and horizontal, although colour aliasing dominates otherwise.

If I get the chance in th enxt few weeks I'll run some system tests for comparison with my bits and bobs. (No, the Rolleicord is unlikely to resolve more cycles/picture height than the 5D2!)

Active member

A nice example - indeed I would expect the 100 macro to be a sharp lens!

It's interesting that the AA filter is clearly not so strong as to prevent colour aliasing at approaching Nyquist frequency and allows some artifacting beyond that in the vertical and horizontal, although colour aliasing dominates otherwise.

I have always thought that for an AA filter to really suppress CFA demosaicing artifacts it would have to have a really low cutoff, and that actual ones don't really try to go far far in that regard (as the consequence would be very degraded luminance resolution).

New member

I have always thought that for an AA filter to really suppress CFA demosaicing artifacts it would have to have a really low cutoff, and that actual ones don't really try to go far far in that regard (as the consequence would be very degraded luminance resolution).

Exactly. I read elsewhere that in real world (colour information etc) a good result is to reliably resolve around 70% to 75% of Nyquist frequency. This would be pretty impressive across the piece really and would explain why colour aliasing is relatively uncommon (albeit not non-existent) on AA filtered cameras.

On the M9 you see aliasing sometimes around the edges of high frequency detail, say twigs in a tree at a distance.

His work was actually not on antialiasing filters. Rather, it was on what was the effective perceptual resolution of a system with a certain geometric resolution (assuming that the subject was not so kind as to have all its transitions aligned with pixel boundaries). But the two go together.

pro member

I have always thought that for an AA filter to really suppress CFA demosaicing artifacts it would have to have a really low cutoff, and that actual ones don't really try to go far far in that regard (as the consequence would be very degraded luminance resolution).

And what's more, Green has a higher sampling density than Red and Blue. So an AA-filter for Red and Blue would hurt Green (the major contributor of Luminance data) too much. So a single filter can only be a compromise.

Now imagine what happens on most medium format backs, and on the Leica M8 and M9, which don't have AA-filters at all? There must be some really nifty software being used to cover up the artifacts after they have been created by those.

New member

And what's more, Green has a higher sampling density than Red and Blue. So an AA-filter for Red and Blue would hurt Green (the major contributor of Luminance data) too much. So a single filter can only be a compromise.

Now imagine what happens on most medium format backs, and on the Leica M8 and M9, which don't have AA-filters at all? There must be some really nifty software being used to cover up the artifacts after they have been created by those.

Any 'nifty' software is going to be taking a risk - I know you know this! - as it's not possible to correct for aliasing after the event, i.e. you can't separate signal from false signal and certainly not reconstruct. I imagine that you could look for conditions where aliasing is likely to be in place and apply local actions to cover up the effects - desaturate or average colour where colour aliasing is expecetd or blur areas where false detail is likely. Of course, you run the risk of blurring real detail. Imaging a picture of a picket fence that halves frequency at a distance where the higher frequency is at Nyquist. Such an algorithm might well blur the lower frequency detail away, even though it could be perfectly well rendered.

I have always taken the view, in spite of interent wisdom, that Canon and Nikon et al know what they're doing and that there are good practical reasons the M8 and M9 don't have the filters. I suspect the latter is largely due to the difficulties of dealing with non-retrofocus wideangles as is. An AA filter is just adding too much to the sandwich. Of course, if the ruddy lenses weren't so good there would be no issue, but the ZM 2.8/25 has been measured (a production sample) as being able to put 400 cy/mm on to film (microfilm!) at f4.

I'll run the test and let you see the commedy aliasing and false detail some time soon.

pro member

It may also be worth considering RGB versions for testing Bayer sensors - I suspect the performance in any particular colour is signifcantly worse. (Although I'm equally aware that printers cannot print a target that is pure enough in spectra to demonstrate this fully, and of course they do reflect most of the real world in that)

I've just added a P.S. to my original message with links to a Red/Blue version of the targets. They look even more psychedelic because the human eye also has more difficulty to resolve the differences. It's not just a Red/Blue version, I've also tried to reduce the luminance difference between the colors to make it even harder for Bayer CFA based sensor designs.

I may have to tweak them a bit more over time to improve the equal luminance difficulty level, but there are many variables involved that are beyond my control, most notably the transmission characteristics of the different CFA filter colors and how the Raw converters do their demosaicing. That's in addition to the printer limitations. Anyway, it should give a pretty good idea about the worst case performance for RGBG filter arrays. In practice one could take a weighted average between a luminance and an R/B score. The jury is still out on what that weighting should be.

Active member

I have recently bought an EF 70-200L f4 IS zoom lens for my Canon 5D MkII body. I had the possibility to exchange the lens within 8 days so naturally I have set out to test it as best as I could. Bart's resolution target has helped me in achieving this quickly. Looking at the results, the 1st copy seemed to have an issue in the bottom right hand corner. I have exchanged it for another copy and retested. This one was a better performer and I have kept it. Obviously, I have also taken real life pictures to test the lens in the field. But the aberration I have discovered in the 1st lens would probably have gone unnoticed that way.

The pictures were taken with the camera on a tripod, using mirror lock up and contrast focusing in live view. At first, I have used the 2sec timer but the resulting images have indicated that this was not long enough to get rid of the residual shaking. I have then switched to using the remote control and the 10sec timer.

I took pictures of the target at the center of the frame and also at the 4 corners. The target was taped to a window pane and the camera was set up at a distance of around 25x of the focal length used. The lens axis was set to be as perpendicular as possible to the target plane since I was also testing the corner performance. I wanted to be able to see whether the distortions in the corners would be symmetrical or not. For all pictures, I have left the camera fixed on the tripod and have moved the target around instead. This has been repeated using 4 main focal lengths (70mm, 100mm, 135mm and 200mm).

Besides using the resolution target for identifying the possible aberrations, I have also taken shots at various apertures from f4 to f22. This has helped me experimentally identify the aperture at which diffraction became an issue and how much of it I could normally tolerate.

Below is one of the center of the frame test pictures. Exposure details are: 5D Mk II, EF 70-200L f4 IS, 200mm, f6.3, 1/13s, ISO 100. Let me point out that the target has a serious color shift since the old Canon printed I have used to print it would not play ball. But talking to Bart on this, we have concluded that it would not be a problem for the test and I have left it at that. The conversion from raw is done in LR3, using all neutral settings. No sharpening, no noise reduction, no lens corrections, no clarity. The image was then taken into PS where I have added the 92 pixel Nyquist limit circle as a layer.

Full image at 100%:

Cropped image when zoomed in at 300% (not a real/permanent resizing, this was just screen captured from PS):

Cropped image when zoomed in at 300%, including the 92 pixel Nyquist limit circle (not a real/permanent resizing, this was just screen captured from PS):

Active member

Now as a comparison, here is the lower right hand corner of the 1st lens which I have swapped. As you can see, it's performance was a bit "muddy". The aberration is not symmetrical and focusing is off, among others.

Cropped image when zoomed in at 300% (not a real/permanent resizing, this was just screen captured from PS):

Cropped image when zoomed in at 300%, including the 92 pixel Nyquist limit circle (not a real/permanent resizing, this was just screen captured from PS):

pro member

Now as a comparison, here is the lower right hand corner of the 1st lens which I have swapped. As you can see, it's performance was a bit "muddy". The aberration is not symmetrical and focusing is off, among others.

Thanks for adding that as an illustration of what to look for. The lens aperture was stopped down, so it should have shown a more regularly shaped resolution limit, especially for a 200mm lens. The red flag was also raised due to the difference with the other corners (which were almost as good as the center), which proved that a better corner performance was possible.

I know we were engaged in extreme corner pixel peeping (on a full frame sensor, no less) but, since your 2nd copy performed better, it paid off in getting an even better performing lens.

Corner performance for this zoomlens, stopped down a bit, can come very close to the performance in the center of image, and even wide open it is no slouch (although pixel peeping me tends to stop down 1/3rd of a stop to f/4.5 for improved corners).

pro member

Hi Folks,

Just a small update to let you know that I added a couple of features to the test chart. These features will allow to judge the print quality of the target. The links to the targets in the first post of this thread will now download the updated targets.

1. At the top left of the medium-gray area surrounding the star feature, I've changed the earlier feature into a horizontal and vertical line test at 3 resolutions, and with 2 levels of contrast (high/low). When the resolution is optimal, then the 1, 2, and 3 pixel wide lines should be resolved. You may need a loupe to see the finest detail, especially on the lower contrast group.

Due to ink diffusion in the paper medium, the line patterns may become a bit darker than medium gray when viewed from a distance. When all of the line patterns deviate from medium gray, then there may be an issue with your profile or gamma settings. I generally assign the paper profile to the target and then print it with the same output profile, which should effectively neutralize the effect of color management (in case there is an issue whith the profile).

It is also possible that the horizontal line patterns, or the vertical ones, deviate more in tone. That indicates differences between the horizontal and vertical resolution, or a limitation due to the paper used.

2. At the top right and the bottom left of the medium-gray area surrounding the star feature, the 3 resolution levels (horizontal/vertical) are displayed at all brightness levels. That may also reveal some compromises in the capability to print very accurate tonality for the highest detail level, due to having fewer pixels for dithering intermediate tones with a limited set of inks.

These additional features are not stricktly required to use the target for resolution tests, but if print quality is compromised too much, then it might affect the resolution test accuray as well. So these are more like early warning signals than features you need for the actual resolution tests themselves.

All that is required is a high quality (600 or 720 PPI) print, which will result in a 130mm square output. Note that on an Epson printer one must also set the "finest details" option, otherwise the printer driver falls back to 360 PPI. Glossy paper is most likely to have little ink diffusion but requires a bit more control when lighting it for the shoot, to avoid reflections. Some Matte papers also allow very sharp output, but many diffuse the fine detail which would make it a less preferred candidate for printing a test chart.

Then a shot at 25 - 50x the focal length is all that's needed, and a measurement in millimetres or in pixels of the central blurred disk diameter. For lens tests, I refocus on corners to avoid field curvature from affecting the measurements, unless that is what needs to be measured (e.g. for flat surface repros).

Just shoot a correctly exposed image, with a lot of care to achieve good focus. The slightest defocus will affect resolution.

OPF Owner/Editor-in-Chief

Active member

Here I will discuss some interesting aspects of the star figure that is the centerpiece of Bart's test target. I am sure Bart has discussed various of these matters before.

I will be working with Bart's "720 PPI" target. When used for printer resolution testing, this is intended to be printed at a scale of 720 px/in. And when I speak or properties that have "inch" in their names, I mean in the frame of reference of the target being at a scale of 720 px/in.

I will work with a central square portion of the target, dimensions 200 px × 200 px. I will present it here upsized to 4X (on a "dumb" basis) for ease in viewing. It bears some overlay markings I will discuss shortly.

The star consists of 144 "black" "radials". If we follow a circular path around the pattern, we will traverse 144 cycles of a nominally-sinusoidal variation in image luminance.

If the radius of that path is r, then the frequency of that "circumferential" variation will be 72/(pi r) cy/in.

Given that the pixel pitch is 720 px/in, it is tempting to say that the Nyquist frequency involved here is 360 cy/in. As we'll see shortly, it is not that simple.

The white circle overlaid on the pattern is at a radius where the circumferential frequency of the pattern is 360 cy/in. Outside that circle, the frequency is greater; inside, it is less.

But in fact with this conventional pattern of pixel locations (along vertical and horizontal lines), the Nyquist frequency is only 360 cy/in for a vertical or horizontal path. Our detail is along a vertical path when we are on a horizontal radial, and along a horizontal path when we are on a vertical radial.

But if we are on a 45° diagonal radial, the path is diagonal (yes, at right angles to the radial). It turns out that along such a diagonal path, the effective pixel spacing is less than 1/720 in, by a factor of 1/sqrt(2), or about 0.707. That means that, along a 45 ° diagonal path, the Nyquist frequency is 360/0.707 cy/in, or about 509 cy/in.

A similar matter occurs along other "diagonal" radials.

So in fact the line that shows where the circumferential frequency of the pattern is equal to the Nyquist frequency, when we are along any certain radial, is the one shown in yellow on the figure - a "diamond".

Well! But if that is in fact the line inside which the frequency of the pattern of radials is higher than the Nyquist frequency, we would expect to find no "detail" inside that line - the pattern of radials would not be conveyed by the pixel pattern, and we would see only boring gray.

But in fact we do see some detail, but not a very good portrayal of the radials. What is happening here?

Well, I forget exactly how Bart generates the underlying star pattern, but I think it is done in a vector-like way so that it, in effect, has infinite resolution. Then he "samples" that to form the pixel pattern in the delivered test target.

But once we get inside the yellow "Nyquist frequency line", the (circumferential) frequency of the radial pattern is greater than the Nyquist frequency. So the actual pattern is not represented (it can't be), but rather some other pattern - we have aliasing.

Now as we get near the center of the pattern, we find that the amplitude of the aliasing drops to the point where it can no longer be seen. Interestingly enough, the boundary at which that happens is roughly circular. I have shown that boundary (as I rather arbitrarily located it) with the red circle. The circumferential frequency to which it corresponds is about 1.66 times the Nyquist frequency, or about 599 cy/in.

Do the spurious components in the test pattern inside the Nyquist diamond cause an complications to the determination of printer resolution? I don't know. Hopefully not. But, when examining a test print, I find a similar pattern of (relatively faint) "swirls and swooshes" as we near the center, not a polite drop off of the pattern of radials (still straight) into innocuous gray. This makes it difficult to decide what diameter of "gray spot" to use to adjudge the resolution exhibited by the printer. I have to ponder that matter when I am fully awake.

It might be desirable in a star test target to "band-limit" the underlying theoretical pattern to the Nyquist frequency (mindful that this is different along the various radials), thus eliminating aliasing in the pixel representation of the pattern we start with.

I wouldn't be surprised if this has been discussed in the literature of star pattern testing.

pro member

It might be desirable in a star test target to "band-limit" the underlying theoretical pattern to the Nyquist frequency (mindful that this is different along the various radials), thus eliminating aliasing in the pixel representation of the pattern we start with.

It's always a dilemma whether to use a stricter bandlimiting filter or not. Formally, one would need to eliminate everything within your white circle, but that would throw out some useful diagonal resolution (between the white circle and the yellow diamond). Also, band limiting can in reality not be as abrupt, so it would need to already reduce amplitude outside of the white Nyquist limitation, which would be a shame.

So I chose for an unaltered pattern, including some marginal aliasing. The amount of blur generated in lens+camera testing, is usually severe enough to totally blur both central resolution and aliasing. It is when testing print resolution that all pixels will be used one-on-one, without size reduction due to optical projection to a smaller size.

There can be some confusion as to where exactly the cutoff diameter is, but here the aliasing can even help! When the radials start their hyperbolic divergence from the expected direction, and they get fatter instead of narrower, we are obviously looking at the aliases that get mirrored around the Nyquist frequency of our pixel/sensel grid.

Another effect that can sometimes be observed, is that a radial near the horizontal or vertical direction suddenly jumps one pixel position, the radial suddenly goes from black to white, and the neighboring radial from white to black. It is a phase effect that's often seen with very high quality camera lenses which can resolve detail with enough modulation to exceed the Nyquist frequency of a sensor (notably sensors without an optical low-pass filter (OLPF) in their filter stack).
It is caused by an ever so slight rotation of the pattern versus the sampling grid (the target itself is of course perfectly aligned with the pixel grid, so the only hyperbolic phase effects are visible just outside of the Nyquist limit, close to the horizontal/vertical radials).

With lens+camera testing, the quality of the Raw converter also plays a role. Cameras with a Bayer color filter array (CFA), have a different Nyquist frequency for the Green filtered sensels than for the Red and Blue filtered ones. That poses the Raw converter for a difficult task to reconstruct actual detail and color near the Nyquist limits that differ due to sampling position and density.

Active member

It's always a dilemma whether to use a stricter bandlimiting filter or not. Formally, one would need to eliminate everything within your white circle, but that would throw out some useful diagonal resolution (between the white circle and the yellow diamond). Also, band limiting can in reality not be as abrupt, so it would need to already reduce amplitude outside of the white Nyquist limitation, which would be a shame.

So I chose for an unaltered pattern, including some marginal aliasing. The amount of blur generated in lens+camera testing, is usually severe enough to totally blur both central resolution and aliasing. It is when testing print resolution that all pixels will be used one-on-one, without size reduction due to optical projection to a smaller size.

There can be some confusion as to where exactly the cutoff diameter is, but here the aliasing can even help! When the radials start their hyperbolic divergence from the expected direction, and they get fatter instead of narrower, we are obviously looking at the aliases that get mirrored around the Nyquist frequency of our pixel/sensel grid.

Yes, seeing that structure clearly on my printer test prints, I essentially conclude that the exhibited resolution is very near the Nyquist limit

Of course it is the fact that this printer exhibits a resolution so near the Nyquist limits that brings about this "dilemma" at all! It is what we may call a "happy problem".

Another effect that can sometimes be observed, is that a radial near the horizontal or vertical direction suddenly jumps one pixel position, the radial suddenly goes from black to white, and the neighboring radial from white to black. It is a phase effect that's often seen with very high quality camera lenses which can resolve detail with enough modulation to exceed the Nyquist frequency of a sensor (notably sensors without an optical low-pass filter (OLPF) in their filter stack).

It is caused by an ever so slight rotation of the pattern versus the sampling grid (the target itself is of course perfectly aligned with the pixel grid, so the only hyperbolic phase effects are visible just outside of the Nyquist limit, close to the horizontal/vertical radials).

With lens+camera testing, the quality of the Raw converter also plays a role. Cameras with a Bayer color filter array (CFA), have a different Nyquist frequency for the Green filtered sensels than for the Red and Blue filtered ones.

Yes. I think that situation is in part responsible for some of the misunderstanding about the OPLF.

By the way, I notice some "hyperbilc" art well outside even the white circle. I have conjectured that this might be due to the fact that the luminance pattern across a radial is not perfectly sinusoidal. (It couldn't be, as that would require infinite precision in the underlying numerical construction of the function.) So perhaps the fundamental frequency of the circumferential variation is accompanied (at low amplitude) by some harmonics (likely mostly odd). In the region of interest, those frequencies would fall outside the Nyquist limit and thus create their own hyperbolic artwork.

Is that a credible conjecture?

I wrote that piece after, at about 4:45 am, I awoke and couldn't go back to sleep. After it was done and posted, I went back to bed and slept until 9:30 am!

Reading it will perhaps have the same effect on our colleagues!

Thanks again for your wonderful work in bringing to us tools for measuring the performance of our toys, and for working with me as I try to reconcile basic and idealized theoretical concepts with actual behavior.