Coming soon: Lens Reviews to return to dpreview.com

We're incredibly pleased to announce the imminent return of lens reviews to dpreview.com thanks to a joint venture with DXO Labs, involving the establishment of a dedicated DxO lens and camera testing facility in Seattle, and the incorporation of DxO test data into dpreview.com’s class-leading lens reviews. We're currently ironing out the last few bugs in the system, and hope to publish the first review later this week.

We launched lens reviews back in 2008 and they were an immediate hit, gaining praise for the unique user-friendly presentation of complex data via a patented test data widget, supported by numerous real-world sample images and expert commentary. Logistical issues put the lens reviews on an extended hiatus in late 2010, and the new venture with DxO Labs not only ensures the return of lens reviews to dpreview.com, but allows more of them to be produced, more quickly.

We're currently putting the finishing touches to a completely re-written version of our lens widget, which will use DXO data and present it in a familiar format.

The new lab is now fully operational, and we expect to publish the first review within the next week or so. The popular lens review data widget has been entirely re-written to allow dpreview.com’s visitors to visualize the test results from the new lab and compare lenses just as they could before. Dpreview.com’s highly respected lens expert, Andy Westlake, will once again produce the lens reviews.

As part of the joint venture agreement the test results obtained will also be made available on DxO Mark website (www.dxomark.com). DxO Labs and dpreview.com will also be collaborating on the testing of digital cameras with a view to adding even more valuable image quality information to dpreview.com’s legendary in-depth reviews.

'We are very happy to provide dpreview with our measurement technology for testing cameras and lenses' said DxO Labs CEO Jerome Meniere. 'Dpreview’s articulate and creative writing style makes difficult photography concepts accessible to even the most novices of photographers – they are a perfect complement to DxOMark’s scientific measures.'

Simon: Producing the data for lens reviews in an incredibly long-winded process requiring a large, dedicated studio and hundreds, sometimes thousands of high precision exposures and measurements. The establishment of a dedicated DxO Labs lens and camera testing facility on our doorstep allows us to entrust the measurement and studio testing of lenses to an established world leader in image quality analysis, and to work with its team to offer our readers the perfect combination of accurate, consistent measurements with real-world shooting experience and expert analysis.

I think, cooperation with DXO is a step back for this page. DXO's reviews of the optics, (above all ranking) are often misleading for the user and potential buyers. There are better alternatives on the market.

EW...and with news, I stop paying any attention to DPR's lens reviews. Thanks for the heads up!

Btw, DX0's ratings exist purely to create buzz for their software. At one point, their licensing validation software amounted to a root kit. Remember that? It's not good software. Maybe they're changed their ways and gone more honest, but it would take a lot to convince me to take a risk after being bitten by a company like that.

DPR, I appreciate that lens testing is hard work, but I would personally prefer to see you gather your own data, even if it is slow going. Or at least I'd prefer that to anything to do with DX0.

I always found that my lens performed very closely to what DXO reviews have shown. I'm certainly not going to give credibility to an Amazon user that wouldn't know a good image if it hit him in the head.

As your user name implies, "novice". Bought?? well known by who you? Maybe you can post some real evidence to prove your slanderous claim they are bought. What did they do, give one of your loved lens a low rating.

@goblin - You are totally right. I am sure DxO didn't find any "surprizing measures" when Nikon ISO 6400 is actually 4500 or Sony ISO 6400 is also not even 5000. Same with Canon too! Wow! Only Olympus "cheats!" This "cheat" was only brought up when E-M5 was released. How convenient.

'It seems quite obvious from comments by Simon that DXO will be playing an ever increasing role at DPreview...unfortunately' - not true at all. We may use some of their raw data (emphatically not their scores) to add to areas we don't currently review (such as raw DR)

That is great news! I was referring to a reply you gave to a request to incorporate DXO into camera tests and I guess I read more into your response than there was. I have been reading DPReview since about 2004 (although only recently as a registered member) and consider it one of my favorite photography sites. I would hate to see it diminished as an information source.

I knew that no lens reviews had been posted in a while, but I never knew they were on indefinite hiatus. Good to see them back, there are quite a few lenses I'd like to see tested. I did notice on the old testing regimen that you picked a mount for a lens like a Sigma and didn't test it on any other mounts. Personally I'd like to see some results for multiple mounts with third-party lenses. Also, I think that a standardized camera level should be chosen, and kept over a period of time, for instance Canon's 5Dmk 3 and Nikon's D600 (I chose those two because their pixel counts were similar and therefor should return more closely related results than the D800)or a 1Dx and D4, and then also given a shorter test on current consumer cameras as well so as to show the capabilities independent of a professional camera.

It would be nice, though, if DPreview will publish extensive details on how lenses are tested and how the ratings are calculated.

Another nice-to-have feature will be to group tests, so at one time several lenses from different brands, but similar specs will be tested, and readers could make up their minds on which lens is the most suitable for them.

@Simon, sorry to hear that you are selling out. DPreview previously was the benchmark for independent reviews. By tying in with the DxO lot I feel that the independence is lost - because DxO Tests are nothing more than a marketing vehicle for their software products. I have yet to see a single test by them that is not marred by extensive and exhaustive procedural mistakes and flaws that make them complete and utterly irrelevant for real life photography.

@ Karl Gnter Wnsch - Wow that's a bit harsh don't you think? While you may be correct in your statement that DxO uses their test as a sales vehicle for their software, they are non-the-less the de-facto-standard are they not?

It seems to me that if someones' reviews/tests are to have meaning they need to be conducted in accordance with an established standard, or there would be no real means of comparison.

As for your comment :"I have yet to see a single test by them that is not marred by extensive and exhaustive procedural mistakes and flaws that make them complete and utterly irrelevant for real life photography." How did you come to this conclusion, what standards did you apply to whatever tests you ran to ascertain theirs was faulty.

@Karl - You just need to understand what they mean for each factors. They are not biased or irrelevant, just a bit sophisticated for your simple mind. These are not computer benchmark and is very relevant for real life situations.

So you need to be "sophisticated" to understand how the D600 sensor is better than the 80 megapixel Phase One IQ180 since DXO rates it higher. I can think of several words other than sophisticated to describe people who believe that :-)

JKP already addressed it: We read the test reports in order to know more about the particular lens and to decide for or against it when we consider to buy it. However, what we get to read is the test result measured on a lens copy which we'll not buy. Lenses are industrial products and their quality varies from sample to sample. This is absolutely normal and unavoidable, no manufacturing runs without tolerances and lens production is no exception. The question is, how strong does the quality fluctuate in the lens type of our interest.We need the information, if - when we buy this lens - there is a rather high probability that our copy will be as good as the tested one, or if there is a high risk that we'll don't get what the tested sample promised.

It would be really great if dpReview & DxO would expand the test procedure in this direction. I guess that e.g. a fast assessment of the centering variation on at least 5 (better 10) lens samples would be appreciated by a lot of us.

50 isn't practicable. 5 is exactly five times more (i.e. better) than one, 10 would be still better and - I hope it - feasible.

Another important criterion is the AF accuracy and repeatability. The best results (resolution) measured in the lab using one lens sample after meticulous manual focusing is useless in the real life if the AF doesn't work perfect. A repeatability assessment is a question of a statistical analysis. I know a german magazine publishing its test results concerning AF repeatability. This is a nearly unique service among the lens testing labs but they only list the best and the worst result of ten measurements and measure the AF accuracy in percent. Nobody knowsa) what means 'percent' in the assessment of the AF accuracyb) if the worst result (e.g. 20%) is an outlier or the rule, whereas the best result (e.g. 100%) is the outlier

Excellent. With the sudden influx of large sensor cameras on the market or being announced, the choosing of the lenses to get the best out of these sensors is going to become more important. So this move by DPR to restart lens reviews is really good news for the consumers.'If' this is done to the same standard as existing DPR reviews, the site should become an even more valuable resource.

Hello sxhortx - I looked at your link for DxO lens test result, and I wonder... how can they compare resolution of various lenses when the term "resolution" is an ill defined measure (MTF at several spatial frequencies and field positions must be used to get objective results) and they test each lens on a camera with different pixel count?

I really think that the only satisfactory way of testing lenses is independent of camera using equipment such as this http://lenses.zeiss.com/camera-lenses/carl-zeiss-camera-lenses/industrial_lenses/products/lens-measuring-technologies.html#inpagetabs_41a6-0 I am sure that with the backing of Amazon DPReview could afford to purchase such equipment. The Zeiss lens testing equipment also has one important attribute that your methodology does not - it test the lens at infinity focus, your tests cover near focus only.

Yes...let's see some lens tests for once, instead of these limited-usefulness system tests. Given the number of lenses, bodies and cross-platform adapters, systems tests are getting more and more useless by the minute.

Either use a lens projector, or use the same camera - with AA filter removed - for every lens test, using adapters to mount the lenses from other manufacturers. And uses a method like Imatest that oversamples so pixel size doesn't matter much.

C'mon Joseph, he wrote "such as": K8 is famous even among amateurs, it doesn't mean he really suggest them to buy a K8... :)

BTW, K8 can reach 160 p/mm, for today's cameras with AA filters it's more than enough I think. Moreover we are actually using an even older Ealing and we have (almost) no problem with CSC cameras since we can test Sony E, Samsung NX, Fujifilm X and so on... tricky but not impossible ;)

The Zeiss system you recommend is nice to have convinience, but is not a necessity. With proper knowledge and a calibrated target like http://www.aig-imaging.com/mm5/merchant.mvc?Screen=PROD&Store_Code=AIIPI&Product_Code=M-6&Category_Code=Sinusoidal-Precision-Sine-Test-Array

@cordellwillis - you need to get over this. The iPhone is a camera. We test cameras. We're not going to suddenly change overnight and only test cameras on phones, but we are going to respond to the very real demand for coverage of that segment of the market.

Please test another iPhone, errrr oops. iCamera so we can read about the DXO results. The iPhone is a great phone. Darn, I did it again. The iCamera is a great camera that just so happens to have a phone!

Great news, thanks for the announcement! Very useful part of overall informative aspect of DPR site.Speaking of optics, I wonder whatever happened with Origami Optics? Long time ago it has been announced as soon-to-come, but I still haven't seen this excellent idea in everyday use.Maybe someone among your data gatherers could be bothered to get us an up-to-date info about it? For those who want to know more, use http://www.jacobsschool.ucsd.edu/news/news_releases/release.sfe?id=617

One additional test that would be of immense interest to an (admittedly small, but growing) number of protographers would be to test lenses for infrared usability. It wouldn't have to be anything elaborate, but there are a bunch of us who would reallly like to know whether a lens has an IR hotspot before we spend good money for it, and currently there's no way to tell about a new lens until other IR folks start complaining about it.

First is the "Sony lesson". High profile entities come under a lot of fire for offering any sort of support for infrared. It's one thing for some guys in a basement like Lifepixel or Max Max to do it, but dpreview would have the same problems that Sony had, religious groups and rabid mothers screaming about perverts taking "x-ray pictures" of their precious children.

Second is that IR hotspots are highly dependent on the wavelength of IR used, and the characteristics of a particular sensor, and that of the filter used for the IR mod. So there's no one test. That's also why you see so many "independent" tests disagreeing about particular lenses.

For what it's worth - I wouldn't bother with DxO results, their testing procedures are so far off any reality that these values serve no one - as often as they contradict reality they should really refrain from publishing anything. Anyone trusting their data is IMHO up a creek without a paddle...

Perhaps DPReview should get a cat as a mascot and to photograph "real world" sample images when testing lenses and cameras. The eyes, fine hair and whiskers of a cat provide a wealth of detail and texture. It would also be much more fun for users to compare sample images of a cat than those boring test shots, and you could keep us up to date on his/her exploits.

Sounds good, although I'm not all that familiar with DxO lens testing.

Even better would be somehow incorporating collaboration with Roger Cicala at LensRentals. He makes what I believe are valid points about sample variations between lenses. There are "good copies" and "bad copies". They have several to dozens of duplicates of the same lens, and perhaps 50+ of the most popular lenses.

However, those lenses are sent back and forth as rentals, and not necessarily treated all that well. Another type of useful information would be their repair frequency statistics.

Also, I suppose DxO/DPR will micro-focus-adjust the lens to body? But I suppose that doesn't matter if you use LiveView at 10x magnification for critical focus?

A very useful test bench would involve a compact sensor positioned behind a universal lens mount with MF so you could compare Canon vs. Nikon vs. Sony. u4/3 or NEX would work in abstract, but smaller, higher-resolution sensor would be able to tell you a lot more.

A person would be a fool NOT knowing how a lens performs before buying. If you don't see reviews how would you know it's quality until you try it. I can tell in the first images if a lens is a dog, but I'm certainly not going to buy a lens that all the reviews show not to be good. When a new lens come on the market and you bought it. Would you be a fool if it wasn't good and you didn't look at any of the "measurement". I'd say that's being a fool.

This sounds very good!!On a side note, does this text have to be so... marketing'y ?It would sound so much more natural and real if the following words were removed:-Popular-Highly respected-legendary-world leader

Great to hear of your new resolve to give this prime topic a shift in focus, That DXO is hand holding should also lead you to some sharper content. The frequency of your output needs to be in contrast to the past which was at best, Shaky.

I have looked far and wide for somthing along these lines, I hope this new venture put's a cap on it.

DxO reviews are the most comprehensive ones available, that's great news.

Just a comment.

In about every test methodology lenses are tested at (an unknown) close focusing distance only, which is frustrating since lense performance might vary significantly across the range of focusing distances from the closest up to infinity.

More and more we see lenses that test successfully but disappoint in the real world due to poor performance at infinity.So, imo if you had plans to include an appreciation of lens performance at infinity (other than sample shots) that would be another great improvement.

The experience of using DXO Optics Pro for years and closely following DPR growing to excellence for many years made me jump! The combined expertise will no doubt bring joy to enthousiasts. Nice move DPR, congrats!

Please don't make any use of the DxOMark 'Scores'. They are so misleading as to give higher 'scores' to lenses that are clearly worse, optically.

If the presentation is similar to your old system, then OK.

Although, I will never understand the reluctance to determine and publish the lens resolution on a scale that is independent of camera sensors. Buyers are better served to know what is the lens' absolute limit, not what it achieves on this sensor or that sensor.

I'm a little confused here. Dxo pretty much slammed the Canon 70-200 2.8 IS II, while you guys gave it your 'gold' award. How exactly is the merging of your reviews going to rectify such a huge difference of opinion?

we're not merging reviews. We're using test data which we will be carefully checking and managing. If that data doesn't agree with our findings we won't just publish (in fact this process will undoubtedly improve DXO's lens tests by more readily identifying sample variation outliers).

I for one disagree with DXO results very often. By the way I agree with Dpreview on 70-200 F/2.8 IS II that it is THE BEST 70-200 lens across all manufacturers. Andrew was right and DXO as their conclusions often are, was dead wrong.

By the way, photozone slammed the new 24-70 F/2.8L II while Roger Cicala said it was incredible.

Excellent decision dpreview! The challenge will be to provide meaningful analysis to the data that even the technically challenged can find useful. Really looking forward to this! Its all about the glass!

@ljfinger: This only means we cannot compare lens performances across different platforms. But each lens test on its own can still reveal important and relevant clues such as edge vs center performance, vignetting, distortion...

Secondly, just like DXOMark sensor data, lens performance data per se without attachment to actual cameras is useless to real world users.

"This only means we cannot compare lens performances across different platforms."

And across different cameras on the same platform. Really it means you have to test every lens on every camera it could be used on. Given adapters and cross-platform use, this makes the number of permutations and combinations nearly endless.

The only valid lens resolution tests are the one with infinite pixels, by your logic. A 15MP camera is still gonna get 15MP even if it loses its AA filter, only that the one with AA filters are slightly worse. A conventional 35MP camera is still going to beat a 15MP sensor without an AA filter in terms of resolution.

"A 15MP camera is still gonna get 15MP even if it loses its AA filter,..."

Not necessarily.The IMATEST approach uses an oversampling method, which can therefore examine spacial frequencies accurately far above the Nyquist frequency, if the AA filter is not present.

http://www.imatest.com/docs/sharpness/

"Briefly, the slanted edge method calculates MTF by finding the average edge (4X oversampled using a clever binning algorithm), differentiating it (this is the Line Spread Function (LSF)), then taking the absolute value of the fourier transform of the LSF. The edge is slanted so the average is derived from a distribution of sampling phases (relationships between the edge and pixel locations). The algorithm is described in detail here."

"The four bins are combined to calculate an averaged 4x oversampled edge. This allows analysis of spatial frequencies beyond the normal Nyquist frequency."