The highly configurable RED Epic Dragon is the latest version of the firm’s original Epic-M Digital Still and Motion Camera (DSMC). Although we are more used to analyzing still cameras at DxOMark, we’ve had a unique opportunity to assess RAW output from a prototype using all of our usual industry standard tests. As the results were extraordinary, we thought we would share the findings. Read on to find out how this high-performance sensor performs.

Further readings for the RED Epic Dragon review: First camera to break the 100-point DxOMark sensor score barrier!

To provide photographers with a broader perspective about mobiles, lenses and cameras, here are links to articles, reviews, and analyses of photographic equipment produced by DxOMark, renown websites, magazines or blogs.

A cinema camera just showed it can outclass full-frame SLRs from Canon and Nikon that have sensors nearly twice as large. When it comes to image sensor quality, a camera company you probably haven't heard of just stole the show -- almost.

Over the past few years, DxOMark's tests and scores have provided a numerical measure for the Raw image quality potential of cameras. They've therefore been debated at some length by enthusiasts looking to see where their favourite model stands in the rankings. Now there's a new DxOMark Sensor score champion, but perhaps surprisingly, the first camera to break the 100-point barrier isn't a full frame model from Nikon or Sony, but the RED Epic Dragon movie camera. How did it manage that?

DxOMark has undertaken one of its detailed sensor based tests on a very different camera than usual—it's put the Red Epic Dragon under the microscope, and what the site found was a camera sensor better than anything else it's tested by a significant margin.

This just in: the DxOMark scores have been published, the new 6K RED Epic Dragon camera beats out the old king-of-the-hill, the Nikon D800E. The first video sample for the Dragon surfaced in August, 2013 and we've been excited to see how it would perform ever since. The RED press release surprised many with the camera being advertised for both video and stills. At around 19MP this camera system might just give high-end Nikon and Canon DSLRs a run for their money.

Clocking in at an impressive 101 points on its scale, DxOMark's new review of the Red Epic Dragon sets a new standard for color depth and dynamic range—all in an APS-H sensor. The high-end digital cinema camera retails for $29,000 for just the sensor module, and despite being a video oriented device does well enough with still images to outperform the likes of the Nikon D800E, or even medium format Phase One backs.

Until today, the Nikon D800E stood at the top of the DxOMark totem pole with an amazing sensor score of 96. But the champion has been unseated, and not by some Canon or Nikon full-frame camera or one of the impressive new APS-C shooters we’ve seen recently. No, the D800E has been decidedly put out of first place by the RED Epic Dragon video camera.

Comments

RED EPIC vs. Nikon D800e

Testing methods of any kind are always a hint to us which technology we should use for our purposes. As livelong Nikon users, we have seen the development of digital kameras with great satisfaction and the result of our d800e in studioworks ars simply mindblowing. In combination with modern multishot technology like stitching, dff and raw workflow I would concider the result as state of the art. Just go ahead and compare it with a slide. We habe 6 x 7 cm and even 4 x 5 inch slides, and still, I prefer the digital output. So a few years ago, we bought the red epic, at the time with the mystirium x sensor. Now it was upgraded to the dragon. Last week, we went to Island to shoot the active volcano Barbardunga from the ground an air. I was strictly using the RED as it under normal circumstances is more equipment than one man can handle at the time. My partner was shooting a Nikon D800e and Canon 5D mark2. Seeing the results is making the story very short. The Red is completly out of this world when it comes to motion picture. Especially in high contrast enviroments like white steamclouds in the sun and black Lava in the shade, shot from helicopter. You would be compleatly lost with a normal videocamera. But, when you take a single frame out of the stream, it is miles away from the quality of the d800.We where shooting 250 ISO, 85 fps and 160/sec exposure time on the RED and 1000 ISO, 1000/sec. at the Nikon/Canon. So, it is simply a different purpose those cameras are made for. So, this is the message of this posting. If you want a realy serious video/moviecamera, it would be the Epic or Scarlet Dragon. It you want the best possible Image on a DSLR, it would be the D800. Seen from this strictly experience based point of view, I do see some accordance to the dxo testing results.

Re: Functionality

Freedom

Each person is free to run his own tests (and start his own site).I don't say you can't criticize: you're free to do that too--I just say that if you think you know better--do it!For my part I think DxOMark is the best source I've found.

RED Epic Dragon

-.-

Once again we hear a DxO proclamation that this or that provides the best image quality over this or that.. yawn..Yet little has changed over the years, photographers continue to purchase, in greater numbers, the cameras that are no where near the top of the grand list.. Why you ask? Because the testing methods DxO uses do not mean anything to the majority of the public, they do not take into account the range of operation, the range of output, the range of use of said product.To give an example, the current publicly ranked 'king' D800e, can only produce its 'advertised' results at ~100-200 ISO (plenty of studio work in that range) BUT only in the form of a ~75% downsampled ~8mp image. While that may be fine for a percentage of D800/e users, the reality (and reason for its corporate revenue killing sales) is that the majority of users are wanting to output full resolution 36mp images in some form. Everything from full rez prints (landscape, macro, sports) to the 'croppers' (wildlife, sports, daily, peepers, journalists, etc) where the IQ performance at 1:1 is what matters. It is in this area DxO is apparently under the impression that no one uses a digital camera for anything but 100iso 8mp 8x10 output, so there is no need to evaluate that area of performance in our ranking... and even that thinking would be fine if said camera produced its ranked results through its operating range.**It is here we find the crux of the story**It does not take much effort to click a few options in the comparison tool to see that the D800/e are the worst performing Nikon FF camera when using native resolution in any form, and at base ISO! Even if one were to speculate that the performance curve from native to 75% downsample was in its favor, one would be wrong. It also does not take much effort to see that of the cameras that outperform the D800/e (D4, D4s, D600, D610, D3s, D700) their performance at their native resolutions surpasses the D800/e even when downsampled to those resolutions. Meaning that even when downsampled to 24mp, the D800/e cannot match the output of the D610 at its native 24mp... it cannot match the 16mp performance of the D4 even when downsampled to 16mp.

Understand this is not a bash on Nikon, or the D800/e, merely the most identifiable examples of the two primary questions DxO fails to satisfactorily comment on for unknown reasons.1: What possible reason could there be that DxO does not 'rank' sensors based on their entire operating range? We all understand that a certain area of photography world rarely has to go above 100iso, but these tools are not designed solely for 100iso. In actuality the rest of the photography world shoots considerably higher than base ISO. Certainly that would mean that the sensors testing, measurement, and ranking should reflect its performance across its ISO range and not just one setting...2: What possible reason could there be that DxO does not 'rank' sensors based on their entire output range? We all understand that there are certain areas of photography that operate in the 8x10 world, but there are exponentially more images printed in magazines and other forms of media that are much smaller. Therefore allowing all equipment being tested to benefit accurately from the same percentage of downsampling. So instead of focusing on what one camera can achieve at some arbitrary output setting, DxO could certainly test, measure, and rank the performance of the device across all possible output ranges.Then rank the device based on an averaged scope of its performance.

It has been mentioned by DxO that the ~8mp base ISO output fills some esoteric demand in the photography industry, and therefore represents an established criteria for measurement. However that niche was established back in the film days where an 8x10 or 8x12 was essentially the max output of 35mm film - not to mention the largest size the average person could reliably get printed. One could print larger, but if limiting the print to the highest amount of detail, sharpness, and color whether scrutinized by eye or loupe, that was around 8x11. Today we have magazines that are A3/B3 in size, so focusing on the size of the print is not really the optimal measurement criteria. We are also in a time where digital media is larger than print media by a quantum measure, so an 8mp image sample becomes even more arbitrary when the typical 1080p monitor only needs 2mp of data to fill it. We understand that there is a threshold, a point of diminishing return, in downsampling so why not decide on a percentage as opposed to output size. If a digital image of something like a D800/e no longer returns appreciable benefits beyond say a 75% downsample, then pick that as the ratio that all images are measured at. For a 16mp sensor a 75% downsample is not going to result in a 240-300dpi 8x10, but it would still produce an image size used significantly across other printed media - but more importantly it represents the beneficial result of a specific downsample that any sensor cam achieve and be fairly measured on. The higher the mp of the sensor the more knowledge and assurance that there is no more to be gained and most importantly nothing lost, so no brand or model is disadvantaged due to mp count versus downsample ratio.

The most significant discrepancy however is in the sensitivity range of the equipment being tested, as it clearly does include any calculation of the poor performance curve in some equipment in the 'scoring'.. Here again we have a situation where DxO has commented that there would be no clear range in which to measure and rank equally all devices given the advancements of technology. That is simply a load of self-absorbed empirically asserted horse manure. Just pick a range, any range, and go with it.. If the majority of digital imaging devices today are designed to operate between 100iso to 6400iso, then pick that range, or 100iso-3200iso. For older devices that cannot achieve the chosen ISO range well they have already been surpassed by modern devices in every way, so they would continue to be ranked below newer devices regardless. As newer devices are released they should improve on that performance range as mere consequence of trying to reach for higher ISO ranges. So If XX camera today achieves an average ISO score of 2900 for a designed 100-6400iso operating range, and two years from now YY camera improves the operating range to 100-51200iso without improving, or even depreciating, their previous performance then they are not working in the right direction and would be deserved of a lower score - otherwise their score within the same range should increase, regardless of any higher operating ranges designed, and ranking would continue to reflect that. See, it isn't that hard to really 'measure' and 'rank' the equipment without giving any advantage to one or artificial disadvantage to another.

More than anything else it would accurately test, measure, and 'rank' the devices on what they were designed to do, for any user, under all design conditions, instead of some esoteric criteria. The measurement and ranking methods used by DxO are certainly unique to DxO, and perhaps that is why you feel so validated in your empirical proclamations. To give some perspective of the anomaly though, if your testing, measurement, and ranking methods were applied to anything else we would have the best performing automobile based on the speed it can achieve while coasting downhill during a tail-wind with two of its four windows rolled down and tires under inflated by 15psi...Of course one could make those 'settings' on every product tested, and therefore feel quantified in the results, but there are many more variables of the products design and operation range that determine its performance that any badge of accomplishment not based/averaged on overall operation is meaningless.

First replies for this comment

Re: -.-

I understand your point but DXO simplified many things so that ordinary consumers can make a sound judgement in terms of which is better, albeit, in a small but still relevant typical usage range. High ISO performing cameras can easily be achieved by tweaking and sacrificing low ISO performance. This seems what is Canon doing now which is why the performance of their sensors at High ISOs remain competitive versus Nikon despite using way older sensors and technology.

In my opinion and personal preference, low/base ISO performance is still relevant and top priority (unless or until smaller format cameras can compete with FF/MF cameras in low ISO of the same generation).

On topic, this post of DXO is the poorest of them all. Why include and hype a video camera in a sea of digital cameras?

As an authority in sensor benchmarks, I would like them to investigate the image quality differences between CCD and the new CMOS implementation in MF cameras.

Re: -.-

You seem grossly misinformed about the image size issue. DxO has to choose a standard image size to compare all the image from in a fair way. Yes, DxO can use a larger image size, but that only change all of the scores of the cameras altogether. Their scores are still the same compared to each other, relatively, their scores are still the same. Comparing a normalized 8MP picture and a normalized 16MP picture won't be very different in terms of relative performance.

And take note, 1 or 2 points in DxOMark scores are well within their margin of error. I often see people comparing a sensor with another which have about 100 ISO difference, which is just 1/10th of a stop for larger sensors.

Yes, DxO can review the entire operating range of a camera, but that would really be unfair to cameras which have a greater operating range but as we all know, image quality normally suffer at that range. And, in fact, they do review the entire operating range of a camera - just look at specific scores. For fair comparison, you can't account for those into the overall score. The overall score is just a simplification of performance according to their set standards. For more detailed information about your use case, why not delve deeper into those specific scores? With so many variables, it's not possible to include all of it into the final score. The only way is to find a set of items that allows effective comparison between sensors.

Anyway, my point is, DxO isn't any wrong in their way of testing the sensors and their method of scoring sensors. But for anyone other than beginners or a very general use case, I suppose you would go deeper and compare between sensors in a more precise way. That is what DxOMark is for.

Re: -.-

That is a very long post. You could have saved a bunch of time by just posting this instead: "I have no clue why DXO equalizes images in order to make comparisons fair, nor do I know how to read their charts showing the test results at all ISOs".

Maybe if you post questions next time, instead of a rant, then you could learn how to interpret DXO test results.

specs

First replies for this comment

Re: specs

Hi Nacho,

Sorry for the late answer. As for whether the RED Epic Dragon is comparable to a photography camera, it is particularly hard to say but it should soon be possible to select the best frame in a video sequence. This allows us to think that, for some specific use, it could be used instead of some still camera. The shutter min is very low (way faster than 1/8000) and the shutter max is limited by the frame rate you could use.

Red Dragon Results

As I understanding it, multi-sampling is the similar to in-camera image stacking. So if 2 exposures were combined, the effective ISO is 1/2 of that used in individual exposures and the combined sensel has an apparent FWC 2x bigger.

Some questions:

1. Your noise correlation detection method picks up spatial NR. Can you introduce a method that will detect temporal NR?

2. The ISO 104 figure you measured. Isn't this equivalent to the same rendering brightness level of a single ISO 104 shot. So a 2x multi-sampled image that measured ISO 104 would actually be combining individuals exposures, each using approx. ISO 208? Isn't the measured ISO of 104 on a 19MP APS-H sized sensor indicative that the achieved FWC is not abnormally large, multi-sampling or not?

3. How is motion-blur prevented between individual exposures in the multi-sample? Would this mean that fast pans or high-speed action would unpleasant if recorded at 24fps?

First replies for this comment

Re: Red Dragon Results

The Red cameras have an HDR mode that takes two exposures per frame in the camera and then combines two exposures into a single frame in post. That mode has the motion blur issue you mention in your post. But DxO did not measure the performance of the Dragon sensor in that mode. The DxO analysis is for the normal single-exposure mode.

Re: Red Dragon Results

Quote:

That mode has the motion blur issue you mention in your post. But DxO did not measure the performance of the Dragon sensor in that mode.

As, I understand it, in the standard mode of operation, multiple exposures are also performed to make 1 video frame. This is not for HDR reasons per se (HDR mode, as I understand it, would involve combining shots taken at different exposure settings and then tone mapping the result into a more usable DR), but rather for temporal NR. This temporal NR produces the extremely high measured DR, unexpected in a APS-H 19MP camera.

If I'm correct in this understanding, then motion blur could still be an issue.

Re: Red Dragon Results

Quote:

[quote author=noirist link=topic=2466.msg4458#msg4458 date=1393890344]That mode has the motion blur issue you mention in your post. But DxO did not measure the performance of the Dragon sensor in that mode.

As, I understand it, in the standard mode of operation, multiple exposures are also performed to make 1 video frame. This is not for HDR reasons per se (HDR mode, as I understand it, would involve combining shots taken at different exposure settings and then tone mapping the result into a more usable DR), but rather for temporal NR. This temporal NR produces the extremely high measured DR, unexpected in a APS-H 19MP camera.

If I'm correct in this understanding, then motion blur could still be an issue.

Simply amazing!

Thank you, DxO for this outstanding review of this amazing camera! I hope that you expand your domain to include other video cameras. Really, there is a lack of solid information about video image quality, and you would do the world a service to provide it.

RED Dragon Results

As a cinematographer (35 years) (and a still photographer 50 years) the results of this test both surprise me and doesn't surprise me. The manufacturers of motion picture cameras have been putting a LOT of effort into duplicating the dynamic range of film. But still shooters are of course the masters of analysing individual frames. Motion picture cameras need to be able to process and spit our files at 24 frames per second at a minimum but cameras like the Dragon can hit up to 300 frames per second! You can tell that they are doing a LOT OF PROCESSING by how (very) hot they can run, but then they are designed to handle the heat. The Dragon has two extra fans as compared to the Epic that preceded it. And it's main fan was upgraded. Note that when rolling the camera fans have to go into quiet mode so that the sound dept. doesn't hear them.

As a cinematographer ... I'm very proud that a little company like RED, as compared to the titans like Nikon/Canon/Sony have beat them. I haven't read the entire test but NOTE that the Dragon can also shoot in HDR mode for even greater DR! AND the generally acknowledged KING OF DR in the movie Biz is the ARRI Alexa camera. EVERY movie that was nominated for a BEST PICTURE or BEST Cinematography award for 2013 (last nights awards) was Shot on an ALEXA! Looking forward to your test of that Masterpiece.

Recently it came to light that the original OLPF was flawed and is now being replaced. The flaw mainly concerned flares: http://www.motionvfx.com/mblog/red_dragons_awful_red_flare_getting_fixed,p3083.html

First replies for this comment

Re: RED Dragon Results

Yes, I would love to see the results from Arriraw from the Alexa as well. While it's not a camera that could be used for most stills applications, the test would be very beneficial to those of us in the motion world. Thanks for this test of the Dragon sensor. Very informative.

Re: RED Dragon Results

I wouldn't describe the winner (Gravity) as "Shot on Alexa" considering only about 5% of the frame was live-action. If anything for the second year in a row the best cinematography award winner was an animated film.