What's your Doh! Mark Score?

March 23, 2013  •  7 Comments

 

I'm rather a fan of DXOMark personally - and am often amused by others' reactions to the way it captures and presents camera sensor data. To be sure, the summaries are by their nature a 'boiling down' of a lot of information into an easily digestible and somewhat comparable format. But there is an awful lot of explanation available on their site as to how the information is measured and what it means - and, most importantly, what it does not mean. But IMHO, their sensor measurements are more useful than their lens measurements (ironically) and in any event it is certainly true that an individual photographer is always right to choose the rendition they prefer from any system. Nonetheless, I have never seen a DXO Sensor score for a camera with which I have experience that disagrees with my own 'unmeasured' feeling for the qualities of that sensor. I therefore consider it a very useful guide, for cameras I am thinking of buying, as to whether or not the imaging performance of their sensors will suit me or not.

But not everyone, it would appear, feels the same.

 

DXOMark on Camera Sensors

Regular as clockwork come the new threads: "DXO Mark Rates CaNiSon R2D3!" screams the OP, and then the hubbub grows. Suddenly, everyone's an expert: but many people's willingness to accept a DXO Mark score varies in more or less direct proportion to whether they own and like the camera in question and how well it was rated.

A classic example is the recent furore relating to DXO's scoring of Leica's M 240, M9, ME and M8 cameras. Here's an assortment of quotes from some well-know fora where these things are discussed...

"It just tests a few very specific properties of the cameras' sensors which however have no significant correlation with the cameras' image qualities whatsoever."

"Ken Rockwell in a laboratory"

"dxomark should be renamed "NikonPRMark". Bought and paid for."

"..how can anyone believe that such a scoring system can give any real appreciation of a camera and its usability?"

"As a man once said; 'There's lies, damned lies, statistics and then DxO benchmarks' "

"these kinds of tests are about as useful as dropping two cameras off a cliff and rating them by which one hits the ground first"

Doh!

So let's lay some ghosts to rest.

Firstly, DXOMark say quite explicitly: "DxOMark does not address such other important criteria as image signal processing, mechanical robustness, ease of use, flexibility, optics quality, value for money, etc."

In other words, it doesn't matter whether the battery lasts for three frames or if the ergonomics are worse than a brick: this is just about the sensor. And that is as it should be: there are plenty of other places to go if you want to know how the camera handles or if it has a reliable AF system.

Secondly, DXO are very clear that the 'main headline score' has three constituent parts, being Dynamic Range, Colour Depth and ISO performance, and that each of these metrics are of differing levels of interest for different kinds of photography: Landscape, Portrait and Sport. Sure, those categories aren't perfect but for heaven's sake, it's pretty clear isn't it?

Thirdly, many people resolutely refuse to understand that the scores are resolution independent. All cameras are normalised to an 8mp baseline and that is as it has to be: resolution depends on the number of pixels, the existence and strength of an AA filter and the quality of the optic attached. If you are measuring sensor performance only, you have to normalise to a baseline or you will automatically be measuring things other than the performance of the sensor. If you want to know about resolution, look at the lens tests, which are all specific to individual cameras and therefore to individual sensor/AA filter arrays.

Fourthly, the way in which DXO captures, interprets and weighs the data is mostly pretty clear. The only exception as far as I can see is the question of how in-camera pre-processing of RAW data ("Part Baking") affects the measurability of SN ratios at high ISO. I've read all the stuff and I am still not 100% sure I understand it. It seems that they can detect when it is being done but it is not clear to me how they then take that knowledge into account.

But in general it is my contention that if you read the bumf, take the headline score for what it is, and have a good general understanding of photography, then a DXOMark score is incontrovertibly extremely useful. Especially if you understand that 'colour depth' (effectively the breadth of the sensor's output colour gamut) is not the same as 'colour rendition', which is the sensor's output gamut as interpreted by a RAW converter according to the profile it uses and how good it is at de-Bayering.

And back to those Leica scores: DXOMark said of the M9/M9P/ME sensor that "these cameras offer the worst image quality DxOMark have tested on a full frame sensor, with the exception of the 10-year-old Canon EOS 1Ds" and that seemed to annoy some people. It is, however, factually true, and whilst it certainly doesn't stop owners of those cameras from making amazing images, it seems plain wrong-headed to disagree with it as a finding.

Many complainants, though, seemed less churlish about the rather more flattering findings for the M-240. And in fact I think that they are better, even than they look: at first glance, it looks as if the M-240 is the 11th highest ranking sensor they have ever tested. But if you ex-out the medium format cameras (because you're a 35mm kind of a guy/gal) then it is 8th. Now, exclude the egregiously high MP D800 and E, and the lower MP D4, because you are looking for a 25mp camera, and you climb to 5th. Now look at the four cameras above you: three of them have, in effect, the same Sony 24.3mp sensor.

In other words Leica has managed, on its first outing with a CMOS sensor and using a manufacturer which is not widely known, and having to work around the very specific micro-lens requirements of the M mount, come up with the third best sensor ever tested, in its MP class. And to beat every Canon camera ever tested. Now that is pretty damned good. 

Better still, if you are using those lovely F1.4 lenses and taking advantage of the low vibration shutter and no mirror slap, then it's really the colour depth and the DR that are of interest, and the M-240 scores very well on those metrics. So no wonder there were fewer howls of rage when DXO published their M-240 score.

 

DXOMark on Lenses (and this time the Doh's are on me)

Compared to lenses, sensors are quite easy to measure: they have fewer significant parameters, are subject to fewer permutations of use-cases and seem to be manufactured to a higher degree of uniformity. And so it is that while I have always found the DXO Camera Sensor Scores to be very useful and to chime closely with my own impressions, the lens measurements are, to me, far harder to understand and far easier to disagree with. This is for several reasons:

When it comes to DXOMark Scores, especially regarding zooms, some confusion reigns: in one place the rubric states:

"The DxOMark Score corresponds to an average of the optimal quantity of information that the camera can capture for each focal.The quantity of information is calculated for each focal length/aperture combination, and the highest values for each focal are weighted to compute the DxOMark Score."

Putting aside the potential confusion engendered by the poor translations (I assume a 'focal' is a 'focal length?) which sometimes mar the site, there is the additional confusion of the apparent contradiction between the above and another statement on the site:

"The DxOMark Score shows the performance of a lens, mounted on a given camera body, for its optimal focal length/aperture combination and for defined exposure conditions."

As it turns out, both are true: it says elsewhere that 

"Each focal length/aperture combination provides a numerical value. The highest value is the DxOMark Score."

But just when you think you're catching on, you find another nugget:

"the highest values for each focal are weighted to compute the DxOMark Score."

I think I have an average ability to take in complex information, but boy do you have to work at it with DXO's lens scores -  and in this case I am still not sure I understand. There is further information available through yet more clickable links but at some point of delving into ambiguities and contradictions I gave up. Sorry.

My working assumption is that the little graphic that expresses the DXOMark Score shows a black line as the average score within a blue 'zone' that shows the range of scores over the entire focal length range - but I am unclear as to whether this also includes performance at the severely diffraction limited smallest apertures. And if anyone from DXO is reading this, you have a link on your Testing Protocols page that promises more information about DXO Analyzer but it leads to a page that... does not mention DXO Analyzer at all.

Moving on. There is no attempt made, as far as I can see (a prize of an Easter Egg to the first person who can show me DXO's protocol for determining shooting distances) to explain or elucidate the subject distances at which lens tests are made. This matters because some lenses are known to be better at close range than at landscape distances. Others have field curvature effects that are more or less significant at differing distances and apertures and in zooms, these can even change their type (curved "in" versus "out" versus "sombrero") at differing focal lengths. In fact field curvature is in my recent experience one of the main things that can severely compromise the performance of a lens.

Some lenses just can't 'do' sharp edges and some can, but not in the same plane as the subject - so though DXO's Accutance Mapping provides a very useful visualisation of flat field sharpness across the frame as measured at one aperture and (presumably) one subject distance, it doesn't show you the 3D 'shape' of the field of focus, nor can it tell you how that shape varies with subject distance and with aperture. These are important things to know about a lens and my experience can only be found by shooting 'in the wild' though I must stress that I am merely a 'very interested party' rather than an expert in optical design and measurement so maybe I'm missing something here?

Neither can I find any mention of focus shift, a phenomenon that can severely affect the focus performance of a lens. One might agree that lenses should all be tested via focus bracketed series, preferably at the shooting aperture in magnified live view, but that is not always possible. And clearly Phase Detect AF systems are focussing wide open in any event. Focus shift is an optical property of a lens, not a feature of a focussing system, and should therefore IMHO be included. Nor is there a peep, for zooms, as to whether a aprticular model has a tendency to require widely differing AF Fine Tune values at either end of its zoom range. For Nikon owners in particular, this is very important information.

The frequency and tendency of a lens model to decentering is another huge contributor to lens performance. With some current lenses, even ones which score very highly indeed on DXOMark, decentering seems almost endemic. And yet I can find no mention of it on DXO. And I have to add that if I were a manufacturer sending a copy to the DXO lab for review (or maybe they buy their own copies at random?) I'd make damned sure I sent them a 'good one'.

For these reasons and others (optical, such as bokeh and non-optical, such as the performance of stabilisation systems) I find that though DXO lens scores are a rich and useful data point, I cannot read them with the degree of confidence that I can apply to their sensor scores. In addition, therefore, I also read results from my favourite lens review sites, such as PhotozoneCamerlabs and DigLloyd and I garner much useful information from the wonderfully experienced and generous readership of GetDPI.

Then, when I've waded through all that and finally purchased the damned lens, I go and shoot it. And that's when I find out what it's really like.

Coming next week: DXO has published a list of the best lenses for use on the D800 - next week I'll give my own, based on real-world considerations.

 

Comments

Tim Ashley Studio
Good point Nancy - and results from those cameras do look mouth-wateringly good!
NancyP(non-registered)
One thing that DxOMark sensor ratings don't include is consideration of the non-Bayer (non RGBG) sensors. These sensors include the Foveon sensors from Sigma (DP and SD Merrills - of which I own DP2M) and the X-trans sensors of Fujifilm. So users of these non-Bayer sensors rate evaluate them in "real world" shooting conditions.
Tim Ashley Studio
Hi Leon,
I sold my M9 a while back because, having acquired a D800E, it rapidly became clear that there was simply no contest other than in certain shooting conditions. From my experience so far the M240 is a lot better than the M9 but, on file quality alone, still not up to the RX-1/D600 class of sensors, let alone to a downsampled D800 file. However, I will complete my work with the 240 over the next two or three weeks and will be reviewing it in some depth... and I expect to find that for many people, it could be 'just the ticket'...
Leon Roy(non-registered)
Tim, I've recently picked up an M9 as well, more as a portable alternative to the D800, and find the output fantastic provided the exposure is nailed. In that respect it reminds me a little of slide film.

Compared to the D800E however it's not even close when the ISO hits 1200 and above.

Have you had a chance to try the new Leica M240? Any feelings on how well it fares against the Nikon?
Tim Ashley Studio
I am with Leon on this one. Files from the M9 benefit from two real advantages (no AA filter, though this is matched by the D800E, and some of the very best lenses) and one illusory advantage, which is that lower DR files often look better 'out of the can' because very high DR files often look 'flat' when opened (but not processed further processed) in a RAW developer. An experienced photographer on a Phase One back, or a D800E (or indeed an RX-1 or D600 with appropriate sharpening) will quite easily be able to make a better file than can be had from an M9 under many shooting conditions. If you only shoot on bright but hazy days, you'll likely never see the advantage but I gave up on the M9, despite having had Film M and M8 and 8.2 beforehand and being an avid rangefinder fan, precisely because the IQ was lagging what was available elsewhere by more of a margin than the lenses were making up for. And that was many, many months before DXO got anywhere near it. It was just clear from the files. Sorry!
No comments posted.
Loading...
Subscribe
RSS
Archive
January February March April May June July (24) August (11) September (8) October (4) November (5) December (4)
January February March (2) April May June July August September October November December
January February March April May June July August September October November December
January February March April May June July August September October November December
January February March April May June July August September October November December
January February March April May June July August September October November December
January February March April May June July August September October November December
January February March April May June July August September October November December
January February March April May June July August September October November December
January February March April May June July August September October November December
January February March April May June July August September October November December
January February March April May June July August September October November December