James Photography
Scanner Bake-Off 2004 Results: Notes and Observations




The Results Are Obvious
First off, this is not a scientific test. Certain variables we have no control over will skew the results, so I'd hate to hear criticism that this comparison effort is fundamentally flawed. Because - it's quite apparent this test has demonstrated wide variations in sharpness between different scanner makes and models, which was the entire point of this excercise.

I ran the images through the MTF analysing software a few times, and saw between 1 and 3 percent variation in each run with different samples of the scans, so these MTF ratings are a fair representation. The MTF50 stats confirm what you can see... but read on...

Why Do Some With a Lower MTF Look Sharper?
This is where the uncontrollable variables raise their ugly heads. Turns out that image gamma, and hence brightness/contrast values, influence the MTF rating. To do a more comprehensive and complete test, each and every scan would need to also include a known output gamma, and an average Black/White point goal of somewhere around 27 and 242 for the USAF target's features just adjacent to the edge to be evaluated. (Thanks for that info Bart...) But that's a bit beyond the scope of the original intent of this bake-off. So, interpretation of both the measured and subjective results must be weighed by this knowledge. Just how much the results are varied by gamma isn't exactly known, but as previously stated on another page, the more results we get for the same scanner, the more we can "average" the results.

Also, the portion of the image I used to measure sharpness may not have been the same area participants used for focusing. I tend to think this is as much of a contributing factor as gamma, but that's just an educated guess. Again, getting dead-on results for a scientific study would mean ALLOT more work and time; I've established enough of a baseline of data for practical comparisons between scanners.

The percentage difference in all the higher-end scanners isn't alot, so there's no need for any pissing contests. Some common sense can be applied here. It is apparent that all the scans much below 18 line pairs per millimetre are fuzzy, and those above are great. In the end, both the MTF stats and the clip images in the far right column tell enough of a story when tempered with knowledge of the uncontrollable variables.

Not everything was included.
There were some submissions with MAC OS results, as well as wet vs dry mounts, different scanning software... I included a couple things, but not all. That'll come in the future; for now I only wanted to compare apples to apples insofar as that is possible...

"...But the image wasn't sharp enough"
It was suggested in one person's email that the target image wasn't clear and sharp enough to differentiate the high-end scanners, and that I should have used a B+W neg of Kodak's Technical Pan because of it's very fine grain. That *may* be true, as some higher-end scanners are pretty neck-and-neck. I've been able to determine that my target slide's resolution works out to about 3 pixels; that is, it takes about that much to transition from white to black.

It must be noted that this is a real-world study using real-world equipment, yielding real-world results. No-one uses or scans 25 ASA B+W negs to any degree that would make such a test realistic to the crowds of people that mostly scan colour slides. This is a "street-level" study of the practical qualities of equipment that most serious pros and amateurs can afford, using realistic parameters.

The "Blob" that Eat My Scanner
The unmistakable smudge near the center of the image is my printers' version of "...hmmm I don't know what to do with this number 6, so I'll just puke it wherever I want". Not to worry. The resolution of the line-pairs it obliterates far exceed my camera and film resolving power anyways. And it turns out other printers do the same thing, depending on the printer driver.

Not Everything is Measurable...
What cannot be accounted for are the variables introduced by operator error, focusing, scanner software, image gamma, compression algorithms, and the resolution of the film/lens combination (though it is consistent in this run). So as more scanners of the same model are included, the more we can average the results and present a more fair comparison.

What I can tell you is that by looking at the original printed USAF chart, my slides of the chart don't resolve down to the same factor as the original. That's to be expected; the slide also includes the limitations of my film and lens resolution as previously mentioned. But at the same time, no scanner has been able to produce as sharp an edge that's on the slide itself. Although, the magnified output of 4000 and 5400 DPI is pretty decent - these high-end scanners certainly do the job. The adage "you get what you pay for" comes to mind. It's too bad the same high-quality optics can't be included in the mid-range 2800 DPI scanners around the $500 price point. This test has proven that manufacturers DO have the option of placing high-end optics in lower DPI scanners, but they won't do it.

Just for the heck of it...
...I had a buddy enter an image taken with his Canon 10D DSLR with a 50mm prime lens. Incredible. The MTF was 62.49, almost 3 times that of the highest rated scanner, and the pixel rise was 1.04. This tells us that using a slide taken with a camera and lens certainly lacks the sharpness needed to help differentiate the higher-end scanners, as I suspect the superior optics can define sharper lines than my target image. Although, coming up with a slide image *that* sharp would be a challenge... which brings us back to the "street-level" results I was looking for in the first place. Something to chew on. Here's a link to the actual results.

Last Note
Due to the fact that not all participants have sent in their results, this document is a work in progress.


The National Federation Against Cruelty to Scanners attests that no equipment was harmed in the making of this website.






Typical Disclaimer Stuff
There are no alliances or special considerations when it comes to brands, as this scanner bake-off is not an endorsement of any product or manufacturer. Any and all participation is voluntary. All results are impartial and fair, so if anyone takes offence, or even wants to litigate, they can bite me.