Originally Posted by RichC
What none of these web comparisons addresses is the need to consider bottlenecks in equipment that impact the theoretical resolution of film – i.e. lens resolution and scanner resolution (assuming images will be scanned). Consider a typical scan of 35 mm film at a scanner resolution of 2700 ppi: the resulting file has dimensions of about 3800 × 2400 pixels = 9 MP – but this is not equivalent to a 9 MP image from a digital camera because both are theoretical values based solely on the number of pixels, and fail to account for differences in resolving power between the two mediums. Comparing, say, a 20 MP film scan with a 20 MP digital camera image is thus comparing apples with oranges. What we actually need to compare is the resolving power of frame of film and a digital sensor.
Just comparing two different 20MP digital cameras is a flawed concept in and of itself. You have a lot of factors there...lens, size of photoreceptors, pixel density, method of color interpretation (Bayer, etc) as well as others I am too lazy to look up. Either way, you really need to stick with lines per measurement of the sensor in order to be correct here.
In your case, we are comparing 20MP scanner vs a 20MP camera image. That's complicating matters in that a certain scanner will interpret the scene differently than the camera. You are greatly
In order to do an experiment correctly:
+ Scene must be photographed the exact same way.
+ Sensor must interpret light the same way.
+ Film scanner must interpret that film the same was as the digital camera interprets the light from the scene.
+ Film must be developed to get the most perfect result as close to the theoretical limit of the film as possible
+ Same lens with no post processing by the internals
Food for thought....how can you reduce the variability?