"Chris Bore" <
[email protected]> wrote in message
news:
[email protected] oups.com...
> Image 'spatial frequency resolution' seems to be used to mean:
> - the highest spatial frequency that can be resolved.
> Experimentally, an image of sinusoidal intensity is displayed and the
> contrast varied until the viewer can see the bar pattern.
>
> This is using 'spatial frequency resolution' to measure our ability to
> resolve adjacent light and dark features.
>
> I am interested in spatial frequency resolution as in, ability to
> resolve adjacent spatial frequncies (I think this is closer to a normal
> definition of 'frequency resolution').
>
> For example, if we have an image with two textures, each representing a
> spatial frequency, then what separation is required between those
> spatial frequncies so that we can resolve them (in the frequency
> spectrum)?
>
> I think this will be related to the 'size' of the image relative to the
> detail within it - similar to 1D DSP where the 'total sampling time'
> determines the frequency resolution.
>
> But I can't find any references to this topic. So will be grateful for
> any comments or pointers.
>
Hello Chris,
I haven't seen you post here in a while - welcome back.
It seems like what you are asking about actually spans several photographic
topics.
The 1st deals with raw resolution. In the case with 35mm camera systems it
is not unusual to have lenses resolve anywhere in the range of 75 to 120
line pairs per mm. Leica had one that actually resolved over 200 lp/mm! The
film 's area (35mm format) is 24 by 36 mm. However extensive sampling of
slides taken by professional photographers show their resolution tops out at
73 lp/mm. And most were under 60 lp/mm. And these are the quality images.
The practical limitation mostly stems from camera handling - mirror slap and
shutter shake and a little from dye bleed (diffusion) in the film.
Many of today's digital SLRs use either an APS sized sensors - about 16 by
24 mm and there are a few models having "full frame" sensors with 24 by 36
mm areas. Some of the lower ranked models with say 6MP basically hit the
Nyquist limit around 40 or so lp/mm. The densest sensor is currently in the
Nikon D2X which crams 12.4MP into a 16 by 24 mm sensor. This tops out at
over 60lp/mm. Some amateurs have been disappointed with the D2X, since its
high resolution reveals any weaknesses in the lenses and camera handling
techniques.
The anti-alias filter is sometimes described as a blurry piece of glass.
Actually these are made of exotic materials and on the newer DSLRs not only
perform the anti-alias function but also IR and UV rejection. My D100 sees
IR from my studio strobes so when they are used in combination with sunlight
I end up with mixed color lighting. However with a D2X, the strobe's color
match to the sun.
The sensor, as you are probably aware, uses a Bayer color filter array -
except for Foveon sensors. This has the property of sampling the green
channel at nearly twice the rate of the red and blue channels. So in the
case where a camera manufacturer has used a weak anti-alias filter, the
problem of color more' arises. Kodak's DCS-14N is a 14MP full frame camera
with no anti-alias filter. Its replacements - the Pro SLR 14/n and Pro SLR
14/c needless to say did. Also unlike film, digital sensors really prefer
that the light strike them at a nearly normal angle. So today's sensors have
micro lenses. When light strikes the sensor from strongly oblique angles
(wide angle lenses especially on full frame cameras) suffer from light
falloff and exaggerated chromatic aberration.
Currently I don't know of a single standard by which to measure today's
optical systems. And yes photography is moving towards the system concept
and away from the idea of the sensors and lenses working independently.
Lateral chromatic aberration and vignetting are easily handled in post
processing now. Plus both Canon and Nikon offer lenses with reduced image
coverage for cameras with APS sized sensors. Also these newer wideangle
lense are being designed with nearly telecentric optics to ward off
problems. With film, wide angle lenses were simple retrofocus (opposite of
telephoto) designs.
When you talk about comparing textures of two similar fabrics, one of the
things that comes to mind is the idea of local contrast as espoused in Ansel
Adams zone method. Typically fabric detail is low contrast which means it
probably spans 1/2 to 1/3 of an F stop in brightness. So your optical system
will have to be specked to a spatial frequency at a given low contrast.
I hope I've given some food for thought.
Clay