next
Next: Dynamical Measurements Up: Five Stages of Colloidal Previous: Linking Locations into Trajectories

Error Estimates and Optimal Settings

A simple model suffices to gauge the performance of the brightness-weighted centroid estimation. Although scattering of light by submicron dielectric spheres is fairly complicated [15], a typical sphere's image is reasonably well modeled by a Gaussian surface of revolution,

  equation114

with apparent radius s centered at tex2html_wrap_inline1033 . We assume implicitly in eqn. (5) that the center coordinates tex2html_wrap_inline909 are registered with the camera's digitizing grid. This need not be the case. If the estimating mask is not much broader than the image, then uneven clipping at the edges skews the centroid estimate. Say the ideal image were offset along one of the grid's axes by a small amount tex2html_wrap_inline811 which we wish to estimate. The corresponding error due to clipping in the displacement estimate is

  equation121

The value at each pixel, furthermore, has an associated measurement error due to noise of rms magnitude tex2html_wrap_inline1039 which contributes

  equation129

to the error in estimating tex2html_wrap_inline811 . The expected average displacement for an ensemble of spheres is tex2html_wrap_inline1043 and we estimate tex2html_wrap_inline1045 by measuring the rms variation in background brightness. The combined error for locating stationary particles, tex2html_wrap_inline1047 , appears in Fig. 4 for typical values of s and tex2html_wrap_inline1045 and has a minimum value somewhat better than 0.05 pixel in each direction for the optimal choice of w. A conservative estimate for the measurement error in our system therefore is tex2html_wrap_inline1055 = 10 nm.

Using video images to make uncorrelated measurements of fluctuating particle locations requires the shutter interval, tex2html_wrap_inline1057 , to be considerably shorter than the 1/30 sec interval between consecutive images. Video cameras such as the NEC TI-324A have adjustable shutters with exposure times ranging from 1/60 sec down to 1/10000 sec. Image quality considerations dictate using the longest exposure time consistent with the largest acceptable particle motion between frames. Shortening the exposure time, however, reduces the contrast level tex2html_wrap_inline1059 and also may increase the noise level tex2html_wrap_inline1039 in some cameras. The dependence of the relative noise level on adjustable parameters is given by the rule of thumb

equation144

where tex2html_wrap_inline1063 is the system magnification. The choice of magnification thus is constrained by two mutually incompatible considerations: increasing the apparent particle size s and reducing tex2html_wrap_inline1045 . Studies of ordering in suspensions further require as many spheres as possible to be in the field of view and so place an additional constraint on M. Our system produces images of acceptable quality for tex2html_wrap_inline1071 1 msec.

Interlaced video images pose an additional problem for video microscopists studying rapidly moving particles. A single interlaced frame consists of two fields, one for the odd lines and one for the even. Usually, these two fields are not exposed simultaneously, but rather 1/60 sec apart regardless of the shutter speed. A particle which moves significantly in the period between the two field exposures will produce a jagged image such as that shown in Fig. 5. While some video cameras can be adjusted to produce non-interlaced images, not all video recorders and frame grabbers process such signals correctly. When interlacing poses problems, we analyze the even and odd fields separately, and thereby acquire data at 1/60 sec intervals. Since each field has only half as many lines as a full frame, the tracking accuracy is degraded and differs in the two directions. Whenever possible, we arrange our experiments so that interesting motion occurs along the row direction to exploit its higher spatial resolution.


next
Next: Dynamical Measurements Up: Five Stages of Colloidal Previous: Linking Locations into Trajectories

David G. Grier
Mon Mar 11 23:01:27 CST 1996