Fast feature identification for holographic tracking: The orientation alignment transform
Abstract.
The concentric fringe patterns created by features in holograms may be associated with a complexvalued orientational order field. Convolution with an orientational alignment operator then identifies centers of symmetry that correspond to the twodimensional positions of the features. Feature identification through orientational alignment is reminiscent of voting algorithms such as Hough transforms, but may be implemented with fast convolution methods, and so can be orders of magnitude faster.
Holographic microscopy records information about the spatial distribution of illuminated objects through their influence on the phase and intensity distribution of the light they scatter. This information can be retrieved from a hologram, at least approximately, by reconstructing the threedimensional light field responsible for the recorded intensity distribution (1); (2). Alternatively, features of interest in a hologram can be interpreted with predictions of the theory of light scattering to obtain exceedingly precise measurements of a scattering object's threedimensional position, size and refractive index (3). The availability of so much highquality information about the properties and motions of individual colloidal particles has proved a boon for applications as varied as product quality assessment (4), microrheology (5); (6), porosimetry (7), microrefractometry (8), and flow velocimetry (9); (10), as well as for molecular binding assays (9), and as a tool for fundamental research in statistical physics (11); (12); (13) and materials science (14).
Fitting measured holograms to theoretical predictions requires an initial estimate for each scatterer's position. This can pose challenges for conventional image analysis algorithms because the hologram of a small object consists of alternating bright and dark fringes covering a substantial area in the field of view (9). Here, we introduce a fast, robust and accurate featureidentification algorithm that not only meets the needs of holographic particle tracking, but also should be useful in other image analysis applications.
Figure 1(a) shows a typical hologram of a colloidal polystyrene sphere in water. This hologram was recorded with an inline holographic video microscope (1); (2) using a collimated linearly polarized laser for illumination (Coherent Cube, vacuum wavelength ). Light scattered by the sphere interferes with the rest of the beam in the focal plane of a microscope objective (Nikon Plan Apo, oil immersion, numerical aperture 1.4). The objective, in combination with a tube lens, relays the interference pattern to a video camera (NEC TI324A II) with an effective magnification of 135 pixel. The intensity distribution recorded by the video camera is normalized by a background image (3); (9) to suppress spurious interference fringes. Figure 1(a) shows a pixel region of the normalized intensity, .
The sphere's hologram features bright and dark circular fringes all centered on a point in the focal plane that coincides with the sphere's center. This point could be identified by performing a circular Hough transform, which additionally would identify the radii of all the rings (15). Hough transforms, however, have a computational complexity of in the number of pixels on the side of an image (15), and thus are prohibitively costly. Variants of Hough transforms that identify centers but not radii can achieve a computational complexity of (16).
More efficient searches for centers of rotational symmetry take advantage of the observation that gradients in the intensity of images such as Fig. 1(a) either point toward or away from the centers. Figure 1(b) shows the magnitude, , of the image's gradient. Each pixel in the gradient image, , is associated with a direction,
(1) 
relative to the image's axis. Figure 1(c) shows for the image in Fig. 1(a). Each pixel therefore offers information that the center of a feature might lie somewhere along direction relative to its position . Voting algorithms (9) make use of this information by allowing each pixel to cast votes for pixels along its preferred direction, the votes of all pixels being tallied in an accumulator array. Hough transforms operate on a similar principle, but also incorporate distance information. Pixels in the transformed image that accumulate the most votes then are candidates for center positions, and may be located with subpixel accuracy using standard methods of particle tracking (17). Alternatively, each intersection between pixels' votes can be computed directly as the solution of a set of simultaneous equations (18). Voting algorithms typically identify the centers of features such as the example in Fig. 1(a) to within 1/10 pixel. Efficient implementations (9); (18) have a computational complexity of . Achieving this efficiency involves first identifying pixels with the strongest gradients, typically by imposing a threshold on .
Here, we introduce an alternative to discrete voting algorithms that is based on a continuous transform of the local orientation field. This approach eliminates the need for threshold selection and further reduces the computational burden of localizing circular features in an image. The spatially varying orientation of gradients in may be described with the twofold orientational order parameter (19); (20)
(2) 
The factor of two in the exponent accounts for the bidirectional nature of orientation information obtained from gradients, as can be seen in Fig. 1(c). Weighting the order parameter by ensures that pixels in regions with stronger gradients contribute more to the estimate for the local orientation.
To identify symmetryordained coincidences in the orientation field, we convolve with the twofold symmetric transformation kernel,
(3) 
to obtain the orientation alignment transform
(4) 
The phase of complements the phase of , as can be seen in the inset to Fig. 1(c). The integrand of Eq. (4) therefore is realvalued and nonnegative along the line that is oriented along , and is complexvalued along other directions. Realvalued contributions directed along gradients of accumulate at points in that are centers of symmetry of the gradient field, as illustrated schematically in the inset to Fig. 1(d). Complexvalued contributions, by contrast, tend to cancel out. Centers of symmetry in therefore are transformed into centers of brightness in , as can be seen in Fig. 1(d). The centroid of the peak then can be identified and located (17).
Circular features at larger radii from a center of symmetry subtend more pixels in and thus would tend to have more influence over the position of the center of brightness in . The factor of in Eq. (3) ensures that all of the interference fringes comprising the hologram of a sphere contribute with equal weight to estimate for the centroid.
The orientation alignment transform defined by Eqs. (2), (3) and (4) is related to the FourierMellin transform, which is used to detect geometrically invariant features in images (21); (22). It can be computed efficiently using the Fourier convolution theorem,
(5) 
where is the Fourier transform of , and where
(6) 
is the Fourier transform of . The orientation alignment transform therefore can be calculated by performing a fast Fourier transform (FFT) on , multiplying by a precomputed kernel, , and then performing an inverse FFT. Computing the gradient image by convolution with a SavitzkyGolay filter (23) reduces sensitivity to noise in and can be performed in operations. The transform's overall computational complexity is set by the cost of the forward and inverse FFT, and so is more efficient than voting algorithms. Rather than requiring sequential analysis of abovethreshold pixels, moreover, the orientation alignment transform lends itself to implementation on parallel processors. Our implementation in the IDL programming language achieves realtime performance (30 frames) on a 2 flop processor for holograms such as the example in Fig. 2.
Figure 2 illustrates the orientation alignment transform's performance for identifying and locating multiple particles in a single image simultaneously. This hologram records twelve 3 diameter colloidal silica spheres that were arranged in four different planes using holographic optical tweezers (24). Despite interference between the spheres' scattering patterns and uncorrected motion artifacts in the hologram, each sphere's contribution to the hologram is resolved into a single peak by the orientation alignment transform. The peaks' locations are identified by crosses superimposed on , and constitute estimates for the particles' twodimensional positions in that hologram.
The widths and heights of peaks in depend on the particles' axial positions, as can be seen in Fig. 2. Particles closer to the focal plane are transformed into sharper brighter peaks. This dependence can be calibrated on a particlebyparticle basis to facilitate realtime threedimensional tracking with minimal additional computational burden. Twodimensional tracking requires no separate calibration.
Unambiguously identifying and locating individual spheres becomes increasingly difficult as their scattering patterns overlap and centers of symmetry become obscured. Spurious features can be introduced, moreover, by superposition of symmetrically arranged scattering patterns. Symmetry considerations therefore should provide little information on individual particles' positions for dense samples in the multiplescattering limit. Results such as those in Fig. 2 confirm reliable detection of micrometerscale spheres down to separations of two or three wavelengths.
Applying the same analysis to each snapshot in a holographic video sequence yields the inplane trajectory for each sphere in the field of view. Figure 3(a) shows the trajectory of the sphere from Fig. 1 obtained in this way from 16,500 consecutive video frames. Each frame, moreover, yields two measurements of the particle's position because the even and odd scan lines are recorded separately. Given the recording rate of 29.97 frames the time interval between interleaved video fields is . The camera's exposure time, 0.1 , is fast enough that the sphere will not have diffused appreciably during image acquisition (25); (26). The 33,000 position measurements plotted in Fig. 3(a) record the particle's Brownian motion over more than 9 .
Assuming that the sphere diffuses freely without significant hydrodynamic coupling to surrounding surfaces, the meansquared displacement,
(7) 
should satisfy the EinsteinSmoluchowski equation
(8) 
where is the sphere's position along one of the Cartesian coordinates with and , where is the diffusion coefficient along that direction, and where is the error in the associated position measurement. Analyzing measured trajectories with Eq. (8) therefore provides a method to measure the precision with which a particle's position can be measured (17); (25); (26).
The data in Fig. 3(b) show the meansquared displacements along the and direction computed from the trajectories in Fig. 3(a) using Eq. (7). Error bars in Fig. 3 reflect statistical uncertainties in the computed values. Although results along the two directions appear to agree with each other to within these uncertainties, leastsquares fits to the EinsteinSmoluchowski prediction in Eq. (8) yield slightly different values for the particle's diffusion coefficient: and . This discrepancy may be attributed to blurring along the direction that arises when the even and odd scan lines are extracted from each interlaced video frame. The resulting loss of spatial resolution along tends to suppress the apparent diffusivity along that direction (25); (26). This artifact may be avoided by using a progressive scan camera. The larger of the measured diffusion coefficients is consistent with the StokesEinstein prediction for a sphere of radius (27) diffusing through water with viscosity at absolute temperature .
Fits to Eq. (8) also yield estimates for errors in the particle's position of and , or roughly 0.06 pixel in each direction. This performance is comparable to the precision obtained with voting algorithms (9); (18). Because of its speed advantage, the orientation alignment transform in conjunction with centroidlocating algorithms (17) should be immediately useful for inplane particle tracking applications. Its results also can be used to bootstrap more detailed analyses (9) for applications that require greater precision or simultaneous tracking and characterization.
The orientation alignment transform performs well for identifying features composed of large numbers of closely spaced concentric fringes. It does not fare so well with simple disklike features whose few alignment coincidences occur at comparatively large ranges. Such images are better analyzed with Hough transforms, voting algorithms, or related morphological methods. The orientation alignment transform, by contrast, is better suited to holographic images whose gradientrich structure is computationally burdensome for conventional methods.
Opensource software implementing the orientation alignment transform is available online at http://physics.nyu.edu/grierlab/software/. This work was supported primarily by a grant from Procter & Gamble and in part by the MRSEC program of the National Science Foundation through Grant Number DMR0820341.
References

(1)J. Sheng, E. Malkiel and J. Katz.“Digital holographic microscope for measuring threedimensional particle distributions and motions.”Appl. Opt.45, 3893–3901 (2006).

(2)S.H. Lee and D. G. Grier.“Holographic microscopy of holographically trapped threedimensional structures.”Opt. Express15, 1505–1512 (2007).

(3)S.H. Lee, Y. Roichman, G.R. Yi, S.H. Kim, S.M. Yang, A. van Blaaderen, P. van Oostrum and D. G. Grier.“Characterizing and tracking single colloidal particles with video holographic microscopy.”Opt. Express15, 18275–18282 (2007).

(4)F. C. Cheong, K. Xiao and D. G. Grier.``Characterization of individual milk fat globules with holographic video microscopy.”J. Dairy Sci.92, 95–99 (2009).

(5)F. C. Cheong, S. Duarte, S.H. Lee and D. G. Grier.“Holographic microrheology of polysaccharides from Streptococcus mutans biofilms.”Rheol. Acta48, 109–115 (2009).

(6)G. Bolognesi, S. Bianchi and R. Di Leonardo.“Digital holographic tracking of microprobes for multipoint viscosity measurements.”Opt. Express19, 19245–19254 (2011).

(7)F. C. Cheong, K. Xiao, D. J. Pine and D. G. Grier.“Holographic characterization of individual colloidal spheres' porosities.”Soft Matter7, 6816–6819 (2011).

(8)H. Shpaisman, B. J. Krishnatreya and D. G. Grier.“Holographic microrefractometer.”Appl. Phys. Lett.101, 091102 (2012).

(9)F. C. Cheong, B. Sun, R. Dreyfus, J. AmatoGrill, K. Xiao, L. Dixon and D. G. Grier.“Flow visualization and flow cytometry with holographic video microscopy.”Opt. Express17, 13071–13079 (2009).

(10)L. Dixon, F. C. Cheong and D. G. Grier.“Holographic particlestreak velocimetry.”Opt. Express19, 4393–4398 (2011).

(11)Y. Roichman, B. Sun, A. Stolarski and D. G. Grier.“Influence of nonconservative optical forces on the dynamics of optically trapped colloidal spheres: The fountain of probability.”Phys. Rev. Lett.101, 128301 (2008).

(12)K. Xiao and D. G. Grier.“Multidimensional optical fractionation with holographic verification.”Phys. Rev. Lett.104, 028302 (2010).

(13)J. Fung and V. N. Manoharan.“Holographic measurements of anisotropic threedimensional diffusion of colloidal clusters.”Phys. Rev. E88, 020302 (2013).

(14)J. Fung, K. E. Martin, R. W. Perry, D. M. Kaz, R. McGorty and V. N. Manoharan.“Measuring translational, rotational, and vibrational dynamics in colloids with digital holographic microscopy.”Opt. Express19, 8051–8065 (2011).

(15)D. H. Ballard.“Generalizing the Hough transform to detect arbitrary shapes.”Pattern Recognition13, 111–122 (1981).

(16)C. Hollitt.“A convolution approach to the circle Hough transform for arbitrary radius.”Machine Vision and Applications24, 683–694 (2013).

(17)J. C. Crocker and D. G. Grier.“Methods of digital video microscopy for colloidal studies.”J. Colloid Interface Sci.179, 298–310 (1996).

(18)R. Parthasarathy.“Rapid, accurate particle tracking by calculation of radial symmetry centers.”Nature Methods9, 724–726 (2012).

(19)B. I. Halperin and D. R. Nelson.“Theory of twodimensional melting.”Phys. Rev. Lett.41, 121–124 (1978).

(20)D. R. Nelson and B. I. Halperin.“Dislocationmediated melting in two dimensions.”Phys. Rev. B19, 2457–2484 (1979).

(21)J. Rubinstein, J. Segman and Y. Zeevi.“Recognition of distorted patterns by invariance kernels.”Pattern Recognition24, 959–967 (1991).

(22)T. J. Atherton and D. J. Kerbyson.“Size invariant circle detection.”Image and Vision Computing17, 795–803 (1999).

(23)A. Savitzky and M. J. E. Golay.“Smoothing and differentionation of data by simplified least squares procedures.”Acta Crystallographica36, 1627–1639 (1964).

(24)D. G. Grier.“A revolution in optical manipulation.”Nature424, 810–816 (2003).

(25)T. Savin and P. S. Doyle.“Role of finite exposure time on measuring an elastic modulus using microrheology.”Phys. Rev. E71, 041106 (2005).

(26)T. Savin and P. S. Doyle.``Static and dynamic errors in particle tracking microrheology.”Biophys. J.88, 623–638 (2005).

(27)B. J. Krishnatreya, A. ColenLandy, P. Hasebe, B. A. Bell, J. R. Jones, A. SundaMeya and D. G. Grier.“Measuring Boltzmann's constant through holographic video microscopy of a single sphere.”Am. J. Phys.82, 23–31 (2014).