Automated Micro and Nanoscale Assembly Using Optical Tweezers


Main Participants: Satyandra K. Gupta, Arvind Balijepalli, Ashis G. Banerjee, Tom LeBrun, and Tao Peng

Sponsors: This project is sponsored by NIST, Center for Nano Manufacturing and Metrology, and NSF.

Keywords: Micromanipulation, nanomanipulation, optical tweezers, assembly planning, and robotics


Motivation

Optical tweezers can trap and move a variety of microscale and nanoscale components without physical contact and hence without damaging components due to stiction or deformation caused by contact forces. At the same time, optical tweezers provide a broad range of positioning and orienting capabilities to place components at the desired locations in the workspace. By utilizing multiple trapping beams, multiple operations can be performed in parallel and the instrumentation can be based on inexpensive lasers and piezo-actuators. Thus the technique can scale to production in terms of both cost and efficiency, making optical tweezers a very promising technology for micro and nanoscale assembly. Currently, optical tweezers are mainly used in research laboratories. In order to use optical tweezers in production processes, the following challenges need to be addressed:

    • The overall operation speed has to increase considerably to ensure that manufacturing can be performed in a cost-competitive manner.
    • The overall operation yield has to increase considerably to ensure that a large number of assembly operations can be performed without encountering assembly errors.

The reliance on highly trained expert human operators has to decrease considerably to ensure wide spread use of this technology.

We believe that addressing these challenges will make optical tweezers a viable technology for prototyping nanoscale electronic devices, manufacturing customized nano-structures for bio-medical application, and repair and rework of nano-structures produced using other processes (e.g., self-assembly).

 

Objectives

The objectives of this project are:

  1. Development of 3D imaging system for on-line monitoring of the assembly process. This will ensure that the system is aware of positions and orientations of all the components in the workspace, thereby decreasing assembly errors. This capability is also a prerequisite for autonomous operation.
  2. Development of planning algorithms for automated operations. The system must able to perform assembly operations in automated  manner. The human operator will have high-level control and manual override capabilities. Under normal operating conditions, the system will automatically generate the traps and transport components.

Technical Approach

On-Line Monitoring: On-line monitoring requires a new vision system for 3D optical microscopy of workspace at video frame rates. Fast 3D imaging  is important for operator feedback while prototyping new devices using optical tweezers, and requires new techniques to recognize, track and visualize micro and nanoscale components.  Note that while the resolution of traditional optical microscopy is insufficient to resolve nanostructures, they can be observed in the optical microscope (e.g. nanowires, quantum dots) and their positions measured with nanometer-scale resolution. This allows us to use optical techniques to follow nanoassembly processes. Recent advances in ultramicroscopy are also pushing the resolution of optical microscopy to 50 nm and below; so optical microscopy can serve as a key tool for nanoscale measurements. Development of a new 3D vision system requires analyzing a stack of images produced by a camera mounted on the optical tweezers and from these images identifying the components present in the workspace and estimating their locations in 3D space. We first segment each image in the stack into connected regions. Each connected region is analyzed for the presence of a component signature and is used to estimate the type, size, location, and orientation of the component. Estimates generated from various different regions are used to generate an overall estimate. Due to the three dimensional nature of various components, each component leaves signatures in multiple different images. Hence, the estimates generated from one image can be combined with the estimates from a different image. The overall estimates are used to compute and render a synthetic 3D scene showing the current state of the workspace.  In order to be useful in automated planning, we need to compute the 3D scene in a very short amount of time and update the scene at least ten frames per second. Hence, we are developing efficient algorithms for this task. This requires identifying the best possible component signatures to use as well as developing efficient algorithms to verify presence of a signature in an image region.

Planning Algorithms: Assembling micro and nanoscale components involves trapping them and moving them to the desired locations. This requires moving the components through the workspace while avoiding collision with the other components in the workspace. Untrapped components in the workspace also constantly move due to random Brownian motion. Hence the workspace configuration constantly changes. The trapping laser can also be time shared to move multiple components. Hence the laser can also be used to move components that are in the way of the target component to clear the path. The physics of trapping imposes constraints on the speed at which the laser can move a trapped particle through the fluidic workspace. Moreover, there are also constraints on the shape of the trap and clearance that need to be maintained between the trap and the other components in the workspace. In order to perform planning, we are identifying and modeling relevant constraints in a geometric framework. We have formulated the motion-planning problem with the goal of delivering a component to its desired location in the minimum possible expected time.  The nominal transport time information is combined with the expected collision circumvention time to compute the expected time for completing the nominal path. Two types of collision circumvention strategies are pursued: local path alterations to circumvent imminent collisions, and trapping obstructions to remove them. On the occasions when these strategies fail, we plan to use recovery plans to cope up with unavoidable collisions. We are developing efficient algorithms for nominal path planning, collision circumvention planning, and post-collision recovery planning.


Related Publications

The following papers provide more details on our approach.

  • A. Balijepalli, T.W. LeBrun, and S.K. Gupta. Stochastic simulations with graphics hardware: Characterization of accuracy and performance. ASME Journal of Computing and Information Science in Engineering, 10(1):  011010, March 2010.
  • A.G. Banerjee, A. Pomerance, W. Losert, and S.K. Gupta. Developing a stochastic dynamic programming framework for optical tweezers based automated particle transport operations. IEEE Transactions on Automation Science and Engineering, 7(2), 218 – 227, 2010.
  • T. Peng, A. Balijepalli, S.K. Gupta, and T. LeBrun. Algorithms for extraction of nanowire lengths and positions from optical section microscopy image sequence. ASME Journal of Computing and Information Science in Engineering, 9(4), December 2009.
  • A. Balijepalli, T.W. LeBrun, J. Gorman, and S.K. Gupta. Evaluation of a trapping potential measurement technique for optical tweezers using simulations and experiments. ASME International Conference on Micro and Nano Systems, August 30-September 2, 2009, San Diego.
  • A.G. Banerjee, W. Losert, and S.K. Gupta. A decoupled and prioritized stochastic dynamic programming approach for automated transport of multiple particles using optical tweezers. ASME International Conference on Micro and Nano Systems, August 30-September 2, 2009, San Diego.
  • A.G. Banerjee, A. Balijepalli, S.K. Gupta, and T.W. LeBrun. Generating simplified trapping probability models from simulation of optical tweezers system. ASME Journal of Computing and Information Science in Engineering, 9(2):021003, June 2009.
  • T. Peng, A. Balijepalli, S.K. Gupta, and T. LeBrun. Algorithms for on-line monitoring of micro-spheres in an optical tweezers-based assembly cell. ASME Journal of Computing and Information Science in Engineering, 7(4):330-338, 2007.

Contact

For additional information and to obtain copies of the above papers please contact:

Dr. Satyandra K. Gupta
Viterbi School of Engineering
University of Southern California
Los Angeles, California 90089-1453
Phone: 213-740-0491
Email: guptask [AT] usc [DOT] edu