Real-Time Spatiotemporal Assistance for Micromanipulation Using Imitation Learning
There has been an increasing demand for microscopic work using optical microscopes and micromanipulators for applications in various fields. However, microinjection requires skilled operators, and the considerable shortage of experts has become a recent challenge. We overcome this challenge by propo...
Gespeichert in:
Veröffentlicht in: | IEEE robotics and automation letters 2024-04, Vol.9 (4), p.1-8 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | There has been an increasing demand for microscopic work using optical microscopes and micromanipulators for applications in various fields. However, microinjection requires skilled operators, and the considerable shortage of experts has become a recent challenge. We overcome this challenge by proposing an assistance system based on force and visual presentation using artificial intelligence technology to simplify cell rotation manipulation, which is difficult in microinjection. The proposed system employs imitation learning for an expert with a Gaussian mixture model (GMM) to obtain the ideal pipette trajectory and long short-term memory (LSTM) to infer the pipette operation at the next time step. The assistance position is calculated from the spatial component with GMM and the time-series component with LSTM. We conducted a participant experiment using mature porcine oocytes as targets for manipulation to evaluate the effectiveness of the proposed system. The results indicated that, compared to the conventional system, the proposed system reduced the pipette operation time for single-oocyte rotation and the cell damage caused by the pipette-oocyte collision by approximately 27.0 % and 82.0 %, respectively. Therefore, the proposed system is expected to enable beginners to reproduce high-level skills and address the shortage of experts. |
---|---|
ISSN: | 2377-3766 2377-3766 |
DOI: | 10.1109/LRA.2024.3366011 |