Adaptive Human-Robot Shared Control for Micromanipulation

Precise and dexterous manipulation of micro-objects with the size of a few microns is significant but challenging. For example, peeling the inner limiting membrane with a thickness of 2.5 microns during vitreoretinal surgery or transporting red blood cells with the size of 5-8 microns is quite challenging, since the human hand can only reach a precision of 100 microns for operation under optimal conditions. Therefore, a robotic micromanipulation system is worth developing, which can support fields of application from biotechnology to microsurgery.

This project is motivated by the urgent need for intuitive and flexible micromanipulation systems to support manipulation of micro-object, where the poor sensory feedback, physiological tremor, and obstructive view hamper the precise micron-scale maneuvers. To this end, I aim to develop adaptive human-robot shared control for micromanipulation.

Imitation Learning for Service Robots

In conventional robotized automation, robots are programmed to execute repetitive tasks with high speed and high precision. However, it is impractical to hard-code robots to perform dexterous manipulation in various scenarios, since it involves dynamic processes that are difficult to model.


In recent years, service robots have been widely deployed in different environments such as houses, offices, hospitals.  They have been developed to perform a wide range of human activities such as cooking, cleaning, sweeping, personal care etc. Among the diverse activities mentioned above, pouring a specific amount of liquids or solid materials such as rice, sugars and salts into a container is one of the frequently performed tasks in food preparation, drink service or industrial environments. The pouring system is a complicated nonlinear time-variant system. An accurate model of liquid dynamics is difficult to obtain. Therefore, in this project, robot learning techniques will be developed for robot to master automatic pouring skills.  

Past Projects

Robot Learning

  • Developed a Sim-to-Real Transfer Framework for Adaptive Trajectory Generation

  • Developed Hierarchical Imitation Learning Algorithm for Service Robot

  • Explored Transfer Learning, Domain Adaptation for Robotic Manipulation​​​

Teleoperation  for Robotic Surgery

  • Designed and Verified a Compact Master Manipulator: Hamlyn CRM

  • Explored Mechanical Design, Master-Slave Mapping, Visual Alignment, Multi-Sensor Fusion for Hamlyn CRM

  • Built up an Open Source Workspace Analysis Package

  • Proposed Master Manipulator Design Theory

  • Finished MRes thesis with the title of “Design and Validation of a Compact Master Manipulator for Robotic Surgery Control and Training”

Human-Robot Shared Control for Laparoscopic Surgery

  • Proposed a Self-Adaptive Motion Scaling Framework to improve the surgical teleoperation efficiency

  • Explored Surgical Gesture Segmentation and Recognition based on machine learning (HMM, GMM, K-means) and deep learning Methods (LSTM, CNN)

  • Proposed the Concept of Implicit Human-Robot Shared Control

  • Conducted Real-time Surgical Skill Analysis and Cross-Domain Transfer Learning for Skill Assessment

  • Investigated a Framework of Supervised Semi-Autonomous Control

  • Employed Bayesian Optimization for User Specific Parameter Optimization

Micro Manipulation

  • Constructed a Robot-Aided Optical Tweezer Platform for Micromanipulation

  • Developed Two Control Strategies for Optical Manipulation of Microrobot Via Planar OT

  • Conducted Force Characterization for Optical Manipulation

  • Developed Multi-Task Learning and Domain Adaptation Algorithms for Microrobot 6D Pose Estimation

  • Explored Few-Shot Learning for Microrobot 3D Pose Estimation

Robot-Assisted Microsurgery

  • Built a Master Controller for Robot-Assisted Microsurgery Training

  • Developed a Robot Operating System (ROS) Interface Based Microsurgical Robot Research Kit (MRRK) for Robot-Assisted Microsurgery Research

  • Investigated Microsurgical Tool Tracking Algorithm

  • Developed Hybrid Mapping Strategy and Haptic Guidance for Robot Control


Research Themes

  • Human-Robot Shared Control

  • Robot Learning

  • Surgical Data Science

  • Microrobotics

Research Outline

Dandan Zhang contributes to the development of advanced dexterous manipulation techniques for robots across scales.

She initiates the concept of implicit human-robot shared control, and she is working towards integrating Artificial General Intelligence into trustworthy robotic systems.


Conventional stationary master manipulators are stable and have high precision. However, the high cost, large footprint limit their widespread uptake. The handheld master controller is an attractive alternative for its grounded counterpart, with the advantages of being compact, lightweight, and easy-to-use. Therefore, I developed a compact master manipulator based on an effective workspace analysis framework and a handheld master controller for microsurgical robot control. To enable seamless human-robot shared control, ergonomic design consideration is explored based on workspace analysis. Adaptive motion scaling framework, adaptive hybrid interface, and other optimal master-slave mapping techniques are explored to realize implicit human-robot shared control. Machine learning is incorporated into the control framework to further improve surgical operation efficiency and alleviate the burdens for surgeons. The concept of supervised semi-autonomous control is proposed, through which the strengths of the humans’ judgments and the robot intelligence are seamlessly combined to provide reliable control. To enable versatile micromanipulation, an OT-based robotic system with 6 Degree-of-Freedom (DoFs) to realize micromanipulation with accurate pose and depth pose estimation is built up, which enables dexterous micromanipulation at the cellular level.