top of page

2022-2023 MSc Robotics/Biorobotics

​

Microrobotics:

​

  1. Deep Learning-Based Microrobot Real-Time Tracking 
    Student: Ziqi Xiong, MSc Robotics
    Supervision team
    : Dandan Zhang​
     

  2. Machine Learning-Based Adaptive Human-Robot Shared Control for Micromanipulation
    Student: Peiyang Jiang, MSc Robotics
    Supervision team: Dandan Zhang

​

Tactile Robotics:
 

  1. Soft Tactile Robotics Hand for In-Hand Manipulation
    Student: Qingzheng Cong, MSc Biorobotics
    Supervision team
    : Dandan Zhang, Nathan Lepora, Xiaoqing Guo
     

  2. Deep-Learning Based Tactile Perception of Object Affordance
    Student: Muhuan Wu, MSc Biorobotics
    Supervision team: Dandan Zhang, Nathan Lepora, Wen Fan

     

  3. Mixed Reality-Based Teleoperation for Contact-Rich Task with Mobile Manipulator

       Student: Ziniu Wu
       
Supervision team: Dandan Zhang, Nathan Lepora, Shipeng Xiong
​

​

2022-2023 MSc/MEng Engineering Math

  1. Motion Tracking for Microrobot Control
    Student:
    Finn McFall, MEng Engineering Math
    Supervision team:
    Dandan Zhang
     

  2. Automatic Fruit Picking with a Mobile Manipulator
    Student:
    Frederick Cammegh, MEng Engineering Math
    Supervision team: Dandan Zhang
     

  3. Workspace Analysis for Parallel Robot and Continuum Robot
    Student: Hanyu Zhao, MSc Engineering Math
    Supervision team: Dandan Zhang

​

2022-2023 MEng/MSc Engineering Math
Project Descriptions

Title: A harvesting robotic platform for automatic fruit picking

Motivation:

Harvesting fruits for the fresh market is a time-consuming and tedious process. Therefore, harvesting fruits automatically in an orchard by mobile robots with manipulators should be explored to reduce humans’ workload and increase food-production efficiency.

Challenges:

The main aim of this project is to develop deep learning-based automatic fruit picking robotic system. Three main challenges for the proposed robotic system include: sensing (fruit recognition and localization), planning (determination of the optimal grasping pose), and robotic control (fruit grasping motion generation).

Project Overview:

Deep learning-based models will be applied to enable the robot to identify the locations of fruits and guide the robot to approach the fruits with the desired grasping pose. After that, the robot will pick the fruits and place them in the target area. The work will be carried out in a simulated environment, while the algorithm will be transferred to a physical robotic system. Deep reinforcement learning or deep imitation learning algorithms can be explored.

This project will require expertise in using the deep learning environments within python. Basic knowledge of robotic systems and control will be desired.

Stakeholders:

Fruit-production industry; agriculture robotics

The student who chooses this project will work with an agriculture robotics company and researchers who work on intelligent robotics at Bristol Robotics Lab.

Keywords: deep learning, robotics

 

​

Title: Workspace Analysis for Parallel Robot and Continuum Robot 

 

Motivation:

Workspace analysis is essential for robotic manipulators, which helps researchers and engineers to study, evaluate, and optimize the designs of robots based on specific criteria with consideration of ergonomics and usability.

 

Challenges and Project Overview:

Although workspace analysis is a common research topic, current solutions provide design-specific evaluation, and there is a lack of generic software tools for different hardware configurations.  WSRender, a versatile research-oriented framework for workspace analysis and visualization has been proposed. The limitation of the current WSRender version is the lack of a calculation module for parallel robots or continuum robots.

Therefore, this project aims to include workspace analysis for parallel robots and continuum robots. 

 

Stakeholders:

A surgical robotics company is interested in the workspace analysis of their surgical robot. The student will simulate a robot developed by the company and analyse the workspace of the robot.

​

Essential requirements: Matlab and python

Desirable requirements: Robotics, Modeling

​

 

Title: A Handheld Controller for Surgical Robot Remote Control 

 

Motivation:

With the advantages of being compact and lightweight, handheld controllers are attractive tools for teleoperation. Without mechanical linkages, the physical footprint can be significantly reduced. Besides, an intuitive ungrounded design can assist the operators to manipulate the robot through teleoperation in a more natural way without motion constraints.

 

Overview:

This project aims to develop a handheld controller for remotely controlling a robot. A 9-axis IMU will be adopted to measure the finger gripper’s gesture of the controller and a depth camera will be applied to track its 3d position. The handheld controller will be first tested in a simulated environment and then evaluated on a physical robotic platform. 

 

Stakeholders:

The student who chooses this project will work with a surgical robotics company. The student will evaluate the effectiveness of the handheld controller by teleoperating a surgical robot.

 

Essential requirements: Python 

Desirable requirements: Signal Processing 

2021-2022 MSc Robotics/Biorobotics

1. Reinforcement Learning for Dual-Arm Robotic Manipulation Based on Multi-Sensor Fusion

     Student: Pengyuan Wei

     Supervision teams: Dandan Zhang, Nathan Lepora

​

2. Wearable Controller with Haptic Feedback for Robot Control

     Student: Bharath Sajja

     Supervision teams: Dandan Zhang

​

3. Design of a two-finger robotic hand with tactile feedback

     Student: Fengrui Zhang

     Supervision teams: Dandan Zhang, Nathan Lepora

​

4. Soft robotic hand for in-hand manipulation

     Student: Xuehuai Feng

​

5. Surgical Gesture Segmentation and Recognition for Medical Robotics

     Student: Zhili Yuan

     Supervision teams: Dandan Zhang

​

6. Hand-Eye Coordination for Fruit Grasping

     Student: Pengcheng Gao

     Supervision teams: Dandan Zhang, Daniel Freer

​

7. Imitation Learning for Service Robot

     Student: Aaron Asamoah

     Supervision teams: Dandan Zhang, Virginia

​

8. Soft Robotic Hand with Tactile Feedback

     Student: Xuyang Zhang

     Supervision teams: Nathan Lepora, Dandan Zhang

​

9. Data-Efficient Imitation Learning for Dexterous Manipulation

     Student: Shipeng Xiong

     Supervision teams: Nathan Lepora, Dandan Zhang

​

10. A low-cost teleoperation system using Haptics and Mixed Reality

     Student: Enyang Feng

     Supervision teams: Nathan Lepora, Dandan Zhang

Example Project Description

​

  • Title: Reinforcement Learning for Robotic Manipulation Based on Multi-Sensor Fusion

For sensing techniques used for robotic dexterous manipulation, there are two primary modalities: vision and tactile. The robot can be guided to approach the target objects based on visual feedback, while tactile feedback can provide useful information of interaction between objects and the environment. These two modalities are complementary to each other for contact-rich tasks.  To this end, this project aims to explore multimodal representation learning, which could be a useful tool for developing a robust and task-related representation to facilitate reinforcement learning with high efficiency.

​

  • Title: Sim-to-Real Transfer Learning for Robotic Manipulation

Training a deep learning model for a robotic manipulator to locate and grasp the target object is notoriously costly and time-consuming. It requires a large amount of data, which is not reasonable for most real-world applications. This project aims to train the deep learning model in a simulation environment and implement a sim-to-real transfer learning technique to transfer the pre-trained model from the simulator to the physical environment.

​

  • Title: Design of a two-finger robotic hand with tactile feedback

This project aims to develop a two-finger robotic hand, which can be used to grasp deformable target objects such as plastic bottles with various shapes. The robotic hand should integrate tactile sensors to provide feedback to the controller, which paves a way for the development of an adaptive control algorithm to determine the best grasping strategy.

​

  • Title: Soft robotic hand for in-hand manipulation

This project aims to develop a soft robotic hand and demonstrate its potential for in-hand manipulation. In-hand manipulation involves manipulating an object within one hand. It is a vitally important capability for robots to accomplish real-world tasks. Therefore, it is worthwhile to investigate robotic dexterous in-hand manipulation to reconfigure an object. Soft robotic hands can enable robots to grasp delicate or soft objects of varying shape, size, and pose without explicit knowledge, and is promising for potential applications such as manipulating eggs or fruits etc. Machine learning approaches can be explored to mitigate the need of complex planning and control for in-hand manipulation when using a soft robotic hand with dexterous fingers.

​

  • Title: Surgical Gesture Segmentation and Recognition for Medical Robotic Applications

Minimally invasive surgery mainly consists of the execution of a series of specific sub-tasks, which can be further decomposed into basic gestures or contexts. As a prerequisite of autonomic operation, surgical gesture recognition can assist motion planning and decision-making, construct context-aware knowledge to improve the surgical robot control quality, which may lead to better clinical outcomes. This project aims to investigate Bayesian Inference and explainable machine learning approaches for surgical gesture segmentation and recognition across multiple robotic platforms. The student is expected to have basic knowledge of machine learning and is familiar with Python.

​

  • Title: Human-Robot Cooperative Control for Medical Robotics

Teleoperation is widely used for medical robotic platforms. To ensure seamless and intuitive control, an optimized leader-follower control model should be built to assist the operator in a cooperative manner. The operator can retain precise control when conducting delicate or complex manipulation, while an accelerated movement to the next distant target can be achieved. Moreover, haptic guidance should be included in the control framework. Learning from expert demonstration algorithms can be explored to obtain a pre-trained model, which can be used to provide the haptic guidance signals for instructing trainees during teleoperation or supervising novice surgeons to enhance the surgical operation skills. The student is expected to have basic knowledge of machine learning and is familiar with Python/C++ as well as simulation.

​

  • Title: Robotic Skill Learning for Cooking

With advances in the field of robotic manipulation, machine learning and sensing, intelligent robots are expected to assist humans in kitchens and restaurants. This project aims to develop assistive robots for cooking, which are envisioned to replicate human skills in order to reduce the burden of the cooking process.

Alumni

MRes/MSc Projects

2021, Imperial College London

  • Machine Learning-Based Robotic Manipulation
    Student: Peizhuo Yu (03/2021- 09/2021)
    Supervision teams: Dandan Zhang, Benny Lo

    Research output: MRes Thesis

​

2020, Imperial College London

  • Deep Learning-based Robot-Assisted Minimally Invasive Surgery
    Student: Ruoxiao Wang (03/2020- 09/2020)
    Supervision teams: Dandan Zhang, Benny Lo
    Research output: MRes Thesis (Distinction), 2 conference papers

 

  • Machine Learning-Based Human-Robot Shared Control for Robotic Surgery
    Student: Ruiqi Zhu (03/2020- 09/2020)
    Supervision teams: Dandan Zhang, Benny Lo
    Research output: MRes Thesis (Distinction), 1 conference paper

​

2019, Imperial College London

  • A Cooperative Framework for the da Vinci Research Kit
    Student: Junhong Chen (03/2019- 09/2019)
    Supervision teams: Dandan Zhang, Guang-Zhong Yang
    Research output: MRes Thesis (Distinction), 2 conference papers, 2 journal papers

​

Visiting Researcher/Intern Projects

2019, Imperial College London

  • A Cooperative Framework for the da Vinci Research Kit
    Student: Junhong Chen (03/2019- 09/2019)
    Supervision teams: Dandan Zhang, Guang-Zhong Yang
    Research output: MRes Thesis (Distinction), 2 conference papers, 2 journal papers

bottom of page