top of page

2024 MSc Human and Biological Robotics

& 2024 MSc Communications and Signal Processing

​

Tactile Perception and Robotic Hand Design:

​

Students:

Louis Flinn; Syed Hasan; Julian Ing

 

Project Description: 
In-hand manipulation involves adjusting the position and orientation of an object within the palm and fingers. This MSc project aims to design and prototype an affordable robotic hand that not only mimics the complex motions required for in-hand manipulation but also integrates tactile feedback to enhance manipulation capabilities. This project requires the student to push the boundaries of current robotic hand designs by incorporating tactile sensors and developing algorithms to interpret and respond to tactile data, thus enabling the hand to perform delicate and skillful in-hand manipulations.

​

Project Goals:
i)    To design a dexterous robotic hand with articulations that reflect the degrees of freedom necessary for in-hand manipulation; employ cost-effective manufacturing processes and materials, potentially including 3D printing, to construct the hand.
ii)    To integrate affordable tactile sensors that provide critical feedback for in-hand manipulation, such as slip detection and pressure distribution; map sensor placements effectively across the hand, focusing on areas that maximize manipulation control.
iii)    To leverage open-source electronics and software platforms to manage costs and encourage community collaboration and innovation; enable the construction of a compact actuation and control system that can process sensory input and coordinate the actuation of the hand with precision and responsiveness.
iv)    To construct a working prototype that embodies the designed features and can be tested for functional capabilities; conduct a comprehensive series of tests that assess the in-hand manipulation skills of the robotic hand, including object reorientation, precision lifting, and controlled release.

​

Required Skills:
i)    Proficient knowledge of mechanical engineering principles and experience with 3D CAD modeling.
ii)    Experience with sensor technology and skills in electronic circuit design and familiarity with microcontroller programming and interfacing.
iii)    Practical experience in prototype building, including familiarity with rapid prototyping techniques.

​

Expected Outcomes: 
i)    A functioning prototype of an affordable robotic hand capable of in-hand manipulation with tactile feedback.
ii)    A set of test results demonstrating the hand's manipulation capabilities and the effectiveness of the tactile feedback system.
iii)    A final project report detailing the design process, control algorithms, technical challenges, performance evaluation, and recommendations for future development.

​

​

​

Sim-to-Real Transfer Learning for Robotic Manipulation:​

​

Students:

Yudong Mao; Zhenyang Tao

​

Project Description: 
The objective of this MSc project is to leverage the potential of simulation environments for training deep learning models for controlling robotic arms for dexterous manipulation. However, the deep learning models that drive these robotic manipulators necessitate vast amounts of experimental data for training, which can be impractical and economically infeasible. To this end, this project will focus on developing a robust sim-to-real transfer learning framework that allows a pre-trained model in a simulated environment to adapt effectively to a real-world setting.
The student will train a deep learning model to accurately identify and manipulate target objects in a virtual domain. This training will be performed using state-of-the-art simulation software that provides a high-fidelity approximation of real-world physics and environmental conditions. The core challenge lies in bridging the gap between the simulated and real environments. This project will not only address the practical challenges of training robotic manipulators but will also contribute to the broader understanding of transfer learning techniques, which are crucial for the deployment of intelligent systems in dynamic real-world scenarios.

​

Project Goals:
i)    To develop a deep learning model capable of object detection and grasp point selection within a simulated environment.
ii)    To conduct model training within the simulation platform, utilizing synthetic data generated within the simulated environment and using advanced deep learning techniques that are resilient to the domain shift between simulation and reality.
iii)    To implement and fine-tune sim-to-real transfer learning algorithms to adapt the simulation-trained model to real-world conditions effectively.
iv)    To evaluate the performance of the transferred model on a physical robotic manipulator setup and iteratively refine the model through real-world training cycles to enhance accuracy and reliability.

​

Required Skills:
i)    Proficiency in deep learning frameworks such as PyTorch, TensorFlow, or Keras, with the ability to construct and train deep neural network architectures.
ii)    Knowledge of computer vision techniques and libraries, particularly OpenCV, for processing and analyzing visual data from both simulated and real-world environments.
iii)    Experience with robot simulation software, capable of simulating physics, robot kinematics, and environmental interactions to generate synthetic training data.

​

Expected Outcomes: 
i)    A deep learning model trained in a simulated environment for robot manipulation.
ii)    A tested and validated transfer learning pipeline for adapting simulation-trained models to the physical robot.
iii)    A comprehensive evaluation report detailing the model's performance, with insights into the challenges of sim-to-real transfer and potential solutions.
 

2024 MRes Medical Robotics and
Image-Guided Intervention

Title: Enhancing Microrobot Control with Haptic Feedback in
Optical Tweezers-Based Micromanipulation System

​

Project Description:

​

Background:

Optical tweezers (OT) have been used for precise manipulation for biomedical applications. OT utilizes a focused laser beam to trap and manipulate small micro-objects. OT has found applications in many areas, including cell manipulation, tissue engineering, micro-assembly, etc [1]. However, to avoid damage to the targeted cells for manipulation caused by laser heat, we use OT to actuate microrobots and conduct indirect manipulation of cells [2].

 

Objective:

This project aims to develop a tailored haptic feedback system to augment user interaction and control with OT during micromanipulations [3]. A machine learning-based approach will be adopted to estimate the interaction force between the microrobot and the targeted micro-objects or cells during manipulation. The final target is the integration of haptic feedback mechanisms to facilitate safe and precise 3D micromanipulation by avoiding exerting extra force on the targeted micro-object for manipulation and thus enhancing the micromanipulation efficiency.

 

Basic Project Requirement:

i)     Review of Challenges: A comprehensive review should be provided to detail the challenges of micromanipulation, with a spotlight on indirect manipulation using an OT-based microrobotic system. This will serve as a foundation for the successful completion of the whole project.

ii)    Force Estimation: A machine learning-based approach will be adopted to estimate the interaction force between the microrobot and the targeted micro-object for manipulation based on sequential microscopic image data.

iii)  Haptic Integration: Building upon the aforementioned force estimation, we will construct a haptic feedback mechanism tailored for the OT-based microrobotic system. This will serve to enhance the user's intuitive connection with the microrobot by restoring the touch information as complementary to the pure visual feedback in traditional OT-based manipulation.

iv)    System Evaluation: To validate our approach, we'll evaluate the proposed system based on a micro-assembly task. We'll compare its precision, efficacy, and accuracy against conventional micromanipulation methodologies.

Machine Learning-Based Perception of Microscopic Images for Biomedical Applications

 

Project Overview:
Microscopic images are important for biomedical research and clinical diagnostics. This project aims to harness the power of machine learning to enhance the perception and interpretation of microscopic images in biomedical applications.

​

Project Objectives:
1) Develop Machine Learning Models: 
Design and implement machine learning algorithms tailored to the challenges and needs of microscopic image analysis in biomedicine. This could involve supervised learning for diagnostic tasks, unsupervised learning for the discovery of unknown patterns, and reinforcement learning for sequential decision-making in image analysis.
2) Enhance Perception of Microscopic Images: 
Apply the developed machine learning models to enhance the interpretation of microscopic images. This could involve tasks such as cell identification and classification, and microrobot real-time tracking.
3) Validation and Evaluation: 
Validate and test the developed algorithms using real-world microscopic data sets. This should include quantitative measures of performance and comparison with traditional image analysis methods.
 

Tactile robotics for Telemedicine

 

Project Overview:
The integration of robotics and telemedicine has paved the way for innovative healthcare services, allowing medical practitioners to remotely deliver care to patients. This project aims to design and implement robotic systems capable of delivering tactile feedback, providing healthcare professionals with a more immersive and interactive telemedicine experience.


Project Objectives:

1) Development of Tactile Robotic Systems: Design and develop a teleoperated robotic system equipped with tactile sensors. The systems should be able to mimic the movements and actions of a healthcare provider remotely, based on their control inputs.

2) Telemedicine Integration: Integrate these tactile robotic systems with existing telemedicine infrastructure, ensuring seamless communication and operation between the healthcare provider and the robotic system.

3) Tactile Feedback Analysis: Develop algorithms to analyze the tactile feedback data obtained from the robotic systems. This includes implementing machine learning techniques to interpret and understand tactile interaction information.

4) System Validation: Conduct rigorous testing and validation of the developed systems and algorithms. This involves creating controlled scenarios to evaluate the performance, reliability, and safety of the tactile robotic telemedicine system.

Internship: â€‹Haptics Interface for Robotic Surgery

bottom of page