top of page

Human-Robot Interaction

Research Overview: We envision humans and robots collaborating within enhanced digital and physical spaces. More specifically, we work on integrating HRI with advanced digital technologies, such as Augmented Reality, Mixed Reality, Cloud-based Computing, etc.

 

Objectives: We aim to create enriched interactive environments that enhance human-robot collaboration, leading to augmented efficiency and versatility.

​​​​​

Research Theme 1: Advanced Digital Technology

​

Through Augmented Reality (AR), robots can overlay critical information in real-world scenarios. This aids users by providing contextual assistance. By providing a shared AR view, robots can indicate their intentions or highlight objects of interest, which promotes more cohesive and predictable collaborations. Robots can be programmed to facilitate Virtual Reality (VR)-based training modules, which is helpful for surgical practice.

​

Robots can use Mixed Reality (MR) to help engineers and designers visualize and manipulate both virtual and physical elements, fostering a comprehensive design process. MR interfaces allow users to control robots in distant or hazardous locations by immersing them in a real-time virtual representation of the robot's surroundings, thereby enabling precise operations. MR allows robots and humans to work on tasks by blending physical and digital elements, bridging the gap between the virtual and real worlds.

​

In addition, with the support of Cloud-based Computing, robots can share experiences and learning instances, drawing from vast databases to enhance their capabilities. With cloud integration, robots can be operated, monitored, and even updated remotely, ensuring flexibility and immediate adaptability to dynamic tasks. Moreover, connecting to the cloud allows robots to utilize vast computational resources when needed, enhancing their problem-solving abilities without being limited by onboard hardware.

03

HubotVerse: Towards Internet of Human and Intelligent Robotic Things
with a Digital Twin-Based Mixed Reality Framework

Dandan Zhang, Ziniu Wu, Jin Zheng, Yifan Li, Zheng Dong, Jialin Lin

Although the Internet of Robotic Things (IoRT) has enhanced the productivity of robotic systems in conjunction with the Internet of Things (IoT), it does not inherently support seamless human–robot collaboration. This article presents HuBotVerse, a unified framework designed to foster the evolution of the Internet of Human and Intelligent Robotic Things (IoHIRT). HuBotVerse is advantageous due to its unique features, including security, user-friendliness, manageability, and its open source nature. Moreover, this framework can seamlessly integrate various human–robot interaction (HRI) interfaces to facilitate collaborative control between humans and robots. Here, we emphasize a digital twin (DT)-based mixed reality (MR) interface, which enhances teleoperation efficiency by offering users an intuitive and immersive way to interact with. To evaluate the effectiveness of HuBotVerse, we conducted user studies based on a pick-and-place task. Feedback was gathered through questionnaires, complemented by a quantitative analysis of key performance metrics, user experience, and the National Aeronautics and Space Administration Task Load Index (NASA-TLX). Results indicate that the fusion of MR and HuBotVerse within a comprehensive framework significantly improves the efficiency and user experience of teleoperation. Moreover, the follow-up questionnaires reflect the advantages of the HuBotVerse framework in terms of evident user-friendliness, manageability, and usability in home-care or health-care applications.

 

For codes, project videos, tutorials, technical details, case studies, and Q&A, please check our website

(https://sites.google.com/view/iohirtplusmr/home).

Research Theme 2: Interaction Interfaces

​

Beyond verbal interactions, we're innovating versatile communication channels – including gestures, haptic feedback, and visual cues – allowing robots to convey information in diverse and intuitive ways. We aim to design robots capable of delivering tactile feedback through haptic interfaces, enhancing user experience, particularly in tasks requiring precision or sensory validation.

02

A Digital Twin-Driven Immersive Teleoperation Framework
for Robot-Assisted Microsurgery

Peiyang Jiang, Dandan Zhang

This paper presents a novel digital twin (DT)-driven framework for immersive teleoperation in the domain of robot-assisted microsurgery (RAMS). The proposed method leverages the power of DT to create an interactive, immersive teleoperation environment using mixed reality (MR) technology for surgeons to conduct robot-assisted microsurgery with higher precision, improved safety, and higher efficiency. The simulated tissue can be highlighted in DT, which provides an intuitive reference for the operator to conduct micromanipulation with safety, while the MR device can provide operators with the 3D visualization of a digital microrobot mimicking the motions of the physical robot and the 2D microscopic images in real-time. 
We evaluated the proposed framework through user studies based on a Trajectory Following task and conducted comparisons between with and without using the proposed framework.  The NASA-TLX questionnaire, along with additional evaluation metrics such as total trajectory, time cost, mean velocity, and a predefined collision metric, were used to analyze the user studies. 
Results indicated that the proposed DT-driven immersive teleoperation framework can  enhance the precision, safety, and efficiency of teleoperation, and brings satisfactory user experience to operators during teleoperation.

Research Theme 2: Interaction Interfaces

​

Beyond verbal interactions, we're innovating versatile communication channels – including gestures, haptic feedback, and visual cues – allowing robots to convey information in diverse and intuitive ways. We aim to design robots capable of delivering tactile feedback through haptic interfaces, enhancing user experience, particularly in tasks requiring precision or sensory validation.

01

Digital Twin-Driven Mixed Reality Framework for
Immersive Teleoperation With Haptic Rendering

Wen Fan, Xiaoqing Guo, Enyang Feng, Jialin Lin, Yuanyi Wang, Jiaming Liang, Martin Garrad, Jonathan Rossiter,
Zhengyou Zhang, Nathan Lepora, Lei Wei, Dandan Zhang

DTMR.png
Tencent-Haptics.png

Teleoperation has widely contributed to many applications. Consequently, the design of intuitive and ergonomic control interfaces for teleoperation has become crucial. The rapid advancement of Mixed Reality (MR) has yielded tangible benefits in human-robot interaction. MR provides an immersive environment for interacting with robots, effectively reducing the mental and physical workload of operators during teleoperation. Additionally, the incorporation of haptic rendering, including kinaesthetic and tactile rendering, could further amplify the intuitiveness and efficiency of MR-based immersive teleoperation. In this study, we developed an immersive, bilateral teleoperation system, integrating Digital Twin-driven Mixed Reality (DTMR) manipulation with haptic rendering. This system comprises a commercial remote controller with a kinaesthetic rendering feature and a wearable cost-effective tactile rendering interface, called the Soft Pneumatic Tactile Array (SPTA). We carried out two user studies to assess the system's effectiveness, including a performance evaluation of key components within DTMR and a quantitative assessment of the newly developed SPTA. The results demonstrate an enhancement in both the human-robot interaction experience and teleoperation performance.

Main Contributions:

  1. Integrating the DT technique into an MR-based immersive teleoperation system, leading to the development of the Digital Twin-driven Mixed Reality (DTMR) framework.

  2. Building a novel haptic rendering system based on i) pneumatically-driven actuators for tactile feedback on human fingertips, and ii) a commercial haptic controller that enables kinaesthetic rendering and real-time motion tracking.

  3. Constructing a seamless bilateral teleoperation framework through the integration of immersive manipulation via DTMR and the novel haptic rendering device, which includes both kinaesthetic and tactile rendering.

experiment-DTMR.png
bottom of page