Haptics Research

IISRI uses, develops and tests haptic interfaces for medical, military and commercial applications.

We have developed many bespoke products, including a patented, multi-point haptic user interface that allows multi-finger manipulation of a distant object.

Our renowned engineers provide all haptic-related research, development and commercialisation activities, from concept generation and prototyping to turn-key solutions.

What happens in haptics

Haptics – the science of applying tactile sensation and control to interaction with computer applications – has an incredible range of uses. Here at IISRI our research aims to make a significant difference to a large number of industries – from optometry to virtual training programs to art for the sightless.

HEARing art

Our Haptic-Enabled Art Realisation (HEAR) research project proposes a technological platform that will allow blind and visually impaired people to physically 'feel' the visual information contained within 2D visual art. Not only will this open up art to those previously unable to access it, it will also add a new dimension to the communication media used within wider society.

Featured staff member

Dr Zoran Najdovski is a Senior Research Fellow in Multipoint Haptics at IISRI. While virtual reality is not the alien concept is was some years ago, Dr Najdovski, along with Professor Saeid Nahavandi, was at the forefront back in 2009. Together they helped design the Haptic Gripper – a revolutionary grasping device that went on to be the winner on The New Inventors television show.

See Dr Najdovski on The New Inventors

Read Dr Najdovski's profile

Research topics

Augmented optometry training simulator with multi-point haptics

The training of optometrists is traditionally achieved under the close supervision of peers and superiors. With the advancements in technology, medical procedures are performed more efficiently and effectively, resulting in faster recovery times and less trauma to the patient. However, this technology has made it harder to demonstrate and teach the required manual skills, as education is now a combination of both the medical procedure, otherwise needing a patient to be present for realism, and the use of technology.

This project proposes to increase the capabilities of optometry students through haptically-enabled single-point and multi-point training tools, as well as augmented reality techniques. Haptics technology allows a human to touch and feel virtual computer models as though they are real. These devices have played an increasing role in developing expertise, reducing instances of medical error and lowering training costs.

The proposed training environment, integrated with an optometry slit lamp instrument, can be used to teach cognitive and manual skills while the system tracks the performance of each individual.

A researcher demonstrates the haptic device. They peer into the slit lamp and their hand holds the stylus of the device, which examines a human model’s face.
Figure: An IISRI researcher demonstrating the completed system. The haptic device provides force realistic force feedback, while corresponding imagery from the computer simulation are passed to the operator via the embedded screens.

Advanced dynamic simulation and analysis of firearm training through haptics and motion capture

The aim of this work was to develop a way to evaluate firearm discharge training tasks. Advanced haptic and motion capture technologies were applied to:

  • deliver dynamic, immersive shooting simulation
  • enable accurate training analysis
  • support decision-making tasks.

This research focused on detailed analysis and realistic modelling of firearm discharge procedures, as well as force modelling and temporal-based calculation and rendering for specific firearms. The effectiveness and efficiency of physics-engine based dynamic simulation, as well as motion capture and motion analysis in assisting firearm training simulation, was also evaluated.

The outcomes of this research are:

  • profiled recoil simulation of different firearms
  • device independent middleware for visuo-haptic integration
  • physics engine-based collision detection and dynamic simulation
  • motion capture and data analysis for haptic-based firearm shooting.

Introducing robotics and force feedback to crowd simulation packages

We propose a pipeline to effectively integrate haptic interactions into the DI-Guy simulation environment, with the goal of improving user interactivity with the avatars in the scenario. By implementing such a pipeline, simulation packages will be capable of, not only enhancing control over certain actions of avatars, but also providing realistic force feedback to the user. 

Unlike a typical haptic integration approach, which is based on high-level libraries, existing haptic collision detection and force rendering algorithms, we implement our own novel algorithm. This is based on the fact we do not have direct access to low-level primitives such as vertices and normals in the virtual environment.

Figure: An operator using a haptic device to control the manipulator of a robot in a simulation environment (left), a screenshot of the same robot interacting with an injured individual in a virtual environment (right).

Virtual grasp: accurate data-driven multi-point haptics

In recent years, haptic devices have started to evolve towards multi-point interaction systems. However, these advances in hardware have not necessarily been followed by similar advances in software. 

The haptic rendering techniques used in most of these systems are simple and inaccurate non-physical methods. This has prevented newly developed systems from being used in areas like the medical domain or in quality assurance processes, where a user needs to make judgements based on interactions with a deformable model.

Rig used to capture material properties of a 50mm sided cube of silicon using a delta-style haptic device, fitted with a bespoke multi-point haptic end effector.
Figure: Rig used to capture material properties of a 50mm sided cube of silicon using a delta-style haptic device, fitted with a bespoke multi-point haptic end effector.

Data-driven methods provide a fast solution with a reliable accuracy. Research using these methods has achieved an acceptable mean square error between the built model performance and the actual training data. The difficulty with data-driven models is that they need to be modified to account for additional points of contact. This requires data-driven modelling techniques to be scalable and able to incorporate additional parameters.

In our research, we combine data-driven modelling algorithms and a novel multi-point haptic gripper system to enable an accurate multi-point interaction simulation. The system can model non-linear complex behaviours of deformable models in real time. Data is collected by measuring the offline interactions with a real deformable object. The system then enables users to differentiate between different objects based on geometry and material.

Modelling force interactions with heterogeneous deformable objects

The virtual simulation of heterogeneous objects and their behaviour in different scenarios is vital for many applications. Heterogeneous objects can be defined as objects composed of different constituent materials and could exhibit continuously varying composition and/or microstructure, thus producing a gradation in their properties. In order to model heterogeneous objects, both an appropriate static representation and a dynamic model are required.

This research targets the challenges of accurately simulating interaction with heterogeneous deformable models. To fully simulate heterogeneous objects, robust representation and behavioural models are required. The best representation should be able to satisfy certain criteria such as intuitiveness and accuracy with high speed.

Data-driven haptic simulation of interactions with deformable models

The haptic feedback of the interaction with virtual deformable objects is important for many applications, such as minimally invasive surgery, rapid prototyping and virtual entertainment. Pairing the haptic sense with visual and audio senses provides the user with a great level of detail and ensures proper usage of the virtual environment. 

The two main features and design goals of a haptic system are the speed and accuracy or physical loyalty. Physical loyalty means that the deformable objects movement is very close to the constitutive laws of continuum mechanics. This ensures realism and adequate response of different materials and structures. 

There are two broad families of simulation algorithms: the parametric ones, such as finite element methods, which are usually accurate but slow, and the data-driven ones that have no explicit model and simulate using the collected data. These can be very fast as most of the processing is done offline. 

The objective of this research is to study the current data-driven techniques, investigate their applicability and overcome their disadvantages. The data-driven procedure includes data collection, analysis, utilisation and validation.

Towards a parameter-less 3D mesh segmentation

There are numerous techniques that target the 3D segmentation problem. However, the research is still in its infancy with respect to related areas such as 2D image segmentation. A major problem of the field is that it's application dependent. 

In this work, we propose a novel algorithm. Firstly, it does not need any object-specific parameters and the tuning occurs as a global controlling mechanism for the desired output. Secondly, it follows the cognitive theory and the human approach of 3D segmentation as it considers different 2D poses projections of the object that is typically what a human user gets. Finally, it can be used with different theories other than the cognitive theory and still produce meaningful results.

In addition, we propose a mechanism for locating an antipodal vertex given another vertex. This can help in other applications, and not only 3D segmentation such as 3D skeletonisation or skinning, which is always closely related to the segmentation problem.

MUSTe method for measuring the human side of virtual environment efficacy

Virtual training systems are attracting a lot of attention from manufacturing industries because of their potential advantages over conventional training practices. Shorter timeframes for the development of different training scenarios, as well as reuse of existing designed engineering (maths) models, can lead to significant cost savings. 

This research presents a newly developed virtual environment (VE) for training in procedural tasks, such as object assembly. In addition, a novel evaluation framework has been proposed to evaluate system efficacy through large-scale user testing. 

Results confirm the practical significance of evaluating a VE design by using samples of real and representative users through the discovery of usability problems and system deficiencies. Results also indicate the benefits of collecting multimodal information for accurate and comprehensive assessment of system efficacy. Evaluation results and improvements to existing designs are also presented.

Haptically assisted microrobotic cell injection

This project is a collaborative effort between IISRI and the Department of Mechanical Engineering at the University of Canterbury, New Zealand.

The aim is to develop a haptically enabled microrobotic cell injection system that will enhance human-in-the-loop intracellular injection, offering significant benefits over conventional techniques.

Currently, in manual cell injection, the operator remains limited to the visual feedback and is unable to adequately perceive the microscale environment. This results in extended training times, as well as poor success rates and repeat-ability.

Our research proposes an approach that integrates the operator's haptic interface modality by investigating haptic bilateralism. It also introduces a mapping framework, resulting in an intuitive method that allows the operator to manoeuvre the micropipette in a manner similar to hand-held needle insertion.

Aside from improvements in manoeuvring the microrobot, this mapping establishes the basis to render haptic information to the operator. A method is then introduced to model cell mechanics and haptically displays the cell indentation force to the operator. New parabolic and conical potential field haptic virtual fixtures are then introduced to help the operator penetrate the cell at a precise location. Results demonstrate the successful operation of the system.

Haptically enabled grasping system for enhanced tele-surgery

Haptically enabled tele-operation systems are designed to improve task precision and efficiency through increased user capability. In remote (tele-operated) surgery, a surgeon controls a robot to perform a highly delicate task within an extremely small workspace. The use of a robot can outperform a human as their movement accuracy and precision is much greater. 

Existing tele-surgery systems are unable to present the surgeon anything more than basic visual feedback. Applications such as endoscopy require delicate movement within the body to minimise damage to the cavity walls. The research will present a novel multi-point haptic grasping interface in combination with a sensorised robotic end-effector for force-feedback tele-operation. It will also develop a generic system that can be applied to procedures requiring grasping force feedback for dexterous telemanipulation.

A haptically enabled CAN-based steering wheel controller

With the increasing prevalence of portable devices (phones, MP3 players etc) in cars, driver distraction has become a major cause of car accidents around the world. 

To help alert drivers to their distraction, active safety technologies such as lane departure warning systems and collision avoidance systems are being implemented.

The issue with implementing yet another piece of technology into cars is how to cut through the competing demands of phones, sat navs and other technologies. Haptic alerts present a method that may enable the system to short-circuit the normal auditory or visual communication channels. 

This research investigates the design and development of a low-cost haptic steering wheel, the controller and its integration via CAN into a vehicle as a communication device by a lane departure, collision avoidance, or other type of safety system.

Haptically enabled interactive and immersive virtual assembly training

Virtual training systems are gaining a lot of attention from the manufacturing industries due to their potential advantages over conventional training practices, such as general assembly. 

We are researching a haptically enabled interactive and immersive virtual reality (HIVEx) system. The aim of this system is to imitate real assembly training scenarios by providing comprehensive user interaction, as well as enforcing realistic physical constraints within the virtual environment. The developed system employs a modular system approach providing flexibility of reconfiguration and scalability, as well as better utilisation of the current multi-core computer architecture. The user interacts with the system using a haptics device and data glove while fully immersed in the virtual environment with depth perception. 

An evaluation module incorporated into the system automatically logs and evaluates the information through the simulation, providing user performance improvements over time. A rugged portable version of the system is also developed and presented with full system capabilities, allowing easy relocation with different factory environments.

Haptics interface for modelling and simulation of flexible objects

Virtual reality (VR) systems are ideal for product and process design, virtual training practices and ergonomic analysis. Automotive industries are considered to be the leaders in applying VR solutions to real-world problems. 

Although a number of commercial 3D engineering tools for digital mock-ups exist, most of them lack intuitive direct manipulation and are constrained by interaction with mainly rigid objects. 

To bridge this gap, we've developed a haptics-enabled interface for the modelling and simulation of flexible objects. This interface can be used for design products/processes, ergonomic analysis/evaluation and assembly training, in conjunction with our virtual assembly training system, incorporating both rigid and flexible objects.

The developed interface makes use of generic communication architecture and is capable of accommodating haptic devices from vendors, including SensAble, Haption, Force Dimension and Novint.

Modelling of mechanical properties of living cells for micromanipulation systems

Micromanipulation of biological cells is widely undertaken in medical and cell-related research and work. Drug discovery, functional genomics and toxicology are some examples of this regularly-performed cell manipulation. 

The performance of cell manipulation can be improved by deploying robotics and intelligent systems. Modelling of biological cells is essential for robotic or haptic cell manipulation. This requires an appropriate model of mechanical behaviours of cells under external stimulus. 

The main goal of modelling in this research is to quantitatively evaluate the mechanical properties and responses of cells when subjected to stimulation and/or perturbation, considering both accuracy and real-time functionality of the model. Cell structure, deformation and behaviour under cell injection and aspiration process are studied from kinematic, fluidic and dynamic aspects. The model is aimed to be deployed for applications in haptics and robotics.