Centre for Intelligent Systems Research

Haptics Research Lab
(funded by the Australian Research Council)

Augmented Optometry Training Simulator with Multi-point Haptics

Training of optometrists is traditionally achieved under the close supervision of peers and superiors. With rapid advancements in technology, medical procedures are performed more efficiently and effectively, resulting in faster recovery times and less trauma to the patient. However, the application of this technology has made it difficult to effectively demonstrate and teach the required manual skills, as education is now a combination of not only the medical procedure but also the use of technology.

Overview of the optometry training simulator system architecture

This project is proposed to increase the capabilities of optometry students through haptically-enabled single-point and multi-point training tools as well as augmented reality techniques. Haptics technology allows a human to touch and feel virtual computer models as though they are real. Through physical connection to the operator, haptic devices are considered to be personal robots that are capable of improving the human-computer interaction with a virtual environment. These devices have played an increasing role in developing expertise, reducing instances of medical error and reducing training costs.

The proposed haptically-enabled virtual optometry training environment, integrated with an optometry slit lamp instrument can be used to teach cognitive and manual skills while the system tracks the performance of each individual.

The physical optometry training simulator

The above figures show an overview of the system architecture and the physical simulator, with three major components:

  1. Hardware/software system incorporating a slit lamp instrument, an external computer, a dummy head model with markers on the eyes, as well as visual rendering, collision detection and force rendering algorithms for single and multi-point haptic technology.
  2. Visual rendering pipeline with AR, which involves a webcam for input from AR markers and a HMD for a visual output to the trainee.
  3. Haptic rendering pipeline supporting single and multi-point haptic devices. The haptic device is mounted onto the slit lamp instrument and therefore necessary calibration, interfacing and synchronization are required.

back to top

 

Advanced Dynamic Simulation and Analysis of Firearm Training through Haptics and Motion Capture

The aim of this work was to develop an extensible framework to effectively evaluate firearm discharge training tasks. Advanced haptic and motion capture technologies were applied to deliver dynamic, immersive shooting simulation, enable accurate training analysis and support decision making tasks.

This research focussed on detailed analysis and realistic modelling of firearm discharge procedures as well as force modelling and temporal-based calculation and rendering with respect to specific firearms. The effectiveness and efficiency of physics-engine based dynamic simulation as well as motion capture and motion analysis in assisting firearm training simulation was also evaluated.

The outcomes of this research are:

  • Profiled recoil simulation of different firearms
  • Device independent middleware for visuo-haptic integration
  • Physics engine-based collision detection and dynamic simulation
  • Motion capture and data analysis for haptic-based firearm shooting

Major technological areas and novelties of the framework
Major technological areas and novelties of the framework

back to top

 

Introducing Robotics and Force Feedback to Crowd Simulation Packages

We propose a pipeline to effectively integrate haptic interactions into the DI-Guy simulation environment, with the goal of improving user interactivity with the avatars in the scenario. By implementing such a pipeline, simulation packages will be capable of not only enhancing control over certain actions of avatars, but also providing realistic force feedback to the user. Unlike typical haptic integration approach which is based on high-level libraries and existing haptic collision detection and force rendering algorithms, we implement our own novel algorithm based on the fact we do not have direct access to low-level primitives such as vertices and normals in the virtual environment. Besides, we also implemented our own vendor-independent middleware as a part of our algorithm to provide a generic solution for integration of natural interfaces such as haptics and 3D space mouse with commercial software to enable the building of a complete immersive virtual training system.

Hardware setup
Simulated environment
Figure 1: Hardware setup (left) Simulated environment (right)
Rendering pipeline
Data flow diagram
Figure 2: Rendering pipeline (left) Data flow diagram (right)

back to top

 

Virtual Grasp: Accurate data-driven Multi-Point Haptics

In recent years, haptic devices have started to evolve towards multi-point interaction systems. However, such advances in hardware design have not necessarily been followed by similar software improvements. The haptic rendering techniques used in most of these systems are simple and inaccurate non-physical methods. This has prevented newly developed systems from being utilised in real applications such as the medical domain or in quality assurance processes, where a user needs to make judgements based on interactions with a deformable model. Haptic rendering in these scenarios must be physically accurate; which can be achieved by incorporating continuum mechanics methods such as finite element methods (FEM) or data-driven methods. In order to satisfy the haptic loop requirement of a 1000 Hz refresh rate, the computation volume for FEM is quite high. This is especially noticeable in multi-point haptic interactions.

The data collection system
The data collection system

Data-driven methods provide a fast solution with a reliable accuracy. Research using these methods has achieved an acceptable mean square error (MSE) between the built model performance and the actual training data. The difficulty with data-driven models is that they need to be modified to account for additional points of contact. This requires data-driven modelling techniques to be scalable and able to incorporate additional parameters that accurately describe the behaviour.

In this work, we combine data-driven modelling algorithms and a novel multi-point haptic gripper system to enable an accurate multi-point interaction simulation. The system can model nonlinear complex behaviour of deformable models in real-time. Data is collected by measuring the offline interactions with a real deformable object. The system then enables users to differentiate between different objects based on geometry and material.

The physical deformable cube is discretized into finite vertices
The physical deformable cube is discretized into finite vertices
Chai3D simulation of multi point haptic interaction
Chai3D simulation of multi point haptic interaction

back to top

 

Modelling Force Interactions with Heterogeneous Deformable Objects

The virtual simulation of heterogeneous objects and their behaviour in different scenarios is vital for many applications. For example, soft tissue simulation in medical applications and complex mechanical parts design in computer aided design (CAD) applications. Also, the simulation of deformable models is important for enhancing the user experience in virtual reality environments. This requires the development of a dynamic behaviour model. The underlying objects differ based on their materials.

Heterogeneous objects can be defined as objects composed of different constituent materials and could exhibit continuously varying composition and/or microstructure thus producing gradation in their properties. In order to model heterogeneous objects, both an appropriate static representation and dynamic model are required.

This research targets the problem of the accuracy of simulating the interaction with heterogeneous deformable models. These models are more realistic than the homogeneous ones. To fully simulate heterogeneous objects, we need robust representation and behavioural models. The best representation should be able to satisfy certain criteria such as intuitiveness, accuracy with high speed.

Example of the medical modelling for heterogeneous objects

Example of the medical modelling for heterogeneous objects

Example of modelling heterogeneous objects using single based feature model

Example of modelling heterogeneous objects using single based feature model

back to top

 

Data-Driven Haptic Simulation of Interactions with Deformable Models

The haptic feedback of the interaction with virtual deformable objects is important for many applications. Examples include but are not limited to: medical applications such as minimally invasive surgery, rapid prototyping, and virtual entertainment. Pairing the haptic sense with the visual and the audio senses provide the user with a great level of detail and ensure proper usage of the virtual environment.

The two main features and design goals of a haptic system are the speed and accuracy or physical loyalty. Speed is a critical requirement as haptic applications need a refresh rate of about 1000 Hz which is very high with respect to visual applications only which need between 30 to 60 Hz only. By physical loyalty we mean that the deformable objects movement is very close to the constitutive laws of continuum mechanics. This ensures realism and adequate response of different materials and structures.

There are two broad families of simulation algorithms: The parametric ones such as finite element methods (FEM) which are usually accurate but slow and the data-driven ones that have no explicit model and simulate using the collected data and can be very fast as most of the processing is done offline. The objective of this research is to study the current data-driven techniques, investigate their applicability and overcome their disadvantages. The data-driven procedure includes data collection, analysis, utilization and validation. Within each step, there are research questions such as:

  • What data is to be collected to simulate the behaviour of the model given different material properties?
  • How can the data be collected?
  • What model of interaction can be simulated (e.g. single vs. multi-point)?
  • Can the collected data be generalised for other objects that share the same topology and same material?
  • How to handle the large data amounts for storage and interpolation algorithm training?
  • Can data-driven techniques complement parametric ones such as replacing the time integration mechanism in FEM to create a hybrid technique?

Data-driven modeling approach for haptic simulation of deformable models

Data-driven modeling approach for haptic simulation of deformable models

Finite element simulation of multi point interaction with a deformable object

Finite element simulation of multi point interaction with a deformable object

back to top

 

Towards a Parameter-less 3D Mesh Segmentation

There are numerous techniques that targeted the 3D segmentation problem. However, the research is still in its infancy with respect to related areas such as 2D image segmentation. A major problem of the field is that it is application dependent.

In this work, we propose a novel algorithm that has many advantages over the literature. Firstly, it does not need any object specific parameters and the tuning occurs as a global controlling mechanism for the desired output. Secondly, it follows the cognitive theory and the human approach of 3D segmentation as it considers different 2D poses projections of the object that is typically what a human user gets. Finally, it can be used with different theories other than the cognitive theory and still produce meaningful results.

In addition, we propose a mechanism for locating an antipodal vertex given another vertex. This can help in other applications and not only 3D segmentation such as 3D skeletonization or skinning which is always closely related to the segmentation problem. Figure below shows preliminary results of a dog 3D mesh.

3D mesh of a dog The output cut lines of the 3D mesh

Figure - (a) 3D mesh of a dog (b) The output cut lines of the 3D mesh

back to top

 

MUSTe method for measuring the human side of virtual environment efficacy

Virtual training systems are attracting paramount attention from manufacturing industries because of their potential advantages over conventional training practices. Shorter times for the development of different training scenarios as well as reuse of existing designed engineering (maths) models can lead to significant cost savings. This research presents a newly developed virtual environment (VE) for training in procedural tasks, i.e. object assembly. In addition, a novel evaluation framework has been proposed to evaluate system efficacy through large-scale user-testing, which is often neglected by design experts in the field of VEs. Results confirm the practical significance of evaluating a VE design by using samples of real and representative users through the effective discovery of critical usability problems and system deficiencies. Results also indicate the benefits of collecting multimodal information for accurate and comprehensive assessment of system efficacy. Evaluation results and improvements to existing designs are also presented.

Figure - VEs efficacy evaluation framework

Figure - VEs efficacy evaluation framework

back to top

 

Haptic Enabled Art Realisation

The Haptic-Enabled Art Realisation (HEAR) research project proposes a technological platform that will allow blind and visually impaired people to physically 'feel' the visual information contained within 2D visual art. Aside from facilitating equity and access to the visual arts the ability to perceive and interact with the visual information contained within generic 2D images offers wider reaching benefits. The creation of this technological platform proposes a radical potential to revolutionise the available technologies for assisting people who are visually challenged. The realisation of this technological capability also offers the potential to introduce a new dimension to the communication media used within wider society.

The HEAR project team

The HEAR project team

back to top

 

Haptically Assisted Microrobotic Cell Injection

The Haptically Assisted Microrobotic Cell Injection project is a collaborative effort with the Centre for Intelligent Systems Research (CISR) and the Department of Mechanical Engineering, University of Canterbury, New Zealand, facilitated through bilateral research visits. The objective of this project is to develop a haptically enabled microrobotic cell injection system in the aim of enhancing human-in-the-loop intracellular injection that offers immense benefits over conventional techniques.

Microrobotic cell injection is an area of increasing research interest. Currently, in manual cell injection the operator remains limited to the visual feedback and is unable to adequately perceive the microscale environment. This results in extended training times as well as poor success rates and repeatability. This work proposes an approach that integrates the operator's haptic interface modality offering significant benefits. It begins by investigating haptic bilateralism and introducing a mapping framework resulting in an intuitive method allowing the operator to manoeuvre the micropipette in a manner similar to handheld needle insertion. Aside from enhancing the method by which to manoeuvre the microrobot, this mapping establishes the basis to render haptic information to the operator. A method is then introduced to model cell mechanics, and haptically display the cell indentation force to the operator. New parabolic and conical potential field haptic virtual fixtures are then introduced to assist the operator in penetrating the cell at the desired location. Results demonstrate the successful operation of the system.

The haptically assisted microrobotic cell injection system The haptically assisted microrobotic cell injection system

The haptically assisted microrobotic cell injection system

back to top

 

Haptically-Enabled Grasping System for Enhanced Tele-Surgery

Haptically-enabled teleoperation systems are designed to improve task precision and efficiency through increased user capability. In remote (teleoperated) surgery, a surgeon controls a robot to perform a highly delicate task and within an extremely small workspace. The use of a robot can outperform a human as their movement accuracy and precision is much greater. Existing tele-surgery systems are limited in their ability to present the surgeon with the necessary information that can immerse the operator beyond the basic visual feedback control approach. Applications such as endoscopy, including colonoscopy, cystoscopy etc. require delicate movement within the body cavities to minimize damage to the cavity walls. The research will present a novel multi-point haptic grasping interface in combination with a sensorised robotic end-effector for force-feedback teleoperation. This project will investigate methods and technologies to enhance the telemanipulation process for such applications, and develop a generic system that can be applied to procedures requiring grasping force feedback for dexterous telemanipulation.

Haptics device

back to top

 

A Haptically Enabled CAN-Based Steering Wheel Controller

As the portable entertainment and mobility technologies migrate into the car, driver distraction has become recognized as a major factor in road crashes around the world. To help alert drivers to their distraction, active safety technologies such as lane departure warning systems and collision avoidance systems are being implemented. One issue with the implementation of yet another technology into the vehicle is how to cut through the competing demands of the mobile phone, navigation systems and other technologies. Haptic alerts present just such a method that may enable the system to short- circuit the normal auditory or visual communication channels. This research investigates design and development of a low cost haptic steering wheel, the controller and its integration via CAN bus onto a vehicle as a communication device by a lane departure, collision avoidance, or other type of safety systems.

Haptic ECU

Figure - Haptic ECU

back to top

 

Haptically Enabled Interactive and Immersive Virtual Assembly Training

Virtual training systems are attracting paramount attention from the manufacturing industries due to their potential advantages over the conventional training practices such as general assembly. Within this virtual training realm for general assembly, a haptically enabled interactive and immersive virtual reality (HIVEx) system is presented. The idea is to imitate real assembly training scenarios by providing comprehensive user interaction as well as by enforcing physical constraints within the virtual environment through the use of haptics technology. The developed system employs a modular system approach providing flexibility of reconfiguration and scalability as well as better utilisation of the current multi-core computer architecture. The user interacts with the system using haptics device and data glove while fully immersed into the virtual environment with depth perception. An evaluation module, incorporated into the system, automatically logs and evaluates the information through the simulation providing user performance and improvements over time. A ruggedised portable version of the system is also developed and presented with full system capabilities allowing easy relocation with different factory environments. A number of training scenarios have been developed with varying degree of complexity to exploit the potential of the presented system. The presented system can be employed for teaching and training of existing assembly processes as well as the design of new optimised assembly operations. Furthermore, the presented system can assist in optimizing existing practices by evaluating the effectiveness and the level of knowledge transfer involved in the process. Within the aforementioned conceptual framework, a working prototype is developed.

Haptically Enabled Virtual Assembly training system
Haptically Enabled Virtual Assembly training system

back to top

 

Haptics Interface for Modelling and Simulation of Flexible Objects

Virtual reality systems are attracting paramount attention in product and process design, virtual training practices and ergonomic analysis. Automotive industries are considered to be the leaders in applying virtual reality (VR) solutions for real-world, non-trivial, problems. Although a number of commercial 3D engineering tools for digital mock-ups exist, however most of them lack in intuitive direct manipulation of the digital mock-ups and are constrained to the interaction with mainly rigid objects. To bridge this gap, we have developed a haptics enabled interface for modelling and simulation of flexible objects. This interface can be used for design of the products/processes, ergonomic analysis/evaluation and assembly training in conjunction with our virtual assembly training system incorporating both rigid and flexible objects. Flexilution solver is employed to estimate and simulate the dynamic behaviour of flexible objects in response to external user interaction whereas Nvidia's generic physics engine is used to simulate the behaviour of rigid objects. The developed interface makes use of generic communication architecture and is capable of accommodating haptic devices from the vendors including SensAble, Haption, force dimension and Novint.

Haptics interface for simulation of flexible objects Haptics interface for simulation of flexible objects

Haptics interface for simulation of flexible objects

back to top

 

Modelling of Mechanical Properties of Living Cells for Micromanipulation Systems

Micromanipulation of biological cells is widely undertaken in medical and cell related research and work. Drug discovery, functional genomics and toxicology are some examples of this regularly-performed cell manipulation. The performance of the cell manipulation can be improved by deploying robotics and intelligent systems. Modeling of the biological cells is essential for the robotic or haptic cell manipulation. This requires an appropriate model of the mechanical behaviours of cells under external stimulus. The main goal of modeling in this research is to quantitatively evaluate the mechanical properties and responses of cells when subjected to stimulation and/or perturbation considering both accuracy and real time functionality of the model. Cell structure, cell deformation and cell behaviour under cell injection and aspiration process are studied from kinematics, fluidics and dynamics aspects. The model is aimed to be deployed for applications in haptics and robotics.

A schematic of a developed model for cell microinjection
A schematic of a developed model for cell microinjection. Cell membrane (a) before and (b) after deformation upon micropipet exertion.(c) the cross section of the cell

back to top

 

Deakin University acknowledges the traditional land owners of present campus sites.

30th September 2013