Research projects
IISRI researchers have developed a number of capabilities that allow interactions with the brain to characterise cognition and state of mind. These range from analysing single cell dynamics in the brain to monitoring the physiological behaviour of individuals. Our research explores the capabilities to interact with the brain at neuron cell level, for example responses to stimuli such as drugs. In other work, we analyse the characterisation of general state of mind in terms of situational awareness, attention and involving methods such as EEG and eye tracking technologies.
Exploring how mosquito brains are affected by the zika virus
Our research aims to understand how the Zika virus attacks the brain so that potential steps towards vaccination can be made. Unlike other research groups, IISRI traced the infection back to its source in mosquitoes, the main carriers of the disease. The research involved placing mosquito brain cells on a chip and tracking their activity. It was discovered that mosquitoes infected with the Zika virus were naturally resistant to its most debilitating effects, making them more effective at spreading the virus.
Advanced dynamic simulation and analysis of firearm training through haptics and motion capture
IISRI researchers are using advanced haptic and motion capture technologies to evaluate firearm discharge training tasks. This enables accurate training analysis, supports decision-making and delivers immersive shooting simulation. This research focuses on detailed analysis and realistic modelling of firearm discharge procedures, as well as force modelling and temporal-based calculation and rendering for specific firearms.
Wavelets and multi-wavelet bases for stereo correspondence estimation
Stereo correspondence estimation in one of the most active research areas in the field of computer vision. IISRI researchers have developed an algorithm that uses multi-resolution analysis enforced with wavelets/multi-wavelets transform modulus maxima to establish correspondences between stereo pairs of images.
Measuring depth accuracy in RGBD cameras
RGBD sensors project an infrared pattern and calculate depth from the reflected light using an infrared sensitive camera. In this research, the depth-sensing capabilities of two RGBD sensors are compared under various conditions.
Real-time ergonomic assessment for assembly operations using RGBD cameras
Ergonomic assessments increase productivity and performance by helping to prevent and reduce workplace injuries. The aim of this research is to use RGBD cameras for real-time ergonomic assessment in assembly operations. The accuracy and high sampling rates of RGBD sensors create an opportunity to monitor and alert operators if their current posture is risky.
Human motion analysis from video sequences
This research proposed a general framework for the analysis of human motion in videos, based on the bag-of-words representation and the probabilistic Latent Semantic Analysis (pLSA) model. This consists of detecting human subjects in videos, extracting pyramid Histogram of oriented Gradient descriptors, constructing a visual codebook by k-means clustering, and supervised learning of the pLSA model for recognition.
Event-related potential analysis to identify functional differences in the brain
The event-related potential (ERP) technique is a derivative of Electroencephalography (EEG), which measures brain activity during the cognitive processing of a sensory, motor or cognitive task. Our research investigates the use of extended multivariate autoregressive (eMVAR) models for information flow analysis of ERP data. We are developing a range of adaptive estimation techniques, with the goal of improved extraction of the underlying information flow using the ERP data. These techniques are also used in a variety of applied research, including cognitive load assessment, development of brain-machine interfaces and rehabilitation-related studies.
Human identification from ecg signals
There is a lot of interest in using characteristics such as a person’s face, fingerprints or gait as a form of biomedical identification. Recent research reveals that individuals can be identified from Electrocardiogram (ECG) signals as people have different physiological and geometrical hearts, resulting in unique ECG signals. In our research, we are extracting compact and discriminative features from ECG signals. Unlike previous methods, the proposed method is able to capture both local and global structural information and does not need to segment individual heartbeats or detect fiducial points.
Neural microelectrodes on microfluidics
This research looks at interfacing nerve cells on an integrated microfluidics platform with electronics and software systems. The focus is on developing an integrated platform of these promising technologies into a modular platform. This involves interfacing nerve cells in immobilised microstructures inside microfluidics with embedded systems, and investigation of dynamics of bi-directional communication between neuron and silicon.
High-tech innovation in automotive design assesses driver comfort
IISRI researchers have been recognised for a high-tech innovation in automotive design whereby digital mannequins and computer-generated vehicles have been used to assess the comfort of drivers.
Vox Lumen – motion simulation in dance
As part of Melbourne’s White Night 2015, Vox Lumen: People into Light transformed Federation Square into an interactive world. It involved stunning abstract digital projections, dancer-driven live motion capture, and interactive content that tracked the movement of crowds across Melbourne's biggest night of arts and culture. The event was a combination of live performance and audience interactivity.
The fusion of live performance and technology featured dancer/choreographer Steph Hutchison, who wore IISRI's high-tech Xsens motion capture suit.
This marker-less, full-body suit allows researchers to go mobile, taking a technology that is usually limited to a commercial lab setting into the heart of the city. The inertial-based suit measures the movement of the dancer using an internal gyroscope, so that every movement directly affects the content on the screen.