Process Modelling and Analysis

IISRI research has focused on Rapid Modelling methodologies to facilitate the generation of complex models faster, at less cost and with incomplete information. IISRI has an extensive track record in the research and development of simulation and decision support technologies. Our technologies have applications in:

  • Modelling and analysis of logistics and materials handling facilities and processes;
  • Statistical analysis, economic modelling and economic impact studies;
  • Business process analysis and business facilitation services; and
  • Process visualisation methodologies.

Conventional analysis methods are static, typically involving complex mathematics and require the end user to be able to manipulate, understand and interpret mathematical data using spreadsheets. In contrast, 3-D modelling and discrete event simulation enable the end user to see a “virtual” working model of an entire system.

Simulation models capture the variations in individual the processes, rather than just considering the average operating performance, resulting in higher levels of realism and more relevant outcomes. This is a great tool for assisting non-professionals to understand, and make decisions on, very complex problems. Virtual reality modelling provides the ability to demonstrate the changes graphically and effectively communicate concepts and implications.

Research Topics

Multiobjective Scheduling for Joinery Manufacturing

Project Summary

ARC LP0991175 - Distributed Real-time Multiobjective Scheduling for Joinery Manufacturing
Chief Investigators: Prof Saeid Nahavandi, A/Prof Doug Creighton
Research Fellow: Dr Samer Hanoun
Industry Partner(s): Informatic Technologies, Dr Hans Kull

Project Description

The Australian Furniture Industry represents 4% of Australia's manufacturing base, with an annual turnover of $9.5 billion. Australian furniture manufacturers are losing the battle against imports, due to high fixed costs and long delivery times. Improved scheduling of their constrained resources provides an opportunity to stay competitive. However, traditional scheduling methods cannot be easily implemented in practice, as they primarily consider a single objective and cannot deal with multi-objective dynamic processes. Considering a typical furniture manufacturing process like joinery making, this project aims to propose and develop a novel multi-objective scheduling strategy to optimise the outputs and dramatically help in reducing delivery time.

Project National Benefits

Efficient scheduling is one of most important research areas for complex manufacturing processes and is not yet fully investigated. Manufacturers often know that they have scheduling problems, but have no idea what technology can be used to solve them. With increasing competition from low-cost production manufacturers overseas, proper scheduling strategies are critical for making Australian businesses competitive. For furniture manufacturing businesses to survive, they must make sure their production processes are as efficient as possible. This project enables the Australian manufacturers to efficiently schedule their production, reduce the costs and increase the sales revenue and profits.

Project Details

The problem of scheduling for minimising material waste (cost) exists in many manufacturing domains. Joineries, corrugated board plants and glass manufacturing are typical examples. In joinery manufacturing, products such as kitchens, bathrooms, and cabinets are produced mainly from two materials. The cost of any product is determined largely by the material used for the front, which is far more expensive than the melamine material used for the sides and the rear. The dominant objective is always to increase the profit by minimising the waste in the front material. Additionally, another objective is to minimise the tardiness of the manufactured jobs. The tardiness could, in theory, be relaxed based on the schedule that provides the maximum cost savings, however, achieving the minimum tardiness is required to satisfy customers required due dates. The multi-objective nature of the problem requires reaching an acceptable compromise where the quality of a solution has to satisfy both criteria.
In order to schedule under multiple objectives, a new problem formulation is proposed and a novel heuristic to solve the problem. In this project:

  1. The problem is formulated as a bi-criteria optimisation problem considering two objectives, the material waste and the tardiness. The material waste objective, expressed in terms of cost savings, is given a higher priority over the tardiness objective. This multi-objective nature enables producing a set of solutions with different quality levels, as shown in Figure 1 below, providing the decision maker (production manager) the opportunity of choosing the best according to some acceptable compromise. arcl-graph Figure 1: Graph showing the set of trade-off solutions produced and provided to the decision maker compared to the true Pareto solutions
  2. A mathematical model is developed to capture the nature of jobs and the flow precedence between different operations and their associated stages in the joinery. The developed model is used to explore and analyse different optimisation techniques.
  3. A generalised flowshop model is developed for modelling the parallel flow stages of jobs and operations in the joinery. The model is used to configure the joinery and routing of jobs. This model supports the decision maker with more flexibility and enables the easy migration to other domains or where the configuration of the joinery changes.
  4. A scheduling algorithm is developed based on the Simulated Annealing technique. The algorithm performance is analysed based on the generated solutions accuracy.

Project Outcomes

  1. A new scheduling algorithm is developed based on the Simulated Annealing technique. The algorithm performance is analysed based on the generated solutions accuracy. The solutions accuracy is verified to be within 0.56% of the optimal solutions on average with maximum deviation of 1.87%.
  2. 2- The proposed scheduling algorithm is implemented through an evaluation software package (B-version). Functionalities such as configuring the joinery, specifying the machines operational times, machines rostered days off and manipulating the generated schedule are all incorporated in the software to enable fast adoption on the joinery floor level. The developed software acts as a production planning and scheduling tool capable of interfacing directly with any production data and has all the necessary data structures, input forms and output reports such as production plans and machine schedules. It produces optimal production plans for most problems facing small to medium sized companies and near optimal ones for larger problem sizes. One of the main advantages of the developed software is that it can be easily applied in other manufacturing domains such as metal fabrication, food processing, supply chain management and workforce management. Figure 2 shows the GUI interface of the developed software.
    arcl-interface Figure 2: The interface of the software package developed through the project

The project has led to new and exciting research directions and innovations and laid foundations for new research:

  1. A new direction in scheduling techniques based on meta-heuristics.
  2. An innovative way to explore the possible available solutions to enable the production manager choosing among according to the manufacturing preferences.
  3. A new foundation of applying meta-heuristics in solving single and multi-objectives job shop scheduling problems.

Project Published Research Findings

  1. S. Hanoun and S. Nahavandi, “A greedy heuristic and simulated annealing approach for a bicriteria flowshop scheduling problem with precedence constraints - a practical manufacturing case”, International Journal of Advanced Manufacturing Technology, 2012, 60, 1087-1098.
  2. S. Hanoun, S. Nahavandi, and H. Kull, “Pareto Archived Simulated Annealing for Single Machine Job Shop Scheduling with Multiple Objectives”, Sixth International Multi-Conference on Computing in the Global Information Technology (ICCGI), pp.99-104, 19-24 June 2011, Luxembourg City, Luxembourg.
  3. S. Hanoun, D. Creighton, S. Nahavandi, and H. Kull, “Solving a Multiobjective Job Shop Scheduling Problem using Pareto Archived Cuckoo Search”, 17th IEEE International Conference on Emerging technologies and Factory Automation (ETFA), Krakow Poland, 17-21 Sept 2012.

Airport Security

Airport security is a major concern for governments and passengers alike and much effort has recently been expended to improve safety across all facets of airport operations. CISR placed a heavy emphasis on research into airport security for the last several years and has been working closely with industry and the Australian government to understand the impacts to airports of increased security considerations.


Airports are susceptible to terrorist threats in several ways. Considering the payload of a typical flight, these threats can be broken down into threats from checked baggage, threats from carry on baggage and passengers, and, finally, threats from air cargo. CISR has worked with industry and government to asses the threat to airports and address the impacts to the sector from increased security requirements. Data collection during live trials has enabled the development of a set of model platforms that can quickly asses the impact to security process of increased security requirement, new policies and even new technology.


Liquids Aerosols and Gels Screening (LAGs)

CISR was appointed by thelags1 Australian Government's Office of Transport Security (OTS) to undertake a study into introduction of liquids, aerosols and gels (LAGs) screening technology into Australian airports. The concept of the study was to investigate the impact to the existing screening process if LAGs in carry on luggage was screened.

The screening process involved the screening of any LAGs for various scenarios covering options such as a blanket approach to screen all LAGs to random screening.

The technology was trialled over a six week period in two Australian Airports. Data collected during this period was used to drive simulation models of the environments, leading to the ability to predict system changes to operational policies and loadings.


Passenger Body Screening

CISR was tasked by the Department of Infrastructure, Transport, Regional Development and Local Government's (Infrastructure) Office of Transport Security (OTS) to undertake a study into the introduction of advanced screening technology in Australian airports, specifically the security screening of domestic and international passengers (PAX). The study's aim was to report the impacts and associated costs of completing varying levels of PAX screening using different advanced technology through the screening process.

State of the art technology in passenger screening was tested in order to determine impact to the entire screening process if such technology was adopted.


Over a period of six weeks and three airports, data was collected as trained security operators screened passengers and passenger baggage according to the Methods, Techniques and Equipment to be used for Screening (METS). Performance data collected during this trial and operational data from an extended period over the previous 12 months was then used to model and analyse the impact to each industry participant.

Three beneficial outcomes were generated through this data collection and model development. Firstly, the compilation of real world datasets of PAX behaviour, covering details such as walking speeds and divest timings. Secondly, the development of an airport screening point modelling platform that can be used to rapidly evaluate the impact of newly available or over the horizon technologies, without the expense and logistics of a live trial. Thirdly, the generation of data relevant to the assessment of impact for the future relaxation of International LAGs screening requirements.


The outcome of the study presented the operational and financial impacts of the introduction of selected Body Scan and Bottle Scan technologies to three selected Australian domestic and international screening points.

Air Cargo Operations

CISR was contracted by the Department of Infrastructure, Transport and Regional Development and Local Government to conduct a study within the air cargo environment to determine the impact of implementing X-ray examination at varying stages within the existing air cargo industry. This study was in relation to Australian Government X-ray Technology Capability Assessment, an assessment of current technologies ability to detect IEDs within air cargo and the impact to the industry of implementing such technology.

A selection of international and domestic cargo terminal operators (CTO's) and metropolitan cargo facilities in Sydney were chosen for the study. These industry participants covered all the main sectors of the air cargo industry, including international express, domestic express, perishable forwarders, international CTOs, domestic CTOs, and retail and wholesale freight forwarders. The data used for study was gathered from industry participants through onsite trials. Data detailing cargo volumes, turn around times, processing times and staffing levels was collected, as well as many other aspects relevant to the air cargo process. A model of each facility was created and analysed through discrete event simulation. Multiple scenarios were created by varying cargo volume examination, examination policies and the use different X-ray equipment.air-cargo

All processes that occur between receiving and dispatch of export consignments were analysed. Additionally, other processes in each facility that might be affected by the introduction of examination processes, such as imports, were considered.


Baggage Handling

CISR, in conjunction with their industry partner Deneb Australasia, has developed a software platform to rapidly model baggage handling systems (BHS).

The task of modelling a BHS could take many months worth of work, but by using the rapid simulation platform and modelling techniques the timeframe to develop a full functional BHS model has been reduced to weeks.

The platform covers all aspects of the BHS, from barcode reading and security screening to early bag storage and sortation systems.


Manufacturing and Assembly

CISR has experience in a range of visualisation, simulation and optimisation projects in the manufacturing domain. Benefits of applying simulation to manufacturing environments include improved throughput, process design and reduced inventory.


CISR has successfully delivered projects across the sector, providing expertise to rapidly model and analyse facilities providing tools to evaluate schedules, resource requirements, facility layouts and the impact of failures and scheduled downtimes.


Warehousing and Material Handling

CISR has successfully applied their rapid modelling techniques to warehouse facilities to dramatically reduce the time required to develop, test and analyse the simulation model. Not only does the rapid model development reduce model building costs, it allows for more time and effort to be placed on the creation of control methodologies to optimise the facility. Additionally the model can be used as a visualisation engine to observe production systems, being driven from existing control systems or log files.


Control of Polymerisation Batch Reactor

Polymer products, or plastics, are used in numerous industries including construction, manufacturing, electronics, transportation, food processing, and aerospace. Despite the wide range of polymer applications, polymerisation reactor control is still a very challenging task as the polymerisation reaction is complex and nonlinear in nature. Controlling polymerisation reaction variables relies on whether they can be measured, estimated, or measured with some time delay. One of the major difficulties encountered in polymerisation reactor control is the lack of reliable online real time analytical data. Typically, temperature is used as an intermediate variable to control polymer quality, as the quality and quantity of polymer are directly dependent on reactor temperature.


Figure 1: Lab scale polystyrene batch reactor


Figure 2: Polystyrene batch reactor model in Matlab Simulink

The aim of this research is to develop advanced nonlinear controllers for monitoring polymerisation batch reactors that provide a smooth, safe and waste free production line, as well as high quality polymer products. Three advanced nonlinear controllers have been designed and implemented in a real polystyrene plant. The three controllers are an artificial Neural Network-based MPC (NN-MPC), an artificial Fuzzy Logic Controller (FLC) and a Generic Model Controller (GMC). The performance criterion, Integral absolute error (IAE) is used as cost function in the optimisation process to tune the controller parameters. The proposed controllers are tested with various implementations including the optimal temperature batch recipe and process disturbance rejection. Experimental results reveal that the NN-MPC is superior at tracking the optimal reactor temperature profile without noticeable overshoot as observed in the case of a FLC or GMC.

Figure 3: Optimal setpoint tracking using three different advanced nonlinear controllers. Reactor temperature is the controlled variable whereas heater power is the manipulated variable.

Simulation-based Learning for Control of Complex Conveyor Networks

The demands placed on material handling systems continue to grow with increasing design complexity and higher throughput requirements. In complex engineered environments, such as a baggage handling system, thousands of bags must be tracked through the system and delivered to the appropriate location once the bags have cleared security requirements. These systems are highly dynamic, with time varying traffic demands and changing flight schedules. Furthermore, complexity is evident in the stochastic behaviour of these systems as the path traffic will take is dependant on the processing outcomes as traffic traverses the system.

The focus of this research is to address the problem of directing traffic flows within a complex conveyor network. Items entering the system have different processing requirements, priorities and exit points, combined with dynamic process flows and the stochastic nature of the system to create an interesting problem for analysis. A generic algorithm has been developed that is applicable to such a system and is able to learn the most appropriate method to manage the traffic flows, ensuring correct processing and delivery.


A generic flow chart of a BHS

Development and Application of Hybrid Soft Computing Models

Soft Computing (SC) is an inter-disciplinary area this is well-suited for the design and development of computerized intelligent systems. The main SC models include artificial neural networks, fuzzy systems, and evolutionary algorithms (to name a few). Each SC model, however, has its own benefits and limitations. As a result, this research focuses on the design and development of hybrid SC models, e.g. neural-fuzzy, neural-evolutionary, fuzzy-evolutionary, neural-fuzzy-evolutionary paradigms, with the aim of capitalising on the strengths of each SC model and, at the same time, alleviating the associated shortcomings.

An example of an evolutionary-based neural-fuzzy model is shown in Figure 1. Other hybrid SC models with online learning capabilities have been researched and developed. We have also applied the resulting hybrid SC models to a number of complex real-world problems. These include content-based image retrieval (Figure 2), fault detection and diagnosis of motors and condition monitoring of industrial systems/processes (Figure 3), as well as typing biometrics (or keystroke dynamics) and medical decision support problems (Figure 4).


Figure 1 - A hybrid model combining a neural-fuzzy network and a genetic algorithm



Video driven traffic modelling

A video driven modelling technique is introduced for traffic systems, where video processing is employed to estimate metrics such as traffic volumes. These metrics are used to update the traffic system model, which is then simulated using the Paramics traffic simulation platform. Video driven model tuning has widespread potential application in traffic systems, due to the convenience and reduced costs of model development and maintenance.


Constructing Optimal Prediction Intervals for Load Forecasting Problem

Short Term Load Forecasting (STLF) is fundamental for the reliable and efficient operation of power systems. Application of artificial intelligent-based techniques and in particular Neural Networks (NNs) has proliferated for STLF within the last two decades. Despite this, one should notice that NN models are deterministic, and by that, their application for predicting future of stochastic systems, such as loads, is always in doubt and questionable.

The objective of this research is to construct Prediction Intervals (PIs) for future loads instead of forecasting their exact values. Different techniques are applied for constructing reliable PIs for outcomes of NNs. Statistical measures are developed and applied for quantitative and comprehensive evaluation of PIs. According to these measures, a new cost function is designed for shortening width of PIs without compromising their coverage probability. Evolutionary optimisation techniques are used for minimisation of this cost function and adjustment of NN parameters. Demonstrated results clearly show that the proposed method for constructing PIs outperforms the traditional techniques.


PIs constructed for test samples using the delta method (top) and optimized delta method (bottom)

Simulation Modelling of Pedestrian Way-finding Behaviour under Normal Situation

Our aim is to develop a comprehensive conceptual model of pedestrian way-finding behaviour under normal and non-panic conditions. To gain a deeper insight, it is necessary to investigate the requirements for inclusion of behavioural theories and knowledge from human dynamics, cognitive science and psychology. Research is investigating how to merge the strength of the two most plausible pedestrian modelling paradigms, Social Force Model and Discrete Choice Model with the computational efficiency of discrete event simulation. Our model consists of three main elements: environment representation, agent characteristics and behavioural rules. Initially, a 2D space with walls, one obstacle, an attraction, entrance and exit has been designed. Two entities with similar characteristics have been introduced. Currently, speed of the entities is constant and the position is manipulated by attractive and repulsive forces. Attractive force is applied around attractive spot in the environment and also another attractive force is used to motivate the entities to move to reach the target as well as repulsive force, which is for obstacle avoidance. In this study, so far the early stages of modelling have been developed and trajectories of entities are captured.


Movement trajectory of two pedestrians with different motivations while moving towards a destination

Uncertainty Quantification through Construction of Prediction Intervals

As universal approximators, Neural Networks (NNs) have achieved great success in many regression problems in the last two decades. No matter how NNs are trained or used, they suffer from two basic deficiencies: susceptible to the uncertainty, and, no indication of their accuracy. To cope with these deficiencies, construction of Prediction Intervals (PIs) for NN outputs has been proposed in literature. Although NN-based PIs have been investigated in literature for almost two decades, there are many issues left unarticulated related to them: (i) PI assessment, (ii) PI optimization, and (iii) reducing the computational requirements.

Motivated by these gaps in literature, this research attempts to develop practically useful measures for quantitative evaluation of PIs. Secondly, a new cost function is developed based on these measures and is minimised in order to find the optimum values of some critical parameters of NN models. The optimisation algorithm attempts to develop NNs leading to narrower PIs without compromising their coverage probability. Finally, a new method is proposed for PI construction, which does not require calculation of the large matrices, as they are required by other methods such as the delta and Bayesian techniques.


Terminology and concept of prediction interval


Prediction intervals for targets

Developing Artificial Intelligent-based Metamodels for Complex Systems

Motivations for this research are the recent trends in both academia and industry towards integration of simulation modelling techniques in system design and operation. Development of detailed 3D discrete event simulation models has become the common practice for modelling and analysis of complex systems. These models are, however, expert-intensive throughout their lifecycle. Besides, their computational requirement is huge, hindering their application for real time operational planning and optimization.

This research aims at developing abstract metamodels for modelling operations within complex man-made systems. A metamodel is a tool for analysis of a detailed simulation model, which provides insight into some aspect of the underlying system. Artificial intelligent based methods are used in this research for predicting the performance measures for manufacturing enterprises. Feedforward neural networks and adaptive neuro-fuzzy inference system metamodels are compared based on their performance for finding highly nonlinear relationships between independent and dependent variables. Demonstrated results indicate that both methods are capable of generating accurate point predictions. While neural network point predictions are more accurate, the neuro fuzzy models are more transparent.

metamodels1 From real systems to neural network metamodels

Enabling Ambient Intelligence for Manufacturing Processes through Distributed Camera Networks

Project Summary

ARC LP110200364 - Distributed Real-time Multiobjective Scheduling for Joinery Manufacturing
Chief Investigators: Prof Saeid Nahavandi, A/Prof Doug Creighton
Research Fellow: Dr Samer Hanoun
Industry Partner(s): Boeing Australia Ltd (Dr Phil Crothers), General Motors-Holden's Automotive Ltd (Mr Gary Caroll)

Project Description

Distributed camera networks are recognised as one of the technological cornerstones of Ambient Intelligence. However, currently manufacturing environments, camera networks lack flexibility and contribute limited knowledge to process variation management. This project aims to address the capability gaps through answering the following research questions:

  • How to optimise the design and configuration of a heterogeneous camera network?
  • How to schedule camera nodes in a dynamically changing manufacturing environment?
  • How to evaluate camera network design and performance of the scheduling algorithm in a virtual environment?

The answers to these research questions are highly interrelated. An efficient camera network design will provide the camera network scheduling algorithm flexibility to allocate cameras to accomplish the dynamic visual jobs. Capped infrastructure budgets will result in camera networks that can only provide a partial view of the whole manufacturing process at any one time. On the other hand, a primitive scheduling algorithm will not take advantage of a good and optimal network design, therefore hindering the network capability.

Project Details

Phase 1: Camera network design optimisation

Situational awareness in manufacturing plays a critical role in maintaining a safe and efficient workplace. Continuous monitoring of relevant activities such as workers entering work cells, robots assembling parts and completion of manufactured parts is crucial to maintain the operational success of any manufacturing process. A camera network can enable the required monitoring operations; however, its coverage design of designated targets and placement in the environment is the key point to achieving full and adequate monitoring. Several factors contribute towards producing an efficient camera network design. First of these factors is the candidate mounting locations for the cameras. Choosing good locations will enable better views. Second is the targets profile of coverage. Each target might be required to be monitored by a certain number of cameras for better identification and detected by a specific resolution for inspection purposes. Final factor is the overall budget to construct the camera network. Adopting cameras with high capabilities reflects back on meeting the coverage requirements easily.

In order to enable an efficient camera network design and optimal placement using multiple types of cameras for target coverage with visibility and resolution constraints, the research in this project has:

  • Modelled the camera viewing frustum based on actual camera properties such as image sensor size, image resolution, lens focal length and f-stop. An automatic focus point is achieved based on the minimum spatial resolution defined to limit the camera viewing frustum depth. Figure 1, shows the schematic of the modelled camera viewing frustum.
  • Developed a new coverage quality metric to select the best camera configuration centralising the covered target points in its FoV. The quality metric is applied when comparing multiple camera placements with equal coverage.
  • Proposed a new neighbourhood generation function is to effectively handle the discrete nature of the camera placement problem.


The manufacturing workplace environment is represented by a 3D CAD to enable examining physical occlusions and spatial coverage constraints. Figure 2 shows a 3D synthetic CAD model of the manufacturing work cell. Figure 3, 4 shows sample of the placed cameras rendered field-of-view as well as the visual deployments in 2D and in 3D.



Phase 2: Tasks scheduling in camera networks

Camera networks have been increasingly adopted in manufacturing to enhance workplace safety and maintain production quality levels. Effective management of cameras is crucial for the network to achieve its designated monitoring goal and fulfill task-specific requirements. Continuous monitoring of a set of activities of known locations is achievable using a network of cameras optimally placed over the monitored environment however, the operational success in utilizing the camera network for monitoring further unknown temporal activities is dependent on many factors.

First of these and the most important is the adoption of an effective camera management and control strategy. This strategy will facilitate the selection of the best camera or cameras for monitoring each specific activity, based on different factors, such as each camera’s current status, workload, and the monitoring requirements. The second factor is the network size (i.e., number of cameras). A higher number of cameras in the network will guarantee dedicated monitoring times to all activities and enable finding available cameras when needed. The third factor is the cameras’ capability. Using cameras with extended pan, tilt and focal length will enable covering various areas in the monitored environment and fulfil detecting activities by a specific resolution for inspection purposes. The final factor is the expected occurrence rate of each activity in the monitored environment. A highly dynamic environment will have activities occurring frequently and closely in time which will require more resources to provide adequate monitoring.

In order to enable an effective camera management, the research in this project has:

  • Regarded the multicamera task assignment as a scheduling problem, where active Pan-Tilt-Zoom (PTZ) cameras initially placed for monitoring continuous occurring tasks are treated as resources, required to monitor additional occurring tasks with specific requirements.
  • Proposed a reactive scheduling approach to address the online nature of the problem. The Multi-Attribute Utility Theory (MAUT) is adopted to derive a suitable utility function to quantify each camera’s ability to monitor a task.
  •  Demonstrate the proposed and developed framework on a manufacturing workcell and show its value as it would be more or less infeasible given the difficulties to deploy a large-scale camera network without first designing and testing its operation.

Several experiments were conducted on a manufacturing setup represented by the synthetic workcell model illustrated in Figure 5. Active robots are located throughout the workcell performing specific manufacturing and assembling operations. Experiments were conducted to investigate and measure the effect of different factors on the camera network setup and operations including number of cameras versus number of tasks, tasks occurrence rates and tasks priority levels.


A practical experiment is conducted to verify the applicability of the proposed method and the ability of the proposed scheduler to operate in real time and interface to actual cameras. Figures 6, 7, 8 show the experiment laboratory environment and a three Canon VB-C60 cameras placed in the outlined area to monitor one of the three continuous occurring activities and simulated cameras views and equivalent views from actual cameras.



Project Published Research Findings

  • S. Hanoun, A. Bhatti, D. Creighton, S. Nahavandi, P. Crothers and C. G. Esparza, “Target Coverage in Camera Networks for Manufacturing Workplaces”, Journal of Intelligent Manufacturing, 2014, 1-15, DOI: 10.1007/s10845-014-0946-z.
  • S. Hanoun; A. Bhatti; D. Creighton; S. Nahavandi; P. Crothers; G. Carroll, "Task Assignment in Camera Networks: A Reactive Approach for Manufacturing Environments," in IEEE Systems Journal , vol. PP, no. 99, pp.1-11, DOI: 10.1109/JSYST.2016.2530788.
Page custodian: Deakin Research
Last updated: