Making an impact on airport security
Airports are susceptible to terrorist threats in several ways. Considering the payload of a typical flight, these threats can be broken down into threats from checked baggage, carry-on baggage, passengers and air cargo.
We have worked with both industry and government to assess the threat to airports and address the impacts to the sector from increased security requirements. Data collection during live trials has enabled the development of a set of model platforms that can quickly assess the impact to security process of increased security requirement, new policies and even new technology.
Dr Samer Hanoun is a Senior Research Fellow in simulation and scheduling. His research interests are focused on job shop scheduling techniques, approximation algorithms and meta-heuristic methods for single objective and multi-objective optimisation.
Recently, Dr Hanoun has been engaged in the identification and assessment of technical solutions for enhancing job shop scheduling in the joinery manufacturing domain. By defining models and heuristics, the production planner can make the most of automated tools for manufacturing optimisation and, in turn, can help to take the grind out of the hard work.
Through this research we have been able to develop Pareto optimisation methodologies and tools for a set of real and conflicting manufacturing objectives such as material waste and tardiness, which can be utilised to provide production managers with a set of production schedules to select from subject to the manufacturing preferences at hand.
SENIOR RESEARCH FELLOW
Multi-objective scheduling for joinery manufacturing
The Australian furniture industry represents 4% of Australia's manufacturing base, with an annual turnover of $9.5 billion. Australian furniture manufacturers are losing the battle against imports, due to high fixed costs and long delivery times. Improved scheduling of their constrained resources provides an opportunity to stay competitive. However, traditional scheduling methods cannot be easily implemented in practice, as they primarily consider a single objective and cannot deal with multi-objective dynamic processes.
Considering a typical furniture manufacturing process like joinery making, this project aims to propose and develop a novel multi-objective scheduling strategy to optimise the outputs and dramatically reduce delivery time.
In this project, the challenge is formulated as a bi-criteria optimisation problem considering two objectives – the material waste and the tardiness. The material waste objective, expressed in terms of cost savings, is given a higher priority over the tardiness objective. This multi-objective nature enables producing a set of solutions with different quality levels, as shown in Figure 1 below, providing the decision-maker (production manager) the opportunity of choosing the best according to some acceptable compromise.
A mathematical model is developed to capture the nature of jobs and the flow precedence between different operations and their associated stages in the joinery. The developed model is used to explore and analyse different optimisation techniques.
A generalised flowshop model is developed for modelling the parallel flow stages of jobs and operations in the joinery. The model is used to configure the joinery and routing of jobs. This model supports the decision-maker with more flexibility and enables the easy migration to other domains or where the configuration of the joinery changes. A scheduling algorithm is developed based on the simulated annealing technique. The algorithm performance is analysed based on the generated solutions accuracy.
The proposed scheduling algorithm is implemented through an evaluation software package (B-version). The developed software acts as a production planning and scheduling tool, which produces optimal production plans for most problems facing small- to medium-sized companies and near optimal ones for larger problem sizes.
One of the main advantages of the developed software is that it can be easily applied in other manufacturing domains, such as metal fabrication, food processing, supply chain management and workforce management.
Airport security is a major concern for governments and passengers alike and much effort has recently been expended to improve safety across all facets of airport operations. IISRI placed a heavy emphasis on research into airport security for the last several years and has been working closely with industry and the Australian government to understand the impacts to airports of increased security considerations.
Airports are susceptible to terrorist threats in several ways. Considering the payload of a typical flight, these threats can be broken down into threats from checked baggage, threats from carry on baggage and passengers, and, finally, threats from air cargo.
ISRI has worked with industry and government to asses the threat to airports and address the impacts to the sector from increased security requirements. Data collection during live trials has enabled the development of a set of model platforms that can quickly asses the impact to security process of increased security requirement, new policies and even new technology.
Liquids aerosols and gels screening (LAGs)
IISRI was appointed by the Australian Government's Office of Transport Security to undertake a study on the introduction of liquids, aerosols and gels (LAGs) screening technology into Australian airports. The concept of the study was to investigate the impact to the existing screening process if LAGs in carry-on luggage was implemented.
The process involved the screening of any LAGs for various scenarios covering options such as a blanket approach to a random screening.
The technology was trialled over a six-week period in two Australian airports. Data collected during this period was used to drive simulation models of the environments, leading to the ability to predict system changes to operational policies and loadings.
Passenger body screening
IISRI was tasked by the Department of Infrastructure, Transport, Regional Development and local government's (Infrastructure) Office of Transport Security to undertake a study into the introduction of advanced screening technology in Australian airports – specifically the security screening of domestic and international passengers (PAX).
The aim of the study was to report the impacts and associated costs of completing varying levels of PAX screening using different advanced technology through the screening process. State-of-the-art technology in passenger screening was tested in order to determine impact to the entire screening process
if such technology was adopted.
Over a six-week period, across three airports, data was collected as trained security operators screened passengers and passenger baggage according to the methods,
techniques and equipment to be used for screening (METS). Performance data collected during this trial and operational data from an extended period over the previous 12 months was then used to model and analyse the impact to each industry participant.
Three beneficial outcomes were generated through this data collection and model development. Firstly, the compilation of real-world datasets of PAX behaviour, covering details such as walking speeds and divest timings. Secondly, the development of an airport screening point modelling platform that can
be used to rapidly evaluate the impact of newly available or over the horizon technologies, without the expense and logistics of a live trial. Thirdly, the generation of data relevant to the assessment of impact for the future relaxation of international LAGs screening requirements.
The outcome of the study presented the operational and financial impacts of the introduction of selected body scan and bottle scan technologies to three selected Australian domestic and international screening points.
Air cargo operations
IISRI was contracted by the Department of Infrastructure, Transport and Regional Development and local government to conduct a study within the air cargo environment to determine the impact of implementing X-ray examination at varying stages within the existing air cargo industry. This study was in relation to Australian Government X-ray Technology Capability Assessment, an assessment of current technologies ability to detect IEDs within air cargo and the impact to the industry of implementing such technology.
A selection of international and domestic cargo terminal operators (CTOs) and metropolitan cargo facilities in Sydney were chosen for the study. These industry participants covered all the main sectors of the air cargo industry, including international express, domestic express, perishable forwarders, international CTOs, domestic CTOs, and retail and wholesale freight forwarders.
The data used for the study was gathered from industry participants through onsite trials. Data detailing cargo volumes, turn-around times, processing times and staffing levels was collected, as well as many other aspects relevant to the air cargo process. A model of each facility was created and analysed through discrete event simulation. Multiple scenarios were created by varying cargo volume examination, examination policies and the use of different X-ray equipment.
All processes that occur between receiving and dispatch of export consignments were analysed. Additionally, other processes in each facility that might be affected by the introduction of examination processes, such as imports, were considered.
IISRI, in conjunction with our industry partner Deneb Australasia, has developed a software platform to rapidly model baggage handling systems (BHS).
The task of modelling a BHS could take months. By using the rapid simulation platform and modelling techniques, the time-frame to develop a fully functional BHS model has been reduced to weeks.
The platform covers all aspects of the BHS, from barcode reading and security screening to early bag storage and sortation systems.
Manufacturing and assembly
IISRI has experience in a range of visualisation, simulation and optimisation projects in the manufacturing domain. Benefits of applying simulation to manufacturing environments include improved throughput, process design and reduced inventory.
We have successfully delivered projects across the sector, providing expertise to rapidly model and analyse facilities providing tools to evaluate schedules, resource requirements, facility layouts and the impact of failures and scheduled downtimes.
Warehousing and material handling
We have successfully applied our rapid modelling techniques to warehouse facilities to dramatically reduce the time required to develop, test and analyse the simulation model. Not only does the rapid model development reduce model building costs, it allows for more time and effort to be placed on the creation of control methodologies to optimise the facility.
Additionally the model can be used as a visualisation engine to observe production systems, being driven from existing control systems or log files.
Control of polymerisation batch reactor
Polymer products, or plastics, are used in numerous industries including construction, manufacturing, electronics, transportation, food processing, and aerospace. Despite the wide range of polymer applications, polymerisation reactor control is still a very challenging task, as the polymerisation reaction is complex and nonlinear in nature.
Controlling polymerisation reaction variables relies on whether they can be measured, estimated, or measured with some time delay. One of the major difficulties encountered in polymerisation reactor control is the lack of reliable online real-time analytical data. Typically, temperature is used as an intermediate variable to control polymer quality, as the quality and quantity of polymer are directly dependent on reactor temperature.
The aim of this research is to develop advanced nonlinear controllers for monitoring polymerisation batch reactors that provide a smooth, safe and waste-free production line, as well as high-quality polymer products.
Three advanced, non-linear controllers have been designed and implemented in a real polystyrene plant. The three controllers are an artificial neural network-based mpc (NN-MPC), an artificial fuzzy logic controller (FLC) and a generic model controller (GMC). The performance criterion, integral absolute
error is used as cost function in the optimisation
process to tune the controller parameters. The proposed controllers are tested with various implementations, including the optimal temperature batch recipe and process disturbance rejection.
Experimental results reveal that the NN-MPC is superior at tracking the optimal reactor temperature profile without noticeable overshoot as observed in the case of a FLC or GMC.
Simulation-based learning for control of complex conveyor networks
The demands placed on material handling systems continue to grow with increasing design complexity and higher throughput requirements. In complex engineered environments, such as a baggage handling system, thousands of bags must be tracked through the system and delivered to the appropriate location once the bags have cleared security requirements.
These systems are highly dynamic, with time varying traffic demands and changing flight schedules. Furthermore, complexity is evident in the stochastic behaviour of these systems as the path traffic will take is dependent on the processing outcomes as traffic traverses the system.
The focus of this research is to address the problem of directing traffic flows within a complex conveyor network. Items entering the system have different processing requirements, priorities and exit points, combined with dynamic process flows and the stochastic nature of the system to create an interesting problem for analysis.
A generic algorithm has been developed that is applicable to such a system and is able to learn the most appropriate method to manage the traffic flows, ensuring correct processing and delivery.
Development and application of hybrid soft computing models
Soft computing (SC) is an inter-disciplinary area that is well-suited for the design and development of computerised intelligent systems. The main SC models include artificial neural networks, fuzzy systems and evolutionary algorithms. Each SC model, however, has its own benefits and limitations.
As a result, this research focuses on the design and development of hybrid SC models (e.g. neural-fuzzy, neural-evolutionary, fuzzy-evolutionary, neural-fuzzy-evolutionary paradigms) with the aim of capitalising on the strengths of each SC model and, at the same time, alleviating the associated shortcomings.
An example of an evolutionary-based neural-fuzzy model is shown in Figure 1. Other hybrid SC models with online learning capabilities have been researched and developed. We have also applied the resulting hybrid SC models to a number of complex real-world problems. These include content-based image retrieval (Figure 2), fault detection and diagnosis of motors and condition monitoring of industrial systems/processes (Figure 3), as well as typing biometrics (or keystroke dynamics) and medical decision support problems (Figure 4).
Video-driven traffic modelling
A video-driven modelling technique is used for traffic systems, where video processing is employed to estimate metrics such as traffic volumes. These metrics are used to update the traffic system model, which is then simulated using the Paramics traffic simulation platform. Video driven model tuning has widespread potential application in traffic systems, due to the convenience and reduced costs of model development and maintenance.
Constructing optimal prediction intervals for load forecasting problem
Short-term load forecasting (STLF) is fundamental for the reliable and efficient operation of power systems. Application of artificial intelligence-based techniques and, in particular, neural networks (NNs) has proliferated STLF within the last two decades. Despite this, NN models are deterministic and their application for predicting the future of stochastic systems, such as loads, is always in doubt and questionable.
The objective of this research is to construct prediction intervals (PIs) for future loads instead of forecasting their exact values. Different techniques are applied for constructing reliable PIs for the outcomes of NNs. Statistical measures are developed and applied for quantitative and comprehensive evaluation of PIs.
According to these measures, a new cost function is designed for shortening the width of PIs without compromising their coverage probability. Evolutionary optimisation techniques are used for minimisation of this cost function and adjustment of NN parameters. Demonstrated results clearly show that the proposed method for constructing PIs outperforms the traditional techniques.
Simulation modelling of pedestrian way-finding behaviour under normal situations
Our aim is to develop a comprehensive conceptual model of pedestrian way-finding behaviour under normal and non-panic conditions. To gain a deeper insight, it's necessary to investigate the requirements for inclusion of behavioural theories and knowledge from human dynamics, cognitive science and psychology.
Research is investigating how to merge the strength of the two most plausible pedestrian modelling paradigms, social force model and discrete choice model, with the computational efficiency of discrete event simulation.
Our model consists of three main elements: environment representation, agent characteristics and behavioural rules. Initially, a 2D space with walls, one obstacle, an attraction, entrance and exit has been designed. Two entities with similar characteristics have been introduced. Currently, the speed of the entities is constant and the position is manipulated by attractive and repulsive forces.
Attractive force is applied around an attractive spot in the environment and another attractive force is used to motivate the entities to move to reach the target, as well as a repulsive force, which is for obstacle avoidance. In this study so far, the early stages of modelling have been developed and trajectories of entities captured.
Uncertainty quantification through construction of prediction intervals
As universal approximators, neural networks (NNs) have achieved great success in many regression problems in the last two decades. No matter how NNs are trained or used, they suffer from two basic deficiencies: they're susceptible to uncertainty and there's no indication of their accuracy.
To cope with these deficiencies, construction of Prediction Intervals (PIs) for NN outputs has been proposed in literature. Although NN-based PIs have been investigated in literature for almost two decades, there are many issues left unarticulated relating to themincluding PI assessment, PI optimisation, and reducing the computational requirements.
Motivated by these gaps in literature, this research attempts to develop practically useful measures for quantitative evaluation of PIs. Secondly, a new cost function is developed based on these measures and is minimised in order to find the optimum values of some critical parameters of NN models. The optimisation algorithm attempts to develop NNs leading to narrower PIs without compromising their coverage probability.
Finally, a new method is proposed for PI construction that does not require calculation of the large matrices, as they are required by other methods, such as the delta and Bayesian techniques.
Developing artificial intelligent-based metamodels for complex systems
Motivations for this research are the recent trends in both academia and industry towards the integration of simulation modelling techniques in system design and operation. Development of detailed 3D-discrete event simulation models has become the common practice for modelling and analysis of complex systems. These models are, however, expert-intensive throughout their life cycle and their computational requirement is huge, hindering their application for real-time operational planning and optimisation.
This research aims to develop abstract metamodels for modelling operations within complex man-made systems. A metamodel is a tool for analysis of a detailed simulation model, which provides insight into some aspect of the underlying system. Artificial intelligent-based methods are used in this research for predicting the performance measures for manufacturing enterprises.
Feedforward neural networks and adaptive neuro-fuzzy inference system metamodels are compared based on their performance for finding highly non-linear relationships between independent and dependent variables. Demonstrated results indicate that both methods are capable of generating accurate point predictions. While neural network point predictions are more accurate, the neuro fuzzy models are more transparent.
Enabling ambient intelligence for manufacturing processes through distributed camera networks
Distributed camera networks are recognised as one of the technological cornerstones of Ambient Intelligence. However, currently manufacturing environments, camera networks lack flexibility and contribute limited knowledge to process variation management. This project aims to address the capability gaps through answering the following research questions:
- How to optimise the design and configuration of a heterogeneous camera network?
- How to schedule camera nodes in a dynamically changing manufacturing environment?
- How to evaluate camera network design and performance of the scheduling algorithm in a virtual environment?
The answers to these research questions are highly interrelated. An efficient camera network design will provide the camera network scheduling algorithm flexibility to allocate cameras to accomplish the dynamic visual jobs. Capped infrastructure budgets will result in camera networks that can only provide a partial view of the whole manufacturing process at any one time. On the other hand, a primitive scheduling algorithm will not take advantage of a good and optimal network design, therefore hindering the network capability.
Phase 1: Camera network design optimisation
Situational awareness in manufacturing plays a critical role in maintaining a safe and efficient workplace. Continuous monitoring of relevant activities such as workers entering work cells, robots assembling parts and completion of manufactured parts is crucial to maintain the operational success of any manufacturing process. A camera network can enable the required monitoring operations; however, its coverage design of designated targets and placement in the environment is the key point to achieving full and adequate monitoring. Several factors contribute towards producing an efficient camera network design. First of these factors is the candidate mounting locations for the cameras. Choosing good locations will enable better views. Second is the targets profile of coverage. Each target might be required to be monitored by a certain number of cameras for better identification and detected by a specific resolution for inspection purposes. Final factor is the overall budget to construct the camera network. Adopting cameras with high capabilities reflects back on meeting the coverage requirements easily.
In order to enable an efficient camera network design and optimal placement using multiple types of cameras for target coverage with visibility and resolution constraints, the research in this project has:
- Modelled the camera viewing frustum based on actual camera properties such as image sensor size, image resolution, lens focal length and f-stop. An automatic focus point is achieved based on the minimum spatial resolution defined to limit the camera viewing frustum depth. Figure 1, shows the schematic of the modelled camera viewing frustum.
- Developed a new coverage quality metric to select the best camera configuration centralising the covered target points in its FoV. The quality metric is applied when comparing multiple camera placements with equal coverage.
- Proposed a new neighbourhood generation function is to effectively handle the discrete nature of the camera placement problem.
Phase 2: Tasks scheduling in camera networks
Camera networks have been increasingly adopted in manufacturing to enhance workplace safety and maintain production quality levels. Effective management of cameras is crucial for the network to achieve its designated monitoring goal and fulfill task-specific requirements. Continuous monitoring of a set of activities of known locations is achievable using a network of cameras optimally placed over the monitored environment however, the operational success in utilizing the camera network for monitoring further unknown temporal activities is dependent on many factors.
First of these and the most important is the adoption of an effective camera management and control strategy. This strategy will facilitate the selection of the best camera or cameras for monitoring each specific activity, based on different factors, such as each camera’s current status, workload, and the monitoring requirements. The second factor is the network size (i.e., number of cameras). A higher number of cameras in the network will guarantee dedicated monitoring times to all activities and enable finding available cameras when needed. The third factor is the cameras’ capability. Using cameras with extended pan, tilt and focal length will enable covering various areas in the monitored environment and fulfil detecting activities by a specific resolution for inspection purposes. The final factor is the expected occurrence rate of each activity in the monitored environment. A highly dynamic environment will have activities occurring frequently and closely in time which will require more resources to provide adequate monitoring.
In order to enable an effective camera management, the research in this project has:
- Regarded the multicamera task assignment as a scheduling problem, where active Pan-Tilt-Zoom (PTZ) cameras initially placed for monitoring continuous occurring tasks are treated as resources, required to monitor additional occurring tasks with specific requirements.
- Proposed a reactive scheduling approach to address the online nature of the problem. The Multi-Attribute Utility Theory (MAUT) is adopted to derive a suitable utility function to quantify each camera’s ability to monitor a task.
- Demonstrate the proposed and developed framework on a manufacturing work cell and show its value as it would be more or less infeasible given the difficulties to deploy a large-scale camera network without first designing and testing its operation.
Several experiments were conducted on a manufacturing setup represented by the synthetic work cell model illustrated in Figure 5. Active robots are located throughout the work cell performing specific manufacturing and assembling operations. Experiments were conducted to investigate and measure the effect of different factors on the camera network setup and operations including number of cameras versus number of tasks, tasks occurrence rates and tasks priority levels.