Human-in-the-loop in driving simulation: A “NextPerception” evolution

Writers: Silvia Chiesa, Mirko De Nunzio, Luca Tramarin

 

Evolution of simulators

The very first simulator can be traced back to the early 20th century, when Sir. Edwin Albert Link developed a prototype of a flight simulator. The prototype, made by scraps of pianos and organs from his father’s workshop, had a full cockpit and controls capable to simulate airplane dynamics, sense of flight and even sounds, creating a controlled and safe environment to train pilots. Despite very little initial interest from instructors or the military, simulators have had great success and more than 10,000 simulators have been produced.

The story of Edwin A. Link gives credit to the relevance of simulation for later developments in e.g. the automotive industry.

Driving simulators seemed to have first appeared during the early 30s with the introduction of a simple architecture for a traffic simulator. Because there is little evidence of this early implementation, researchers usually refer to the Volkswagen prototype in the early 70s as the first most relevant driving simulator. It was a 3DOFs motion-based car simulator equipped with one screen in front of the windshield.

This prototype was followed by other developments from car makers which introduced a whole set of new features, which more and more resemble a realistic driving experience and enhance the driver’s perception.

Driver perception

Driving simulation is nothing more than an illusion of motion by means of a virtual vehicle. To effectively mimic this illusion, it is essential to understand how humans perceive the driving experience. It is, therefore, a complex environment where human proprioceptive sensors are involved, like visual, auditory, and kinaesthetic systems.

In the early years, driving simulators were mostly designed based on the concept of visual flow (Gibson 1950), which highlight the sole relevance of visual perception to simulate the sense of motion. The perception sphere was further extended by research on the vestibular system to help drivers recognize, for example, acceleration and gravitational forces, whereas tactile perception (pressure, vibrations, etc.) provide the driver information on vehicle stability (for example, the feel of slippery road). This “Multimodal Perception”, defined by the integration of multisensory systems, represent the quintessential need to effectively reproduce real-world behaviour in a simulated environment.

ADAS: A paradigm shift for the driver

As technology exponentially evolved over time, new car settings were configured with a whole range of sensors and automated systems that support the driver in specific critical conditions. Advanced Driver Assistance Systems (ADAS) are widely implemented today, and more solutions are under investigation to further enhance the vehicle autonomy and driver safety on the road.

This represents a paradigm shift for the driver’s role, which begins to change from controlling to monitoring the system performance. This does not necessarily mean that the driver is completely exempted from intervention, rather, the driver is further involved through handover assignments, cognitive testing, safety testing and so on.

A NextPerception Evolution

The NextPerception ECSEL Project truly represents the latest evolution of simulators and simulated environments.

The RE:Lab driving simulator is equipped with real car commands and a steering wheel including haptic force feedback. The simulator also includes a video projector (to display the scenarios) and a 15.6” display placed behind the steering wheel to render a full digital Human-Machine Interface. The main characteristics of the driving simulator are enabled by the highly realistic vehicle dynamic models and a flexible and configurable vehicular traffic model for the implementation of critical traffic situations.

In NextPerception the driving simulator is used to integrate unobtrusive sensors and AI-based algorithms to infer the driver’s state. More specifically, different classification modules are integrated to detect visual distraction, cognitive distraction, and the driver’s emotional state.

The aim of this project is to create a Distributed Artificial Intelligent (DAI) system. The main advantage of these systems is the Perfect Parallel: little or no effort is needed to separate the workload into several parallel tasks. This property allows it to solve problems that require the processing of very large data sets. Every node of a DAI system consists in an autonomous learning module that can act independently, and partial solutions are integrated by communication between nodes, often asynchronously. Usually DAI systems are robust, elastic and, by necessity, loosely coupled. The DAI system does not require the data to be aggregated in a single node to operate.

The main requirements for a system to be defined as a DAI are:

  • A distributed system with robust and elastic computation on unreliable and failing resources that are loosely coupled
  • Coordination of the actions and communication of the nodes
  • Subsamples of large datasets and online Machine Learning

 

To test the system, we simulated different scenarios and objective measures describing driver behaviour and reactions during driving were collected (such as force applied to the brake pedal, steering wheel angle, minimal time-to-collision, maximum lateral acceleration of the vehicle, maximum longitudinal acceleration of the vehicle, acceleration potential and steering-wheel usage behaviour, etc.).

Driving simulators are an effective tool for experimentally measuring driving behaviour in complex road-and-traffic conditions. Use of a driving simulator in research and validation of new features allows to investigate the human interaction with the system.

In this sense, the evaluation of the user acceptance and needs in the interaction is crucial, in order to be sure that the selected approach is consistent with the users’ expectations.

Interested readers will find more detailed information in our public deliverable D4.5 First Pilot Evaluation Reports.