Use Case.
See what we offer.
We offer a software library for efficient development of driver models. These driver models are able to predict driving decisions, actions and driving styles of human drivers. They can be used to adapt automated assistance and automated functions in vehicles to the individual driver and to the current traffic situation. Driver models are an innovative technology for the monitoring, understanding, assessing and anticipating human drivers.
The car industry strives to realize the dream of automated driving. An important challenge is to not forget the driver. Will the driver accept to be driven automatically? Will the driver be scared when the car drives very different from own personal style? Will the driver trust the automated car? Will the driver understand when he has to take over? Trust and understanding can only be developed if the automated car communicates and interacts with the driver in an intuitive and transparent way. For example:
an overtaking assistant should warn the driver only if (s)he really wants to overtake - this will avoid confusion,
an automated car should drive in a way that fits the preferences of the driver - this will increase trust.
Despite its importance, support for driver modelling has been limited so far. General purpose suites like R, MATLAB, Simulink and SCADE can be used for some aspects of driver modelling but lack the overarching toolchain to support a rapid development from raw data towards executable models and tangible results.
Humatects offers a software library for the development, utilization, and machine learning of driver models based on Dynamic Bayesian Networks – Driver Modelling Suite (DMS).
With DMS, the user can efficiently develop driver models by reusing ready-made parameters and model structures for a variety of use cases, including driver intention recognition, traffic prediction and autonomous control. Developed using the feedback of automotive companies concerning unique Automotive requirements and use cases, DMS comes with the following advantages:
variety of predefined psychologically motivated parameters and distribution types,
ready-made machine-learning algorithms for parameter and structure learning,
tools and applications for data pre-processing and annotation,
tools and applications for runtime utilization, visualization, evaluation and diagnostics of models and parameters,
step-by-step example workflows allow for an easy access for beginners,
powerful API that provides all freedom for experts,
smooth integration into third-party middleware (e.g. RTMaps) and driving simulators (e.g. SiLAB, Scanner).
DMS even offers algorithms for incremental learning of driver models during runtime. This functionality supports adaptation of the model to the indivdiual driver.
Our customers are automotive suppliers and OEMs who want to apply driver models to develop individualized assistance and automated vehicle systems.
In the European project AutoMate, Humatects used and extended its driver modelling suite by developing driver models for team-based automated driving. Click here to see a video which explains this innovative concept.
Humatects developed driver models for an overtaking assistant that adapts to the overtaking behaviour of individual drivers. Click here to see a video which explains this assistance system.
A technician has to repair or maintain a certain device, e.g. an extruder in a production ine or a smart meter in a fuse box. Since s/he is not yet very experienced with this device s/he uses the Cocomo Support System. Cocomo is an Augmented Reality application that can be used e.g. with the HoloLens or even with a normal Smart Phone device. It provides step-by step instructions for repair and maintenance tasks. For each step, needed tools, a brief task description and an animated 3D visualization are shown as graphical overlays on the live scene. In case that the provided help information is not sufficient, Cocomo allows to connect live to a remote expert. Cocomo enables cooperation between technician and remote expert based on the camera and microphone in the HoloLens or Smart Phone. Both share the same view, the technician can explain the problem directly on the device and the remote expert can tag and highlight certain parts via Augmented Reality in order to support his verbal instructions.
In more than 3 years of research we developed a software framework which allows us to build a customized support system tailored to your needs. The framework supports the following features:
Integrated Cooperation Platform: Our software runs on all wearable and mobile device as well as stationary computers with Internet browsing capability. It supports a bring-your-own-device approach, e.g., field technicians can benefit from the system on the fly with their own devices. It provides a web-based HTML5 interface, which ensures a higher scalability for the overall application by supporting a variety of devices out of the box, and better cost effectiveness in development and maintenance of the software.
Extendability: All help task-specific resources (needed tools, task description, documents, videos, pictures, animated 3D visualization) are stored in a database and can be extended by the user her/himslef or by a group of nominated super-users with editor rights.
Enterprise Architectures: Our software follows service-oriented architectural principles that allow easy integration into enterprise architectures of our customers. In this way, our software naturally connects to ticket systems, workflows and document repositories already used by customers.
Our customers are companies who provide repair and maintenance work or support to their clients. Cocomo allows them to save money e.g. for traveling, to reduce the failure rate and to increase customer satisfaction.
Cocomo was features recently in the online magazine SCOPE – Produktion, Automatisierung, Industrial IoT (article is in German).
Humatects provides a software for adaptive transition training: TrainingSuite.
In transition training pilots are trained on a new aircraft type.
A major challenge is that the pilots come with heterogenous background knowledge:
they have flown different aircraft before. Nevertheless, today,
they all get the same training in a one-size-fits-all approach.
The innovation realized by TrainingSuite allows to use the trainees background systematically
based on a validated and proven scientific approach and software: Ready-To-Measure Training
TrainingSuite uses formal computational models of Standard Operating Procedures (SOPs). We model, on the one hand, the SOPs of the aircraft the trainee already
knows and, on the other hand, the SOPs of the aircraft the pilot has to learn.
These two models are automatically compared by an artificial intelligence algorithm.
This comparison delivers the differences between the two aircraft as an output.
The TrainingSuite software now allows to construct a syllabus which focusses exactly on these
differences in order to effectively and efficiently train from the "known" to the "unknown".
Training becomes more efficient, because training time is allocated to the new learning items and not wasted on the known ones.
Training success becomes more predictable because the syllabus is tailored to the trainee background knowledge.
TrainingSuite helps to establish a systematic and transparent standard for ready-to-measure training, instead of relying on ad-hoc adaptation of the curriculum during training.
Humatects offers the TrainingSuite software based on a one-time license fee. Additionally a maintenance and support contract is provided that includes implementation of customer specific requirements. As said, the adaptation is based on computational models of the SOPs of two aircraft types. Humatects already created several models in the past which are sold to our customers. Further models will be built on demand.
Our customers are pilot training organisations. The software has been developed in cooperation and is used by one of the largest training organisation.
A dedicated website is available for TrainingSuite: www.pilot-training-suite.com.
TrainingSuite was presented at the EATS 2019 in Berlin.
Operators of control rooms have to detect anomalies or emergency situations as quickly as possible. Correct recovery procedures have to be found and performed to remedy the problem. Failure identification and repair have to be done in a most efficient and effective way. Drivers of automated vehicles have to understand what the automation is doing and what it expects from her/him. Drivers will buy these new cars and will develop trust in automated driving only if the automated behaviour is transparent and easy to understand.
Humatects designs and implements software for Adaptive Human-Machine Interaction (HMI) in control rooms and vehicle cockpits:
Graphical User Interface for Energy Control Room
Graphical User Interface for Aircraft Systems
Graphical User Interface for Automotive Systems
This includes graphical interfaces, touch interfaces and augmented and virtual reality interfaces. Our HMI solutions present information to ease monitoring and control of vehicles, energy networks and complex processes. We use a model-based engineering method to clearly specify what information needs to be communicated to the user. While other companies often use ad-hoc creativity to come up with HMI solutions, we complement our creativity with scientifically grounded engineering methods. Our techniques are grounded in more than a decade of research performed by the Humatects team at the computer science institute OFFIS. In addition, we use empirical techniques like observing and interviewing the users. This combined model-based and empirical approach is what makes Humatects' HMI development unique.
We offer services to systematically produce HMI designs and to implement these designs into software. Usually, these HMIs are intended to allow users to monitor and control a system (e.g. the Columbus model of the ISS, an energy network, a vehicle) in the most efficient and effective way.
We work for companies from a wide range of sectors to develop optimal HMI solutions for the user. This includes the energy sector as well as the automotive industry, the maritime sector, the aerospace sector as well as applications for cities and municipalities - wherever interactive systems need to be tailored to the users.
Example of our customers are EWE AG, Stadt Oldenburg, Airbus D&S and ATLAS Elektronik.
We designed a new user interface for the EWE Netz energy control room.
We developed new human-machine interaction for real-time monitoring of complex processes for Airbus D&S.
We developed an innovative collaborative work space for the European Space Agency.
We developed new human-machine interaction for efficient resource planning.
Augmented Reality as well as Virtual Reality are promising new technologies which provide an added value for many applications:
Virtual Reality allows to completely immerse into a virtual environment. The user can intensively explore a virtual world by using natural head movement when looking around and inspecting. This is of interest for a wide range of applications from remote control to planning of facilities, gardens or infrastructure.
Augmented Reality (AR) allows to naturally enrich the real environment with artificial objects. Thereby AR can be used e.g. for introducing users to features and handling of a new device or machine, to improve navigational tasks in the automotive or maritime domain or to support construction planning by showing how a planned building will fit into its future neighbourhood.
We have a profound knowledge in the specification and implementation of virtual and augmented reality applications. We offer services ranging from the conception of AR/VR systems up to the implementation for a wide range of devices (e.g. smartphones, tablets and, of course, smart glasses like the Oculus Rift or the Microsoft Hololens).
We have already developed AR/VR applications for various users - be it for safety-critical applications or for tourism and cultural applications. In addition, our participation in research projects in this area keeps us up to date with the latest developments. Our participation in the SmartKai project, in which we cooperate to build a novel AR-supported mooring system for ships, is described at Heise Online.