Multi-Modal (M&M’s) Research is So Sweet
Vol.4 Issue 6
Kratchounova (right) giving a tour of the Multi-Modal Control Input Laboratory.

While colorful, button-shaped chocolates are a draw for most sophisticated palates, cutting-edge innovations are a welcome treat to most scientists, researchers and engineers. Pioneering many developments in Flight Deck operations is Dr. Daniela Kratchounova, a Human Factors Researcher in the Aerospace Human Factors Division of the Civil Aerospace Medical Institute (CAMI). In 2015, Kratchounova worked on a project, examining gaze control literature and viability in a cockpit environment. The conclusions from that project and emerging projects on the horizon, produced the idea of a multi-modal control input laboratory. A lab where scientists can examine the impact on human performance using new control input technologies in a flight deck environment. The introduction of new modalities or combined modalities is expected in the not- too-distant future. The question for human factors research appears to be, is there a synergetic combination of control modalities (not the exclusive use of one) that would make pilot interactions with the flight deck more intuitive, optimize crew workload, and ultimately have a positive impact on safety? At a minimum, can multi-modal control inputs form an equivalent level of safety?

Early beginnings, as Jake (top) and David help convert a storage closet into a full-fledged laboratory.

The multi-modal lab had a big vision and a small budget. Independent elements were procured to meet the needs of other projects and repurposed for the lab when available. Each piece of equipment was purchased with a game plan for how the infrastructure or software could be used for the multi-modal lab, long-term. The lab space began with construction of the SimPod by two student volunteers, David Newton and Jake Spraul. At that time, David was an Applied Psychology graduate student at the University of Central Oklahoma (UCO), and Jake was a senior in Aerospace Engineering at the University of Notre Dame. They were key in assembling the flight deck console, as well. The platform for the flight deck console was built in the CAMI machine shop by Barry Runnels, a Human Factors Simulation Engineer. The M&M lab is the culmination of a project that required a big vision, patience, frugality, and innovation. With each step, Kratchounova was thrilled that her vision was slowly coming to fruition.

Four different non-traditional control modalities were to be integrated in the lab, featuring touch, voice, gaze and 3D gesturing. For example, pilots’ interaction with the flight deck may be as simple and intuitive as a gaze at a symbol to open an application, and a wave of a hand to close it.

It took until the spring of 2018 before work on integrating these modalities could begin. Based on a system architecture diagram created by Kratchounova, Dennis Rester, a Human Factors Simulation Engineer, built the computer hardware infrastructure of the lab. Dennis designed a sophisticated system including 14 servers with enough computer power to “fly from here to Mars and back,” Kratchounova likes to say jokingly. Using the method of Agile software development, Resource Data Inc. completed the individual modalities’ integration in July of 2018.

One of the goals of this lab is to share it with members in industry, government and academia to conduct collaborative research and enhance the way pilots interact with their flight deck. More specifically, “The lab is about optimizing the crew workload profile in all phases of flight, making the interface more of a natural and organic ‘interspace,’ which better resembles human-to-human interactions,” says Kratchounova.

While the lab is still in development, there are plans to make it fully integrated with the capability to support a variety of projects in the future, including multi-modal controls, advanced vision systems, augmented and virtual reality and artificial intelligence. Dr. Melchor Antuñano, Director of the Civil Aerospace Medical Institute (who is encouraged by this research) explains, “The new Multi-Modal Control Input Laboratory will allow our researchers to measure the simultaneous performance of aviation-related tasks under various environmental, pharmacological, and physiological stressors. This device is very innovative and unique, and will allow us to do research studies in a simulated cockpit environment which enables pilots to operate the technologies using a more human-based way of communicating.”

Some of the partners contributing to the lab’s capabilities include: Resource Data, SmartEye, ADACEL, PixelWix, Cherokee CRC, EyesDx, and Leap Motion to name a few. Kratchounova hopes that the Lab, affectionately known as the M&M’s Lab (representing the two “M”s in Multi-Modal) will become an innovative research facility for the entire aerospace industry. One goal of the lab is to help assess new and novel technologies, so that we can be ahead of the power curve in supporting Flight Standards and Aircraft Certification with fundamental research on future flight decks. This lab space is a great example of big vision and strategic investments that can support existing and future studies.

Provisions for the afternoon energy slumps.

As an interesting side note, the M&M’s lab has truly become sweet research for the agency. As researchers and engineers work in the lab during its development or conduct experiments, Kratchounova is prepared for their late afternoon energy slumps, by providing a variety of M&M’s candies (25 different flavors, to be exact) to those visiting or working in the lab.

The candy-coated chocolate was originally developed in the 1940’s during the war, allowing soldiers to carry chocolate in warm climates without melting. Hence the slogan “Melts in your Mouth, not in Your Hand.” Kratchounova is on to something, and it may be the sweet smell of success.

For more information about the Multi-Modal Control Input Laboratory, please contact Daniela Kratchounova or by calling (405)954-6841.


Video Clip of M & M Lab
The multi-modal control laboratory is a cutting edge research facility that enables our researchers to assess human performance impacts of new technologies in the flight deck environment.

It has multiple control inputs including: voice, touch, gaze, and 3-D gesture. You can see in this short clip of how the pilot is able to operate the aircraft with each of these control mechanisms. We can examine each of these methods singularly or in combination.

 
 
 
 
Federal Aviation Aministration (FAA) seal