Perceived Mental Workload Classification Using Intermediate Fusion Multimodal Deep Learning

Frontiers in Human Neuroscience 14 (2021)
  Copy   BIBTEX

Abstract

A lot of research has been done on the detection of mental workload using various bio-signals. Recently, deep learning has allowed for novel methods and results. A plethora of measurement modalities have proven to be valuable in this task, yet studies currently often only use a single modality to classify MWL. The goal of this research was to classify perceived mental workload using a deep neural network that flexibly makes use of multiple modalities, in order to allow for feature sharing between modalities. To achieve this goal, an experiment was conducted in which MWL was simulated with the help of verbal logic puzzles. The puzzles came in five levels of difficulty and were presented in a random order. Participants had 1 h to solve as many puzzles as they could. Between puzzles, they gave a difficulty rating between 1 and 7, seven being the highest difficulty. Galvanic skin response, photoplethysmograms, functional near-infrared spectrograms and eye movements were collected simultaneously using LabStreamingLayer. Marker information from the puzzles was also streamed on LSL. We designed and evaluated a novel intermediate fusion multimodal DNN for the classification of PMWL using the aforementioned four modalities. Two main criteria that guided the design and implementation of our DNN are modularity and generalisability. We were able to classify PMWL within-level accurate on a seven-level workload scale using the aforementioned modalities. The model architecture allows for easy addition and removal of modalities without major structural implications because of the modular nature of the design. Furthermore, we showed that our neural network performed better when using multiple modalities, as opposed to a single modality. The dataset and code used in this paper are openly available.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 92,347

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Paradoxes of Interaction?Johannes Stern & Martin Fischer - 2015 - Journal of Philosophical Logic 44 (3):287-308.
Multimodal mental imagery.Bence Nanay - 2018 - Cortex 105:125-136.
Not all perceptual experience is modality specific.Casey O'Callaghan - 2015 - In Dustin Stokes, Mohan Matthen & Stephen Biggs (eds.), Perception and Its Modalities. Oxford University Press. pp. 133-165.
Perception and Multimodality.Casey O'Callaghan - 2012 - In Eric Margolis, Richard Samuels & Stephen Stich (eds.), The Oxford Handbook of Philosophy of Cognitive Science. Oxford University Press.
Lemon Classification Using Deep Learning.Jawad Yousif AlZamily & Samy Salim Abu Naser - 2020 - International Journal of Academic Pedagogical Research (IJAPR) 3 (12):16-20.
Type of Tomato Classification Using Deep Learning.Mahmoud A. Alajrami & Samy S. Abu-Naser - 2020 - International Journal of Academic Pedagogical Research (IJAPR) 3 (12):21-25.
Symmetric Contingency Logic with Unlimitedly Many Modalities.Jie Fan - 2019 - Journal of Philosophical Logic 48 (5):851-866.

Analytics

Added to PP
2021-01-12

Downloads
10 (#1,199,765)

6 months
6 (#530,615)

Historical graph of downloads
How can I increase my downloads?

Citations of this work

No citations found.

Add more citations

References found in this work

Add more references