Jakob Kruse, Leon Ciechanowski, Ambre Dupuis et Peter A. Gloor
Article de revue (2024)
Document en libre accès dans PolyPublie et chez l'éditeur officiel |
|
Libre accès au plein texte de ce document Version officielle de l'éditeur Conditions d'utilisation: Creative Commons: Attribution (CC BY) Télécharger (16MB) |
Résumé
Recent advances in artificial intelligence combined with behavioral sciences have led to the development of cutting-edge tools for recognizing human emotions based on text, video, audio, and physiological data. However, these data sources are expensive, intrusive, and regulated, unlike plants, which have been shown to be sensitive to human steps and sounds. A methodology to use plants as human emotion detectors is proposed. Electrical signals from plants were tracked and labeled based on video data. The labeled data were then used for classification., and the MLP, biLSTM, MFCC-CNN, MFCC-ResNet, Random Forest, 1-Dimensional CNN, and biLSTM (without windowing) models were set using a grid search algorithm with cross-validation. Finally, the best-parameterized models were trained and used on the test set for classification. The performance of this methodology was measured via a case study with 54 participants who were watching an emotionally charged video; as ground truth, their facial emotions were simultaneously measured using facial emotion analysis. The Random Forest model shows the best performance, particularly in recognizing high-arousal emotions, achieving an overall weighted accuracy of 55.2% and demonstrating high weighted recall in emotions such as fear (61.0%) and happiness (60.4%). The MFCC-ResNet model offers decently balanced results, with Accuraccy MFCC-ResNet = 0.324. Regarding the MFCC-ResNet model, fear and anger were recognized with 75% and 50% recall, respectively. Thus, using plants as an emotion recognition tool seems worth investigating, addressing both cost and privacy concerns.
Mots clés
emotion recognition; artificial intelligence; deep learning; plant sensor; classification; emotion models
Sujet(s): |
2800 Intelligence artificielle > 2800 Intelligence artificielle (Vision artificielle, voir 2603) 2800 Intelligence artificielle > 2801 Langage naturel et reconnaissance de la parole |
---|---|
Département: | Département de mathématiques et de génie industriel |
Centre de recherche: | LID - Laboratoire en intelligence des données |
Organismes subventionnaires: | Software AG Foundation (SAGST), Polish National Science Centre |
Numéro de subvention: | 2020/38/A/HS6/00066 |
URL de PolyPublie: | https://publications.polymtl.ca/57907/ |
Titre de la revue: | Sensors (vol. 24, no 6) |
Maison d'édition: | MDPI |
DOI: | 10.3390/s24061917 |
URL officielle: | https://doi.org/10.3390/s24061917 |
Date du dépôt: | 06 juin 2024 14:43 |
Dernière modification: | 01 oct. 2024 03:13 |
Citer en APA 7: | Kruse, J., Ciechanowski, L., Dupuis, A., & Gloor, P. A. (2024). Leveraging the sensitivity of plants with deep learning to recognize human emotions. Sensors, 24(6), 1917 (22 pages). https://doi.org/10.3390/s24061917 |
---|---|
Statistiques
Total des téléchargements à partir de PolyPublie
Téléchargements par année
Provenance des téléchargements
Dimensions