纺织学报 ›› 2022, Vol. 43 ›› Issue (08): 197-205.doi: 10.13475/j.fzxb.20210107509

• 综合述评 • 上一篇    下一篇

可穿戴技术在情绪识别中的应用进展及发展趋势

刘欢欢1,2,3, 王朝晖1,2,3(), 叶勤文1,2, 陈子唯1,2, 郑婧瑾1,2   

  1. 1.东华大学 服装与艺术设计学院, 上海 200051
    2.东华大学 现代服装设计与技术教育部重点实验室, 上海 200051
    3.上海市智能制造与工程一带一路国际联合实验室, 上海 200051
  • 收稿日期:2021-01-29 修回日期:2021-09-07 出版日期:2022-08-15 发布日期:2022-08-24
  • 通讯作者: 王朝晖
  • 作者简介:刘欢欢(1997—),女,硕士生。主要研究方向为服装先进制造与人体科学研究。
  • 基金资助:
    上海市科学技术委员会“科技创新行动计划”“一带一路”国际合作项目(21130750100)

Progress and trends in application of wearable technology for emotion recognition

LIU Huanhuan1,2,3, WANG Zhaohui1,2,3(), YE Qinwen1,2, CHEN Ziwei1,2, ZHENG Jingjin1,2   

  1. 1. College of Fashion and Design, Donghua University, Shanghai 200051, China
    2. Key Laboratory of Clothing Design and Technology, Ministry of Education, Donghua University, Shanghai 200051, China
    3. Shanghai Belt and Road Joint Laboratory of Textile Intelligent Manufacturing, Shanghai 200051, China
  • Received:2021-01-29 Revised:2021-09-07 Published:2022-08-15 Online:2022-08-24
  • Contact: WANG Zhaohui

摘要:

为促进情绪识别智能可穿戴产品技术在纺织服装领域的创新发展,系统介绍了近些年国内外情绪识别监测内容方法、分类算法以及情绪识别可穿戴设备的研究现状。首先概述了情绪分类模型并总结情绪产生时出现的生理反应;然后针对目前情绪识别监测内容方法的研究现状,阐述了生理信号和行为表现二大类情绪识别监测内容方法,进一步总结常用情绪识别分类算法,并依据可穿戴设备部位总结现有情绪识别可穿戴产品;最后讨论了情绪识别智能可穿戴设备在未来发展中需要解决的问题,并从柔性舒适采集设备、识别结果准确度以及识别结果交互方式3个方面展望其未来发展趋势和应用前景。

关键词: 情绪识别, 机器学习, 情绪模型, 智能可穿戴, 生理信号, 行为表现

Abstract:

In order to promote the innovative development of smart wearable product technology for emotion recognition in the textile and apparel field, this paper systematically introduced the current research status of emotion recognition monitoring, classification algorithms and emotion recognition wearable devices. The emotion classification model was outlined and the physiological reactions summarized that occur when emotions were generated. In view of the current research status of emotion recognition monitoring methods, two categories of emotion recognition monitoring methods, namely physiological signals and behavioural manifestations, were elaborated, and the common emotion recognition classification algorithms and the existing emotion recognition wearable products based on the wearable device parts were summarized. The review also discussed the challenges and problems that need to be addressed in future development of emotion recognition smart wearable. The review identified future development trend and application prospects from three aspects: flexible and comfortable collection devices, accuracy of recognition results, and the way to interact with recognition results.

Key words: emotion recognition, machine learning, emotion modeling, smart wearable, physiological signal, behavioral expression

中图分类号: 

  • TS941

图1

效价–唤醒度情绪模型"

表1

4种典型情绪状态及其生理反应"

项目 愤怒 悲伤 愉悦 快乐
心率 ↑↓
心率变异性
皮肤电导率水平
呼吸频率
体温

图2

不同情绪下嘴部肌肉差异"

图3

眼部区域面容改变的7个AU单元"

图4

购物中心情绪监测系统示意图"

表2

情绪识别可穿戴研究中情绪类型和识别信号及结果总结"

情感状态 信号类型 识别算法 识别准确度/% 参考文献
愤怒、厌恶、中性、愉悦等8种 EMG、PPG、RESP、EDA K近邻算法 81 [12]
快乐、愤怒、愉悦、悲伤 ECG、EDA、EMG、RESP K近邻算法;线性判别函数;神
经网络
81、80、81 [40]
高压、低压 PPG 逻辑回归 63左右 [41]
效价–唤醒度模型下的积极兴奋、消极兴奋和冷静 BP、EEG、EDA、PPG、RESP 线性判别分析;二次判别分析;
支持向量机
<50、<47、<50 [42]
效价–唤醒度二维模型(5等级) ECG、EDA、RESP 二次判别分析 >90 [43]
放松、焦虑、激动、快乐 EDA、PPG 卷积神经网络 <75 [44]
高/低效价,高/低唤醒度 ECG、EMG、EOG 支持向量机 5060 [45]
平静、快乐、悲伤、恐惧、愤怒 EDA、PPG 规则范式分类器 <87 [46]
放松,3种不同类型压力状态(生理、情感、认知) EDA、TEMP、HR 高斯混合模型 <85 [47]
高/低效价,高/低唤醒度 EEG、EDA、EMG 决策树; 朴素贝叶斯算法; 支持
向量机
63.8、58.5 [48]
高/低效价,高/低唤醒度 EDA、PPG、TEMP 朴素贝叶斯算法; K近邻算法;
随机森林;支持向量机
76 [49]
高/低效价,高/低唤醒度 ECG、EDA 卷积神经网络 效价维度75;
唤醒度维度71
[50]
快乐、放松、厌恶、悲伤和中性 EDA、PPG、EMG 深度信念网络+支持向量机 <67 [51]
笑声监测 EDA、PPG 逻辑回归;随机森林;支持向量机 87 [52]
高/低效价,高/低唤醒度 EMG、PPG、TEMP 决策树;K近邻算法;随机森林 <67 [53]

表3

不同情绪识别方法与可穿戴技术结合总结"

识别内容 监测部位 可应用产品形态
生理信号 头部、脸部、胸部 头盔、头箍、眼镜、口罩、项链、
内衣、背心、衬衫、胸带等
肢体动作 手腕、脚腕 智能手环、智能手表、
裤子、鞋类等
面部表情 脸部 眼镜、口罩等
语音语调 可采集到音频的
位置都可
佩戴型产品
行为线索 可采集到行为信息的
位置都可
佩戴型产品、智能家居等

图5

Amoeba智能眼镜"

图6

眼镜式可穿戴系统"

图7

PSYCHE可穿戴监控系统模型"

图8

飞利浦“布贝尔服”"

[1] SINGH R R, CONJETI S, BANERJEE R. A comparative evaluation of neural network classifiers for stress level analysis of automotive drivers using physiological signals[J]. Biomedical Signal Processing and Control, 2013, 8(6): 740-754.
doi: 10.1016/j.bspc.2013.06.014
[2] 黄天蓝. 可穿戴设备的服装结构与工艺研究[J]. 美与时代, 2017(5):102-106.
HUANG Tianlan. Wearable clothing structure and technology research[J]. Beauty and Times, 2007(5): 102-106.
[3] SCHMIDT P, REISS A, DURICHEN R, et al. Wearable-based affect recognition:a review[J]. Sensors, 2019, 19(19):4079-4121.
doi: 10.3390/s19194079
[4] EKMAN P, LEVENSON R W, FRIESEN W V. Autonomic nervous system activity distinguishes among emotions[J]. Science, 1983, 221(4616): 1208-1210.
doi: 10.1126/science.6612338
[5] RUSSEL J A. Mixed emotions viewed from the psychological constructionist perspective[J]. Emotion Review, 2017, 9(2): 111-117.
doi: 10.1177/1754073916639658
[6] MCCORRY L K. Physiology of the autonomic nervous system[J]. American Journal of Pharmaceutical Education, 2007, 71(4): 11.
doi: 10.5688/aj710111
[7] KREIBIG S D. Autonomic nervous system activity in emotion: a review[J]. Biological Psychology, 2010, 84(3): 394-421.
doi: 10.1016/j.biopsycho.2010.03.010
[8] 吴学奎, 任立红, 丁永生. 面向智能服装的多生理信息融合的情绪判别[J]. 计算机工程与应用, 2009, 45(33): 218-221.
WU Xuekui, REN Lihong, DING Yongsheng. Multi-physiology information fusion for emotion distinction in smart clothing[J]. Computer Engineering and Applications, 2009, 45(33): 218-221.
doi: 10.3778/j.issn.1002-8331.2009.33.069
[9] ALAM M G R, ABEDIN S F, MOON S I, et al. Healthcare IoT-based affective state mining using a deep convolutional neural network[J]. IEEE Access, 2019, 7:75189-75202.
doi: 10.1109/ACCESS.2019.2919995
[10] PICARD P W, HEALEY J. Affective wearables[J] Personal Technologies, 1997, 1(4): 231-240.
doi: 10.1007/BF01682026
[11] D'MELLO S K, KORY J. A review and meta-analysis of multimodal affect detection systems[J]. Acm Computing Surveys, 2015, 47(3):36.
[12] PICARD R W, VYZAS E, HEALEY J. Toward machine emotional intelligence: analysis of affective physiological state[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2001, 23(10): 1175-1191.
doi: 10.1109/34.954607
[13] KHAN A M, LAWO M. Developing a system for recognizing the emotional states using physiological devices[C]//12th International Conference on Intelligent Environments. London: IEEE, 2016: 48-53.
[14] VALENZA G C, GENTILI C, LANATA A, et al. Mood recognition in bipolar patients through the PSYCHE platform: preliminary evaluations and perspectives[J]. Artificial Intelligence in Medicine, 2013, 57(1): 49-58.
doi: 10.1016/j.artmed.2012.12.001
[15] SOLEYMANI M, LICHTENAUER J, PUN T, et al. A multimodal database for affect recognition and implicit tagging[J]. IEEE Transactions on Affective Computing, 2013, 3(1):42-55.
doi: 10.1109/T-AFFC.2011.25
[16] MAHDIANI S, JEYHANI V, PELTOKANGAS, et al. Is 50 Hz high enough ECG sampling frequency for accurate HRV analysis? [C]//IEEE Engineering in Medicine and Biology Society. Milan: IEEE, 2015: 5948-5951.
[17] ZHU J P, JI L Z, JI C Y, et al. Heart rate variability monitoring for emotion and disorders of emotion[J]. Physiological Measurement, 2019. DOI: 10.1088/1361-6579/AB1887.
doi: 10.1088/1361-6579/AB1887
[18] ZANGRONIZ R, MARTINEZ-RODRIGO A, PASTOR J M, et al. Electrodermal activity sensor for classification of calm/distress condition[J]. Sensors, 2017. DOI: 10.3390/s17102324.
doi: 10.3390/s17102324
[19] DOMINGUEZ-JIMENEZ J A, CAMPO-LANDINES K C, MARTINEZ-SANTOS J C, et al. A machine learning model for emotion recognition from physiological sig-nals[J]. Biomedical Signal Processing and Control, 2020. DOI: 10.1016/j. bspc. 2019. 101646.
doi: 10.1016/j. bspc
[20] HEALEY J A, PICARD R W. Detecting stress during real-world driving tasks using physiological sensors[J]. IEEE Transactions on Intelligent Transportation Systems, 2005, 6(2): 156-166.
doi: 10.1109/TITS.2005.848368
[21] LYKKEN D T, VENABLES P H. Direct measurement of skin conductance: a proposal for standardization[J]. Psychophysiology, 1971, 8(5): 656-672.
doi: 10.1111/j.1469-8986.1971.tb00501.x
[22] FENG H H, GOLSHAN H M, MAHOOR M H. A wavelet-based approach to emotion classification using EDA signals[J]. Expert Systems with Applications, 2018, 112: 77-86.
doi: 10.1016/j.eswa.2018.06.014
[23] JANG E H, PARK B J, PARK M S, et al. Analysis of physiological signals for recognition of boredom, pain, and surprise emotions[J]. Journal of Physiological Anthropology, 2015. DOI: 10.1186/s40101-015-0063-5.
doi: 10.1186/s40101-015-0063-5
[24] ZHANG Q, CHEN X X, ZHAN Q Y, et al. Respiration-based emotion recognition with deep lear-ning[J]. Computers in Industry, 2017, 92/93: 84-90.
doi: 10.1016/j.compind.2017.04.005
[25] DAR M N, AKRAM M U, KHAWAJA S G, et al. CNN and LSTM-based emotion charting using physiological signals[J]. Sensors, 2020, 20(16): 26.
doi: 10.3390/s20010026
[26] SOLEYMANI M, PANTIC M, PUN T. Multimodal emotion recognition in response to videos[J]. IEEE Transactions on Affective Computing, 2012, 3(2): 211-223.
doi: 10.1109/T-AFFC.2011.37
[27] SARIYANIDI E, GUNES H, CAVALLARO A. Automatic analysis of facial affect: a survey of registration, representation, and recognition[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, 37(6): 1113-1133.
doi: 10.1109/TPAMI.2014.2366127
[28] NAG A, HABER H, VOSS C, et al. Toward continuous social phenotyping: analyzing gaze patterns in an emotion recognition task for children with autism through wearable smart glasses[J]. Journal of Medical Internet Research, 2020.DOI: 10.2196/preprint.13810.
doi: 10.2196/preprint.13810
[29] KATSIS C D, GOLETSIS Y, RIGAS G, et al. A wearable system for the affective monitoring of car racing drivers during simulated conditions[J]. Transportation Research Part C:Emerging Technologies, 2011, 19(3): 541-551.
doi: 10.1016/j.trc.2010.09.004
[30] DORES A R, BARBOSA F, QUEIROS C, et al. Recognizing emotions through facial expressions: a largescale experimental study[J]. International Journal of Environmental Research and Public Health, 2020, 17(20): 7240-7253.
doi: 10.3390/ijerph17197240
[31] 崔莉庆, 李舜, 朱廷劭. 移动可穿戴设备的数据挖掘:步态中的情绪识别[C]//第十八届全国心理学学术会议摘要集——心理学与社会发展. 天津: [出版者不详], 2015:623-624.
CUI Liqing, LI Shun, ZHU Tingzhao. Emotion detection based on natural gaits recorded by wearable devices[C]//Abstracts of the 18th National Psychology Conference:Psychology and Social Development. Tianjin: [s.n.], 2015:623-624.
[32] KAMINSKA D, SAPINSKI T, ANBARJAFARI G. Efficiency of chosen speech descriptors in relation to emotion recognition[J]. Eurasip Journal on Audio Speech and Music Processing, 2017(1): 3-12.
[33] 韩文静, 李海峰, 阮华斌, 等语音情感识别研究进展综述[J]. 软件学报, 2014, 25(1):37-50.
HAN Wenjing, LI Haifeng, RUN Huabin, et al. Review on speech emotion recognition.[J]. Journal of Software, 2014, 25(1): 37-50.
[34] NALEPA G J, KUTT K, BOBEK S. Mobile platform for affective context-aware systems[J]. Future Generation Computer Systems:the International Journal of Escience, 2019, 92:490-503.
[35] ALAJMI N, KANJO E, MAWASS N E, et al. Shopmobia: an emotion-based shop rating system[C]//Humaine Association Conference on Affective Computing and Intelligent Interaction. Geneva: IEEE, 2013: 745-750.
[36] FERNANDEZ-DELGADO M, CERNADAS E, BARRO S, et al. Do we need hundreds of classifiers to solve real world classification problems?[J]. Journal of Machine Learning Research, 2014, 15: 3133-3181.
[37] RUBIN J. Time, frequency & complexity analysis for recognizing panic states from physiologic time series[C]//10th EAI International Conference on Pervasive Computing Technologies for Healthcare. UK: ACM, 2016:81-88.
[38] FRIEDMAN J, HASTIE T, TIBSHIRANI R. Additive logistic regression: a statistical view of boosting[J]. Annals of Statistics, 2000, 28(2): 337-374.
[39] MOZOSO M, SANDULESCU V, ANDREWS S, et al. Stress detection using wearable physiological and sociometric sensors[J]. International Journal of Neural Systems, 2017, 27(2): 16.
[40] WAGNER J, KIM J, ANDRE E, et al. From physiological signals to emotions: implementing and comparing selected methods for feature extraction and classification[C]//2005 IEEE International Conference on Multimedia and Expo. Amsterdam: IEEE, 2005: 941-944.
[41] KIM D, SEO Y, CHO J, et al. Detection of subjects with higher self-reporting stress scores using heart rate variability patterns during the day[C]//IEEE Engineering in Medicine and Biology Society Conference Proceedings. Vancouver: IEEE, 2008: 682-685.
[42] CHANEL G, KIERKELS J M, SOLEYMANI M, et al. Short-term emotion assessment in a recall paradigm[J]. International Journal of Human-Computer Studies, 2009, 67(8): 607-627.
doi: 10.1016/j.ijhcs.2009.03.005
[43] VALENZA G, LANATA A, SCILINGO P E. The role of nonlinear dynamics in affective valence and arousal recognition[J]. IEEE Transactions on Affective Computing, 2012, 3(2): 237-249.
doi: 10.1109/T-AFFC.2011.30
[44] MARTINEZ H P, BENGIO Y, YANNAKAKIS N G. Learning deep physiological models of affect[J]. IEEE Computational Intelligence Magazine, 2013, 8(2): 20-33.
doi: 10.1109/MCI.2013.2247823
[45] ABADI M K, SUBRAMANIAN R, KIA M S, et al. Decaf: meg-based multimodal database for decoding affective physiological responses[J]. IEEE Transactions on Affective Computing, 2015, 6(3): 209-222.
doi: 10.1109/TAFFC.2015.2392932
[46] RATHOD P, GEORGE K, SHINDE N, et al. Bio-signal based emotion detection device[C]//IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks. San Francisco: IEEE, 2016: 105-108.
[47] BIRJANDTALAB J, COGAN D, POUYAN B M, et al. A non-EEG biosignals dataset for assessment and visualization of neurological status[C]//IEEE international workshop on signal processing systems. Dallas: IEEE, 2016: 110-114.
[48] GIRARDI D, LANUBILE F, NOVIELLI N, et al. Emotion detection using noninvasive low cost sen-sors[C]//Seventh International Conference on Affective Computing and Intelligent Interaction. San Antonio: IEEE, 2017:125-130.
[49] ZHAO B B, WANG Z, YU W Z, et al. Emotionsense: emotion recognition based on wearable wristband[C]//IEEE Smartworld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation. Guangzhou: IEEE, 2018:346-355.
[50] SANTAMARIA-GRANADOS L, MUNOZ-ORGANERO M, RAMIREZ-GONZALEZ G, et al. Using deep convolutional neural network for emotion detection on a physiological signals dataset (AMIGOS)[J]. IEEE Access, 2019, 7: 57-67.
doi: 10.1109/ACCESS.2018.2883213
[51] HASSAN M M, ALAM M G R, UDDIN M Z, et al. Human emotion recognition using deep belief network architecture[J]. Information Fusion, 2019, 51: 10-18.
doi: 10.1016/j.inffus.2018.10.009
[52] DI LASCIO E, GASHI S, SANTINI S, et al. Laughter recognition using non-invasive wearable devices[C]// 13th EAI International Conference on Pervasive Computing Technologies for Healthcare. Trento: ACM, 2019: 262-271.
[53] HEINISCH J S, ANDERSON C, DAVID K, et al. Angry or climbing stairs? towards physiological emotion recognition in the wild[C]//IEEE International Conference on Pervasive Computing and Communications Workshops. Kyoto: IEEE, 2019:486-491.
[54] YANG J, WANG R, GUAN X, et al. AI-enabled emotion-aware robot: the fusion of smart clothing, edge clouds and robotics[J]. Future Generation Computer Systems, 2020, 102: 701-709.
doi: 10.1016/j.future.2019.09.029
[55] KWON J, KIM D H, PARK W, et al. A wearable device for emotional recognition using facial expression and physiological response[C]//38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. Orlando: IEEE, 2016: 5765-5768.
[56] 徐诗怡. 老年人情绪监测智能手环创新设计研究[D]. 西安: 西安理工大学, 2019: 56-68.
XU Shiyi. Research on innovative design of emotional monitoring interligent bracelet for the aged[D]. Xi'an: Xi'an University of Technology, 2019:56-68.
[57] YANG J, ZHOU J, TAO G M, et al. Wearable 3.0: from smart clothing to wearable affective robot[J]. Network, 2019, 33(6): 8-14.
[58] LANATA A, VALENZA G, NARDELLI M, et al. Complexity index from a personalized wearable monitoring system for assessing remission in mental health[J]. Journal of Biomedical and Health Informatics, 2015, 19(1): 132-139.
[59] 王晓昕. 仿生色彩设计应用研究[J]. 数位时尚(新视觉艺术), 2011 (1): 89-90.
WANG Xiaoxin. A study on the application of bionic colour design[J]. New Vision Art, 2011(1): 89-90.
[1] 孙春红, 丁广太, 方坤. 基于稀疏字典学习的羊绒与羊毛分类[J]. 纺织学报, 2022, 43(04): 28-32.
[2] 虞茹芳, 洪兴华, 祝成炎, 金子敏, 万军民. 还原氧化石墨烯涂层织物的电加热性能[J]. 纺织学报, 2021, 42(10): 126-131.
[3] 荣凯, 樊威, 王琪, 张聪, 于洋. 二维过渡金属碳/氮化合物复合纤维在智能可穿戴领域的应用进展[J]. 纺织学报, 2021, 42(09): 10-16.
[4] 方剑, 任松, 张传雄, 陈钱, 夏广波, 葛灿. 智能可穿戴纺织品用电活性纤维材料[J]. 纺织学报, 2021, 42(09): 1-9.
[5] 闫涛, 潘志娟. 轻薄型取向碳纳米纤维膜的应变传感性能[J]. 纺织学报, 2021, 42(07): 62-68.
[6] 杨景朝, 蒋秀明, 董九志, 陈云军, 梅宝龙. 基于机器学习的整体穿刺加压参数预测方法[J]. 纺织学报, 2019, 40(08): 157-163.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!