基于融合网络的井下人员行为识别方法

张雷, 冉凌鎛, 代婉婉, 朱永红, 史新国

张雷,冉凌鎛,代婉婉,等. 基于融合网络的井下人员行为识别方法[J]. 工矿自动化,2023,49(3):45-52. DOI: 10.13272/j.issn.1671-251x.2022120015
引用本文: 张雷,冉凌鎛,代婉婉,等. 基于融合网络的井下人员行为识别方法[J]. 工矿自动化,2023,49(3):45-52. DOI: 10.13272/j.issn.1671-251x.2022120015
ZHANG Lei, RAN Lingbo, DAI Wanwan, et al. Behavior recognition method for underground personnel based on fusion network[J]. Journal of Mine Automation,2023,49(3):45-52. DOI: 10.13272/j.issn.1671-251x.2022120015
Citation: ZHANG Lei, RAN Lingbo, DAI Wanwan, et al. Behavior recognition method for underground personnel based on fusion network[J]. Journal of Mine Automation,2023,49(3):45-52. DOI: 10.13272/j.issn.1671-251x.2022120015

基于融合网络的井下人员行为识别方法

基金项目: 江苏省高等学校基础科学(自然科学)研究项目(21KJB510025);国家重点研发计划项目(2017YFC0804401);国家自然科学基金项目(52074273);教育部产学合作协同育人项目(BY2021160202102356012);徐州市科技计划项目(KC19208);淄矿集团智慧矿山关键技术研发开放基金项目(2019LH05)。
详细信息
    作者简介:

    张雷(1987—),男,江苏徐州人,讲师,博士,研究方向为移动无线感知,E-mail:11905@ xzit.edu.cn

    通讯作者:

    朱永红(1979—),女,江苏徐州人,副教授,博士,研究方向为无线传感器网络,E-mail:zhyh@xzit.edu.cn

  • 中图分类号: TD67

Behavior recognition method for underground personnel based on fusion network

  • 摘要: 井下人员行为识别是保障煤矿安全生产的重要措施。针对现有井下人员行为识别研究缺少对感知机理的研究与分析且特征提取手段单一的问题,提出一种基于融合网络的井下人员行为识别方法。该方法主要包括数据预处理、特征构建和判识网络构造3个部分。数据预处理:通过信道状态信息(CSI)商模型、子载波去直流和离散小波去噪对采集的CSI数据进行处理,以降低环境噪声、设备噪声等的影响。特征构建:将处理后的数据利用格拉姆和/差角场 (GASF/GADF)转换成图像,从而保留数据的空间和时间特性。判识网络构造:根据人员动作的特点,提出一种由基于门控循环单元(GRU)的编解码网络和多尺度卷积神经网络(CNN)组成的融合网络,利用GRU保留前后数据之间的关联性,同时利用注意力机制的权重分配策略有效提取关键特征,以提高行为识别的准确率。实验结果表明:该方法对行走、摘帽子、扔东西、坐、抽烟、挥手、跑动、睡觉8种动作的平均识别准确率为97.37%,对睡觉和坐的识别准确率最高,最容易发生误判的动作是行走和跑动;使用准确率、精确率、召回率和F1分数作为评价指标,得出融合网络的性能优于CNN和GRU,人员行为识别准确率高于HAR系统、WiWave系统和Wi−Sense系统;正常速度下行走和摘帽子2种动作的平均识别精度为95.6%,高于快速动作情况下的93.6%和慢速动作情况下的92.7%;收发设备之间的距离为2 m和2.5 m时,识别准确率较高。
    Abstract: Underground personnel behavior recognition is an important measure to ensure safe production in coal mines. The existing research on behavior recognition of underground personnel lacks research and analysis on the perception mechanism, and the feature extraction method is simple. In order to solve the above problems, a behavior recognition method for underground personnel based on fusion networks is proposed. The method mainly includes three parts: data preprocessing, feature construction, and recognition network construction. Data preprocessing: the collected channel status information (CSI) data is processed through CSI quotient models, subcarrier denoising, and discrete wavelet denoising to reduce the impact of environmental noise and equipment noise. Feature construction: the processed data is transformed into images using the Gramian angular summation/difference fields (GASF/GADF) to preserve the spatial and temporal features of the data. Recognition network construction: according to the features of personnel actions, a fusion network composed of a gate recurrent unit (GRU) based encoding and decoding network and a multiscale convolutional neural network (CNN) is proposed. GRU is used to preserve the correlation between pre and post data. The weight allocation strategy of the attention mechanism is used to effectively extract key features to improve the accuracy of behavior recognition. The experimental results show that the average recognition accuracy of this method for eight movements, namely walking, taking off a hat, throwing things, sitting, smoking, waving, running, and sleeping, is 97.37%. The recognition accuracy for sleeping and sitting is the highest, and the most prone to misjudgment are walking and running. Using accuracy, precision, recall, and F1 score as evaluation indicators, it is concluded that the performance of the fusion network is superior to CNN and GRU. The accuracy of personnel behavior recognition is higher than the HAR system, WiWave system and Wi-Sense system. The average recognition accuracy of walking and taking off a hat at normal speed is 95.6%, which is higher than 93.6% for fast motion and 92.7% for slow motion. When the distance between transceiver devices is 2 m and 2.5 m, the recognition accuracy is higher.
  • 图  1   菲涅耳区原理

    Figure  1.   Fresnel zone principle

    图  2   行为识别框架

    Figure  2.   Behavior recognition framework

    图  3   井下人员行为识别网络

    Figure  3.   Underground personnel behavior recognition network

    图  4   实验巷道

    Figure  4.   Experimental roadway

    图  5   人员行为识别结果显示

    Figure  5.   Display of human behavior recognition results

    图  6   8种动作识别结果混淆矩阵

    Figure  6.   Confusion matrix of recognition results of 8 kinds of action

    图  7   不同网络模型识别结果对比

    Figure  7.   Comparison of recognition results of different network models

    图  8   不同系统识别结果对比

    Figure  8.   Comparison of recognition results of different systems

    图  9   动作速度对识别精度的影响

    Figure  9.   Influence of action speed on recognition precision

    图  10   不同距离下的识别精度

    Figure  10.   Recognition precision at different distances

    表  1   基于GRU的编解码网络参数

    Table  1   Parameters of encoding and decoding network based on GRU

    序号网络层输出维度
    1GRU256×512
    2GRU256×256
    3GRU128×256
    4GRU128×128
    5Transposed Convolution128×128
    6Self−Attention128×128
    7Transposed Convolution128×256
    8Self−Attention128×256
    9Transposed Convolution128×512
    10Self−Attention128×512
    111D−Convolution64×256
    12Flattern2048×1
    下载: 导出CSV

    表  2   多尺度CNN参数

    Table  2   Parameters of multi-scale CNN

    序号网络层核大小核数目输出维度
    1ECA512×1600
    1−1Con1−15256256×800
    1−2Con1−27256256×800
    1−3Con1−31256256×800
    2−1Con2−17512512×400
    2−2Con2−23512512×400
    2−3Con2−35512512×400
    3−1Con3−13256256×400
    3−2Con3−25256256×400
    3−3Con3−37256256×400
    4−1Pooling4−1128×512
    4−2Pooling4−2128×512
    4−3Pooling4−3128×512
    5−1ECA128×512
    5−2ECA128×512
    5−3ECA128×512
    6Flatten2048×1
    下载: 导出CSV

    表  3   实验动作

    Table  3   Experimental actions

    动作潜在危险行为
    行走进入危险区域
    摘帽子摘安全帽
    扔东西乱扔工具
    在危险区域休息
    抽烟违规抽烟
    挥手斗殴
    跑动违规下车
    睡觉在危险区域睡觉
    下载: 导出CSV

    表  4   不同优化器和学习率下的识别准确率

    Table  4   Recognition accuracy under different optimizers and learning rates

    学习率准确率/%
    Ada DeltaSGDRMS PropAdam
    0.000195.5494.7596.3397.01
    0.00193.4394.3794.6497.37
    0.0196.4594.8893.4192.15
    0.0592.9792.6490.7292.89
    0.188.2589.7691.4591.99
    下载: 导出CSV
  • [1] 陶志勇,郭京,刘影. 基于多天线判决的CSI高效人体行为识别方法[J]. 计算机科学与探索,2021,15(6):1122-1132. DOI: 10.3778/j.issn.1673-9418.2005021

    TAO Zhiyong,GUO Jing,LIU Ying. Efficient human behavior recognition method of CSI based on multi-antenna judgment[J]. Journal of Frontiers of Computer Science and Technology,2021,15(6):1122-1132. DOI: 10.3778/j.issn.1673-9418.2005021

    [2]

    GU Yu,WANG Yantong,WANG Meng,et al. Secure user authentication leveraging keystroke dynamics via Wi-Fi sensing[J]. IEEE Transactions on Industrial Informatics,2022,18(4):2784-2795. DOI: 10.1109/TII.2021.3108850

    [3]

    GORRINI A,MESSA F,CECCARELLI G,et al. Covid-19 pandemic and activity patterns in Milan. Wi-Fi sensors and location-based data[J]. TeMA-Journal of Land Use,Mobility and Environment,2021,14(2):211-226.

    [4]

    CHEN Liangqin,TIAN Liping,XU Zhimeng,et al. A survey of WiFi sensing techniques with channel state information[J]. ZTE Communications,2020,18(3):57-63.

    [5]

    MA Yongsen,ZHOU Gang,WANG Shuangquan. WiFi sensing with channel state information:a survey[J]. ACM Computing Surveys,2019,52(3):1-36.

    [6]

    FANG Yuanrun,XIAO Fu,SHENG Biyun,et al. Cross-scene passive human activity recognition using commodity WiFi[J]. Frontiers of Computer Science,2022,16:1-11.

    [7]

    ZHANG Lei,ZHANG Yue,BAO Rong,et al. A novel WiFi-based personnel behavior sensing with a deep learning method[J]. IEEE Access,2022,10:120136-120145. DOI: 10.1109/ACCESS.2022.3222381

    [8] 魏忠诚,张新秋,连彬,等. 基于Wi-Fi信号的身份识别技术研究[J]. 物联网学报,2021,5(4):107-119. DOI: 10.11959/j.issn.2096-3750.2021.00213

    WEI Zhongcheng,ZHANG Xinqiu,LIAN Bin,et al. A survey on Wi-Fi signal based identification technology[J]. Chinese Journal on Internet of Things,2021,5(4):107-119. DOI: 10.11959/j.issn.2096-3750.2021.00213

    [9]

    WANG Yan, LIU Jian, CHEN Yingying, et al. E-eyes: device-free location-oriented activity identification using fine-grained WiFi signatures[C]. Proceedings of the 20th Annual International Conference on Mobile Computing and Networking, 2014: 617-628.

    [10]

    YAN Huan,ZHANG Yong,WANG Yujie. WiAct:a passive WiFi-based human activity recognition system[J]. IEEE Sensors Journal,2019,20(1):296-305.

    [11] 熊小樵,冯秀芳,丁一. 基于CSI的手势识别方法研究[J]. 计算机应用与软件,2022,39(1):181-187. DOI: 10.3969/j.issn.1000-386x.2022.01.027

    XIONG Xiaoqiao,FENG Xiufang,DING Yi. Research on hand gesture recognition method based on CSI[J]. Computer Applications and Software,2022,39(1):181-187. DOI: 10.3969/j.issn.1000-386x.2022.01.027

    [12]

    ATITALLAH B B, ABBASI M B, BARIOUL R, et al. Simultaneous pressure sensors monitoring system for hand gestures recognition[C]. 2020 IEEE Sensors, Rotterdam, 2020: 1-4.

    [13]

    CHU Xianzhi, LIU Jiang, SHIMAMOTO S. A sensor-based hand gesture recognition system for Japanese sign language[C]. 2021 IEEE 3rd Global Conference on Life Sciences and Technologies(LifeTech), Nara, 2021: 311-312.

    [14]

    YIN Kang, TANG Chengpei, ZHANG Xie, et al. Robust human activity recognition system with Wi-Fi using handcraft feature[C]. 2021 IEEE Symposium on Computers and Communications, Athens, 2021: 1-8.

    [15]

    YU Bohan,WANG Yuxiang,NIU Kai,et al. WiFi-sleep:sleep stage monitoring using commodity Wi-Fi devices[J]. IEEE Internet of Things Journal,2021,8(18):13900-13913. DOI: 10.1109/JIOT.2021.3068798

    [16]

    SOLIKHIN M,PRATAMA Y,PASARIBU P,et al. Analisis watermarking menggunakan metode discrete cosine transform (DCT) dan discrete fourier transform (DFT)[J]. Jurnal Sistem Cerdas,2022,5(3):155-170.

    [17]

    RAJASHEKHAR U,NEELAPPA D,RAJESH L. Electroencephalogram (EEG) signal classification for brain-computer interface using discrete wavelet transform (DWT)[J]. International Journal of Intelligent Unmanned Systems,2022,10(1):86-97. DOI: 10.1108/IJIUS-09-2020-0057

    [18]

    CAN C, KAYA Y, KILIÇ F. A deep convolutional neural network model for hand gesture recognition in 2D near-infrared images[J]. Biomedical Physics & Engineering Express, 2021, 7(5). DOI: 10.1088/2057-1976/ac0d91.

    [19]

    YU L, LI J, WANG T, et al. T2I-Net: time series classification via deep sequence-to-image transformation networks[C]. 2022 IEEE International Conference on Networking, Sensing and Control, Shanghai, 2022: 1-5.

    [20]

    MOGHADDAM M G, SHIREHJINI A A N, SHIRMOHAMMADI S. A WiFi-based system for recognizing fine-grained multiple-subject human activities[C]. 2022 IEEE International Instrumentation and Measurement Technology Conference, Ottawa, 2022: 1-6.

    [21]

    MEI Y, JIANG T, DING X, et al. WiWave: WiFi-based human activity recognition using the wavelet integrated CNN[C]. 2021 IEEE/CIC International Conference on Communications in China, Xiamen, 2021: 100-105.

    [22]

    MUAAZ M,CHELLI A,GERDES M W,et al. Wi-Sense:a passive human activity recognition system using Wi-Fi and convolutional neural network and its integration in health information systems[J]. Annals of Telecommunications,2022,77(3):163-175.

  • 期刊类型引用(1)

    1. 周魁,王向来,张方义,岁攀峰,曹安业,郭文豪. 基于改进Critic权重法的冲击地压危险等级预测方法. 煤炭技术. 2024(11): 125-129 . 百度学术

    其他类型引用(8)

图(10)  /  表(4)
计量
  • 文章访问数:  228
  • HTML全文浏览量:  36
  • PDF下载量:  65
  • 被引次数: 9
出版历程
  • 收稿日期:  2022-12-05
  • 修回日期:  2023-03-08
  • 网络出版日期:  2023-03-26
  • 刊出日期:  2023-03-24

目录

    /

    返回文章
    返回