基于IBES−XGBoost的矿井巷道摩擦阻力因数预测模型

Prediction model for friction resistance coefficient of mine roadways based on IBES-XGBoost

  • 摘要: 针对现有基于机器学习的矿井巷道摩擦阻力因数α预测算法存在欠拟合或过拟合、预测精度不高等问题,提出一种集成反向学习初始化、混沌自适应参数、动态自适应变异和混沌局部搜索策略的改进秃鹰搜索(IBES)算法,采用该算法对极限梯度提升树(XGBoost)模型的关键超参数进行自适应寻优,在此基础上以多维度巷道几何参数及结构类别信息为输入特征,以最小化预测均方根误差(RMSE)为目标函数,构建了矿井巷道α预测模型(IBES−XGBoost模型)。通过标准测试函数测试验证了IBES算法较原始秃鹰搜索算法及其他元启发式算法在求解精度、收敛速度和稳定性方面均表现出显著优势。基于陕北地区多个矿井现场实测获得的涵盖4种典型断面形状与8种支护方式等复杂工况的260组样本构建数据集,按支护类型分层抽样并按8∶2划分训练集与测试集,通过5折交叉验证完成超参数寻优。实验结果表明:IBES−XGBoost模型对测试集的预测RMSE为0.001 232,平均绝对误差(MAE)为0.000 868,决定系数(R2)为0.985 426,优于所有对比模型,且与次优的BES−XGBoost模型相比,其RMSE和MAE分别降低了49.94%和49.09%,验证了IBES−XGBoost模型具有极高的预测准确率和鲁棒性。

     

    Abstract: To address the issues of underfitting or overfitting and low prediction accuracy in existing machine learning-based algorithms for predicting the friction resistance coefficient α of mine roadways, an Improved Bald Eagle Search (IBES) algorithm was proposed. This algorithm integrated reverse learning initialization, chaotic adaptive parameters, dynamic adaptive mutation, and chaotic local search strategies. It was employed to adaptively optimize the key hyperparameters of the Extreme Gradient Boosting (XGBoost) model. On this basis, a prediction model for the roadway friction resistance coefficient α (IBES-XGBoost model) was constructed, using multi-dimensional roadway geometric parameters and structural category information as input features, and minimizing the Root Mean Square Error (RMSE) as the objective function. Standard benchmark function tests demonstrated that the IBES algorithm exhibited significant advantages over the original Bald Eagle Search algorithm and other metaheuristic algorithms in terms of solution accuracy, convergence speed, and stability. A dataset comprising 260 samples was established based on field measurements from multiple mines in northern Shaanxi, covering complex working conditions including four typical cross-sectional shapes and eight support types. Stratified sampling based on support types was performed, and the dataset was divided into training and test sets at a ratio of 8∶2. Hyperparameter optimization was completed using five-fold cross-validation. Experimental results showed that the IBES-XGBoost model achieved an RMSE of 0.001 232, a mean absolute error (MAE) of 0.000 868, and a coefficient of determination (R2) of 0.985 426 on the test set, outperforming all comparison models. Compared with the second-best BES-XGBoost model, the RMSE and MAE were reduced by 49.94% and 49.09%, respectively, demonstrating that the IBES-XGBoost model possessed extremely high prediction accuracy and robustness.

     

/

返回文章
返回