Abstract
To address the challenges of enhancing feature learning capabilities from highly imbalanced industrial data and achieving superior generalization performance, this paper develops a novel deep neural network architecture, namely the deep self-attention feature augmentation network (DSAFAN). The majority-oriented oversampling technique (MOOST) is inserted into the deep learning architectures as an auxiliary tool for minority-class feature augmentation. Given the inherent difficulty of enhancing the completeness of the minority-class data, the developed MOOST generates out-of-distribution synthetic data with abnormal characteristics under the guidance of normal data, thereby effectively expanding the advantageous region occupied by existing anomalies. Furthermore, the DSAFAN leverages a multi-head self-attention mechanism for feature extraction, enabling the model to capture complex temporal dependencies and significantly improve its feature learning capability. Finally, the effectiveness of the proposed methods has been validated through comprehensive experiments involving the real-world aero-engine operation data and a public dataset.
| Original language | English |
|---|---|
| Article number | 112377 |
| Journal | Reliability Engineering and System Safety |
| Volume | 274 |
| DOIs | |
| State | Published - Oct 2026 |
| Externally published | Yes |
Keywords
- Aero-engine anomaly detection
- Deep self-attention feature-augmentation network
- Inter-class imbalance
- Inter-class overlap
- Majority-oriented oversampling technique
Fingerprint
Dive into the research topics of 'A novel majority-oriented over-sampling technique and its application in imbalanced anomaly detection of aero-engines'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver