TY - GEN
T1 - Multi-Scale Contrastive Attention Representation Learning for Encrypted Traffic Classification
AU - Yang, Shuo
AU - Zheng, Xinran
AU - Li, Jinze
AU - Xu, Jinfeng
AU - Ngai, Edith C.H.
N1 - Publisher Copyright:
© 2024 ACM.
PY - 2024/10/21
Y1 - 2024/10/21
N2 - Encrypted traffic classification is essential for network security and management. However, the encrypted nature makes it challenging to extract representative features from raw traffic data. Existing end-to-end methods ignore byte correlations within packets and potential correlations among packets, hindering the learning of real traffic semantics and leading to suboptimal performance. This paper proposes MsETC, a multi-scale contrastive attention representation learning method for encrypted traffic classification. MsETC divides the raw packet byte sequence into multi-scale patches and then extracts dual views for contrastive learning from both the inter-patch and intra-patch perspectives. This allows the model to capture correlations among bytes within a packet as well as the potential interactions between packets. Extensive experiments on real-world datasets demonstrate that the proposed method achieves superior classification performance with lower complexity.
AB - Encrypted traffic classification is essential for network security and management. However, the encrypted nature makes it challenging to extract representative features from raw traffic data. Existing end-to-end methods ignore byte correlations within packets and potential correlations among packets, hindering the learning of real traffic semantics and leading to suboptimal performance. This paper proposes MsETC, a multi-scale contrastive attention representation learning method for encrypted traffic classification. MsETC divides the raw packet byte sequence into multi-scale patches and then extracts dual views for contrastive learning from both the inter-patch and intra-patch perspectives. This allows the model to capture correlations among bytes within a packet as well as the potential interactions between packets. Extensive experiments on real-world datasets demonstrate that the proposed method achieves superior classification performance with lower complexity.
KW - contrastive learning
KW - representation learning
KW - traffic classification
UR - https://www.scopus.com/pages/publications/85210012353
U2 - 10.1145/3627673.3679968
DO - 10.1145/3627673.3679968
M3 - 会议稿件
AN - SCOPUS:85210012353
T3 - International Conference on Information and Knowledge Management, Proceedings
SP - 4173
EP - 4177
BT - CIKM 2024 - Proceedings of the 33rd ACM International Conference on Information and Knowledge Management
PB - Association for Computing Machinery
T2 - 33rd ACM International Conference on Information and Knowledge Management, CIKM 2024
Y2 - 21 October 2024 through 25 October 2024
ER -