TY - GEN
T1 - Edge-Cloud Differential Co-Evolutionary Algorithm for Distributed Feature Selection Optimization
AU - Wang, Yuhua
AU - Wei, Feng Feng
AU - Luo, Wenjian
AU - Chen, Wei Neng
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Evolutionary algorithms have become a popular method for feature selection to reduce the dimensionality of data. However, with the rapid development of distributed computing paradigms, data are owned by distributed nodes, which poses challenges for global optimization. Meanwhile, the rapid growth of data size increases the burden of a large number of fitness evaluations. To solve these problems, this paper proposes an edgecloud differential co-evolutionary algorithm for distributed highdimensional feature selection optimization. The edge clients are responsible for local model construction and local optimization, while the cloud server takes charge of global model ensemble and global optimization. Specifically, the edge clients train a radial basis function network using local data and conduct local evolution through differential evolution (DE). The training error and local optimized candidates are sent to the cloud server through the proposed best-of-the-best communication mechanism. After receiving the local information, the cloud server ensembles the local models for global optimization. Meanwhile, the local candidates are integrated into a global solution to guide the global DE evolution. Additionally, the final iteration begins to integrate the edge client solution set and find a federated set that feeds back to the cloud server to achieve the 'best of the best' effect. Experimental results demonstrate that the proposed algorithm is superior to traditional feature selection and integration methods on five data sets.
AB - Evolutionary algorithms have become a popular method for feature selection to reduce the dimensionality of data. However, with the rapid development of distributed computing paradigms, data are owned by distributed nodes, which poses challenges for global optimization. Meanwhile, the rapid growth of data size increases the burden of a large number of fitness evaluations. To solve these problems, this paper proposes an edgecloud differential co-evolutionary algorithm for distributed highdimensional feature selection optimization. The edge clients are responsible for local model construction and local optimization, while the cloud server takes charge of global model ensemble and global optimization. Specifically, the edge clients train a radial basis function network using local data and conduct local evolution through differential evolution (DE). The training error and local optimized candidates are sent to the cloud server through the proposed best-of-the-best communication mechanism. After receiving the local information, the cloud server ensembles the local models for global optimization. Meanwhile, the local candidates are integrated into a global solution to guide the global DE evolution. Additionally, the final iteration begins to integrate the edge client solution set and find a federated set that feeds back to the cloud server to achieve the 'best of the best' effect. Experimental results demonstrate that the proposed algorithm is superior to traditional feature selection and integration methods on five data sets.
KW - differential evolution
KW - edgecloud collaboration
KW - evolutionary algorithm
KW - feature selection
UR - https://www.scopus.com/pages/publications/105010181700
U2 - 10.1109/MCIICompanion65207.2025.11007430
DO - 10.1109/MCIICompanion65207.2025.11007430
M3 - 会议稿件
AN - SCOPUS:105010181700
T3 - 2025 IEEE Symposium for Multidisciplinary Computational Intelligence Incubators, MCII Companion 2025
BT - 2025 IEEE Symposium for Multidisciplinary Computational Intelligence Incubators, MCII Companion 2025
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2025 IEEE Symposium for Multidisciplinary Computational Intelligence Incubators, MCII Companion 2025
Y2 - 17 March 2025 through 20 March 2025
ER -