Federated Machine Learning
over Fog/Edge/Cloud Architectures
The increasing use of wearable devices has made them an inseparable part of daily life, especially in the realm of biomedical applications. Simultaneously, the generation and processing of a huge volume of monitoring data poses two major problems. On the one hand, it is a challenge for battery-powered wearable technologies, and on the other it is dependent on cloud servers for transmission and processing of the data. Data management in the cloud also poses a high privacy risk and lack of scalability.
Our objective is to develop a multi-layer distributed approach in which most of the data is processed at source, and only the necessary information for data-fusion or high-level inference is transmitted. In our proposed self-aware distributed system for epileptic seizure monitoring, we propose a novel wearable system by combining multi-parametric biosignal processing and machine learning with the self-awareness notion.
The electroencephalographic signal is captured by our e-Glass system, which includes the necessary machine learning for detecting the baseline normal situation and works in tandem with a wrist band for monitoring the heart rate. There is also scope to include a higher-level analysis by running more complex signal processing and machine learning algorithms using the smartphone of the user. The smartphone can also act as a Fog coordinator, exploiting the available resources on neighboring smartphones, or relying on the cloud if heavy, global analysis is necessary.
To further illustrate the impact of our proposed technique in wearable technologies, we utilized the notion of self-awareness for cognitive workload detection during manual labor. Our multimodal machine-learning algorithm detected cognitive workload during manual labor with a performance of 81.75%. We achieved an improvement of 27.6% in energy consumption, with less than 6% of performance loss.