Data centers are taking on huge workloads including Deep Neural Networks, data analytics, and video streaming. Even the most robust CPU- and GPU-based architectures are unable to handle today’s demanding computing environment. Therefore, the current trend is to turn to new forms of accelerators called Field Programmable Gate Arrays (FPGAs), which demonstrate superior energy performance. Commercial behemoths like Intel, Amazon, and Microsoft have added FPGAs in their data centers through takeovers and system implementations. However, are FPGAs safe from security attacks? If not, how can such attacks be tackled? A fresh research proposal by EPFL’s Mirjana Stojilovic seeks to address these and related concerns regarding FPGAs.
The collaborative engagement between Microsoft and EPFL goes back to 2008 when they came together, along with ETH Zurich, for the Microsoft Innovation Cluster for Embedded Software (ICES). That relationship has matured through the years with various phases of the Swiss Joint Research Center (JRC) projects. In the first two phases (2014-18), Swiss JRC supported 9 EPFL projects. After reviewing and ranking 29 proposals for phase III, including 13 from EPFL, JRC has now confirmed nine proposals. Three of them are from EPFL, including two projects submitted by EcoCloud faculty.
For four days (January 26-29), some of the best minds on Machine Learning and Artificial Intelligence congregated for the Applied Machine Learning Days (AMLD) conference at the SwissTech Convention Center at EPFL, Lausanne. With EPFL being the principal organizer of the event, professors Marcel Salathé, Martin Jaggi, and Bob West played a stellar role in the conduct of the event. AMLD2019 included talks, tutorials, and workshops, but it will be best remembered for introducing 16 different “AI & your domain” tracks, which featured talks by domain experts and interesting panels.
The 32nd Annual Conference on Neural Information Processing Systems (NeurIPS 2018) was held in Montreal between December 2 and 8. The proceedings brought together 8000 attendees and 1011 papers. It also included posters and workshops covering an array of algorithms, theories, experiments, and ideas presented by the crème de la crème of researchers on machine learning. Sieving through this massive database, the insightful platform Medium has shortlisted its influential list of papers and poster presentations. In the latter list is “Training DNNs with Hybrid Block Floating Point,” which was presented by EPFL researchers Mario Drumond, Tao Lin, Martin Jaggi, and Babak Falsafi.
We are investigating new stochastic theories and analytics including statistical learning techniques, machine training and inference systems and their applications to IoT, social media and big data platforms.
We are investigating technologies to maximize efficiency with in-memory data services, in-situ query processing on streamed data, and on-demand query engine customization using multi-objective compiler optimization.
We are investigating a three-pronged approach to holistic data platforms and datacenter efficiency: vertical integration to minimize data-movement; specialization to optimize work per service; and approximation to tailor work for output quality.
We are investigating technologies spanning from decentralized trust and cryptography to robustness and resiliency of natural processes to strengthen security, privacy and trust in data platforms and systems.