DNN training and inference have similar basic operators but with fundamentally different requirements. The former is throughput bound and relies on high precision floating-point arithmetic for convergence while the latter is latency-bound and tolerant to low-precision arithmetic. Both workloads require high computational capabilities and can benefit from hardware accelerators. The disparity in resource requirements forces datacenter operators to choose between custom accelerators for training and inference or training accelerators for inference.
In their paper, the researchers introduce a nanowire-based device to create high-electron-mobility tri-gate transistors for power-conversion applications. Based on nanoscale structures, the novel transistor design significantly reduces heat loss during the energy conversion process.
Contrary to expectations, the experiment revealed that the respondents held on to their views firmly, regardless of the celebrity inputs or their esteem in the eyes of the respondents. It was also clear that respondents liked to hear an opinion identical to their own even if it came from a disliked celebrity. Conversely, a dissenting opinion by a celebrity or expert reduced the respondent’s empathy for that person.
Two EPFL students have developed PowerSGD, an algorithm that allows compression of the needed bandwidth without compromising the accuracy of the training.
We are investigating new stochastic theories and analytics including statistical learning techniques, machine training and inference systems and their applications to IoT, social media and big data platforms.
We are investigating technologies to maximize efficiency with in-memory data services, in-situ query processing on streamed data, and on-demand query engine customization using multi-objective compiler optimization.
We are investigating a three-pronged approach to holistic data platforms and datacenter efficiency: vertical integration to minimize data-movement; specialization to optimize work per service; and approximation to tailor work for output quality.
We are investigating technologies spanning from decentralized trust and cryptography to robustness and resiliency of natural processes to strengthen security, privacy and trust in data platforms and systems.