A Center for Sustainable Cloud Computing

Analytics & Applications

We are investigating new stochastic theories and analytics including statistical learning techniques, machine training and inference systems and their applications to IoT, social media and big data platforms.


Adversarial Machine Learning
Efficient distributed learning solutions, taking into account adversarial behavior in both worker-server and peer-to-peer architectures.
The first benchmark suite for emerging scale-out applications
Co-located Deep Learning Training and Inference
duhl-new-strategy-to-render-10x-faster-machine-learning DuHL: New Strategy to Render 10X Faster Machine Learning
A new generic algorithmic building block to accelerate training of machine learning models on heterogeneous computing systems.
Large-scale Data Analytics Large-scale Data Analytics
Maximize resource utilization and concurrently achieve high parallelism and load balance.
learning-based-dimensionality-reduction Learning-based Dimensionality Reduction
Advanced techniques that will improve the applicability of adaptive sampling.
Modularity and Scalability in ML
Algorithms that combine three key aspects: accelerated, non-convex, and distributed training.
Real-time Event Processing Real-time Event Processing
Customer experience optimization is much improved if updates to the system are carried out in real time.
scalable-learning-machines Scalable Learning Machines
Developing learning systems that can be customized according to the skills and abilities of students.
time-data-trade-off Time-data Trade-off
Accelerating the process of mathematical optimization.
training-dnn-using-ssd Training DNN Using SSD
An algorithm termed stochastic spectral descent (SSD) for training deep neural networks.
Training GANs: A Convex Optimization Perspective
Generative adversarial networks (GANs) are a class of deep generative models, extensively used in AI.