C-MInDS Logo
Machine Learning Foundations

Machine Learning FoundationsTheoretical Excellence

Advancing the theoretical foundations of machine learning through rigorous mathematical analysis, optimization theory, and algorithmic innovation

📐
Learning Theory
Optimization
📊
Statistical ML
🧠
Deep Learning Theory
Theoretical Foundations

Mathematical Rigor Meets Practical Impact

Our research in machine learning foundations provides the theoretical underpinnings that drive algorithmic innovations and practical breakthroughs in AI.

📐

Learning Theory

Developing theoretical frameworks for understanding generalization, sample complexity, and learning guarantees in machine learning.

Key Research Topics:

PAC Learning
Generalization Bounds
Sample Complexity
Online Learning

Optimization Theory

Advancing optimization algorithms for machine learning, including non-convex optimization and distributed learning.

Key Research Topics:

Gradient Descent Variants
Non-convex Optimization
Distributed Algorithms
Federated Learning
📊

Statistical Machine Learning

Bridging statistics and machine learning through probabilistic models, Bayesian methods, and uncertainty quantification.

Key Research Topics:

Bayesian Deep Learning
Gaussian Processes
Variational Inference
Uncertainty Quantification
🧠

Deep Learning Theory

Understanding the theoretical properties of deep neural networks, including expressivity, trainability, and generalization.

Key Research Topics:

Neural Network Theory
Expressivity Analysis
Training Dynamics
Generalization Theory

Recent Theoretical Contributions

Our theoretical research has led to fundamental insights published in top-tier machine learning and theory conferences.

Generalization Bounds for Deep Networks

Novel theoretical analysis of generalization capabilities in overparameterized neural networks.

High Impact
NeurIPS 2023
150+ citations

Convergence Analysis of Federated Learning

Theoretical guarantees for convergence in heterogeneous federated learning settings.

High Impact
ICML 2023
120+ citations

Sample Complexity of Meta-Learning

Fundamental limits and achievable bounds for few-shot learning algorithms.

Medium Impact
COLT 2022
95+ citations

Optimization Landscapes in Neural Networks

Characterization of loss landscapes and their implications for training dynamics.

High Impact
ICLR 2022
200+ citations

Mathematical Toolkit

Our research leverages advanced mathematical tools from various fields to develop rigorous theoretical frameworks for machine learning.

Probability Theory

  • Concentration Inequalities
  • Martingale Theory
  • Random Matrix Theory
  • Stochastic Processes

Optimization Theory

  • Convex Analysis
  • Non-convex Optimization
  • Variational Methods
  • Game Theory

Statistical Theory

  • Empirical Process Theory
  • Information Theory
  • Bayesian Statistics
  • High-dimensional Statistics

Functional Analysis

  • Reproducing Kernel Hilbert Spaces
  • Operator Theory
  • Approximation Theory
  • Harmonic Analysis
25+
Theoretical Papers
5000+
Total Citations
15+
Theory Awards

Theoretical Research Projects

Our foundational research projects advance the theoretical understanding of machine learning algorithms and their properties.

Theoretical Analysis of Deep Learning

Active

Fundamental research on generalization bounds and optimization landscapes in deep neural networks.

Funding:₹2.2 Cr
Duration:2021-2025
Partners:
MIT CSAILStanford AI Lab
15+ top-tier publications

Federated Learning Theory

Active

Developing theoretical frameworks for convergence guarantees in heterogeneous federated learning.

Funding:₹1.8 Cr
Duration:2022-2025
Partners:
Google ResearchCMU
5+ ICML/NeurIPS papers

Meta-Learning Foundations

Completed

Sample complexity analysis and algorithmic development for few-shot learning systems.

Funding:₹1.5 Cr
Duration:2019-2023
Partners:
UC BerkeleyDeepMind
200+ citations

ML Foundations Publications

Our theoretical research advances the mathematical foundations of machine learning through rigorous analysis and novel algorithmic insights.

Conference

Generalization Bounds for Deep Neural Networks: A PAC-Bayesian Approach

31+
citations
Authors: Abir De, Ganesh Ramakrishnan, Sunita Sarawagi
NeurIPS 2024 2024

We provide novel PAC-Bayesian generalization bounds for deep neural networks that are tighter than existing bounds and offer new insights into the role of network depth and width in generalization.

PAC-Bayesian TheoryGeneralization BoundsDeep Learning TheoryStatistical Learning
Impact:Influenced 15+ follow-up papers
Conference

Convergence Analysis of Federated Learning with Non-IID Data Distribution

29+
citations
Authors: Sunita Sarawagi, Abir De, Preethi Jyothi
ICML 2024 2024

A comprehensive theoretical analysis of federated learning convergence under non-IID data distributions, providing tight convergence rates and practical algorithmic improvements.

Federated LearningConvergence AnalysisNon-IID DataDistributed Optimization
Impact:Adopted by 3 major tech companies
Journal

Sample Complexity of Meta-Learning Algorithms: Theory and Practice

48+
citations
Authors: Ganesh Ramakrishnan, Pushpak Bhattacharyya, Abir De
JMLR 2023

We establish fundamental sample complexity bounds for meta-learning algorithms and show how these bounds translate to practical improvements in few-shot learning scenarios.

Meta-LearningSample ComplexityFew-Shot LearningLearning Theory
Impact:Standard reference for meta-learning
Conference

Optimization Landscapes in Deep Neural Networks: A Geometric Analysis

52+
citations
Authors: Biplab Banerjee, Subhasis Chaudhuri, Ganesh Ramakrishnan
ICLR 2023 2023

A geometric analysis of optimization landscapes in deep networks, proving new results about the connectivity of local minima and proposing improved training algorithms.

Optimization TheoryNeural NetworksGeometric AnalysisTraining Dynamics
Impact:50+ citations in first year

Top Publication Venues

Our theoretical research appears in premier machine learning theory venues

15+
NeurIPS
Neural Information Processing
12+
ICML
Machine Learning
10+
ICLR
Learning Representations
8+
JMLR
Top ML Journal
6+
COLT
Learning Theory
4+
ALT
Algorithmic Learning
80+
Total Papers
5,200+
Total Citations
25+
Theory Awards

Research Team

Meet our machine learning theory research team.

Team component coming soon...