Takip et
Sushrut Karmalkar
Sushrut Karmalkar
University of Wisconsin-Madison
cs.utexas.edu üzerinde doğrulanmış e-posta adresine sahip - Ana Sayfa
Başlık
Alıntı yapanlar
Alıntı yapanlar
Yıl
List-decodable linear regression
S Karmalkar, A Klivans, P Kothari
Advances in neural information processing systems 32, 2019
812019
Superpolynomial lower bounds for learning one-layer neural networks using gradient descent
S Goel, A Gollakota, Z Jin, S Karmalkar, A Klivans
International Conference on Machine Learning, 3587-3596, 2020
682020
Time/accuracy tradeoffs for learning a relu with respect to gaussian marginals
S Goel, S Karmalkar, A Klivans
Advances in neural information processing systems 32, 2019
562019
Approximation schemes for relu regression
I Diakonikolas, S Goel, S Karmalkar, AR Klivans, M Soltanolkotabi
Conference on learning theory, 1452-1485, 2020
522020
Robustly learning any clusterable mixture of gaussians
I Diakonikolas, SB Hopkins, D Kane, S Karmalkar
arXiv preprint arXiv:2005.06417, 2020
482020
Instance-optimal compressed sensing via posterior sampling
A Jalal, S Karmalkar, AG Dimakis, E Price
arXiv preprint arXiv:2106.11438, 2021
392021
Outlier-robust high-dimensional sparse estimation via iterative filtering
I Diakonikolas, D Kane, S Karmalkar, E Price, A Stewart
Advances in Neural Information Processing Systems 32, 2019
372019
Compressed sensing with adversarial sparse noise via l1 regression
S Karmalkar, E Price
arXiv preprint arXiv:1809.08055, 2018
362018
Fairness for image generation with uncertain sensitive attributes
A Jalal, S Karmalkar, J Hoffmann, A Dimakis, E Price
International Conference on Machine Learning, 4721-4732, 2021
352021
Outlier-robust clustering of gaussians and other non-spherical mixtures
A Bakshi, I Diakonikolas, SB Hopkins, D Kane, S Karmalkar, PK Kothari
2020 ieee 61st annual symposium on foundations of computer science (focs …, 2020
302020
Robust polynomial regression up to the information theoretic limit
D Kane, S Karmalkar, E Price
2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS …, 2017
182017
On the power of compressed sensing with generative models
A Kamath, E Price, S Karmalkar
International Conference on Machine Learning, 5101-5109, 2020
172020
Robust sparse mean estimation via sum of squares
I Diakonikolas, DM Kane, S Karmalkar, A Pensia, T Pittas
Conference on Learning Theory, 4703-4763, 2022
152022
Lower bounds for compressed sensing with generative models
A Kamath, S Karmalkar, E Price
arXiv preprint arXiv:1912.02938, 2019
142019
List-decodable sparse mean estimation via difference-of-pairs filtering
I Diakonikolas, D Kane, S Karmalkar, A Pensia, T Pittas
Advances in Neural Information Processing Systems 35, 13947-13960, 2022
102022
Fourier entropy-influence conjecture for random linear threshold functions
S Chakraborty, S Karmalkar, S Kundu, SV Lokam, N Saurabh
LATIN 2018: Theoretical Informatics: 13th Latin American Symposium, Buenos …, 2018
52018
Compressed sensing with approximate priors via conditional resampling
A Jalal, S Karmalkar, A Dimakis, E Price
NeurIPS 2020 Workshop on Deep Learning and Inverse Problems, 2020
42020
Distribution-Independent Regression for Generalized Linear Models with Oblivious Corruptions
I Diakonikolas, S Karmalkar, JH Park, C Tzamos
The Thirty Sixth Annual Conference on Learning Theory, 5453-5475, 2023
12023
The polynomial method is universal for distribution-free correlational SQ learning
A Gollakota, S Karmalkar, A Klivans
arXiv preprint arXiv:2010.11925, 2020
12020
Depth separation and weight-width trade-offs for sigmoidal neural networks
A Deshpande, N Goyal, S Karmalkar
12018
Sistem, işlemi şu anda gerçekleştiremiyor. Daha sonra yeniden deneyin.
Makaleler 1–20