Follow
Samuel Horvath
Samuel Horvath
PhD Student, KAUST
Verified email at kaust.edu.sa - Homepage
Title
Cited by
Cited by
Year
Stochastic Distributed Learning with Gradient Quantization and Variance Reduction
S Horvath, D Kovalev, K Mishchenko, SU Stich, P Richtarik
arXiv preprint arXiv:1904.05115, 2019
1002019
Don't Jump Through Hoops and Remove Those Loops: SVRG and Katyusha are Better Without the Outer Loop
D Kovalev, S Horváth, P Richtárik
ALT 2020 - Proceedings of the 31st International Conference on Algorithmic …, 2019
892019
Natural compression for distributed deep learning
S Horváth, CY Ho, L Horvath, AN Sahu, M Canini, P Richtárik
arXiv preprint arXiv:1905.10988, 2019
672019
On Biased Compression for Distributed Learning
A Beznosikov, S Horváth, P Richtárik, M Safaryan
NeurIPS 2020, Workshop on Scalability, Privacy, and Security in Federated …, 2020
642020
Lower bounds and optimal algorithms for personalized federated learning
F Hanzely, S Hanzely, S Horváth, P Richtárik
34th Conference on Neural Information Processing Systems (NeurIPS 2020), 2020
542020
A field guide to federated optimization
J Wang, Z Charles, Z Xu, G Joshi, HB McMahan, M Al-Shedivat, G Andrew, ...
arXiv preprint arXiv:2107.06917, 2021
342021
Optimal Client Sampling for Federated Learning
W Chen, S Horvath, P Richtarik
NeurIPS 2020 workshop on Privacy Preserving Machine Learning, 2020
312020
Nonconvex variance reduced optimization with arbitrary sampling
S Horváth, P Richtárik
ICML 2019 - Proceedings of the 36th International Conference on Machine Learning, 2018
252018
A Better Alternative to Error Feedback for Communication-Efficient Distributed Learning
S Horváth, P Richtárik
ICLR 2021 - International Conference on Learning Representations, 2020
182020
FjORD: Fair and Accurate Federated Learning under heterogeneous targets with Ordered Dropout
S Horvath, S Laskaridis, M Almeida, I Leontiadis, SI Venieris, ND Lane
35th Conference on Neural Information Processing Systems (NeurIPS 2021), 2021
162021
Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization
S Horváth, L Lei, P Richtárik, MI Jordan
SIAM Journal on Mathematics of Data Science (SIMODS), 2022
72022
Hyperparameter transfer learning with adaptive complexity
S Horváth, A Klein, P Richtárik, C Archambeau
International Conference on Artificial Intelligence and Statistics, 1378-1386, 2021
42021
Long-term outcome in patients with takotsubo syndrome
E Pogran, A El-Razek, L Gargiulo, V Weihs, C Kaufmann, S Horváth, ...
Wiener klinische Wochenschrift 134 (7), 261-268, 2022
12022
FedShuffle: Recipes for Better Use of Local Work in Federated Learning
S Horváth, M Sanjabi, L Xiao, P Richtárik, M Rabbat
arXiv preprint arXiv:2204.13169, 2022
2022
FL_PyTorch: optimization research simulator for federated learning
K Burlachenko, S Horváth, P Richtárik
Proceedings of the 2nd ACM International Workshop on Distributed Machine …, 2021
2021
FLIX: A Simple and Communication-Efficient Alternative to Local Methods in Federated Learning
E Gasanov, A Khaled, S Horváth, P Richtárik
AISTATS 2022, International Conference on Artificial Intelligence and Statistics, 2021
2021
Learning to Optimize via Dual space Preconditioning
S Chraibi, A Salim, S Horváth, F Hanzely, P Richtárik
2019
IntML: Natural Compression for Distributed Deep Learning
S Horváth, CY Ho, L Horváth, AN Sahu, M Canini, P Richtárik
Workshop on AI Systems at Symposium on Operating Systems Principles 2019 …, 2019
2019
The system can't perform the operation now. Try again later.
Articles 1–18