Follow
Tolga Ergen
Tolga Ergen
Research Scientist, LG AI Research
Verified email at stanford.edu - Homepage
Title
Cited by
Cited by
Year
Unsupervised anomaly detection with LSTM neural networks
T Ergen, SS Kozat
IEEE transactions on neural networks and learning systems 31 (8), 3127-3141, 2019
2742019
Online training of LSTM networks in distributed systems for variable length data sequences
T Ergen, SS Kozat
IEEE transactions on neural networks and learning systems 29 (10), 5159-5165, 2017
1052017
Efficient online learning algorithms based on LSTM neural networks
T Ergen, SS Kozat
IEEE transactions on neural networks and learning systems 29 (8), 3772-3783, 2017
1022017
Neural networks are convex regularizers: Exact polynomial-time convex optimization formulations for two-layer networks
M Pilanci, T Ergen
International Conference on Machine Learning, 7695-7705, 2020
862020
Revealing the Structure of Deep Neural Networks via Convex Duality
T Ergen, M Pilanci
arXiv preprint arXiv:2002.09773, 2020
76*2020
Convex geometry and duality of over-parameterized neural networks
T Ergen, M Pilanci
The Journal of Machine Learning Research 22 (1), 9646-9708, 2021
512021
Implicit Convex Regularizers of CNN Architectures: Convex Optimization of Two- and Three-Layer Networks in Polynomial Time
T Ergen, M Pilanci
arXiv preprint arXiv:2006.14798, 2020
412020
Vector-output relu neural network problems are copositive programs: Convex analysis of two layer networks and polynomial-time algorithms
A Sahiner, T Ergen, J Pauly, M Pilanci
arXiv preprint arXiv:2012.13329, 2020
402020
Global optimality beyond two layers: Training deep relu networks via convex programs
T Ergen, M Pilanci
International Conference on Machine Learning, 2993-3003, 2021
332021
Convex geometry of two-layer relu networks: Implicit autoencoding and interpretable models
T Ergen, M Pilanci
International Conference on Artificial Intelligence and Statistics, 4024-4033, 2020
322020
Demystifying batch normalization in relu networks: Equivalent convex optimization models and implicit regularization
T Ergen, A Sahiner, B Ozturkler, J Pauly, M Mardani, M Pilanci
arXiv preprint arXiv:2103.01499, 2021
262021
Unraveling attention via convex duality: Analysis and interpretations of vision transformers
A Sahiner, T Ergen, B Ozturkler, J Pauly, M Mardani, M Pilanci
International Conference on Machine Learning, 19050-19088, 2022
252022
Hidden convexity of wasserstein GANs: Interpretable generative models with closed-form solutions
A Sahiner, T Ergen, B Ozturkler, B Bartan, J Pauly, M Mardani, M Pilanci
arXiv preprint arXiv:2107.05680, 2021
192021
Energy-efficient LSTM networks for online learning
T Ergen, AH Mirza, SS Kozat
IEEE Transactions on Neural Networks and Learning Systems 31 (8), 3114-3126, 2019
182019
Convex optimization for shallow neural networks
T Ergen, M Pilanci
2019 57th Annual Allerton Conference on Communication, Control, andá…, 2019
162019
Convex neural autoregressive models: Towards tractable, expressive, and theoretically-backed models for sequential forecasting and generation
V Gupta, B Bartan, T Ergen, M Pilanci
ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech andá…, 2021
14*2021
Path regularization: A convexity and sparsity inducing regularization for parallel relu networks
T Ergen, M Pilanci
Advances in Neural Information Processing Systems 36, 2024
132024
Parallel deep neural networks have zero duality gap
Y Wang, T Ergen, M Pilanci
arXiv preprint arXiv:2110.06482, 2021
112021
A novel distributed anomaly detection algorithm based on support vector machines
T Ergen, SS Kozat
Digital Signal Processing 99, 102657, 2020
112020
Convex duality and cutting plane methods for over-parameterized neural networks
T Ergen, M Pilanci
OPT-ML workshop, 2019
92019
The system can't perform the operation now. Try again later.
Articles 1–20