Follow
Chandana Satya Prakash
Chandana Satya Prakash
Verified email at amazon.com
Title
Cited by
Cited by
Year
Alexa teacher model: Pretraining and distilling multi-billion-parameter encoders for natural language understanding systems
J FitzGerald, S Ananthakrishnan, K Arkoudas, D Bernardi, A Bhagia, ...
Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and …, 2022
632022
Alexatm 20b: Few-shot learning using a large-scale multilingual seq2seq model
S Soltan, S Ananthakrishnan, J FitzGerald, R Gupta, W Hamza, H Khan, ...
arXiv preprint arXiv:2208.01448, 2022
612022
Alexatm 20b: Few-shot learning using a large-scale multilingual seq2seq model, 2022
S Soltan, S Ananthakrishnan, J FitzGerald, R Gupta, W Hamza, H Khan, ...
URL https://arxiv. org/abs/2208.01448 94, 0
9
Instilling type knowledge in language models via multi-task QA
S Li, M Sridhar, CS Prakash, J Cao, W Hamza, J McAuley
arXiv preprint arXiv:2204.13796, 2022
82022
AlexaTM 20B: Few-shot learning using a large-scale multilingual seq2seq model. arXiv 2022
S Soltan, S Ananthakrishnan, J FitzGerald, R Gupta, W Hamza, H Khan, ...
arXiv preprint arXiv:2208.01448, 2022
52022
Attention Fusion: a light yet efficient late fusion mechanism for task adaptation in NLU
J Cao, CS Prakash, W Hamza
Findings of the Association for Computational Linguistics: NAACL 2022, 857-866, 2022
22022
Alexa Teacher Models
JGM FitzGerald, S Ananthakrishnan, K Arkoudas, D Bernardi, A Bhagia, ...
ICON, 2021
22021
FARS: FSM-Augmentation to Make LLMs Hallucinate the Right APIs
S Rongali, CS Prakash, A Gupta, W Hamza
2023
Shared encoder for natural language understanding processing
JJ Hueser, F Triefenbach, CS Prakash, J Cao, W Hamza, M Momotko
US Patent App. 17/690,609, 2023
2023
Sharing encoder representations across languages, domains and tasks in large-scale spoken language understanding
J Hueser, J Gaspers, T Gueudre, C Prakash, J Cao, D Sorokin, Q Do, ...
Proceedings of the 61st Annual Meeting of the Association for Computational …, 2023
2023
The system can't perform the operation now. Try again later.
Articles 1–10