Takip et
Hua Wu
Hua Wu
Baidu NLP
baidu.com üzerinde doğrulanmış e-posta adresine sahip - Ana Sayfa
Başlık
Alıntı yapanlar
Alıntı yapanlar
Yıl
Ernie: Enhanced representation through knowledge integration
Y Sun, S Wang, Y Li, S Feng, X Chen, H Zhang, X Tian, D Zhu, H Tian, ...
arXiv preprint arXiv:1904.09223, 2019
9932019
Ernie 2.0: A continual pre-training framework for language understanding
Y Sun, S Wang, Y Li, S Feng, H Tian, H Wu, H Wang
Proceedings of the AAAI conference on artificial intelligence 34 (05), 8968-8975, 2020
7772020
Multi-task learning for multiple language translation
D Dong, H Wu, W He, D Yu, H Wang
Proceedings of the 53rd Annual Meeting of the Association for Computational …, 2015
6672015
Minimum risk training for neural machine translation
S Shen, Y Cheng, Z He, W He, H Wu, M Sun, Y Liu
arXiv preprint arXiv:1512.02433, 2015
4842015
RocketQA: An optimized training approach to dense passage retrieval for open-domain question answering
Y Qu, Y Ding, J Liu, K Liu, R Ren, WX Zhao, D Dong, H Wu, H Wang
arXiv preprint arXiv:2010.08191, 2020
4482020
An end-to-end model for question answering over knowledge base with cross-attention combining global knowledge
Y Hao, Y Zhang, K Liu, S He, Z Liu, H Wu, J Zhao
Proceedings of the 55th Annual Meeting of the Association for Computational …, 2017
4072017
Learning to respond with deep neural networks for retrieval-based human-computer conversation system
R Yan, Y Song, H Wu
Proceedings of the 39th International ACM SIGIR conference on Research and …, 2016
3862016
Multi-turn response selection for chatbots with deep attention matching network
X Zhou, L Li, D Dong, Y Liu, Y Chen, WX Zhao, D Yu, H Wu
Proceedings of the 56th Annual Meeting of the Association for Computational …, 2018
3842018
Ernie-vil: Knowledge enhanced vision-language representations through scene graphs
F Yu, J Tang, W Yin, Y Sun, H Tian, H Wu, H Wang
Proceedings of the AAAI conference on artificial intelligence 35 (4), 3208-3216, 2021
3372021
Unimo: Towards unified-modal understanding and generation via cross-modal contrastive learning
W Li, C Gao, G Niu, X Xiao, H Liu, J Liu, H Wu, H Wang
arXiv preprint arXiv:2012.15409, 2020
3322020
Ernie 3.0: Large-scale knowledge enhanced pre-training for language understanding and generation
Y Sun, S Wang, S Feng, S Ding, C Pang, J Shang, J Liu, X Chen, Y Zhao, ...
arXiv preprint arXiv:2107.02137, 2021
3112021
Semi-supervised learning for neural machine translation
Y Cheng, Y Cheng
Joint training for neural machine translation, 25-40, 2019
3052019
Dureader: a chinese machine reading comprehension dataset from real-world applications
W He, K Liu, J Liu, Y Lyu, S Zhao, X Xiao, Y Liu, Y Wang, H Wu, Q She, ...
arXiv preprint arXiv:1711.05073, 2017
3012017
Multi-view response selection for human-computer conversation
X Zhou, D Dong, H Wu, S Zhao, D Yu, H Tian, X Liu, R Yan
Proceedings of the 2016 conference on empirical methods in natural language …, 2016
2642016
Pivot language approach for phrase-based statistical machine translation
H Wu, H Wang
Machine Translation 21, 165-181, 2007
2592007
PLATO: Pre-trained dialogue generation model with discrete latent variable
S Bao, H He, F Wang, H Wu, H Wang
arXiv preprint arXiv:1910.07931, 2019
2502019
Geometry-enhanced molecular representation learning for property prediction
X Fang, L Liu, J Lei, D He, S Zhang, J Zhou, F Wang, H Wu, H Wang
Nature Machine Intelligence 4 (2), 127-134, 2022
2492022
Unified structure generation for universal information extraction
Y Lu, Q Liu, D Dai, X Xiao, H Lin, X Han, L Sun, H Wu
arXiv preprint arXiv:2203.12277, 2022
2392022
SKEP: Sentiment knowledge enhanced pre-training for sentiment analysis
H Tian, C Gao, X Xiao, H Liu, B He, H Wu, H Wang, F Wu
arXiv preprint arXiv:2005.05635, 2020
2262020
STACL: Simultaneous translation with implicit anticipation and controllable latency using prefix-to-prefix framework
M Ma, L Huang, H Xiong, R Zheng, K Liu, B Zheng, C Zhang, Z He, H Liu, ...
arXiv preprint arXiv:1810.08398, 2018
2222018
Sistem, işlemi şu anda gerçekleştiremiyor. Daha sonra yeniden deneyin.
Makaleler 1–20