Transformer-based natural language processing (NLP) models have serious implications for the entire field of NLP — from speech recognition systems to natural language generation (NLG), natural language understanding (NLU), machine translation, and text analysis applications. Consequently, tools for developing transformer-based models have become popular among researchers and developers implementing NLP applications.
Advisor
Hear My Words: Transformer-Based Models for NLP and Speech Recognition
By Curt Hall
Posted February 11, 2020 | Technology |
Don’t have a login?
Make one! It’s free and gives you access to all Cutter research.