This article was published on Hacker News today, a good overview and criticism of the current hype around NLP and speech-to-text in general. I like this comparison:
Stable production-grade NLP models usually are built upon data which is several orders of magnitude larger or the task at hand should be much more simple. For example - it is safe to assume that a neural network can do NER reliably, but as for answering questions or maintaining dialogue - this is just science fiction now. I like an apt analogy - building AGI with transformers is like going to the Moon by building towers
About me
I work since more than 20 years as a developer, product manager and AI lead with language technologies. Starting with speech recognition and machine translation I now focus on education in semantic technologies and LLMs.
Check out my AI trainings.
Contact me and book your training.