Build state-of-the-art faster NLP models in Tensorflow 2


Inside the world of NLP, we have seen so much progress in the last few years. And we can easily divide it as Pre-BERT era and Post-BERT era. Transformer based models have been dominating this industry from last 3 years . There are quite a bunch of variations of transformer based models to solve some drawbacks, still the core idea remains the same. From 124M parameter models to 13B parameter models, the progress is rapid and fast. These progress pose us with the challenge of how these huge models can be…


Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store