05-03, 13:30–13:55 (Asia/Jerusalem), PyData Track 1
We all heard about huge transformers that cost millions of dollars to train, and achieve amazing results. But is there still room for the little guy, with a single GPU and a small budget to innovate in NLP ?
Well, have you heard about grounding ?
We all heard about huge transformers (e.g. gpt3, dale, etc) that cost millions of dollars to train, and achieve amazing results. But is there still room for the little guy, with a single GPU and a small budget to innovate in NLP ?
In this talk we would describe the natural language grounding technique, that takes world context into account and achieved impressing results.
We would demonstrate how instruction parsing could be done more efficiently with a grounded representation.
And we will will discuss the similarities with pragmatics (in linguistics).
English
Target audience –Data Scientists, Managers