All they need do is run GPT-3 (or higher). It's a few-shot learning algorithm.
Quote:
GPT-3
arxiv.org/abs/2005.14165
Generative Pre-trained Transformer 3 is an autoregressive language model that uses deep learning to produce human-like text. It is the third-generation language prediction model in the GPT-n series created by OpenAI, a San Francisco-based artificial intelligence research laboratory.Wikipedia
Original author(s): OpenAI
Initial release: June 11, 2020 (beta)
Type: Autoregressive Transformer language model
|
Quote:
Originally Posted by DDG
Search domain en.wikipedia.orghttps://en.wikipedia.org › wiki › GPT-3
GPT-3, specifically the Codex model, is the basis for GitHub Copilot, a code completion and generation software that can be used in various code editors and IDEs. GPT-3 is used in certain Microsoft products to translate conventional language into formal computer code.
|
The specific language is just an abstraction layer.
__________________
.
.Without freedom of speech we wouldn't know who all the idiots are. -- anonymous poster
________________
.
.Because much of what is in the published literature is nonsense,
and much of what isn’t nonsense is not in the scientific literature.
-- Sabine Hossenfelder
|