07-10-2021, 02:33 PM
|
#14 (permalink)
|
Master EcoModder
Join Date: Aug 2012
Location: northwest of normal
Posts: 29,002
Thanks: 8,231
Thanked 9,003 Times in 7,437 Posts
|
Quote:
Originally Posted by Piotrsko
Ummmm, so far you only get programmers bias, the AI isn't sufficiently intellegent yet to self program like a newborn human baby.
|
Babies inherit their parent's biases. Programmers are constrained to selecting the training data sets to be used.
Quote:
GPT-3 is a neural-network-powered language model. A language model is a model that predicts the likelihood of a sentence existing in the world. For example, a language model can label the sentence “I take my dog for a walk” as more probable to exist (i.e. on the Internet) than the sentence “I take my banana for a walk.” This is true for sentences as well as phrases and, more generally, any sequence of characters.
Like most language models, GPT-3 is elegantly trained on an unlabeled text dataset (in this case, the training data includes among others Common Crawl and Wikipedia). Words or phrases are randomly removed from the text, and the model must learn to fill them in using only the surrounding words as context. It’s a simple training task that results in a powerful and generalizable model.
....
But here’s the really magical part. As a result of its humongous size, GPT-3 can do what no other model can do (well): perform specific tasks without any special tuning. You can ask GPT-3 to be a translator, a programmer, a poet, or a famous author, and it can do it with its user (you) providing fewer than 10 training examples.
|
__________________
.
.Without freedom of speech we wouldn't know who all the idiots are. -- anonymous poster
________________
.
.Because much of what is in the published literature is nonsense,
and much of what isn’t nonsense is not in the scientific literature.
-- Sabine Hossenfelder
|
|
|