Most of those would be a parameter, until you get to the massively parallel, sophisticated pre- and post-processing integrated computer graphics. Then it turns to a laundry list.
Quote:
https://towardsdatascience.com › gpt-4-will-have-100-trillion-parameters-500x-the-size-of-gpt-3-582b98d82253
GPT-4 Will Have 100 Trillion Parameters — 500x the Size of GPT-3
11 Sept 2021 100 trillion parameters is a lot. To understand just how big that number is, let's compare it with our brain. The brain has around 80-100 billion neurons (GPT-3's order of magnitude) and around 100 trillion synapses. GPT-4 will have as many parameters as the brain has synapses.
|