View Single Post
Old 05-04-2023, 01:01 AM   #157 (permalink)
freebeard
Master EcoModder
 
freebeard's Avatar
 
Join Date: Aug 2012
Location: northwest of normal
Posts: 29,084
Thanks: 8,255
Thanked 9,018 Times in 7,451 Posts
What I don't understand is why a forward-looking individual such as yourself is not having CatsGTP/Stable Diffusion make all your cards for you.

Several hours la-ter...
SparseGPT: Massive Language Models Can Be Accurately Pruned in One-Shot
Quote:
We show for the first time that large-scale generative pretrained transformer (GPT) family models can be pruned to at least 50% sparsity in one-shot, without any retraining, at minimal loss of accuracy. This is achieved via a new pruning method called SparseGPT, specifically designed to work efficiently and accurately on massive GPT-family models.... and can reach 60% unstructured sparsity with negligible increase in perplexity: remarkably, more than 100 billion weights from these models can be ignored at inference time.
__________________
.
.
Without freedom of speech we wouldn't know who all the idiots are. -- anonymous poster
____________________
.
.
What the headline giveth, the last paragraph taketh away. -- Scott Ott

Last edited by freebeard; 05-04-2023 at 02:56 AM..
  Reply With Quote