MIT reveals new, simple model pruning method

This looks interesting…

Now, MIT researchers have a new and better way to compress models.

It’s so simple that they unveiled it in a tweet last month: Train the model, prune its weakest connections, retrain the model at its fast, early training rate, and repeat, until the model is as tiny as you want.

“That’s it,” says Alex Renda, a PhD student at MIT. “The standard things people do to prune their models are crazy complicated.”