notable: they ripped out the silicon that supports training
they say: “it’s the age of inference”
which, yeah, RL is mostly inference. Continual learning is almost all inference. Ambient agents, fast growing inference demands in general audiences
kartik343.wixstudio.com/blogorithm/p...
notable: they ripped out the silicon that supports training
View original threadremoving MXUs lets them reallocate transistor budget to inference tasks
they also dropped fp64 support — they’re all in on AI, it’s not going to be very useful for traditional supercomputing workloads
they also dropped fp64 support — they’re all in on AI, it’s not going to be very useful for traditional supercomputing workloads
6