Google Research Pushing Neural Networks Out of the Datacenter
Google has been at the bleeding edge of AI hardware development with the arrival of its TPU and other system-scale modifications to make large-scale neural network processing efficient and fast.
But just as these developments come to fruition, advances in trimmed-down deep learning could move many more machine learning training and inference operations out of the datacenter and into your palm.
Although it might be natural to think the reason that neural networks cannot be processed on devices like smartphones is because of limited CPU power, the real challenge lies in the vastness of the model sizes and hardware memory …
Google Research Pushing Neural Networks Out of the Datacenter was written by Nicole Hemsoth at The Next Platform.
Platform can support Kubernetes-based container deployment in 10 minutes.
What the modern MNO will need is a virtualized mobile core.
Data center networking is moving away from ‘Ciscos of the world,’ Big Switch says.