The Golden Grail: Automatic Distributed Hyperparameter Tuning
While it might not be an exciting problem front and center of AI conversations, the issue of efficient hyperparameter tuning for neural network training is a tough one. There are some options that aim to automate this process but for most users, this is a cumbersome area—and one that can lead to bad performance when not done properly.
The problem with coming up with automatic tools for tuning is that many machine learning workloads are dependent on the dataset and the conditions of the problem being solved. For instance, some users might prefer less accuracy over a speedup or efficiency …
The Golden Grail: Automatic Distributed Hyperparameter Tuning was written by Nicole Hemsoth at The Next Platform.
