Parallel computing, with automated processes and distributed training

Tips & Tricks

Libraries using Ray

RaySGD

Distributed training

Tune

Hyperparameter tuning

Documentation

Ray - Ray 0.8.4 documentation

Others

Faster and Cheaper Pytorch with RaySGD

How to scale Python multiprocessing to a cluster with one line of code

Securely connecting to VM instances | Compute Engine Documentation

Another interesting package that uses Ray as backend:

Scale your pandas workflow by changing a single line of code - Modin 0.7.3 documentation