Ray Tune
Fast and easy distributed hyperparameter tuning
Ray Tune is a Python library for fast hyperparameter tuning at scale. Easily distribute your trial runs to quickly find the best hyperparameters.

Why Ray Tune?
Just a few of the many capabilities that sets Ray Tune apart from the other hyperparameter optimization libraries.
State of the art algorithms
Maximize model performance and minimize training costs by using the latest algorithms such as PBT, HyperBAND, ASHA, and more.
Library agnostic
Ray Tune supports all the popular machine learning frameworks, including PyTorch, TensorFlow, XGBoost, LightGBM, and Keras — use your favorite!
Built-in distributed mode
With built-in multi-GPU and multi-node support, and seamless fault tolerance, easily parallelize your hyperparameter search jobs.
Power up existing workflows
Have an existing workflow in another library like HyperOpt and Ax? Integrate Ray Tune to improve performance with minimal code changes.
10x your productivity
Start using Ray Tune by changing just a couple lines of code. Enjoy simpler code, automatic checkpoints and integrations with tools like MLflow and TensorBoard.
Hooks into the Ray ecosystem
Use Ray Tune on its own, or combine with other Ray libraries such as XGBoost-Ray, RLlib.
Try it yourself
Install Ray Tune with pip install "ray[tune]"
and give this example a try.
from ray import tune
def objective(step, alpha, beta):
return (0.1 + alpha * step / 100)**(-1) + beta * 0.1
def training_function(config):
# Hyperparameters
alpha, beta = config["alpha"], config["beta"]
for step in range(10):
# Iterative training function - can be any arbitrary training procedure.
intermediate_score = objective(step, alpha, beta)
# Feed the score back back to Tune.
tune.report(mean_loss=intermediate_score)
analysis = tune.run(
training_function,
config={
"alpha": tune.grid_search([0.001, 0.01, 0.1]),
"beta": tune.choice([1, 2, 3])
})
print("Best config: ", analysis.get_best_config(
metric="mean_loss", mode="min"))
# Get a dataframe for analyzing trial results.
df = analysis.results_df

See Ray Tune in action
See how engineers and scientists like you are using Ray Tune to accelerate their hyperparameter search to tackle challenging goals like ecosystem restoration and demand forecasting.

Dendra Systems
Dendra Systems has a bold mission: planting 1 trillion trees using drones. Learn how Ray Tune is powering that mission.

Anastasia.ai
Learn how Anastasia realized 9x speedup and 87% cost reduction on their demand forecasting use case with Ray Tune.

LinkedIn improved member engagement with a superior Network Quality Service prediction model. See their path to 2x faster training with Ray Tune.
Scale other workloads with Ray
Expand your Ray journey beyond hyperparameter tuning and bring easy scale to other pieces — such as data processing, training, and serving — of your machine learning pipelines.
O'Reilly Learning Ray Book
Get your free copy of early release chapters of Learning Ray, the first and only comprehensive book on Ray and its ecosystem, authored by members on the Ray engineering team
