Why Ray Tune?

Just a few of the many capabilities that sets Ray Tune apart from the other hyperparameter optimization libraries.

icn

State of the art algorithms

Maximize model performance and minimize training costs by using the latest algorithms such as PBT, HyperBAND, ASHA, and more.

icn

Library agnostic

Ray Tune supports all the popular machine learning frameworks, including PyTorch, TensorFlow, XGBoost, LightGBM, and Keras — use your favorite!

icn

Built-in distributed mode

With built-in multi-GPU and multi-node support, and seamless fault tolerance, easily parallelize your hyperparameter search jobs.

icn

Power up existing workflows

Have an existing workflow in another library like HyperOpt and Ax? Integrate Ray Tune to improve performance with minimal code changes.

icn

10x your productivity

Start using Ray Tune by changing just a couple lines of code. Enjoy simpler code, automatic checkpoints and integrations with tools like MLflow and TensorBoard.

icn

Hooks into the Ray ecosystem

Use Ray Tune on its own, or combine with other Ray libraries such as XGBoost-Ray, RLlib.

Try it yourself

Install Ray Tune with pip install "ray[tune]" and give this example a try.

from ray import tune
 
def objective(step, alpha, beta):
   return (0.1 + alpha * step / 100)**(-1) + beta * 0.1
 
def training_function(config):
   # Hyperparameters
   alpha, beta = config["alpha"], config["beta"]
   for step in range(10):
       # Iterative training function - can be any arbitrary training procedure.
       intermediate_score = objective(step, alpha, beta)
       # Feed the score back back to Tune.
       tune.report(mean_loss=intermediate_score)
 
analysis = tune.run(
   training_function,
   config={
       "alpha": tune.grid_search([0.001, 0.01, 0.1]),
       "beta": tune.choice([1, 2, 3])
   })
 
print("Best config: ", analysis.get_best_config(
   metric="mean_loss", mode="min"))
 
# Get a dataframe for analyzing trial results.
df = analysis.results_df
Code sample background image

See Ray Tune in action

See how engineers and scientists like you are using Ray Tune to accelerate their hyperparameter search to tackle challenging goals like ecosystem restoration and demand forecasting.

thumbnail-dendra-systems-logo

Dendra Systems

Dendra Systems has a bold mission: planting 1 trillion trees using drones. Learn how Ray Tune is powering that mission.

Watch the video
thumbnail-anastasia-logo

Anastasia.ai

Learn how Anastasia realized 9x speedup and 87% cost reduction on their demand forecasting use case with Ray Tune.

Read the story
thumbnail-linkedin

LinkedIn

LinkedIn improved member engagement with a superior Network Quality Service prediction model. See their path to 2x faster training with Ray Tune.

Watch the video

Scale other workloads with Ray

Expand your Ray journey beyond hyperparameter tuning and bring easy scale to other pieces — such as data processing, training, and serving — of your machine learning pipelines.

Ray Train

Scalable deep learning

Ray Serve

Scale model serving

Ray Datasets

Scale data loading and collections use cases

O'Reilly Learning Ray Book

Get your free copy of early release chapters of Learning Ray, the first and only comprehensive book on Ray and its ecosystem, authored by members on the Ray engineering team

Group 5