Useful resources
Academic papersโ
Li, Lisha, et al. "Hyperband: A novel bandit-based approach to hyperparameter optimization." The Journal of Machine Learning Research 18.1 (2017): 6765-6816.
Sweep Experimentsโ
The following W&B Reports demonstrate examples of projects that explore hyperparameter optimization with W&B Sweeps.
- Drought Watch Benchmark Progress
- Description: Developing the baseline and exploring submissions to the Drought Watch benchmark.
- Tuning Safety Penalties in Reinforcement Learning
- Description: We examine agents trained with different side effect penalties on three different tasks: pattern creation, pattern removal, and navigation.
- Meaning and Noise in Hyperparameter Search with W&B Stacey Svetlichnaya
- Description: How do we distinguish signal from pareidolia (imaginary patterns)? This article is showcases what is possible with W&B and aims to inspire further exploration.
- Who is Them? Text Disambiguation with Transformers
- Description: Using Hugging Face to explore models for natural language understanding
- DeepChem: Molecular Solubility
- Description: Predict chemical properties from molecular structure with random forests and deep nets.
- Intro to MLOps: Hyperparameter Tuning
- Description: Explore why hyperparameter optimization matters and look at three algorithms to automate hyperparameter tuning for your machine learning models.
selfm-anagedโ
The following how-to-guide demonstrates how to solve real-world problems with W&B:
- Sweeps with XGBoost
- Description: How to use W&B Sweeps for hyperparameter tuning using XGBoost.
Sweep GitHub repositoryโ
W&B advocates open source and welcome contributions from the community. Find the GitHub repository at https://github.com/wandb/sweeps. For information on how to contribute to the W&B open source repo, see the W&B GitHub Contribution guidelines.