Skip to content

Commit 08f9f99

Browse files
paschaifacebook-github-bot
authored andcommitted
Add scalarized objective recipe (facebook#3529)
Summary: as titled - any edits are greatly appreciated! Differential Revision: D71403615
1 parent a3adfb3 commit 08f9f99

File tree

2 files changed

+60
-0
lines changed

2 files changed

+60
-0
lines changed

docs/recipes/scalarized-objective.md

+59
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,59 @@
1+
# Scalarized Objective Optimizations with Ax
2+
3+
In some cases, you may want to optimize a linear combination of multiple metrics rather than a single metric. This is where scalarized objectives come into the picture. You can define an objective function that is a weighted sum of several metrics, allowing you to balance different aspects of performance in your optimization.
4+
5+
Scalarized objectives are useful when you have multiple metrics that you want to consider simultaneously in your optimization process. By assigning weights to each metric, you can control their relative importance in the overall objective function.
6+
7+
## Setup
8+
9+
Before we begin you must instantiate the `Client` and configure it with your experiment and metrics.
10+
11+
We will also assume you are already familiar with [using Ax for ask-tell optimization](#).
12+
13+
```python
14+
client = Client()
15+
16+
client.configure_experiment(...)
17+
client.configure_metrics(...)
18+
```
19+
20+
## Steps
21+
22+
1. Configure an optimization with a scalarized objective
23+
2. Continue with iterating over trials and evaluating them
24+
3. Observe optimal parametrizations
25+
26+
### 1. Configure an optimization with a scalarized objective
27+
We can leverage the Client's `configure_optimization` method to configure a scalarized objective optimization. This method takes in an objective goal as a string, and can be used to specify single-objective, scalarized-objective, and multi-objective goals. For this recipe, we will use a scalarized-bjective goal:
28+
29+
```python
30+
client.configure_optimization(objectives="2 * objective1 + objective")
31+
```
32+
33+
In this example, we are optimizing a linear combination of two objectives, `objective1` and `objective2`, and we value improvements to `objective1` twice as much as improvements in `objective2`.
34+
35+
By default, objectives are assumed to be maximized. If you want to minimize an objective, you can prepend the objective with a `-`.
36+
37+
### 2. Continue with iterating over trials and evaluating them
38+
Now that your experiment has been configured for a multi-objective optimization, you can simply continue with iterating over trials and evaluating them as you typically would.
39+
40+
```python
41+
trial_idx, parameters = client.get_next_trials().popitem()
42+
client.complete_trial(...)
43+
```
44+
45+
### 3. Observe optimal parametrizations
46+
You can now observe the optimal parametrizations by calling `get_best_parameterization()`. The function returns a list of tuples containing the best parameters, their corresponding metric values, the most recent trial that ran them, and the name of the best arm.
47+
48+
```python
49+
best_parameterization = client.get_best_parameterization()
50+
for parameters, metrics, trial_index, arm_name in best_parameterization:
51+
...
52+
```
53+
54+
## Learn more
55+
56+
Take a look at these other resources to continue your learning:
57+
58+
- [Multi-objective Optimizations in Ax](#)
59+
- [Set outcome constraints](#)

website/sidebars.js

+1
Original file line numberDiff line numberDiff line change
@@ -57,6 +57,7 @@ export default {
5757
'recipes/experiment-to-json',
5858
'recipes/experiment-to-sqlite',
5959
'recipes/multi-objective-optimization',
60+
'recipes/scalarized-objective',
6061
],
6162
},
6263
};

0 commit comments

Comments
 (0)