Skip to content

Tutorial: Your first optimization

Gustavo Rosa edited this page Jun 29, 2021 · 17 revisions

Every code starts with some imports, correct?

import numpy as np
from opytimark.markers.n_dimensional import Sphere

from opytimizer import Opytimizer
from opytimizer.core import Function
from opytimizer.optimizers.swarm import PSO
from opytimizer.spaces import SearchSpace

First of all, let us define a random seed for experimental consistency.

# Random seed for experimental consistency
np.random.seed(0)

Let us instantiate the first required class. We will need a search space to fit all of this stuff, right? We need the number of agents, number of variables, and lower and upper bound for each variable.

# Number of agents and decision variables
n_agents = 20
n_variables = 2

# Lower and upper bounds (has to be the same size as `n_variables`)
lower_bound = [-10, -10]
upper_bound = [10, 10]

# Creates the SearchSpace class
space = SearchSpace(n_agents, n_variables, lower_bound, upper_bound)

So, we have a search space, but what is missing? Yes! We want to optimize so that we will need an optimizer.

# Creates PSO's optimizer
optimizer = PSO()

We want to optimize things, right? So you can define whatever your desires are. In this case, we will use a simple x^2, where x is a vector of n variables, pre-loaded from the opytimark package. With an objective function, we can create a Function class.

# Creates Function's object
function = Function(Sphere())

Finally, we can plug everything into your tasker, Opytimizer. With this class in hand, send a start command to initiate the optimization smoothly.

# Bundles every piece into Opytimizer class
opt = Opytimizer(space, optimizer, function)

# Runs the optimization task
opt.start(n_iterations=100)

There you go! Just sequentially put this instruction in a single file and run it. You can also get the file in examples/applications/single_objective/standard_optimization.py. Stay focused and you should be ready for everything.

Clone this wiki locally