Skip to content

Commit

Permalink
Merge pull request #639 from Epistimio/release-v0.1.16rc1
Browse files Browse the repository at this point in the history
Release v0.1.16rc1
  • Loading branch information
bouthilx authored Aug 23, 2021
2 parents 9e6f283 + 40a0659 commit 6bc3b79
Show file tree
Hide file tree
Showing 59 changed files with 2,135 additions and 541 deletions.
213 changes: 52 additions & 161 deletions docs/src/user/algorithms.rst
Original file line number Diff line number Diff line change
Expand Up @@ -56,9 +56,10 @@ Configuration
seed: null
``seed``
.. autoclass:: orion.algo.random.Random
:noindex:
:exclude-members: space, state_dict, set_state, suggest, observe, is_done, seed_rng

Seed for the random number generator used to sample new trials. Default is ``None``.

.. _grid-search:

Expand Down Expand Up @@ -95,11 +96,12 @@ Configuration
n_values: 100
``n_values``
.. autoclass:: orion.algo.gridsearch.GridSearch
:noindex:
:exclude-members: space, state_dict, set_state, suggest, observe, is_done, seed_rng,
configuration, requires_dist, requires_type, build_grid


Number of different values to use for each dimensions to build the grid. Can be either
1. An integer. The same number will be used for all dimensions
2. A dictionary many dimension names to integers. Each dimension will have its own number of values.

.. _hyperband-algorithm:

Expand Down Expand Up @@ -152,16 +154,13 @@ Configuration
algorithms. See :ref:`StubParallelStrategy` for more information.


``seed``

Seed for the random number generator used to sample new trials. Default is ``None``.

``repetitions``
.. autoclass:: orion.algo.hyperband.Hyperband
:noindex:
:exclude-members: space, state_dict, set_state, suggest, observe, is_done, seed_rng,
configuration, sample_from_bracket, append_brackets, create_bracket,
create_brackets, promote, register_samples, sample, seed_brackets,
executed_times

Number of executions for Hyperband. A single execution of Hyperband takes a finite
budget of ``(log(R)/log(eta) + 1) * (log(R)/log(eta) + 1) * R``, and ``repetitions`` allows you
to run multiple executions of Hyperband. Default is ``numpy.inf`` which means to run Hyperband
until no new trials can be suggested.


.. _ASHA:
Expand Down Expand Up @@ -220,36 +219,19 @@ Configuration
Notice the additional ``strategy`` in configuration which is not mandatory for most other
algorithms. See :ref:`StubParallelStrategy` for more information.


``seed``

Seed for the random number generator used to sample new trials. Default is ``None``.


``num_rungs``

Number of rungs for the largest bracket. If not defined, it will be equal to ``(base + 1)`` of the
fidelity dimension. In the original paper,
``num_rungs == log(fidelity.high/fidelity.low) / log(fidelity.base) + 1``.

``num_brackets``

Using a grace period that is too small may bias ASHA too strongly towards fast
converging trials that do not lead to best results at convergence (stragglers).
To overcome this, you can increase the number of brackets, which increases the amount of resources
required for optimisation but decreases the bias towards stragglers. Default is 1.
.. autoclass:: orion.algo.asha.ASHA
:noindex:
:exclude-members: space, state_dict, set_state, suggest, observe, is_done, seed_rng,
configuration, sample_from_bracket, append_brackets, create_bracket,
create_brackets, promote, register_samples, sample, seed_brackets,
executed_times, compute_bracket_idx


``repetitions``

Number of execution of ASHA. Default is ``numpy.inf`` which means to
run ASHA until no new trials can be suggested.


.. _tpe-algorithm:

TPE
---------
---

`Tree-structured Parzen Estimator`_ (TPE) algorithm is one of Sequential Model-Based
Global Optimization (SMBO) algorithms, which will build models to propose new points based
Expand Down Expand Up @@ -291,35 +273,12 @@ Configuration
full_weight_num: 25
``seed``

Seed to sample initial points and candidates points. Default is ``None``.

``n_initial_points``

Number of initial points randomly sampled. Default is ``20``.

``n_ei_candidates``

Number of candidates points sampled for ei compute. Default is ``24``.

``gamma``
.. autoclass:: orion.algo.tpe.TPE
:noindex:
:exclude-members: space, state_dict, set_state, suggest, observe, is_done, seed_rng,
configuration, sample_one_dimension, split_trials, requires_type

Ratio to split the observed trials into good and bad distributions. Default is ``0.25``.

``equal_weight``

True to set equal weights for observed points. Default is ``False``.

``prior_weight``

The weight given to the prior point of the input space. Default is ``1.0``.

``full_weight_num``

The number of the most recent trials which get the full weight where the others will be
applied with a linear ramp from 0 to 1.0. It will only take effect if ``equal_weight``
is ``False``. Default is ``25``.

.. _evolution-es algorithm:

Expand Down Expand Up @@ -382,116 +341,48 @@ Configuration
strategy: StubParallelStrategy
``seed``

Seed for the random number generator used to sample new trials. Default is ``None``.

``repetitions``

Number of executions for Hyperband. A single execution of Hyperband takes a finite
budget of ``(log(R)/log(eta) + 1) * (log(R)/log(eta) + 1) * R``, and ``repetitions`` allows you
to run multiple executions of Hyperband. Default is ``numpy.inf`` which means to run Hyperband
until no new trials can be suggested.

``nums_population``
Number of population for EvolutionES. Larger number of population often gets better performance
but causes more computation. So there is a trade-off according to
the search space and required budget of your problems.

``mutate``

In the mutate part, one can define the customized mutate function with its mutate factors,
such as multiply factor (times/divides by a multiply factor) and add factor
(add/subtract by a multiply factor). We support the default mutate function.
.. autoclass:: orion.algo.evolution_es.EvolutionES
:noindex:
:exclude-members: space, state_dict, set_state, suggest, observe, is_done, seed_rng,
requires_dist, requires_type


Algorithm Plugins
=================

.. _scikit-bayesopt:
Plugins documentation is hosted separately. See short documentations below to find
links to full plugins documentation.

Scikit Bayesian Optimizer
-------------------------
.. _skopt-plugin:

``orion.algo.skopt`` provides a wrapper for `Bayesian optimizer`_ using Gaussian process implemented
in `scikit optimize`_.
Scikit-Optimize
---------------

.. _scikit optimize: https://scikit-optimize.github.io/
.. _bayesian optimizer: https://scikit-optimize.github.io/#skopt.Optimizer
This package is a plugin providing a wrapper for
`skopt <https://scikit-optimize.github.io>`__ optimizers.

Installation
~~~~~~~~~~~~
For more information, you can find the documentation at
`orionalgoskopt.readthedocs.io <https://orionalgoskopt.readthedocs.io>`__.

.. code-block:: sh

pip install orion.algo.skopt
.. _robo-plugin:

Configuration
~~~~~~~~~~~~~
Robust Bayesian Optimization
----------------------------

.. code-block:: yaml
This package is a plugin providing a wrapper for
`RoBO <https://github.com/automl/robo>`__ optimizers.

experiment:
algorithms:
BayesianOptimizer:
seed: null
n_initial_points: 10
acq_func: gp_hedge
alpha: 1.0e-10
n_restarts_optimizer: 0
noise: "gaussian"
normalize_y: False
``seed``

``n_initial_points``

Number of evaluations of ``func`` with initialization points
before approximating it with ``base_estimator``. Points provided as
``x0`` count as initialization points. If len(x0) < n_initial_points
additional points are sampled at random.

``acq_func``

Function to minimize over the posterior distribution. Can be:
``["LCB", "EI", "PI", "gp_hedge", "EIps", "PIps"]``. Check skopt
docs for details.

``alpha``

Value added to the diagonal of the kernel matrix during fitting.
Larger values correspond to increased noise level in the observations
and reduce potential numerical issues during fitting. If an array is
passed, it must have the same number of entries as the data used for
fitting and is used as datapoint-dependent noise level. Note that this
is equivalent to adding a WhiteKernel with c=alpha. Allowing to specify
the noise level directly as a parameter is mainly for convenience and
for consistency with Ridge.

``n_restarts_optimizer``

The number of restarts of the optimizer for finding the kernel's
parameters which maximize the log-marginal likelihood. The first run
of the optimizer is performed from the kernel's initial parameters,
the remaining ones (if any) from thetas sampled log-uniform randomly
from the space of allowed theta-values. If greater than 0, all bounds
must be finite. Note that n_restarts_optimizer == 0 implies that one
run is performed.

``noise``

If set to "gaussian", then it is assumed that y is a noisy estimate of f(x) where the
noise is gaussian.

``normalize_y``

Whether the target values y are normalized, i.e., the mean of the
observed target values become zero. This parameter should be set to
True if the target values' mean is expected to differ considerable from
zero. When enabled, the normalization effectively modifies the GP's
prior based on the data, which contradicts the likelihood principle;
normalization is thus disabled per default.
You will find in this plugin many models for Bayesian Optimization:
`Gaussian Process <https://epistimio.github.io/orion.algo.robo/usage.html#robo-gaussian-process>`__,
`Gaussian Process with MCMC <https://epistimio.github.io/orion.algo.robo/usage.html#robo-gaussian-process-with-mcmc>`__,
`Random Forest <https://epistimio.github.io/orion.algo.robo/usage.html#robo-random-forest>`__,
`DNGO <https://epistimio.github.io/orion.algo.robo/usage.html#robo-dngo>`__ and
`BOHAMIANN <https://epistimio.github.io/orion.algo.robo/usage.html#robo-bohamiann>`__.

For more information, you can find the documentation at
`epistimio.github.io/orion.algo.robo <https://epistimio.github.io/orion.algo.robo>`__.

.. _parallel-strategies:

Expand Down
1 change: 1 addition & 0 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,7 @@
"legacy = orion.storage.legacy:Legacy",
],
"Executor": [
"singleexecutor = orion.executor.single_backend:SingleExecutor",
"joblib = orion.executor.joblib_backend:Joblib",
"dask = orion.executor.dask_backend:Dask",
],
Expand Down
8 changes: 4 additions & 4 deletions src/orion/algo/asha.py
Original file line number Diff line number Diff line change
Expand Up @@ -102,10 +102,10 @@ class ASHA(Hyperband):
Seed for the random number generator used to sample new trials.
Default: ``None``
num_rungs: int, optional
Number of rungs for the largest bracket. If not defined, it will be equal to (base + 1) of
the fidelity dimension. In the original paper,
num_rungs == log(fidelity.high/fidelity.low) / log(fidelity.base) + 1.
Default: log(fidelity.high/fidelity.low) / log(fidelity.base) + 1
Number of rungs for the largest bracket. If not defined, it will be equal to ``(base + 1)``
of the fidelity dimension. In the original paper,
``num_rungs == log(fidelity.high/fidelity.low) / log(fidelity.base) + 1``.
Default: ``log(fidelity.high/fidelity.low) / log(fidelity.base) + 1``
num_brackets: int
Using a grace period that is too small may bias ASHA too strongly towards
fast converging trials that do not lead to best results at convergence (stagglers). To
Expand Down
11 changes: 11 additions & 0 deletions src/orion/algo/evolution_es.py
Original file line number Diff line number Diff line change
Expand Up @@ -100,6 +100,17 @@ class EvolutionES(Hyperband):
repetitions: int
Number of execution of Hyperband. Default is numpy.inf which means to
run Hyperband until no new trials can be suggested.
nums_population: int
Number of population for EvolutionES. Larger number of population often gets better
performance but causes more computation. So there is a trade-off according to the search
space and required budget of your problems.
Default: 20
mutate: str or None, optional
In the mutate part, one can define the customized mutate function with its mutate factors,
such as multiply factor (times/divides by a multiply factor) and add factor
(add/subtract by a multiply factor). The function must be defined by
an importable string. If None, default
mutate function is used: ``orion.algo.mutate_functions.default_mutate``.
"""

Expand Down
6 changes: 4 additions & 2 deletions src/orion/algo/hyperband.py
Original file line number Diff line number Diff line change
Expand Up @@ -137,8 +137,10 @@ class Hyperband(BaseAlgorithm):
Seed for the random number generator used to sample new trials.
Default: ``None``
repetitions: int
Number of execution of Hyperband. Default is numpy.inf which means to
run Hyperband until no new trials can be suggested.
Number of executions for Hyperband. A single execution of Hyperband takes a finite budget of
``(log(R)/log(eta) + 1) * (log(R)/log(eta) + 1) * R``, and ``repetitions`` allows you to run
multiple executions of Hyperband. Default is ``numpy.inf`` which means to run Hyperband
until no new trials can be suggested.
"""

Expand Down
18 changes: 11 additions & 7 deletions src/orion/algo/random.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,15 +12,19 @@


class Random(BaseAlgorithm):
"""Implement a algorithm that samples randomly from the problem's space."""
"""An algorithm that samples randomly from the problem's space.
def __init__(self, space, seed=None):
"""Random sampler takes no other hyperparameter than the problem's space
itself.
Parameters
----------
space: `orion.algo.space.Space`
Optimisation space with priors for each dimension.
seed: None, int or sequence of int
Seed for the random number generator used to sample new trials.
Default: ``None``
:param space: `orion.algo.space.Space` of optimization.
:param seed: Integer seed for the random number generator.
"""
"""

def __init__(self, space, seed=None):
super(Random, self).__init__(space, seed=seed)

def seed_rng(self, seed):
Expand Down
Loading

0 comments on commit 6bc3b79

Please sign in to comment.