Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
# Flags: doc-Runnable

An example of optimizing a simple support vector machine on the digits dataset. In contrast to the
[simple example](/examples/1%20Basics/2_svm_cv/), in which all cross-validation folds are executed
[simple example](../../1_basics/2_svm_cv), in which all cross-validation folds are executed
at once, we use the intensification mechanism described in the original
[SMAC paper](https://link.springer.com/chapter/10.1007/978-3-642-25566-3_40) as also demonstrated
by [Auto-WEKA](https://dl.acm.org/doi/10.1145/2487575.2487629). This mechanism allows us to
Expand Down
7 changes: 1 addition & 6 deletions mkdocs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@ plugins:
python:
paths: [smac]
# Extra objects which allow for linking to external docs
inventories:
import:
- 'https://docs.python.org/3/objects.inv'
- 'https://numpy.org/doc/stable/objects.inv'
- 'https://pandas.pydata.org/docs/objects.inv'
Expand Down Expand Up @@ -197,14 +197,12 @@ nav:
- Installation: "1_installation.md"
- Package Overview: "2_package_overview.md"
- Getting Started: "3_getting_started.md"
- Minimal Example: "4_minimal_example.md"
- Advanced Usage:
- "advanced_usage/1_components.md"
- "advanced_usage/2_multi_fidelity.md"
- "advanced_usage/3_multi_objective.md"
- "advanced_usage/4_instances.md"
- "advanced_usage/5_ask_and_tell.md"
- "advanced_usage/5.1_warmstarting.md"
- "advanced_usage/6_commandline.md"
- "advanced_usage/7_stopping_criteria.md"
- "advanced_usage/8_logging.md"
Expand All @@ -221,9 +219,6 @@ nav:
- "6_references.md"
- "7_glossary.md"
- "8_faq.md"
- "9_license.md"
- "10_experimental.md"
- "images/README.md"
# - Contributing:
# - "contributing/index.md"
# - "contributing/contributing-a-benchmark.md"
Expand Down
8 changes: 0 additions & 8 deletions readme.txt

This file was deleted.

2 changes: 1 addition & 1 deletion smac/acquisition/function/confidence_bound.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ class AbstractConfidenceBound(AbstractAcquisitionFunction):

Example for LCB (UCB adds the variance term instead of subtracting it):

$LCB(X) = \mu(\mathbf{X}) - \sqrt{\beta_t}\sigma(\mathbf{X})$ [SKKS10](/6_references/#SKKS10).
$LCB(X) = \mu(\mathbf{X}) - \sqrt{\beta_t}\sigma(\mathbf{X})$ [SKKS10](../../../docs/6_references).

with

Expand Down
7 changes: 4 additions & 3 deletions smac/acquisition/function/probability_improvement.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,10 @@
class PI(AbstractAcquisitionFunction):
r"""Probability of Improvement

:math:`P(f_{t+1}(\mathbf{X})\geq f(\mathbf{X^+}))` :math:`:=`
:math:`\Phi(\\frac{ \mu(\mathbf{X})-f(\mathbf{X^+}) }{ \sigma(\mathbf{X}) })`
with :math:`f(X^+)` as the incumbent and :math:`\Phi` the cdf of the standard normal.
$$
P(f_{t+1}(\mathbf{X})\geq f(\mathbf{X^+})) := \Phi\left(\frac{ \mu(\mathbf{X})-f(\mathbf{X^+}) }{ \sigma(\mathbf{X}) }\right)
$$
with $f(X^+)$ as the incumbent and $\Phi$ the cdf of the standard normal.

Parameters
----------
Expand Down
5 changes: 5 additions & 0 deletions smac/acquisition/function/thompson.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,11 @@ class TS(AbstractAcquisitionFunction):

$TS(X) \sim \mathcal{N}(\mu(\mathbf{X}),\sigma(\mathbf{X}))$
Returns -TS(X) as the acquisition_function optimizer maximizes the acquisition value.

Parameters
----------
xi : float, defaults to 0.0
TS does not require xi here, we only wants to make it consistent with other acquisition functions.
"""

@property
Expand Down
11 changes: 3 additions & 8 deletions smac/acquisition/maximizer/abstract_acquisition_maximizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,15 +27,10 @@ class AbstractAcquisitionMaximizer:

Parameters
----------
configspace : ConfigurationSpace
Configuration space used for sampling.
acquisition_function : AbstractAcquisitionFunction | None, defaults to None
Acquisition function to maximize.
challengers : int, defaults to 5000
Number of configurations sampled during the optimization process. Details depend on the used maximizer.
Also, the number of configurations that is returned by calling `maximize`.
configspace : ConfigurationSpace acquisition_function : AbstractAcquisitionFunction
challengers : int, defaults to 5000 Number of configurations sampled during the optimization process,
details depend on the used maximizer. Also, the number of configurations that is returned by calling `maximize`.
seed : int, defaults to 0
Random seed.
"""

def __init__(
Expand Down
10 changes: 4 additions & 6 deletions smac/acquisition/maximizer/differential_evolution.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,15 +22,13 @@ def check_kwarg(cls: type, kwarg_name: str) -> bool:

Parameters
----------
cls : type
The class to inspect.
kwarg_name : str
The name of the keyword argument to check.
cls (type): The class to inspect.
kwarg_name (str): The name of the keyword argument to check.

Returns
-------
bool
True if the class's __init__ method accepts the keyword argument, otherwise False.
bool: True if the class's __init__ method accepts the keyword argument,
otherwise False.
"""
# Get the signature of the class's __init__ method
init_signature = inspect.signature(cls.__init__) # type: ignore[misc]
Expand Down
3 changes: 3 additions & 0 deletions smac/facade/abstract_facade.py
Original file line number Diff line number Diff line change
Expand Up @@ -347,6 +347,9 @@ def validate(
----------
config : Configuration
Configuration to validate
instances : list[str] | None, defaults to None
Which instances to validate. If None, all instances specified in the scenario are used.
In case that the budget type is real-valued, this argument is ignored.
seed : int | None, defaults to None
If None, the seed from the scenario is used.

Expand Down
6 changes: 3 additions & 3 deletions smac/facade/blackbox_facade.py
Original file line number Diff line number Diff line change
Expand Up @@ -239,7 +239,7 @@ def get_initial_design( # type: ignore
scenario: Scenario,
*,
n_configs: int | None = None,
n_configs_per_hyperparameter: int = 8,
n_configs_per_hyperparamter: int = 8,
max_ratio: float = 0.25,
additional_configs: list[Configuration] = None,
) -> SobolInitialDesign:
Expand All @@ -257,15 +257,15 @@ def get_initial_design( # type: ignore
max_ratio: float, defaults to 0.25
Use at most ``scenario.n_trials`` * ``max_ratio`` number of configurations in the initial design.
Additional configurations are not affected by this parameter.
additional_configs: list[Configuration], defaults to None
additional_configs: list[Configuration], defaults to []
Adds additional configurations to the initial design.
"""
if additional_configs is None:
additional_configs = []
return SobolInitialDesign(
scenario=scenario,
n_configs=n_configs,
n_configs_per_hyperparameter=n_configs_per_hyperparameter,
n_configs_per_hyperparameter=n_configs_per_hyperparamter,
max_ratio=max_ratio,
additional_configs=additional_configs,
seed=scenario.seed,
Expand Down
6 changes: 3 additions & 3 deletions smac/facade/hyperparameter_optimization_facade.py
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,7 @@ def get_initial_design( # type: ignore
scenario: Scenario,
*,
n_configs: int | None = None,
n_configs_per_hyperparameter: int = 10,
n_configs_per_hyperparamter: int = 10,
max_ratio: float = 0.25,
additional_configs: list[Configuration] | None = None,
) -> SobolInitialDesign:
Expand All @@ -154,13 +154,13 @@ def get_initial_design( # type: ignore
max_ratio: float, defaults to 0.25
Use at most ``scenario.n_trials`` * ``max_ratio`` number of configurations in the initial design.
Additional configurations are not affected by this parameter.
additional_configs: list[Configuration], defaults to None
additional_configs: list[Configuration], defaults to []
Adds additional configurations to the initial design.
"""
return SobolInitialDesign(
scenario=scenario,
n_configs=n_configs,
n_configs_per_hyperparameter=n_configs_per_hyperparameter,
n_configs_per_hyperparameter=n_configs_per_hyperparamter,
max_ratio=max_ratio,
additional_configs=additional_configs,
)
Expand Down
6 changes: 3 additions & 3 deletions smac/facade/multi_fidelity_facade.py
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ def get_initial_design( # type: ignore
scenario: Scenario,
*,
n_configs: int | None = None,
n_configs_per_hyperparameter: int = 10,
n_configs_per_hyperparamter: int = 10,
max_ratio: float = 0.25,
additional_configs: list[Configuration] = None,
) -> RandomInitialDesign:
Expand All @@ -90,15 +90,15 @@ def get_initial_design( # type: ignore
max_ratio: float, defaults to 0.25
Use at most ``scenario.n_trials`` * ``max_ratio`` number of configurations in the initial design.
Additional configurations are not affected by this parameter.
additional_configs: list[Configuration], defaults to None
additional_configs: list[Configuration], defaults to []
Adds additional configurations to the initial design.
"""
if additional_configs is None:
additional_configs = []
return RandomInitialDesign(
scenario=scenario,
n_configs=n_configs,
n_configs_per_hyperparameter=n_configs_per_hyperparameter,
n_configs_per_hyperparameter=n_configs_per_hyperparamter,
max_ratio=max_ratio,
additional_configs=additional_configs,
)
6 changes: 3 additions & 3 deletions smac/model/gaussian_process/gaussian_process.py
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ def _train(
----------
X : np.ndarray [#samples, #hyperparameters + #features]
Input data points.
y : np.ndarray [#samples, #objectives]
Y : np.ndarray [#samples, #objectives]
The corresponding target values.
optimize_hyperparameters: boolean
If set to true, the hyperparameters are optimized, otherwise the default hyperparameters of the kernel are
Expand Down Expand Up @@ -276,9 +276,9 @@ def sample_functions(self, X_test: np.ndarray, n_funcs: int = 1) -> np.ndarray:

Parameters
----------
X_test : np.ndarray [#samples, #hyperparameters + #features]
X : np.ndarray [#samples, #hyperparameters + #features]
Input data points.
n_funcs : int
n_funcs: int
Number of function values that are drawn at each test point.

Returns
Expand Down
8 changes: 4 additions & 4 deletions smac/model/random_forest/random_forest.py
Original file line number Diff line number Diff line change
Expand Up @@ -113,11 +113,11 @@ def __init__(
Parameters
----------
n_estimators : int, default=100
The number of trees in the forest.
The number of trees in the forest.

.. versionchanged:: 0.22
The default value of ``n_estimators`` changed from 10 to 100
in 0.22.
.. versionchanged:: 0.22
The default value of ``n_estimators`` changed from 10 to 100
in 0.22.

criterion : {"squared_error", "absolute_error", "friedman_mse", "poisson"}, \
default="squared_error"
Expand Down
12 changes: 6 additions & 6 deletions smac/random_design/modulus_design.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,15 +51,15 @@ class DynamicModulusRandomDesign(AbstractRandomDesign):
Parameters
----------
start_modulus : float, defaults to 2.0
Initially, every modulus-th configuration will be at random.
Initially, every modulus-th configuration will be at random.
modulus_increment : float, defaults to 0.3
Increase modulus by this amount in every iteration.
Increase modulus by this amount in every iteration.
end_modulus : float, defaults to np.inf
The maximum modulus ever used. If the value is reached before the optimization
is over, it is not further increased. If it is not reached before the optimization is over,
there will be no adjustment to make sure that the `end_modulus` is reached.
The maximum modulus ever used. If the value is reached before the optimization
is over, it is not further increased. If it is not reached before the optimization is over,
there will be no adjustment to make sure that the `end_modulus` is reached.
seed : int, defaults to 0
Integer used to initialize the random state. This class does not use the seed.
Integer used to initialize the random state. This class does not use the seed.
"""

def __init__(
Expand Down
4 changes: 1 addition & 3 deletions smac/runhistory/runhistory.py
Original file line number Diff line number Diff line change
Expand Up @@ -307,10 +307,8 @@ def add_trial(self, info: TrialInfo, value: TrialValue) -> None:

Parameters
----------
info : TrialInfo
trial : TrialInfo
The ``TrialInfo`` object of the running trial.
value : TrialValue
The ``TrialValue`` object of the running trial.
"""
self.add(
config=info.config,
Expand Down
Loading