Skip to content

Comments

fix(GPHedge): replace list with sequence#524

Merged
till-m merged 1 commit intobayesian-optimization:masterfrom
phi-friday:fix-gphedge-replace-list-with-sequence
Sep 30, 2024
Merged

fix(GPHedge): replace list with sequence#524
till-m merged 1 commit intobayesian-optimization:masterfrom
phi-friday:fix-gphedge-replace-list-with-sequence

Conversation

@phi-friday
Copy link
Contributor

Given the use case below, it seems that base_acquisitions in GPHedge can have multiple types of AcquisitionFunction.

acq = acquisition.GPHedge(base_acquisitions=base_acquisitions)

However, the type args in list are fixed, so they do not allow such a situation.
Therefore, we need to change to Sequence, which allows args with covariant=True.
I was going to ignore it because it's related to generics,
but it seemed necessary for the use case you're considering, so I fixed it.

Note

This is not a complete description.
But I think it will help you understand why the error is occurring.

  • list[T: AcquisitionFunction]: All elements in list are T.

correct case: [UpperConfidenceBound] does belong to list[UpperConfidenceBound].(Also, UpperConfidenceBound is a subclass of AcquisitionFunction.)
error case: [UpperConfidenceBound, ProbabilityOfImprovement] does not belong to any of the three sets list[AcquisitionFunction], list[UpperConfidenceBound], or list[ProbabilityOfImprovement].

  • Sequence[T_co: AcquisitionFunction]: Every element in Sequence is either T_co or a subclass of T_co.

[UpperConfidenceBound, ProbabilityOfImprovement] does not belong to Sequence[UpperConfidenceBound] or Sequence[ProbabilityOfImprovement], but it does belong to Sequence[AcquisitionFunction].

@till-m till-m merged commit f63372e into bayesian-optimization:master Sep 30, 2024
@till-m
Copy link
Member

till-m commented Sep 30, 2024

Thanks for your contribution!

@phi-friday phi-friday deleted the fix-gphedge-replace-list-with-sequence branch September 30, 2024 11:02
till-m added a commit to till-m/BayesianOptimization that referenced this pull request Oct 9, 2024
* fix: import error from exception module (bayesian-optimization#525)

* fix: replace list with sequence (bayesian-optimization#524)

* Fix min window type check (bayesian-optimization#523)

* fix: replace dict with Mapping

* fix: replace list with Sequence

* fix: add type hint

* fix: does not accept None

* Change docs badge (bayesian-optimization#527)

* fix: parameter, target_space

* fix: constraint, bayesian_optimization

* fix: ParamsType

---------

Co-authored-by: till-m <36440677+till-m@users.noreply.github.com>
till-m added a commit that referenced this pull request Dec 27, 2024
* WIP

* Add ML example

* Save for merge

* Update

* Parameter types more (#13)

* fix: import error from exception module (#525)

* fix: replace list with sequence (#524)

* Fix min window type check (#523)

* fix: replace dict with Mapping

* fix: replace list with Sequence

* fix: add type hint

* fix: does not accept None

* Change docs badge (#527)

* fix: parameter, target_space

* fix: constraint, bayesian_optimization

* fix: ParamsType

---------

Co-authored-by: till-m <36440677+till-m@users.noreply.github.com>

* Use `.masks` not `._masks`

* User `super` to call kernel

* Update logging for parameters

* Disable SDR when non-float parameters are present

* Add demo script for typed optimization

* Update parameters, testing

* Remove sorting, gradient optimize only continuous params

* Go back to `wrap_kernel`

* Update code

* Remove `tqdm` dependency, use EI acq

* Add more text to typed optimization notebook.

* Save files while moving device

* Update with custom parameter type example

* Mention that parameters are not sorted

* Change array reg warning

* Update Citations, parameter notebook

---------

Co-authored-by: phi-friday <phi.friday@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants