Skip to content

Conversation

@paoloascia
Copy link

Motivation

Hi! This tutorial explains how to implement the Feasibility-driven trust Region Bayesian Optimization (FuRBO) method presented last September at AutoML2025. We believe it is a relevant contribution, as it builds upon Scalable Constrained BO to enhance the treatment of constraints and accelerate convergence to a feasible region.

Have you read the Contributing Guidelines on pull requests?

Yes

Test Plan

We did not change the previous code. The tutorial has been tested prior to upload.

Related PRs

No documentation needs to be added.

Commit with the tutorial on Feasibility-Driven trust Region Bayesian Optimization.
Title update
SCBO url update
Title update
@meta-cla
Copy link

meta-cla bot commented Oct 17, 2025

Hi @paoloascia!

Thank you for your pull request and welcome to our community.

Action Required

In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at [email protected]. Thanks!

@meta-cla
Copy link

meta-cla bot commented Oct 17, 2025

Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Meta Open Source project. Thanks!

@meta-cla meta-cla bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Oct 17, 2025
@hvarfner
Copy link
Contributor

Thanks @paoloascia for the contribution! Overall I think this looks good and about ready to be merged. I'll get back later today with a few suggestions for improvements.

@paoloascia
Copy link
Author

Hi @hvarfner ! Thank you for the comment. I'm looking forward to your feedback :).

Copy link
Contributor

@hvarfner hvarfner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good implementation-wise, but there are some changes that would be good to see.

  1. Please make the notebook more informative: The user should get a sense of what the method does from the markdown cells and plots. Now, they have to go through the code. Since the community is pretty well-informed on what TurBO does by now (and probably even SCBO to some extent), try to focus on what distinguishes FurBO from those two. This may entail some extra work, such as adding an informative plot, some math and maybe even a comparison to SCBO.
  2. Please use a linter to clean up the code. This includes unused imports, having comments in the right places and using docstrings instead of comments where appropriate.
  3. Get rid of arguments that are not actually used. Max_cholesky_size and "method": L-BFGS-B are covered by current defaults, so they shouldn't be needed.
  4. Warnings & try/catches: It's odd to me that GP fitting fails - I really don't think it should with this few observations. Please have a look at this. I'm happy to help here.

"from gpytorch.likelihoods import GaussianLikelihood\n",
"from gpytorch.mlls import ExactMarginalLogLikelihood\n",
"\n",
"from scipy.stats import invgauss\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unused imports

"from gpytorch.mlls import ExactMarginalLogLikelihood\n",
"\n",
"from scipy.stats import invgauss\n",
"from scipy.stats import ecdf\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unused imports

"metadata": {},
"outputs": [],
"source": [
"class norm_():\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be good to see the utilities we have in botorch used here. The class Normalize(d=dim, bounds=bounds) should do the trick to normalize the inputs to [0, 1] for the models. Similarly, creating a different class ack that wraps Ackley can be avoided by using the same transforms.

"from botorch.models.model_list_gp_regression import ModelListGP\n",
"from torch.quasirandom import SobolEngine\n",
"\n",
"class Furbo_state():\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please use pythonic class naming (i.e. FurboState). Ideally, use type hints too, so that users understan what is going on.

"class Furbo_state():\n",
" '''Class to track optimization status with restart'''\n",
" # Initialization of the status\n",
" def __init__(self, \n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank for commenting what each variable does. However, the conventional way to do this would be via a docstring. This is also cleaner, and should be a quick fix.

"import numpy as np\n",
"from matplotlib import rc\n",
"\n",
"def ensure_tensor(x, tkwargs, dim=0):\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Isn't torch.cat(x).to(**tkwargs) enough here?

" samples = multivariate_circular(x_candidate, radius, n_samples, lb=lb, ub=ub, **tkwargs)\n",
" \n",
" # Evaluate samples on the models of the objective -> yy Tensor\n",
" state.Y_model.eval()\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

posterior() sets the model in eval mode, so these can be removed!

" samples_yy = posterior.mean.squeeze()\n",
" \n",
" # Evaluate samples on the models of the constraints -> yy Tensor\n",
" state.C_model.eval()\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and here

"metadata": {},
"outputs": [],
"source": [
"def multivariate_circular(centre, # Centre of the multivariate distribution\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This to me seems like the secret sauce for the method, right? If so, it could benefit from some additional explanation in the markdown cells!

"outputs": [],
"source": [
"def stopping_criterion(state, n_iteration):\n",
" '''Function to evaluate if the maximum number of allowed iterations is reached.'''\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

return state.it_counter <= n_iteration

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed Do not delete this pull request or issue due to inactivity.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants