Conversation
Codecov Report❌ Patch coverage is
📢 Thoughts on this report? Let us know! |
|
General question - does caching hyperparameters work for approximate GPs? Seems ok since only kernel parameters should be saved, and they are valid even if inducing point is removed, but good to check. Also a request - add parameter for inducing point count and allocator (the two available in botorch) |
|
Looking at caching hyperparameters right now, will add the options you mentioned |
|
Looks like there is a botorch bug with loading in a state dict for ApproximateGP models, will submit a PR for botorch when I get a chance tomorrow, for now I will disable |
|
Added a botorch issue meta-pytorch/botorch#3250 |
|
@nikitakuklev I've added the requested options, lmk what you think |
nikitakuklev
left a comment
There was a problem hiding this comment.
See comments. Also, need to add to validate_gp_constructor and all
|
@nikitakuklev thanks for the comments, should be addressed now, lmk if there are more changes you'd like |
| gp_constructor.build_approximate_gp( | ||
| X=torch.empty((0, 1)), | ||
| Y=torch.empty((0, 1)), | ||
| Yvar=torch.empty((0, 1)), |
There was a problem hiding this comment.
Yvar is not supported (this is masked test issue only)
There was a problem hiding this comment.
thanks for catching that
This pull request introduces support for approximate Gaussian Process (GP) model building to the codebase, enabling scalable modeling for larger datasets. The main changes include the implementation of an
ApproximateModelConstructorusing variational GP models, updates to benchmarking and documentation, and refactoring of model training logic to support both exact and variational approaches.Approximate GP Modeling Support:
ApproximateModelConstructorclass inxopt/generators/bayesian/models/approximate.pyto construct scalable variational GP models for each outcome, leveragingSingleTaskVariationalGPand variational ELBO training.build_variational_gputility method inbase_model.pyfor creating and training variational GP models.standard.pyto allow flexible training of individual models with custom marginal log likelihoods, supporting both exact and variational approaches. [1] [2] [3]Documentation and Examples:
approximate.ipynb, demonstrating benchmarking and usage of approximate GP models versus standard models, including scalability and performance comparisons.index.mdandmkdocs.yml) to reference approximate GP model building and the new example notebook. [1] [2]Testing:
ApproximateModelConstructorbuilds models usingSingleTaskVariationalGP, and updated tests for model training and error handling. [1] [2] [3]Underlying API Enhancements: