API

Library


Module

General Framework for the data augmented Gaussian Processes

source

Types

Classic Batch Gaussian Process Regression (no inducing points)

source

Sparse Gaussian Process Regression with Gaussian Likelihood

source

Batch Student T Gaussian Process Regression (no inducing points)

source

Sparse Gaussian Process Regression with Student T likelihood

source

Batch Gaussian Process Classifier with Logistic Likelihood (no inducing points)

source

Sparse Gaussian Process Classifier with Logistic Likelihood Create a GP model taking the training data and labels X & y as required arguments. Other optional arguments are:

  • Stochastic::Bool : Is the method trained via mini batches
  • AdaptiveLearningRate::Bool : Is the learning rate adapted via estimation of the gradient variance? see "Adaptive Learning Rate for Stochastic Variational inference" https://pdfs.semanticscholar.org/9903/e08557f328d58e4ba7fce68faee380d30b12.pdf, if not use simple exponential decay with parameters κs and τs seen under (1/(iter+τs))^-κs
  • Autotuning::Bool : Are the hyperparameters trained as well
  • optimizer::Optimizer : Type of optimizer for the hyperparameters
  • OptimizeIndPoints::Bool : Is the location of inducing points optimized
  • nEpochs::Integer : How many iteration steps
  • batchsize::Integer : number of samples per minibatches
  • κ_s::Real
  • τ_s::Real
  • kernel::Kernel : Kernel for the model
  • noise::Float64 : noise added in the model
  • m::Integer : Number of inducing points
  • ϵ::Float64 : minimum value for convergence
  • SmoothingWindow::Integer : Window size for averaging convergence in the stochastic case
  • verbose::Integer : How much information is displayed (from 0 to 3)
source

Batch Bayesian Support Vector Machine (no inducing points)

source

Sparse Gaussian Process Classifier with Bayesian SVM likelihood

source

Functions and methods

Function to train the given GP model, there are options to change the number of max iterations, give a callback function that will take the model and the actual step as arguments and give a convergence method to stop the algorithm given specific criteria

source

Return the mean of the predictive distribution of f

source

Return the mean and variance of the predictive distribution of f

source

Return the mean of the predictive distribution of f

source

Return the mean and variance of the predictive distribution of f

source

Return the predicted class {-1,1} with a GP model via the logit link

source

Return the mean of likelihood p(y=1|X,x) via the logit link with a GP model

source

Return the point estimate of the likelihood of class y=1 via the SVM likelihood

source

Return the likelihood of class y=1 via the SVM likelihood

source

Kernels

Radial Basis Function Kernel also called RBF or SE(Squared Exponential)
source

Kernel functions

Create the covariance matrix between the matrix X1 and X2 with the covariance function kernel

source

Compute the covariance matrix of the matrix X, optionally only compute the diagonal terms

source

Compute the covariance matrix between the matrix X1 and X2 with the covariance function kernel in preallocated matrix K

source

Compute the covariance matrix of the matrix X in preallocated matrix K, optionally only compute the diagonal terms

source

Return the variance of the kernel

source

Return the lengthscale of the IsoKernel

source

Return the lengthscales of the ARD Kernel

source

Index