HSIN-MIN LUChen J.-SLiao W.-C.2022-04-262022-04-26202110414347https://www.scopus.com/inward/record.uri?eid=2-s2.0-85098927817&doi=10.1109%2fTKDE.2019.2953728&partnerID=40&md5=37bf161cc567dc630069097c73d518c3https://scholars.lib.ntu.edu.tw/handle/123456789/608044Regression models have broad applications in data analytics. Gaussian process regression is a nonparametric regression model that learns nonlinear maps from input features to real-valued output using a kernel function that constructs the covariance matrix among all pairs of data. Gaussian process regression often performs well in various applications. However, the time complexity of Gaussian process regression is O(n3) O(n3) for a training dataset of size $n$ n. The cubic time complexity hinders Gaussian process regression from scaling up to large datasets. Guided by the properties of Gaussian distributions, we developed a variance-adjusted gradient boosting algorithm for approximating a Gaussian process regression (VAGR). VAGR sequentially approximates the full Gaussian process regression model using the residuals computed from variance-adjusted predictions based on randomly sampled training subsets. VAGR has a time complexity of O(nm3) O(nm3) for a training dataset of size n n and the chosen batch size $m$ m. The reduced time complexity allows us to apply VAGR to much larger datasets compared with the full Gaussian process regression. Our experiments suggest that VAGR has a prediction performance comparable to or better than models that include random forest, gradient boosting machines, support vector regressions, and stochastic variational inference for Gaussian process regression. ? 1989-2012 IEEE.big dataGaussian processGradient boostingvariance-adjusted predictionCovariance matrixData AnalyticsDecision treesGaussian noise (electronic)Large datasetStochastic modelsStochastic systemsSupport vector machinesSupport vector regressionBroad applicationGaussian process regressionGaussian process regression modelNon-parametric regressionPrediction performanceTraining subsetsVariational inferenceGaussian distribution[SDGs]SDG15Nonparametric Regression via Variance-Adjusted Gradient Boosting Gaussian Process Regressionjournal article10.1109/TKDE.2019.29537282-s2.0-85098927817