# XG Boost

Use this gradient boosted trees algorithm to provide an accurate prediction of a target
variable by combining the estimate of a set of simpler, weaker models. Additionally, it uses a
gradient descent algorithm to minimize the loss when adding new models.

XGBoost minimizes a regularized objective function that combines a convex loss function, based on the difference between the predicted and target outputs, and a penalty term for model complexity. This is also referred to as the regression tree function.