Gradient boosted trees are an ensemble machine learning technique that produces a prediction model as an ensemble of weak prediction models, typically decision trees. It builds models sequentially to minimize a loss function using gradient descent. Each new model is fit to the negative gradient of the loss function to reduce error. This allows weak learners to be combined into a stronger learner with better predictive performance than a single decision tree. Key advantages are it is fast, easy to tune, and achieves good performance.