Question: XGBoost is a scalable machine learning system for tree boosting. Its objective function has a training loss and a regularization term: L = i
XGBoost is a scalable machine learning system for tree boosting. Its objective function has a training loss and a regularization term: L=∑il(),^yi)+∑kΩ(fk)L=∑il(yi,y^i)+∑kΩ(fk). Read the XGBoost paper and answer the following questions:
a. What is ^yiy^i ? At the tt th iteration, XGBoost fixes f1,…,ft−(f1,…,ft−1 and trains the tt th tree model ftft. How does XGBoost approximate the training loss)(yi,^yi)lossl(yi,y^i) here?
b. What is Ω(fk)Ω(fk) ? Which part in the regularization term needs to be considered at the tt th iteration?
Step by Step Solution
3.38 Rating (157 Votes )
There are 3 Steps involved in it
a The final prediction for a given example is the sum of predic... View full answer
Get step-by-step solutions from verified subject matter experts
