Question: XGBoost is a scalable machine learning system for tree boosting. Its objective function has a training loss and a regularization term: L = i

XGBoost is a scalable machine learning system for tree boosting. Its objective function has a training loss and a regularization term: L=il(),^yi)+kΩ(fk)L=il(yi,y^i)+kΩ(fk). Read the XGBoost paper and answer the following questions:

a. What is ^yiy^i ? At the tt th iteration, XGBoost fixes f1,,ft(f1,,ft1 and trains the tt th tree model ftft. How does XGBoost approximate the training loss)(yi,^yi)lossl(yi,y^i) here?

b. What is Ω(fk)Ω(fk) ? Which part in the regularization term needs to be considered at the tt th iteration?

Step by Step Solution

3.38 Rating (157 Votes )

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock

a The final prediction for a given example is the sum of predic... View full answer

blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Data Mining Concepts And Techniques Questions!