XGBoost is a scalable machine learning system for tree boosting. Its objective function has a training loss

Question:

XGBoost is a scalable machine learning system for tree boosting. Its objective function has a training loss and a regularization term: L=il(),^yi)+kΩ(fk)L=il(yi,y^i)+kΩ(fk). Read the XGBoost paper and answer the following questions:

a. What is ^yiy^i ? At the tt th iteration, XGBoost fixes f1,,ft(f1,,ft1 and trains the tt th tree model ftft. How does XGBoost approximate the training loss)(yi,^yi)lossl(yi,y^i) here?

b. What is Ω(fk)Ω(fk) ? Which part in the regularization term needs to be considered at the tt th iteration?

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Data Mining Concepts And Techniques

ISBN: 9780128117613

4th Edition

Authors: Jiawei Han, Jian Pei, Hanghang Tong

Question Posted: