Modelling Customer Churn using Xgboost

Go to Top

References

Load the libraries

Go to Top

Colab

Useful Scripts

Go to Top

Load the Data

Go to Top

Data Processing

Go to Top

Train Validation Split

Go to Top

Modelling: Xgboost

Go to Top

HPO: bayes_opt

maximize(
    init_points       = 5, 
    n_iter            = 25,
    acq               = 'ucb',
    kappa             = 2.576,
    kappa_decay       = 1,
    kappa_decay_delay = 0,
    xi                = 0.0,
    **gp_params)


 |      acq: {'ucb', 'ei', 'poi'}
 |          The acquisition method used.
 |              * 'ucb' stands for the Upper Confidence Bounds method
 |              * 'ei' is the Expected Improvement method
 |              * 'poi' is the Probability Of Improvement criterion.
 |      
 |      kappa: float, optional(default=2.576)
 |          Parameter to indicate how closed are the next parameters sampled.
 |              Higher value = favors spaces that are least explored.
 |              Lower value = favors spaces where the regression function is the
 |              highest.





If we use early_stopping_round, then we have
clf.best_score, clf.best_iteration and clf.best_ntree_limit.

NOTE: Here, if I use early_stopping_round, the bayes optimization fails.

Model Evaluation

Go to Top

Time Taken

Go to Top