Impress us!
Optional
Last updated
Optional
Last updated
We are really proud of you, you reached the end!
"End? No, the journey doesn't end here. XGBoost is just another path... One that we all must take."
~ Gandalf, probably... if he knew data science
So in that sense, we encourage you to go on even further.
There are still numerous algorithms we have not covered so far.
If you do not know how to find one, it always helps to look at the , if you are stuck on some problem concerning machine learning. may be helpful to find other methods for classification.
If you are using R do not worry, most algorithms you found on the sci-kit website are also available in R. This page is only for getting inspiration as it is the best overview of existing algorithms, but you do not have to do any programming in Python.
The possibilities are overwhelming, so if you cannot decide on a specific algorithm, you can use XGBoost - short for extreme gradient boosting - a powerful boosting method that is regularly used by winners in competitions on Kaggle. To learn more about it, visit the .
You can learn more about the implementation in R on this .