You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A Lightweight Decision Tree Framework supporting regular algorithms: ID3, C4.5, CART, CHAID and Regression Trees; some advanced techniques: Gradient Boosting, Random Forest and Adaboost w/categorical features support for Python
This is the Docker container based on open source framework XGBoost (https://xgboost.readthedocs.io/en/latest/) to allow customers use their own XGBoost scripts in SageMaker.
A Python package which implements several boosting algorithms with different combinations of base learners, optimization algorithms, and loss functions.
LightGBM + Optuna: Auto train LightGBM directly from CSV files, Auto tune them using Optuna, Auto serve best model using FastAPI. Inspired by Abhishek Thakur's AutoXGB.
This provides a python based response generator for the Gamma-ray Burst Monitor (GBM). Additionally, a 3ML plugin is provided which allows for the simultaneous fitting of GRB locations and spectra.
This my entry for the Titanic competition on Kaggle. May 2019: public score is 0.80382, which is a top 10% ranking on the leader board of around 11.249 participants.
Sistema web para el uso de modelos de riesgo de defunción y severidad hospitalaria asociada a diferentes afecciones médicas en pacientes hospitalizados.
A machine learning-based model that predicts English Premier League standings using historical match data, player stats, and team performance metrics. The project leverages Random Forest for match outcome predictions and is deployed with Streamlit for interactive analysis.