1. The document describes research using tree-based machine learning methods like Random Forest, CatBoost, LightGBM, and XGBoost to predict the critical temperature of superconductors. 2. These models were trained and tested on a dataset of 21263 superconductors using k-fold cross validation. Random Forest, CatBoost, LightGBM and XGBoost achieved root-mean-squared errors of 9.05 K, 8.95 K, 8.86 K and 8.85 K, respectively, showing they can effectively predict critical temperature. 3. LightGBM and XGBoost performed best, with XGBoost having the lowest error rate of 8.853 K, slightly out