Wolfram Language Paclet Repository
Community-contributed installable additions to the Wolfram Language
This paclet utilizes XGBoost algorithm in Wolfram Language
Contributed by: Mike Yeh
In this paclet, we provide Wolfram Language functions for implementing XGBoost python functions, e.g., xgb.DMatrix(), xgb.train(), and predict(). So far we have XgbTrain[] to implement xgb.train() and XgbModelPredict[] to perform model prediction. More functions will be added later.
To install this paclet in your Wolfram Language environment,
evaluate this code:
PacletInstall["MikeYeh/XGBPaclet"]
To load the code after installation, evaluate this code:
Needs["MikeYeh`XGBPaclet`"]
A XGBoost python session is required before using XgbTrain[]. The following code shows how to create a XGBoost python session:
In[1]:= | ![]() |
Out[1]= | ![]() |
In[2]:= | ![]() |
The last line of code is to import XGBoost package in our python session. The imported name of xgboost must be xgb.
Create training set and validation set for the following examples
The following code generates training set and validation set as pure numerical arrays or lists:
In[3]:= | ![]() |
In[4]:= | ![]() |
In[5]:= | ![]() |
Use XgbTrain[] to train a model with trainset and XGBoost python session, and store the trained model in the output session:
In[6]:= | ![]() |
XgbModelPredict[] use the given session and default model name "model" to predict the testData. Please note that the xbgoost python session should containing the trained "model":
In[7]:= | ![]() |
Out[7]= | ![]() |
Change XGBoost training runs "numBoostRound" to be 2:
In[8]:= | ![]() |
Wolfram Language Version 14.1