Wolfram Language Paclet Repository
Community-contributed installable additions to the Wolfram Language
This paclet utilizes XGBoost algorithm in Wolfram Language
Contributed by: Mike Yeh
In this paclet, we provide Wolfram Language functions for implementing XGBoost python functions, e.g., xgb.DMatrix(), xgb.train(), and predict(). So far we have XgbTrain[] to implement xgb.train() and XgbModelPredict[] to perform model prediction. More functions will be added later.
To install this paclet in your Wolfram Language environment,
evaluate this code:
PacletInstall["MikeYeh/XGBPaclet"]
To load the code after installation, evaluate this code:
Needs["MikeYeh`XGBPaclet`"]
A XGBoost python session is recommended before using XgbTrain[]. The code demonstrates the minimum packages needed to be installed before creating a python session:
In[1]:= | ![]() |
To create a XGBoost python session:
In[2]:= | ![]() |
Within the session our function will automatically import xgboost and other packages.
Create training set and validation set for the following examples
The following code generates training set and validation set as pure numerical arrays or lists:
In[3]:= | ![]() |
In[4]:= | ![]() |
In[5]:= | ![]() |
Use XgbTrain[] to train a model with trainset and XGBoost python session, and store the trained model in the output session:
In[6]:= | ![]() |
Out[6]= | ![]() |
Store all information of the trained model in the XGBTrainResultsObject[]:
In[7]:= | ![]() |
Out[7]= | ![]() |
XgbModelPredict[] use the given session and default model name "model" to predict the testData. Please note that the xbgoost python session should containing the trained "model":
In[8]:= | ![]() |
Out[8]= | ![]() |
Change XGBoost training runs "numBoostRound" to be 2:
In[9]:= | ![]() |
In[10]:= | ![]() |
Out[10]= | ![]() |
Wolfram Language Version 14.1