早上看到一篇公众号在推xgboost、lightgbm、catboost原理比较,网上大概查了下原理及接口使用,目前只提供python、R、C/C++接口,并没有看到java接口,这个算法还算比较新颖,下面简单的用python调用去接口看看,没太多的技术含量,模型训练、保存、加载都可以看到,好像也提供service服务:关于接口参数说明可以参考




这篇文章: http://blog.csdn.net/aiirrrryee/article/details/78224232


官网提供的例子:

https://tech.yandex.com/catboost/doc/dg/concepts/installation-docpage/

简单的分类、回归例子:
import numpy as np from catboost import
CatBoostClassifier,CatBoostRegressor,Pool
#####¥##################分类#################################
train_data=np.random.randint(0,100,size=(100,10))
train_label=np.random.randint(0,2,size=(100))
test_data=np.random.randint(0,100,size=(50,10))
model=CatBoostClassifier(iterations=2,depth=2,learning_rate=1,loss_function='Logloss',
logging_level='Verbose') model.fit(train_data,train_label,cat_features=[0,2,5])
preds_class=model.predict(test_data) preds_proba=model.predict_proba(test_data)
print("class=",preds_class) print("proba=",preds_proba)
#####¥##################回归################################# train_data =
np.random.randint(0, 100, size=(100, 10)) train_label = np.random.randint(0,
1000, size=(100)) test_data = np.random.randint(0, 100, size=(50, 10)) #
initialize Pool train_pool = Pool(train_data, train_label,
cat_features=[0,2,5]) test_pool = Pool(test_data, cat_features=[0,2,5]) #
specify the training parameters model = CatBoostRegressor(iterations=2,
depth=2, learning_rate=1, loss_function='RMSE') #train the model
model.fit(train_pool) # make the prediction using the resulting model preds =
model.predict(test_pool) print(preds)