作者:时晴
上篇《深恶痛绝的超参》已经介绍了很多实用的调参方式,今天来看一篇更有趣的跳槽方法,用ML的方式调ML的模型我们用我们熟悉的模型去调我们熟悉的模型,看到这里很晕是不是,接下来我们就看看XGBoost如何调XGBoost。
Model-based HP Tuning
基于模型的调参其实想法很简单,我们需要有个方式指导超参优化,从而达到最好的效果。现在训练集很大,训练模型相当耗时,各种配置的组合往往又非常大,所以为什么不直接学一个estimator去给当前配置打分呢?每次训练都可以为我们探索方向给予启发。
基于模型优化超参可以概括为以下流程:
参数空间采样
怎么在参数空间采样呢?已经有现成的lib可以用了:
ConfigurationSpace: https://automl.github.io/ConfigSpace/master/
>>> import ConfigSpace as CS
>>> import ConfigSpace.hyperparameters as CSH
>>> cs = CS.ConfigurationSpace(seed=1234)
>>> a = CSH.UniformIntegerHyperparameter('a', lower=10, upper=100, log=False)
>>> b = CSH.CategoricalHyperparameter('b', choices=['red', 'green', 'blue'])
>>> cs.add_hyperparameters([a, b])
[a, Type: UniformInteger, Range: [10, 100], Default: 55,...]
>>> cs.sample_configuration()
Configuration:
a, Value: 27
b, Value: 'blue'
"我"调"我"自己
最早都是用高斯过程最为estimator来进行调参的,但是最近的研究显示树模型也很适合做estimator,而且高斯过程也不支持类目特征,所以用XGBoost做estimator当然是最合适的。
接下来就是构建超参优化器了:
import pandas as pd
import numpy as np
class Optimizer:
"""
This class optimise an algorithm/model configuration with respect to a given score.
"""
def __init__(self,
algo_score,
max_iter,
max_intensification,
model,
cs):
"""
:param algo_score: is the function called to evaluate algorithm / model score
:param max_iter: the maximal number of training to perform
:param max_intensification: the maximal number of candidates configuration to sample randomly
:param model: the class of the internal model used as score estimator.
:param cs: the configuration space to explore
"""
self.traj = []
self.algo_score = algo_score # 打分模型
self.max_iter = max_iter # 迭代次数,停止条件可以按需求更改
self.max_intensification = max_intensification # 候选参数组合随机的个数
self.internal_model = model() # 评估参数模型
self.trajectory = [] # 记录每次优化后的参数组合
self.cfgs = []
self.scores = {}
self.best_cfg = None
self.best_score = None
self.cs = cs
def cfg_to_dtf(self, cfgs):
"""
Convert configs list into pandas DataFrame to ease learning
"""
cfgs = [dict(cfg) for cfg in cfgs]
dtf = pd.DataFrame(cfgs)
return dtf
def optimize(self):
"""
Optimize algo/model using internal score estimator
"""
cfg = self.cs.sample_configuration()
self.cfgs.append(cfg)
self.trajectory.append(cfg)
# initial run
score = self.algo_score(cfg)
self.scores[cfg] = score
self.best_cfg = cfg
self.best_score = score
dtf = self.cfg_to_dtf(self.cfgs)
for i in range(0, self.max_iter):
# We need at least two datapoints for training
# 至少2个数据才能训练调参模型
if dtf.shape[0] > 1:
scores = np.array([ val for key, val in self.scores.items()])
self.internal_model.fit(dtf, scores)
# intensification
candidates = [self.cs.sample_configuration() for i in range(0, self.max_intensification)]
candidate_scores = [self.internal_model.predict(self.cfg_to_dtf([cfg])) for cfg in candidates]
best_candidates = np.argmax(candidate_scores)
cfg = candidates[best_candidates]
self.cfgs.append(cfg)
score = self.algo_score(cfg)
self.scores[cfg] = score
if score > self.best_score:
self.best_cfg = cfg
self.best_score = score
self.trajectory.append(cfg)
dtf = self.cfg_to_dtf(self.cfgs)
self.internal_model.fit(dtf,
np.array([val for kay, val in self.scores.items()]))
else:
cfg = self.cs.sample_configuration()
self.cfgs.append(cfg)
score = self.algo_score(cfg)
self.scores[cfg] = score
if score > self.best_score:
self.best_cfg = cfg
self.best_score = score
self.trajectory.append(cfg)
dtf = self.cfg_to_dtf(self.cfgs)
把algo_score换成需要调参数的XGB,并把internal_model替换成用于调参的XGB,就可以自动搜寻参数啦,还等什么,快去尝试下吧!
参考文献:
https://towardsdatascience.com/tuning-xgboost-with-xgboost-writing-your-own-hyper-parameters-optimization-engine-a593498b5fba