如何用拥抱式变压器在rasa中实现BERT,在rasa中运行Bert模型需要什么?
recipe: default.v1
*# Configuration for Rasa NLU.
# https://rasa.com/docs/rasa/nlu/components/*
language: en
pipeline:
*# how to implement this BERT in rasa*
- name: HFTransformersNLP
model_weights: "bert-base-uncased"
model_name: "bert"
- name: LanguageModelTokenizer
- name: LanguageModelFeaturizer
- name: DIETClassifier
epochs: 200
发布于 2022-01-12 13:06:53
此错误可能是由于您正在使用的Rasa版本(rasa --version
的输出)造成的。在当前版本(>2.1
)中,不推荐HFTransformersNLP
和LanguageModelTokenizer
。使用BERT模型可以通过任何令牌器和
pipeline:
- name: LanguageModelFeaturizer
model_name: "bert"
model_weights: "rasa/LaBSE"
有关更多细节,请参见文档。
https://stackoverflow.com/questions/70578679
复制相似问题