我用码头安装了triton推理服务器,
docker run --gpus=1 --rm -p8000:8000 -p8001:8001 -p8002:8002 -v /mnt/data/nabil/triton_server/models:/models nvcr.io/nvidia/tritonserver:22.08-py3 tritonserver --model-repository=/models
我还创建了火炬脚本模型,使用
from model_ecapatdnn import ECAPAModel
import soundfile as sf
import torch
model_1 = ECAPAModel.ECAPAModel(lr = 0.001, lr_decay = 0.97, C = 1024, n_class = 18505, m = 0.2, s = 30, test_step = 3, gpu = -1)
model_1.load_parameters("/ecapatdnn/model.pt")
model = model_1.speaker_encoder
# Switch the model to eval model
model.eval()
# An example input you would normally provide to your model's forward() method.
example = torch.rand(1, 48000)
# Use torch.jit.trace to generate a torch.jit.ScriptModule via tracing.
traced_script_module = torch.jit.trace(model, example)
# Save the TorchScript model
traced_script_module.save("traced_ecapatdnn_bangasianeng.pt")
现在,您可以看到,我的模型采用了一个形状为(BxN)
的张量,其中B是批大小。
如何为这个模型编写config.pbtxt
?
发布于 2022-09-28 08:53:35
所以找到了答案。必须在config
文件中指定形状。这是为我工作的config
。
name: "ecapatdnn_bangasianeng"
platform: "pytorch_libtorch"
max_batch_size: 1
input[
{
name: "INPUT__0"
data_type: TYPE_FP32
dims: [-1]
}
]
output:[
{
name: "OUTPUT__0"
data_type: TYPE_FP32
dims: [512]
}
]
https://stackoverflow.com/questions/73877546
复制相似问题