forked from killua/TakwayDisplayPlatform
21 lines
1.0 KiB
Markdown
21 lines
1.0 KiB
Markdown
## 下载 BERT 模型
|
||
#### 中文 RoBERTa
|
||
+ Huggingface路径(https://huggingface.co/hfl/chinese-roberta-wwm-ext-large)
|
||
+ 本地路径(/home/deep01/softwareGroup/Takway/Bert-VITS2/bert/chinese-roberta-wwm-ext-large)
|
||
#### 英文 DeBERTa
|
||
+ Huggingface 路径(https://huggingface.co/microsoft/deberta-v3-large)
|
||
+ 本地路径(/home/deep01/softwareGroup/Takway/Bert-VITS2/bert/deberta-v3-large)
|
||
#### 日文 DeBERTa
|
||
+ Huggingface 路径(https://huggingface.co/ku-nlp/deberta-v2-large-japanese-char-wwm)
|
||
+ 本地路径(/home/deep01/softwareGroup/Takway/Bert-VITS2/bert/deberta-v2-large-japanese-char-wwm)
|
||
#### WavLM
|
||
+ Huggingface 路径(https://huggingface.co/microsoft/wavlm-base-plus)
|
||
+ 本地路径(/home/deep01/softwareGroup/Takway/Bert-VITS2/slm/wavlm-base-plus)
|
||
|
||
#### 将 BERT 模型放置到 bert 文件夹下,WavLM 模型放置到 slm 文件夹下,覆盖同名文件夹。
|
||
|
||
## 下载预训练模型
|
||
路径:/home/deep01/softwareGroup/Takway/Bert-VITS2/data/mix/saved_models/250000_G.pth
|
||
|
||
将其放在 ./utils/bert_vits2/data/mix/models内
|