7
7
" [](https://github.com/imcaspar/gpt2-ml)\n " ,
8
8
" [](https://github.com/imcaspar/gpt2-ml)\n " ,
9
9
" [](https://twitter.com/intent/tweet?text=Wow:&url=https://github.com/imcaspar/gpt2-ml)\n " ,
10
- " ### 高考作文生成指南 :\n " ,
10
+ " ### 四十二使用指南 :\n " ,
11
11
" \n " ,
12
12
" * 点击代码框左上角的▶️,选择RUN ANYWAY\n " ,
13
13
" * 等待对应代码文件、模型文件下载\n " ,
14
14
" * 输入参数,代表长度\n " ,
15
- " * 输入作文题目(摘要模块没有部署,请自己提炼作文题中的主旨句) \n " ,
15
+ " * 输入问题 \n " ,
16
16
" \n " ,
17
- " 之后就会生成对应文章 ,效果如下:"
17
+ " 之后就会生成对应回答 ,效果如下:"
18
18
]
19
19
},
20
20
{
52
52
" !mkdir -p /home/EssayKiller_V2/LanguageNetwork/GPT2/finetune/trained_models\n " ,
53
53
" \n " ,
54
54
" %cd /home/EssayKiller_V2/LanguageNetwork/GPT2/finetune/\n " ,
55
- " !perl /home/EssayKiller_V2/LanguageNetwork/GPT2/scripts/gdown.pl https://drive.google.com/file/d/1ajR-yVWmZC_z7HgNjz4tivNz-vUCgKBC trained_models/model.ckpt-280000 .data-00000-of-00001\n " ,
56
- " !wget -q --show-progress https://github.com/EssayKillerBrain/EssayKiller/releases/download/v1.0/model.ckpt-280000 .index -P /home/EssayKiller_V2/LanguageNetwork/GPT2/finetune/trained_models\n " ,
57
- " !wget -q --show-progress https://github.com/EssayKillerBrain/EssayKiller/releases/download/v1.0/model.ckpt-280000 .meta -P /home/EssayKiller_V2/LanguageNetwork/GPT2/finetune/trained_models\n " ,
55
+ " !perl /home/EssayKiller_V2/LanguageNetwork/GPT2/scripts/gdown.pl https://drive.google.com/file/d/1A910UqSNBBi_SEoIDl15095T_5kojESO trained_models/model.ckpt-344000 .data-00000-of-00001\n " ,
56
+ " !wget -q --show-progress https://github.com/EssayKillerBrain/EssayKiller/releases/download/v1.0/model.ckpt-344000 .index -P /home/EssayKiller_V2/LanguageNetwork/GPT2/finetune/trained_models\n " ,
57
+ " !wget -q --show-progress https://github.com/EssayKillerBrain/EssayKiller/releases/download/v1.0/model.ckpt-344000 .meta -P /home/EssayKiller_V2/LanguageNetwork/GPT2/finetune/trained_models\n " ,
58
58
" \n " ,
59
59
" !echo '模型下载完成,Git项目已构建,请继续点击下方的▶️'\n " ,
60
60
" # If gdown.pl failed, please uncomment following code & exec"
66
66
"source" : [
67
67
" ### 模型参数设置:\n " ,
68
68
" \n " ,
69
- " * 1.作文最小篇幅 :\n " ,
70
- " 生成对应字数的高考作文 ,可自己调节长度,最长为1024个汉字。\n " ,
71
- " 一般来说越短的文章AI表现越好 。\n " ,
69
+ " * 1.回答最小篇幅 :\n " ,
70
+ " 生成对应字数的回答 ,可自己调节长度,最长为1024个汉字。\n " ,
71
+ " 一般来说字数越短,AI表现越好 。\n " ,
72
72
" \n " ,
73
- " * 2.生成作文篇数 :\n " ,
74
- " 默认会生成1篇议论文 ,生成时间取决于服务器状态\n " ,
75
- " 一般不超过60秒。 受限于线上GPU内存,篇数最多为100 。\n "
73
+ " * 2.生成回答篇数 :\n " ,
74
+ " 默认会生成1篇回答 ,生成时间取决于服务器状态\n " ,
75
+ " 受限于线上GPU内存,篇数最多为10 。\n "
76
76
]
77
77
},
78
78
{
82
82
"outputs" : [],
83
83
"source" : [
84
84
" #!cat /usr/local/lib/python3.6/dist-packages/tensorflow_core/python/tpu/tpu_feed.py\n " ,
85
- " #@title #文章生成模块 \n " ,
86
- " 作文最小篇幅 = 1024 #@param {type:\" number\" , min:800, max:1024, step:1}\n " ,
87
- " 生成作文篇数 = 1 #@param {type:\" number\" , min:1, max:100, step:1}\n " ,
85
+ " #@title #回答生成模块 \n " ,
86
+ " 最小篇幅 = 1024 #@param {type:\" number\" , min:800, max:1024, step:1}\n " ,
87
+ " 生成篇数 = 1 #@param {type:\" number\" , min:1, max:100, step:1}\n " ,
88
88
" %mv /home/EssayKiller_V2/LanguageNetwork/GPT2/finetune/models/mega/* /home/EssayKiller_V2/LanguageNetwork/GPT2/finetune/trained_models/\n " ,
89
89
" %cd /home/EssayKiller_V2/LanguageNetwork/GPT2/\n " ,
90
90
" !export TF_CPP_MIN_LOG_LEVEL=2\n " ,
91
91
" !echo '模型加载中,请稍后......'\n " ,
92
- " !PYTHONPATH=$(pwd) python scripts/demo.py -ckpt_fn finetune/trained_models/model.ckpt-280000 -min_len $作文最小篇幅 -samples $生成作文篇数 \n " ,
92
+ " !PYTHONPATH=$(pwd) python scripts/demo.py -ckpt_fn finetune/trained_models/model.ckpt-344000 -min_len $最小篇幅 -samples $生成篇数 \n " ,
93
93
" !PYTHONPATH=$(pwd) python scripts/formatter.py -org_text result.txt"
94
94
]
95
95
}
96
96
],
97
97
"metadata" : {
98
98
"colab" : {
99
- "name" : " 17亿参数-高考作文生成AI | 1.7B GPT2 Pretrained Essay Killer Brain" ,
99
+ "name" : " 四十二生成式AI | Base on 1.7B GPT2 Pretrained Essay Killer Brain" ,
100
100
"provenance" : [],
101
101
"collapsed_sections" : []
102
102
},
108
108
},
109
109
"nbformat" : 4 ,
110
110
"nbformat_minor" : 0
111
- }
111
+ }
0 commit comments