Skip to content
This repository was archived by the owner on Jul 4, 2025. It is now read-only.

Commit ef7c7b7

Browse files
authored
feat: Create llama3.yml presets for llama3 family (#862)
1 parent 8e31813 commit ef7c7b7

File tree

4 files changed

+73
-0
lines changed

4 files changed

+73
-0
lines changed

cortex-js/presets/alpaca.yml

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
version: 1.0
2+
# Model Settings
3+
prompt_template: |+
4+
{system_message}
5+
### Instruction: {prompt}
6+
### Response:
7+
ctx_len: 2048
8+
9+
# Results Preferences
10+
stop:
11+
- </s>
12+
temperature: 0.7
13+
top_p: 0.95
14+
max_tokens: 2048
15+
frequency_penalty: 0
16+
presence_penalty: 0

cortex-js/presets/chatml.yml

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
version: 1.0
2+
# Model Settings
3+
prompt_template: |+
4+
<|im_start|>system
5+
{system_message}<|im_end|>
6+
<|im_start|>user
7+
{prompt}<|im_end|>
8+
<|im_start|>assistant
9+
ctx_len: 2048
10+
11+
# Results Preferences
12+
stop:
13+
- <|im_end|>
14+
temperature: 0.7
15+
top_p: 0.95
16+
max_tokens: 2048
17+
frequency_penalty: 0
18+
presence_penalty: 0

cortex-js/presets/llama3.yml

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
version: 1.0
2+
# Model Settings
3+
prompt_template: |+
4+
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
5+
6+
{system_message}<|eot_id|><|start_header_id|>user<|end_header_id|>
7+
8+
{prompt}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
9+
ctx_len: 2048
10+
11+
# Results Preferences
12+
stop:
13+
- <|eot_id|>
14+
- <|<|end_header_id|>|>
15+
temperature: 0.7
16+
top_p: 0.95
17+
max_tokens: 2048
18+
frequency_penalty: 0
19+
presence_penalty: 0

cortex-js/presets/vicuna.yml

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
version: 1.0
2+
# Model Settings
3+
prompt_template: |-
4+
<|begin_of_sentence|>{system_prompt}
5+
6+
User: {prompt}
7+
8+
Assistant:
9+
10+
ctx_len: 2048
11+
12+
# Results Preferences
13+
stop:
14+
- </s>
15+
- end_of_sentence
16+
temperature: 0.7
17+
top_p: 0.95
18+
max_tokens: 2048
19+
frequency_penalty: 0
20+
presence_penalty: 0

0 commit comments

Comments
 (0)