Sidharthan commited on
Commit
087bce7
·
verified ·
1 Parent(s): bc5ad48

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -7
README.md CHANGED
@@ -54,7 +54,7 @@ model = AutoPeftModelForCausalLM.from_pretrained(
54
 
55
  # Generate a script
56
  def generate_script(prompt):
57
- formatted_prompt = f"keywords\n{prompt}\nscript\n"
58
  inputs = tokenizer(formatted_prompt, return_tensors="pt")
59
  inputs = {key: value.to(device) for key, value in inputs.items()}
60
 
@@ -83,16 +83,18 @@ print(f"Generated Script:\n{response}")
83
  The model expects prompts in the following format:
84
 
85
  ```
86
- keywords
87
- <your keywords here>
88
- script
 
89
  ```
90
 
91
  Example:
92
  ```
93
- keywords
94
- crosshatch waffle texture, dark chocolate, four bar crispy wafers, kat, milk chocolate
95
- script
 
96
  ```
97
 
98
  ### Output
 
54
 
55
  # Generate a script
56
  def generate_script(prompt):
57
+ formatted_prompt = f"<bos><start_of_turn>keywords\n{prompt}<end_of_turn>\n<start_of_turn>script\n"
58
  inputs = tokenizer(formatted_prompt, return_tensors="pt")
59
  inputs = {key: value.to(device) for key, value in inputs.items()}
60
 
 
83
  The model expects prompts in the following format:
84
 
85
  ```
86
+ <bos><start_of_turn>keywords
87
+ <your_keywords_here><end_of_turn>
88
+ <start_of_turn>script
89
+
90
  ```
91
 
92
  Example:
93
  ```
94
+ <bos><start_of_turn>keywords
95
+ crosshatch waffle texture, dark chocolate, four bar crispy wafers, kat, milk chocolate<end_of_turn>
96
+ <start_of_turn>script
97
+
98
  ```
99
 
100
  ### Output