This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
def build_llama2_prompt(messages): | |
startPrompt = "<s>[INST] " | |
endPrompt = " [/INST]" | |
conversation = [] | |
for index, message in enumerate(messages): | |
if message["role"] == "system" and index == 0: | |
conversation.append(f"<<SYS>>\n{message['content']}\n<</SYS>>\n\n") | |
elif message["role"] == "user": | |
conversation.append(message["content"].strip()) | |
else: |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/bin/bash | |
### steps #### | |
# Verify the system has a cuda-capable gpu | |
# Download and install the nvidia cuda toolkit and cudnn | |
# Setup environmental variables | |
# Verify the installation | |
### | |
### to verify your gpu is cuda enable check |