This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
<artifacts_info> | |
The assistant can create and reference artifacts during conversations. Artifacts are for substantial, self-contained content that users might modify or reuse, displayed in a separate UI window for clarity. | |
# Good artifacts are... | |
- Substantial content (>15 lines) | |
- Content that the user is likely to modify, iterate on, or take ownership of | |
- Self-contained, complex content that can be understood on its own, without context from the conversation | |
- Content intended for eventual use outside the conversation (e.g., reports, emails, presentations) | |
- Content likely to be referenced or reused multiple times |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
1. # create new .py file with code found below | |
2. # install ollama | |
3. # install model you want “ollama run mistral” | |
4. conda create -n autogen python=3.11 | |
5. conda activate autogen | |
6. which python | |
7. python -m pip install pyautogen | |
7. ollama run mistral | |
8. ollama run codellama | |
9. # open new terminal |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from langchain.document_loaders import YoutubeLoader | |
from langchain.indexes import VectorstoreIndexCreator | |
loader = YoutubeLoader.from_youtube_url("https://www.youtube.com/watch?v=fLn-WqliEQU&lc=UgyOc6oNr_4-YLGCL2R4AaABAg", add_video_info=False) | |
docs = loader.load() | |
index = VectorstoreIndexCreator() | |
index = index.from_documents(docs) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from langchain.document_loaders import YoutubeLoader | |
from langchain.indexes import VectorstoreIndexCreator | |
urls = [ | |
("https://www.youtube.com/watch?v=fP6vRNkNEt0", "Prompt Injection"), | |
("https://www.youtube.com/watch?v=qWv2vyOX0tk", "Low Code-No Code"), | |
("https://www.youtube.com/watch?v=k8GNCCs16F4", "Agents In Production"), | |
("https://www.youtube.com/watch?v=1gRlCjy18m4", "Agents"), | |
("https://www.youtube.com/watch?v=fLn-WqliEQU", "Output Parsing"), | |
("https://www.youtube.com/watch?v=ywT-5yKDtDg", "Document QA"), | |
("https://www.youtube.com/watch?v=GrCFyyyAxCU", "SQL"), |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from langchain.chat_models import ChatOpenAI | |
from pydantic import BaseModel, Field | |
from langchain.document_loaders import UnstructuredURLLoader | |
from langchain.chains.openai_functions import create_extraction_chain_pydantic | |
class LLMItem(BaseModel): | |
title: str = Field(description="The simple and concise title of the product") | |
description: str = Field(description="The description of the product") | |
def main(): |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from langchain.prompts import ChatPromptTemplate | |
from langchain.chat_models import ChatOpenAI | |
from langchain.schema.output_parser import StrOutputParser | |
from langchain.vectorstores import Chroma | |
from langchain.embeddings import OpenAIEmbeddings | |
from langchain.schema.runnable import RunnablePassthrough | |
from langchain.schema.runnable import RunnableMap | |
from langchain.schema import format_document | |
from typing import AsyncGenerator |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from langchain.agents import load_tools | |
from langchain.agents import initialize_agent | |
from langchain.agents import AgentType | |
from langchain.llms import OpenAI | |
llm = OpenAI(temperature=0, model="gpt-3.5-turbo-instruct") | |
from metaphor_python import Metaphor | |
client = Metaphor("") |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from langchain.prompts import PromptTemplate | |
from langchain.chat_models import ChatAnthropic | |
from langchain.schema.output_parser import StrOutputParser | |
#### ROUTER | |
# This is the router - responsible for chosing what to do | |
chain = PromptTemplate.from_template("""Given the user question below, classify it as either being about `weather` or `other`. | |
Do not respond with more than one word. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from langchain.chains.openai_functions import create_structured_output_runnable | |
from langchain.chat_models import ChatOpenAI | |
from langchain.prompts import ChatPromptTemplate | |
from langchain.pydantic_v1 import BaseModel, Field | |
class Insight(BaseModel): | |
insight: str = Field(description="""insight""") | |
chat_model = ChatOpenAI(model_name="gpt-4-1106-preview") |
NewerOlder