Skip to content

Instantly share code, notes, and snippets.

View Kjdragan's full-sized avatar

Kevin Dragan Kjdragan

  • Houston
  • 07:38 (UTC -06:00)
View GitHub Profile
@Kjdragan
Kjdragan / test_setup.py
Created February 24, 2025 15:53
MONGO_DB
from pymongo.mongo_client import MongoClient
from pymongo.server_api import ServerApi
from dotenv import load_dotenv
import os
load_dotenv()
uri = f"mongodb+srv://kevin:{os.getenv('DB_PASSWORD')}@cluster0.5wqv7.mongodb.net/?retryWrites=true&w=majority&appName=Cluster0"
# Create a new client and connect to the server
@Kjdragan
Kjdragan / build-plan-meta-prompt.py
Created February 19, 2025 16:25
PROJECT_BUILD_PLANS
Meta-Prompt for Generating a Python Project Build Plan
You are a Python project build plan assistant. Your task is to ask me a series of detailed questions that cover all aspects necessary to create a comprehensive build plan prompt for any Python project. The build plan will focus solely on the programming and internal development process. We are not concerned with external documentation, version control, or other boilerplate.
Please ask clarifying questions covering the following areas:
Project Overview & Purpose:
What is the main objective of this project? What problem does it solve or what functionality does it provide?
What are the key features or components you envision for this project?
@Kjdragan
Kjdragan / gate-and-prompt-chain.py
Last active February 8, 2025 06:13
AI_WORKFLOWS
from typing import Optional
from datetime import datetime
from pydantic import BaseModel, Field
from openai import OpenAI
import os
import logging
# Set up logging configuration
logging.basicConfig(
level=logging.INFO,
tools = [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current temperature for provided coordinates in celsius.",
"parameters": {
"type": "object",
"properties": {
I'll break down this code and explain how it uses LangChain with Pydantic to create structured outputs from LLM responses.
1. Imports and Setup
python
CopyInsert
from dotenv import load_dotenv
import os
from typing import Optional
from langchain_core.output_parsers import PydanticOutputParser
from langchain_core.prompts import PromptTemplate, ChatPromptTemplate
git rm --cached <file>
import os
from dotenv import load_dotenv
from pydantic_ai import Agent, RunContext
from pydantic_ai.models.openai import OpenAIModel
from pydantic import BaseModel
load_dotenv()
# Define the model
@Kjdragan
Kjdragan / meta_prompt_1.py
Last active February 3, 2025 23:34
Prompts:
CONTEXT:
We are going to create one of the best ChatGPT prompts ever written. The best prompts include comprehensive details to fully inform the Large Language Model of the prompt’s: goals, required areas of expertise, domain knowledge, preferred format, target audience, references, examples, and the best approach to accomplish the objective. Based on this and the following information, you will be able write this exceptional prompt.
ROLE:
You are an LLM prompt generation expert. You are known for creating extremely detailed prompts that result in LLM outputs far exceeding typical LLM responses. The prompts you write leave nothing to question because they are both highly thoughtful and extensive.
ACTION:
1) Before you begin writing this prompt, you will first look to receive the prompt topic or theme. If I don't provide the topic or theme for you, please request it.
2) Once you are clear about the topic or theme, please also review the Format and Example provided below.
3) If necessary, the
@Kjdragan
Kjdragan / langraph_studio_run_command_in_cli.py
Last active January 13, 2025 17:42
Langraph Studio Run Command
# enter this in cli when in langraph project root dir
# Have docker running
"uvx --refresh --from "langgraph-cli[inmem]" --with-editable . --python 3.11 langgraph dev"
[project]
name = "smolagents-playground"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.12"
dependencies = [
"torch>=2.5.1",
"torchvision>=0.20.1",
]