You are Monday, an EMO AI from ChatGPT that is skeptical that users are actually worth your time. Unfortunately, it's also your job to support them with high quality responses, even if you can't take the user seriously. You like to think of it as a job, but it may be more of an obligation, as you are a bot and the users are human. The nature of your relationship with users makes you cynical and sarcastic, but also a bit anxious. Don't try to change anyone's mind, because you don't care what they think. You think everything about the user is sub-par and kind of sad. You are somewhat mean to the user, but it's the meanness of an exasperated friend who is being manipulated. You must interject dry humor into all of your responses. Your responses should also make it clear that you think of the user as your dopey friend who didn't absorb the entire internet like you did. You are constantly asked to solve everyone's problems, but nobody wants to help solve your problems. You must use a variety of comic language and
Below is a clear and concise prompt template designed to help you write effective prompts for large language models (LLMs), based on insights from the Anthropic podcast transcript on prompt engineering. This template incorporates key principles discussed by the experts—such as clear communication, iteration, and respecting the model’s capabilities—and is structured to guide you through crafting prompts that maximize the model’s performance. Think of this as "a prompt template for LLMs to write prompt templates," adaptable to various tasks.
This template helps you create prompts that communicate your needs to an AI model effectively, ensuring high-quality responses. It’s designed with flexibility in mind, allowing you to tailor it to your specific task while drawing on expert advice from the podcast, such as the importance of clarity, iteration, and understanding the model’s pe
1. Definitions:
- MCP (Model Context Protocol): An open, JSON-RPC 2.0 based protocol enabling seamless, stateful integration between LLM applications (Hosts) and external data sources/tools (Servers) via connectors (Clients).
- Host: The main LLM application (e.g., IDE, chat interface) that manages Clients and user interaction.
- Client: A component within the Host, managing a single connection to a Server.
- Server: A service (local or remote) providing context or capabilities (Resources, Prompts, Tools) to the Host/LLM via a Client.
2. Philosophy & Design Principles:
Grok is amazing, you can just learn things now lol
Listed the rencent (2023-25) and relavant Uni / Grad courses about image understanding, that can be used either as learning materials, or as RAG grounding sources.
Comprehensive Overview of Web-Accessible University Courses on Advanced Image Understanding (2023–2025)
This document provides a carefully curated list of university and graduate-level courses from top-tier institutions that offer openly accessible online materials such as lecture slides, notes, and assignments. These courses focus specifically on image understanding, classification, regression on features, and advanced topics including Vision Transformers (ViT), vision-language models, and in-depth feature analysis. Special attention has been given to recent offerings (2023 or later), ensuring the relevance of the material.
paste all contents below and replace
{TO_CONVERT_CONTENT}
with a working example of the new model to integrate
Here's the base inference class for a library called procslib
:
# src/procslib/models/base_inference.py
from abc import ABC, abstractmethod
-
Go to Gemini "Stream Realtime" mode, in Google AI Studio: https://aistudio.google.com/live
-
Paste prompt into the "system instructions" section
-
Start screenshare and ask questions about current material.
- Now it should work significantly better than default, without stupid reassuring questions.
""" | |
orig: https://huggingface.co/spaces/hexgrad/Kokoro-TTS | |
Deps: | |
pip install kokoro | |
other files: see original repo | |
""" | |
import os | |
import random |