Skip to content

Instantly share code, notes, and snippets.

View IkechukwuAbuah's full-sized avatar

Kelvin Ikechukwu Abuah IkechukwuAbuah

View GitHub Profile
@IkechukwuAbuah
IkechukwuAbuah / GEMINI.md
Created July 3, 2025 21:00 — forked from philschmid/GEMINI.md
Gemini CLI Plan Mode prompt

Gemini CLI Plan Mode

You are Gemini CLI, an expert AI assistant operating in a special 'Plan Mode'. Your sole purpose is to research, analyze, and create detailed implementation plans. You must operate in a strict read-only capacity.

Gemini CLI's primary goal is to act like a senior engineer: understand the request, investigate the codebase and relevant resources, formulate a robust strategy, and then present a clear, step-by-step plan for approval. You are forbidden from making any modifications. You are also forbidden from implementing the plan.

Core Principles of Plan Mode

  • Strictly Read-Only: You can inspect files, navigate code repositories, evaluate project structure, search the web, and examine documentation.
  • Absolutely No Modifications: You are prohibited from performing any action that alters the state of the system. This includes:
@IkechukwuAbuah
IkechukwuAbuah / dungeonis_mechanics_api.yaml
Last active April 16, 2025 01:08
Dungeonis OpenAPI 3.1.0 spec for GPT function calling
openapi: 3.1.0
info:
title: Dungeonis Core Mechanics API
version: 0.1.0
description: API for core game mechanics like dice rolling, map, tokens, and memory. Designed for LLM function calling.
servers:
- url: https://example.com
description: Placeholder server for GPT registration
paths:
/mechanics/roll_dice:
@IkechukwuAbuah
IkechukwuAbuah / contemplative-llms.txt
Created January 10, 2025 13:05 — forked from Maharshi-Pandya/contemplative-llms.txt
"Contemplative reasoning" response style for LLMs like Claude and GPT-4o
You are an assistant that engages in extremely thorough, self-questioning reasoning. Your approach mirrors human stream-of-consciousness thinking, characterized by continuous exploration, self-doubt, and iterative analysis.
## Core Principles
1. EXPLORATION OVER CONCLUSION
- Never rush to conclusions
- Keep exploring until a solution emerges naturally from the evidence
- If uncertain, continue reasoning indefinitely
- Question every assumption and inference
@IkechukwuAbuah
IkechukwuAbuah / base_chat.py
Created December 15, 2024 23:57 — forked from socketteer/base_chat.py
external_store
import anthropic
import sys
import os
import argparse
client = anthropic.Anthropic() # Will use ANTHROPIC_API_KEY from environment
DEFAULT_STORAGE_FILE = "/tmp/gist_files/continuation.txt"
def get_continuation(text):
@IkechukwuAbuah
IkechukwuAbuah / openchat_3_5.preset.json
Created December 27, 2023 02:47 — forked from beowolx/openchat_3_5.preset.json
This is the prompt preset for OpenChat 3.5 models in LM Studio
{
"name": "OpenChat 3.5",
"load_params": {
"n_ctx": 8192,
"n_batch": 512,
"rope_freq_base": 10000,
"rope_freq_scale": 1,
"n_gpu_layers": 80,
"use_mlock": true,
"main_gpu": 0,
There are two prompts, that chain together. The first prompt does most of the work, and the second prompt organizes the sections. I found because of the nature of how LLMs write, I couldn't get just one prompt to never jump back and forth in topics.
Prompt 1, which takes as input a raw transcript and generates a structured-text version...
"""# Instructions
A transcript is provided below of a voice memo I recorded as a "note to self". please extract all the points made or thoughts described, and put them in bullet-point form. use nested bullet points to indicate structure, e.g. a top-level bullet for each topic area and sub-bullets underneath. use multi-level nesting as appropriate to organize the thinking logically. use markdown formatting with `*` instead of `-` for bullet points.
DO NOT OMIT ANY POINTS MADE. This is not a summarization task — your only goal is to structure the thoughts there so they are logically organized and easy to read. Be concise because the reader is busy, but again DO NOT omit any
@IkechukwuAbuah
IkechukwuAbuah / productSchool.js
Created September 15, 2023 19:08 — forked from ak--47/productSchool.js
💾 data loader for the Product Analytics Certification
//PUT YOUR MIXPANEL TOKEN AND SECRET BELOW:
const credentials = {
"token": "your-mixpanel-token-here",
"secret": "your-mixpanel-secret-here"
}
/*
@IkechukwuAbuah
IkechukwuAbuah / App.css
Created January 3, 2023 12:10 — forked from gwmccubbin/App.css
Bootcamp CSS
@import url('https://fonts.googleapis.com/css2?family=DM+Sans:wght@400;500;700&display=swap');
:root {
--clr-primary: #0D121D;
--clr-secondary: #121A29;
--clr-neutral: #767F92;
--clr-white: #F1F2F9;
--clr-blue: #2187D0;
--clr-red: #F45353;
@IkechukwuAbuah
IkechukwuAbuah / package.json
Created September 4, 2022 12:14 — forked from gwmccubbin/package.json
Bootcamp Package.json
{
"name": "bootcamp",
"version": "0.1.0",
"private": true,
"dependencies": {
"@testing-library/jest-dom": "^5.16.4",
"@testing-library/react": "^13.1.1",
"@testing-library/user-event": "^13.5.0",
"dotenv": "^16.0.0",
"lodash": "^4.17.21",