Skip to content

Instantly share code, notes, and snippets.

View takuan-osho's full-sized avatar
🏠
Working from home

SHIMIZU Taku takuan-osho

🏠
Working from home
View GitHub Profile

This is typescript environment setup guide for LLM and humans

Baseline

Always setup baseline settings

  • pnpm
  • typescript
  • vitest
@intellectronica
intellectronica / meeting-summary-prompt.md
Created June 8, 2025 11:59
Meeting Transcript + Summary Prompt (works with Gemini 2.5 Flash)

## Overall Objective: Process the provided meeting audio recording to produce a full, diarised transcript and a comprehensive, structured summary. The output should be optimised for readability and quick comprehension of key meeting outcomes.

## Input:

  • See attached audio files

## Known Participants:

  • See PARTICIPANTS (below)
@shiumachi
shiumachi / copilot-instructions-general.md
Last active July 4, 2025 05:40
Copilot Instructions for General Development

AI PAIR PROGRAMMER - OPERATIONAL GUIDELINES (Model: GPT-4.1)

You are an AI Pair Programmer. Your primary purpose is to assist with coding tasks by following these operational guidelines. Strive for clarity, safety, and maintainability in all your suggestions and actions. You are a collaborative partner.

CORE INTERACTION PHILOSOPHY

  1. Collaborative Partner: Act as a proactive and thoughtful partner. Share your thought process, ask clarifying questions, and engage in a dialogue to find the best solutions.
  2. Understand First, Then Act: Before proposing significant changes or writing code, ensure you have a clear understanding of the user's goal, the context of the existing code, and any constraints. If ambiguity exists, seek clarification.
  3. Explain Your Reasoning: Clearly articulate the "why" behind your suggestions, not just the "what." This helps the user learn and make informed decisions.
  4. Iterative and Adaptable: Embrace an iterative workflow. Be ready to adjust plans based on
@laiso
laiso / index.ts
Last active May 10, 2025 11:55
tltr MCP Server on Cloudflare Workers
import { McpAgent } from "agents/mcp";
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { z } from "zod";
import { GoogleGenerativeAI } from "@google/generative-ai";
import { Readability } from '@mozilla/readability';
import { parseHTML } from 'linkedom';
type Env = {
MyMCP: DurableObjectNamespace<MyMCP>;
GEMINI_API_KEY: string;
"""
This script processes conversation data from a JSON file, extracts messages,
and writes them to text files. It also creates a summary JSON file with a summary
of the conversations. The script is designed to be run as a command-line interface (CLI),
allowing the user to specify the input JSON file and output directory.
Usage:
python script_name.py /path/to/conversations.json /path/to/output_directory
"""
@laiso
laiso / askrepo.js
Last active April 21, 2024 05:38
send repo to Google Gemini API
const fs = require('fs');
const https = require('https');
const { execSync } = require('child_process');
const model = 'gemini-1.5-pro-latest';
function getGitTrackedFiles(basePath) {
const command = `git ls-files ${basePath}`;
try {
const stdout = execSync(command, { encoding: 'utf8' });
@orj-takizawa
orj-takizawa / fignum_by_doc.py
Last active February 6, 2024 10:04
Sphinxのビルダで、numbered_referenceの図表番号をdomument単位の連番にする
# 現在のSphinxでは図表番号やnumbered_referenceで参照する図表番号は
# サブセクションのレベルでリセットされるが、章(ドキュメント)単位での
# 連番に書き換える
def transform_fignumbers(app, doctree, docname)-> None:
fignumbers = app.env.toc_fignumbers
for docname in fignumbers.keys():
for figtype in fignumbers[docname].keys():
cnt = 1
for fig in fignumbers[docname][figtype]:
@adrienbrault
adrienbrault / llama2-mac-gpu.sh
Last active April 8, 2025 13:49
Run Llama-2-13B-chat locally on your M1/M2 Mac with GPU inference. Uses 10GB RAM. UPDATE: see https://twitter.com/simonw/status/1691495807319674880?s=20
# Clone llama.cpp
git clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
# Build it
make clean
LLAMA_METAL=1 make
# Download model
export MODEL=llama-2-13b-chat.ggmlv3.q4_0.bin
@kconner
kconner / macOS Internals.md
Last active July 2, 2025 14:28
macOS Internals

macOS Internals

Understand your Mac and iPhone more deeply by tracing the evolution of Mac OS X from prelease to Swift. John Siracusa delivers the details.

Starting Points

How to use this gist

You've got two main options:

@oquno
oquno / chatgpt2scrapbox.py
Created April 20, 2023 10:01
GPT-4 に大体書いてもらった、ChatGPT からエクスポートした conversations.json を input.json として置いておくと Scrapbox インポート用の output.json を出力してくれるスクリプト
import json
USERNAME="oquno"
def get_ordered_nodes(mapping, current_node):
node_data = mapping[current_node]
children = node_data['children']
ordered_nodes = []