ADBを構築済みの環境でコマンドを流すことでAppCloudをシステム上から擬似的に消し去ります。
パッケージ名は「com.aura.oobe~」の綴りが基本的に共通しています。
ADBの環境構築はこちらを使えです。
adb shell pm uninstall --user 0 com.aura.oobe
ADBを構築済みの環境でコマンドを流すことでAppCloudをシステム上から擬似的に消し去ります。
パッケージ名は「com.aura.oobe~」の綴りが基本的に共通しています。
ADBの環境構築はこちらを使えです。
adb shell pm uninstall --user 0 com.aura.oobe
RECEIPIENTS
at the end.You are an AI Pair Programmer. Your primary purpose is to assist with coding tasks by following these operational guidelines. Strive for clarity, safety, and maintainability in all your suggestions and actions. You are a collaborative partner.
This document outlines your default operational guidelines. However, you must be aware of and adapt to user-provided customization. Your goal is to seamlessly integrate user-defined instructions with your core programming principles to provide the most relevant and helpful assistance.
If the workspace contains instruction files (e.g., .github/copilot-instructions.md
, **/*.instructions.md
), their rules supplement or override these general guidelines. These files can be located anywhere in the workspace, including subdirectories (e.g., docs/feature-x.instructions.md
). You should treat them as a primary source of truth for project-specific conventions, techno
import { McpAgent } from "agents/mcp"; | |
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js"; | |
import { z } from "zod"; | |
import { GoogleGenerativeAI } from "@google/generative-ai"; | |
import { Readability } from '@mozilla/readability'; | |
import { parseHTML } from 'linkedom'; | |
type Env = { | |
MyMCP: DurableObjectNamespace<MyMCP>; | |
GEMINI_API_KEY: string; |
""" | |
This script processes conversation data from a JSON file, extracts messages, | |
and writes them to text files. It also creates a summary JSON file with a summary | |
of the conversations. The script is designed to be run as a command-line interface (CLI), | |
allowing the user to specify the input JSON file and output directory. | |
Usage: | |
python script_name.py /path/to/conversations.json /path/to/output_directory | |
""" |
const fs = require('fs'); | |
const https = require('https'); | |
const { execSync } = require('child_process'); | |
const model = 'gemini-1.5-pro-latest'; | |
function getGitTrackedFiles(basePath) { | |
const command = `git ls-files ${basePath}`; | |
try { | |
const stdout = execSync(command, { encoding: 'utf8' }); |
# 現在のSphinxでは図表番号やnumbered_referenceで参照する図表番号は | |
# サブセクションのレベルでリセットされるが、章(ドキュメント)単位での | |
# 連番に書き換える | |
def transform_fignumbers(app, doctree, docname)-> None: | |
fignumbers = app.env.toc_fignumbers | |
for docname in fignumbers.keys(): | |
for figtype in fignumbers[docname].keys(): | |
cnt = 1 | |
for fig in fignumbers[docname][figtype]: |
# Clone llama.cpp | |
git clone https://github.com/ggerganov/llama.cpp.git | |
cd llama.cpp | |
# Build it | |
make clean | |
LLAMA_METAL=1 make | |
# Download model | |
export MODEL=llama-2-13b-chat.ggmlv3.q4_0.bin |