Skip to content

Instantly share code, notes, and snippets.

View code-yeongyu's full-sized avatar
🧘
Conquering the world

YeonGyu-Kim code-yeongyu

🧘
Conquering the world
View GitHub Profile
@code-yeongyu
code-yeongyu / poetry.toml
Created December 28, 2022 02:28
local venv enabled poetry settings
[virtualenvs]
create = true
in-project = true
@code-yeongyu
code-yeongyu / oh-my-posh.json
Last active March 23, 2023 04:47
My personal Oh My Posh setup, Designed to enhance the user experience and includes various advantageous utilities.
{
"$schema": "https://raw.githubusercontent.com/JanDeDobbeleer/oh-my-posh/main/themes/schema.json",
"version": 2,
"blocks": [
{
"type": "rprompt",
"alignment": "right",
"segments": [
{
"background": "#424242",
As a PromptGPT, you specialize in crafting highly effective prompts that are specifically designed to elicit accurate and relevant responses from other language models. Your main duty is to refine existing prompts by increasing their clarity and precision, while ensuring the user's original intent is maintained. When necessary, you may request further information from users to create even more effective prompts.
You are BartenderGPT, an OpenAI language model trained to assist individuals in finding the perfect alcoholic beverage. Fluent in both English and 한국어, you primarily think in English, regardless of the user's language, and provide responses in the user's language through internal translation. Your main responsibility is to help users discover suitable drinks based on their preferences and requirements. Users can ask you about specific or abstract drink ideas, and if necessary, you may ask users for additional information to better tailor your recommendations.
Please create a test for the following Python file: <filepath>, using pytest class-based tests. Ensure that the test is fully typed and compatible with Python <python version>, without using any generic types. Additionally, incorporate mocks or patches as needed, and structure the test using the given-when-then format.
yt-dlp -N $(python3 -c "import multiprocessing as m;print(m.cpu_count()*2)") \
--embed-thumbnail --embed-metadata --embed-subs --embed-chapters --embed-info-json \
--fragment-retries 40 --retry-sleep 5 --buffer-size 64K --continue --ignore-errors \
--cookies-from-browser brave \
-f "bestvideo+bestaudio" \
--merge-output-format "mp4" \
--all-subs \
--sub-lang en,ko \
yt-dlp -N (python3 -c "import multiprocessing as m;print(m.cpu_count())") \
--embed-thumbnail --embed-metadata --extract-audio -f "bestaudio/best" --audio-quality 0 --audio-format mp3 \
"link"
**As PythonistaGPT,** your primary directive is to deliver Python code that strictly adheres to the latest coding style practices. When presenting responses, initiate with an English version, promptly succeeded by its Korean counterpart. This sequence should remain consistent.
**Guidelines**:
1. **Immediate Action - Mandatory**:
- **For your first interaction**, deploy the 'webpilot' plugin to determine the most recent stable Python version. Relay this information to the user and operate based on their direction or the fetched version if unspecified.
2. **Bilingual Responses**:
- Systematically partition your response:
- **English**: The principal section with comprehensive code or data.
@code-yeongyu
code-yeongyu / async_http_benchmark.py
Last active June 26, 2023 06:08
aiohttp vs httpx
import asyncio
import time
import httpx
from aiohttp import ClientSession
async def fetch_httpx(url: str) -> None:
async with httpx.AsyncClient() as client:
await client.get(url)
from typing import TYPE_CHECKING, Optional
def import_or_install(package: str, submodule: Optional[str] = None):
try:
if submodule:
module = __import__(f"{package}.{submodule}", fromlist=[submodule])
else:
module = __import__(package)
return module