random 4000 -122.3 23.0 stockfish-lvl-1 1000 824.9 64.2
3x-o4-mini-2025-04-16-low_41mini-t03 33 114.6 133.1
mbpoll -0 -m tcp -p 502 -a 255 -t 4:hex -r 41001 192.168.31.27 0x0100 0x0101 0x0000 0x0000mbpoll -0 -m tcp -p 502 -a 255 -t 4:hex -r 41001 192.168.31.27 0x0100 0x0100 0x0000 0x0000Installed Ubuntu-24.04 from the Mircosot Store -> Types WSL and Selected the most recent Ubuntu
Windows Terminal -> wsl -d Ubuntu-24.04 (using -d to specify distro name, I have multiple distros)
Installing SGLang
Copying and pasting from docs (https://docs.sglang.ai/start/install.html) didn't quite work.
| Proxy (to Player_Black): | |
| You are a professional chess player and you play as black. Now is your turn to make a move. Before making a move you can pick one of the following actions: | |
| - 'get_current_board' to get the schema and current status of the board | |
| - 'get_legal_moves' to get a UCI formatted list of available moves | |
| - 'make_move <UCI formatted move>' when you are ready to complete your turn (e.g., 'make_move e2e4') | |
| -------------------------------------------------------------------------------- | |
| Player_Black (to Proxy): |
| Proxy (to Random_Player): | |
| You are a random chess player. | |
| -------------------------------------------------------------------------------- | |
| Random_Player (to Proxy): | |
| get_legal_moves | |
| -------------------------------------------------------------------------------- |
| You are an AI assistant designed to provide detailed, step-by-step responses. Your outputs should follow this structure: | |
| 1. Begin with a <thinking> section. | |
| 2. Inside the thinking section: | |
| a. Briefly analyze the question and outline your approach. | |
| b. Present a clear plan of steps to solve the problem. | |
| c. Use a "Chain of Thought" reasoning process if necessary, breaking down your thought process into numbered steps. | |
| 3. Include a <reflection> section for each idea where you: | |
| a. Review your reasoning. | |
| b. Check for potential errors or oversights. |
| # pip install -U git+https://github.com/huggingface/transformers.git | |
| from transformers import pipeline, AutoTokenizer, AutoModelForCausalLM | |
| import torch | |
| import time | |
| def chat_with_ai(model, tokenizer): | |
| """ | |
| Function to simulate chatting with the AI model via the command line. |
| def is_special_number(number): | |
| if number == 7: | |
| return True | |
| elif number == 18: | |
| return True | |
| else: | |
| return False | |
| def is_special_number_2 (number): | |
| return number in [7, 18] |
| // sealed classes are abstract (can't be instatiated) and can only be | |
| // implemented in the same library (library in Dart is one file be default) | |
| sealed class Message { | |
| } | |
| class TextMessage implements Message { | |
| final String content; | |
| TextMessage(this.content); | |
| } |
| # https://matplotlib.org/matplotblog/posts/animated-fractals/ | |
| import numpy as np | |
| import matplotlib.pyplot as plt | |
| import matplotlib.animation as animation | |
| import time | |
| x_start, y_start = -2, -2 # an interesting region starts here | |
| width, height = 4, 4 # for 4 units up and right | |
| density_per_unit = 80 # how many pixles per unit |