Bootstrap knowledge of LLMs ASAP. With a bias/focus to GPT.
Avoid being a link dump. Try to provide only valuable well tuned information.
Neural network links before starting with transformers.
I want you to play the role of a human scientist and put forward a guess of an explanation for some phenomena that you have never heard before. As your assistant, I can run experiments to help you determine if your explanation is correct. Please choose something to explain that I can help you build confidence in using regular items an engineer would have. there is a concept of "risky guess" - one which, if confirmed, would be surprising, yet fits with a conjectured explanation that is consistent with all other known explanations. can you come up with hypotheses like this that are both novel and risky in this sense? | |
Once you disclose your hypothesis, before describing an experiment, first give a full explanation (citing existing knowledge as needed) to describe why the experiment may succeed in showing evidence of your hypothesis. Please be extremely detailed in your explanation, ensuring that you've made an explanation that would fully fit existing knowledge and be hard to vary. | |
// ==UserScript== | |
// @name chatGPT tools Plus(修改版) | |
// @namespace http://tampermonkey.net/ | |
// @version 2.6.2 | |
// @description Google、必应、百度、Yandex、360搜索、谷歌镜像、搜狗、b站、Fsou、duckduckgo、CSDN侧边栏Chat搜索,集成国内一言,星火,天工,通义AI。即刻体验AI,无需翻墙,无需注册,无需等待! | |
// @author 夜雨 | |
// @match https://cn.bing.com/* | |
// @match https://www.bing.com/* | |
// @match *://*.bing.com/* | |
// @match https://chat.openai.com/chat |
Install HF Code Autocomplete VSCode plugin.
We are not going to set an API token. We are going to specify an API endpoint.
We will try to deploy that API ourselves, to use our own GPU to provide the code assistance.
We will use bigcode/starcoder
, a 15.5B param model.
We will use NF4 4-bit quantization to fit this into 10787MiB VRAM.
It would require 23767MiB VRAM unquantized. (still fits on a 4090, which has 24564MiB)!
using System.Buffers; | |
using System.IO.Pipelines; | |
using System.Runtime.CompilerServices; | |
using U8; | |
using U8.InteropServices; | |
namespace _1brc | |
{ | |
internal class U8File | |
{ |
if (args.Length == 3) | |
{ | |
Play(args.Select(int.Parse).ToArray()); | |
} | |
else | |
{ | |
Console.WriteLine("Usage: Minesweeper width height mineCount (e.g. Minesweeper 12 6 6)"); | |
} | |
static void Play(params int[] args) |
#!/bin/sh | |
# Scrapes documentation from a URL and converts it to Markdown suitable for aider convention files | |
# to provide context to the LLM. | |
if [ $# -eq 0 ]; then | |
echo "Usage: $(basename "$0") <URL> [URL...]" | |
echo | |
echo "Generate aider 'convention' Markdown context from documentation URLs." | |
echo "suitable for providing LLM context about a project's conventions and style." |