If you have uv installed (and you should!), you can install llm globally in a uv
-managed tool environment with:
uv tool install llm
If you want to use models other than OpenAI models, you'll need some extensions:
import sys | |
from io import StringIO | |
import streamlit as st # pip install streamlit | |
from code_editor import code_editor # pip install streamlit_code_editor | |
import ollama as ol # pip install ollama | |
st.set_page_config(layout='wide') | |
st.title('`Offline code completion`') |
""" | |
A minimal, fast example generating text with Llama 3.1 in MLX. | |
To run, install the requirements: | |
pip install -U mlx transformers fire | |
Then generate text with: | |
python l3min.py "How tall is K2?" |
package main | |
import ( | |
"encoding/json" | |
"fmt" | |
"io/ioutil" | |
"os" | |
"os/user" | |
"path/filepath" | |
"strings" |
#!/bin/bash | |
# Ollama Model Export Script | |
# Usage: bash ollama-export.sh vicuna:7b | |
# License: MIT (https://ncurl.xyz/s/o_o6DVqIR) | |
# https://gist.github.com/supersonictw/f6cf5e599377132fe5e180b3d495c553 | |
# Interrupt if any error occurred | |
set -e | |
# Declare |
Replace natbib by biblatex to get citation links
\usepackage[backend=bibtex, style=authoryear-comp, natbib=true, sortcites=false]{biblatex}
\addbibresource{main.bib}
% optional if you want (Smith 1776) instead of (Smith, 1776)
\renewcommand*{\nameyeardelim}{\addspace}
\begin{document}
def energy(z, n, m, L): | |
return np.cos(n* np.pi *np.real(z) / L) *np.cos(m *np.pi*np.imag(z) / L) - np.cos(m*np.pi*np.real(z)/ L) *np.cos(n* np.pi *np.imag(z) / L) | |
class ChladniPlate: | |
def __init__(self, n, m, L=1, n_particles=10000): | |
self.L = L | |
self.n_particles = n_particles | |
self.n = n | |
self.m = m |
I've been using [Backblaze][bbz] for a while now as my online backup service. I have used a few others in the past. None were particularly satisfactory until Backblaze came along.
It was - still is - keenly priced at a flat $5 (£4) per month for unlimited backup (I've currently got just under half a terabyte backed-up). It has a fast, reliable client. The company itself is [transparent about their operations][trans] and [generous with their knowledge sharing][blog]. To me, this says they understand their customers well. I've never had reliability problems and everything about the outfit exudes a sense of simple, quick, solid quality. The service has even saved the day on a couple of occasions where I've lost files.
Safe to say, I'm a happy customer. If you're not already using Backblaze, [I highly recommend you do][recommend].
import math | |
import numpy as np | |
import random | |
import matplotlib.pyplot as plt | |
medium_size = 1024 | |
x = np.linspace(-1., 1., medium_size) | |
y = np.linspace(-1., 1., medium_size) | |
x, y = np.meshgrid(x, y, indexing='ij') |
module Trie = struct | |
type t = { mutable count : int; children : t option Array.t } | |
let[@inline] empty () = { count = 0; children = Array.make 128 None } | |
let add buf pos len t = | |
let rec loop idx t = | |
if idx > pos + len - 1 then t.count <- t.count + 1 | |
else | |
let c = Char.lowercase_ascii (Bytes.get buf idx) in |