A guide on using Ollama as the OpenAI API provider for inline completions in iTerm2.
- API URL:
http://127.0.0.1:11434/v1/completions
- Model:
mistral
- Tokens:
4000
- Use legacy completions API:
true
import SwiftUI | |
enum PreviewProviderMode: CaseIterable { | |
/// Use for a light appearance preview. | |
case lightMode | |
/// Use for a dark appearance preview. | |
case darkMode | |
!!! For M1/M2 apple silicon see this comment:
For MacOS | Mojave | High Sierra
For an emulator that mimics a Pixel 5 Device with Google APIs and ARM architecture (for an M1/M2 Macbook):
List All System Images Available for Download: sdkmanager --list | grep system-images
Download Image: sdkmanager --install "system-images;android-30;google_atd;arm64-v8a"
protocol TableValuable { | |
associatedtype TableItem | |
static func loadingValue() -> TableItem | |
static func failedValue() -> TableItem | |
func value() -> TableItem | |
} | |
enum TableState<T: TableValuable> { | |
case Loading | |
case Failed |
rsync (Everyone seems to like -z, but it is much slower for me)
<?xml version="1.0" encoding="UTF-8"?> | |
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> | |
<plist version="1.0"> | |
<dict> | |
<key>items</key> | |
<array> | |
<dict> | |
<key>assets</key> | |
<array> | |
<dict> |