Skip to content

Instantly share code, notes, and snippets.

Replicating Friday: A Guide to Building an Agentic Home Assistant AI Voice Assistant

condensed and arranged from NathanCu's thread on the HomeAssistant forums: https://community.home-assistant.io/t/fridays-party-creating-a-private-agentic-ai-using-voice-assistant-tools/855862 i used fabric to grab it and carve it up for me to run it through Gemini for the resulting documentation below for review, as it wasn't obviously documented anywhere and this information is too good to get lost in a forum post. --@emory

Introduction: Embracing the Agentic AI Philosophy

Nathan Curtis's "Friday" project is a deep dive into creating a truly "agentic" AI within Home Assistant, moving beyond simple command-and-control to enable proactive, reasoning, and conversational interactions. This guide distills Nathan's insights, code, and architectural decisions, aiming to provide a playbook for those looking to build a similar sophisticated AI assistant.

@peterkeen
peterkeen / card.yml
Last active July 12, 2023 16:44
Home Assistant YouTube Blocker
type: custom:timer-bar-card
entities:
- entity: timer.youtube_15_minutes
state_color: true
name: YouTube
translations:
idle: Blocked
extend_paper_buttons_row:
position: right
buttons:
require 'sorbet-runtime'
require 'active_support'
class ApplicationScript < T::InexactStruct
def self.run!(args = ARGV)
options = {}
OptionParser.new do |parser|
props.each do |propname, prop|
# special case T::Boolean because OptionParser doesn't understand it
if prop[:type] == T::Boolean
# Step 1: write a gist with a file named __script__.rb
# Step 2: copy and paste the function def below into Rails console
# Step 3: run `eval_gist("the-gist-id-from-the-url")`
# Step 4: GOTO Step 3
def eval_gist(gist_id)
eval(JSON.parse(Net::HTTP.get(URI("https://api.github.com/gists/#{gist_id}"))).dig("files", "__script__.rb", "content"))
end
@peterkeen
peterkeen / export_tiller.rb
Created March 4, 2018 14:25
Export a Tiller transaction spreadsheet as a ledger file while checking for duplicates in ledger-web database.
require 'rubygems'
require 'bundler/setup'
require 'pg'
require 'sequel'
require 'ledger_gen'
require 'google_drive'
SPREADSHEET_ID = 'your-google-sheet-id'
DATABASE_URL = 'postgres://username:password@host/database'
require 'rubydns'
require 'httparty'
require 'lru_redux'
class ZeroTierAPI
include HTTParty
base_uri "https://my.zerotier.com/api"
headers 'Authorization' => "Bearer #{ENV['ZT_API_TOKEN']}"
require 'rubydns'
INTERFACES = [
[:udp, "0.0.0.0", 5300],
[:tcp, "0.0.0.0", 5300]
]
IN = Resolv::DNS::Resource::IN
UPSTREAM = RubyDNS::Resolver.new([[:udp, "8.8.8.8", 53], [:tcp, "8.8.8.8", 53]])
task :run_gist => :environment do
url = ENV['url']
raise "usage: rake run_gist url=https://gist.github.com/asdfasdfasdfasdf" unless url.present?
dir = Dir.mktmpdir
system("git clone #{url} #{dir}")
Dir.chdir(dir) do
raise "Gist does not have a script.rb file to execute" unless File.exists?('script.rb')

Cora Street Press Individual Contributor License Agreement

Thank you for your interest in contributing to open source software projects (“Projects”) made available by Cora Street Press or its affiliates (“Cora Street Press”). This Individual Contributor License Agreement (“Agreement”) sets out the terms governing any source code, object code, bug fixes, configuration changes, tools, specifications, documentation, data, materials, feedback, information or other works of authorship that you submit or have submitted, in any form and in any manner, to Cora Street Press in respect of any of the Projects (collectively “Contributions”). If you have any questions respecting this Agreement, please contact [email protected].

You agree that the following terms apply to all of your past, present and future Contributions. Except for the licenses granted in this Agreement, you retain all of your right, title and interest in and to your Contributions.

Copyright License. You hereby grant, and agree to grant

#!/bin/bash
# This depends on a file named `/root/backup_dirs` containing a list of directories to backup
# one per line. All directories in that file, plus a database dump, will be archived together.
#
# Further, if you have a gitolite installationn, this will backup every repo in an individual archive.
set -e
set -x
set -o pipefail