# AI^3

AI^3 ("Ai-Ai-Ai"), developed by [PubHealthcare](https://pub.healthcare/), is a pioneering platform transforming **healthcare**, **architecture**, **urban planning**, **scientific research**, and **personal protection**. Built with [Ruby](https://www.ruby-lang.org/en/) and [LangChain.rb](https://github.com/patterns-ai-core/langchainrb), **AI^3** integrates advanced AI models like [ChatGPT](https://openai.com/chatgpt), [Anthropic Claude](https://www.anthropic.com/product/claude), and [Replicate](https://replicate.com/) into Unix environments and a **mobile-first Rails Progressive Web App (PWA)**. This integration drives global advancements across sectors, offering a robust CLI for OpenBSD and a unified PWA designed to act as a personal assistant, safeguarding users.

---

## Key Features

### 1. Multi-LLM Support
AI^3 integrates multiple LLMs, including OpenAI's GPT-4, Claude, and Replicate, to handle diverse use cases such as text summarization, context-based querying, and dynamic task execution.

### 2. Weaviate-Powered Memory
Utilizes Weaviate as a vector database for context persistence and retrieval in RAG (Retrieval-Augmented Generation) workflows.

### 3. Modular Tools
AI^3 includes and supports:
- **Calculator**: Perform complex calculations on demand.
- **File Search**: Efficiently search and retrieve files.
- **Weather Tool**: Fetch and present real-time weather information.
- **Universal Scraper**: Smart human-like scraping with Ferrum.

### 4. Secure Execution
Implements OpenBSD's `pledge` and `unveil` for secure and isolated runtime environments.

### 5. Context-Aware Memory
Maintains interaction history to enable coherent and continuous workflows, adapting seamlessly to user needs.

### 6. CLI and PWA Integration
A robust CLI and mobile-first Rails PWA ensure accessibility and functionality across platforms, enabling users to harness AI^3's capabilities on both Unix systems and mobile devices.

---

## Applications

### šŸ„ Healthcare
AI^3 optimizes patient care and resource management with predictive diagnostics, enhanced telemedicine, and secure consultations for remote regions.

### šŸ™ļø Urban Renewal
Supports sustainable construction with swarm robotics, traffic management innovations, and green design principles to develop eco-friendly, pedestrian-centric spaces.

### šŸ”¬ Scientific Research
Facilitates superscale molecular simulations and cross-disciplinary advancements in personalized medicine and atomic research.

### šŸ›”ļø Security and Defense
Enhances surveillance and countermeasure capabilities, disrupts disinformation, and provides constant vigilance for defense and security agencies.

### šŸ“± Personal Protection
Acts as a personal assistant to detect threats, coordinate emergency responses, and safeguard privacy with advanced encryption.

---

## Vision

**AI^3** aims to:

- **šŸŒ† Transform Urban Infrastructure**: Build sustainable, adaptable cities.
- **šŸ„ Advance Global Healthcare**: Implement efficient, AI-driven care systems.
- **šŸ”¬ Enhance Research Frontiers**: Bridge diverse fields to drive societal progress.
- **šŸ›”ļø Strengthen Defense and Security**: Equip NATO and allied forces with advanced AI tools to maintain global peace and security.
- **šŸ“± Empower Personal Protection**: Provide individuals with AI-driven personal assistants to enhance safety and well-being.

---

## Why LangChain?
LangChain’s flexible design enhances AI^3 by enabling seamless integration and intelligent data handling. It connects with APIs and tools, interacts robustly with vector databases for smart data management, and scales effectively for complex workflows. This ensures AI^3 remains adaptable to evolving technological and user needs.

---

## Contributing
Contributions are welcome! Please submit a pull request or raise an issue for feature requests and bug fixes.

---

## License
This project is licensed under the MIT License.

## `install.sh`
```
#!/usr/bin/env zsh
# AI³ Installation Script
# Version: 5.7.0
# Sets up a dynamic CLI per master.json phases.

set -e

ROOT_DIR="${PWD}"
LIB_DIR="${ROOT_DIR}/lib"
CONFIG_DIR="${ROOT_DIR}/config"
DATA_DIR="${ROOT_DIR}/data"
LOGS_DIR="${ROOT_DIR}/logs"
CACHE_DIR="${ROOT_DIR}/cache"
ASSISTANTS_DIR="${ROOT_DIR}/assistants"
BIN_DIR="${ROOT_DIR}/bin"
TMP_DIR="${ROOT_DIR}/tmp"

echo "[AI³] Setting up directories..."
mkdir -p $LIB_DIR $CONFIG_DIR $LOGS_DIR $CACHE_DIR $DATA_DIR/vector_db $DATA_DIR/screenshots $ASSISTANTS_DIR $BIN_DIR $TMP_DIR || {
  echo "Error: Failed to create directories"
  exit 1
}
chmod 755 $BIN_DIR
find $LOGS_DIR -mtime +7 -delete 2>/dev/null
echo "[AI³] Directories ready."

for gem in langchainrb redis ferrum tty-prompt dentaku; do
  gem list -i "^${gem}$" >/dev/null || {
    echo "Error: Missing gem $gem. Install with 'gem install $gem'."
    exit 1
  }
done

echo "[AI³] Generating ai3.rb..."
cat > $ROOT_DIR/ai3.rb <<'EOF'
#!/usr/bin/env ruby
# frozen_string_literal: true
# AI³ CLI: Dynamic task handler per master.json.

require "langchain"
require "redis"
require "fileutils"
require "tty-prompt"

CONFIG = YAML.load_file(File.join(Dir.pwd, "config.yml"))

required_env = %w[GROK_API_KEY WEAVIATE_HOST WEAVIATE_API_KEY]
missing = required_env.reject { |key| ENV.key?(key) || CONFIG.dig("vector_search", key.downcase.gsub("_", "")) }
raise "Missing: #{missing.join(', ')}" unless missing.empty?

%w[logs data cache assistants tmp].each { |dir| FileUtils.mkdir_p(File.join(Dir.pwd, dir)) }

require_relative "lib/global_ai"
require_relative "lib/error_handling"
require_relative "lib/interactive_session"
require_relative "lib/filesystem_tool"
require_relative "lib/universal_scraper"
require_relative "lib/calculator_tool"
require_relative "lib/prompt_manager"
require_relative "lib/rag_system"
require_relative "lib/command_handler"
require_relative "lib/context_manager"
require_relative "lib/schema_manager"

class AI3
  def initialize
    validate_environment
    puts "AI³ Initialized - Ready!"
    GlobalAI.initialize!
    @prompt = TTY::Prompt.new
    @session = InteractiveSession.new(GlobalAI.rag_system, GlobalAI.context_manager)
    run_startup_sequence
  end

  def start
    puts "AI³ CLI started. Type 'exit' to quit."
    @session.start
  end

  private

  def validate_environment
    raise "Log dir missing" unless File.directory?(CONFIG["background_tasks"]["log_dir"])
    Redis.new.ping rescue raise "Redis unavailable"
    GlobalAI.vector_client.schema_exists?(CONFIG["schemas"]["default"]) || raise "Weaviate unavailable"
    puts "Voice support: #{CONFIG["voice_enabled"] ? 'Enabled (future)' : 'Disabled'}"
  rescue => e
    puts ErrorHandling.log_error("Init failed: #{e.message}")
    exit 1
  end

  def run_startup_sequence
    review_context if @prompt.yes?("Review context, instructions, or documentation first?")
    project = @prompt.ask("Which project to refine? (enter name or 'all')") || "all"
    verify_data if @prompt.yes?("Confirm data and intent (e.g., arrays, comment style) before proceeding?")
    puts "Starting analysis on '#{project}'..."
  end

  def review_context
    puts "Analyzing prior interactions..."
    context = GlobalAI.context_manager.load_context("system")
    puts "Loaded context: #{context}"
    GlobalAI.context_manager.update_context("system", "Loaded at #{Time.now}")
  end

  def verify_data
    puts "Key data: #{CONFIG.inspect}"
    puts "Comment style: Above code, omitted if obvious (e.g., 'exit')"
    @prompt.yes?("Looks good?") || raise "Data verification failed"
  end
end

AI3.new.start if $PROGRAM_NAME == __FILE__
EOF
chmod +x $ROOT_DIR/ai3.rb
echo "[AI³] ai3.rb created."

echo "[AI³] Generating config.yml..."
cat > $CONFIG_DIR/config.yml <<'EOF'
version: "5.7.0"
verbose: false
voice_enabled: false

llm:
  primary: "grok"
  secondary: "claude"
  tertiary: "openai"
  temperature: 0.7
  max_tokens: 1000

vector_search:
  provider: "weaviate"
  index_name: "ai3_documents"
  api_key: "your_weaviate_api_key"
  host: "your_weaviate_host"

background_tasks:
  log_dir: "logs"
  max_concurrent: 5

rag:
  template: "Query: {{query}}\nContext: {{context}}\nAnswer:"
  k: 3
  chunk_size: 500
  chunk_overlap: 50

schemas:
  default: "ai3_documents"
  legal: "LegalDocs"
  media: "MediaModels"
EOF

echo "[AI³] Installing lib/global_ai.rb..."
cat > $LIB_DIR/global_ai.rb <<'EOF'
# frozen_string_literal: true
# GlobalAI: Core resources for AI³.

require "langchain"
require "logger"

module GlobalAI
  class << self
    attr_reader :llm, :grok, :vector_client, :universal_scraper, :filesystem_tool,
                :calculator_tool, :context_manager, :rag_system, :logger,
                :command_handler, :schema_manager

    def initialize!
      @logger = Logger.new("logs/global_ai.log", level: CONFIG["verbose"] ? Logger::DEBUG : Logger::INFO)
      llms = {
        "grok" => proc { Langchain::LLM::Grok.new(api_key: ENV.fetch("GROK_API_KEY")) },
        "claude" => proc { Langchain::LLM::Anthropic.new(api_key: ENV.fetch("ANTHROPIC_API_KEY")) },
        "openai" => proc { Langchain::LLM::OpenAI.new(api_key: ENV.fetch("OPENAI_API_KEY")) }
      }
      @llm = ErrorHandling.handle_standard_error("LLM init") {
        initialize_llm(llms, [CONFIG["llm"]["primary"], CONFIG["llm"]["secondary"], CONFIG["llm"]["tertiary"]])
      }
      @grok = llms["grok"].call
      @vector_client = Langchain::Vectorsearch::Weaviate.new(
        url: ENV.fetch("WEAVIATE_HOST", CONFIG["vector_search"]["host"]),
        api_key: ENV.fetch("WEAVIATE_API_KEY", CONFIG["vector_search"]["api_key"]),
        index_name: CONFIG["vector_search"]["index_name"],
        llm: @llm
      )
      @schema_manager = SchemaManager.new(@vector_client)
      @schema_manager.create_schema(CONFIG["schemas"]["default"], %w[text])
      @universal_scraper = UniversalScraper.new
      @filesystem_tool = FilesystemTool.new
      @calculator_tool = CalculatorTool.new
      @context_manager = ContextManager.new(@vector_client)
      @rag_system = RAGSystem.new(@vector_client)
      @command_handler = CommandHandler.new
    end

    private

    def initialize_llm(llms, priorities)
      priorities.each do |name|
        next unless llms[name]
        return llms[name].call
      rescue => e
        @logger.warn("Failed #{name}: #{e.message}")
      end
      raise "All LLMs failed"
    end
  end
end
EOF

echo "[AI³] Installing lib/interactive_session.rb..."
cat > $LIB_DIR/interactive_session.rb <<'EOF'
# frozen_string_literal: true
# InteractiveSession: CLI task handler.

require "tty-prompt"
require "json"
require_relative "prompt_manager"
require_relative "../assistants/general"

class InteractiveSession
  ASSISTANTS = { "general" => GeneralAssistant }.freeze

  def initialize(rag_system, context_manager)
    @rag_system = rag_system
    @context_manager = context_manager
    @prompt = TTY::Prompt.new
    @redis = Redis.new
    @background_tasks = @redis.hgetall("ai3_tasks").transform_keys(&:to_s).map { |k, v| [k, JSON.parse(v)] }.to_h
    @task_queue = [] # Future: Add dependency graph (LangGraph-inspired)
    @max_concurrent = CONFIG["background_tasks"]["max_concurrent"]
    load_additional_assistants
  end

  def start
    puts "AI³ Session started. Type 'exit' to quit."
    loop do
      input = @prompt.ask("> ")
      break if input&.downcase == "exit"
      next if input&.strip&.empty?
      process_input(input)
      cleanup_completed_tasks
    end
    puts "Goodbye!"
  end

  private

  def load_additional_assistants
    Dir.glob(File.join(Dir.pwd, "assistants", "*.rb")).each do |file|
      next if file.end_with?("general.rb")
      require_relative file
      name = File.basename(file, ".rb")
      ASSISTANTS[name] = Object.const_get("#{name.capitalize}Assistant")
    rescue => e
      GlobalAI.logger.warn("Failed to load assistant #{name}: #{e.message}")
    end
  end

  def process_input(input)
    GlobalAI.logger.info("Input: #{input}")
    case input
    when /status of my task/i
      report_task_status
    when /evaluate rag/i
      evaluate_rag(input)
    else
      execute_task(input)
    end
  end

  def execute_task(input)
    timestamp = Time.now.strftime("%Y%m%d_%H%M%S")
    log_file = "#{CONFIG["background_tasks"]["log_dir"]}/task_#{timestamp}.log"

    if @background_tasks.size >= @max_concurrent
      @task_queue << input
      puts "Max tasks (#{@max_concurrent}) reached. Queued."
      return
    end

    intent = PromptManager.classify_intent(input)
    if input =~ /use the (\w+) assistant/i && ASSISTANTS[$1]
      assistant = ASSISTANTS[$1].new
      puts assistant.respond(input)
    else
      task_id = "#{intent.first}_#{timestamp}"
      pid = case intent.first
            when "filesystem"
              GlobalAI.filesystem_tool.execute_in_background(input, log_file)
            when "scrape"
              GlobalAI.universal_scraper.execute_in_background(input, log_file)
            when "calculate"
              puts "Result: #{GlobalAI.calculator_tool.execute(input)}"
              return
            when "command"
              puts "Result: #{GlobalAI.command_handler.execute(input)}"
              return
            else
              puts "Response: #{@rag_system.handle_query(input)}"
              return
            end
      @background_tasks[task_id] = { "log" => log_file, "pid" => pid }
      @redis.hset("ai3_tasks", task_id, @background_tasks[task_id].to_json)
      puts "Started #{intent.first} task. ID: #{task_id}"
    end
  end

  def evaluate_rag(input)
    query = input.match(/evaluate rag ['"](.+)['"] ['"](.+)['"]/)[1]
    expected = input.match(/evaluate rag ['"](.+)['"] ['"](.+)['"]/)[2]
    result = @rag_system.evaluate_rag(query, expected)
    puts "Evaluation: Precision: #{result[:precision]}, Recall: #{result[:recall]}"
  end

  def report_task_status
    @background_tasks.empty? ? puts "No tasks." : @background_tasks.each { |id, info|
      status = File.exist?(info["log"]) ? File.read(info["log"]).split("\n").last : "Running..."
      puts "Task #{id}: #{status}"
    }
    puts "Queued tasks: #{@task_queue.size}" unless @task_queue.empty?
  end

  def cleanup_completed_tasks
    @background_tasks.each do |id, info|
      if Process.waitpid(info["pid"], Process::WNOHANG)
        @redis.hdel("ai3_tasks", id)
        GlobalAI.logger.info("Task #{id} done")
        unless @task_queue.empty?
          execute_task(@task_queue.shift)
        end
      end
    end
    @background_tasks = @redis.hgetall("ai3_tasks").transform_keys(&:to_s).map { |k, v| [k, JSON.parse(v)] }.to_h
  end
end
EOF

echo "[AI³] Installing lib/filesystem_tool.rb..."
cat > $LIB_DIR/filesystem_tool.rb <<'EOF'
# frozen_string_literal: true
# FilesystemTool: Manages file operations.

require "fileutils"
require "langchain"
require_relative "error_handling"

class FilesystemTool < Langchain::Tool::Base
  def initialize
    @doas_available = system("doas true >/dev/null 2>&1")
    @loader = Langchain::Loader.new
  end

  def execute_in_background(input, log_file)
    input = input.gsub(/[;`&|<>]/, "")
    pid = fork do
      File.open(log_file, "a") { |f| f.puts "Starting: #{input}" }
      ErrorHandling.handle_standard_error("Filesystem: #{input}") do
        case input
        when /navigate to (\S+)/i
          Dir.chdir($1) { File.open(log_file, "a") { |f| f.puts "At #{$1}" } }
        when /complete.*rails.*in (\S+)/i
          Dir.chdir($1) do
            tasks = JSON.parse(File.read("master.json"))["commands"] rescue ["echo 'No tasks'"]
            File.write("task.sh", "#!/bin/bash\n#{tasks.join("\n")}")
            FileUtils.chmod("+x", "task.sh")
            system("bash task.sh >> #{log_file} 2>&1")
            File.open(log_file, "a") { |f| f.puts "Rails done" }
          end
        when /doas su.*fix.*network/i
          File.write("fix.sh", "#!/bin/bash\n#{@doas_available ? 'doas ' : ''}ifconfig up\nping -c 5 8.8.8.8")
          FileUtils.chmod("+x", "fix.sh")
          system("bash fix.sh >> #{log_file} 2>&1")
          File.open(log_file, "a") { |f| f.puts "Network fixed" }
        when /read.*file (\S+)/i
          File.read($1)
        when /read.*chunks (\S+) (\d+)/i
          path, chunk_size = $1, $2.to_i
          File.open(path, "r") { |f| f.read(chunk_size) }
        when /load.*file (\S+)/i
          @loader.load_file($1).content # Supports PDFs, CSVs, etc.
        when /clean.*file (\S+)/i
          File.write($1, File.read($1).gsub(/\s+/, " ").strip)
          "Cleaned #{$1}"
        when /replace (\S+) (\S+) (\S+)/i
          file, old, new = $1, $2, $3
          content = File.read(file).gsub(old, new)
          File.write(file, content)
          "Replaced in #{file}"
        else
          File.open(log_file, "a") { |f| f.puts "Done: #{GlobalAI.llm.chat(input)}" }
        end
      end
    end
    Process.detach(pid)
    pid
  end
end
EOF

echo "[AI³] Installing lib/universal_scraper.rb..."
cat > $LIB_DIR/universal_scraper.rb <<'EOF'
# frozen_string_literal: true
# UniversalScraper: Generic web scraper.

require "ferrum"
require "langchain"
require_relative "error_handling"

class UniversalScraper < Langchain::Tool::Base
  def initialize(urls = [])
    @options = { timeout: 120, retries: 3 }
    @cache = {}
    @urls = urls # URLs provided by assistants
    @chunker = Langchain::Chunker::RecursiveCharacterTextSplitter.new(
      chunk_size: CONFIG["rag"]["chunk_size"],
      chunk_overlap: CONFIG["rag"]["chunk_overlap"]
    )
  end

  def execute_in_background(input, log_file)
    input = input.gsub(/[;`&|<>]/, "")
    pid = fork do
      File.open(log_file, "a") { |f| f.puts "Starting: #{input}" }
      ErrorHandling.handle_standard_error("Scrape: #{input}") do
        browser = Ferrum::Browser.new(timeout: @options[:timeout])
        urls_to_scrape = input =~ /scrape (\S+)/i ? [$1] : @urls
        urls_to_scrape.each do |url|
          browser.goto(url)
          chunks = @chunker.chunks(browser.body)
          @cache[url] = chunks
          GlobalAI.vector_client.add_texts(texts: chunks)
          File.open(log_file, "a") { |f| f.puts "Scraped and indexed: #{url}" }
        end
        browser.quit
      end
    end
    Process.detach(pid)
    pid
  end
end
EOF

echo "[AI³] Installing lib/calculator_tool.rb..."
cat > $LIB_DIR/calculator_tool.rb <<'EOF'
# frozen_string_literal: true
# CalculatorTool: Performs computations.

require "dentaku"
require_relative "error_handling"

class CalculatorTool < Langchain::Tool::Base
  def initialize
    @calculator = Dentaku::Calculator.new
  end

  def execute(input)
    ErrorHandling.handle_standard_error("Calc: #{input}") { @calculator.evaluate(input) || "Invalid" }
  end
end
EOF

echo "[AI³] Installing lib/prompt_manager.rb..."
cat > $LIB_DIR/prompt_manager.rb <<'EOF'
# frozen_string_literal: true
# PromptManager: Handles LLM prompts with dynamic intent classification.

require "tty-prompt"
require "redis"

module PromptManager
  INTENT_TEMPLATE = Langchain::Prompt::PromptTemplate.new(
    template: "Classify the intent of: '{{input}}'. Return a comma-separated list of intents (e.g., 'scrape, info').",
    input_variables: ["input"]
  )
  @redis = Redis.new

  def self.classify_intent(input)
    # Local regex-based intent detection
    intents = []
    intents << "filesystem" if input =~ /navigate|complete|read|clean|replace/i
    intents << "scrape" if input =~ /scrape|index|stalk/i
    intents << "calculate" if input =~ /calc|compute/i
    intents << "command" if input =~ /list|browse|search/i

    # Cache check
    cache_key = "intent:#{input.hash}"
    cached = @redis.get(cache_key)
    return JSON.parse(cached) if cached

    # Fallback to LLM if no local match or ambiguous
    intents = GlobalAI.llm.chat(INTENT_TEMPLATE.format(input: input)).downcase.split(",").map(&:strip) if intents.empty? || intents.size > 1
    intents = ["other"] if intents.empty?
    
    @redis.setex(cache_key, 3600, intents.to_json) # Cache for 1 hour
    intents
  rescue => e
    GlobalAI.logger.warn("Intent failed: #{e.message}")
    TTY::Prompt.new.select("Ambiguous: '#{input}'. Intent?", %w[filesystem scrape calculate command other])
  end

  def self.create_prompt(template, input_variables)
    Langchain::Prompt::PromptTemplate.new(template: template, input_variables: input_variables)
  end
end
EOF

echo "[AI³] Installing lib/rag_system.rb..."
cat > $LIB_DIR/rag_system.rb <<'EOF'
# frozen_string_literal: true
# RAGSystem: Augments responses with Weaviate.

require "json"
require_relative "prompt_manager"

class RAGSystem
  def initialize(weaviate_helper)
    @weaviate_helper = weaviate_helper
    @prompt_template = Langchain::Prompt::PromptTemplate.new(
      template: CONFIG["rag"]["template"] || "Query: {{query}}\nContext: {{context}}\nAnswer:",
      input_variables: ["query", "context"]
    )
    @k = CONFIG["rag"]["k"] || 3
  end

  def handle_query(query)
    ErrorHandling.handle_standard_error("RAG: #{query}") do
      context_data = GlobalAI.context_manager.load_context("system")
      prior_context = JSON.parse(context_data)["prior"] || ""
      results = @weaviate_helper.similarity_search(query: query, k: @k)
      context = "#{prior_context}\n#{results.map(&:to_s).join("\n")}".strip
      response = GlobalAI.grok.chat(@prompt_template.format(query: query, context: context))
      parsed = parse_output(response)
      GlobalAI.context_manager.update_context("system", {"prior" => context, "last_query" => query}.to_json)
      parsed
    end
  end

  def evaluate_rag(query, expected)
    ErrorHandling.handle_standard_error("RAG Eval: #{query}") do
      actual = handle_query(query)
      expected_words = expected.downcase.split
      actual_words = actual["answer"].downcase.split
      common = expected_words & actual_words
      precision = common.size.to_f / actual_words.size
      recall = common.size.to_f / expected_words.size
      puts "Evaluated '#{query}': Precision: #{precision}, Recall: #{recall}"
      { precision: precision, recall: recall }
    end
  end

  private

  def parse_output(response)
    begin
      JSON.parse(response)
    rescue JSON::ParserError
      {"answer" => response.strip}
    end
  end
end
EOF

echo "[AI³] Installing lib/command_handler.rb..."
cat > $LIB_DIR/command_handler.rb <<'EOF'
# frozen_string_literal: true
# CommandHandler: Executes predefined commands.

require_relative "error_handling"

class CommandHandler
  def initialize
    @commands = {
      "list_dir" => proc { |dir| Dir.entries(dir).reject { |e| e == "." || e == ".." }.join("\n") },
      "browse_directory" => proc { |dir| Dir.glob("#{dir}/*").join("\n") },
      "search_file" => proc { |args| `grep -r "#{args.split[1]}" #{args.split[0]}"`.chomp }
    }
  end

  def execute(input)
    ErrorHandling.handle_standard_error("Cmd: #{input}") do
      case input
      when /list.*dir.*in (\S+)/i then @commands["list_dir"].call($1)
      when /browse.*dir.*in (\S+)/i then @commands["browse_directory"].call($1)
      when /search.*file (\S+) (\S+)/i then @commands["search_file"].call("#{$1} #{$2}")
      else "Unknown: #{input}"
      end
    end
  end
end
EOF

echo "[AI³] Installing lib/error_handling.rb..."
cat > $LIB_DIR/error_handling.rb <<'EOF'
# frozen_string_literal: true
# ErrorHandling: Manages errors.

module ErrorHandling
  def self.log_error(message)
    "[ERROR]: #{message}"
  end

  def self.handle_standard_error(context, retries: 3)
    yield
  rescue => e
    GlobalAI.logger.error("#{context}: #{e.message}")
    retries.times do |i|
      sleep(2 ** i)
      return yield
    rescue => retry_error
      GlobalAI.logger.warn("Retry #{i + 1}: #{retry_error.message}")
    end
    log_error("#{context} failed after #{retries}: #{e.message}")
  end
end
EOF

echo "[AI³] Installing lib/context_manager.rb..."
cat > $LIB_DIR/context_manager.rb <<'EOF'
# frozen_string_literal: true
# ContextManager: Tracks user context.

require "redis"
require "langchain"
require_relative "error_handling"

class ContextManager
  def initialize(weaviate_helper)
    @contexts = {}
    @weaviate_helper = weaviate_helper
    @redis = Redis.new
    @memory = Langchain::Memory::ConversationBufferMemory.new(max_size: 10)
  end

  def update_context(user_id, context)
    @contexts[user_id] = context
    ErrorHandling.handle_standard_error("Context: #{user_id}") do
      @redis.set("#{user_id}:#{Time.now.to_i}", context.to_json)
      @memory.add(context.to_json)
      @weaviate_helper.add_texts(texts: [context.to_json])
      "Context updated: #{user_id}"
    end
  end

  def load_context(user_id)
    buffer = @memory.load
    @redis.keys("#{user_id}:*").sort.last&.then { |k| @redis.get(k) } || buffer || "{}"
  end
end
EOF

echo "[AI³] Installing lib/schema_manager.rb..."
cat > $LIB_DIR/schema_manager.rb <<'EOF'
# frozen_string_literal: true
# SchemaManager: Manages Weaviate schemas.

require_relative "error_handling"

class SchemaManager
  def initialize(weaviate_client)
    @client = weaviate_client
  end

  def create_schema(name, properties)
    ErrorHandling.handle_standard_error("Schema: #{name}") do
      schema = { "class" => name, "properties" => properties.map { |p| {"name" => p, "dataType" => ["string"]} } }
      @client.create_schema(schema) unless schema_exists?(name)
    end
  end

  def update_schema(name, properties)
    ErrorHandling.handle_standard_error("Update Schema: #{name}") do
      current = @client.get_schema["classes"].find { |c| c["class"] == name } || {"properties" => []}
      updated = current["properties"] + properties.map { |p| {"name" => p, "dataType" => ["string"]} }
      @client.delete_schema(name) if schema_exists?(name)
      @client.create_schema("class" => name, "properties" => updated)
    end
  end

  def schema_exists?(name)
    @client.get_schema["classes"]&.any? { |c| c["class"] == name } || false
  end
end
EOF

echo "[AI³] Installing assistants/general.rb..."
cat > $ASSISTANTS_DIR/general.rb <<'EOF'
# frozen_string_literal: true
# GeneralAssistant: Handles general tasks.

require "yaml"
require "langchain"
require_relative "../lib/global_ai"

class GeneralAssistant
  URLS = [] # No specific URLs needed for general assistant

  def initialize
    config = YAML.load_file(File.join(Dir.pwd, "config", "assistants.yml"))
    @assistant = Langchain::Assistant.new(
      llm: GlobalAI.llm,
      tools: [GlobalAI.filesystem_tool, GlobalAI.universal_scraper, GlobalAI.calculator_tool],
      temperature: config["assistants"]["general"]["temperature"]
    )
  end

  def respond(input)
    @assistant.chat(input)
  end
end
EOF

echo "[AI³] Installation complete. Run: 'ruby $ROOT_DIR/ai3.rb'."
```

## `install_ass.sh`
```
#!/usr/bin/env zsh
# AI³ Assistants Installation Script
# Version: 5.7.0
# Sets up all assistants per master.json.

set -e

ROOT_DIR="${PWD}"
ASSISTANTS_DIR="${ROOT_DIR}/assistants"
CONFIG_DIR="${ROOT_DIR}/config"
LOGS_DIR="${ROOT_DIR}/logs"
DATA_DIR="${ROOT_DIR}/data"

echo "[AI³] Setting up assistants directories..."
mkdir -p $ASSISTANTS_DIR $CONFIG_DIR $LOGS_DIR $DATA_DIR || {
  echo "Error: Failed to create directories"
  exit 1
}
echo "[AI³] Directories ready."

for gem in langchainrb yaml tty-prompt ferrum; do
  gem list -i "^${gem}$" >/dev/null || {
    echo "Error: Missing gem $gem. Install with 'gem install $gem'."
    exit 1
  }
done

echo "[AI³] Generating assistants.yml..."
cat > $CONFIG_DIR/assistants.yml <<'EOF'
version: "5.7.0"
assistants:
  general:
    role: "General-purpose assistant"
    llm: "grok"
    temperature: 0.7
    tools: ["FilesystemTool", "UniversalScraper", "CalculatorTool"]
  offensive_ops:
    role: "Offensive operations assistant"
    llm: "grok"
    temperature: 0.7
    tools: ["FilesystemTool", "UniversalScraper"]
  influencer:
    role: "Social media influencer management"
    llm: "grok"
    temperature: 0.7
    tools: ["UniversalScraper"]
  lawyer:
    role: "Legal consultation assistant"
    llm: "grok"
    temperature: 0.7
    tools: ["UniversalScraper"]
  trader:
    role: "Cryptocurrency trading assistant"
    llm: "grok"
    temperature: 0.7
    tools: ["FilesystemTool"]
  architect:
    role: "Parametric architecture design assistant"
    llm: "grok"
    temperature: 0.7
    tools: ["UniversalScraper"]
  hacker:
    role: "Ethical hacking assistant"
    llm: "grok"
    temperature: 0.7
    tools: ["UniversalScraper"]
  snapchat:
    role: "Snapchat chatbot assistant"
    llm: "grok"
    temperature: 0.7
    tools: ["UniversalScraper"]
  tinder:
    role: "Tinder chatbot assistant"
    llm: "grok"
    temperature: 0.7
    tools: ["UniversalScraper"]
  discord:
    role: "Discord chatbot assistant"
    llm: "grok"
    temperature: 0.7
    tools: ["UniversalScraper"]
  onlyfans:
    role: "OnlyFans chatbot assistant"
    llm: "grok"
    temperature: 0.7
    tools: ["UniversalScraper"]
  facebook:
    role: "Facebook chatbot assistant"
    llm: "grok"
    temperature: 0.7
    tools: ["UniversalScraper"]
  personal:
    role: "Personal assistant for protection"
    llm: "grok"
    temperature: 0.7
    tools: ["FilesystemTool", "UniversalScraper"]
  music:
    role: "Music creation assistant"
    llm: "grok"
    temperature: 0.7
    tools: ["UniversalScraper", "FilesystemTool"]
  material_repurposing:
    role: "Material repurposing assistant"
    llm: "grok"
    temperature: 0.7
    tools: ["UniversalScraper"]
EOF

echo "[AI³] Installing assistants/offensive_ops.rb..."
cat > $ASSISTANTS_DIR/offensive_ops.rb <<'EOF'
# frozen_string_literal: true
# OffensiveOpsAssistant: Manages offensive operations.

require "yaml"
require "langchain"
require "securerandom"
require_relative "../lib/global_ai"

class OffensiveOpsAssistant
  URLS = ["https://example.com/offensive", "https://example.com/darkweb"] # Configurable URLs
  ACTIVITIES = %i[
    generate_deepfake phishing_campaign social_engineering disinformation_operations
    doxing cyberbullying identity_theft fabricate_evidence
  ].freeze

  def initialize
    config = YAML.load_file(File.join(Dir.pwd, "config", "assistants.yml"))
    @assistant = Langchain::Assistant.new(
      llm: GlobalAI.llm,
      tools: [GlobalAI.filesystem_tool, UniversalScraper.new(URLS)],
      temperature: config["assistants"]["offensive_ops"]["temperature"]
    )
    @logger = Logger.new(File.join(Dir.pwd, "logs", "offensive_ops.log"), level: Logger::DEBUG)
  end

  def respond(input)
    if input =~ /launch.*campaign/i
      timestamp = Time.now.strftime("%Y%m%d_%H%M%S")
      log_file = "logs/task_#{timestamp}.log"
      GlobalAI.universal_scraper.execute_in_background(input, log_file)
      "Launching campaign in background. Check #{log_file}."
    else
      handle_activity(input)
    end
  end

  private

  def handle_activity(input)
    case input.downcase
    when /deepfake/i
      prompt = "Generate a deepfake script targeting a persona."
      script = @assistant.chat(prompt)
      file = "deepfake_#{SecureRandom.hex(4)}.txt"
      GlobalAI.filesystem_tool.execute_in_background("echo '#{script}' > #{file}", "logs/deepfake.log")
      "Deepfake script created: #{file}"
    when /phishing/i
      prompt = "Craft a phishing email targeting a user."
      email = @assistant.chat(prompt)
      file = "phishing_#{SecureRandom.hex(4)}.txt"
      GlobalAI.filesystem_tool.execute_in_background("echo '#{email}' > #{file}", "logs/phishing.log")
      "Phishing email created: #{file}"
    when /social engineering/i
      prompt = "Design a social engineering tactic."
      tactic = @assistant.chat(prompt)
      "Social engineering tactic: #{tactic}"
    when /disinformation/i
      prompt = "Plan a disinformation operation."
      plan = @assistant.chat(prompt)
      "Disinformation plan: #{plan}"
    when /doxing/i
      prompt = "Compile a mock doxing report."
      report = @assistant.chat(prompt)
      file = "doxing_#{SecureRandom.hex(4)}.txt"
      GlobalAI.filesystem_tool.execute_in_background("echo '#{report}' > #{file}", "logs/doxing.log")
      "Doxing report created: #{file}"
    when /cyberbullying/i
      prompt = "Simulate cyberbullying to provoke a response."
      tactics = @assistant.chat(prompt)
      "Cyberbullying tactics: #{tactics}"
    when /identity theft/i
      prompt = "Forge a digital identity for bait."
      data = @assistant.chat(prompt)
      "Identity theft data: #{data}"
    when /fabricate evidence/i
      prompt = "Fabricate digital evidence."
      evidence = @assistant.chat(prompt)
      file = "evidence_#{SecureRandom.hex(4)}.txt"
      GlobalAI.filesystem_tool.execute_in_background("echo '#{evidence}' > #{file}", "logs/evidence.log")
      "Evidence created: #{file}"
    else
      @assistant.chat(input)
    end
  end
end
EOF

echo "[AI³] Installing assistants/influencer.rb..."
cat > $ASSISTANTS_DIR/influencer.rb <<'EOF'
# frozen_string_literal: true
# InfluencerAssistant: Manages social media tasks.

require "yaml"
require "langchain"
require_relative "../lib/global_ai"

class InfluencerAssistant
  URLS = ["https://instagram.com", "https://twitter.com"] # Configurable URLs

  def initialize
    config = YAML.load_file(File.join(Dir.pwd, "config", "assistants.yml"))
    @assistant = Langchain::Assistant.new(
      llm: GlobalAI.llm,
      tools: [UniversalScraper.new(URLS)],
      temperature: config["assistants"]["influencer"]["temperature"]
    )
  end

  def respond(input)
    if input =~ /manage.*influencers/i
      timestamp = Time.now.strftime("%Y%m%d_%H%M%S")
      log_file = "logs/task_#{timestamp}.log"
      GlobalAI.universal_scraper.execute_in_background(input, log_file)
      "Managing influencers in background. Check #{log_file}."
    else
      @assistant.chat(input)
    end
  end
end
EOF

echo "[AI³] Installing assistants/lawyer.rb..."
cat > $ASSISTANTS_DIR/lawyer.rb <<'EOF'
# frozen_string_literal: true
# LawyerAssistant: Manages legal tasks.

require "yaml"
require "langchain"
require_relative "../lib/global_ai"

class LawyerAssistant
  URLS = ["https://lovdata.no", "https://lexisnexis.com"] # Configurable URLs

  def initialize
    config = YAML.load_file(File.join(Dir.pwd, "config", "assistants.yml"))
    @assistant = Langchain::Assistant.new(
      llm: GlobalAI.llm,
      tools: [UniversalScraper.new(URLS)],
      temperature: config["assistants"]["lawyer"]["temperature"]
    )
    @logger = Logger.new(File.join(Dir.pwd, "logs", "lawyer.log"), level: Logger::DEBUG)
  end

  def respond(input)
    if input =~ /index.*legal/i
      timestamp = Time.now.strftime("%Y%m%d_%H%M%S")
      log_file = "logs/task_#{timestamp}.log"
      GlobalAI.universal_scraper.execute_in_background(input, log_file)
      "Indexing legal sources in background. Check #{log_file}."
    else
      @assistant.chat(input)
    end
  end
end
EOF

echo "[AI³] Installing assistants/trader.rb..."
cat > $ASSISTANTS_DIR/trader.rb <<'EOF'
# frozen_string_literal: true
# TraderAssistant: Handles trading tasks.

require "yaml"
require "langchain"
require_relative "../lib/global_ai"

class TraderAssistant
  URLS = [] # No scraping needed for trader

  def initialize
    config = YAML.load_file(File.join(Dir.pwd, "config", "assistants.yml"))
    @assistant = Langchain::Assistant.new(
      llm: GlobalAI.llm,
      tools: [GlobalAI.filesystem_tool],
      temperature: config["assistants"]["trader"]["temperature"]
    )
    @logger = Logger.new(File.join(Dir.pwd, "logs", "trader.log"), level: Logger::DEBUG)
  end

  def respond(input)
    if input =~ /run.*trading/i
      timestamp = Time.now.strftime("%Y%m%d_%H%M%S")
      log_file = "logs/task_#{timestamp}.log"
      GlobalAI.filesystem_tool.execute_in_background(input, log_file)
      "Running trading task in background. Check #{log_file}."
    else
      @assistant.chat(input)
    end
  end
end
EOF

echo "[AI³] Installing assistants/architect.rb..."
cat > $ASSISTANTS_DIR/architect.rb <<'EOF'
# frozen_string_literal: true
# ArchitectAssistant: Manages design tasks.

require "yaml"
require "langchain"
require_relative "../lib/global_ai"

class ArchitectAssistant
  URLS = ["https://archdaily.com", "https://designboom.com"] # Configurable URLs

  def initialize
    config = YAML.load_file(File.join(Dir.pwd, "config", "assistants.yml"))
    @assistant = Langchain::Assistant.new(
      llm: GlobalAI.llm,
      tools: [UniversalScraper.new(URLS)],
      temperature: config["assistants"]["architect"]["temperature"]
    )
  end

  def respond(input)
    if input =~ /implement.*designs/i
      timestamp = Time.now.strftime("%Y%m%d_%H%M%S")
      log_file = "logs/task_#{timestamp}.log"
      GlobalAI.universal_scraper.execute_in_background(input, log_file)
      "Implementing designs in background. Check #{log_file}."
    else
      @assistant.chat(input)
    end
  end
end
EOF

echo "[AI³] Installing assistants/hacker.rb..."
cat > $ASSISTANTS_DIR/hacker.rb <<'EOF'
# frozen_string_literal: true
# HackerAssistant: Ethical hacking assistant.

require "yaml"
require "langchain"
require_relative "../lib/global_ai"

class HackerAssistant
  URLS = ["https://exploit-db.com", "https://kali.org", "https://hackthissite.org"] # Configurable URLs

  def initialize
    config = YAML.load_file(File.join(Dir.pwd, "config", "assistants.yml"))
    @assistant = Langchain::Assistant.new(
      llm: GlobalAI.llm,
      tools: [UniversalScraper.new(URLS)],
      temperature: config["assistants"]["hacker"]["temperature"]
    )
  end

  def respond(input)
    if input =~ /conduct.*analysis/i
      timestamp = Time.now.strftime("%Y%m%d_%H%M%S")
      log_file = "logs/task_#{timestamp}.log"
      GlobalAI.universal_scraper.execute_in_background(input, log_file)
      "Conducting analysis in background. Check #{log_file}."
    else
      @assistant.chat(input)
    end
  end
end
EOF

echo "[AI³] Installing assistants/snapchat.rb..."
cat > $ASSISTANTS_DIR/snapchat.rb <<'EOF'
# frozen_string_literal: true
# SnapchatAssistant: Snapchat chatbot.

require "yaml"
require "langchain"
require "ferrum"
require_relative "../lib/global_ai"

class SnapchatAssistant
  URLS = ["https://www.snapchat.com"] # Configurable URLs

  def initialize
    config = YAML.load_file(File.join(Dir.pwd, "config", "assistants.yml"))
    @assistant = Langchain::Assistant.new(
      llm: GlobalAI.llm,
      tools: [UniversalScraper.new(URLS)],
      temperature: config["assistants"]["snapchat"]["temperature"]
    )
    @browser = Ferrum::Browser.new
  end

  def respond(input)
    if input =~ /send.*message (\S+) (.+)/i
      user_id, message = $1, $2
      @browser.goto("https://www.snapchat.com/add/#{user_id}")
      @browser.at_css("textarea")&.send_keys(message)
      @browser.at_css("button[type='submit']")&.click
      "Sent Snapchat message to #{user_id}"
    elsif input =~ /send.*message/i
      timestamp = Time.now.strftime("%Y%m%d_%H%M%S")
      log_file = "logs/task_#{timestamp}.log"
      GlobalAI.universal_scraper.execute_in_background(input, log_file)
      "Sending Snapchat message in background. Check #{log_file}."
    else
      @assistant.chat(input)
    end
  end
end
EOF

echo "[AI³] Installing assistants/tinder.rb..."
cat > $ASSISTANTS_DIR/tinder.rb <<'EOF'
# frozen_string_literal: true
# TinderAssistant: Tinder chatbot.

require "yaml"
require "langchain"
require "ferrum"
require_relative "../lib/global_ai"

class TinderAssistant
  URLS = ["https://tinder.com"] # Configurable URLs

  def initialize
    config = YAML.load_file(File.join(Dir.pwd, "config", "assistants.yml"))
    @assistant = Langchain::Assistant.new(
      llm: GlobalAI.llm,
      tools: [UniversalScraper.new(URLS)],
      temperature: config["assistants"]["tinder"]["temperature"]
    )
    @browser = Ferrum::Browser.new
  end

  def respond(input)
    if input =~ /send.*message (\S+) (.+)/i
      user_id, message = $1, $2
      @browser.goto("https://tinder.com/@#{user_id}")
      @browser.at_css("textarea")&.send_keys(message)
      @browser.at_css("button[type='submit']")&.click
      "Sent Tinder message to #{user_id}"
    elsif input =~ /send.*message/i
      timestamp = Time.now.strftime("%Y%m%d_%H%M%S")
      log_file = "logs/task_#{timestamp}.log"
      GlobalAI.universal_scraper.execute_in_background(input, log_file)
      "Sending Tinder message in background. Check #{log_file}."
    else
      @assistant.chat(input)
    end
  end
end
EOF

echo "[AI³] Installing assistants/discord.rb..."
cat > $ASSISTANTS_DIR/discord.rb <<'EOF'
# frozen_string_literal: true
# DiscordAssistant: Discord chatbot.

require "yaml"
require "langchain"
require "ferrum"
require_relative "../lib/global_ai"

class DiscordAssistant
  URLS = ["https://discord.com"] # Configurable URLs

  def initialize
    config = YAML.load_file(File.join(Dir.pwd, "config", "assistants.yml"))
    @assistant = Langchain::Assistant.new(
      llm: GlobalAI.llm,
      tools: [UniversalScraper.new(URLS)],
      temperature: config["assistants"]["discord"]["temperature"]
    )
    @browser = Ferrum::Browser.new
  end

  def respond(input)
    if input =~ /send.*message (\S+) (.+)/i
      user_id, message = $1, $2
      @browser.goto("https://discord.com/users/#{user_id}")
      @browser.at_css("textarea")&.send_keys(message)
      @browser.at_css("button[type='submit']")&.click
      "Sent Discord message to #{user_id}"
    elsif input =~ /send.*message/i
      timestamp = Time.now.strftime("%Y%m%d_%H%M%S")
      log_file = "logs/task_#{timestamp}.log"
      GlobalAI.universal_scraper.execute_in_background(input, log_file)
      "Sending Discord message in background. Check #{log_file}."
    else
      @assistant.chat(input)
    end
  end
end
EOF

echo "[AI³] Installing assistants/onlyfans.rb..."
cat > $ASSISTANTS_DIR/onlyfans.rb <<'EOF'
# frozen_string_literal: true
# OnlyfansAssistant: OnlyFans chatbot.

require "yaml"
require "langchain"
require "ferrum"
require_relative "../lib/global_ai"

class OnlyfansAssistant
  URLS = ["https://onlyfans.com"] # Configurable URLs

  def initialize
    config = YAML.load_file(File.join(Dir.pwd, "config", "assistants.yml"))
    @assistant = Langchain::Assistant.new(
      llm: GlobalAI.llm,
      tools: [UniversalScraper.new(URLS)],
      temperature: config["assistants"]["onlyfans"]["temperature"]
    )
    @browser = Ferrum::Browser.new
  end

  def respond(input)
    if input =~ /send.*message (\S+) (.+)/i
      user_id, message = $1, $2
      @browser.goto("https://onlyfans.com/#{user_id}")
      @browser.at_css("textarea")&.send_keys(message)
      @browser.at_css("button[type='submit']")&.click
      "Sent OnlyFans message to #{user_id}"
    elsif input =~ /send.*message/i
      timestamp = Time.now.strftime("%Y%m%d_%H%M%S")
      log_file = "logs/task_#{timestamp}.log"
      GlobalAI.universal_scraper.execute_in_background(input, log_file)
      "Sending OnlyFans message in background. Check #{log_file}."
    else
      @assistant.chat(input)
    end
  end
end
EOF

echo "[AI³] Installing assistants/facebook.rb..."
cat > $ASSISTANTS_DIR/facebook.rb <<'EOF'
# frozen_string_literal: true
# FacebookAssistant: Facebook chatbot.

require "yaml"
require "langchain"
require "ferrum"
require_relative "../lib/global_ai"

class FacebookAssistant
  URLS = ["https://www.facebook.com"] # Configurable URLs

  def initialize
    config = YAML.load_file(File.join(Dir.pwd, "config", "assistants.yml"))
    @assistant = Langchain::Assistant.new(
      llm: GlobalAI.llm,
      tools: [UniversalScraper.new(URLS)],
      temperature: config["assistants"]["facebook"]["temperature"]
    )
    @browser = Ferrum::Browser.new
  end

  def respond(input)
    if input =~ /send.*message (\S+) (.+)/i
      user_id, message = $1, $2
      @browser.goto("https://www.facebook.com/#{user_id}")
      @browser.at_css("textarea")&.send_keys(message)
      @browser.at_css("button[type='submit']")&.click
      "Sent Facebook message to #{user_id}"
    elsif input =~ /send.*message/i
      timestamp = Time.now.strftime("%Y%m%d_%H%M%S")
      log_file = "logs/task_#{timestamp}.log"
      GlobalAI.universal_scraper.execute_in_background(input, log_file)
      "Sending Facebook message in background. Check #{log_file}."
    else
      @assistant.chat(input)
    end
  end
end
EOF

echo "[AI³] Installing assistants/personal.rb..."
cat > $ASSISTANTS_DIR/personal.rb <<'EOF'
# frozen_string_literal: true
# PersonalAssistant: Background protection assistant.

require "yaml"
require "langchain"
require_relative "../lib/global_ai"

class PersonalAssistant
  URLS = ["https://facebook.com", "https://twitter.com"] # Configurable URLs for tracking
  WAKE_WORD = "Assistant"

  def initialize
    config = YAML.load_file(File.join(Dir.pwd, "config", "assistants.yml"))
    @assistant = Langchain::Assistant.new(
      llm: GlobalAI.llm,
      tools: [GlobalAI.filesystem_tool, UniversalScraper.new(URLS)],
      temperature: config["assistants"]["personal"]["temperature"]
    )
    @logger = Logger.new(File.join(Dir.pwd, "logs", "personal.log"), level: Logger::DEBUG)
    @dbs = {
      people: File.join(Dir.pwd, "data", "people_db.txt"),
      itinerary: File.join(Dir.pwd, "data", "itinerary_db.txt"),
      finance: File.join(Dir.pwd, "data", "finance_db.txt"),
      profiles: File.join(Dir.pwd, "data", "profiles_db.txt")
    }
    @dbs.each_value { |db| File.write(db, "") unless File.exist?(db) }
  end

  def respond(input)
    if input.start_with?(WAKE_WORD)
      command = input.sub(WAKE_WORD, "").strip
      case command.downcase
      when /track person (.+)/
        track_person($1)
      when /update itinerary (.+)/
        update_itinerary($1)
      when /track finance (.+)/
        track_finance($1)
      when /track profile (.+)/
        track_profile($1)
      else
        @assistant.chat(command)
      end
    else
      "Use '#{WAKE_WORD}' to activate me."
    end
  end

  private

  def track_person(name)
    prompt = "Analyze hypothetical interactions with #{name} for mental state."
    analysis = @assistant.chat(prompt)
    File.write(@dbs[:people], "#{File.read(@dbs[:people])}\n#{name}: #{analysis}")
    @logger.info("Tracked: #{name}")
    analysis =~ /negative|intention to harm/i ? "ALERT: Threat from #{name}" : "No threat: #{name}"
  end

  def update_itinerary(event)
    prompt = "Update itinerary with: #{event}."
    updated = @assistant.chat(prompt)
    File.write(@dbs[:itinerary], "#{File.read(@dbs[:itinerary])}\n#{Time.now}: #{updated}")
    @logger.info("Itinerary: #{event}")
    updated
  end

  def track_finance(transaction)
    prompt = "Track transaction: #{transaction}."
    analysis = @assistant.chat(prompt)
    File.write(@dbs[:finance], "#{File.read(@dbs[:finance])}\n#{Time.now}: #{analysis}")
    @logger.info("Finance: #{transaction}")
    analysis
  end

  def track_profile(url)
    timestamp = Time.now.strftime("%Y%m%d_%H%M%S")
    log_file = "logs/task_#{timestamp}.log"
    GlobalAI.universal_scraper.execute_in_background("scrape #{url}", log_file)
    "Tracking profile #{url} in background. Check #{log_file}."
  end
end
EOF

echo "[AI³] Installing assistants/music.rb..."
cat > $ASSISTANTS_DIR/music.rb <<'EOF'
# frozen_string_literal: true
# MusicAssistant: Creates music content.

require "yaml"
require "langchain"
require "securerandom"
require_relative "../lib/global_ai"

class MusicAssistant
  URLS = ["https://soundonsound.com", "https://replicate.com"] # Configurable URLs

  def initialize
    config = YAML.load_file(File.join(Dir.pwd, "config", "assistants.yml"))
    @assistant = Langchain::Assistant.new(
      llm: GlobalAI.llm,
      tools: [UniversalScraper.new(URLS), GlobalAI.filesystem_tool],
      temperature: config["assistants"]["music"]["temperature"]
    )
    @logger = Logger.new(File.join(Dir.pwd, "logs", "music.log"), level: Logger::DEBUG)
  end

  def respond(input)
    if input =~ /song.*video/i
      timestamp = Time.now.strftime("%Y%m%d_%H%M%S")
      log_file = "logs/task_#{timestamp}.log"
      GlobalAI.universal_scraper.execute_in_background(input, log_file)
      "Generating media in background. Check #{log_file}."
    elsif input =~ /create.*music (.+)/
      create_music($1)
    else
      @assistant.chat(input)
    end
  end

  private

  def create_music(style)
    prompt = "Generate a #{style} music composition idea."
    composition = @assistant.chat(prompt)
    file = "composition_#{style}_#{SecureRandom.hex(4)}.txt"
    GlobalAI.filesystem_tool.execute_in_background("echo '#{composition}' > #{file}", "logs/music.log")
    "Music created: #{file}"
  end
end
EOF

echo "[AI³] Installing assistants/material_repurposing.rb..."
cat > $ASSISTANTS_DIR/material_repurposing.rb <<'EOF'
# frozen_string_literal: true
# MaterialRepurposingAssistant: Manages repurposing tasks.

require "yaml"
require "langchain"
require_relative "../lib/global_ai"

class MaterialRepurposingAssistant
  URLS = ["https://recycling.com", "https://epa.gov/recycle"] # Configurable URLs

  def initialize
    config = YAML.load_file(File.join(Dir.pwd, "config", "assistants.yml"))
    @assistant = Langchain::Assistant.new(
      llm: GlobalAI.llm,
      tools: [UniversalScraper.new(URLS)],
      temperature: config["assistants"]["material_repurposing"]["temperature"]
    )
  end

  def respond(input)
    if input =~ /conduct.*analysis/i
      timestamp = Time.now.strftime("%Y%m%d_%H%M%S")
      log_file = "logs/task_#{timestamp}.log"
      GlobalAI.universal_scraper.execute_in_background(input, log_file)
      "Conducting analysis in background. Check #{log_file}."
    else
      @assistant.chat(input)
    end
  end
end
EOF

echo "[AI³] Assistants installed. Use within 'ruby $ROOT_DIR/ai3.rb'."

```

## `install_tests.sh`
```
#!/usr/bin/env zsh
# AI³ CLI Installation Tests
# Version: 5.7.0
# Tests install.sh functionality with RSpec and FactoryBot.

set -e

echo "[AI³] Setting up test environment..."
gem install rspec factory_bot redis || {
  echo "Error: Failed to install test gems."
  exit 1
}

mkdir -p spec/factories spec/support
touch spec/spec_helper.rb spec/install_spec.rb

echo "[AI³] Configuring RSpec..."
cat > spec/spec_helper.rb <<'EOF'
require 'factory_bot'
require 'fileutils'
require 'redis'

RSpec.configure do |config|
  config.include FactoryBot::Syntax::Methods
  config.before(:suite) do
    FactoryBot.find_definitions
  end
end
EOF

echo "[AI³] Defining factories..."
cat > spec/factories/files.rb <<'EOF'
FactoryBot.define do
  factory :file do
    path { "tmp/test_#{SecureRandom.hex(4)}.txt" }
    content { "Test content with spaces" }
    after(:create) { |file| File.write(file.path, file.content) }
  end
end
EOF

echo "[AI³] Writing tests..."
cat > spec/install_spec.rb <<'EOF'
require 'spec_helper'
require 'yaml'

describe "AI³ CLI Installation" do
  let(:root_dir) { Dir.pwd }
  let(:bin_dir) { "#{root_dir}/bin" }
  let(:assistants_dir) { "#{root_dir}/assistants" }
  let(:redis) { Redis.new }

  before(:all) do
    system("chmod +x install.sh && ./install.sh")
  end

  after(:all) do
    FileUtils.rm_rf(["bin", "lib", "config", "data", "logs", "cache", "assistants", "tmp"])
    redis.flushdb
  end

  it "creates required directories" do
    expect(Dir.exist?(bin_dir)).to be true
    expect(File.stat(bin_dir).mode & 0777).to eq(0755)
    expect(Dir.exist?(assistants_dir)).to be true
    expect(Dir.exist?("tmp")).to be true
  end

  it "generates ai3.rb" do
    expect(File.exist?("ai3.rb")).to be true
    expect(File.executable?("ai3.rb")).to be true
  end

  it "generates config.yml with expected keys" do
    config = YAML.load_file("config.yml")
    expect(config["version"]).to eq("5.7.0")
    expect(config["llm"]["primary"]).to eq("grok")
    expect(config["schemas"]).to include("default" => "ai3_documents")
    expect(config["voice_enabled"]).to be false
  end

  it "installs all library files" do
    libs = %w[global_ai interactive_session filesystem_tool universal_scraper calculator_tool
              prompt_manager rag_system command_handler error_handling context_manager schema_manager]
    libs.each { |lib| expect(File.exist?("lib/#{lib}.rb")).to be true }
  end

  it "installs general assistant" do
    expect(File.exist?("assistants/general.rb")).to be true
  end

  it "runs ai3.rb and validates environment" do
    expect(system("echo 'y\nall\ny' | ruby ai3.rb >/dev/null 2>&1")).to be true
  end

  it "handles filesystem tool read operations" do
    file = create(:file)
    result = `ruby ai3.rb "read file #{file.path}"`
    expect(result.strip).to eq("Result: #{file.content}")
  end

  it "handles filesystem tool chunked read" do
    file = create(:file)
    result = `ruby ai3.rb "read chunks #{file.path} 5"`
    expect(result.strip).to eq("Result: Test ")
  end

  it "handles filesystem tool replace" do
    file = create(:file)
    result = `ruby ai3.rb "replace #{file.path} Test Hello"`
    expect(result.strip).to eq("Result: Replaced in #{file.path}")
    expect(File.read(file.path)).to eq("Hello content with spaces")
  end

  it "queues tasks" do
    6.times { system("ruby ai3.rb \"use the general assistant to say hi\" &") }
    sleep 1
    result = `ruby ai3.rb "status of my task"`
    expect(result).to include("Queued tasks: 1")
  end

  it "supports RAG evaluation" do
    result = `ruby ai3.rb "evaluate rag 'What is AI?' 'AI is intelligence'"`
    expect(result).to include("Precision:", "Recall:")
  end

  it "classifies intents dynamically with regex" do
    result = `ruby ai3.rb "scrape example.com"`
    expect(result).to include("Started scrape task")
    cache_key = "intent:#{"scrape example.com".hash}"
    expect(redis.get(cache_key)).to include("scrape")
  end

  it "classifies intents dynamically with LLM fallback" do
    result = `ruby ai3.rb "analyze data trends"`
    expect(result).to include("Response:") # Fallback to RAG
    cache_key = "intent:#{"analyze data trends".hash}"
    expect(redis.get(cache_key)).to include("other") # LLM-assigned intent cached
  end

  it "uses generalized scraper with no hardcoded URLs" do
    result = `ruby ai3.rb "scrape https://test.com"`
    expect(result).to include("Started scrape task")
    expect(File.read("logs/task_#{Time.now.strftime("%Y%m%d_%H%M")}" + "*.log")).to include("Scraped and indexed: https://test.com")
  end
end
EOF

echo "[AI³] Running tests..."
rspec spec/install_spec.rb --format documentation

echo "[AI³] CLI tests complete."
```