Skip to content

Instantly share code, notes, and snippets.

@enachb
Created January 12, 2026 22:45
Show Gist options
  • Select an option

  • Save enachb/7312f191138622039ebcf705cec5eabf to your computer and use it in GitHub Desktop.

Select an option

Save enachb/7312f191138622039ebcf705cec5eabf to your computer and use it in GitHub Desktop.
SkyDaddy Max AI Monorepo Migration Guide - Comprehensive guide for consolidating multi-repo into monorepo with proto package resolution strategy

Monorepo Migration Guide

Date: 2026-01-12 Target: Consolidate SkyDaddy repositories into a unified monorepo Primary Concern: Proto package naming and Go module path resolution


Table of Contents

  1. Current Architecture Analysis
  2. Identified Issues & Challenges
  3. Target Monorepo Structure
  4. Proto Package Strategy
  5. Migration Plan
  6. Build System Updates
  7. CI/CD Changes
  8. Rollback Strategy
  9. Post-Migration Validation

Current Architecture Analysis

Repository Structure (Pre-Migration)

Current Repositories:
β”œβ”€β”€ github.com/SkyDaddyAI/cloud           # Go backend services
β”œβ”€β”€ github.com/SkyDaddyAI/ui              # React/TypeScript frontend
β”œβ”€β”€ github.com/SkyDaddyAI/object-detector # Python ML agents
└── github.com/SkyDaddyAI/proto           # Shared protobuf definitions (submodule)

Current Proto Integration

Proto Repository

  • Location: [email protected]:SkyDaddyAI/proto.git
  • Consumed as: Git submodule in each repository
  • Package declaration: package proto;
  • Go package option: option go_package = "github.com/SkyDaddyAI/proto";

Go (Cloud Services)

// In proto files
option go_package = "github.com/SkyDaddyAI/proto";

// Generated to: cloud/proto/
// Imported as:
import pb "github.com/SkyDaddyAI/proto"

Build command:

protoc --proto_path=. \
  --go_out=proto \
  --go-grpc_out=proto \
  --go_opt=module=github.com/SkyDaddyAI/proto \
  --go-grpc_opt=module=github.com/SkyDaddyAI/proto \
  proto/**/*.proto

TypeScript (UI)

# ui/buf.gen.yaml
version: v2
plugins:
  - local: protoc-gen-es
    out: src/gen
    include_imports: true
    opt: target=ts

Generated to: ui/src/gen/proto/ Import style: import { VideoService } from '@/gen/proto/api/api_pb'

Python (Object Detector)

protoc --python_out=. proto/**/*.proto

Generated to: object-detector/proto/ (currently empty) Import style: from proto.api import api_pb2


Identified Issues & Challenges

1. Proto Go Package Path Mismatch ⚠️ CRITICAL

Current State:

// proto/api/api.proto
package proto;
option go_package = "github.com/SkyDaddyAI/proto";

Problem:

  • Go code expects github.com/SkyDaddyAI/proto module
  • In monorepo, this module path won't exist
  • Generated code in cloud/proto/ needs different import path

Impact:

  • All Go imports will break
  • go.mod requires statement will fail
  • Build will fail with "module not found" errors

2. Go Module Structure

Current: Each repository has its own go.mod

// cloud/go.mod
module github.com/SkyDaddyAI/cloud

Challenge: Need to decide between:

  • Single module (all Go code in one module)
  • Multi-module workspace (separate modules with go.work)

3. Proto Import Paths

Current:

import "proto/api/picture.proto";
import "proto/model/drone.proto";

Challenge: Need consistent path resolution across:

  • Go: protoc --proto_path
  • TypeScript: buf configuration
  • Python: protoc output paths

4. Build Coordination

Current: Each repository builds independently

Challenge: Need to ensure:

  • Proto builds before language-specific code
  • Dependency order (proto β†’ cloud β†’ ui)
  • Incremental builds (only rebuild changed components)

5. CI/CD Pipeline Fragmentation

Current: Separate CI pipelines per repository

Challenge:

  • Unified pipeline or separate jobs?
  • Cache sharing across components
  • Conditional builds (only build what changed)

6. Versioning & Releases

Current: Each repository has independent versioning

Challenge:

  • Unified versioning or independent?
  • Release strategies (atomic vs. independent)
  • Docker image tagging

Target Monorepo Structure

Recommended Structure

max_ai/                                    # Monorepo root
β”œβ”€β”€ .github/
β”‚   └── workflows/                        # Unified CI/CD
β”‚       β”œβ”€β”€ proto.yml                     # Proto validation & generation
β”‚       β”œβ”€β”€ cloud.yml                     # Cloud services build/test
β”‚       β”œβ”€β”€ ui.yml                        # UI build/test
β”‚       └── object-detector.yml           # Python agents build/test
β”œβ”€β”€ proto/                                # Proto definitions (no longer submodule)
β”‚   β”œβ”€β”€ api/
β”‚   β”‚   β”œβ”€β”€ alerting.proto
β”‚   β”‚   β”œβ”€β”€ api.proto
β”‚   β”‚   └── ...
β”‚   β”œβ”€β”€ model/
β”‚   β”‚   β”œβ”€β”€ drone.proto
β”‚   β”‚   └── ...
β”‚   β”œβ”€β”€ mission/
β”‚   β”œβ”€β”€ video/
β”‚   β”œβ”€β”€ buf.yaml                          # Buf configuration
β”‚   β”œβ”€β”€ buf.gen.go.yaml                   # Go generation config
β”‚   β”œβ”€β”€ buf.gen.ts.yaml                   # TypeScript generation config
β”‚   └── buf.gen.python.yaml               # Python generation config
β”œβ”€β”€ cloud/                                # Go backend services
β”‚   β”œβ”€β”€ go.mod                            # Module: github.com/SkyDaddyAI/max_ai/cloud
β”‚   β”œβ”€β”€ go.sum
β”‚   β”œβ”€β”€ proto/                            # Generated Go code (gitignored)
β”‚   β”‚   β”œβ”€β”€ api/
β”‚   β”‚   β”‚   β”œβ”€β”€ api.pb.go
β”‚   β”‚   β”‚   β”œβ”€β”€ api_grpc.pb.go
β”‚   β”‚   β”‚   └── ...
β”‚   β”‚   └── model/
β”‚   β”œβ”€β”€ server/
β”‚   β”œβ”€β”€ alerts/
β”‚   β”œβ”€β”€ mission_service/
β”‚   └── ...
β”œβ”€β”€ ui/                                   # React frontend
β”‚   β”œβ”€β”€ package.json                      # Uses pnpm workspace
β”‚   β”œβ”€β”€ src/
β”‚   β”‚   β”œβ”€β”€ gen/                          # Generated TypeScript (gitignored)
β”‚   β”‚   β”‚   └── proto/
β”‚   β”‚   β”‚       β”œβ”€β”€ api/
β”‚   β”‚   β”‚       └── model/
β”‚   β”‚   └── ...
β”‚   └── ...
β”œβ”€β”€ object-detector/                      # Python ML agents
β”‚   β”œβ”€β”€ proto/                            # Generated Python (gitignored)
β”‚   β”‚   β”œβ”€β”€ api/
β”‚   β”‚   └── model/
β”‚   β”œβ”€β”€ detection-agent/
β”‚   β”œβ”€β”€ recognition-agent/
β”‚   └── ...
β”œβ”€β”€ docs/                                 # Unified documentation
β”œβ”€β”€ scripts/                              # Shared build/dev scripts
β”‚   β”œβ”€β”€ proto-gen-all.sh                 # Generate all proto targets
β”‚   β”œβ”€β”€ dev-setup.sh                     # Initial setup
β”‚   └── validate-proto.sh                # Proto linting/validation
β”œβ”€β”€ go.work                               # Go workspace (optional)
β”œβ”€β”€ pnpm-workspace.yaml                   # PNPM workspace config
β”œβ”€β”€ Makefile                              # Root-level build targets
β”œβ”€β”€ CLAUDE.md                             # Updated for monorepo
└── README.md                             # Monorepo overview

Key Design Decisions

1. Proto as First-Class Citizen

  • Location: Root-level proto/ directory
  • Status: Regular directory (not submodule)
  • Ownership: Shared by all components

2. Go Module Strategy: Multi-Module with Workspace

Rationale:

  • Allows independent versioning of cloud services if needed
  • Better module boundaries
  • Easier to extract services later if needed

Implementation:

// go.work (root)
go 1.24.0

use (
    ./cloud
    ./object-detector/diffing-agent  // If needed
)
// cloud/go.mod
module github.com/SkyDaddyAI/max_ai/cloud

require (
    github.com/SkyDaddyAI/max_ai/proto v0.0.0
    ...
)

replace github.com/SkyDaddyAI/max_ai/proto => ../proto

3. Generated Code Strategy: Gitignored

Rationale:

  • Reduces merge conflicts
  • Always in sync with proto definitions
  • Smaller repository size
  • Forces explicit build step

Implementation:

# Generated proto code
cloud/proto/
ui/src/gen/proto/
object-detector/proto/

Proto Package Strategy

The Core Problem

Before (Multi-Repo):

// proto/api/api.proto
syntax = "proto3";
package proto;
option go_package = "github.com/SkyDaddyAI/proto";

Generated Go code lives in separate repo with module path github.com/SkyDaddyAI/proto.

After (Monorepo) - We need to decide on Go package path.

Solution Options

Option A: Nested Module (RECOMMENDED βœ…)

Create proto/ as a separate Go module within the monorepo.

Structure:

proto/
β”œβ”€β”€ go.mod                    # module github.com/SkyDaddyAI/max_ai/proto
β”œβ”€β”€ api/
β”‚   β”œβ”€β”€ api.proto
β”‚   β”œβ”€β”€ api.pb.go            # Generated here
β”‚   └── api_grpc.pb.go
└── model/
    β”œβ”€β”€ drone.proto
    └── drone.pb.go          # Generated here

Proto file:

// proto/api/api.proto
syntax = "proto3";
package proto.api;  // Changed: more specific
option go_package = "github.com/SkyDaddyAI/max_ai/proto/api";  // Changed

Go imports:

// cloud/server/main.go
import (
    apipb "github.com/SkyDaddyAI/max_ai/proto/api"
    modelpb "github.com/SkyDaddyAI/max_ai/proto/model"
)

Build command:

cd proto
protoc --proto_path=. \
  --go_out=. \
  --go-grpc_out=. \
  --go_opt=paths=source_relative \
  --go-grpc_opt=paths=source_relative \
  api/*.proto model/*.proto

Pros:

  • βœ… Clean module separation
  • βœ… Standard Go module layout
  • βœ… Easy to version independently
  • βœ… Works with Go workspace
  • βœ… Can publish proto module separately if needed

Cons:

  • ❌ More complex than single module
  • ❌ Need replace directives during development

Option B: Generate into Cloud Module

Keep proto definitions as source, generate into cloud/internal/proto/.

Structure:

proto/
β”œβ”€β”€ api/
β”‚   └── api.proto            # Source only
└── model/
    └── drone.proto          # Source only

cloud/
β”œβ”€β”€ go.mod                   # module github.com/SkyDaddyAI/max_ai/cloud
β”œβ”€β”€ internal/
β”‚   └── proto/              # Generated here
β”‚       β”œβ”€β”€ api/
β”‚       β”‚   β”œβ”€β”€ api.pb.go
β”‚       β”‚   └── api_grpc.pb.go
β”‚       └── model/
β”‚           └── drone.pb.go

Proto file:

// proto/api/api.proto
syntax = "proto3";
package proto.api;
option go_package = "github.com/SkyDaddyAI/max_ai/cloud/internal/proto/api";

Go imports:

// cloud/server/main.go
import (
    apipb "github.com/SkyDaddyAI/max_ai/cloud/internal/proto/api"
)

Pros:

  • βœ… Single Go module
  • βœ… Simpler dependency management
  • βœ… No replace directives needed

Cons:

  • ❌ Proto code tied to cloud module
  • ❌ Can't easily share with external consumers
  • ❌ Violates separation of concerns
  • ❌ Hard to use proto in other Go modules later

Option C: Vendor Proto (Not Recommended ❌)

Generate proto into each consumer's directory.

Pros: None for this use case

Cons:

  • ❌ Code duplication
  • ❌ Sync issues
  • ❌ Larger repository
  • ❌ More complex builds

RECOMMENDATION: Option A (Nested Module)

Create proto/ as its own Go module with path github.com/SkyDaddyAI/max_ai/proto.


Migration Plan

Phase 1: Preparation (Pre-Migration)

1.1 Create Migration Branch

# In the monorepo location
git checkout -b migration/monorepo-consolidation

1.2 Audit Current Proto Usage

# For each repository, identify all proto imports
cd cloud
grep -r "github.com/SkyDaddyAI/proto" --include="*.go" . > ../proto-imports-cloud.txt

cd ../ui
grep -r "from '@/gen/proto" --include="*.ts" --include="*.tsx" . > ../proto-imports-ui.txt

cd ../object-detector
find . -name "*_pb2.py" > ../proto-imports-python.txt

1.3 Document Current Build Commands

Create migration-state.md:

# Current Build State

## Cloud
- Proto generation: `cd cloud && make proto-build`
- Service build: `cd cloud && make build-fast`
- Tests: `cd cloud && make test-alerting`

## UI
- Proto generation: `cd ui && pnpm run proto-gen`
- Build: `cd ui && pnpm run build`

## Object Detector
- Proto generation: `cd object-detector && make proto-build`
- Build: `cd object-detector && docker compose build`

1.4 Tag Current State

# In each repository
git tag pre-monorepo-migration
git push origin pre-monorepo-migration

Phase 2: Proto Module Creation

2.1 Initialize Proto Module

mkdir -p max_ai/proto
cd max_ai/proto

# Initialize Go module
go mod init github.com/SkyDaddyAI/max_ai/proto

# Create go.mod
cat > go.mod <<EOF
module github.com/SkyDaddyAI/max_ai/proto

go 1.24.0

require (
    google.golang.org/grpc v1.76.0
    google.golang.org/protobuf v1.36.10
)
EOF

2.2 Copy Proto Definitions

# From proto submodule
cp -r <proto-repo>/api ./api
cp -r <proto-repo>/model ./model
cp -r <proto-repo>/mission ./mission
cp -r <proto-repo>/video ./video
cp <proto-repo>/navigation.proto ./

2.3 Update Proto Package Declarations

Before:

package proto;
option go_package = "github.com/SkyDaddyAI/proto";

After:

package proto.api;  // More specific package names
option go_package = "github.com/SkyDaddyAI/max_ai/proto/api";

Script to update all proto files:

#!/bin/bash
# scripts/update-proto-packages.sh

find proto -name "*.proto" -type f | while read file; do
    # Get directory relative to proto/
    dir=$(dirname "$file" | sed 's|proto/||')

    # Determine package name
    if [ "$dir" = "." ]; then
        pkg="proto"
        go_pkg="github.com/SkyDaddyAI/max_ai/proto"
    else
        pkg="proto.${dir//\//.}"
        go_pkg="github.com/SkyDaddyAI/max_ai/proto/$dir"
    fi

    # Update package declaration
    sed -i.bak "s|^package proto;|package $pkg;|" "$file"

    # Update go_package option
    sed -i.bak "s|option go_package = \"github.com/SkyDaddyAI/proto\";|option go_package = \"$go_pkg\";|" "$file"

    rm "$file.bak"
done

2.4 Create Proto Build Configuration

proto/buf.yaml:

version: v2
modules:
  - path: .
breaking:
  use:
    - FILE
lint:
  use:
    - STANDARD

proto/buf.gen.go.yaml:

version: v2
plugins:
  - remote: buf.build/protocolbuffers/go
    out: .
    opt:
      - paths=source_relative
  - remote: buf.build/grpc/go
    out: .
    opt:
      - paths=source_relative

proto/buf.gen.ts.yaml:

version: v2
plugins:
  - local: protoc-gen-es
    out: ../ui/src/gen
    opt:
      - target=ts

proto/buf.gen.python.yaml:

version: v2
plugins:
  - remote: buf.build/protocolbuffers/python
    out: ../object-detector
    opt:
      - paths=source_relative

2.5 Create Proto Makefile

proto/Makefile:

.PHONY: gen-go gen-ts gen-python gen-all clean lint

gen-go: ## Generate Go code
	@echo "πŸ”¨ Generating Go proto code..."
	@buf generate --template buf.gen.go.yaml
	@go mod tidy
	@echo "βœ… Go proto code generated"

gen-ts: ## Generate TypeScript code
	@echo "πŸ”¨ Generating TypeScript proto code..."
	@buf generate --template buf.gen.ts.yaml
	@echo "βœ… TypeScript proto code generated"

gen-python: ## Generate Python code
	@echo "πŸ”¨ Generating Python proto code..."
	@buf generate --template buf.gen.python.yaml
	@echo "βœ… Python proto code generated"

gen-all: gen-go gen-ts gen-python ## Generate all proto code

lint: ## Lint proto files
	@buf lint

clean: ## Clean generated files
	@echo "🧹 Cleaning generated proto files..."
	@find api model mission video -name "*.pb.go" -delete
	@find api model mission video -name "*_grpc.pb.go" -delete
	@rm -rf ../ui/src/gen/proto
	@rm -rf ../object-detector/proto
	@echo "βœ… Generated files cleaned"

Phase 3: Cloud Services Migration

3.1 Update Cloud go.mod

cloud/go.mod:

module github.com/SkyDaddyAI/max_ai/cloud

go 1.24.0

require (
    github.com/SkyDaddyAI/max_ai/proto v0.0.0
    // ... other dependencies
)

// During development, use local proto module
replace github.com/SkyDaddyAI/max_ai/proto => ../proto

3.2 Update Import Paths

Script to update all Go imports:

#!/bin/bash
# scripts/update-go-imports.sh

cd cloud

# Find all .go files and update imports
find . -name "*.go" -type f | while read file; do
    # Update import statements
    # From: github.com/SkyDaddyAI/proto
    # To: github.com/SkyDaddyAI/max_ai/proto/api (or /model, etc.)

    # This is complex - recommend using gofmt with sed or a Go tool
    sed -i.bak 's|"github.com/SkyDaddyAI/proto"|"github.com/SkyDaddyAI/max_ai/proto/api"|g' "$file"
    rm "$file.bak"
done

# Run goimports to fix imports
go install golang.org/x/tools/cmd/goimports@latest
find . -name "*.go" -exec goimports -w {} \;

Manual verification needed:

  • Check each import is correct (api vs model vs mission)
  • Update type references if package names changed
  • Verify gRPC client/server instantiation

3.3 Update Cloud Makefile

cloud/Makefile (update proto-build target):

proto-build: ## Build protocol buffer files (deprecated - use root make proto)
	@echo "⚠️  Deprecated: Use 'make proto' from repository root"
	@echo "πŸ”— Delegating to proto module..."
	@cd ../proto && $(MAKE) gen-go
	@echo "βœ… Protocol buffers built successfully"

3.4 Update Cloud Docker Setup

cloud/.devcontainer/docker-compose.yml:

services:
  app:
    build:
      context: ..  # Changed: now builds from monorepo root
      dockerfile: cloud/.devcontainer/Dockerfile
    volumes:
      - ../proto:/workspaces/proto:cached  # Mount proto
      - .:/workspaces/cloud:cached

cloud/.devcontainer/Dockerfile:

FROM golang:1.24

# Install protoc and buf
RUN apt-get update && apt-get install -y protobuf-compiler
RUN go install github.com/bufbuild/buf/cmd/buf@latest

# Set working directory
WORKDIR /workspaces

# Copy proto module first
COPY proto/ /workspaces/proto/
RUN cd /workspaces/proto && go mod download

# Copy cloud module
COPY cloud/ /workspaces/cloud/
RUN cd /workspaces/cloud && go mod download

Phase 4: UI Migration

4.1 Create Workspace Configuration

pnpm-workspace.yaml (root):

packages:
  - 'ui'
  # Add more if you have multiple TS packages

4.2 Update UI package.json

ui/package.json:

{
  "name": "@skydaddy/ui",
  "scripts": {
    "proto-gen": "cd ../proto && make gen-ts",
    "dev": "vite --port 3000",
    "build": "vite build && tsc --noEmit"
  }
}

4.3 Update UI Imports

Before:

import { VideoService } from '@/gen/proto/api/api_pb'

After: (No change needed if generation output stays the same)

import { VideoService } from '@/gen/proto/api/api_pb'

4.4 Update UI Build Configuration

ui/vite.config.ts: (Likely no changes needed)

ui/tsconfig.json: (Likely no changes needed)

Phase 5: Object Detector Migration

5.1 Update Python Proto Generation

object-detector/Makefile:

proto-build: ## Build protocol buffer files
	@echo "πŸ”¨ Building Python protocol buffer files..."
	@cd ../proto && $(MAKE) gen-python
	@echo "βœ… Protocol buffers built successfully"

5.2 Update Python Imports

Before:

from proto.api import api_pb2

After: (Depends on generation output structure)

from proto.api import api_pb2  # Likely same

Phase 6: Root-Level Integration

6.1 Create Root Makefile

Makefile (root):

.PHONY: help init proto proto-go proto-ts proto-python \
        cloud-build cloud-test ui-build ui-test \
        object-detector-build test-all

help: ## Show this help
	@echo "SkyDaddy Max AI Monorepo"
	@echo ""
	@echo "Usage: make [target]"
	@echo ""
	@grep -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = ":.*?## "}; {printf "  \033[36m%-20s\033[0m %s\n", $$1, $$2}'

init: ## Initialize repository (submodules, dependencies)
	@echo "πŸ“¦ Initializing repository..."
	@echo "βœ… Installing Go dependencies..."
	@cd proto && go mod download
	@cd cloud && go mod download
	@echo "βœ… Installing Node dependencies..."
	@cd ui && pnpm install
	@echo "βœ… Monorepo initialized"

# Proto targets
proto: proto-go proto-ts proto-python ## Generate all proto code

proto-go: ## Generate Go proto code
	@cd proto && $(MAKE) gen-go

proto-ts: ## Generate TypeScript proto code
	@cd proto && $(MAKE) gen-ts

proto-python: ## Generate Python proto code
	@cd proto && $(MAKE) gen-python

proto-lint: ## Lint proto files
	@cd proto && $(MAKE) lint

proto-clean: ## Clean generated proto files
	@cd proto && $(MAKE) clean

# Cloud targets
cloud-build: proto-go ## Build cloud services
	@cd cloud && $(MAKE) build-fast

cloud-test: proto-go ## Run cloud tests
	@cd cloud && $(MAKE) test-alerting
	@cd cloud && $(MAKE) mission-test
	@cd cloud && $(MAKE) video-test

cloud-dev: proto-go ## Start cloud dev environment
	@cd cloud && $(MAKE) dev-up

# UI targets
ui-build: proto-ts ## Build UI
	@cd ui && pnpm run build

ui-dev: proto-ts ## Start UI dev server
	@cd ui && pnpm run dev

ui-test: proto-ts ## Run UI tests
	@cd ui && pnpm run test

# Object detector targets
object-detector-build: proto-python ## Build object detector
	@cd object-detector && docker compose build

object-detector-up: proto-python ## Start object detector services
	@cd object-detector && docker compose up -d

# Combined targets
dev: ## Start all dev services
	@$(MAKE) cloud-dev
	@$(MAKE) ui-dev
	@$(MAKE) object-detector-up

test-all: ## Run all tests
	@$(MAKE) proto-lint
	@$(MAKE) cloud-test
	@$(MAKE) ui-test

build-all: ## Build all components
	@$(MAKE) proto
	@$(MAKE) cloud-build
	@$(MAKE) ui-build
	@$(MAKE) object-detector-build

6.2 Create Go Workspace

go.work (root):

go 1.24.0

use (
    ./proto
    ./cloud
)

6.3 Update .gitignore

.gitignore (root):

# Generated proto code
proto/api/**/*.pb.go
proto/api/**/*_grpc.pb.go
proto/model/**/*.pb.go
proto/model/**/*_grpc.pb.go
proto/mission/**/*.pb.go
proto/mission/**/*_grpc.pb.go
proto/video/**/*.pb.go
proto/video/**/*_grpc.pb.go

cloud/proto/
ui/src/gen/proto/
object-detector/proto/

# Dependencies
node_modules/
.pnpm-store/

# Build artifacts
cloud/bin/
ui/dist/

# Dev environment
.env
.env.local
*.log

# IDE
.vscode/
.idea/
*.swp
*.swo

6.4 Create Setup Script

scripts/dev-setup.sh:

#!/bin/bash
set -e

echo "πŸš€ SkyDaddy Max AI - Development Setup"
echo ""

# Check prerequisites
echo "πŸ“‹ Checking prerequisites..."
command -v go >/dev/null 2>&1 || { echo "❌ Go not found. Please install Go 1.24+"; exit 1; }
command -v node >/dev/null 2>&1 || { echo "❌ Node.js not found. Please install Node.js 18+"; exit 1; }
command -v pnpm >/dev/null 2>&1 || { echo "❌ pnpm not found. Run: npm install -g pnpm"; exit 1; }
command -v protoc >/dev/null 2>&1 || { echo "❌ protoc not found. Please install protobuf compiler"; exit 1; }
command -v buf >/dev/null 2>&1 || { echo "⚠️  buf not found. Installing..."; go install github.com/bufbuild/buf/cmd/buf@latest; }

echo "βœ… Prerequisites check passed"
echo ""

# Install dependencies
echo "πŸ“¦ Installing dependencies..."
make init

# Generate proto files
echo "πŸ”¨ Generating proto files..."
make proto

echo ""
echo "βœ… Setup complete!"
echo ""
echo "Next steps:"
echo "  1. Start cloud services: make cloud-dev"
echo "  2. Start UI dev server: make ui-dev"
echo "  3. Start object detector: make object-detector-up"
echo "  Or run all: make dev"
chmod +x scripts/dev-setup.sh

Phase 7: CI/CD Migration

7.1 Proto Validation Workflow

.github/workflows/proto.yml:

name: Proto Validation

on:
  pull_request:
    paths:
      - 'proto/**'
  push:
    branches: [main]
    paths:
      - 'proto/**'

jobs:
  lint:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Setup Buf
        uses: bufbuild/buf-setup-action@v1
        with:
          version: latest

      - name: Lint proto files
        run: cd proto && buf lint

      - name: Check breaking changes
        run: cd proto && buf breaking --against 'https://github.com/SkyDaddyAI/max_ai.git#branch=main,subdir=proto'
        if: github.event_name == 'pull_request'

  generate:
    runs-on: ubuntu-latest
    strategy:
      matrix:
        target: [go, ts, python]
    steps:
      - uses: actions/checkout@v4

      - name: Setup Go
        uses: actions/setup-go@v5
        with:
          go-version: '1.24'

      - name: Setup Node
        uses: actions/setup-node@v4
        with:
          node-version: '20'

      - name: Setup Python
        uses: actions/setup-python@v5
        with:
          python-version: '3.12'

      - name: Generate ${{ matrix.target }} code
        run: make proto-${{ matrix.target }}

      - name: Upload generated code
        uses: actions/upload-artifact@v4
        with:
          name: proto-${{ matrix.target }}
          path: |
            proto/**/*.pb.go
            ui/src/gen/proto/**
            object-detector/proto/**

7.2 Cloud Services Workflow

.github/workflows/cloud.yml:

name: Cloud Services

on:
  pull_request:
    paths:
      - 'proto/**'
      - 'cloud/**'
      - '.github/workflows/cloud.yml'
  push:
    branches: [main]
    paths:
      - 'proto/**'
      - 'cloud/**'

jobs:
  test:
    runs-on: ubuntu-latest
    services:
      postgres:
        image: pgvector/pgvector:pg15
        env:
          POSTGRES_PASSWORD: test
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

      nats:
        image: nats:latest
        options: >-
          --health-cmd "nats-server --version"
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

      redis:
        image: redis:7
        options: >-
          --health-cmd "redis-cli ping"
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

    steps:
      - uses: actions/checkout@v4

      - name: Setup Go
        uses: actions/setup-go@v5
        with:
          go-version: '1.24'
          cache-dependency-path: |
            proto/go.sum
            cloud/go.sum

      - name: Generate proto
        run: make proto-go

      - name: Build services
        run: make cloud-build

      - name: Run tests
        run: make cloud-test
        env:
          DB_HOST: localhost
          DB_PORT: 5432
          DB_USER: postgres
          DB_PASSWORD: test
          NATS_URL: nats://localhost:4222
          REDIS_HOST: localhost:6379

  build-docker:
    runs-on: ubuntu-latest
    if: github.event_name == 'push' && github.ref == 'refs/heads/main'
    steps:
      - uses: actions/checkout@v4

      - name: Setup Docker Buildx
        uses: docker/setup-buildx-action@v3

      - name: Login to GHCR
        uses: docker/login-action@v3
        with:
          registry: ghcr.io
          username: ${{ github.actor }}
          password: ${{ secrets.GITHUB_TOKEN }}

      - name: Build and push
        uses: docker/build-push-action@v5
        with:
          context: .
          file: cloud/Dockerfile
          push: true
          tags: |
            ghcr.io/skydaddyai/cloud:latest
            ghcr.io/skydaddyai/cloud:${{ github.sha }}
          cache-from: type=gha
          cache-to: type=gha,mode=max

7.3 UI Workflow

.github/workflows/ui.yml:

name: UI

on:
  pull_request:
    paths:
      - 'proto/**'
      - 'ui/**'
      - '.github/workflows/ui.yml'
  push:
    branches: [main]
    paths:
      - 'proto/**'
      - 'ui/**'

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Setup Node
        uses: actions/setup-node@v4
        with:
          node-version: '20'

      - name: Setup pnpm
        uses: pnpm/action-setup@v3
        with:
          version: 8

      - name: Get pnpm store directory
        id: pnpm-cache
        shell: bash
        run: echo "STORE_PATH=$(pnpm store path)" >> $GITHUB_OUTPUT

      - name: Setup pnpm cache
        uses: actions/cache@v4
        with:
          path: ${{ steps.pnpm-cache.outputs.STORE_PATH }}
          key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
          restore-keys: |
            ${{ runner.os }}-pnpm-store-

      - name: Install dependencies
        run: cd ui && pnpm install --frozen-lockfile

      - name: Generate proto
        run: make proto-ts

      - name: Build
        run: make ui-build

      - name: Type check
        run: cd ui && pnpm run type-check || echo "Add type-check script"

Phase 8: Documentation Updates

8.1 Update CLAUDE.md

Replace proto submodule references with monorepo paths:

Before:

### Setup
```bash
git submodule update --init --recursive
make proto-build

**After**:
```markdown
### Setup
```bash
# No submodule initialization needed
make proto              # Generate all proto targets
make init              # Install dependencies

#### 8.2 Update README.md

**README.md** (root):
```markdown
# SkyDaddy Max AI - Monorepo

Complete drone video processing and AI analysis platform.

## Quick Start

```bash
# Clone repository
git clone https://github.com/SkyDaddyAI/max_ai.git
cd max_ai

# Setup development environment
./scripts/dev-setup.sh

# Start all services
make dev

Repository Structure

  • proto/ - Shared protobuf definitions
  • cloud/ - Go backend services (gRPC, video, missions, alerts)
  • ui/ - React/TypeScript web interface
  • object-detector/ - Python ML agents (YOLO, pose, recognition)

Development

Prerequisites

  • Go 1.24+
  • Node.js 20+
  • pnpm 8+
  • Docker & Docker Compose
  • protoc (protobuf compiler)
  • buf (optional, for proto linting)

Build Commands

make proto              # Generate all proto code
make cloud-build        # Build cloud services
make ui-build          # Build UI
make object-detector-build  # Build Python agents

make build-all         # Build everything

Development Commands

make cloud-dev         # Start cloud services (docker-compose)
make ui-dev           # Start UI dev server (vite)
make object-detector-up  # Start ML agents

make dev              # Start everything

Testing

make test-all         # Run all tests
make cloud-test       # Cloud services only
make ui-test         # UI only

Documentation

License

Proprietary - SkyDaddyAI


### Phase 9: Testing & Validation

#### 9.1 Create Validation Checklist

**MIGRATION_CHECKLIST.md**:
```markdown
# Migration Validation Checklist

## Proto Generation
- [ ] Go code generates without errors: `make proto-go`
- [ ] TypeScript code generates without errors: `make proto-ts`
- [ ] Python code generates without errors: `make proto-python`
- [ ] Buf lint passes: `cd proto && buf lint`
- [ ] Generated files are gitignored

## Cloud Services
- [ ] Go workspace resolves proto module: `cd cloud && go list -m github.com/SkyDaddyAI/max_ai/proto`
- [ ] All imports updated and compile: `cd cloud && go build ./...`
- [ ] Tests pass: `make cloud-test`
- [ ] Docker build works: `cd cloud && docker build -t test .`
- [ ] Dev container starts: `make cloud-dev`

## UI
- [ ] Dependencies install: `cd ui && pnpm install`
- [ ] TypeScript compiles: `cd ui && pnpm run build`
- [ ] Dev server starts: `cd ui && pnpm run dev`
- [ ] Proto imports resolve in IDE

## Object Detector
- [ ] Python proto imports work: `cd object-detector && python -c "from proto.api import api_pb2"`
- [ ] Docker build works: `cd object-detector && docker compose build`
- [ ] Services start: `cd object-detector && docker compose up`

## CI/CD
- [ ] Proto workflow passes
- [ ] Cloud workflow passes
- [ ] UI workflow passes
- [ ] Object detector workflow passes

## Documentation
- [ ] CLAUDE.md updated
- [ ] README.md updated
- [ ] All commands in docs verified

9.2 Run Validation

# From repository root
./scripts/validate-migration.sh

scripts/validate-migration.sh:

#!/bin/bash
set -e

echo "πŸ§ͺ Migration Validation"
echo ""

# Proto generation
echo "1️⃣ Testing proto generation..."
make proto-clean
make proto
echo "βœ… Proto generation passed"
echo ""

# Cloud build
echo "2️⃣ Testing cloud build..."
cd cloud
go mod download
go build ./...
cd ..
echo "βœ… Cloud build passed"
echo ""

# UI build
echo "3️⃣ Testing UI build..."
cd ui
pnpm install
pnpm run build
cd ..
echo "βœ… UI build passed"
echo ""

# Tests
echo "4️⃣ Running tests..."
make test-all || echo "⚠️  Some tests failed (review required)"
echo ""

echo "βœ… Migration validation complete!"

Build System Updates

Before (Multi-Repo)

Each repository:

cloud/
β”œβ”€β”€ Makefile (proto-build, builds services)
β”œβ”€β”€ .devcontainer/
└── go.mod

ui/
β”œβ”€β”€ package.json (proto-gen script)
└── buf.gen.yaml

object-detector/
β”œβ”€β”€ Makefile (proto-build)
└── docker-compose.yml

After (Monorepo)

max_ai/
β”œβ”€β”€ Makefile (orchestrates all builds)
β”œβ”€β”€ proto/
β”‚   β”œβ”€β”€ Makefile (owns proto generation)
β”‚   β”œβ”€β”€ buf.gen.*.yaml (multiple templates)
β”‚   └── go.mod
β”œβ”€β”€ cloud/
β”‚   β”œβ”€β”€ Makefile (delegates proto to root)
β”‚   └── go.mod (requires ../proto)
β”œβ”€β”€ ui/
β”‚   └── package.json (delegates to root make)
└── object-detector/
    └── Makefile (delegates proto to root)

Dependency Graph

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  proto  β”‚  ← First
β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”˜
     β”‚
     β”œβ”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
     β”‚      β”‚          β”‚            β”‚
     β–Ό      β–Ό          β–Ό            β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ cloud  β”‚ β”‚  ui  β”‚ β”‚ obj-det β”‚  β”‚ external β”‚
β”‚ (Go)   β”‚ β”‚ (TS) β”‚ β”‚ (Py)    β”‚  β”‚ services β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

CI/CD Changes

Strategy: Monorepo-Aware CI

Option 1: Path-Based Triggers (RECOMMENDED)

Pros:

  • βœ… Only build what changed
  • βœ… Faster CI runs
  • βœ… Clear job separation

Cons:

  • ❌ More complex workflow files
  • ❌ Need to handle cross-component changes

Option 2: Always Build All

Pros:

  • βœ… Simple configuration
  • βœ… Guaranteed consistency

Cons:

  • ❌ Slower CI runs
  • ❌ Wastes resources

Recommended: Hybrid Approach

  1. PR checks: Path-based (only validate changed components)
  2. Main branch: Build all (ensure consistency)
  3. Tags: Build and publish all

Cache Strategy

# Example: Go cache across jobs
- name: Setup Go cache
  uses: actions/cache@v4
  with:
    path: |
      ~/.cache/go-build
      ~/go/pkg/mod
    key: ${{ runner.os }}-go-${{ hashFiles('**/go.sum') }}

Important: Include both proto/go.sum and cloud/go.sum in hash key.


Rollback Strategy

If Migration Fails

Option A: Revert to Tagged State

# In each original repository
git checkout pre-monorepo-migration

# Continue working in multi-repo mode

Option B: Create Migration Branch

Keep migration work in branch, continue main development in separate repos.

# In monorepo
git checkout -b migration/monorepo-consolidation

# In original repos
git checkout main  # Continue as before

Option C: Extract Back to Multi-Repo

If monorepo proves problematic:

# Extract proto
git subtree split --prefix=proto -b proto-branch
cd ../proto-new && git pull ../max_ai proto-branch

# Extract cloud
git subtree split --prefix=cloud -b cloud-branch
cd ../cloud-new && git pull ../max_ai cloud-branch

# Repeat for ui, object-detector

Post-Migration Validation

Week 1: Monitoring

  • Build times: Compare CI times vs. multi-repo
  • Developer feedback: Survey team on DX
  • Build failures: Track new issues vs. baseline

Week 2-4: Optimization

  • Cache tuning: Optimize CI cache hit rates
  • Incremental builds: Implement smart build detection
  • Documentation gaps: Fill based on questions

Month 2-3: Stabilization

  • Remove old repositories: Archive after confidence
  • Update external references: CI badges, links, etc.
  • Refine workflows: Based on usage patterns

Common Issues & Solutions

Issue 1: Go Import Errors

Error:

package github.com/SkyDaddyAI/max_ai/proto/api: cannot find package

Solution:

# Check go.work exists at root
cat go.work

# Verify replace directive in cloud/go.mod
grep "replace.*proto" cloud/go.mod

# Force module refresh
cd cloud
go clean -modcache
go mod download
go mod tidy

Issue 2: Proto Not Found During Generation

Error:

protoc: error: proto/api/api.proto: No such file or directory

Solution:

# Ensure you're in correct directory
pwd  # Should be in proto/ for proto generation

# Check proto_path in command
protoc --proto_path=. ...  # Correct
protoc --proto_path=proto ...  # Wrong if already in proto/

Issue 3: TypeScript Import Errors

Error:

Cannot find module '@/gen/proto/api/api_pb'

Solution:

# Regenerate TypeScript proto
make proto-ts

# Check tsconfig paths
cat ui/tsconfig.json  # Should have @/ alias

# Restart TypeScript server in IDE
# VS Code: Cmd+Shift+P β†’ "Restart TS Server"

Issue 4: Python Import Errors

Error:

ModuleNotFoundError: No module named 'proto.api'

Solution:

# Check Python path includes object-detector/
export PYTHONPATH=$(pwd)/object-detector:$PYTHONPATH

# Or use absolute imports
python -c "import sys; sys.path.insert(0, 'object-detector'); from proto.api import api_pb2"

# Add to docker-compose.yml
environment:
  PYTHONPATH: /app

Issue 5: Circular Dependencies

Error:

import cycle not allowed

Solution:

# Analyze import graph
cd cloud
go mod graph | grep proto

# Common cause: proto imports cloud code (wrong!)
# Fix: proto should have zero dependencies on cloud/ui/object-detector
grep -r "github.com/SkyDaddyAI/max_ai/cloud" proto/  # Should be empty

Issue 6: Docker Build Context Issues

Error:

COPY failed: file not found in build context

Solution:

# Dockerfile build context must be monorepo root
# In docker-compose.yml or docker build:

# Wrong:
docker build -t image ./cloud

# Correct:
docker build -t image -f cloud/Dockerfile .

Update Dockerfile:

# Copy from monorepo root
COPY proto/ /app/proto/
COPY cloud/ /app/cloud/

WORKDIR /app/cloud

Performance Considerations

Build Time Comparison

Before (Multi-Repo):

Proto build:      30s (each repo)
Cloud build:      2m
UI build:         1m
Object-det build: 3m

Total (sequential): ~9m 30s
Total (parallel):   ~3m (longest)

After (Monorepo) - Expected:

Proto build:      30s (once)
Cloud build:      2m (depends on proto)
UI build:         1m (depends on proto)
Object-det build: 3m (depends on proto)

Total (sequential): ~6m 30s
Total (parallel):   ~3m 30s (longest + proto)

Optimization Strategies

  1. Proto caching: Cache generated files in CI
  2. Conditional builds: Only build changed components
  3. Layer caching: Docker multi-stage builds
  4. Parallel execution: Use make -j where safe

Success Criteria

Phase 1 (Week 1): Basic Migration

  • All code in monorepo
  • All services build successfully
  • All tests pass
  • CI workflows functional

Phase 2 (Week 2-4): Stabilization

  • No build issues for 1 week
  • Developer onboarding smooth
  • Documentation complete
  • CI times acceptable (< 10m)

Phase 3 (Month 2): Optimization

  • CI times < 5m for typical PR
  • Build cache hit rate > 80%
  • Zero import resolution issues
  • Team velocity maintained or improved

Phase 4 (Month 3): Completion

  • Old repositories archived
  • External links updated
  • Knowledge transfer complete
  • Monorepo patterns established

Conclusion

This migration transforms SkyDaddy from multi-repo to monorepo with careful attention to:

  1. Proto package naming: Resolved via nested Go module approach
  2. Import path consistency: Clear module boundaries and replace directives
  3. Build coordination: Root Makefile orchestrating all components
  4. CI/CD efficiency: Path-based triggers with smart caching
  5. Developer experience: Single make dev to start everything

Critical success factor: Proto as a first-class module with explicit versioning and clear dependency boundaries.

The nested module approach (proto/ as separate Go module) provides the best balance of:

  • Clean separation of concerns
  • Flexibility for future refactoring
  • Standard Go tooling compatibility
  • Minimal friction during development

Next Steps

  1. Review this document with the team
  2. Create migration branch in new monorepo location
  3. Execute Phase 1-2 (Proto module creation)
  4. Validate builds before proceeding to Phase 3+
  5. Iterate based on findings

Estimated timeline: 2-3 weeks for full migration (with validation)

Point of no return: After updating all Go imports (Phase 3.2) - ensure thorough testing before this step.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment