Date: 2026-01-12 Target: Consolidate SkyDaddy repositories into a unified monorepo Primary Concern: Proto package naming and Go module path resolution
- Current Architecture Analysis
- Identified Issues & Challenges
- Target Monorepo Structure
- Proto Package Strategy
- Migration Plan
- Build System Updates
- CI/CD Changes
- Rollback Strategy
- Post-Migration Validation
Current Repositories:
βββ github.com/SkyDaddyAI/cloud # Go backend services
βββ github.com/SkyDaddyAI/ui # React/TypeScript frontend
βββ github.com/SkyDaddyAI/object-detector # Python ML agents
βββ github.com/SkyDaddyAI/proto # Shared protobuf definitions (submodule)
- Location:
[email protected]:SkyDaddyAI/proto.git - Consumed as: Git submodule in each repository
- Package declaration:
package proto; - Go package option:
option go_package = "github.com/SkyDaddyAI/proto";
// In proto files
option go_package = "github.com/SkyDaddyAI/proto";
// Generated to: cloud/proto/
// Imported as:
import pb "github.com/SkyDaddyAI/proto"Build command:
protoc --proto_path=. \
--go_out=proto \
--go-grpc_out=proto \
--go_opt=module=github.com/SkyDaddyAI/proto \
--go-grpc_opt=module=github.com/SkyDaddyAI/proto \
proto/**/*.proto# ui/buf.gen.yaml
version: v2
plugins:
- local: protoc-gen-es
out: src/gen
include_imports: true
opt: target=tsGenerated to: ui/src/gen/proto/
Import style: import { VideoService } from '@/gen/proto/api/api_pb'
protoc --python_out=. proto/**/*.protoGenerated to: object-detector/proto/ (currently empty)
Import style: from proto.api import api_pb2
Current State:
// proto/api/api.proto
package proto;
option go_package = "github.com/SkyDaddyAI/proto";Problem:
- Go code expects
github.com/SkyDaddyAI/protomodule - In monorepo, this module path won't exist
- Generated code in
cloud/proto/needs different import path
Impact:
- All Go imports will break
go.modrequires statement will fail- Build will fail with "module not found" errors
Current: Each repository has its own go.mod
// cloud/go.mod
module github.com/SkyDaddyAI/cloudChallenge: Need to decide between:
- Single module (all Go code in one module)
- Multi-module workspace (separate modules with
go.work)
Current:
import "proto/api/picture.proto";
import "proto/model/drone.proto";Challenge: Need consistent path resolution across:
- Go:
protoc --proto_path - TypeScript:
bufconfiguration - Python:
protocoutput paths
Current: Each repository builds independently
Challenge: Need to ensure:
- Proto builds before language-specific code
- Dependency order (proto β cloud β ui)
- Incremental builds (only rebuild changed components)
Current: Separate CI pipelines per repository
Challenge:
- Unified pipeline or separate jobs?
- Cache sharing across components
- Conditional builds (only build what changed)
Current: Each repository has independent versioning
Challenge:
- Unified versioning or independent?
- Release strategies (atomic vs. independent)
- Docker image tagging
max_ai/ # Monorepo root
βββ .github/
β βββ workflows/ # Unified CI/CD
β βββ proto.yml # Proto validation & generation
β βββ cloud.yml # Cloud services build/test
β βββ ui.yml # UI build/test
β βββ object-detector.yml # Python agents build/test
βββ proto/ # Proto definitions (no longer submodule)
β βββ api/
β β βββ alerting.proto
β β βββ api.proto
β β βββ ...
β βββ model/
β β βββ drone.proto
β β βββ ...
β βββ mission/
β βββ video/
β βββ buf.yaml # Buf configuration
β βββ buf.gen.go.yaml # Go generation config
β βββ buf.gen.ts.yaml # TypeScript generation config
β βββ buf.gen.python.yaml # Python generation config
βββ cloud/ # Go backend services
β βββ go.mod # Module: github.com/SkyDaddyAI/max_ai/cloud
β βββ go.sum
β βββ proto/ # Generated Go code (gitignored)
β β βββ api/
β β β βββ api.pb.go
β β β βββ api_grpc.pb.go
β β β βββ ...
β β βββ model/
β βββ server/
β βββ alerts/
β βββ mission_service/
β βββ ...
βββ ui/ # React frontend
β βββ package.json # Uses pnpm workspace
β βββ src/
β β βββ gen/ # Generated TypeScript (gitignored)
β β β βββ proto/
β β β βββ api/
β β β βββ model/
β β βββ ...
β βββ ...
βββ object-detector/ # Python ML agents
β βββ proto/ # Generated Python (gitignored)
β β βββ api/
β β βββ model/
β βββ detection-agent/
β βββ recognition-agent/
β βββ ...
βββ docs/ # Unified documentation
βββ scripts/ # Shared build/dev scripts
β βββ proto-gen-all.sh # Generate all proto targets
β βββ dev-setup.sh # Initial setup
β βββ validate-proto.sh # Proto linting/validation
βββ go.work # Go workspace (optional)
βββ pnpm-workspace.yaml # PNPM workspace config
βββ Makefile # Root-level build targets
βββ CLAUDE.md # Updated for monorepo
βββ README.md # Monorepo overview
- Location: Root-level
proto/directory - Status: Regular directory (not submodule)
- Ownership: Shared by all components
Rationale:
- Allows independent versioning of cloud services if needed
- Better module boundaries
- Easier to extract services later if needed
Implementation:
// go.work (root)
go 1.24.0
use (
./cloud
./object-detector/diffing-agent // If needed
)// cloud/go.mod
module github.com/SkyDaddyAI/max_ai/cloud
require (
github.com/SkyDaddyAI/max_ai/proto v0.0.0
...
)
replace github.com/SkyDaddyAI/max_ai/proto => ../protoRationale:
- Reduces merge conflicts
- Always in sync with proto definitions
- Smaller repository size
- Forces explicit build step
Implementation:
# Generated proto code
cloud/proto/
ui/src/gen/proto/
object-detector/proto/Before (Multi-Repo):
// proto/api/api.proto
syntax = "proto3";
package proto;
option go_package = "github.com/SkyDaddyAI/proto";Generated Go code lives in separate repo with module path github.com/SkyDaddyAI/proto.
After (Monorepo) - We need to decide on Go package path.
Create proto/ as a separate Go module within the monorepo.
Structure:
proto/
βββ go.mod # module github.com/SkyDaddyAI/max_ai/proto
βββ api/
β βββ api.proto
β βββ api.pb.go # Generated here
β βββ api_grpc.pb.go
βββ model/
βββ drone.proto
βββ drone.pb.go # Generated here
Proto file:
// proto/api/api.proto
syntax = "proto3";
package proto.api; // Changed: more specific
option go_package = "github.com/SkyDaddyAI/max_ai/proto/api"; // ChangedGo imports:
// cloud/server/main.go
import (
apipb "github.com/SkyDaddyAI/max_ai/proto/api"
modelpb "github.com/SkyDaddyAI/max_ai/proto/model"
)Build command:
cd proto
protoc --proto_path=. \
--go_out=. \
--go-grpc_out=. \
--go_opt=paths=source_relative \
--go-grpc_opt=paths=source_relative \
api/*.proto model/*.protoPros:
- β Clean module separation
- β Standard Go module layout
- β Easy to version independently
- β Works with Go workspace
- β Can publish proto module separately if needed
Cons:
- β More complex than single module
- β Need
replacedirectives during development
Keep proto definitions as source, generate into cloud/internal/proto/.
Structure:
proto/
βββ api/
β βββ api.proto # Source only
βββ model/
βββ drone.proto # Source only
cloud/
βββ go.mod # module github.com/SkyDaddyAI/max_ai/cloud
βββ internal/
β βββ proto/ # Generated here
β βββ api/
β β βββ api.pb.go
β β βββ api_grpc.pb.go
β βββ model/
β βββ drone.pb.go
Proto file:
// proto/api/api.proto
syntax = "proto3";
package proto.api;
option go_package = "github.com/SkyDaddyAI/max_ai/cloud/internal/proto/api";Go imports:
// cloud/server/main.go
import (
apipb "github.com/SkyDaddyAI/max_ai/cloud/internal/proto/api"
)Pros:
- β Single Go module
- β Simpler dependency management
- β No replace directives needed
Cons:
- β Proto code tied to cloud module
- β Can't easily share with external consumers
- β Violates separation of concerns
- β Hard to use proto in other Go modules later
Generate proto into each consumer's directory.
Pros: None for this use case
Cons:
- β Code duplication
- β Sync issues
- β Larger repository
- β More complex builds
Create proto/ as its own Go module with path github.com/SkyDaddyAI/max_ai/proto.
# In the monorepo location
git checkout -b migration/monorepo-consolidation# For each repository, identify all proto imports
cd cloud
grep -r "github.com/SkyDaddyAI/proto" --include="*.go" . > ../proto-imports-cloud.txt
cd ../ui
grep -r "from '@/gen/proto" --include="*.ts" --include="*.tsx" . > ../proto-imports-ui.txt
cd ../object-detector
find . -name "*_pb2.py" > ../proto-imports-python.txtCreate migration-state.md:
# Current Build State
## Cloud
- Proto generation: `cd cloud && make proto-build`
- Service build: `cd cloud && make build-fast`
- Tests: `cd cloud && make test-alerting`
## UI
- Proto generation: `cd ui && pnpm run proto-gen`
- Build: `cd ui && pnpm run build`
## Object Detector
- Proto generation: `cd object-detector && make proto-build`
- Build: `cd object-detector && docker compose build`# In each repository
git tag pre-monorepo-migration
git push origin pre-monorepo-migrationmkdir -p max_ai/proto
cd max_ai/proto
# Initialize Go module
go mod init github.com/SkyDaddyAI/max_ai/proto
# Create go.mod
cat > go.mod <<EOF
module github.com/SkyDaddyAI/max_ai/proto
go 1.24.0
require (
google.golang.org/grpc v1.76.0
google.golang.org/protobuf v1.36.10
)
EOF# From proto submodule
cp -r <proto-repo>/api ./api
cp -r <proto-repo>/model ./model
cp -r <proto-repo>/mission ./mission
cp -r <proto-repo>/video ./video
cp <proto-repo>/navigation.proto ./Before:
package proto;
option go_package = "github.com/SkyDaddyAI/proto";After:
package proto.api; // More specific package names
option go_package = "github.com/SkyDaddyAI/max_ai/proto/api";Script to update all proto files:
#!/bin/bash
# scripts/update-proto-packages.sh
find proto -name "*.proto" -type f | while read file; do
# Get directory relative to proto/
dir=$(dirname "$file" | sed 's|proto/||')
# Determine package name
if [ "$dir" = "." ]; then
pkg="proto"
go_pkg="github.com/SkyDaddyAI/max_ai/proto"
else
pkg="proto.${dir//\//.}"
go_pkg="github.com/SkyDaddyAI/max_ai/proto/$dir"
fi
# Update package declaration
sed -i.bak "s|^package proto;|package $pkg;|" "$file"
# Update go_package option
sed -i.bak "s|option go_package = \"github.com/SkyDaddyAI/proto\";|option go_package = \"$go_pkg\";|" "$file"
rm "$file.bak"
doneproto/buf.yaml:
version: v2
modules:
- path: .
breaking:
use:
- FILE
lint:
use:
- STANDARDproto/buf.gen.go.yaml:
version: v2
plugins:
- remote: buf.build/protocolbuffers/go
out: .
opt:
- paths=source_relative
- remote: buf.build/grpc/go
out: .
opt:
- paths=source_relativeproto/buf.gen.ts.yaml:
version: v2
plugins:
- local: protoc-gen-es
out: ../ui/src/gen
opt:
- target=tsproto/buf.gen.python.yaml:
version: v2
plugins:
- remote: buf.build/protocolbuffers/python
out: ../object-detector
opt:
- paths=source_relativeproto/Makefile:
.PHONY: gen-go gen-ts gen-python gen-all clean lint
gen-go: ## Generate Go code
@echo "π¨ Generating Go proto code..."
@buf generate --template buf.gen.go.yaml
@go mod tidy
@echo "β
Go proto code generated"
gen-ts: ## Generate TypeScript code
@echo "π¨ Generating TypeScript proto code..."
@buf generate --template buf.gen.ts.yaml
@echo "β
TypeScript proto code generated"
gen-python: ## Generate Python code
@echo "π¨ Generating Python proto code..."
@buf generate --template buf.gen.python.yaml
@echo "β
Python proto code generated"
gen-all: gen-go gen-ts gen-python ## Generate all proto code
lint: ## Lint proto files
@buf lint
clean: ## Clean generated files
@echo "π§Ή Cleaning generated proto files..."
@find api model mission video -name "*.pb.go" -delete
@find api model mission video -name "*_grpc.pb.go" -delete
@rm -rf ../ui/src/gen/proto
@rm -rf ../object-detector/proto
@echo "β
Generated files cleaned"cloud/go.mod:
module github.com/SkyDaddyAI/max_ai/cloud
go 1.24.0
require (
github.com/SkyDaddyAI/max_ai/proto v0.0.0
// ... other dependencies
)
// During development, use local proto module
replace github.com/SkyDaddyAI/max_ai/proto => ../protoScript to update all Go imports:
#!/bin/bash
# scripts/update-go-imports.sh
cd cloud
# Find all .go files and update imports
find . -name "*.go" -type f | while read file; do
# Update import statements
# From: github.com/SkyDaddyAI/proto
# To: github.com/SkyDaddyAI/max_ai/proto/api (or /model, etc.)
# This is complex - recommend using gofmt with sed or a Go tool
sed -i.bak 's|"github.com/SkyDaddyAI/proto"|"github.com/SkyDaddyAI/max_ai/proto/api"|g' "$file"
rm "$file.bak"
done
# Run goimports to fix imports
go install golang.org/x/tools/cmd/goimports@latest
find . -name "*.go" -exec goimports -w {} \;Manual verification needed:
- Check each import is correct (api vs model vs mission)
- Update type references if package names changed
- Verify gRPC client/server instantiation
cloud/Makefile (update proto-build target):
proto-build: ## Build protocol buffer files (deprecated - use root make proto)
@echo "β οΈ Deprecated: Use 'make proto' from repository root"
@echo "π Delegating to proto module..."
@cd ../proto && $(MAKE) gen-go
@echo "β
Protocol buffers built successfully"cloud/.devcontainer/docker-compose.yml:
services:
app:
build:
context: .. # Changed: now builds from monorepo root
dockerfile: cloud/.devcontainer/Dockerfile
volumes:
- ../proto:/workspaces/proto:cached # Mount proto
- .:/workspaces/cloud:cachedcloud/.devcontainer/Dockerfile:
FROM golang:1.24
# Install protoc and buf
RUN apt-get update && apt-get install -y protobuf-compiler
RUN go install github.com/bufbuild/buf/cmd/buf@latest
# Set working directory
WORKDIR /workspaces
# Copy proto module first
COPY proto/ /workspaces/proto/
RUN cd /workspaces/proto && go mod download
# Copy cloud module
COPY cloud/ /workspaces/cloud/
RUN cd /workspaces/cloud && go mod downloadpnpm-workspace.yaml (root):
packages:
- 'ui'
# Add more if you have multiple TS packagesui/package.json:
{
"name": "@skydaddy/ui",
"scripts": {
"proto-gen": "cd ../proto && make gen-ts",
"dev": "vite --port 3000",
"build": "vite build && tsc --noEmit"
}
}Before:
import { VideoService } from '@/gen/proto/api/api_pb'After: (No change needed if generation output stays the same)
import { VideoService } from '@/gen/proto/api/api_pb'ui/vite.config.ts: (Likely no changes needed)
ui/tsconfig.json: (Likely no changes needed)
object-detector/Makefile:
proto-build: ## Build protocol buffer files
@echo "π¨ Building Python protocol buffer files..."
@cd ../proto && $(MAKE) gen-python
@echo "β
Protocol buffers built successfully"Before:
from proto.api import api_pb2After: (Depends on generation output structure)
from proto.api import api_pb2 # Likely sameMakefile (root):
.PHONY: help init proto proto-go proto-ts proto-python \
cloud-build cloud-test ui-build ui-test \
object-detector-build test-all
help: ## Show this help
@echo "SkyDaddy Max AI Monorepo"
@echo ""
@echo "Usage: make [target]"
@echo ""
@grep -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = ":.*?## "}; {printf " \033[36m%-20s\033[0m %s\n", $$1, $$2}'
init: ## Initialize repository (submodules, dependencies)
@echo "π¦ Initializing repository..."
@echo "β
Installing Go dependencies..."
@cd proto && go mod download
@cd cloud && go mod download
@echo "β
Installing Node dependencies..."
@cd ui && pnpm install
@echo "β
Monorepo initialized"
# Proto targets
proto: proto-go proto-ts proto-python ## Generate all proto code
proto-go: ## Generate Go proto code
@cd proto && $(MAKE) gen-go
proto-ts: ## Generate TypeScript proto code
@cd proto && $(MAKE) gen-ts
proto-python: ## Generate Python proto code
@cd proto && $(MAKE) gen-python
proto-lint: ## Lint proto files
@cd proto && $(MAKE) lint
proto-clean: ## Clean generated proto files
@cd proto && $(MAKE) clean
# Cloud targets
cloud-build: proto-go ## Build cloud services
@cd cloud && $(MAKE) build-fast
cloud-test: proto-go ## Run cloud tests
@cd cloud && $(MAKE) test-alerting
@cd cloud && $(MAKE) mission-test
@cd cloud && $(MAKE) video-test
cloud-dev: proto-go ## Start cloud dev environment
@cd cloud && $(MAKE) dev-up
# UI targets
ui-build: proto-ts ## Build UI
@cd ui && pnpm run build
ui-dev: proto-ts ## Start UI dev server
@cd ui && pnpm run dev
ui-test: proto-ts ## Run UI tests
@cd ui && pnpm run test
# Object detector targets
object-detector-build: proto-python ## Build object detector
@cd object-detector && docker compose build
object-detector-up: proto-python ## Start object detector services
@cd object-detector && docker compose up -d
# Combined targets
dev: ## Start all dev services
@$(MAKE) cloud-dev
@$(MAKE) ui-dev
@$(MAKE) object-detector-up
test-all: ## Run all tests
@$(MAKE) proto-lint
@$(MAKE) cloud-test
@$(MAKE) ui-test
build-all: ## Build all components
@$(MAKE) proto
@$(MAKE) cloud-build
@$(MAKE) ui-build
@$(MAKE) object-detector-buildgo.work (root):
go 1.24.0
use (
./proto
./cloud
).gitignore (root):
# Generated proto code
proto/api/**/*.pb.go
proto/api/**/*_grpc.pb.go
proto/model/**/*.pb.go
proto/model/**/*_grpc.pb.go
proto/mission/**/*.pb.go
proto/mission/**/*_grpc.pb.go
proto/video/**/*.pb.go
proto/video/**/*_grpc.pb.go
cloud/proto/
ui/src/gen/proto/
object-detector/proto/
# Dependencies
node_modules/
.pnpm-store/
# Build artifacts
cloud/bin/
ui/dist/
# Dev environment
.env
.env.local
*.log
# IDE
.vscode/
.idea/
*.swp
*.swoscripts/dev-setup.sh:
#!/bin/bash
set -e
echo "π SkyDaddy Max AI - Development Setup"
echo ""
# Check prerequisites
echo "π Checking prerequisites..."
command -v go >/dev/null 2>&1 || { echo "β Go not found. Please install Go 1.24+"; exit 1; }
command -v node >/dev/null 2>&1 || { echo "β Node.js not found. Please install Node.js 18+"; exit 1; }
command -v pnpm >/dev/null 2>&1 || { echo "β pnpm not found. Run: npm install -g pnpm"; exit 1; }
command -v protoc >/dev/null 2>&1 || { echo "β protoc not found. Please install protobuf compiler"; exit 1; }
command -v buf >/dev/null 2>&1 || { echo "β οΈ buf not found. Installing..."; go install github.com/bufbuild/buf/cmd/buf@latest; }
echo "β
Prerequisites check passed"
echo ""
# Install dependencies
echo "π¦ Installing dependencies..."
make init
# Generate proto files
echo "π¨ Generating proto files..."
make proto
echo ""
echo "β
Setup complete!"
echo ""
echo "Next steps:"
echo " 1. Start cloud services: make cloud-dev"
echo " 2. Start UI dev server: make ui-dev"
echo " 3. Start object detector: make object-detector-up"
echo " Or run all: make dev"chmod +x scripts/dev-setup.sh.github/workflows/proto.yml:
name: Proto Validation
on:
pull_request:
paths:
- 'proto/**'
push:
branches: [main]
paths:
- 'proto/**'
jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Buf
uses: bufbuild/buf-setup-action@v1
with:
version: latest
- name: Lint proto files
run: cd proto && buf lint
- name: Check breaking changes
run: cd proto && buf breaking --against 'https://github.com/SkyDaddyAI/max_ai.git#branch=main,subdir=proto'
if: github.event_name == 'pull_request'
generate:
runs-on: ubuntu-latest
strategy:
matrix:
target: [go, ts, python]
steps:
- uses: actions/checkout@v4
- name: Setup Go
uses: actions/setup-go@v5
with:
go-version: '1.24'
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: '3.12'
- name: Generate ${{ matrix.target }} code
run: make proto-${{ matrix.target }}
- name: Upload generated code
uses: actions/upload-artifact@v4
with:
name: proto-${{ matrix.target }}
path: |
proto/**/*.pb.go
ui/src/gen/proto/**
object-detector/proto/**.github/workflows/cloud.yml:
name: Cloud Services
on:
pull_request:
paths:
- 'proto/**'
- 'cloud/**'
- '.github/workflows/cloud.yml'
push:
branches: [main]
paths:
- 'proto/**'
- 'cloud/**'
jobs:
test:
runs-on: ubuntu-latest
services:
postgres:
image: pgvector/pgvector:pg15
env:
POSTGRES_PASSWORD: test
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
nats:
image: nats:latest
options: >-
--health-cmd "nats-server --version"
--health-interval 10s
--health-timeout 5s
--health-retries 5
redis:
image: redis:7
options: >-
--health-cmd "redis-cli ping"
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- uses: actions/checkout@v4
- name: Setup Go
uses: actions/setup-go@v5
with:
go-version: '1.24'
cache-dependency-path: |
proto/go.sum
cloud/go.sum
- name: Generate proto
run: make proto-go
- name: Build services
run: make cloud-build
- name: Run tests
run: make cloud-test
env:
DB_HOST: localhost
DB_PORT: 5432
DB_USER: postgres
DB_PASSWORD: test
NATS_URL: nats://localhost:4222
REDIS_HOST: localhost:6379
build-docker:
runs-on: ubuntu-latest
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
steps:
- uses: actions/checkout@v4
- name: Setup Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to GHCR
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
file: cloud/Dockerfile
push: true
tags: |
ghcr.io/skydaddyai/cloud:latest
ghcr.io/skydaddyai/cloud:${{ github.sha }}
cache-from: type=gha
cache-to: type=gha,mode=max.github/workflows/ui.yml:
name: UI
on:
pull_request:
paths:
- 'proto/**'
- 'ui/**'
- '.github/workflows/ui.yml'
push:
branches: [main]
paths:
- 'proto/**'
- 'ui/**'
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Setup pnpm
uses: pnpm/action-setup@v3
with:
version: 8
- name: Get pnpm store directory
id: pnpm-cache
shell: bash
run: echo "STORE_PATH=$(pnpm store path)" >> $GITHUB_OUTPUT
- name: Setup pnpm cache
uses: actions/cache@v4
with:
path: ${{ steps.pnpm-cache.outputs.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Install dependencies
run: cd ui && pnpm install --frozen-lockfile
- name: Generate proto
run: make proto-ts
- name: Build
run: make ui-build
- name: Type check
run: cd ui && pnpm run type-check || echo "Add type-check script"Replace proto submodule references with monorepo paths:
Before:
### Setup
```bash
git submodule update --init --recursive
make proto-build
**After**:
```markdown
### Setup
```bash
# No submodule initialization needed
make proto # Generate all proto targets
make init # Install dependencies
#### 8.2 Update README.md
**README.md** (root):
```markdown
# SkyDaddy Max AI - Monorepo
Complete drone video processing and AI analysis platform.
## Quick Start
```bash
# Clone repository
git clone https://github.com/SkyDaddyAI/max_ai.git
cd max_ai
# Setup development environment
./scripts/dev-setup.sh
# Start all services
make dev
proto/- Shared protobuf definitionscloud/- Go backend services (gRPC, video, missions, alerts)ui/- React/TypeScript web interfaceobject-detector/- Python ML agents (YOLO, pose, recognition)
- Go 1.24+
- Node.js 20+
- pnpm 8+
- Docker & Docker Compose
- protoc (protobuf compiler)
- buf (optional, for proto linting)
make proto # Generate all proto code
make cloud-build # Build cloud services
make ui-build # Build UI
make object-detector-build # Build Python agents
make build-all # Build everythingmake cloud-dev # Start cloud services (docker-compose)
make ui-dev # Start UI dev server (vite)
make object-detector-up # Start ML agents
make dev # Start everythingmake test-all # Run all tests
make cloud-test # Cloud services only
make ui-test # UI only- CLAUDE.md - Detailed architecture and development guide
- MONOREPO_MIGRATION.md - Migration documentation
Proprietary - SkyDaddyAI
### Phase 9: Testing & Validation
#### 9.1 Create Validation Checklist
**MIGRATION_CHECKLIST.md**:
```markdown
# Migration Validation Checklist
## Proto Generation
- [ ] Go code generates without errors: `make proto-go`
- [ ] TypeScript code generates without errors: `make proto-ts`
- [ ] Python code generates without errors: `make proto-python`
- [ ] Buf lint passes: `cd proto && buf lint`
- [ ] Generated files are gitignored
## Cloud Services
- [ ] Go workspace resolves proto module: `cd cloud && go list -m github.com/SkyDaddyAI/max_ai/proto`
- [ ] All imports updated and compile: `cd cloud && go build ./...`
- [ ] Tests pass: `make cloud-test`
- [ ] Docker build works: `cd cloud && docker build -t test .`
- [ ] Dev container starts: `make cloud-dev`
## UI
- [ ] Dependencies install: `cd ui && pnpm install`
- [ ] TypeScript compiles: `cd ui && pnpm run build`
- [ ] Dev server starts: `cd ui && pnpm run dev`
- [ ] Proto imports resolve in IDE
## Object Detector
- [ ] Python proto imports work: `cd object-detector && python -c "from proto.api import api_pb2"`
- [ ] Docker build works: `cd object-detector && docker compose build`
- [ ] Services start: `cd object-detector && docker compose up`
## CI/CD
- [ ] Proto workflow passes
- [ ] Cloud workflow passes
- [ ] UI workflow passes
- [ ] Object detector workflow passes
## Documentation
- [ ] CLAUDE.md updated
- [ ] README.md updated
- [ ] All commands in docs verified
# From repository root
./scripts/validate-migration.shscripts/validate-migration.sh:
#!/bin/bash
set -e
echo "π§ͺ Migration Validation"
echo ""
# Proto generation
echo "1οΈβ£ Testing proto generation..."
make proto-clean
make proto
echo "β
Proto generation passed"
echo ""
# Cloud build
echo "2οΈβ£ Testing cloud build..."
cd cloud
go mod download
go build ./...
cd ..
echo "β
Cloud build passed"
echo ""
# UI build
echo "3οΈβ£ Testing UI build..."
cd ui
pnpm install
pnpm run build
cd ..
echo "β
UI build passed"
echo ""
# Tests
echo "4οΈβ£ Running tests..."
make test-all || echo "β οΈ Some tests failed (review required)"
echo ""
echo "β
Migration validation complete!"Each repository:
cloud/
βββ Makefile (proto-build, builds services)
βββ .devcontainer/
βββ go.mod
ui/
βββ package.json (proto-gen script)
βββ buf.gen.yaml
object-detector/
βββ Makefile (proto-build)
βββ docker-compose.yml
max_ai/
βββ Makefile (orchestrates all builds)
βββ proto/
β βββ Makefile (owns proto generation)
β βββ buf.gen.*.yaml (multiple templates)
β βββ go.mod
βββ cloud/
β βββ Makefile (delegates proto to root)
β βββ go.mod (requires ../proto)
βββ ui/
β βββ package.json (delegates to root make)
βββ object-detector/
βββ Makefile (delegates proto to root)
βββββββββββ
β proto β β First
ββββββ¬βββββ
β
ββββββββ¬βββββββββββ¬βββββββββββββ
β β β β
βΌ βΌ βΌ βΌ
ββββββββββ ββββββββ βββββββββββ ββββββββββββ
β cloud β β ui β β obj-det β β external β
β (Go) β β (TS) β β (Py) β β services β
ββββββββββ ββββββββ βββββββββββ ββββββββββββ
Pros:
- β Only build what changed
- β Faster CI runs
- β Clear job separation
Cons:
- β More complex workflow files
- β Need to handle cross-component changes
Pros:
- β Simple configuration
- β Guaranteed consistency
Cons:
- β Slower CI runs
- β Wastes resources
- PR checks: Path-based (only validate changed components)
- Main branch: Build all (ensure consistency)
- Tags: Build and publish all
# Example: Go cache across jobs
- name: Setup Go cache
uses: actions/cache@v4
with:
path: |
~/.cache/go-build
~/go/pkg/mod
key: ${{ runner.os }}-go-${{ hashFiles('**/go.sum') }}Important: Include both proto/go.sum and cloud/go.sum in hash key.
# In each original repository
git checkout pre-monorepo-migration
# Continue working in multi-repo modeKeep migration work in branch, continue main development in separate repos.
# In monorepo
git checkout -b migration/monorepo-consolidation
# In original repos
git checkout main # Continue as beforeIf monorepo proves problematic:
# Extract proto
git subtree split --prefix=proto -b proto-branch
cd ../proto-new && git pull ../max_ai proto-branch
# Extract cloud
git subtree split --prefix=cloud -b cloud-branch
cd ../cloud-new && git pull ../max_ai cloud-branch
# Repeat for ui, object-detector- Build times: Compare CI times vs. multi-repo
- Developer feedback: Survey team on DX
- Build failures: Track new issues vs. baseline
- Cache tuning: Optimize CI cache hit rates
- Incremental builds: Implement smart build detection
- Documentation gaps: Fill based on questions
- Remove old repositories: Archive after confidence
- Update external references: CI badges, links, etc.
- Refine workflows: Based on usage patterns
Error:
package github.com/SkyDaddyAI/max_ai/proto/api: cannot find package
Solution:
# Check go.work exists at root
cat go.work
# Verify replace directive in cloud/go.mod
grep "replace.*proto" cloud/go.mod
# Force module refresh
cd cloud
go clean -modcache
go mod download
go mod tidyError:
protoc: error: proto/api/api.proto: No such file or directory
Solution:
# Ensure you're in correct directory
pwd # Should be in proto/ for proto generation
# Check proto_path in command
protoc --proto_path=. ... # Correct
protoc --proto_path=proto ... # Wrong if already in proto/Error:
Cannot find module '@/gen/proto/api/api_pb'
Solution:
# Regenerate TypeScript proto
make proto-ts
# Check tsconfig paths
cat ui/tsconfig.json # Should have @/ alias
# Restart TypeScript server in IDE
# VS Code: Cmd+Shift+P β "Restart TS Server"Error:
ModuleNotFoundError: No module named 'proto.api'
Solution:
# Check Python path includes object-detector/
export PYTHONPATH=$(pwd)/object-detector:$PYTHONPATH
# Or use absolute imports
python -c "import sys; sys.path.insert(0, 'object-detector'); from proto.api import api_pb2"
# Add to docker-compose.yml
environment:
PYTHONPATH: /appError:
import cycle not allowed
Solution:
# Analyze import graph
cd cloud
go mod graph | grep proto
# Common cause: proto imports cloud code (wrong!)
# Fix: proto should have zero dependencies on cloud/ui/object-detector
grep -r "github.com/SkyDaddyAI/max_ai/cloud" proto/ # Should be emptyError:
COPY failed: file not found in build context
Solution:
# Dockerfile build context must be monorepo root
# In docker-compose.yml or docker build:
# Wrong:
docker build -t image ./cloud
# Correct:
docker build -t image -f cloud/Dockerfile .Update Dockerfile:
# Copy from monorepo root
COPY proto/ /app/proto/
COPY cloud/ /app/cloud/
WORKDIR /app/cloudBefore (Multi-Repo):
Proto build: 30s (each repo)
Cloud build: 2m
UI build: 1m
Object-det build: 3m
Total (sequential): ~9m 30s
Total (parallel): ~3m (longest)
After (Monorepo) - Expected:
Proto build: 30s (once)
Cloud build: 2m (depends on proto)
UI build: 1m (depends on proto)
Object-det build: 3m (depends on proto)
Total (sequential): ~6m 30s
Total (parallel): ~3m 30s (longest + proto)
- Proto caching: Cache generated files in CI
- Conditional builds: Only build changed components
- Layer caching: Docker multi-stage builds
- Parallel execution: Use
make -jwhere safe
- All code in monorepo
- All services build successfully
- All tests pass
- CI workflows functional
- No build issues for 1 week
- Developer onboarding smooth
- Documentation complete
- CI times acceptable (< 10m)
- CI times < 5m for typical PR
- Build cache hit rate > 80%
- Zero import resolution issues
- Team velocity maintained or improved
- Old repositories archived
- External links updated
- Knowledge transfer complete
- Monorepo patterns established
This migration transforms SkyDaddy from multi-repo to monorepo with careful attention to:
- Proto package naming: Resolved via nested Go module approach
- Import path consistency: Clear module boundaries and replace directives
- Build coordination: Root Makefile orchestrating all components
- CI/CD efficiency: Path-based triggers with smart caching
- Developer experience: Single
make devto start everything
Critical success factor: Proto as a first-class module with explicit versioning and clear dependency boundaries.
The nested module approach (proto/ as separate Go module) provides the best balance of:
- Clean separation of concerns
- Flexibility for future refactoring
- Standard Go tooling compatibility
- Minimal friction during development
- Review this document with the team
- Create migration branch in new monorepo location
- Execute Phase 1-2 (Proto module creation)
- Validate builds before proceeding to Phase 3+
- Iterate based on findings
Estimated timeline: 2-3 weeks for full migration (with validation)
Point of no return: After updating all Go imports (Phase 3.2) - ensure thorough testing before this step.