Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save glaszczak/8ec0f0bbdf4aecbf5f9b925b6e7abf01 to your computer and use it in GitHub Desktop.
Save glaszczak/8ec0f0bbdf4aecbf5f9b925b6e7abf01 to your computer and use it in GitHub Desktop.

Reaction Commerce 3 deploy on AWS using terraform - variables explanation

GitHub repository


πŸ’‘ Reaction Commerce

Reaction Commerce repository

Deploying Reaction Commerce 3 on AWS ECS Tutorial


πŸ’‘ Prerequisites

  1. To run Reaction Commerce locally download repository -> Reaction Commerce Repository

  2. You will need the following tools installed on your computer:

  1. You need to register on platrofrms:
  1. Also AWS or external domain is needed!

πŸ’» Steps that has to be done (all using automation scripts by terraform)

MongoDB

  1. Create MongoDB cluster on MongoDB Atlas and connect it with AWS

AWS

  1. Create Hydra PostgreSQL database
  2. Build Docker images and push them to ECR
  3. Setup Load Balancers
  4. Setup ECS cluster
  5. Deploy Reaction Commerce API (backend)
  6. Deploy Hydra's API
  7. Deploy Identity API
  8. Deploy Admin panel

External provider (e.g. Vercel)

  1. Deploy Storefront

πŸ“— Variables explanation

MongoDB

atlas_mongo_public_key => MongoDB Atlas API public_key which should be created by user > Go to MongoDB Atlas account > Projects > Access Manager > Api Keys tab > Create API Key or select existing

atlas_mongo_private_key => Same section as above

atlas_org_id => Go to MongoDB Atlas account > Select Settings Icon > Settings > Copy Organization ID

mongo_user_name = Your MongoDB user name

mongo_user_password = Your MongoDB password

mongo_database_name = "reaction"

mongo_local_database_name = "local"

Variables in the file mongo.tf:

  • cidr_block => Go to MongoDB Atlas account > Projects > Select Project > Network Access > IP Whitelist tab > Copy existing address or create new ("0.0.0.0/0" allows connection from any IP)

  • atlas_cidr_block => Go to MongoDB Atlas account > Projects > Select Project > Network Access > Peering tab > Add Peering Connection or Select existing > For new Peering Connection Select aws > copy VPC CIDR


AWS Variables

aws_account_id => Go to AWS Console > Select 'My Account' (top dropdown mentu for current user) > Copy 'Account Id'

Variables in the file vpc.tf:

  • cidr_block

    • for resource "aws_vpc" => Go to AWS Console > Services > VPC > Your VPCs (left menu) > Select 'IPv4 CIDR' field for specific VPC
    • for each resource "aws_subnet" => for subnets replace /16 with /24 and set different IP for each subnet (example: "0.0.10.0/24", "0.0.20.0/24", "0.0.30.0/24")
  • destination_cidr_block

    • for each resource "aws_route" "...-gateway" => "0.0.0.0/0"
    • for each resource "aws_route" "...-mongo" => Go to MongoDB Atlas account > Projects > Select Project > Network Access > Peering tab > Add Peering Connection or Select existing > For new Peering Connection Select aws > copy VPC CIDR

Additional variable - always when AWS region is mentioned:

aws_region => Name of the AWS region selected by user in AWS console (e.x. "eu-central-1")


Backend

stripe_secret_key => your Stripe API secret key

  • stripe Log in > Dashboard > Developers > API keys > Create secret key or use existing Secret key

Hydra

ocid_subject_identifiers_pairwise_salt => Generate ocid subject identifier

secret_system => Generate session secret


Frontend

oauth2_client_secret => Generate oauth client secret

session_secret => Generate session secret (different from OAUTH2_CLIENT_SECRET)

stripe_public_api_key => your Stripe API public key

  • stripe Log in > Dashboard > Developers > API keys > Create secret key or use existing Publishable key

Route53

domain_name => http://yoursite.com


Postgres

postgres_identifier = "hydra" (Set itentifier - can be the same as database name)

postgres_database_name = "hydra"

postgres_username = "postgres"

postgres_password => Set database unique password

postgres_instance_name => Unique name cross all DB instances owned by current AWS account

postgres_db_password => Generate password

postgres_port => Default port for PostgreSQL: 5432


Repo paths

admin_repo_path => External; In our case: "User//projects//reaction-admin"

backend_repo_path => External; In our case: "User//projects//reaction"

identity_repo_path => External; In our case: "User//projects//reaction-identity"

hydra_repo_path => External; In our case: "User//projects//reaction-hydra"

storefront_repo_path => External; In our case: "User//projects//reaction-storefront"


Other variables explanation

local-exec commands > Get commands to build docker images based on each repository > Go to AWS Console > Services > ECR > Select repository > View push commands


πŸ”— Helpful Links


AWS

AWS Working with VPCs and subnets

AWS ECS

Redirects with AWS ALB using terraform

AWS What is VPC peering

AWS VPCs and subnets


Terraform

Terraform Environment Variables

AWS Provider

MongoDB Atlas Provider

PostgreSQL Provider


Terraform Resources

aws_ecr_repository

aws_ecs_cluster

aws_lb_listener

aws_lb_target_group

aws_eip (Elastic IP)

aws_ecs_task_definition

aws_lb

aws_lb_target_group

aws_lb (Application Load Balancer)

aws_db_security_group

postgresql_role

aws_db_instance

aws_acm_certificate

mongodbatlas_project

mongodbatlas_cluster

mongodbatlas_network_container

mongodbatlas_network_peering


Terraform Data Source

aws_lb_listener

aws_vpc


Other

Configure Atlas API Access

HOW TO CREATE A VPC WITH TERRAFORM

ECS Deployment

Setup a Container Cluster on AWS with Terraform

ECS cluster with dynamic port mappings using terraform

How to create an ECS Cluster with Terrafrom

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment