This gist is associated with the blog post "Building safe-by-default tools in our Go web application".
It contains the gorm hooks that we use to ensure our queries are correctly scoped.
import { Country, Photon } from '@prisma/photon'; | |
import { findManyCursor } from './findManyCursor'; | |
const photon = new Photon(); | |
let data: Country[]; | |
const createCountry = async (id: string) => photon.countries.create({ | |
data: { | |
id, |
# Foundation Target (install system runtime deps) | |
FROM xyz:1.0@sha256:abcd1234 AS foundation | |
# Runtime deps and common environment stuff here | |
# Build Target (install system buildtime deps) | |
FROM foundation AS build | |
# Install build-time systems deps and things here, we don’t want/need these in production | |
# Development Target (development toolchain) | |
FROM build AS development |
FROM ruby:2.3.1 | |
# Install dependencies | |
RUN apt-get update -qq && apt-get install -y build-essential libpq-dev nodejs | |
# Set an environment variable where the Rails app is installed to inside of Docker image: | |
ENV RAILS_ROOT /var/www/app_name | |
RUN mkdir -p $RAILS_ROOT | |
# Set working directory, where the commands will be ran: |
import { InjectedFormikProps, withFormik } from 'formik'; | |
import * as React from 'react'; | |
import * as Yup from 'yup'; | |
interface FormValues { | |
login: string; | |
} | |
interface FormProps { | |
login?: string; |
Since Golang version 1.11 this process is finally (almost) as easy as it should (!!). You can see full docs here. For older guides see here.
These are my notes, not a generic solution. They are not meant to work anywhere outside my machines. Update version numbers to whatever are the current ones while you do this.
This sample includes a continuous deployment pipiline for websites built with React. We use AWS CodePipeline, CodeBuild, and SAM to deploy the application. To deploy the application to S3 using SAM we use a custom CloudFormation resource.
buildspec.yml
: YAML configuration for CodeBuild, this file should be in the root of your code repositoryconfigure.js
: Script executed in the build step to generate a config.json file for the application, this is used to include values exported by other CloudFormation stacks (separate services of the same application).index.js
: Custom CloudFormation resource that publishes the website to an S3 bucket. As you can see from the buildspec and SAM template, this function is located in a s3-deployment-custom-resource
sub-folder of the repoapp-sam.yaml
: Serverless Application model YAML file. This configures the S3 bucket and the cuA curated list of AWS resources to prepare for the AWS Certifications
A curated list of awesome AWS resources you need to prepare for the all 5 AWS Certifications. This gist will include: open source repos, blogs & blogposts, ebooks, PDF, whitepapers, video courses, free lecture, slides, sample test and many other resources.
A lot of us are interested in doing more analysis with our service logs so I thought I'd share an experiment I'm doing with Sync. The main idea is to transform the raw logs into something that'll be nice to query and generate reports with in Redshift.
Logs make their way into an S3 bucket (lets call it the 'raw' bucket) where we've got a lambda listening for new data. This lambda reads the raw heka protobuf gzipped data, does some transformation and writes a new file to a different S3 bucket (the 'processed' bucket) in a format that is redshift friendly (like json or csv). There's another lambda listening on the processed bucket that loads this data into Redshift.