Built this as one off script to post a bunch of records to bluesky based on a given csv to bootstrap https://bsky.app/profile/cdk.dev
sonnet 3.5 wrote all the code
Built this as one off script to post a bunch of records to bluesky based on a given csv to bootstrap https://bsky.app/profile/cdk.dev
sonnet 3.5 wrote all the code
/** | |
* Post-synthesis function to configure AWS provider for LocalStack | |
*/ | |
exports.postSynth = function(config) { | |
const endpoint = "http://localhost:4566"; | |
const services = [ | |
"apigateway", "apigatewayv2", "cloudformation", "cloudwatch", "dynamodb", "ec2", "es", | |
"elasticache", "firehose", "iam", "kinesis", "lambda", "rds", "redshift", "route53", | |
"secretsmanager", "ses", "sns", "sqs", "ssm", "stepfunctions", "sts" |
set accounts to {"IBAN 1", "IBAN 2"}
yourPassword
with your MoneyMoney password/your/local/path/
with your actual target pathset yesterday to (today - (1 * days))
Load this file in Automator and get fully automated exports.
import { Credentials } from "@aws-amplify/core"; | |
import { AuthOptions, createAuthLink } from "aws-appsync-auth-link"; | |
import { createSubscriptionHandshakeLink } from "aws-appsync-subscription-link"; | |
import { | |
ApolloClient, ApolloLink, HttpLink, InMemoryCache | |
} from "@apollo/client/core" | |
import gql from 'graphql-tag'; | |
global.WebSocket = require('ws'); |
import { Construct, Node, } from "constructs"; | |
import { Resource, TerraformResource, TerraformAsset, AssetType } from 'cdktf'; | |
import * as fs from "fs"; | |
import * as path from 'path' | |
import { DataExternal } from "@cdktf/provider-external" | |
export interface CustomDataSourceConfig { | |
code(input: any): Promise<any> | |
inputs: any; | |
dependsOn?: TerraformResource[]; |
An example for a custom provider leveraging https://github.com/lukekaalim/terraform-plugin-node-SDK/
This is a workaround for hashicorp/terraform-provider-aws#18206
This works locally and could also be published as an actual Terraform provider to the Terraform registry. In addition to this, we can build provider bindings for that via cdktf get
und use it the context of Terraform CDK.
import { Construct, } from "constructs"; | |
import { Resource, TerraformResource } from 'cdktf'; | |
import * as fs from "fs"; | |
import * as path from 'path' | |
import { DataExternal } from "../.gen/providers/external" | |
export interface CustomDataSourceConfig<Inputs, Outputs> { | |
code(input: Inputs): Promise<Outputs> | |
inputs: Inputs; | |
dependsOn?: TerraformResource[]; |
import { Construct, Tag } from '@aws-cdk/core'; | |
import { App, Stack } from '../../../packages/@terrastack/core'; | |
import { AwsProvider, AwsS3Bucket, AwsIamPolicy } from '../.generated/aws'; | |
import { PolicyDocument, PolicyStatement, AnyPrincipal, Effect } from "@aws-cdk/aws-iam" | |
const app = new App(); | |
class MyBucketStack extends Stack { | |
constructor(scope: Construct, ns: string) { | |
super(scope, ns); |
Given you have a stack with one or more Lambda functions (e.g. as part of a Step Functions state machine), it can be pretty useful to stub long running parts with a known response.
This makes use of cdk Aspects, which allows modifying all or a filtered subsset of resources for a given scope (Stack, Construct).
In addition this leverages raw overrides to remove the original code of the Lambda function.
Please note, that the stub has to be in Python or NodeJS, since inlining code is only supported by those runtimes.
const AWS = require('aws-sdk') | |
const appsync = require('aws-appsync'); | |
const gql = require('graphql-tag'); | |
require('cross-fetch/polyfill'); | |
exports.handler = async function(event) { | |
const graphqlClient = new appsync.AWSAppSyncClient({ | |
url: process.env.APPSYNC_ENDPOINT_URL, | |
region: process.env.AWS_REGION, | |
auth: { |