One Paragraph of project description goes here
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system.
| # Source: https://gist.github.com/0431989df4836eb82bdac0cc53c7f3d6 | |
| # Used in https://youtu.be/R6OeIgb7lUI | |
| ############################## | |
| # Flux 2 With GitOps Toolkit # | |
| ############################## | |
| # What Is GitOps And Why Do We Want It?: https://youtu.be/HKkhD6nokC8 | |
| # Argo CD: Applying GitOps Principles To Manage Production Environment In Kubernetes: https://youtu.be/vpWQeoaiRM4 |
| /* | |
| In the node.js intro tutorial (http://nodejs.org/), they show a basic tcp | |
| server, but for some reason omit a client connecting to it. I added an | |
| example at the bottom. | |
| Save the following server in example.js: | |
| */ | |
| var net = require('net'); |
| "use strict"; | |
| const AWS_REGION = process.env.AWS_REGION; | |
| const Promise = require("bluebird"); | |
| const AWS = require("aws-sdk"); | |
| const eventUtils = require("./eventUtils.js"); | |
| AWS.config.update({ region: AWS_REGION }); | |
| AWS.config.setPromisesDependency(Promise); |
| Latency Comparison Numbers (~2012) | |
| ---------------------------------- | |
| L1 cache reference 0.5 ns | |
| Branch mispredict 5 ns | |
| L2 cache reference 7 ns 14x L1 cache | |
| Mutex lock/unlock 25 ns | |
| Main memory reference 100 ns 20x L2 cache, 200x L1 cache | |
| Compress 1K bytes with Zippy 3,000 ns 3 us | |
| Send 1K bytes over 1 Gbps network 10,000 ns 10 us | |
| Read 4K randomly from SSD* 150,000 ns 150 us ~1GB/sec SSD |
| # You have your csv data and it looks like so... It's in a file named "my_data.csv" and we want to import it into a table named "my_things". | |
| "1", "Something", "0.50", "2013-05-05 10:00:00" | |
| "2", "Another thing", "1.50", "2013-06-05 10:30:00" | |
| # Now you want to import it, go to the command line and type: | |
| $ PGPASSWORD=PWHERE psql -h HOSTHERE -U USERHERE DBNAMEHERE -c "\copy my_things FROM 'my_data.csv' WITH CSV;" | |
| # Voila! It's impoted. Now if you want to wipe it out and import a fresh one, you would do this: |
| O(1) | |
| O(n^2) | |
| O(n) | |
| O(n) | |
| O(n^2) | |
| O(n) | |
| O(log n) | |
| O(1) | |
| O(n) |
| package auth | |
| import ( | |
| "context" | |
| "net/http" | |
| "strings" | |
| "google.golang.org/grpc/metadata" | |
| "github.com/andela/micro-api-gateway/pb/authorization" |
| <h2><a href='<?php the_permalink() ?>'><?php the_title() ?></a></h2> | |
| <div class="content"> | |
| <?php the_excerpt() ?> | |
| </div> |
| var aws = require('aws-sdk'); | |
| var s3 = new aws.S3(); | |
| exports.handler = (event, context, callback) => { | |
| var srcBucket = event.Records[0].s3.bucket.name; | |
| var srcKey = event.Records[0].s3.object.key; | |
| // Process file, maybe some CSV parse or image resize, etc. | |
| // Delete file |