(by @andrestaltz)
If you prefer to watch video tutorials with live-coding, then check out this series I recorded with the same contents as in this article: Egghead.io - Introduction to Reactive Programming.
(by @andrestaltz)
If you prefer to watch video tutorials with live-coding, then check out this series I recorded with the same contents as in this article: Egghead.io - Introduction to Reactive Programming.
var amqp = require('amqplib/callback_api'); | |
// if the connection is closed or fails to be established at all, we will reconnect | |
var amqpConn = null; | |
function start() { | |
amqp.connect(process.env.CLOUDAMQP_URL + "?heartbeat=60", function(err, conn) { | |
if (err) { | |
console.error("[AMQP]", err.message); | |
return setTimeout(start, 1000); | |
} |
Arq stores backup data in a format similar to that of the open-source version | |
control system 'git'. | |
Content-Addressable Storage | |
--------------------------- | |
At the most basic level, Arq stores "blobs" using the SHA1 hash of the | |
contents as the name, much like git. Because of this, each unique blob is only | |
stored once. If 2 files on your system have the same contents, only 1 copy of | |
the contents will be stored. If the contents of a file change, the SHA1 hash is | |
different and the file is stored as a different blob. |
#!/bin/bash | |
# variables | |
LOGFILE="/var/log/nginx/access.log" | |
LOGFILE_GZ="/var/log/nginx/access.log.*" | |
RESPONSE_CODE="200" | |
# functions | |
filters(){ | |
grep $RESPONSE_CODE \ |
const http = require('http'); | |
const express = require('express'); | |
const Redis = require('ioredis'); | |
const { RateLimiterRedis } = require('rate-limiter-flexible'); | |
const redisClient = new Redis({ enableOfflineQueue: false }); | |
const maxWrongAttemptsByIPperMinute = 5; | |
const maxWrongAttemptsByIPperDay = 100; | |
const limiterFastBruteByIP = new RateLimiterRedis({ |
Using our beloved docker and docker-compose, we can very quickly bring up an Apache Airflow instance on our mac.
About the only thing you need to customize in this docker-compose.yml file is the volumes section. This will tell docker to map the given directory containing your Airflow DAGs/plugins to container file system.
version: '3'
services:
Percentage:
<img src="https://user-images.githubusercontent.com/16319829/81180309-2b51f000-8fee-11ea-8a78-ddfe8c3412a7.png" width=50% height=50%>
Pixels:
<img src="https://user-images.githubusercontent.com/16319829/81180309-2b51f000-8fee-11ea-8a78-ddfe8c3412a7.png" width="150" height="280">
# To extract the sound from a video and save it as MP3: | |
ffmpeg -i <video.mp4> -vn <sound>.mp3 | |
# To convert frames from a video or GIF into individual numbered images: | |
ffmpeg -i <video.mpg|video.gif> <frame_%d.png> | |
# To combine numbered images (frame_1.jpg, frame_2.jpg, etc) into a video or GIF: | |
ffmpeg -i <frame_%d.jpg> -f image2 <video.mpg|video.gif> | |
# To quickly extract a single frame from a video at time mm:ss and save it as a 128x128 resolution image: |