- Make the spice mix:
- 1 Tbs ground allspice
- 1 Tbs ground cinnamon
- 1 Tbs ground nutmeg
- 1 tsp ground cloves
- 1 tsp ground coriander
- 1 tsp ground ginger
You can keep the leftovers in a jar.
You can keep the leftovers in a jar.
sudo modprobe v4l2loopback \
devices=1 exclusive_caps=1 video_nr=5 card_label="Dummy Camera"
Speed1.avi
, this can be played streamed to /dev/video5
with
ffmpeg -re -stream_loop -1 -i Speed1.avi -vcodec rawvideo -pix_fmt yuv420p -f v4l2 /dev/video5
A simple example of code for testing a custom tensorflow Keras layer
def my_init(shape, dtype=None):
"""This function is a custom kernel initialiser. It loads the weights from a matlab file. Adapt as need"""
matlab = io.loadmat('../matlab/weights.mat')
wfilter = matlab['wfilter']
if shape != wfilter.shape:
raise Exception('Shaped do not match')
return tf.constant(wfilter, dtype=dtype)
This is running on Linux Mint 20
sudo apt-get -y install apt-transport-https ca-certificates curl software-properties-common
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
sudo add-apt-repository "deb [arch=amd64] https://download.docker.com/linux/ubuntu $(. /etc/os-release; echo "$UBUNTU_CODENAME") stable"
sudo apt-get update
sudo apt install docker-ce docker-compose
sudo usermod -aG docker $USER
docker --version
It's actually fairly easy to control Home Assistant remotely using curl
but I couldn't find a complete solution on how to do this, so here goes...
configuration.yaml
by adding the line api:
curl -X GET -H "Authorization: Bearer YOUR_TOKEN" -H "Content-Type: application/json" http://YOUR_IP:8123/api/states | prettyjson
prettyjson
is an alias for python -m json.tool
, you don't need this it's just easier to read.curl -X GET -H "Authorization: Bearer YOUR_TOKEN" -H "Content-Type: application/json" http://YOUR_IP:8123/api/states/switch.mylight | prettyjson
To convert a video to 16x9 or 9x16 by adding black bars use ffmpeg as follows
ffmpeg -i film1.mp4 -vf "scale=1920:1080:force_original_aspect_ratio=decrease,pad=1920:1080:(ow-iw)/2:(oh-ih)/2,setsar=1" film1_bars.mp4
This add bars to the top and bottom, flip both of the 1920:1080
to 1080:1920
for horizontal bars (better for computer monitors, TV etc)
Note IGTV also requires h.264 and a max of 30fps.
ffmpeg -i input.mkv -filter:v "setpts=0.5*PTS" output.mkv
increases doubles playback speed, by dropping frames seeffmpeg -i input.mp4 -f opengl "window title"
or use ffplay
ffmpeg -i video.flv video.mpeg
"""Simple demo of argparse in python, see http://zetcode.com/python/argparse/""" | |
import argparse | |
parser = argparse.ArgumentParser() | |
parser.add_argument('name') # positional argument, note no dash! | |
parser.add_argument('-n', type=int, required=True, help="the number") # Required argument, must be a int, eg. -n 4 | |
parser.add_argument('-e', type=int, default=2, help="defines the value") # Optional argument with a default | |
parser.add_argument('-o', '--output', action='store_true', help="shows output") # a binary flag |
git filter-branch --prune-empty --subdirectory-filter FOLDER-NAME BRANCH-NAME
FOLDER-NAME
: The folder within your project that you'd like to create a separate repository from.BRANCH-NAME
: The default branch for your current project, for example, master
.git remote set-url origin https://github.com/USERNAME/NEW-REPOSITORY-NAME.git
git remote -v
git push -u origin BRANCH-NAME
The code for this tutorial is here
Opencv provides are useful, but limited, method of building a GUI. A much more complete system could be acheived using pyqt.
The question is, how do we display images. There are quite a few possible routes but perhaps the easiest is to use QLabel
since it has a setPixmap
function. Below is some code that creates two labels. It then creates a grey pixmap and displays it one of the labels. code: staticLabel1.py
from PyQt5.QtWidgets import QWidget, QApplication, QLabel, QVBoxLayout
from PyQt5.QtGui import QPixmap, QColor
import sys