FROM ubuntu:18.04
RUN apt-get update -y && \
DEBIAN_FRONTEND=noninteractive apt-get install -y qemu-kvm libvirt-daemon-system libvirt-clients bridge-utils vagrant && \
apt-get autoclean && \
apt-get autoremove && \
vagrant plugin install vagrant-libvirt
COPY startup.sh /
ENTRYPOINT ["/startup.sh" ]
[alias] | |
wip = for-each-ref --sort='authordate:iso8601' --format=' %(color:green)%(authordate:relative)%09%(color:white)%(refname:short)' refs/heads | |
package services | |
import ( | |
"database/sql" | |
"reflect" | |
"regexp" | |
"testing" | |
"github.com/DATA-DOG/go-sqlmock" | |
"github.com/volatiletech/sqlboiler/boil" | |
) |
L0 = bare metal machine | |
L1 = VM on bare metal | |
L2 = VM on VM | |
First, ensure virtualization is enabled on L0 as described here: https://docs.fedoraproject.org/en-US/quick-docs/using-nested-virtualization-in-kvm/#proc_enabling-nested-virtualization-in-kvm | |
Next, launch L1 with the following Vagrant config (or an equivalent directly on the hypervisor): | |
Vagrant.configure("2") do |config| | |
config.vm.box = "generic/ubuntu1604" |
REMOTE=${REMOTE:-"upstream/master"} | |
DOC_EXAMPLE_CHANGE_PATTERN="\ | |
-e ^doc/ \ | |
-e ^examples/ \ | |
-e ^(VERSION|LICENSE)$ \ | |
-e \.md$\ | |
" | |
echo using remote $REMOTE |
If you are like me you find yourself cloning a repo, making some proposed changes and then deciding to later contributing back using the GitHub Flow convention. Below is a set of instructions I've developed for myself on how to deal with this scenario and an explanation of why it matters based on jagregory's gist.
To follow GitHub flow you should really have created a fork initially as a public representation of the forked repository and the clone that instead. My understanding is that the typical setup would have your local repository pointing to your fork as origin and the original forked repository as upstream so that you can use these keywords in other git commands.
-
Clone some repo (you've probably already done this step).
git clone [email protected]
If you are like me you find yourself cloning a repo, making some proposed changes and then deciding to later contributing back using the GitHub Flow convention. Below is a set of instructions I've developed for myself on how to deal with this scenario and an explanation of why it matters based on jagregory's gist.
To follow GitHub flow you should really have created a fork initially as a public representation of the forked repository and the clone that instead. My understanding is that the typical setup would have your local repository pointing to your fork as origin and the original forked repository as upstream so that you can use these keywords in other git commands.
-
Clone some repo (you've probably already done this step).
git clone [email protected]
#!/usr/bin/env python3 | |
""" | |
License: MIT License | |
Copyright (c) 2023 Miel Donkers | |
Very simple HTTP server in python for logging requests | |
Usage:: | |
./server.py [<port>] | |
""" | |
from http.server import BaseHTTPRequestHandler, HTTPServer |
If you are like me you find yourself cloning a repo, making some proposed changes and then deciding to later contributing back using the GitHub Flow convention. Below is a set of instructions I've developed for myself on how to deal with this scenario and an explanation of why it matters based on jagregory's gist.
To follow GitHub flow you should really have created a fork initially as a public representation of the forked repository and the clone that instead. My understanding is that the typical setup would have your local repository pointing to your fork as origin and the original forked repository as upstream so that you can use these keywords in other git commands.
-
Clone some repo (you've probably already done this step)
git clone [email protected]