You'll need a PAT with package write
permissions
Then:
server { | |
listen 443 ssl; | |
server_name ~^kibanasub\.; | |
root /www/cluster; | |
ssl_certificate /etc/nginx/certificates/cert1.pem; | |
ssl_certificate_key /etc/nginx/certificates/privkey1.pem; | |
ssl_protocols TLSv1.2 TLSv1.3; | |
When using the GraphQL machinebox go library, you can tak control of logging as follows:
graphqlclient.Log = func(s string) { log.Printf("YO: %s",s) }
Issue: When pulling from github inside docker, it won't update the cached docker image.
You can force it to by using the git api to pull down the latest commit meta data and saving that to the image. This works and stays in synch with every new update to your repo, which you want to pull. No commits mean no new meta data so your docker will use the cached version. If you commits, this makes a new SHA which gets pulled in the API, invalidating cached
ADD https://[email protected]/repos/some_org/some_repo/git/refs/heads cachebust
RUN git clone https://[email protected]/some_org/some-repo-git.git
git branch -m master main git fetch origin git branch -u origin/main main git remote set-head origin -a
Reference:
sudo fdisk -l
Installed
apt-get install xserver-xorg-video-dummy
Then
vim /etc/X11/xorg.conf
import scrapy | |
import re | |
import json | |
class FacultySpider(scrapy.Spider): | |
name = "faculty" | |
def start_requests(self): | |
urls = [ | |
'https://compsci.uncg.edu/faculty/minjeong-kim/', |
raspivid -o - -t 0 -hf -w 800 -h 400 -fps 24 |cvlc -vvv stream:///dev/stdin --sout '#standard{access=http,mux=ts,dst=:8160}' :demux=h264 |
[ | |
{ | |
"id": "a35b9d70.ae8be", | |
"type": "tab", | |
"label": "GREEN-RED DEMO", | |
"disabled": false, | |
"info": "" | |
}, | |
{ | |
"id": "ef262268.43b27", |