adapted from https://gist.github.com/aeimer/543c231b3ae0fbf8f4f00dc911d9379a for Raspberry Pi architecture
-
Build docker image for Raspberry Pi using the script
build_rpi_image.sh
:#!/bin/bash
adapted from https://gist.github.com/aeimer/543c231b3ae0fbf8f4f00dc911d9379a for Raspberry Pi architecture
Build docker image for Raspberry Pi using the script build_rpi_image.sh
:
#!/bin/bash
# Script to test web scraping on Telepac website. | |
# Adapted from blog article (https://www.scrapingbee.com/blog/selenium-python/) | |
# L Houpert Dec. 2020 | |
from selenium import webdriver | |
from selenium.webdriver.chrome.options import Options | |
import os | |
import time | |
master
branch of the repository is up to dategh-import
is installed (website)master
branch of the directory call ghp-import
and point it to the html directory my_html_dir
containing the HTML files to be exported:ghp-import -n -p -f my_html_dir
Useful additional information can be found on the jupyter-book website
Notes from Aaron Meurer's tutorial on the git workflow
git clone clone-url
Fork the repo on GitHub to your personal account. Click the Fork button on the main repo page.
Add your fork as a remote. This remote will be named after your github username. Go to the fork of the user repository (e.g. https://github.com/username/xxxx - replace username with the GitHub username-), and copy the clone url as in step 1. cd to the clone from step 1 and run
from pathlib import Path | |
import ruamel.yaml | |
import pandas as pd | |
yaml = ruamel.yaml.YAML() | |
df = pd.read_excel('../scopus_vexport_with_manualedit_202008.xlsx') | |
print (df) | |
# df.columns returns: |