Skip to content

Instantly share code, notes, and snippets.

@wryfi
wryfi / http.go
Created May 16, 2018 22:53
DownloadFiles func
func DownloadFiles(destDir string, urls []string) (err error) {
if _, err := os.Stat(destDir); os.IsNotExist(err) {
os.MkdirAll(destDir, 0755)
log.Infof("Created directory %s.", destDir)
}
client := grab.NewClient()
client.UserAgent = "pogos"
requests := make([]*grab.Request, 0)
for _, url := range urls {
@wryfi
wryfi / Dockerfile
Last active April 19, 2020 21:40
yodafact cloud function
# Use the official Python image.
# https://hub.docker.com/_/python
FROM python:3.7-slim
RUN mkdir /app
WORKDIR /app
COPY . .
# Install production dependencies.
RUN pip install gunicorn functions-framework
var altmetricURL = 'https://api.altmetric.com/v1/doi/' + ArticleData.doi;
var dimensionsURL = 'https://metrics-api.dimensions.ai/doi/' + ArticleData.doi;
var counterURL = 'https://counter.plos.org/api/v1.0/stats/totals/doi/' + ArticleData.doi;
var URLs = [altmetricURL, dimensionsURL, counterURL];
var isMetricsView = location.pathname.indexOf('metrics') > 0
if (isMetricsView) {
var citedSection = new MetricsCitedSection();
var discussedSection = new MetricsDiscussedSection();
var savedSection = new MetricsSavedSection();
We can make this file beautiful and searchable if this error is corrected: Unclosed quoted field in line 3.
name,quadrant,ring,isNew,description
Container security scanning,Techniques,Adopt,TRUE,"<p>The continued adoption of containers for deployments, especially <a href=""https://www.thoughtworks.com/radar/platforms/docker"">Docker</a>, has made <strong>container security scanning</strong> a must-have technique and we've moved this technique into Adopt to reflect that. Specifically, containers introduced a new path for security issues; it's vital that you use tools to scan and check containers during deployment. We prefer using automated scanning tools that run as part of the deployment pipeline.</p>"
Data integrity at the origin,Techniques,Adopt,TRUE,"<p>Today, many organizations' answer to unlocking data for analytical usage is to build a labyrinth of data pipelines. Pipelines retrieve data from one or multiple sources, cleanse it and then transform and move it to another location for consumption. This approach to data management often leaves the consuming pipelines with the difficult task of verifying the inbou
FROM node:18-bullseye AS build
RUN apt-get update && apt-get install -y --no-install-recommends \
libgtk2.0-0 libgtk-3-0 libgbm-dev libnotify-dev libgconf-2-4 libnss3 \
libxss1 libasound2 libxtst6 xauth xvfb g++ make \
&& rm -rf /var/lib/apt/lists/* /var/cache/apt/archives/*.deb
COPY . /src/
WORKDIR /src
RUN npm install && npm run build:prod
FROM nginx:latest