Skip to content

Instantly share code, notes, and snippets.

View ThomasLengeling's full-sized avatar

Thomas Sanchez Lengeling ThomasLengeling

View GitHub Profile
@cbeddow
cbeddow / mapillary_jpg_download.py
Last active November 11, 2024 16:55
Download Mapillary images a JPGs
import mercantile, mapbox_vector_tile, requests, json, os
from vt2geojson.tools import vt_bytes_to_geojson
# define an empty geojson as output
output= { "type": "FeatureCollection", "features": [] }
# vector tile endpoints -- change this in the API request to reference the correct endpoint
tile_coverage = 'mly1_public'
# tile layer depends which vector tile endpoints:
@cbeddow
cbeddow / mapillary_image_download.py
Last active March 14, 2024 13:07
Download all Mapillary images in a bounding box - API v4
import mercantile, mapbox_vector_tile, requests, json
from vt2geojson.tools import vt_bytes_to_geojson
# define an empty geojson as output
output= { "type": "FeatureCollection", "features": [] }
# vector tile endpoints -- change this in the API request to reference the correct endpoint
tile_coverage = 'mly1_public'
# tile layer depends which vector tile endpoints:

With you get something like:

dyld: Library not loaded: @rpath/ffmpeg/lib/osx/libavutil.55.dylib Referenced from: /Users/studio/Work/Repositories/Cannula/HAPTest/bin/HAPTest.app/Contents/MacOS/HAPTest Reason: image not found

On Xcode > Build Settings > Other Linker Flags > Debug AND Release (not using AppStore), change:

@loader_path/../../../OFReleases/OF0101/addons/ofxHapPlayer/libs

@phillipi
phillipi / biggan_slerp
Last active October 8, 2023 01:25
Slerp through the BigGAN latent space
# to be used in conjunction with the functions defined here:
# https://colab.research.google.com/github/tensorflow/hub/blob/master/examples/colab/biggan_generation_with_tf_hub.ipynb
# party parrot transformation
noise_seed_A = 3 # right facing
noise_seed_B = 31 # left facing
num_interps = 14
truncation = 0.2
category = 14
@keijiro
keijiro / 00_blot2.md
Last active July 29, 2022 23:39
KodeLife fragment shader sketch

gif

import os, argparse
import tensorflow as tf
from tensorflow.python.framework import graph_util
dir = os.path.dirname(os.path.realpath(__file__))
def freeze_graph(model_folder):
# We retrieve our checkpoint fullpath
checkpoint = tf.train.get_checkpoint_state(model_folder)
@dmnsgn
dmnsgn / WebGL-WebGPU-frameworks-libraries.md
Last active November 17, 2024 08:50
A collection of WebGL and WebGPU frameworks and libraries

A non-exhaustive list of WebGL and WebGPU frameworks and libraries. It is mostly for learning purposes as some of the libraries listed are wip/outdated/not maintained anymore.

Engines and libraries ⚙️

Name Stars Last Commit Description
three.js ![GitHub
@tado
tado / hoge.cpp
Created March 22, 2017 03:57
ofTexture to ofImage
ofTexture texture = cam.getTexture();
ofPixels pixels;
texture.readToPixels(pixels);
ofImage img;
img.setFromPixels(pixels);
@ksopyla
ksopyla / ubuntu16_tensorflow_cuda8.sh
Last active March 7, 2021 16:31
How to set up tensorflow with CUDA 8 cuDNN 5.1 in virtualenv with Python 3.5 on Ubuntu 16.04 http://ksopyla.com/2017/02/tensorflow-gpu-virtualenv-python3/
# This is shorthened version of blog post
# http://ksopyla.com/2017/02/tensorflow-gpu-virtualenv-python3/
# update packages
sudo apt-get update
sudo apt-get upgrade
#Add the ppa repo for NVIDIA graphics driver
sudo add-apt-repository ppa:graphics-drivers/ppa
sudo apt-get update
@jaron
jaron / urban-sound-cnn-salamon.py
Last active July 12, 2019 15:07
A Keras/Tensorflow implementation of the 5-layer CNN described in Salamon and Bello's paper (https://arxiv.org/pdf/1608.04363.pdf). See http://aqibsaeed.github.io/2016-09-24-urban-sound-classification-part-2/ for a description on how to create the data this uses.
import numpy as np
from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation, Flatten
from keras.layers import Convolution2D, MaxPooling2D
from keras.optimizers import SGD
from keras.regularizers import l2, activity_l2
from keras.utils import np_utils
from sklearn import metrics
# to run this code, you'll need to load the following data: