Skip to content

Instantly share code, notes, and snippets.

View thorikawa's full-sized avatar

Takahiro "Poly" Horikawa thorikawa

View GitHub Profile
@SeeJayDee
SeeJayDee / tiny_IRremote.cpp
Last active June 24, 2024 03:36
tiny_IRremote - Arduino IRremote ported to the ATtiny
/*
* tiny_IRremote
* Version 0.2 July, 2016
* Christian D'Abrera
* Fixed what was originally rather broken code from http://www.gammon.com.au/Arduino/
* ...itself based on work by Ken Shirriff.
*
* This code was tested for both sending and receiving IR on an ATtiny85 DIP-8 chip.
* IMPORTANT: IRsend only works from PB4 ("pin 4" according to Arduino). You will need to
* determine which physical pin this corresponds to for your chip, and connect your transmitter
@y-takagi
y-takagi / DOCUMENT.md
Last active February 23, 2025 05:45
iOSでデータを永続化する方法

How to save data in iOS

この投稿では、iOSのファイルシステムについて理解し、データを永続化(iCloud含む)する方法を紹介する。尚、サンプルコードは動かない可能性もあるので参考程度にして下さい。

iOS File System

アプリがファイルシステムとやり取り出来る場所は、ほぼアプリのサンドボックス内のディレクトリに制限されている。新しいアプリがインストールされる際、インストーラーはサンドボックス内に複数のコンテナを作成し、図1に示す構成をとる。各コンテナには役割があり、Bundle Containerはアプリのバンドルを保持し、Data Containerはアプリとユーザ両方のデータを保持する。Data Containerは用途毎に、さらに複数のディレクトリに分けられる。アプリは、例えばiCloud Containerのように、実行時に追加のコンテナへのアクセスをリクエストすることもある。

IMG_0017_RESIZE.png

図1. An iOS app operating within its own sandbox

@daitomanabe
daitomanabe / neural-style-gpu-in-ec2.markdown
Last active June 1, 2016 00:57
easy set up for neural-stlye with Cuda7.5 + cuDNN5 in EC2

#neural-style with Cuda7.5 + cuDNN5 in EC2

  • if you want to skip installing nvidia driver and cuda7.5, use this Ubuntu 14 AMI.
    https://aws.amazon.com/marketplace/pp/B01EYKBEQ0
    (Nvidia Drivers, Cuda 7.5 Toolkit, cuDNN pre-installed with Nvidia Drivers, Cuda 7.5 Toolkit, cuDNN 4, TensorFlow, and Jupyter to leverage Nvidia GRID instances)
    g2.2xlarge or better (GPU instance)
    Don't forget to make your root partition size bigger.

  • In case you want to install everything by yourself

@yohhoy
yohhoy / ff2cv.cpp
Last active December 14, 2024 12:44
Read video frame with FFmpeg and convert to OpenCV image
/*
* Read video frame with FFmpeg and convert to OpenCV image
*
* Copyright (c) 2016 yohhoy
*/
#include <iostream>
#include <vector>
// FFmpeg
extern "C" {
#include <libavformat/avformat.h>
@nobnak
nobnak / UVWorld.cs
Last active August 17, 2023 08:40
UV -> World Position (For Unity)
public class UVWorld : MonoBehaviour {
Mesh _mesh;
Vector3[] _vertices;
int[] _triangles;
Vector2[] _uvs;
Vector3[] _normals;
Triangle2D[] _uvTris;
void Awake() {
_mesh = GetComponent<MeshFilter>().sharedMesh;
@alex-roman
alex-roman / openresty-ubuntu-install.sh
Last active February 28, 2023 14:19
Easy install openresty (used and tested on Ubuntu 14.04, 15.10 and 16.04)
#!/bin/bash
apt-get -y update
apt-get -y install nginx-extras build-essential libpcre3-dev libssl-dev libgeoip-dev libpq-dev libxslt1-dev libgd2-xpm-dev
wget -c https://openresty.org/download/openresty-1.13.6.2.tar.gz
tar zxvf openresty-1.13.6.2.tar.gz
cd openresty-1.13.6.2
./configure \
--sbin-path=/usr/sbin/nginx \
--conf-path=/etc/nginx/nginx.conf \
@eelstork
eelstork / mixamoToBlenderBoneNames.py
Created September 20, 2015 06:30
Convert Mixamo rig bone names (as imported to Blender via FBX) to standard Blender bone names. This is especially use if your mesh uses the 'mirror' modifier. 1 - Backup your Blend. 2 - In action editor disconnect any animation connected to the rig. 3 - Paste the script in a text window and select "run script". 4 - Notice that all bone names are…
# IMPORTANT: make sure no animation is assigned
# to the rig before you run this script,
# otherwise linked animation data will be corrupted.
import bpy
# ----------------------------------
# Mixamo left/right bone names start with 'Left'/'Right'
# Instead we apply Blender's standard .L/.R suffix
# and get rid of long suffix
@genekogan
genekogan / _Instructions.md
Last active September 21, 2024 10:33
instructions for generating a style transfer animation from a video

Instructions for making a Neural-Style movie

The following instructions are for creating your own animations using the style transfer technique described by Gatys, Ecker, and Bethge, and implemented by Justin Johnson. To see an example of such an animation, see this video of Alice in Wonderland re-styled by 17 paintings.

Setting up the environment

The easiest way to set up the environment is to simply load Samim's a pre-built Terminal.com snap or use another cloud service like Amazon EC2. Unfortunately the g2.2xlarge GPU instances cost $0.99 per hour, and depending on parameters selected, it may take 10-15 minutes to produce a 512px-wide image, so it can cost $2-3 to generate 1 sec of video at 12fps.

If you do load the

@eruffaldi
eruffaldi / arucomarker.py
Last active May 18, 2021 08:31
Multiple Aruco PDF Marker generator
import cairo,argparse,random
#TEST: https://jcmellado.github.io/js-aruco/getusermedia/getusermedia.html
#http://terpconnect.umd.edu/~jwelsh12/enes100/markergen.html
#http://terpconnect.umd.edu/~jwelsh12/enes100/markers.js
markers_opts = [[False,True,True,True,True],[False,True,False,False,False]
,[True,False,True,True,False],[True,False,False,False,True]];
import string
digs = string.digits + string.letters
@faithandbrave
faithandbrave / emscripten_cmake_build.md
Last active June 11, 2024 21:14
EmscriptenとCMakeでのビルド方法

EmscriptenとCMakeでのビルド方法

Emscriptenは、C++をJavaScriptにコンパイルする、LLVMベースのコンパイラ。

このドキュメントでは、特定のプロジェクトに依存せず、Emscripten向けに、CMakeを使用してC++プロジェクトをビルドする方法を紹介する。

バージョン

このドキュメントで扱う各ツールのバージョンは、以下のものとする: