Skip to content

Instantly share code, notes, and snippets.

View tomas-wood's full-sized avatar

Hoemas Ruud tomas-wood

View GitHub Profile
@tomas-wood
tomas-wood / faster_rcnn_service.py
Created December 26, 2018 18:15
Prototype for a pytorch based faster-rcnn service.
#! /usr/bin/env python2.7
# -*- coding: utf-8
# file: faster_rcnn_service.py
# author: Thomas Wood
# email: [email protected]
# description: A service to run faster-rcnn.pytorch on an input image.
from __future__ import print_function
# Import _init_paths to set up sys.path to import modules from faster-rcnn.
https://www.zillow.com/homedetails/2100-SW-Fairview-Ave-Dallas-OR-97338/67338232_zpid/
@tomas-wood
tomas-wood / top_20_relations.txt
Created December 11, 2018 23:11
top 20 most frequently occurring relations in visual genome
(2030, 'building', 'has', 'window'),
(1703, 'man', 'has', 'hair'),
(1437, 'man', 'has', 'hand'),
(1334, 'man', 'has', 'head'),
(1294, 'woman', 'has', 'hair'),
(959, 'man', 'has', 'arm'),
(853, 'tree', 'has', 'leaf'),
(675, 'sky', 'has', 'cloud'),
(656, 'cat', 'has', 'ear'),
(653, 'man', 'has', 'leg'),
@tomas-wood
tomas-wood / bugs.md
Last active December 3, 2018 22:14
Looking at the bugs that crop up in the scene-graph demo
@tomas-wood
tomas-wood / pytorch_rocm_error.txt
Created November 24, 2018 10:27
Error message back from clang-7 when following [these instructions](https://github.com/ROCmSoftwarePlatform/pytorch/wiki/Building-PyTorch-for-ROCm)
Scanning dependencies of target caffe2_hip
[ 83%] Linking HIP shared library ../lib/libcaffe2_hip.so
Stack dump:
0. Program arguments: /opt/rocm/hcc/bin/llc -O2 -mtriple amdgcn--amdhsa-amdgiz -mcpu=gfx803 -filetype=obj -o /tmp/tmp.dO38OsZ6ZU/caffe2_hip_generated_ReduceOpsKernel.cu.kernel.bc-gfx803.isabin /tmp/tmp.dO38OsZ6ZU/caffe2_hip_generated_ReduceOpsKernel.cu.kernel.bc-gfx803.isabin.opt.bc
1. Running pass 'CallGraph Pass Manager' on module '/tmp/tmp.dO38OsZ6ZU/caffe2_hip_generated_ReduceOpsKernel.cu.kernel.bc-gfx803.isabin.opt.bc'.
2. Running pass 'Simple Register Coalescing' on function '@_ZN2at6native13reduce_kernelILi512ENS0_8ReduceOpIsZNS0_15sum_kernel_implIssEEvRNS_14TensorIteratorEEUlssE_EEEEvT0_'
/opt/rocm/hcc/bin/llc(_ZN4llvm3sys15PrintStackTraceERNS_11raw_ostreamE+0x2a)[0x15554ba]
/opt/rocm/hcc/bin/llc(_ZN4llvm3sys17RunSignalHandlersEv+0x4c)[0x15537ec]
/opt/rocm/hcc/bin/llc[0x1553957]
/lib/x86_64-linux-gnu/libpthread.so.0(+0x11390)[0x7fe7b73bd390]
@tomas-wood
tomas-wood / single_node
Last active November 14, 2018 22:15
node description
{u'adjacencies': [{u'data': {u'category': u'image-to-region',
u'updated_on': u'13-Nov-2018 23:23:24'},
u'nodeTo': u'https://cs.stanford.edu/people/rak248/VG_100K_2/4887.jpg'},
{u'data': {u'category': u'region-to-label',
u'updated_on': u'13-Nov-2018 23:23:24'},
u'nodeTo': u'potted plant'}],
u'data': {u'category': u'region', u'updated_on': u'13-Nov-2018 23:23:24'},
u'id': u'1cba0c4a06bb4080a7ca06b6b2555aca',
u'name': u'1cba0c4a06bb4080a7ca06b6b2555aca'}
@tomas-wood
tomas-wood / woooo_buddy.txt
Last active October 22, 2018 21:47
Error from trying a docker image of graph_net.
2018-10-22 21:44:44.331954: E tensorflow/stream_executor/cuda/cuda_blas.cc:464] failed to create cublas handle: CUBLAS_STATUS_NOT_INITIALIZED
2018-10-22 21:44:44.399573: E tensorflow/stream_executor/cuda/cuda_blas.cc:464] failed to create cublas handle: CUBLAS_STATUS_NOT_INITIALIZED
2018-10-22 21:44:44.399625: W tensorflow/stream_executor/stream.cc:2089] attempting to perform BLAS operation using StreamExecutor without BLAS support
E1022 21:44:44.403764 140092477048576 tf_logging.py:105] Blas GEMM launch failed : a.shape=(3, 12), b.shape=(12, 10), m=3, n=10, k=12
[[{{node edge_block_1/mlp/linear_0/MatMul}} = MatMul[T=DT_FLOAT, transpose_a=false, transpose_b=false, _device="/job:localhost/replica:0/task:0/device:GPU:0"](edge_block_1/concat, edge_block/mlp/linear_0/w/read)]]
Caused by op u'edge_block_1/mlp/linear_0/MatMul', defined at:
File "blocks_test.py", line 1103, in <module>
tf.test.main()
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/platform/test.py", line 64, in main
@tomas-wood
tomas-wood / data_ingestion_contd.md
Created October 19, 2018 22:53
DATA INGESTION CONTINUED

DATA INGESTION CONTINUED

I've been thinking about it, and if we're given CSV flat files, it shouldn't be that difficult to come up with an automated ingestion method that wouldn't necessarily require a schema if the CSV files have header rows. If we're going to be ingesting data from a SQL table, I can't really see a way around having a user define a schema because the columns in the table aren't necessarily ordered by a user-defined method. Pretty sure the automated way of getting the columns from a SQL table returns the column names in alphabetical order. So we would need to have users name their columns something specific, or more simply just have them provide us a schema.

My thoughts on the MLoG module are that it will serve to make it easier for people in industry to convert their tabular data into a graph format and leverage powerful Graph Machine Learning methods on their structured graph data. But it isn't just ingestion

@tomas-wood
tomas-wood / GES_API_HOW_TO.md
Last active October 18, 2018 22:02
GES API HOW TO

Hey! So you want to use the GES API to do gremlin queries? Well we can help you with that!

We're going to assume you have an account and have already uploaded your graph to GES. This walkthrough will help you get programmatic access to that graph from anywhere in the world.

Generate a token
Use this code and enter in your own region, username, password, and domain_name.

import urllib3
@tomas-wood
tomas-wood / process_tables.md
Last active October 18, 2018 20:25
Processing tables and turning them into graphs.

Process Tables and Turn them Into Graphs!

We want to be able to take the data that people give us in CSV files and turn them into graphs.

Some approaches to doing this and my thoughts/feelings/opinions on each:

  1. DON'T
    Just don't do it. Let them get the data into a graph format we know and love, like a graph on GES we can access through an API. This is an extreme case, but I feel it merits thinking about. Processing data is hard to automate since data is generally noisy and we don't want to trust ourselves with being able to handle every corner case and user input problem that might arise. So we can just say