Skip to content

Instantly share code, notes, and snippets.

View tawnkramer's full-sized avatar

Tawn Kramer tawnkramer

  • San Diego, CA
View GitHub Profile
#!/usr/bin/env python3
"""
Usage:
simple_cv_racer.py --name=your_name
Options:
-h --help Show this screen.
"""
@tawnkramer
tawnkramer / racer.py
Last active September 24, 2021 10:43
"""
Script to drive a keras TF model with the Virtual Race Environment.
Usage:
racer.py (--model=<model>) (--host=<ip_address>) (--name=<car_name>)
Options:
-h --help Show this screen.
"""
* linux ubuntu only now
Get donkeycar, if you don't have it. If you do, just checkout dev.
* cd ~/projects
* git clone https://github.com/autorope/donkeycar
* cd donkeycar
* git checkout master
Install conda. Optional but recommended
* wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh
"""
Usage:
train_imagenet.py --model="mymodel.h5" --data="/data/ImageNetDir" --resume
Note:
The idea here is to pre-train a network on imagenet, or some large corpus, and then transfer weights.
I used https://github.com/mf1024/ImageNet-Datasets-Downloader.git to create a large dataset:
`cd ImageNet-Datasets-Downloader
python downloader.py -data_root /data/ImageNetData -number_of_classes 1000 -images_per_class 1000`
CAMERA_TYPE = "CSIC"
PCA9685_I2C_BUSNUM = 1
CONTROLLER_TYPE='F710'
@tawnkramer
tawnkramer / keras_quant.py
Created May 3, 2019 22:55 — forked from rocking5566/keras_quant.py
Quantization aware training in keras
import numpy as np
import tensorflow as tf
from tensorflow.keras.datasets import mnist
from tensorflow.keras.utils import to_categorical
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Activation, Conv2D, Flatten
from tensorflow.keras.optimizers import RMSprop
# download the mnist to the path '~/.keras/datasets/' if it is the first time to be called
# X shape (60,000 28x28), y shape (10,000, )
@tawnkramer
tawnkramer / intel path follow redme.txt
Last active April 25, 2019 03:29
Intel RealSense T265 Path follower on Donkey
Setup on the pi:
git clone https://github.com/tawnkramer/donkey donkey_tkramer
cd donkey_tkramer
git checkout dev
pip3 uninstall donkeycar
pip3 install .[pi]
donkey createcar --path ~/follow --template path_follower
cd ~/follow
python3 manage.py drive
adapted from http://wiki.tekkotsu.org/index.php/Sony_PlayStation_Eye_driver_install_instructions
sudo -s
cd /usr/src
apt-get update
apt-get install -y build-essential kernel-package linux-source libssl-dev
tar --bzip2 -xvf linux-source-*.tar.bz2
ln -s `find . -maxdepth 1 -type d -name "linux-source-*"` linux
#PS3 Eye Drivers
'''
# Install and setup:
# sudo update && sudo apt install pigpio python3-pigpio
# sudo systemctl start pigpiod
'''
import os
import donkeycar as dk
from donkeycar.parts.controller import PS3JoystickController
from donkeycar.parts.actuator import PWMSteering, PWMThrottle
"""A demo to classify opencv camera stream with google coral tpu device."""
import argparse
import io
import time
import numpy as np
import cv2
import edgetpu.classification.engine