Setting up Conky on Ubuntu 16.04LTS for the Clevo P751DM2-G
System Information:
We extract this with inxi:
installation:
sudo apt-get install inxi
| conda uninstall --force pillow -y | |
| # install libjpeg-turbo to $HOME/turbojpeg | |
| git clone https://github.com/libjpeg-turbo/libjpeg-turbo | |
| pushd libjpeg-turbo | |
| mkdir build | |
| cd build | |
| cmake .. -DCMAKE_INSTALL_PREFIX:PATH=$HOME/turbojpeg | |
| make | |
| make install |
| # if input image is in range 0..1, please first multiply img by 255 | |
| # assume image is ndarray of shape [height, width, channels] where channels can be 1, 3 or 4 | |
| def imshow(img): | |
| import cv2 | |
| import IPython | |
| _,ret = cv2.imencode('.jpg', img) | |
| i = IPython.display.Image(data=ret) | |
| IPython.display.display(i) |
Setting up Conky on Ubuntu 16.04LTS for the Clevo P751DM2-G
System Information:
We extract this with inxi:
installation:
sudo apt-get install inxi
I wrote this answer on stackexchange, here: https://stackoverflow.com/posts/12597919/
It was wrongly deleted for containing "proprietary information" years later. I think that's bullshit so I am posting it here. Come at me.
Amazon is a SOA system with 100s of services (or so says Amazon Chief Technology Officer Werner Vogels). How do they handle build and release?
| import numpy as np | |
| from matplotlib import pyplot as plt | |
| class LivePlotNotebook(object): | |
| """ | |
| Live plot using %matplotlib notebook in jupyter notebook | |
| Usage: | |
| ``` | |
| import time |
| # Charlton Trezevant's Zoomin DNSMasq Config - Version 1.0 | |
| # Having a large local cache speeds up subsequent DNS queries significantly (from several hundred msec to around 25-30) | |
| # You may need to adjust this depending on the amount of free space you have | |
| cache-size=10000 | |
| # This ensures local reverse lookup queries are never sent upstream (e.g. dig +noall +answer -x 10.0.1.1) | |
| bogus-priv | |
| # Names without a dot or other domain part will also not be forwarded upstream | |
| domain-needed | |
| # We won't need dnsmasq to overwrite the system's resolv.conf, as we have our own cache. |
| # Inspired by https://keon.io/deep-q-learning/ | |
| import random | |
| import gym | |
| import math | |
| import numpy as np | |
| from collections import deque | |
| from keras.models import Sequential | |
| from keras.layers import Dense | |
| from keras.optimizers import Adam |
| """ | |
| Create train, valid, test iterators for CIFAR-10 [1]. | |
| Easily extended to MNIST, CIFAR-100 and Imagenet. | |
| [1]: https://discuss.pytorch.org/t/feedback-on-pytorch-for-kaggle-competitions/2252/4 | |
| """ | |
| import torch | |
| import numpy as np |
| def DepthConversion(PointDepth, f): | |
| H = PointDepth.shape[0] | |
| W = PointDepth.shape[1] | |
| i_c = np.float(H) / 2 - 1 | |
| j_c = np.float(W) / 2 - 1 | |
| columns, rows = np.meshgrid(np.linspace(0, W-1, num=W), np.linspace(0, H-1, num=H)) | |
| DistanceFromCenter = ((rows - i_c)**2 + (columns - j_c)**2)**(0.5) | |
| PlaneDepth = PointDepth / (1 + (DistanceFromCenter / f)**2)**(0.5) | |
| return PlaneDepth |
| #!/bin/sh | |
| path=/sys/class/backlight/intel_backlight | |
| luminance() { | |
| read -r level < "$path"/actual_brightness | |
| factor=$((max / 100)) | |
| ret=`printf '%d\n' "$((level / factor))"` | |
| if [ $ret -gt 100 ]; then | |
| ret=100 | |
| fi |