This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
// Requires the gpt library from https://github.com/hrishioa/socrate and the progress bar library. | |
// Created by Hrishi Olickel ([email protected]) (@hrishioa). Reach out if you have trouble running this. | |
import { ThunkQueue } from '../../utils/simplethrottler'; | |
import { | |
AcceptedModels, | |
Messages, | |
askChatGPT, | |
getMessagesTokenCount, | |
getProperJSONFromGPT, |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
package auth | |
import ( | |
"encoding/json" | |
"fmt" | |
"io" | |
"io/ioutil" | |
"net" | |
"net/http" | |
"net/url" |
Just a quickie test in Python 3 (using Requests) to see if Google Cloud Vision can be used to effectively OCR a scanned data table and preserve its structure, in the way that products such as ABBYY FineReader can OCR an image and provide Excel-ready output.
The short answer: No. While Cloud Vision provides bounding polygon coordinates in its output, it doesn't provide it at the word or region level, which would be needed to then calculate the data delimiters.
On the other hand, the OCR quality is pretty good, if you just need to identify text anywhere in an image, without regards to its physical coordinates. I've included two examples:
####### 1. A low-resolution photo of road signs
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
// Create Base64 Object | |
var Base64={_keyStr:"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/=",encode:function(e){var t="";var n,r,i,s,o,u,a;var f=0;e=Base64._utf8_encode(e);while(f<e.length){n=e.charCodeAt(f++);r=e.charCodeAt(f++);i=e.charCodeAt(f++);s=n>>2;o=(n&3)<<4|r>>4;u=(r&15)<<2|i>>6;a=i&63;if(isNaN(r)){u=a=64}else if(isNaN(i)){a=64}t=t+this._keyStr.charAt(s)+this._keyStr.charAt(o)+this._keyStr.charAt(u)+this._keyStr.charAt(a)}return t},decode:function(e){var t="";var n,r,i;var s,o,u,a;var f=0;e=e.replace(/[^A-Za-z0-9\+\/\=]/g,"");while(f<e.length){s=this._keyStr.indexOf(e.charAt(f++));o=this._keyStr.indexOf(e.charAt(f++));u=this._keyStr.indexOf(e.charAt(f++));a=this._keyStr.indexOf(e.charAt(f++));n=s<<2|o>>4;r=(o&15)<<4|u>>2;i=(u&3)<<6|a;t=t+String.fromCharCode(n);if(u!=64){t=t+String.fromCharCode(r)}if(a!=64){t=t+String.fromCharCode(i)}}t=Base64._utf8_decode(t);return t},_utf8_encode:function(e){e=e.replace(/\r\n/g,"\n");var t="";for(var n=0;n<e.length;n++){var r=e.charCodeAt(n);if(r |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# One liner for counting unique IP addresses from nginx logs | |
# Feel free to comment with better ideas - I'm sure it's not the best way of doing this (I'm no awk ninja!) | |
# | |
# Sample output: | |
# | |
# $ cat example.com.access.log | awk -F " " '{a[$1]++ } END { for (b in a) { print b, "\t", a[b] } }' | |
# 66.65.145.220 49 | |
# 92.63.28.68 126 | |
cat example.com.access.log | awk -F " " '{a[$1]++ } END { for (b in a) { print b, "\t", a[b] } }' |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
echo 'export PATH=$HOME/local/bin:$PATH' >> ~/.bashrc | |
. ~/.bashrc | |
mkdir ~/local | |
mkdir ~/node-latest-install | |
cd ~/node-latest-install | |
curl http://nodejs.org/dist/node-latest.tar.gz | tar xz --strip-components=1 | |
./configure --prefix=~/local | |
make install # ok, fine, this step probably takes more than 30 seconds... | |
curl https://www.npmjs.org/install.sh | sh |
NewerOlder