Skip to content

Instantly share code, notes, and snippets.

View umaar's full-sized avatar

Umar Hansa umaar

View GitHub Profile

A crawler is a search engine tool which finds webpages to navigate through. Search engines can discover pages more easily when the href attribute of an anchor element is present, and contains a crawlable hyperlink.

How the Lighthouse link text audit fails

Lighthouse flags anchor elements with uncrawlable hyperlinks:

<insert screenshot here>

Lighthouse flags the following anchor elements as being uncrawlable:

<a class="duf3 aciXEb" href="#" id="sbfblt" data-async-trigger="duf3-46" jsaction="async.u" data-ved="0ahUKEwisrODf1t7oAhWHy6QKHYmbAEsQtw8IDA">Report inappropriate predictions</a>
<a href="#" data-async-trigger="ddlshare" title="Share" jsaction="async.u" data-ved="0ahUKEwisrODf1t7oAhWHy6QKHYmbAEsQ4zgIEA"><div class="dnNgC" style="background-color:#ffffff;border:2px solid #ffffff;opacity:0.800000011920929"></div><input value="//g.co/doodle/w4sr6" class="ddl-shortlink" type="hidden"><input value="//g.co/doodle/1duj2" class="ddl-facebooklink" type="hidden"><input value="//g.co/doodle/gqtwe" class="ddl-twitterlink" type="hidden"><input value="//g.co/doodle/s4vq2" class="ddl-emaillink" type="hidden"><input value="" class="ddl-copylink" type="hidden"><img class="fJOQGe" src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABYAAAAWCAYAAADEtGw7AAAAoUlEQVR42mMwnmnMQAZ2AGIQfADFM6FicDXkGnoGiP+j4TPIhpNj8EwshsLwTEoMfoDH4AeDzmBnIL5MalDgi2kZIF6JZMAtYiMPV0yfhVrwGcoH0RVA7EJ
@umaar
umaar / setup.sh
Created January 28, 2019 19:04 — forked from bradp/setup.sh
New Mac Setup Script
echo "Creating an SSH key for you..."
ssh-keygen -t rsa
echo "Please add this public key to Github \n"
echo "https://github.com/account/ssh \n"
read -p "Press [Enter] key after this..."
echo "Installing xcode-stuff"
xcode-select --install
@umaar
umaar / README.md
Created January 25, 2019 12:22 — forked from novemberborn/README.md
AVA throwsAsync transform for jscodeshift
@umaar
umaar / perceptron.js
Created September 15, 2018 00:14 — forked from primaryobjects/perceptron.js
Perceptron in JavaScript, a simple example. Neural network. See https://jsfiddle.net/qu960cc2/1/
function Perceptron(opts) {
if (!opts) opts = {}
var debug = 'debug' in opts ? opts.debug : false;
var weights = 'weights' in opts
? opts.weights.slice()
: []
var threshold = 'threshold' in opts

convert google takeout archive for location history from kml to gpx and split file into one per day

gpsbabel -i kml -f Location\ History.kml -o gpx -F out.gpx
gpsbabel -t -i gpx -f out.gpx -x track,merge,pack,split,title="ACTIVE LOG # %Y%m%d" -o gpx -F split.gpx
python2 gpxsplitter.py split.gpx
@umaar
umaar / subreddit-face-average.js
Created March 14, 2018 00:09 — forked from vincentriemer/subreddit-face-average.js
Subreddit Face Averaging Script
import * as fs from "fs";
import * as path from "path";
import * as url from "url";
import { exec } from "child_process";
import cuid from "cuid";
import snoowrap from "snoowrap";
import throat from "throat";
import mmm from "mmmagic";
import axios from "axios";
const devtools = new WebSocket('ws://localhost:9222/devtools/page/69990451-aaab-4ef8-87b1-ea77b8101b2a');
devtools.onmessage = ({data}) => {
const {result: {result: {value}}} = JSON.parse(data);
console.log('WebSocket Message Received: ', value)
};
devtools.send(JSON.stringify({
id: 1,
method: 'Runtime.evaluate',
const http = require('http');
function requestHandler(request, response) {
const headers = {
'Server-Timing': `
lb=18; "Load Balancer",
server-3=104; "Server #3 Startup",
db-read=187; "Database Read",
aws-download=317; "AWS Content Download",
db-write=218; "Database Write",