Skip to content

Instantly share code, notes, and snippets.

View ChinnakornP's full-sized avatar

Chinnakorn ChinnakornP

  • Chinnakorn.dev
  • thailand
View GitHub Profile
import * as React from 'react'
import { PayPalButton } from 'react-paypal-button-v2'
import Typography from '@material-ui/core/Typography'
import { PAYMENT_GATEWAYS, YG_EMAIL_BASE } from 'config'
import { Checkout as CheckoutInterface } from 'lib/queries/generatedTypes/Checkout'
interface PaypalOrderParams {
@ChinnakornP
ChinnakornP / API.md
Created May 25, 2021 04:24 — forked from iros/API.md
Documenting your REST API

Title

<Additional information about your API call. Try to use verbs that match both request type (fetching vs modifying) and plurality (one vs multiple).>

  • URL

    <The URL Structure (path only, no root url)>

  • Method:

@ChinnakornP
ChinnakornP / tmux.md
Created March 3, 2021 15:34 — forked from andreyvit/tmux.md
tmux cheatsheet

tmux cheat sheet

(C-x means ctrl+x, M-x means alt+x)

Prefix key

The default prefix is C-b. If you (or your muscle memory) prefer C-a, you need to add this to ~/.tmux.conf:

remap prefix to Control + a

@ChinnakornP
ChinnakornP / print.js
Created January 31, 2020 14:42 — forked from vodolaz095/print.js
Print for NodeJS using CUPS backend
var ipp = require('ipp'); //get it from there - https://npmjs.org/package/ipp - $npm install ipp
var request = require('request'); //get it from there - https://npmjs.org/package/request - $npm install request
var fs = require('fs');
function getPrinterUrls(callback) {
var CUPSurl = 'http://localhost:631/printers';//todo - change of you have CUPS running on other host
request(CUPSurl, function (error, response, body) {
if (!error && response.statusCode == 200) {
var printersMatches = body.match(/<TR><TD><A HREF="\/printers\/([a-zA-Z0-9-^"]+)">/gm);//i know, this is terrible, sorry(
var printersUrls = [];
@ChinnakornP
ChinnakornP / wget.md
Created August 16, 2018 03:04 — forked from simonw/wget.md
Recursive wget ignoring robots
$ wget -e robots=off -r -np 'http://example.com/folder/'
  • -e robots=off causes it to ignore robots.txt for that domain
  • -r makes it recursive
  • -np = no parents, so it doesn't follow links up to the parent folder