Skip to content

Instantly share code, notes, and snippets.

View nigelheap's full-sized avatar

Nigel nigelheap

View GitHub Profile
@nigelheap
nigelheap / fingerprint-example.php
Last active September 25, 2024 07:19
fingerprint-example
<?php
/** Template Name: Payment template */
if (! defined('ABSPATH')) {
exit; // Exit if accessed directly.
}
function generateNABTransactFingerprint($vendorName, $paymentReference, $paymentAlert, $products, $merchantTransactionPassword)
{
@nigelheap
nigelheap / disable-site-health.php
Created August 2, 2019 06:50
Disable wordpress site health just in case your boss sees it and gets lost down a rabbit hole of pointlessness
<?php
// disable the admin menu
add_action( 'admin_menu', function() {
remove_submenu_page( 'tools.php', 'site-health.php' );
});
// block site health page screen
@nigelheap
nigelheap / custom-path.php
Created March 11, 2018 20:50
Creating a custom page in wordpress from a template part with no wordpress head(), foot() or theme
<?php
/**
* Class CustomPage_Frontend
*/
class CustomPage_Frontend {
private static $instance = null;
public $path = 'custompage';
<?php
ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);
$ch = curl_init('https://www.howsmyssl.com/a/check');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$data = curl_exec($ch);
curl_close($ch);
<?php
$order = wc_get_order(34152);
$mailer = WC()->mailer();
$mails = $mailer->get_emails();
if ( ! empty( $mails ) ) {
foreach ( $mails as $mail ) {
if ( $mail->id == 'failed_order' ) {
$mail->trigger( $order->id );
}
# arg1 : bucket
# arg2 : max sub folders / days
total=`s3cmd ls s3://$1/ | wc -l`
# remove old folders if move then max
if [ "$total" -gt "$2" ]
then
# clean up s3 backups
s3cmd ls s3://$1/ | while read -r line;
@nigelheap
nigelheap / drush-s3cmd.sh
Last active July 5, 2017 23:44 — forked from chrisfree/drush-s3cmd.sh
Backup a Drupal site to Amazon S3 using Drush
# arg1 : bucket
# arg2 : project path
# arg3 : backup path
# arg4 : max s3
# Switch to the docroot.
cd $2
# Backup the database.
@nigelheap
nigelheap / Robots Environment .htaccess
Last active June 30, 2017 22:08 — forked from chadclark/Robots Environment .htaccess
Robots.txt for Staging and Production. By adding these rewrite rules to your .htaccess file, robots_dev.txt will be served as robots.txt on any non-production server.
RewriteEngine On
RewriteCond %{HTTP_HOST} \.dev$ [NC]
RewriteCond %{HTTP_HOST} ^dev\. [NC]
RewriteCond %{HTTP_HOST} \.uat$ [NC]
RewriteCond %{HTTP_HOST} ^uat\. [NC]
RewriteRule ^robots.txt robots_dev.txt [L]
@nigelheap
nigelheap / made-by-itomic.js
Created June 14, 2017 04:36
Made by itomic
if (typeof console !== "undefined") {
window.console.log.apply(console, [
'\n %c Made by Itomic %c %c %c http://www.itomic.com.au %c \n\n',
'color: #fff; background: #1dbeea; padding:5px 0;',
'background: #1dbeea; padding:5px 0;',
'background: #222222; padding:5px 0;',
'color: #fff; background: #222222; padding:5px 0;',
'background: #fff; padding:5px 0;'
<?php
add_filter('http_request_args', function( $r, $url ) {
if ( 0 === strpos( $url, 'https://api.wordpress.org/plugins/update-check/1.1/' ) ) {
$blocked_plugins = array(
'plugin_name_1',
'plugin_name_2',
'plugin_name_3'
);