Skip to content

Instantly share code, notes, and snippets.

View admariner's full-sized avatar

Periklis Papanikolaou admariner

  • admariner
  • Thessaloniki
View GitHub Profile
@admariner
admariner / gist:a37ebf708e7c1c97e83b8575d575fe96
Created March 21, 2024 08:28 — forked from boydnorwood/gist:e19086c77c477b8ad32f00d0c1247add
SQL for generating a basic rankings report in Data Studio with data from your Nozzle workspace
--Top Ranking URLs Report for Nozzle.io
--Data Studio Template can be found here: https://datastudio.google.com/u/1/reporting/359d4414-0cd6-4da1-8df5-2c6908e0ddec/page/pyxcB
WITH
-- find the latest versioned keyword data
-- this can also be used to pin a query to an older version, good for static reports
latest_keyword_source_versions AS (
SELECT keyword_source_id, MAX(keyword_source_version_id) AS keyword_source_version_id
FROM nozzledata.nozzle_nozzleofficial.keywords
WHERE keyword_source_id=930701976723823
@admariner
admariner / finetune_llama2.py
Created March 20, 2024 17:41 — forked from mlabonne/finetune_llama2.py
Easy Llama 2 fine-tuning script (📝 Article: https://tinyurl.com/finetunellama2)
# Based on younesbelkada/finetune_llama_v2.py
# Install the following libraries:
# pip install accelerate==0.21.0 peft==0.4.0 bitsandbytes==0.40.2 transformers==4.31.0 trl==0.4.7 scipy
from dataclasses import dataclass, field
from typing import Optional
import torch
from datasets import load_dataset
from transformers import (
@admariner
admariner / cron.route.ts
Created March 16, 2024 23:20 — forked from adrianhajdin/cron.route.ts
Web Scraping Full Course 2023 | Build and Deploy eCommerce Price Tracker
import { NextResponse } from "next/server";
import { getLowestPrice, getHighestPrice, getAveragePrice, getEmailNotifType } from "@/lib/utils";
import { connectToDB } from "@/lib/mongoose";
import Product from "@/lib/models/product.model";
import { scrapeAmazonProduct } from "@/lib/scraper";
import { generateEmailBody, sendEmail } from "@/lib/nodemailer";
export const maxDuration = 300; // This function can run for a maximum of 300 seconds
export const dynamic = "force-dynamic";
# Must have conda installed
# It costs approximately $0.2 (in GPT-4 API fees) to generate one example with analysis and design, and around $2.0 for a full project.
conda create -n metagpt python=3.11.4
conda activate metagpt
npm --version # to check you have npm installed
# optional: install node if you don't have it
npm install -g @mermaid-js/mermaid-cli
git clone https://github.com/geekan/metagpt
cd metagpt
{
"basics": {
"name": "Pavlos Hatziapostolou",
"label": "Architectural Engineer",
"email": "[email protected]",
"phone": "6946507797",
"summary": "Creative and results-driven Architectural Engineer with over 20 years of comprehensive experience in designing, supervising, and executing a wide range of architectural projects from residential renovations to healthcare facilities construction. ",
"location": {
"address": "Filikis Etairias 30",
"postalCode": "54621",
@admariner
admariner / phantomjs_horseman_dynamic_page_crawling.js
Created November 14, 2023 15:48 — forked from dineshsprabu/phantomjs_horseman_dynamic_page_crawling.js
[NodeJS] Dynamic Page Crawling with PhantomJS and Horseman
var Horseman = require('node-horseman');
var horseman = new Horseman();
horseman
.open('http://httpbin.org/ip')
.setProxy('http://api-key:@proxy.crawlera.com:8010') //change the proxy before use.
.html('body')
.then(function(body) {
console.log(body);
});
@admariner
admariner / logflare_to_common_log_format.js
Created November 14, 2023 10:26 — forked from dwsmart/logflare_to_common_log_format.js
Logflare to Common Log Format nodejs Script
// require libs
// run npm install @google-cloud/bigquery
const { BigQuery } = require('@google-cloud/bigquery');
const fs = require('fs');
// BigQuery Config - see https://cloud.google.com/docs/authentication/production#create_service_account
const options = {
keyFilename: '{path_to_key_file}',
projectId: '{project_id}',
};
@admariner
admariner / google_analytics_bigquery_channel_grouping_function_advanced_ga4.sql Define custom Channel Groupings in a reusable "User Defined Function"(UDF) to make your life easier when working with Google Analytics 4 data in BigQuery. Full article on stacktonic.com
-- Author: Krisjan Oldekamp
-- https://stacktonic.com/article/google-analytics-4-and-big-query-create-custom-channel-groupings-in-a-reusable-sql-function
create or replace function `<your-project>.<your-dataset>.channel_grouping`(tsource string, medium string, campaign string) as (
case
when (tsource = 'direct' or tsource is null)
and (regexp_contains(medium, r'^(\(not set\)|\(none\))$') or medium is null)
then 'direct'
when regexp_contains(campaign, r'^(.*shop.*)$')
and regexp_contains(medium, r'^(.*cp.*|ppc|paid.*)$')
@admariner
admariner / google_analytics_bigquery_channel_attribution_build_models.py Get actionable insights on your channel performance by building custom attribution models using Google Analytics 4 data in BigQuery. Full article on stacktonic.com
###################################################
# Author Krisjan Oldekamp / Stacktonic.com
# Email [email protected]
# Article https://stacktonic.com/article/build-a-data-driven-attribution-model-using-google-analytics-4-big-query-and-python
####################################################
#pip install marketing_attribution_models
#pip install --upgrade 'google-cloud-bigquery[bqstorage,pandas]'
#pip install pyarrow -> newest version!