Models | Examples |
---|---|
Display ads | Yahoo! |
Search ads |
require 'bcrypt' | |
module EasyAuth | |
# http://techspeak.plainlystated.com/2010/03/drop-dead-simple-authentication-for.html | |
# To generate a crypted password (in irb): | |
# require 'easy_auth' | |
# EasyAuth.encrypt_password('my_password') # Put returned array in AUTHORIZED_USERS | |
AUTHORIZED_USERS = { |
[ | |
{ | |
"code": "AAA", | |
"lat": "-17.3595", | |
"lon": "-145.494", | |
"name": "Anaa Airport", | |
"city": "Anaa", | |
"state": "Tuamotu-Gambier", | |
"country": "French Polynesia", | |
"woeid": "12512819", |
server { | |
listen 80; | |
root /var/www/craft.dev/public; | |
index index.php index.html index.htm; | |
server_name craft.dev; | |
location / { | |
try_files $uri $uri/ @rewrites; |
Here are a few common tasks you might do in your templates, as they would be written in ExpressionEngine vs. Craft CMS.
#!/bin/bash | |
# A simple script to backup an organization's GitHub repositories. | |
GHBU_BACKUP_DIR=${GHBU_BACKUP_DIR-"github-backups"} # where to place the backup files | |
GHBU_ORG=${GHBU_ORG-"<CHANGE-ME>"} # the GitHub organization whose repos will be backed up | |
GHBU_API=${GHBU_API-"https://api.github.com"} # base URI for the GitHub API | |
GHBU_GITHOST=${GHBU_GITHOST-"<CHANGE-ME>.github.com"} # the GitHub hostname (see comments) | |
# I recommend using an API token so it is easily trackable and removable. | |
# Note that you MUST have SSH keys for a user with the access to all repos set up |
# POST a JSON file and redirect output to stdout | |
wget -q -O - --header="Content-Type:application/json" --post-file=foo.json http://127.0.0.1 | |
# Download a complete website | |
wget -m -r -linf -k -p -q -E -e robots=off http://127.0.0.1 | |
# But it may be sufficient | |
wget -mpk http://127.0.0.1 | |
# Download all images of a website |
Audience: I assume you heard of chatGPT, maybe played with it a little, and was imressed by it (or tried very hard not to be). And that you also heard that it is "a large language model". And maybe that it "solved natural language understanding". Here is a short personal perspective of my thoughts of this (and similar) models, and where we stand with respect to language understanding.
Around 2014-2017, right within the rise of neural-network based methods for NLP, I was giving a semi-academic-semi-popsci lecture, revolving around the story that achieving perfect language modeling is equivalent to being as intelligent as a human. Somewhere around the same time I was also asked in an academic panel "what would you do if you were given infinite compute and no need to worry about labour costs" to which I cockily responded "I would train a really huge language model, just to show that it doesn't solve everything!". We
:root { | |
--font-main: "Inter", system-ui, "Segoe UI", Roboto, Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol"; | |
--font-lufga: "Inter", system-ui, "Segoe UI", Roboto, Helvetica, Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol"; | |
} | |
.theme_dark, .theme_moon_dark, .theme_moon_dark_conditional { | |
--app-bg: #181715; | |
--page-text: #F2E6D7; | |
--app-text: #F2E6D7; | |
--primary: #F2E6D7; |