A curated list of AWS resources to prepare for the AWS Certifications
A curated list of awesome AWS resources you need to prepare for the all 5 AWS Certifications. This gist will include: open source repos, blogs & blogposts, ebooks, PDF, whitepapers, video courses, free lecture, slides, sample test and many other resources.
The traditional technical interview process is designed to ferret out a candidate's weaknesses whereas the process should be designed to find a candidate's strengths.
No one can possibly master all of the arcana of today's technology landscape, let alone bring that mastery to bear on a problem under pressure and with no tools other than a whiteboard.
Under those circumstances, everyone can make anyone look like an idiot.
The fundamental problem with the traditional technical interview process is that it is based on a chain of inference that seems reasonable but is in fact deeply flawed. That chain goes something like this:
set -o errexit | |
TABLE_NAME="${1:?You must pass a TABLE_NAME as first argument}" | |
STARTID="${2:?You must pass a STARTID as 2nd argument}" | |
ENDID="${3:?You must pass a ENDID as 3rd argument}" | |
[[ -z "$4" ]] && LIMIT="" || LIMIT="LIMIT $4" | |
. logins.sh | |
INSERT_COMMAND="INSERT INTO clickhouse_table(column1,column2,...) FORMAT TSV" |
Latency Comparison Numbers (~2012) | |
---------------------------------- | |
L1 cache reference 0.5 ns | |
Branch mispredict 5 ns | |
L2 cache reference 7 ns 14x L1 cache | |
Mutex lock/unlock 25 ns | |
Main memory reference 100 ns 20x L2 cache, 200x L1 cache | |
Compress 1K bytes with Zippy 3,000 ns 3 us | |
Send 1K bytes over 1 Gbps network 10,000 ns 10 us | |
Read 4K randomly from SSD* 150,000 ns 150 us ~1GB/sec SSD |
import anthropic | |
client = anthropic.Anthropic( | |
api_key="my_api_key", | |
) | |
message = client.messages.create( | |
model="claude-3-opus-20240229", | |
max_tokens=1000, | |
temperature=0, | |
messages=[ |