Original blog: https://blog.miguelgrinberg.com/post/it-s-time-for-a-change-datetime-utcnow-is-now-deprecated
Example for new timezone aware objects:
from datetime import datetime, timezone
def aware_utcnow():
return datetime.now(timezone.utc)
redis-cli keys \* | while read key; do value="`redis-cli get "$key"`"; echo "$key: $value"; done |
Original blog: https://blog.miguelgrinberg.com/post/it-s-time-for-a-change-datetime-utcnow-is-now-deprecated
Example for new timezone aware objects:
from datetime import datetime, timezone
def aware_utcnow():
return datetime.now(timezone.utc)
Just run the following command:
cat ~/.vscode/extensions/extensions.json | jq -r ".[] | .identifier.id"
The script will create a ZIP file according to the AWS Documented process.
The resulting ZIP file can then be uploaded to S3 from where you can create the Lambda function, either by the AWS CLI, the AWS Console or by AWS CloudFormation. Just supply the S3 ZIP file URL and your Lambda function will be created.
requirements.txt
file in the same directory as the Python script[2023-12-30T07:12:55Z TRACE hyper::client::pool] checkout waiting for idle connection: ("https", github.com) | |
[2023-12-30T07:12:55Z DEBUG reqwest::connect] starting new connection: https://github.com/ | |
[2023-12-30T07:12:55Z TRACE hyper::client::connect::http] Http::connect; scheme=Some("https"), host=Some("github.com"), port=None | |
[2023-12-30T07:12:55Z DEBUG hyper::client::connect::dns] resolving host="github.com" | |
[2023-12-30T07:12:55Z DEBUG hyper::client::connect::http] connecting to 140.82.121.3:443 | |
[2023-12-30T07:12:55Z DEBUG hyper::client::connect::http] connected to 140.82.121.3:443 | |
[2023-12-30T07:12:55Z TRACE hyper::client::conn] client handshake Http1 | |
[2023-12-30T07:12:55Z TRACE hyper::client::client] handshake complete, spawning background dispatcher task | |
[2023-12-30T07:12:55Z TRACE hyper::proto::h1::conn] flushed({role=client}): State { reading: Init, writing: Init, keep_alive: Busy } | |
[2023-12-30T07:12:55Z TRACE hyper::client::pool] checkout dropped for ("https", github.com) |
People
:bowtie: |
😄 :smile: |
😆 :laughing: |
---|---|---|
😊 :blush: |
😃 :smiley: |
:relaxed: |
😏 :smirk: |
😍 :heart_eyes: |
😘 :kissing_heart: |
😚 :kissing_closed_eyes: |
😳 :flushed: |
😌 :relieved: |
😆 :satisfied: |
😁 :grin: |
😉 :wink: |
😜 :stuck_out_tongue_winking_eye: |
😝 :stuck_out_tongue_closed_eyes: |
😀 :grinning: |
😗 :kissing: |
😙 :kissing_smiling_eyes: |
😛 :stuck_out_tongue: |
import requests | |
import logging | |
# These two lines enable debugging at httplib level (requests->urllib3->http.client) | |
# You will see the REQUEST, including HEADERS and DATA, and RESPONSE with HEADERS but without DATA. | |
# The only thing missing will be the response.body which is not logged. | |
try: | |
import http.client as http_client | |
except ImportError: | |
# Python 2 |
Go to https://cachedview.com/
Navigate to the deleted repo, e.g. https://webcache.googleusercontent.com/search?q=cache:https://github.com/apcera/termtables
Copy latest known commit sha1 signature
A simple way to get AWS Service Quotas using Python
by Danny Quah, May 2020 (revised Jan 2022)
Through the Embed instruction or plugin, Gist snippets on GitHub can conveniently provide posts on Medium, WordPress, and elsewhere supplementary information (lines of code, images, Markdown-created tables, and so on). But while Gist snippets on GitHub can be managed directly via browser or through something like [Gisto][], a user might also wish to manipulate them offline. This last is for many of the same reasons that a user seeks to clone a git repo to their local filesystem, modify it locally, and then only subsequently push changes back up to GitHub.
Here's how to do this:
Create the gist on GitHub and then clone it to your local filesystem: