Information from the 2021 Gazetteer Files. Direct link https://www2.census.gov/geo/docs/maps-data/data/gazetteer/2021_Gazetteer/2021_Gaz_zcta_national.zip
Previous versions on GitHub (with noteworthy discussions):
| # Setup: | |
| # conda create -n whisper python=3.9 | |
| # conda activate whisper | |
| # https://github.com/openai/whisper | |
| # pip install git+https://github.com/openai/whisper.git | |
| # Usage: | |
| # python whisper-audio-to-text.py --audio_dir my_files --out_dir texts | |
| import argparse |
| { | |
| "type": "record", | |
| "name": "FinTransactions", | |
| "fields": [ | |
| { "name": "ts", "type": "string" }, | |
| { "name": "account_id", "type": "string" }, | |
| { "name": "transaction_id", "type": "string" }, | |
| { "name": "amount", "type": "int" }, | |
| { "name": "lat", "type": "double" }, | |
| { "name": "lon", "type": "double" }, |
| bin/pulsar-client produce --key "test1" "persistent://public/default/ack-2" -m "Test this thing 4" -n 25 | |
| public void consumeFromPulsarAsync() throws Exception { | |
| PulsarClient client = PulsarClient. | |
| builder() | |
| .serviceUrl("pulsar://pulsar1:6650") | |
| .build(); | |
| Consumer<String> consumer = client.newConsumer(Schema.STRING) |
Information from the 2021 Gazetteer Files. Direct link https://www2.census.gov/geo/docs/maps-data/data/gazetteer/2021_Gazetteer/2021_Gaz_zcta_national.zip
Previous versions on GitHub (with noteworthy discussions):
| import ai.djl.modality.nlp.DefaultVocabulary; | |
| import ai.djl.modality.nlp.Vocabulary; | |
| import ai.djl.modality.nlp.bert.BertToken; | |
| import ai.djl.modality.nlp.bert.BertTokenizer; | |
| import ai.djl.modality.nlp.qa.QAInput; | |
| import ai.djl.ndarray.NDArray; | |
| import ai.djl.ndarray.NDList; | |
| import ai.djl.ndarray.NDManager; | |
| import ai.djl.translate.Batchifier; | |
| import ai.djl.translate.Translator; |
| apiVersion: v1 | |
| data: | |
| # TODO: replace with own minifi properties | |
| minifi.properties: | | |
| # Core Properties # | |
| nifi.version=0.11.0 | |
| nifi.flow.configuration.file=./conf/config.yml | |
| nifi.administrative.yield.duration=30 sec | |
| # If a component has no work to do (is "bored"), how long should we wait before checking again for work? | |
| nifi.bored.yield.duration=100 millis |
| # Requirements: kafka-python gssapi krbticket | |
| import os | |
| import time | |
| from kafka import KafkaConsumer, KafkaProducer | |
| from krbticket import KrbConfig, KrbCommand | |
| try: | |
| os.environ['KRB5CCNAME'] = '/tmp/krb5cc_<myusername>' | |
| kconfig = KrbConfig(principal='araujo', keytab='/path/to/<myusername>.keytab') | |
| KrbCommand.kinit(kconfig) |
| import avro.schema | |
| import io | |
| import os | |
| import requests | |
| from avro.io import DatumReader, BinaryDecoder | |
| from cachetools import TTLCache | |
| from kafka import KafkaConsumer | |
| # Kafka broker | |
| BROKERS = ['cdp.52.33.201.179.nip.io:9092'] |
this is a rough draft and may be updated with more examples
GitHub was kind enough to grant me swift access to the Copilot test phase despite me @'ing them several hundred times about ICE. I would like to examine it not in terms of productivity, but security. How risky is it to allow an AI to write some or all of your code?
Ultimately, a human being must take responsibility for every line of code that is committed. AI should not be used for "responsibility washing." However, Copilot is a tool, and workers need their tools to be reliable. A carpenter doesn't have to
| import os | |
| from kafka import KafkaProducer, KafkaConsumer | |
| BOOTSTRAP_SERVERS=os.gentenv("KAFKA_BOOTSTRAP_SERVERS").split(",") | |
| TOPIC_NAME="the-topic" | |
| SASL_USERNAME=os.gentenv("KAFKA_SASL_USERNAME") | |
| SASL_PASSWORD=os.gentenv("KAFKA_SASL_PASSWORD") | |
| def consume(): | |
| consumer = KafkaConsumer(TOPIC_NAME, security_protocol="SASL_SSL", sasl_mechanism="SCRAM-SHA-512", sasl_plain_username=SASL_USERNAME, sasl_plain_password=SASL_PASSWORD, bootstrap_servers=BOOTSTRAP_SERVERS) |