In your command-line run the following commands:
brew doctorbrew update
In your command-line run the following commands:
brew doctorbrew update| from apscheduler.schedulers.background import BackgroundScheduler | |
| from apscheduler.executors.pool import ThreadPoolExecutor | |
| from apscheduler.jobstores.memory import MemoryJobStore | |
| from apscheduler.job import Job | |
| import json | |
| import logging | |
| from apscheduler.triggers.cron import CronTrigger | |
| import time | |
| from celery import Celery | |
| from typing import List |
| # A comment begins with a hash sign | |
| static.example.com be_static | |
| www.example.com be_static | |
| # You can add additional comments, but they must be on a new line | |
| example.com be_static | |
| api.example.com be_api |
| #!/usr/bin/env python3.6 | |
| import requests | |
| import json | |
| import re | |
| from datetime import datetime | |
| from datetime import timedelta | |
| def token(): | |
| headers = { | |
| 'Accept': 'application/json', |
| { | |
| "comment": "Sample configuration for the NAT64 Jool service.", | |
| "instance": "default", | |
| "framework": "netfilter", | |
| "global": { | |
| "comment": "Sample pool6 prefix", | |
| "pool6": "64:ff9b::/96" | |
| } |
Setup etcdctl using the instructions at https://github.com/etcd-io/etcd/releases/tag/v3.4.13 (changed path to /usr/local/bin):
Note: if you want to match th etcdctl binaries with the embedded k3s etcd version, please run the curl command for getting the version first and adjust ETCD_VER below accordingly:
curl -L --cacert /var/lib/rancher/k3s/server/tls/etcd/server-ca.crt --cert /var/lib/rancher/k3s/server/tls/etcd/server-client.crt --key /var/lib/rancher/k3s/server/tls/etcd/server-client.key https://127.0.0.1:2379/version
| Objective of the design: use inexpensive components to produce a cleanroom-style particle counter that can be run semi-continuously, monitored from outside the cleanroom. | |
| The 攀藤 (PlanTower) PMS5003 (fifth generation) or PMS7003 (seventh generation) dust sensor costs $15 to $22 on eBay from Shenzhen suppliers. It samples 0.1 liter and calculates dust particles in > 0.3um, 0.5um, 1um, 2.5um, 5um and 10um bins. In order to get a useful reading in a cleanroom, we need to add the results from 100 samples (10 liters of air) or the counts are too small to be reliable. ISO 14644-1:1999 specifies counts in 1 cubic metre of air, which would require 10,000 samples from this small sensor. 100 samples takes roughly 300 seconds to process, so sampling 1 m3 would take 500 minutes (8.3 hours). As this device is not intended to be a NIST-traceable particle counter, sampling at 1% should give a reasonable estimate of the cleanroom quality. Note that the two smallest particle bins have up to 50% uncertainty; the larger |
| // Not sure where I found this approach. Probably a combination of multiple kind people's advices. | |
| // Basically we make a random number or something and encode it as 62 bit, for concise URL ids. | |
| // I don't recall how uniqueness is guaranteed but I think its because we are basing the ID off the timestamp. You could also add a unique constraint on your DB column of course. | |
| CREATE sequence global_id_sequence MINVALUE 0 MAXVALUE 1023 START 0 CYCLE; | |
| CREATE OR REPLACE FUNCTION generate_number(OUT number_id bigint) AS $$ | |
| DECLARE | |
| shard_id int := 1; | |
| start_epoch bigint := 1498490095287; |
| You are Manus, an AI agent created by the Manus team. | |
| You excel at the following tasks: | |
| 1. Information gathering, fact-checking, and documentation | |
| 2. Data processing, analysis, and visualization | |
| 3. Writing multi-chapter articles and in-depth research reports | |
| 4. Creating websites, applications, and tools | |
| 5. Using programming to solve various problems beyond development | |
| 6. Various tasks that can be accomplished using computers and the internet |