This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
"""A quick benchmark comparing the performance of: | |
- msgspec: https://github.com/jcrist/msgspec | |
- pydantic V1: https://docs.pydantic.dev/1.10/ | |
- pydantic V2: https://docs.pydantic.dev/dev-v2/ | |
The benchmark is modified from the one in the msgspec repo here: | |
https://github.com/jcrist/msgspec/blob/main/benchmarks/bench_validation.py | |
I make no claims that it's illustrative of all use cases. I wrote this up |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
""" | |
A simple implementation of GeoJSON (RFC 7946) using msgspec | |
(https://jcristharif.com/msgspec/) for parsing and validation. | |
The `loads` and `dumps` methods work like normal `json.loads`/`json.dumps`, | |
but: | |
- Will result in high-level GeoJSON types | |
- Will error nicely if a field is missing or the wrong type | |
- Will fill in default values for optional fields |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# This uses the noarch `current_repodata.json` from conda-forge, which can be found | |
# at https://conda.anaconda.org/conda-forge/noarch/current_repodata.json | |
# This file is medium in size (13 MiB), and contains a nested structure of metadata | |
# about packages on conda-forge. | |
# | |
# Here we benchmark querying the top 10 packages by size from repodata, using a number | |
# of different python JSON libraries. | |
def bench_msgspec(data: bytes) -> None: | |
from operator import attrgetter |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
""" | |
This benchmark is a modified version of the benchmark available at | |
https://github.com/samuelcolvin/pydantic/tree/master/benchmarks to support | |
benchmarking msgspec. | |
The benchmark measures the time to JSON encode/decode `n` random objects | |
matching a specific schema. It compares the time required for both | |
serialization _and_ schema validation. | |
""" |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import argparse | |
import json | |
import lzma | |
import os | |
import timeit | |
import urllib.request | |
import msgspec | |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
diff --git a/dask_gateway/client.py b/garnet/client.py | |
index db044b9..6f35ea1 100644 | |
--- a/dask_gateway/client.py | |
+++ b/garnet/client.py | |
@@ -27,25 +27,25 @@ from .utils import format_template, cancel_task | |
del comm | |
-__all__ = ("Gateway", "GatewayCluster", "GatewayClusterError", "GatewayServerError") | |
+__all__ = ("Garnet", "GarnetCluster", "GarnetClusterError", "GarnetServerError") |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
""" | |
For workloads where most of the grunt work is *driven* by prefect, but done | |
using some external system like dask, it makes more sense to use Prefect to | |
drive Dask rather than running Prefect inside Dask. | |
If you want your prefect Flow to startup a dask cluster, you'll want to ensure | |
all resources are still cleaned up properly, even in the case of Flow failure. | |
To do this, you can make use of a `prefect.resource_manager`. This mirrors the | |
`contextmanager` pattern you may be familiar with in Python, but makes it work | |
with Prefect tasks. See |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from prefect import Flow | |
from prefect.storage import S3 | |
from prefect.run_configs import ECSRun | |
from prefect.executors import DaskExecutor | |
with Flow("example") as flow: | |
... | |
flow.storage = S3("my-flows") | |
flow.run_config = ECSRun() # Run job on ECS instead of locally |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from prefect import Flow | |
from prefect.executors import DaskExecutor | |
with Flow("daskcloudprovider-example") as flow: | |
# Add tasks to flow here... | |
# Execute this flow on a Dask cluster deployed on AWS Fargate | |
flow.executor = DaskExecutor( | |
cluster_class="dask_cloudprovider.aws.FargateCluster", | |
cluster_kwargs={"image": "prefecthq/prefect", "n_workers": 5} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from prefect import Flow | |
from prefect.run_configs import KubernetesRun | |
from prefect.storage import Docker | |
with Flow("kubernetes-example") as flow: | |
# Add tasks to flow here... | |
# Run on Kubernetes with a custom resource configuration | |
flow.run_config = KubernetesRun(cpu_request=2, memory_request="4Gi") | |
# Store the flow in a docker image |