Skip to content

Instantly share code, notes, and snippets.

@keyboardAnt
Last active November 1, 2023 19:29
Show Gist options
  • Save keyboardAnt/322c4263f231387cad089ed15b0394db to your computer and use it in GitHub Desktop.
Save keyboardAnt/322c4263f231387cad089ed15b0394db to your computer and use it in GitHub Desktop.
lm_format_enforcer_vllm_integration.ipynb
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "view-in-github",
"colab_type": "text"
},
"source": [
"<a href=\"https://colab.research.google.com/gist/keyboardAnt/322c4263f231387cad089ed15b0394db/lm_format_enforcer_vllm_integration.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "Wv1vqZgW-mLt"
},
"source": [
"# LM Format Enforcer Integration with vLLM\n",
"\n",
"<a target=\"_blank\" href=\"https://colab.research.google.com/github/noamgat/lm-format-enforcer/blob/main/samples/colab_vllm_integration.ipynb\">\n",
" <img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/>\n",
"</a>\n",
"\n",
"This notebook shows how you can integrate with the vLLM library. vLLM does not currently have an API for token filtering, so we have to do some monkey patching to expose the functionality.\n",
"\n",
"## Setting up the COLAB runtime (user action required)\n",
"\n",
"This colab-friendly notebook is targeted at demoing the enforcer on LLAMA2. It can run on a free GPU on Google Colab.\n",
"Make sure that your runtime is set to GPU:\n",
"\n",
"Menu Bar -> Runtime -> Change runtime type -> T4 GPU (at the time of writing this notebook). [Guide here](https://www.codesansar.com/deep-learning/using-free-gpu-tpu-google-colab.htm)."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "7uUS5qUj-mLv"
},
"source": [
"## Gathering huggingface credentials (user action required)\n",
"\n",
"We begin by installing the dependencies. This demo uses llama2, so you will have to create a free huggingface account, request access to the llama2 model, create an access token, and insert it when executing the next cell will request it.\n",
"\n",
"Links:\n",
"\n",
"- [Request access to llama model](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf). See the \"Access Llama 2 on Hugging Face\" section.\n",
"- [Create huggingface access token](https://huggingface.co/settings/tokens)\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "8TnReyIj-mLv",
"colab": {
"base_uri": "https://localhost:8080/"
},
"outputId": "725a41f8-6e10-4c42-8850-234de921671f"
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"Requirement already satisfied: vllm in /usr/local/lib/python3.10/dist-packages (0.2.1.post1)\n",
"Requirement already satisfied: lm-format-enforcer in /usr/local/lib/python3.10/dist-packages (0.4.3)\n",
"Requirement already satisfied: ninja in /usr/local/lib/python3.10/dist-packages (from vllm) (1.11.1.1)\n",
"Requirement already satisfied: psutil in /usr/local/lib/python3.10/dist-packages (from vllm) (5.9.5)\n",
"Requirement already satisfied: ray>=2.5.1 in /usr/local/lib/python3.10/dist-packages (from vllm) (2.7.1)\n",
"Requirement already satisfied: pandas in /usr/local/lib/python3.10/dist-packages (from vllm) (1.5.3)\n",
"Requirement already satisfied: pyarrow in /usr/local/lib/python3.10/dist-packages (from vllm) (9.0.0)\n",
"Requirement already satisfied: sentencepiece in /usr/local/lib/python3.10/dist-packages (from vllm) (0.1.99)\n",
"Requirement already satisfied: numpy in /usr/local/lib/python3.10/dist-packages (from vllm) (1.23.5)\n",
"Requirement already satisfied: torch==2.0.1 in /usr/local/lib/python3.10/dist-packages (from vllm) (2.0.1)\n",
"Requirement already satisfied: transformers>=4.34.0 in /usr/local/lib/python3.10/dist-packages (from vllm) (4.34.1)\n",
"Requirement already satisfied: xformers==0.0.22 in /usr/local/lib/python3.10/dist-packages (from vllm) (0.0.22)\n",
"Requirement already satisfied: fastapi in /usr/local/lib/python3.10/dist-packages (from vllm) (0.104.1)\n",
"Requirement already satisfied: uvicorn[standard] in /usr/local/lib/python3.10/dist-packages (from vllm) (0.23.2)\n",
"Requirement already satisfied: pydantic<2 in /usr/local/lib/python3.10/dist-packages (from vllm) (1.10.13)\n",
"Requirement already satisfied: filelock in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (3.12.4)\n",
"Requirement already satisfied: typing-extensions in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (4.8.0)\n",
"Requirement already satisfied: sympy in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (1.12)\n",
"Requirement already satisfied: networkx in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (3.2)\n",
"Requirement already satisfied: jinja2 in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (3.1.2)\n",
"Requirement already satisfied: nvidia-cuda-nvrtc-cu11==11.7.99 in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (11.7.99)\n",
"Requirement already satisfied: nvidia-cuda-runtime-cu11==11.7.99 in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (11.7.99)\n",
"Requirement already satisfied: nvidia-cuda-cupti-cu11==11.7.101 in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (11.7.101)\n",
"Requirement already satisfied: nvidia-cudnn-cu11==8.5.0.96 in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (8.5.0.96)\n",
"Requirement already satisfied: nvidia-cublas-cu11==11.10.3.66 in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (11.10.3.66)\n",
"Requirement already satisfied: nvidia-cufft-cu11==10.9.0.58 in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (10.9.0.58)\n",
"Requirement already satisfied: nvidia-curand-cu11==10.2.10.91 in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (10.2.10.91)\n",
"Requirement already satisfied: nvidia-cusolver-cu11==11.4.0.1 in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (11.4.0.1)\n",
"Requirement already satisfied: nvidia-cusparse-cu11==11.7.4.91 in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (11.7.4.91)\n",
"Requirement already satisfied: nvidia-nccl-cu11==2.14.3 in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (2.14.3)\n",
"Requirement already satisfied: nvidia-nvtx-cu11==11.7.91 in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (11.7.91)\n",
"Requirement already satisfied: triton==2.0.0 in /usr/local/lib/python3.10/dist-packages (from torch==2.0.1->vllm) (2.0.0)\n",
"Requirement already satisfied: setuptools in /usr/local/lib/python3.10/dist-packages (from nvidia-cublas-cu11==11.10.3.66->torch==2.0.1->vllm) (67.7.2)\n",
"Requirement already satisfied: wheel in /usr/local/lib/python3.10/dist-packages (from nvidia-cublas-cu11==11.10.3.66->torch==2.0.1->vllm) (0.41.2)\n",
"Requirement already satisfied: cmake in /usr/local/lib/python3.10/dist-packages (from triton==2.0.0->torch==2.0.1->vllm) (3.27.7)\n",
"Requirement already satisfied: lit in /usr/local/lib/python3.10/dist-packages (from triton==2.0.0->torch==2.0.1->vllm) (17.0.4)\n",
"Requirement already satisfied: interegular>=0.3.2 in /usr/local/lib/python3.10/dist-packages (from lm-format-enforcer) (0.3.2)\n",
"Requirement already satisfied: click>=7.0 in /usr/local/lib/python3.10/dist-packages (from ray>=2.5.1->vllm) (8.1.7)\n",
"Requirement already satisfied: jsonschema in /usr/local/lib/python3.10/dist-packages (from ray>=2.5.1->vllm) (4.19.1)\n",
"Requirement already satisfied: msgpack<2.0.0,>=1.0.0 in /usr/local/lib/python3.10/dist-packages (from ray>=2.5.1->vllm) (1.0.7)\n",
"Requirement already satisfied: packaging in /usr/local/lib/python3.10/dist-packages (from ray>=2.5.1->vllm) (23.2)\n",
"Requirement already satisfied: protobuf!=3.19.5,>=3.15.3 in /usr/local/lib/python3.10/dist-packages (from ray>=2.5.1->vllm) (3.20.3)\n",
"Requirement already satisfied: pyyaml in /usr/local/lib/python3.10/dist-packages (from ray>=2.5.1->vllm) (6.0.1)\n",
"Requirement already satisfied: aiosignal in /usr/local/lib/python3.10/dist-packages (from ray>=2.5.1->vllm) (1.3.1)\n",
"Requirement already satisfied: frozenlist in /usr/local/lib/python3.10/dist-packages (from ray>=2.5.1->vllm) (1.4.0)\n",
"Requirement already satisfied: requests in /usr/local/lib/python3.10/dist-packages (from ray>=2.5.1->vllm) (2.31.0)\n",
"Requirement already satisfied: huggingface-hub<1.0,>=0.16.4 in /usr/local/lib/python3.10/dist-packages (from transformers>=4.34.0->vllm) (0.17.3)\n",
"Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.10/dist-packages (from transformers>=4.34.0->vllm) (2023.6.3)\n",
"Requirement already satisfied: tokenizers<0.15,>=0.14 in /usr/local/lib/python3.10/dist-packages (from transformers>=4.34.0->vllm) (0.14.1)\n",
"Requirement already satisfied: safetensors>=0.3.1 in /usr/local/lib/python3.10/dist-packages (from transformers>=4.34.0->vllm) (0.4.0)\n",
"Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.10/dist-packages (from transformers>=4.34.0->vllm) (4.66.1)\n",
"Requirement already satisfied: anyio<4.0.0,>=3.7.1 in /usr/local/lib/python3.10/dist-packages (from fastapi->vllm) (3.7.1)\n",
"Requirement already satisfied: starlette<0.28.0,>=0.27.0 in /usr/local/lib/python3.10/dist-packages (from fastapi->vllm) (0.27.0)\n",
"Requirement already satisfied: python-dateutil>=2.8.1 in /usr/local/lib/python3.10/dist-packages (from pandas->vllm) (2.8.2)\n",
"Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (from pandas->vllm) (2023.3.post1)\n",
"Requirement already satisfied: h11>=0.8 in /usr/local/lib/python3.10/dist-packages (from uvicorn[standard]->vllm) (0.14.0)\n",
"Requirement already satisfied: httptools>=0.5.0 in /usr/local/lib/python3.10/dist-packages (from uvicorn[standard]->vllm) (0.6.1)\n",
"Requirement already satisfied: python-dotenv>=0.13 in /usr/local/lib/python3.10/dist-packages (from uvicorn[standard]->vllm) (1.0.0)\n",
"Requirement already satisfied: uvloop!=0.15.0,!=0.15.1,>=0.14.0 in /usr/local/lib/python3.10/dist-packages (from uvicorn[standard]->vllm) (0.19.0)\n",
"Requirement already satisfied: watchfiles>=0.13 in /usr/local/lib/python3.10/dist-packages (from uvicorn[standard]->vllm) (0.21.0)\n",
"Requirement already satisfied: websockets>=10.4 in /usr/local/lib/python3.10/dist-packages (from uvicorn[standard]->vllm) (12.0)\n",
"Requirement already satisfied: idna>=2.8 in /usr/local/lib/python3.10/dist-packages (from anyio<4.0.0,>=3.7.1->fastapi->vllm) (3.4)\n",
"Requirement already satisfied: sniffio>=1.1 in /usr/local/lib/python3.10/dist-packages (from anyio<4.0.0,>=3.7.1->fastapi->vllm) (1.3.0)\n",
"Requirement already satisfied: exceptiongroup in /usr/local/lib/python3.10/dist-packages (from anyio<4.0.0,>=3.7.1->fastapi->vllm) (1.1.3)\n",
"Requirement already satisfied: fsspec in /usr/local/lib/python3.10/dist-packages (from huggingface-hub<1.0,>=0.16.4->transformers>=4.34.0->vllm) (2023.6.0)\n",
"Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.10/dist-packages (from python-dateutil>=2.8.1->pandas->vllm) (1.16.0)\n",
"Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from jinja2->torch==2.0.1->vllm) (2.1.3)\n",
"Requirement already satisfied: attrs>=22.2.0 in /usr/local/lib/python3.10/dist-packages (from jsonschema->ray>=2.5.1->vllm) (23.1.0)\n",
"Requirement already satisfied: jsonschema-specifications>=2023.03.6 in /usr/local/lib/python3.10/dist-packages (from jsonschema->ray>=2.5.1->vllm) (2023.7.1)\n",
"Requirement already satisfied: referencing>=0.28.4 in /usr/local/lib/python3.10/dist-packages (from jsonschema->ray>=2.5.1->vllm) (0.30.2)\n",
"Requirement already satisfied: rpds-py>=0.7.1 in /usr/local/lib/python3.10/dist-packages (from jsonschema->ray>=2.5.1->vllm) (0.10.6)\n",
"Requirement already satisfied: charset-normalizer<4,>=2 in /usr/local/lib/python3.10/dist-packages (from requests->ray>=2.5.1->vllm) (3.3.1)\n",
"Requirement already satisfied: urllib3<3,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests->ray>=2.5.1->vllm) (2.0.7)\n",
"Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests->ray>=2.5.1->vllm) (2023.7.22)\n",
"Requirement already satisfied: mpmath>=0.19 in /usr/local/lib/python3.10/dist-packages (from sympy->torch==2.0.1->vllm) (1.3.0)\n",
"\n",
" _| _| _| _| _|_|_| _|_|_| _|_|_| _| _| _|_|_| _|_|_|_| _|_| _|_|_| _|_|_|_|\n",
" _| _| _| _| _| _| _| _|_| _| _| _| _| _| _| _|\n",
" _|_|_|_| _| _| _| _|_| _| _|_| _| _| _| _| _| _|_| _|_|_| _|_|_|_| _| _|_|_|\n",
" _| _| _| _| _| _| _| _| _| _| _|_| _| _| _| _| _| _| _|\n",
" _| _| _|_| _|_|_| _|_|_| _|_|_| _| _| _|_|_| _| _| _| _|_|_| _|_|_|_|\n",
" \n",
" A token is already saved on your machine. Run `huggingface-cli whoami` to get more information or `huggingface-cli logout` if you want to log out.\n",
" Setting a new token will erase the existing one.\n",
" To login, `huggingface_hub` requires a token generated from https://huggingface.co/settings/tokens .\n",
"Token: \n",
"Add token as git credential? (Y/n) \n",
"Token is valid (permission: read).\n",
"\u001b[1m\u001b[31mCannot authenticate through git-credential as no helper is defined on your machine.\n",
"You might have to re-authenticate when pushing to the Hugging Face Hub.\n",
"Run the following command in your terminal in case you want to set the 'store' credential helper as default.\n",
"\n",
"git config --global credential.helper store\n",
"\n",
"Read https://git-scm.com/book/en/v2/Git-Tools-Credential-Storage for more details.\u001b[0m\n",
"Token has not been saved to git credential helper.\n",
"Your token has been saved to /root/.cache/huggingface/token\n",
"Login successful\n"
]
}
],
"source": [
"!pip install vllm lm-format-enforcer\n",
"!huggingface-cli login\n",
"\n",
"# When running from source / developing the library, use this instead\n",
"# %load_ext autoreload\n",
"# %autoreload 2\n",
"# import sys\n",
"# import os\n",
"# sys.path.append(os.path.abspath('..'))\n",
"## os.environ['CUDA_LAUNCH_BLOCKING'] = '1'"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "sB4vlD90-mLw"
},
"source": [
"## Creating a custom sampler that filters tokens\n",
"\n",
"We introduce a subclass of vLLM's ```SamplingParams``` that also accepts a token filtering function, with the same API as Huggingface Transformers\n",
"\n",
"```prefix_allowed_tokens_fn: Callable[[int, torch.Tensor], List[int]]```\n",
"\n",
"We then introduce the function ```_apply_allowed_token_filters()``` that applies the filter functions to the logits (sets them to negative infinity if not allowed) to requests that contain a filter function.\n",
"\n",
"We hope that in future releases of vLLM, this (or similar) will be part of vLLM's ```Sampler``` class."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "tx_2Aol7-mLw"
},
"outputs": [],
"source": [
"import vllm\n",
"import torch\n",
"from typing import List, Callable, Optional\n",
"from vllm.sampling_params import SamplingParams\n",
"from vllm.model_executor.input_metadata import InputMetadata\n",
"\n",
"class SamplingParamsWithFilterFunction(SamplingParams):\n",
" logits_allowed_tokens_filter_function: Optional[Callable[[int, torch.Tensor], List[int]]]\n",
"\n",
"def _apply_allowed_token_filters(logits: torch.Tensor,\n",
" input_metadata: InputMetadata) -> torch.Tensor:\n",
" num_seqs, vocab_size = logits.shape\n",
" logits_row_idx = 0\n",
" for seq_ids, sampling_params in input_metadata.seq_groups:\n",
" if isinstance(sampling_params, SamplingParamsWithFilterFunction):\n",
" filter_function = sampling_params.logits_allowed_tokens_filter_function\n",
" else:\n",
" filter_function = None\n",
" for seq_id in seq_ids:\n",
" if filter_function is not None:\n",
" output_token_ids = input_metadata.seq_data[seq_id].output_token_ids\n",
" output_token_tensor = torch.tensor(output_token_ids, dtype=torch.long)\n",
" allowed_tokens = filter_function(logits_row_idx, output_token_tensor)\n",
" logits_add_factor = torch.zeros(vocab_size, dtype=logits.dtype, device=logits.device)\n",
" logits_add_factor[:] = float('-inf')\n",
" logits_add_factor[allowed_tokens] = 0\n",
" logits[logits_row_idx] += logits_add_factor\n",
" logits_row_idx += 1\n",
" assert logits_row_idx == num_seqs\n",
" return logits\n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "iS3Bk3-k-mLw"
},
"source": [
"In order to integrate this function with the ```Sampler``` class, we have to change its ```forward()``` function to call it. Since we are not modifying vLLM itself, we will do this with monkey patching.\n",
"\n",
"Other than the line\n",
"```\n",
"logits = _apply_allowed_token_filters(logits, input_metadata)\n",
"```\n",
"this is a 100% copy of the original ```Sampler.forward()``` function."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "9O_kyZ1J-mLw"
},
"outputs": [],
"source": [
"from vllm.model_executor.layers.sampler import SamplerOutput, _prune_hidden_states, _get_logits, _get_output_tokens, _get_penalties, _apply_penalties, _get_temperatures, _get_top_p_top_k, _apply_top_p_top_k, _sample, _get_logprobs, _build_sampler_output, _SAMPLING_EPS\n",
"\n",
"from typing import Optional\n",
"\n",
"def patched_forward(\n",
" self,\n",
" embedding: torch.Tensor,\n",
" hidden_states: torch.Tensor,\n",
" input_metadata: InputMetadata,\n",
" embedding_bias: Optional[torch.Tensor] = None,\n",
" ) -> SamplerOutput:\n",
" # Get the hidden states that we use for sampling.\n",
" hidden_states = _prune_hidden_states(hidden_states, input_metadata)\n",
"\n",
" # Get the logits for the next tokens.\n",
" logits = _get_logits(hidden_states, embedding, embedding_bias,\n",
" self.vocab_size)\n",
"\n",
" # Apply presence and frequency penalties.\n",
" output_tokens = _get_output_tokens(input_metadata)\n",
" assert len(output_tokens) == logits.shape[0]\n",
" presence_penalties, frequency_penalties = _get_penalties(\n",
" input_metadata)\n",
" assert len(presence_penalties) == logits.shape[0]\n",
" assert len(frequency_penalties) == logits.shape[0]\n",
" logits = _apply_penalties(logits, output_tokens, presence_penalties,\n",
" frequency_penalties)\n",
"\n",
" ### LM FORMAT ENFORCER MONKEY PATCH START\n",
" logits = _apply_allowed_token_filters(logits, input_metadata)\n",
" ### LM FORMAT ENFORCER MONKEY PATCH END\n",
"\n",
" # Apply temperature scaling.\n",
" temperatures = _get_temperatures(input_metadata)\n",
" assert len(temperatures) == logits.shape[0]\n",
" if any(t != 1.0 for t in temperatures):\n",
" t = torch.tensor(temperatures,\n",
" dtype=logits.dtype,\n",
" device=logits.device)\n",
" # Use in-place division to avoid creating a new tensor.\n",
" logits.div_(t.unsqueeze(dim=1))\n",
"\n",
" # Apply top-p and top-k truncation.\n",
" top_ps, top_ks = _get_top_p_top_k(input_metadata, self.vocab_size)\n",
" assert len(top_ps) == len(top_ks) == logits.shape[0]\n",
" do_top_p = any(p < 1.0 - _SAMPLING_EPS for p in top_ps)\n",
" do_top_k = any(k != self.vocab_size for k in top_ks)\n",
" if do_top_p or do_top_k:\n",
" logits = _apply_top_p_top_k(logits, top_ps, top_ks)\n",
"\n",
" # We use float32 for probabilities and log probabilities.\n",
" # Compute the probabilities.\n",
" probs = torch.softmax(logits, dim=-1, dtype=torch.float)\n",
" # Compute the log probabilities.\n",
" # Use log_softmax to ensure numerical stability.\n",
" logprobs = torch.log_softmax(logits, dim=-1, dtype=torch.float)\n",
"\n",
" # Sample the next tokens.\n",
" sample_results = _sample(probs, logprobs, input_metadata)\n",
" # Get the logprobs query results.\n",
" prompt_logprobs, sample_logprobs = _get_logprobs(\n",
" logprobs, input_metadata, sample_results)\n",
" return _build_sampler_output(sample_results, input_metadata,\n",
" prompt_logprobs, sample_logprobs)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "LoGzJpcv-mLx"
},
"source": [
"We load the model, as is normally done with vLLM"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "Vin9d_x0-mLx",
"colab": {
"base_uri": "https://localhost:8080/"
},
"outputId": "cd46270b-70e7-4035-9d42-051cca9e93d2"
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": [
"INFO 10-31 15:50:34 llm_engine.py:72] Initializing an LLM engine with config: model='NousResearch/Llama-2-7b-chat-hf', tokenizer='NousResearch/Llama-2-7b-chat-hf', tokenizer_mode=auto, revision=None, tokenizer_revision=None, trust_remote_code=False, dtype=torch.float16, max_seq_len=4096, download_dir=None, load_format=auto, tensor_parallel_size=1, quantization=None, seed=0)\n",
"INFO 10-31 15:50:34 tokenizer.py:31] For some LLaMA V1 models, initializing the fast tokenizer may take a long time. To reduce the initialization time, consider using 'hf-internal-testing/llama-tokenizer' instead of the original tokenizer.\n",
"INFO 10-31 15:51:52 llm_engine.py:207] # GPU blocks: 26, # CPU blocks: 512\n"
]
}
],
"source": [
"# model_id = 'meta-llama/Llama-2-7b-chat-hf'\n",
"# model_id = 'facebook/opt-125m'\n",
"model_id = \"NousResearch/Llama-2-7b-chat-hf\"\n",
"llm = vllm.LLM(model=model_id)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "k0ES1Dfg-mLx"
},
"source": [
"If the previous cell executed successfully, you have propertly set up your Colab runtime and huggingface account!"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "IEDtbJfj-mLx"
},
"source": [
"A few helper functions to make display nicer."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "i64sPcrt-mLx"
},
"outputs": [],
"source": [
"from IPython.display import display, Markdown\n",
"\n",
"def display_header(text):\n",
" display(Markdown(f'**{text}**'))\n",
"\n",
"def display_content(text):\n",
" display(Markdown(f'```\\n{text}\\n```'))"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "uMsbo8Zj-mLy"
},
"source": [
"## Setting up the prompt for the specific language model\n",
"\n",
"We set up the prompting style according to the [Llama2 demo](https://huggingface.co/spaces/huggingface-projects/llama-2-13b-chat/blob/main/app.py). We simplify the implementation a bit as we don't need chat history for this demo."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "fvTpXvij-mLy"
},
"outputs": [],
"source": [
"DEFAULT_SYSTEM_PROMPT = \"\"\"\\\n",
"You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.\\n\\nIf a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.\\\n",
"\"\"\"\n",
"\n",
"def get_prompt(message: str, system_prompt: str = DEFAULT_SYSTEM_PROMPT) -> str:\n",
" return f'<s>[INST] <<SYS>>\\n{system_prompt}\\n<</SYS>>\\n\\n{message} [/INST]'"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "MQRmHSYH-mLy"
},
"source": [
"## Activating the monkey patch and creating the generation function\n",
"\n",
"We monkey-patch the ```Sampler``` class with our custom ```forward()``` method, using ```unittest.mock```.\n",
"\n",
"We use our sampling params in order to sent the specific filter function with the request. Different requests can have different format enforcers."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "gNFmH0rz-mLy"
},
"outputs": [],
"source": [
"from lmformatenforcer import CharacterLevelParser\n",
"from lmformatenforcer.integrations.transformers import build_transformers_prefix_allowed_tokens_fn\n",
"from unittest import mock\n",
"\n",
"DEFAULT_MAX_NEW_TOKENS = 100\n",
"\n",
"def vllm_with_character_level_parser(llm: vllm.LLM, prompt: str, parser: Optional[CharacterLevelParser] = None) -> str:\n",
" with mock.patch.object(vllm.model_executor.layers.sampler.Sampler, 'forward', patched_forward):\n",
" prefix_function = build_transformers_prefix_allowed_tokens_fn(llm.get_tokenizer(), parser) if parser else None\n",
" sampling_params = SamplingParamsWithFilterFunction()\n",
" sampling_params.max_tokens = DEFAULT_MAX_NEW_TOKENS\n",
" sampling_params.logits_allowed_tokens_filter_function = prefix_function\n",
" result = llm.generate(prompt, sampling_params=sampling_params)\n",
" return result[0].outputs[0].text"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "0e-K6Ij5-mLy"
},
"source": [
"## vLLM + JSON Use case\n",
"\n",
"Now we demonstrate using ```JsonSchemaParser```. We create a pydantic model, generate the schema from it, and use that to enforce the format.\n",
"The output will always be in a format that can be parsed by the parser."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "dZSMyRQH-mLy",
"colab": {
"base_uri": "https://localhost:8080/",
"height": 646
},
"outputId": "4785dccb-cae7-4dc8-dbe9-89d4cd3b75d9"
},
"outputs": [
{
"output_type": "display_data",
"data": {
"text/plain": [
"<IPython.core.display.Markdown object>"
],
"text/markdown": "**Prompt:**"
},
"metadata": {}
},
{
"output_type": "display_data",
"data": {
"text/plain": [
"<IPython.core.display.Markdown object>"
],
"text/markdown": "```\n<s>[INST] <<SYS>>\nYou are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.\n\nIf a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.\n<</SYS>>\n\nPlease give me information about Michael Jordan. You MUST answer using the following json schema: {\"title\": \"AnswerFormat\", \"type\": \"object\", \"properties\": {\"first_name\": {\"title\": \"First Name\", \"type\": \"string\"}, \"last_name\": {\"title\": \"Last Name\", \"type\": \"string\"}, \"year_of_birth\": {\"title\": \"Year Of Birth\", \"type\": \"integer\"}, \"num_seasons_in_nba\": {\"title\": \"Num Seasons In Nba\", \"type\": \"integer\"}}, \"required\": [\"first_name\", \"last_name\", \"year_of_birth\", \"num_seasons_in_nba\"]} [/INST]\n```"
},
"metadata": {}
},
{
"output_type": "display_data",
"data": {
"text/plain": [
"<IPython.core.display.Markdown object>"
],
"text/markdown": "**Answer, With json schema enforcing:**"
},
"metadata": {}
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 1/1 [00:06<00:00, 6.24s/it]\n"
]
},
{
"output_type": "display_data",
"data": {
"text/plain": [
"<IPython.core.display.Markdown object>"
],
"text/markdown": "```\n {\n\"first_name\": \"Michael\",\n\"last_name\": \"Jordan\",\n\"year_of_birth\": 1963,\n\"num_seasons_in_nba\": 15\n}\n```"
},
"metadata": {}
},
{
"output_type": "display_data",
"data": {
"text/plain": [
"<IPython.core.display.Markdown object>"
],
"text/markdown": "**Answer, Without json schema enforcing:**"
},
"metadata": {}
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 1/1 [00:06<00:00, 6.25s/it]\n"
]
},
{
"output_type": "display_data",
"data": {
"text/plain": [
"<IPython.core.display.Markdown object>"
],
"text/markdown": "```\n Of course! Here's the requested information about Michael Jordan in the format specified:\n\n{\n\"title\": \"AnswerFormat\",\n\"type\": \"object\",\n\"properties\": {\n\"first_name\": {\n\"title\": \"First Name\",\n\"type\": \"string\",\n\"example\": \"Michael\"\n},\n\"last_name\": {\n\"title\": \"Last Name\",\n\"type\": \"string\",\n\"example\": \"J\n```"
},
"metadata": {}
}
],
"source": [
"from lmformatenforcer import JsonSchemaParser\n",
"from pydantic import BaseModel\n",
"\n",
"from typing import List\n",
"\n",
"class AnswerFormat(BaseModel):\n",
" first_name: str\n",
" last_name: str\n",
" year_of_birth: int\n",
" num_seasons_in_nba: int\n",
"\n",
"question = 'Please give me information about Michael Jordan. You MUST answer using the following json schema: '\n",
"question_with_schema = f'{question}{AnswerFormat.schema_json()}'\n",
"prompt = get_prompt(question_with_schema)\n",
"\n",
"display_header(\"Prompt:\")\n",
"display_content(prompt)\n",
"\n",
"display_header(\"Answer, With json schema enforcing:\")\n",
"\n",
"result = vllm_with_character_level_parser(llm, prompt, JsonSchemaParser(AnswerFormat.schema()))\n",
"display_content(result)\n",
"\n",
"display_header(\"Answer, Without json schema enforcing:\")\n",
"result = vllm_with_character_level_parser(llm, prompt, None)\n",
"display_content(result)\n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "A6ozgJyu-mLy"
},
"source": [
"As you can see, the enforced output matches the required schema, while the unenforced does not. We have successfully integrated with vLLM!"
]
},
{
"cell_type": "code",
"source": [
"def batch_vllm_with_character_level_parser(llm: vllm.LLM, prompts: list[str], parser: Optional[CharacterLevelParser] = None) -> list[str]:\n",
" with mock.patch.object(vllm.model_executor.layers.sampler.Sampler, 'forward', patched_forward):\n",
" prefix_function = build_transformers_prefix_allowed_tokens_fn(llm.get_tokenizer(), parser) if parser else None\n",
" sampling_params = SamplingParamsWithFilterFunction()\n",
" sampling_params.max_tokens = DEFAULT_MAX_NEW_TOKENS\n",
" sampling_params.logits_allowed_tokens_filter_function = prefix_function\n",
" results = llm.generate(prompts, sampling_params=sampling_params)\n",
" return [r.outputs[0].text for r in results]"
],
"metadata": {
"id": "WYfMcAUAMo4J"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"def get_prompts(names, AnswerFormat):\n",
" return [f\"Please give me information about {name}. You MUST answer using the following json schema:\\n{AnswerFormat.schema_json()}\" for name in names]\n",
"\n",
"\n",
"names = \\\n",
" ['Michael Jordan',\n",
" 'Babe Ruth',\n",
" 'Muhammad Ali',\n",
" 'Jim Brown',\n",
" 'Wayne Gretzky',\n",
" 'Jesse Owens',\n",
" 'Jim Thorpe',\n",
" 'Willie Mays',\n",
" 'Jack Nicklaus',\n",
" 'Babe Didrikson',\n",
" 'Joe Louis',\n",
" 'Carl Lewis',\n",
" 'Wilt Chamberlain',\n",
" 'Hank Aaron',\n",
" 'Jackie Robinson',\n",
" 'Ted Williams',\n",
" 'Magic Johnson',\n",
" 'Bill Russell',\n",
" 'Martina Navratilova',\n",
" 'Ty Cobb',\n",
" 'Gordie Howe',\n",
" 'Joe DiMaggio',\n",
" 'Jackie Joyner-Kersee',\n",
" 'Sugar Ray Robinson',\n",
" 'Joe Montana',\n",
" 'Kareem Abdul-Jabbar',\n",
" 'Jerry Rice',\n",
" 'Red Grange',\n",
" 'Arnold Palmer',\n",
" 'Larry Bird',\n",
" 'Bobby Orr',\n",
" 'Johnny Unitas',\n",
" 'Mark Spitz',\n",
" 'Lou Gehrig',\n",
" 'Secretariat',\n",
" 'Oscar Robertson',\n",
" 'Mickey Mantle',\n",
" 'Ben Hogan',\n",
" 'Walter Payton',\n",
" 'Lawrence Taylor',\n",
" 'Wilma Rudolph',\n",
" 'Sandy Koufax',\n",
" 'Julius Erving',\n",
" 'Bobby Jones',\n",
" 'Bill Tilden',\n",
" 'Eric Heiden',\n",
" 'Edwin Moses',\n",
" 'Pete Sampras',\n",
" 'O.J. Simpson',\n",
" 'Chris Evert',\n",
" 'Rocky Marciano',\n",
" 'Jack Dempsey',\n",
" 'Rafer Johnson',\n",
" 'Greg Louganis',\n",
" 'Mario Lemieux',\n",
" 'Pete Rose',\n",
" 'Willie Shoemaker',\n",
" 'Elgin Baylor',\n",
" 'Billie Jean King',\n",
" 'Walter Johnson',\n",
" 'Stan Musial',\n",
" 'Jerry West',\n",
" 'Satchel Paige',\n",
" 'Sammy Baugh',\n",
" 'Althea Gibson',\n",
" 'Eddie Arcaro',\n",
" 'Bob Gibson',\n",
" 'Al Oerter',\n",
" 'Bonnie Blair',\n",
" 'Dick Butkus',\n",
" 'Roberto Clemente',\n",
" 'Bo Jackson',\n",
" 'Josh Gibson',\n",
" 'Deion Sanders',\n",
" 'Dan Marino',\n",
" 'Barry Sanders',\n",
" 'Cy Young',\n",
" 'Bob Mathias',\n",
" 'Gale Sayers',\n",
" 'A.J. Foyt',\n",
" 'Jimmy Connors',\n",
" 'Bobby Hull',\n",
" 'Honus Wagner',\n",
" \"Man o' War\",\n",
" 'Maurice Richard',\n",
" 'Otto Graham',\n",
" 'Henry Armstrong',\n",
" 'Joe Namath',\n",
" 'Rogers Hornsby',\n",
" 'Richard Petty',\n",
" 'Bob Beamon',\n",
" 'Mario Andretti',\n",
" 'Don Hutson',\n",
" 'Bob Cousy',\n",
" 'George Blanda',\n",
" 'Michael Johnson',\n",
" 'Citation',\n",
" 'Don Budge',\n",
" 'Sam Snead',\n",
" 'Jack Johnson']\n",
"\n",
"\n",
"def get_players(num_of_names: int) -> list[AnswerFormat | ValueError]:\n",
" prompts = get_prompts(names[:num_of_names], AnswerFormat)\n",
" players_raw = batch_vllm_with_character_level_parser(llm, prompts, JsonSchemaParser(AnswerFormat.schema()))\n",
" players = []\n",
" for p in players_raw:\n",
" try:\n",
" players.append(AnswerFormat.parse_raw(p))\n",
" except ValueError as e:\n",
" players.append(e)\n",
" print()\n",
" print(\"The number of parsed players: \", sum([isinstance(p, AnswerFormat) for p in players]))\n",
" return players"
],
"metadata": {
"id": "ploHfeUzPIVW"
},
"execution_count": null,
"outputs": []
},
{
"cell_type": "code",
"source": [
"players = get_players(3)\n",
"players"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "RiAdvb2XNfta",
"outputId": "4d2cf002-e526-4a02-9972-f5c59cc7cb96"
},
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 3/3 [00:13<00:00, 4.57s/it]"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 3\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"\n"
]
},
{
"output_type": "execute_result",
"data": {
"text/plain": [
"[AnswerFormat(first_name='Michael', last_name='Jordan', year_of_birth=1963, num_seasons_in_nba=15),\n",
" AnswerFormat(first_name='George', last_name='Herman', year_of_birth=1895, num_seasons_in_nba=20),\n",
" AnswerFormat(first_name='Muhammad', last_name='Ali', year_of_birth=1942, num_seasons_in_nba=56)]"
]
},
"metadata": {},
"execution_count": 11
}
]
},
{
"cell_type": "code",
"source": [
"%%timeit\n",
"\n",
"get_players(1)"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "OsADQytcb9ct",
"outputId": "60a0cadb-c4e2-4be9-9f8b-97cb98647d3f"
},
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 1/1 [00:04<00:00, 4.42s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 1\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 1/1 [00:04<00:00, 4.68s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 1\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 1/1 [00:04<00:00, 4.49s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 1\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 1/1 [00:03<00:00, 3.53s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 1\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 1/1 [00:04<00:00, 4.87s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 1\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 1/1 [00:04<00:00, 4.48s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 1\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 1/1 [00:04<00:00, 4.39s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 1\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 1/1 [00:04<00:00, 4.75s/it]"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 1\n",
"5.72 s ± 575 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"\n"
]
}
]
},
{
"cell_type": "code",
"source": [
"%%timeit\n",
"\n",
"get_players(10)"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "xkxkzrJQcHFn",
"outputId": "8864d2be-ef9a-4651-a250-80190297d045"
},
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 10/10 [00:45<00:00, 4.59s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 10\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 10/10 [00:40<00:00, 4.05s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 10\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 10/10 [00:42<00:00, 4.25s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 10\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 10/10 [00:41<00:00, 4.18s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 10\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 10/10 [00:39<00:00, 3.95s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 10\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 10/10 [00:44<00:00, 4.41s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 10\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 10/10 [00:43<00:00, 4.35s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 10\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 10/10 [00:41<00:00, 4.16s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 10\n",
"43.1 s ± 1.69 s per loop (mean ± std. dev. of 7 runs, 1 loop each)\n"
]
}
]
},
{
"cell_type": "code",
"source": [
"%%timeit\n",
"\n",
"get_players(32)"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "a9AXVdVRfx6t",
"outputId": "34945566-bda2-497f-cbaf-f8da1a266bdd"
},
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 32/32 [02:17<00:00, 4.29s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 32\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 32/32 [02:18<00:00, 4.32s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 32\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 32/32 [02:14<00:00, 4.20s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 32\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 32/32 [02:11<00:00, 4.11s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 32\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 32/32 [02:18<00:00, 4.32s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 32\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 32/32 [02:18<00:00, 4.34s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 31\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 32/32 [02:17<00:00, 4.29s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 32\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 32/32 [02:14<00:00, 4.20s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 32\n",
"2min 17s ± 2.54 s per loop (mean ± std. dev. of 7 runs, 1 loop each)\n"
]
}
]
},
{
"cell_type": "code",
"source": [
"%%timeit\n",
"\n",
"get_players(64)"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "kHh1iXjdf2te",
"outputId": "37eee25b-a00b-4cf4-aba2-257863accfdc"
},
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 64/64 [04:43<00:00, 4.43s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 64\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 64/64 [04:47<00:00, 4.49s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 64\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 64/64 [04:40<00:00, 4.38s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 64\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 64/64 [04:35<00:00, 4.30s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 64\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 64/64 [04:39<00:00, 4.36s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 64\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 64/64 [04:28<00:00, 4.19s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 64\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 64/64 [04:32<00:00, 4.25s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 64\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 64/64 [04:35<00:00, 4.30s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 64\n",
"4min 38s ± 5.9 s per loop (mean ± std. dev. of 7 runs, 1 loop each)\n"
]
}
]
},
{
"cell_type": "code",
"source": [
"%%timeit\n",
"\n",
"get_players(100)"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "kvPvCfDtf7a_",
"outputId": "99a81df9-3097-4da2-b631-7cbdfb8568d1"
},
"execution_count": null,
"outputs": [
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 100/100 [07:04<00:00, 4.25s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 99\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 100/100 [06:52<00:00, 4.13s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 100\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 100/100 [07:18<00:00, 4.38s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 100\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 100/100 [07:17<00:00, 4.38s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 100\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 100/100 [07:14<00:00, 4.34s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 100\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 100/100 [07:21<00:00, 4.42s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 100\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 100/100 [07:05<00:00, 4.25s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 100\n"
]
},
{
"output_type": "stream",
"name": "stderr",
"text": [
"Processed prompts: 100%|██████████| 100/100 [06:58<00:00, 4.19s/it]\n"
]
},
{
"output_type": "stream",
"name": "stdout",
"text": [
"\n",
"The number of parsed players: 100\n",
"7min 11s ± 9.99 s per loop (mean ± std. dev. of 7 runs, 1 loop each)\n"
]
}
]
},
{
"cell_type": "code",
"source": [
"import matplotlib.pyplot as plt\n",
"\n",
"\n",
"x = [1, 10, 32, 64, 100]\n",
"y = [5.72, 43.1, 137, 278, 431]\n",
"yerr = [.575, 1.69, 2.54, 5.9, 9.99]\n",
"\n",
"plt.plot(x, y)\n",
"plt.errorbar(x=x, y=y, yerr=yerr, fmt ='o')"
],
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 447
},
"id": "r250-nFwlXZj",
"outputId": "11a77fd7-41f9-46cd-815d-43a0220a15dc"
},
"execution_count": null,
"outputs": [
{
"output_type": "execute_result",
"data": {
"text/plain": [
"<ErrorbarContainer object of 3 artists>"
]
},
"metadata": {},
"execution_count": 18
},
{
"output_type": "display_data",
"data": {
"text/plain": [
"<Figure size 640x480 with 1 Axes>"
],
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAigAAAGdCAYAAAA44ojeAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAABAVUlEQVR4nO3deViVZeL/8fc5rLKLCoiKeypuoJZi6xSlZataalY200wzDa6UZptNq2ZTmlON852l5jflkpVWmppjpZW4pID7lhsugEpwAGU75/79QTGhLaLAcw7n87oursvzPDfwOXfl+fRst80YYxARERFxI3arA4iIiIicSQVFRERE3I4KioiIiLgdFRQRERFxOyooIiIi4nZUUERERMTtqKCIiIiI21FBEREREbfja3WA8+FyuTh69CihoaHYbDar44iIiMg5MMZQWFhIbGwsdvvPHyPxyIJy9OhRWrVqZXUMEREROQ9ZWVm0bNnyZ8d4ZEEJDQ0FKt9gWFiYxWlERETkXDgcDlq1alX1Of5zPLKgfH9aJywsTAVFRETEw5zL5Rm6SFZERETcjgqKiIiIuB0VFBEREXE7KigiIiLidlRQRERExO2ooIiIiIjbUUERERERt6OCIiIiIm5HBUVERETcjgqKiIiIuB0VFBEREXE7KigiIiLidlRQRERExO2ooIiIiMj/lBXDn8Irv8qKLYuhgiIiIiJuRwVFRERE3I4KioiIiPyPy/m/Px9cU/11PVJBERERkUrbP4TXLvnf67eHwsxuldvrmQqKiIiIVJaQd+6BwmPVtzuOVW6v55KigiIiIuLtXE5Y9jBgfmTnd9uWTa7X0z0qKCIiIt7u4BpwHP2ZAQYcRyrH1RMVFBEREW9XlFO742qBCoqIiIiX23s6+NwGhkTXbZAfUEERERHxUk6X4bXP9nLDIidHTSSunxxpg7AW0Lp/vWVTQREREfFCxwpOc9c/1vHi8l2UuWwsjh2HDRtgO2Pkd68HTgO7T73l86233yQiIiJuYdnWbCa/v5n8U+UE+fvw1M1dGdr7Bmw72sHSSdVvNQ6LrSwn8TfXa0YVFBERES9xuszJM0u2M2fdIQB6tAznleGJtG363TUo8TdDu6tgWqvK1yPfhfZX1+uRk++poIiIiHiBbUcLGDs3nW+OF2Ozwe+vaE/qtRfh73vG1R4/LCOt+1tSTkAFRUREpEFzuQxvrDnAC0t3UuZ0ERUawIxhCVzaoanV0X6WCoqIiEgDdbywlIcWZLJq93EAkrtEM31oDyKD/X/6m/yD4U8F9ZTwp6mgiIiINECf78rloQWZnCgqI8DXzuM3xnNX3zhstjPv0nFPKigiIiINSGmFkxeW7uJfX+0HoHNMKLNGJHJRdKjFyWpGBUVERKSB2JtbyJi5Gew45gDg3v5tmHx9ZwL9rLnQ9UKooIiIiHg4Ywxz1h/imcXbKSl3ERnsz59v78HVnevv0fS1TQVFRETEg31bXMbk9zezfFvlQn6Xd2zKS7f3JCos0OJkF0YFRURExEOt+eYEqfMzyXaU4Odj4+GBnfnNpW2x2z3jQtifo4IiIiLiYcqdLmas2M1fV32DMdCuWTCzhifSrUW41dFqjQqKiIiIBzlwophx89LJPFz5rJIRl7TiiRvjCfJvWB/pDevdiIiINFDGGN7fdIQpH2yluMxJeCM/pg3uzvXdm1sdrU6ooIiIiLg5R0k5jy/cyoeZRwG4pG0kM4clEBvRyOJkdUcFRURExI1tPPgt4+alc/jb0/jYbUxI7sgDV3XApwFcCPtz7L885KdNmzYNm83G+PHjq7aVlJSQkpJCkyZNCAkJYciQIeTk5FT7vkOHDjFo0CCCgoKIiopi4sSJVFRUXEgUERGRBsXpMsxauYc7/pbG4W9P0yqyEQv+kMToqzs2+HICF3AEZcOGDfztb3+jR48e1bZPmDCBJUuWsGDBAsLDwxk9ejSDBw/mq6++AsDpdDJo0CBiYmJYs2YNx44d45577sHPz4/nn3/+wt6NiIhIA3Ak/zQT5mWw/kAeALcmxPLMrd0IDfSzOFn9Oa8jKEVFRYwcOZK///3vNG7cuGp7QUEB//znP3n55Ze5+uqr6d27N2+88QZr1qxh7dq1AHzyySds376dt956i4SEBK6//nqeeeYZXnvtNcrKymrnXYmIiHioJZuPcf3M1aw/kEdIgC8zhvVk5vBEryoncJ4FJSUlhUGDBpGcnFxt+8aNGykvL6+2vXPnzsTFxZGWlgZAWloa3bt3Jzr6f4/fHTBgAA6Hg23btv3o7ystLcXhcFT7EhERaUiKSyuY9G4mKXM24SipIKFVBEvGXsZtiS2tjmaJGp/imTdvHps2bWLDhg1n7cvOzsbf35+IiIhq26Ojo8nOzq4a88Ny8v3+7/f9mKlTp/LUU0/VNKqIiIhH2HK4gHHz0tl3ohibDVKu6sC45I74+VzQpaIerUYFJSsri3HjxrFixQoCA+vvGf+PPPIIqampVa8dDgetWrWqt98vIiJSF1wuwz++3MeLy3dR7jQ0Dw9kxrAE+rVrYnU0y9WooGzcuJHc3Fx69epVtc3pdLJ69WpeffVVli9fTllZGfn5+dWOouTk5BATEwNATEwM69evr/Zzv7/L5/sxZwoICCAgIKAmUUVERNxarqOE1Hcy+XLvCQAGdo1h2pDuRAT5W5zMPdTo2NE111zDli1byMjIqPrq06cPI0eOrPqzn58fK1eurPqeXbt2cejQIZKSkgBISkpiy5Yt5ObmVo1ZsWIFYWFhxMfH19LbEhERcV8rd+Qw8JUv+HLvCQL97Ewd3J2/3tVL5eQHanQEJTQ0lG7dulXbFhwcTJMmTaq233fffaSmphIZGUlYWBhjxowhKSmJfv36AXDdddcRHx/P3XffzfTp08nOzubxxx8nJSVFR0lERKRBKyl3MvXjHfw77SAA8c3DmDUikQ5RIRYncz+1/iTZGTNmYLfbGTJkCKWlpQwYMIDXX3+9ar+Pjw+LFy/mgQceICkpieDgYEaNGsXTTz9d21FERETcxq7sQsbOTWdXTiEAv72sLRMHdiLA18fiZO7JZowxVoeoKYfDQXh4OAUFBYSFhVkdR0RE5CcZY/jP2oM8u2QHZRUumoYE8NIdPbnyomZWR6t3Nfn81lo8IiIideRkUSmT3t3Myp2V113+qlMzXry9J01DdEnDL1FBERERqQNf7DlO6juZHC8sxd/HziM3dObe/m2w2Rr+Ojq1QQVFRESkFpVVuPjzJ7v4v9X7AOgYFcKsEYl0aa5LEmpCBUVERKSW7DtexNh56Ww9Urkky1394njshnga+etC2JpSQREREblAxhgWfH2YJz/cxulyJxFBfkwf0oPruv74A0jll6mgiIiIXICCU+U8umgLSzYfAyCpXRNmDEsgJrz+loRpiFRQREREztP6/XlMmJ/BkfzT+NptPHhdJ+6/oh0+dl0Ie6FUUERERGqowuli1qd7efXTPbgMtGkSxCvDE+nZKsLqaA2GCoqIiEgNZOWdYty8dDYdygdgaO+W/OnmroQE6CO1Nmk2RUREztEHGUd4fOFWCksrCA3w5bnB3bm5Z6zVsRokFRQREZFfUFRawZQPtvL+piMA9G7dmJnDEmgVGWRxsoZLBUVERORnZGTlM25eOgdPnsJugzFXd2TM1R3w9bFbHa1BU0ERERH5EU6XYfaqb5ixYjcVLkOLiEbMHJ7AxW0irY7mFVRQREREzpBdUMKE+Rmk7TsJwKAezXn+tu6EN/KzOJn3UEERERH5geXbsnn4vc3knyonyN+HP93cldt7t9Qif/VMBUVERAQ4XebkmSXbmbPuEADdW4TzyvAE2jULsTiZd1JBERERr7f9qIOx89LZm1sEwO+vbMeD13bC31cXwlpFBUVERLyWMYY3vjrAtKU7KXO6iAoN4OU7ErisY1Oro3k9FRQREfFKxwtLmfhuJp/vOg5Acpdopg/tQWSwv8XJBFRQRETEC32+K5eHFmRyoqiMAF87j98Yz11943QhrBtRQREREa9RWuHkhaW7+NdX+wHoHBPKrBGJXBQdanEyOZMKioiIeIW9uYWMmZvBjmMOAO7t34bJ13cm0M/H4mTyY1RQRESkQTPGMHd9Fk8v3kZJuYvIYH9eHNqDa7pEWx1NfoYKioiINFjfFpcx+f3NLN+WA8DlHZvy0u09iQoLtDiZ/BIVFBERaZDWfHOC1PmZZDtK8POxMWlAZ+67rC12uy6E9QQqKCIi0qCUO13MWLGbv676BmOgXdNgZo1IpFuLcKujSQ2ooIiISINx8GQxY+dlkJmVD8Dwi1sx5aZ4gvz1cedp9E9MREQ8njGGhelHeGLRVorLnIQF+jJtSA9u6N7c6mhynlRQRETEozlKynli0VY+yDgKwCVtI5k5LIHYiEYWJ5MLoYIiIiIea+PBbxk/P52svNP42G1MSO7IA1d1wEcXwno8FRQREfE4Tpfh9c/2MnPlHpwuQ8vGjXhleCK9Wze2OprUEhUUERHxKEfyTzNhXgbrD+QBcEtCLM/c2o2wQD+Lk0ltUkERERGP8fGWY0x+bzOOkgqC/X149rZu3JbY0upYUgdUUERExO2dKqvgqQ+3M//rLAB6topg1vAEWjcJtjiZ1BUVFBERcWtbjxQwdm46+04UY7PBH69qz/jki/DzsVsdTeqQCoqIiLgll8vwjy/38eLyXZQ7DTFhgcwYlkBS+yZWR5N6oIIiIiJuJ9dRwoMLMvlizwkABnaNYdqQ7kQE+VucTOqLCoqIiLiVlTtymPjuZvKKywj0s/PkTV0ZfnErbDY928SbqKCIiIhbKCl3MvXjHfw77SAA8c3DmDUikQ5RIRYnEyuooIiIiOV2ZRcydm46u3IKAbjvsrZMGtiJAF8fi5OJVVRQRETEMsYY3lp7kGeX7KC0wkXTEH/+fHtPruoUZXU0sZgKioiIWCKvuIxJ72by3x25AFzVqRkvDu1Js9AAi5OJO1BBERGRevflnhOkvpNBbmEp/j52HrmhM/f2b6MLYaWKCoqIiNSbsgoXL32yi7+t3gdAh6gQZg1PJD42zOJk4m5UUEREpF7sO17EuHkZbDlSAMDIvnE8PiieRv66EFbOpoIiIiJ1yhjDgo2H+dOH2zhV5iQiyI8XhvRgQNcYq6OJG1NBERGROlNwupxHF25hyeZjACS1a8KMYQnEhAdanEzcnQqKiIjUiQ0H8hg/L4Mj+afxtdtIve4ifn9Fe3zsuhBWfpkKioiI1KoKp4tZn+7l1U/34DLQukkQrwxPJKFVhNXRxIOooIiISK3JyjvF+PkZbDz4LQBDerXkqVu6EhKgjxupGf0bIyIiteLDzKM89v4WCksrCA3w5dnbunFLQgurY4mHUkEREZELUlRawZMfbOO9TYcB6N26MTOHJdAqMsjiZOLJVFBEROS8ZWTlM25eOgdPnsJugzFXd2TM1R3w9bFbHU08nAqKiIjUmMtlmL36G17+ZDcVLkOLiEbMHJ7AxW0irY4mDYQKioiI1Eh2QQmp72Sw5puTAAzq0Zznb+tOeCM/i5NJQ6KCIiIi52z5tmwefm8z+afKCfL34U83d+X23i21yJ/UOhUUERH5RafLnDy7ZDtvrzsEQPcW4bwyPIF2zUIsTiYNlQqKiIj8rO1HHYydl87e3CIAfn9FOx68rhP+vroQVuqOCoqIiPwoYwxvfHWAaUt3UuZ0ERUawMt3JHBZx6ZWRxMvoIIiIiJnOVFUykMLMvl813EAkrtE8cKQHjQJCbA4mXgLFRQREanm8125PLRgMyeKSgnwtfP4oC7c1a+1LoSVeqWCIiIiAJRWOJm+bBf//HI/AJ2iQ5k1IpFOMaEWJxNvpIIiIiLszS1kzNwMdhxzAHBv/zZMvr4zgX4+FicTb6WCIiLixYwxzNuQxVMfbaOk3EVksD8vDu3BNV2irY4mXk4FRUTES+WfKmPye1tYti0bgMs7NuWl23sSFRZocTIRFRQREa+U9s1JJszPINtRgp+PjUkDOnPfZW2x23UhrLiHGj1l569//Ss9evQgLCyMsLAwkpKSWLp0adX+kpISUlJSaNKkCSEhIQwZMoScnJxqP+PQoUMMGjSIoKAgoqKimDhxIhUVFbXzbkRE5GeVO128uHwnd/5jLdmOEto1DWbhHy/ld1e0UzkRt1KjIygtW7Zk2rRpdOzYEWMM//73v7nllltIT0+na9euTJgwgSVLlrBgwQLCw8MZPXo0gwcP5quvvgLA6XQyaNAgYmJiWLNmDceOHeOee+7Bz8+P559/vk7eoIiIVDp4spix8zLIzMoHYFifVky5KZ7gAB1MF/djM8aYC/kBkZGRvPjiiwwdOpRmzZoxZ84chg4dCsDOnTvp0qULaWlp9OvXj6VLl3LjjTdy9OhRoqMrL8CaPXs2Dz/8MMePH8ff3/+cfqfD4SA8PJyCggLCwsIuJL6IiFd4f9Nhnli0leIyJ2GBvkwd3INBPZpbHUu8TE0+v897IQWn08m8efMoLi4mKSmJjRs3Ul5eTnJyctWYzp07ExcXR1paGgBpaWl07969qpwADBgwAIfDwbZt237yd5WWluJwOKp9iYjIL3OUlDN+Xjqp72RSXObkkjaRLB1/hcqJuL0aH9fbsmULSUlJlJSUEBISwsKFC4mPjycjIwN/f38iIiKqjY+OjiY7u/IK8ezs7Grl5Pv93+/7KVOnTuWpp56qaVQREa+26dC3jJuXTlbeaXzsNsZf05E//qoDPrrWRDxAjQtKp06dyMjIoKCggHfffZdRo0axatWqushW5ZFHHiE1NbXqtcPhoFWrVnX6O0VEPJXTZXj9s73MXLkHp8vQsnEjXhmeSO/Wja2OJnLOalxQ/P396dChAwC9e/dmw4YNvPLKKwwbNoyysjLy8/OrHUXJyckhJiYGgJiYGNavX1/t531/l8/3Y35MQEAAAQFaoEpE5JcczT/N+PkZrN+fB8AtCbE8c2s3wgL9LE4mUjPnfQ3K91wuF6WlpfTu3Rs/Pz9WrlxZtW/Xrl0cOnSIpKQkAJKSktiyZQu5ublVY1asWEFYWBjx8fEXGkVExKt9vOUYA2euZv3+PIL9fXj5jp7MHJagciIeqUZHUB555BGuv/564uLiKCwsZM6cOXz++ecsX76c8PBw7rvvPlJTU4mMjCQsLIwxY8aQlJREv379ALjuuuuIj4/n7rvvZvr06WRnZ/P444+TkpKiIyQiIufpVFkFT3+0nXkbsgDo2SqCWcMTaN0k2OJkIuevRgUlNzeXe+65h2PHjhEeHk6PHj1Yvnw51157LQAzZszAbrczZMgQSktLGTBgAK+//nrV9/v4+LB48WIeeOABkpKSCA4OZtSoUTz99NO1+65ERLzE1iMFjJ2bzr4Txdhs8Mer2jM++SL8fC74ALmIpS74OShW0HNQRMTbuVyGf365n+nLd1LuNMSEBTJjWAJJ7ZtYHU3kJ9Xk81uPDxQR8TC5jhIeXJDJF3tOADCgazTTBvegcfC5PexSxBOooIiIeJBPd+bw0ILN5BWXEehnZ8qNXRlxSStsNj3bRBoWFRQREQ9QUu5k2tKdvLnmAABdmofxlxEJdIgKtTaYSB1RQRERcXO7cwoZOzedndmFAPzm0rY8fH0nAnx9LE4mUndUUERE3JQxhrfWHuTZJTsorXDRNMSfF2/vya86RVkdTaTOqaCIiFjN5YSDa6AoB0KioXV/8k47mfTuZv67o/Jp21de1Iw/396TZqF6ZpR4BxUUERErbf8Qlk6CwmNVm0qDYphWejf/LU7E38fO5Os7c2//Nti1yJ94ERUUERGrbP8Q3rkHqP44Kr/ibKbxIkGNJ3PH3SnEx+p5T+J99KhBEREruJyw7GHOLCcAdhvYbDDF7z/Ex+hx9eKdVFBERKxwcA04jv7kbhtgdxypHCfihVRQRESsUJRTu+NEGhgVFBERC2SVn+MD1kKi6zaIiJtSQRERqUfGGN7ZkMWAhRUcNZG4fnKkDcJaQOv+9ZhOxH2ooIiI1JPi0gpS38lk0nubOVUOC5qOxoaNyitOfui71wOngV1PixXvpNuMRUTqwY5jDlLe3sS+E8X42G2kXnsRD1x5A7adHc96DgphsZXlJP5m6wKLWEwFRUSkDhljmLs+iz99tI2yChcxYYH85c5ELm4TWTkg/mboPOisJ8nqyIl4OxUUEZE6UlhSzqMLt/JRZuXtxL/q1IyX7kggMti/+kC7D7S93IKEIu5LBUVEpA5sPVLA6DmbOHDyFL52G5MGduK3l7XT4+pFzpEKiohILTLG8J+1B3l28Q7KnC5aRDRi1ohEerdubHU0EY+igiIiUkscJeVMfm8zH2/JBiC5SzR/vr0HEUH+v/CdInImFRQRkVqw+XA+KXM2kZV3Gj8fG5Ov78JvLm2DzaZTOiLnQwVFROQCGGN446sDTF26g3KnoWXjRrx2Zy96toqwOpqIR1NBERE5TwWnypn4biafbK9cL2dg1xheGNqD8EZ+FicT8XwqKCIi5yH90LeMnpPOkfzT+PvYeWxQF+5Jaq1TOiK1RAVFRKQGjDH844v9vLBsJxUuQ+smQbw6ohfdW4ZbHU2kQVFBERE5R98Wl/HQgkxW7swFYFCP5kwb3J3QQJ3SEaltKigiIufg6wN5jJmbzrGCEvx97Uy5MZ6RfeN0SkekjqigiIj8DJfLMHv1N7z0yW6cLkPbpsG8emciXWN1SkekLqmgiIj8hJNFpaS+k8mq3ccBuCUhludu605IgP7qFKlr+q9MRORHrNt3krHz0slxlBLga+fpW7pyR59WOqUjUk9UUEREfsDpMrz+2V5m/Hc3LgPtmwXz+sjedIoJtTqaiFdRQRER+c7xwlImzM/gy70nABjSqyXP3NqVIH/9VSlS3/RfnYgIsGbvCcbNz+B4YSmN/Hx45tZuDO3d0upYIl5LBUVEvJrTZZi1cg+zPt2DMXBRdAiv3dmLjtE6pSNiJRUUEfFauY4Sxs5LZ+2+PACG9WnFn27uSiN/H4uTiYgKioh4pS/2HGfC/AxOFJUR5O/D87d159bEFlbHEpHvqKCIiFepcLqY+d89vPb5XoyBzjGhvDayF+2bhVgdTUR+QAVFRLzGsYLTjJubwfoDlad0RvaN44kb4wn00ykdEXejgiIiXuGzXbmkzs/g21PlhAT4MnVwd27qGWt1LBH5CSooItKglTtd/PmTXfxt1T4AusaG8dqdvWjTNNjiZCLyc1RQRKTBOpJ/mrFz09l48FsARiW15pEbuuiUjogHUEERkQbpv9tzeHBBJgWnywkN9GX6kB5c37251bFE5BypoIhIg1JW4WL6sp3848v9APRsGc5fRvQirkmQxclEpCZUUESkwcjKO8XouelkZuUD8JtL2zL5+s74+9qtDSYiNaaCIiINwrKt2Ux6NxNHSQVhgb78+faeXNc1xupYInKeVFBExKOVVjiZ+vFO3lxzAIDEuAj+MiKRlo11SkfEk6mgiIjHOniymNFz0tlypACA+69ox8QBnfDz0SkdEU+ngiIiHmnJ5mNMfm8zhaUVRAT58fIdPbm6c7TVsUSklqigiIhHKSl38uyS7by19hAAfVo3ZtaIRGIjGlmcTERqkwqKiHiM/SeKSXl7E9uPOQD441XtSb32Inx1SkekwVFBERGP8EHGER59fwvFZU4ig/2ZMSyBKy9qZnUsEakjKigi4tZKyp089dE25q7PAuCStpHMGp5ITHigxclEpC6poIiI29qbW8ToOZvYmV2IzQZjftWBsdd01CkdES+ggiIibum9jYd5fNFWTpc7aRoSwMxhCVzWsanVsUSknqigiIhbOVVWwZMfbGPBxsMA9G/fhJnDE4gK1SkdEW+igiIibmN3TiEpb29iT24RdhuMu+YiRl/dAR+7zepoIlLPVFBExHLGGBZsPMyUD7ZSUu4iKjSAV4YnktS+idXRRMQiKigiYqni0goeX7SVhelHALi8Y1NmDEugaUiAxclExEoqKCJimR3HHKTM2cS+48XYbfDgdZ144Mr22HVKR8TrqaCISL0zxjB3fRZPfbSN0goXMWGBzBqRyCVtI62OJiJuQgVFROpVYUk5jy7cykeZRwG4qlMzXr4jgchgf4uTiYg7UUERkXqz9UgBo+ds4sDJU/jYbUwa0InfXd5Op3RE5CwqKCJS54wxvLX2IM8s3kGZ00VseCB/ubMXvVs3tjqaiLgpFRQRqVOOknImv7eZj7dkA5DcJYo/396TiCCd0hGRn6aCIiJ1ZvPhfFLmbCIr7zR+PjYeHtiZ+y5ri82mUzoi8vNUUESk1hljeOOrA0xduoNyp6Fl40a8emcvElpFWB1NRDxEjZYEnTp1KhdffDGhoaFERUVx6623smvXrmpjSkpKSElJoUmTJoSEhDBkyBBycnKqjTl06BCDBg0iKCiIqKgoJk6cSEVFxYW/GxGxXMGpcn7/n408vXg75U7DgK7RLBl7ucqJiNRIjQrKqlWrSElJYe3ataxYsYLy8nKuu+46iouLq8ZMmDCBjz76iAULFrBq1SqOHj3K4MGDq/Y7nU4GDRpEWVkZa9as4d///jdvvvkmU6ZMqb13JSKWSD/0LTfM+oJPtufg72PnTzfFM/uu3oQ38rM6moh4GJsxxpzvNx8/fpyoqChWrVrFFVdcQUFBAc2aNWPOnDkMHToUgJ07d9KlSxfS0tLo168fS5cu5cYbb+To0aNER0cDMHv2bB5++GGOHz+Ov/8vXzjncDgIDw+noKCAsLCw840vIrXEGMM/vtjPC8t2UuEyxEUG8dqdvejeMtzqaCLiRmry+V2jIyhnKigoACAysvLpjxs3bqS8vJzk5OSqMZ07dyYuLo60tDQA0tLS6N69e1U5ARgwYAAOh4Nt27b96O8pLS3F4XBU+xIR9/BtcRm//ffXPPfxDipchkE9mrN47GUqJyJyQc77IlmXy8X48eO59NJL6datGwDZ2dn4+/sTERFRbWx0dDTZ2dlVY35YTr7f//2+HzN16lSeeuqp840qInVk48E8xsxJ52hBCf6+dqbcGM/IvnG6S0dELth5H0FJSUlh69atzJs3rzbz/KhHHnmEgoKCqq+srKw6/50i8tNcLsNfP/+GO/62lqMFJbRtGszCP/bnrn6tVU5EpFac1xGU0aNHs3jxYlavXk3Lli2rtsfExFBWVkZ+fn61oyg5OTnExMRUjVm/fn21n/f9XT7fjzlTQEAAAQFael3EHZwsKiX1nUxW7T4OwC0JsTx3W3dCAvTUAhGpPTU6gmKMYfTo0SxcuJBPP/2Utm3bVtvfu3dv/Pz8WLlyZdW2Xbt2cejQIZKSkgBISkpiy5Yt5ObmVo1ZsWIFYWFhxMfHX8h7EZE6tm7fSW6Y9QWrdh8nwNfOtMHdmTksQeVERGpdjf5WSUlJYc6cOXzwwQeEhoZWXTMSHh5Oo0aNCA8P57777iM1NZXIyEjCwsIYM2YMSUlJ9OvXD4DrrruO+Ph47r77bqZPn052djaPP/44KSkpOkoi4qZcLsPrn+/l5RW7cRlo3yyY10b2onOM7qITkbpRo9uMf+rc8htvvMG9994LVD6o7cEHH2Tu3LmUlpYyYMAAXn/99Wqnbw4ePMgDDzzA559/TnBwMKNGjWLatGn4+p5bX9JtxiL153hhKanvZPDFnhMADO7Vgmdu6UawjpqISA3V5PP7gp6DYhUVFJH6sWbvCcbNz+B4YSmN/Hx4+pau3N6nldWxRMRD1eTzW/8LJCJncboMs1buYdanezAGLooO4bU7e9ExOtTqaCLiJVRQRKSaXEcJ4+ZlkLbvJAB39GnJUzd3o5G/j8XJRMSbqKCISJUv9hxnwvwMThSVEeTvw3O3deO2xJa//I0iIrVMBUVEqHC6mPnfPbz2+V6Mgc4xobx6Zy86RIVYHU1EvJQKioiXyy4oYezcdNYfyAPgzr5xTLkxnkA/ndIREeuooIh4sc935ZL6TiZ5xWWEBPjy/ODu3Nwz1upYIiIqKCLeqNzp4qVPdjN71TcAdI0N49U7e9G2abDFyUREKqmgiHiZI/mnGTs3nY0HvwXgnqTWPHpDF53SERG3ooIi4kX+uz2Hh97NJP9UOaEBvrwwtAc3dG9udSwRkbOooIh4gbIKF9OX7eQfX+4HoEfLcF4d0Yu4JkEWJxMR+XEqKCINXFbeKcbMTScjKx+AX1/ahsnXdybAV6d0RMR9qaCINGDLt2UzcUEmjpIKwgJ9efH2ngzoGvPL3ygiYjEVFJEGqLTCydSPd/LmmgMAJLSK4NU7E2nZWKd0RMQzqKCINDAHTxYzek46W44UAPC7y9sycUBn/H3tFicTETl3KigiDciSzceY/N5mCksriAjy46Xbe3JNl2irY4mI1JgKikgDUFLu5LklO/jP2oMA9GndmFkjEomNaGRxMhGR86OCIuLh9p8oJuXtTWw/5gDggavak3rtRfj56JSOiHguFRQRD/ZBxhEefX8LxWVOIoP9efmOnlzVKcrqWCIiF0wFRcQDlZQ7eeqjbcxdnwXAJW0jmTU8kZjwQIuTiYjUDhUUEQ+zN7eI0XM2sTO7EJsNRv+qA+Ou6YivTumISAOigiLiQd7fdJjHF23lVJmTpiH+zByWyGUdm1odS0Sk1qmgiHiAU2UVPPnBNhZsPAxAUrsmvDI8gagwndIRkYZJBUXEze3OKSTl7U3syS3CZoNx13RkzNUd8bHbrI4mIlJnVFBE3JQxhgUbDzPlg62UlLtoFhrAK8MT6N9ep3REpOFTQRFxQ8WlFTyxaCvvpx8B4PKOTZkxLIGmIQEWJxMRqR8qKCJuZscxB6PnbOKb48XYbfDgdZ144Mr22HVKR0S8iAqKiJswxjB3fRZPfbSN0goX0WEBzBqeSN92TayOJiJS71RQRNxAYUk5jy7cykeZRwG4qlMzXrq9J010SkdEvJQKiojFth4pYPScTRw4eQofu42JAzpx/+XtdEpHRLyaCoqIRYwxvLX2IM8s2UFZhYvY8ED+cmcivVtHWh1NRMRyKigiFnCUlDP5vc18vCUbgOQuUbw4tCeNg/0tTiYi4h5UUETq2ebD+Yyek86hvFP42m1Mvr4z913WFptNp3RERL6ngiJST4wxvLnmAM9/vINyp6FFRCNevTORxLjGVkcTEXE7Kigi9aDgVDmT3stk+bYcAK6Lj+bFoT0JD/KzOJmIiHtSQRGpY+mHvmX0nHSO5J/Gz8fGozd04d7+bXRKR0TkZ6igiNQRYwz//HI/05bupMJliIsM4tU7E+nRMsLqaCIibk8FRaQO5J8q46EFmfx3Ry4Ag7o3Z+qQ7oQF6pSOiMi5UEERuRAuJxxcA0U5EBINrfuzMauAMXPSOVpQgr+vnSdujOeuvnE6pSMiUgMqKCLna/uHsOxhcByt2lQUEMU/ikZy1HkxbZsG8+qdiXSNDbcwpIiIZ7JbHUDEI23/EN65p1o5AQgqyeU13xk83m4vH425TOVEROQ8qaCI1JTLWXnkBHPWLrsNbDYb9xX9jRA/ndIRETlfKigiNXVwzVlHTn7IhsHmOFI5TkREzosKikhNFeXU7jgRETmLCopIDa09fo7XlodE120QEZEGTAVF5BydKqvg4Xc3c+cnPhw1kbh+cqQNwlpA6/71mE5EpGFRQRE5B1uPFHDjX75k/tdZGJuddZ0mYcMGnHkh7HevB04Du099xxQRaTD0HBSRn+FyGf711X6mL9tFmdNFdFgAM+5IoH+HQbC9xVnPQSEstrKcxN9sXWgRkQZABUXkJxwvLOWhBZms2n0cgGvjo5k+pAeNg/0rB8TfDJ0HnfUkWR05ERG5cCooIj9i1e7jPPhOJieKSgnwtfP4oC7c1a/12Y+rt/tA28utCSki0oCpoIj8QGmFkxeX7eIfX+4HoFN0KLNGJNIpJtTiZCIi3kUFReQ73xwvYuzcdLYddQBwT1JrHr2hC4F+OmUjIlLfVFDE6xljWPD1YZ78cBuny500DvJj+tCeXBuv55iIiFhFBUW8WsHpch5duIUlm48B0L99E16+I4GY8ECLk4mIeDcVFPFaXx/IY9y8DI7kn8bXbiP1uov4/RXt8bFrkT8REaupoIjXcboMr366l1dW7sZlIC4yiFkjEkloFWF1NBER+Y4KiniVI/mnmTAvg/UH8gC4LbEFT9/SldBAP4uTiYjID6mgiNf4eMsxJr+3GUdJBcH+Pjx7WzduS2xpdSwREfkRKijS4J0qq+CZxduZuz4LgJ6tIpg1PIHWTYItTiYiIj9FBUUatG1HCxg7N51vjhdjs8EDV7ZnwrUX4eejdTJFRNyZCoo0SMYY/vXVAV5YupMyp4uo0ABmDkugf4emVkcTEZFzoIIiDc6JolImLsjks12Vi/wld4lm+tAeRH6/yJ+IiLg9FRRpUFbvPk7qd4v8+fvaeeKnFvkTERG3poIiDUJZhYsXl+/k719ULvJ3UXQIs0Yk0jkmzOJkIiJyPlRQxOPtO17E2HnpbD1Sucjf3f1a89ggLfInIuLJVFDEYxljWLDxMH/6cBunypxEBPkxfUgPrusaY3U0ERG5QCoo4pEKTpfz2MItLP5ukb+kdk2YMUyL/ImINBQ1fhjE6tWruemmm4iNjcVms7Fo0aJq+40xTJkyhebNm9OoUSOSk5PZs2dPtTF5eXmMHDmSsLAwIiIiuO+++ygqKrqgNyLeY+PBPG545QsWbz6Gj93GxAGdeOu3fVVOREQakBoXlOLiYnr27Mlrr732o/unT5/OrFmzmD17NuvWrSM4OJgBAwZQUlJSNWbkyJFs27aNFStWsHjxYlavXs39999//u9CvILTZZi1cg93/G0tR/JP0yqyEe/+IYmUX3XQCsQiIg2MzRhjzvubbTYWLlzIrbfeClQePYmNjeXBBx/koYceAqCgoIDo6GjefPNNhg8fzo4dO4iPj2fDhg306dMHgGXLlnHDDTdw+PBhYmNjf/H3OhwOwsPDKSgoICxMd2l4g6P5pxk/P4P1+ysX+bs1IZZnbu2mRf5ERDxITT6/a/V53/v37yc7O5vk5OSqbeHh4fTt25e0tDQA0tLSiIiIqConAMnJydjtdtatW/ejP7e0tBSHw1HtS7zHsq3HuP6VL1i/P49gfx9evqMnM4cnqpyIiDRgtXqRbHZ2NgDR0dHVtkdHR1fty87OJioqqnoIX18iIyOrxpxp6tSpPPXUU7UZVTzA6TInTy/eztz1hwDo2TKcV4Yn0qapFvkTEWnoPGLFtEceeYSCgoKqr6ysLKsjSR3bftTBTa9+ydz1hyoX+buqPQv+0F/lRETES9TqEZSYmMrnT+Tk5NC8efOq7Tk5OSQkJFSNyc3NrfZ9FRUV5OXlVX3/mQICAggICKjNqOKmjDG8ueYAUz/+3yJ/M4YlcKkW+RMR8Sq1egSlbdu2xMTEsHLlyqptDoeDdevWkZSUBEBSUhL5+fls3Lixasynn36Ky+Wib9++tRlHPMzJolLu+/fXPPXRdsqcLpK7RLFs/BUqJyIiXqjGR1CKiorYu3dv1ev9+/eTkZFBZGQkcXFxjB8/nmeffZaOHTvStm1bnnjiCWJjY6vu9OnSpQsDBw7kd7/7HbNnz6a8vJzRo0czfPjwc7qDRxqmL/ZULvJ3vLBykb/HB3Xhbi3yJyLitWpcUL7++mt+9atfVb1OTU0FYNSoUbz55ptMmjSJ4uJi7r//fvLz87nssstYtmwZgYH/e4jW22+/zejRo7nmmmuw2+0MGTKEWbNm1cLbEU9TVuHipU928bfV+wAt8iciIpUu6DkoVtFzUBqG/SeKGTs3nS1HCgC4q18cjw+K1yJ/IiINVE0+v7UWj9Q7YwzvbjzMkz9Y5O+FIT0YoEX+RETkOyooUq8cJeU8tnArH2UeBaBfu0hmDEugeXgji5OJiIg7UUGRerPx4LeMm5fO4W9P42O3kXrtRfzhyvZaR0dERM6igiJ1zukyvP7ZXmau3IPTZWgV2YhXhifSK66x1dFERMRNqaBInTqaf5oJ8zNY990if7d8t8hfmNbRERGRn6GCInVm2dZsHn5vMwWnywn29+HpW7oxuFcLPdtERER+kQqK1LrTZU6eWbKdOesqF/nr0TKcWVrkT0REakAFRWrVjmMOxs5NZ09uEQC/v7IdD17bCX9fj1iXUkRE3IQKitQKYwz/XnOA55fupKyicpG/l+9I4LKOWkdHRERqTgVFLtjJolImvbuZlTsrV6m+pnMU04f2oEmIVqAWEZHzo4IiF+TLPSdIfSeD3O8W+Xvshi7ck6RF/kRE5MKooMh5OXORv45RlYv8dWmutZFEROTCqaBIjR04UczYeelsPly5yN/IvpWL/DXy1yJ/IiJSO1RQ5JwZY3hv0xGe/GArxWVOwhtVLvI3sJsW+RMRkdqlgiLnxFFSzuMLt/Lhd4v89W0byczhWuRPRETqhgqK/KJNhyoX+cvKq1zkb0JyRx64qoMW+RMRkTqjgiI/yeky/PXzvcz4b+Uify0bVy7y17u1FvkTEZG6pYIiP+pYQeUif2v3VS7yd3PPWJ69TYv8iYhI/VBBkbMs31a5yF/+qXKCvlvkb4gW+RMRkXqkgiJVTpc5eXbJdt7+wSJ/rwxPpK0W+RMRkXqmgiIA7Mx2MGaOFvkTERH3oILi5Ywx/L+0gzz38Q7KKlw0Cw3g5Tt6cnnHZlZHExERL6aC4sXyisuY9G4m/91Rucjf1Z2jeFGL/ImIiBtQQfFSX+09wYT5/1vk79HrOzOqfxtdCCsiIm5BBcXLlDtdvPTJbv62+huMgQ5RIfxFi/yJiIibUUHxIgdOFDNuXjqZ3y3yd2ffOJ7QIn8iIuKGVFC8xPubDvPEoh8u8tedgd2aWx1LRETkR6mgNHCFJeU8sWgrizIqF/m7pG0kM4clEBuhRf5ERMR9qaA0YGcu8jf+mo788Vda5E9ERNyfCkoD5HQZZq/6hpdX7NYifyIi4pFUUBqY7IISJszPIG3fSQBu6hnLc1rkT0REPIwKSgPyybZsJv1gkb+nbu7K0N4t9WwTERHxOCoonsblhINroCgHQqKhdX9KnPDsku28tbZykb/uLcJ5ZXgC7ZqFWBxWRETk/KigeJLtH8Kyh8FxtGpTeXBzXjCjeCuvBwC/v6IdD16nRf5ERMSzqaB4iu0fwjv3AKbaZp+iYzzBNIqDJ3LT8N9rkT8REWkQ9L/ZnsDlrDxyckY5AbDbABtMDXqby9tH1ns0ERGRuqCC4gkOrql2WudMdsCn8GjlOBERkQZABcUTFOXU7jgRERE3p4LiAQ6WhZ7bwJDoug0iIiJST1RQ3JjLZfjHF/sY8H4FR00krp8caYOwFtC6fz2mExERqTsqKG7qWMFp7vrnOp5dsoMSJyyKHouN766Irea71wOngd2nvmOKiIjUCd1m7IY+yjzKYwu34CipoJGfD1Nuimf4xTdg29H+rOegEBZbWU7ib7YusIiISC1TQXEjjpJynvxgGwvTjwDQs1UEM4cl0LZpcOWA+Juh86CzniSrIyciItLQqKC4iXX7TpL6TiZH8k9jt8Hoqzsy5uoO+PmccRbO7gNtL7cmpIiISD1RQbFYWYWLGf/dzexV32AMxEUGMWNYAr1bN7Y6moiIiGVUUCy0N7eQcfMy2HbUAcAdfVoy5aauhAToH4uIiHg3fRJawBjDf9Ye5LklOyitcNE4yI+pg3swsFuM1dFERETcggpKPcstLGHSu5v5fNdxAK64qBl/HtqDqLBAi5OJiIi4DxWUerR8WzaPvL+FvOIyAnztPHJ9Z0b1b4PNduazTURERLybCko9KC6t4JnF25m3IQuA+OZhzByewEXR5/gIexERES+jglLHNh36lgnzMzh48hQ2G9x/RTtSr72IAF89u0REROSnqKDUkQqni1c/28tfPt2L02VoEdGIl+7oSb92TayOJiIi4vZUUOrAgRPFjJ+fQUZWPgC3JsTy1C3dCG/kZ20wERERD6GCUouMMczfkMXTi7dzqsxJaKAvz97ajVsSWlgdTURExKOooNSSk0WlTH5/Cyu25wDQr10kL92RQIuIRhYnExER8TwqKLXgs125TFywmRNFpfj52Jg4oBO/vawddrtuHxYRETkfKigX4HSZk6lLd/D/0g4C0DEqhJnDE+gaG25xMhEREc+mgnKeth4pYNy8dL45XgzAry9tw8MDOxPop9uHRURELpQKSg05XYa/rf6Glz/ZTYXLEBUawJ9v78kVFzWzOpqIiEiDoYJSA1l5p3jwnUzWH8gDYGDXGKYO7k7jYH+Lk4mIiDQsKijnwBjDoowjTFm0jcLSCoL9ffjTzV0Z2rul1tERERGpAyooP+RywsE1UJQDIdHQuj8FJS4eW7SFxZuPAdC7dWNm3JFAXJMgi8OKiIg0XCoo39v+ISx7GBxHqzaVBsUwteweFhcl4GO3Mf6ajjxwVXt8fewWBhUREWn4VFCgspy8cw9gqm32K87meabjHzGZwSMfIKFVhCXxREREvI0OBbiclUdOzignAHYb2GzwJ///kNAitP6ziYiIeCkVlINrqp3WOZMNsDuOVI4TERGRemFpQXnttddo06YNgYGB9O3bl/Xr19d/iKKc2h0nIiIiF8yygjJ//nxSU1N58skn2bRpEz179mTAgAHk5ubWb5CQ6NodJyIiIhfMsoLy8ssv87vf/Y5f//rXxMfHM3v2bIKCgvjXv/5Vv0Fa94ewWCpP5vwYG4S1qBwnIiIi9cKSglJWVsbGjRtJTk7+XxC7neTkZNLS0s4aX1paisPhqPZVa+w+MPCF716cWVK+ez1wWuU4ERERqReWFJQTJ07gdDqJjq5+2iQ6Oprs7Oyzxk+dOpXw8PCqr1atWtVuoPib4Y7/B2HNq28Pi63cHn9z7f4+ERER+Vke8RyURx55hNTU1KrXDoejbkpK50FnPUlWR05ERETqnyUFpWnTpvj4+JCTU/3OmJycHGJiYs4aHxAQQEBAQN0Hs/tA28vr/veIiIjIz7LkFI+/vz+9e/dm5cqVVdtcLhcrV64kKSnJikgiIiLiRiw7xZOamsqoUaPo06cPl1xyCTNnzqS4uJhf//rXVkUSERERN2FZQRk2bBjHjx9nypQpZGdnk5CQwLJly866cFZERES8j80Yc/YiNG7O4XAQHh5OQUEBYWFhVscRERGRc1CTz2+txSMiIiJuRwVFRERE3I4KioiIiLgdFRQRERFxOyooIiIi4nY84lH3Z/r+xqNaXTRQRERE6tT3n9vncgOxRxaUwsJCgNpfj0dERETqXGFhIeHh4T87xiOfg+JyuTh69CihoaHYbLbz/jnfLzqYlZWl56nUMc11/dFc1x/Ndf3RXNevuppvYwyFhYXExsZit//8VSYeeQTFbrfTsmXLWvt5YWFh+he+nmiu64/muv5oruuP5rp+1cV8/9KRk+/pIlkRERFxOyooIiIi4na8uqAEBATw5JNPEhAQYHWUBk9zXX801/VHc11/NNf1yx3m2yMvkhUREZGGzauPoIiIiIh7UkERERERt6OCIiIiIm5HBUVERETcjlcXlNdee402bdoQGBhI3759Wb9+vdWRPNrUqVO5+OKLCQ0NJSoqiltvvZVdu3ZVG1NSUkJKSgpNmjQhJCSEIUOGkJOTY1HihmPatGnYbDbGjx9ftU1zXbuOHDnCXXfdRZMmTWjUqBHdu3fn66+/rtpvjGHKlCk0b96cRo0akZyczJ49eyxM7JmcTidPPPEEbdu2pVGjRrRv355nnnmm2totmuvzs3r1am666SZiY2Ox2WwsWrSo2v5zmde8vDxGjhxJWFgYERER3HfffRQVFdVNYOOl5s2bZ/z9/c2//vUvs23bNvO73/3OREREmJycHKujeawBAwaYN954w2zdutVkZGSYG264wcTFxZmioqKqMX/4wx9Mq1atzMqVK83XX39t+vXrZ/r3729has+3fv1606ZNG9OjRw8zbty4qu2a69qTl5dnWrdube69916zbt06s2/fPrN8+XKzd+/eqjHTpk0z4eHhZtGiRSYzM9PcfPPNpm3btub06dMWJvc8zz33nGnSpIlZvHix2b9/v1mwYIEJCQkxr7zyStUYzfX5+fjjj81jjz1m3n//fQOYhQsXVtt/LvM6cOBA07NnT7N27VrzxRdfmA4dOpgRI0bUSV6vLSiXXHKJSUlJqXrtdDpNbGysmTp1qoWpGpbc3FwDmFWrVhljjMnPzzd+fn5mwYIFVWN27NhhAJOWlmZVTI9WWFhoOnbsaFasWGGuvPLKqoKiua5dDz/8sLnssst+cr/L5TIxMTHmxRdfrNqWn59vAgICzNy5c+sjYoMxaNAg85vf/KbatsGDB5uRI0caYzTXteXMgnIu87p9+3YDmA0bNlSNWbp0qbHZbObIkSO1ntErT/GUlZWxceNGkpOTq7bZ7XaSk5NJS0uzMFnDUlBQAEBkZCQAGzdupLy8vNq8d+7cmbi4OM37eUpJSWHQoEHV5hQ017Xtww8/pE+fPtx+++1ERUWRmJjI3//+96r9+/fvJzs7u9p8h4eH07dvX813DfXv35+VK1eye/duADIzM/nyyy+5/vrrAc11XTmXeU1LSyMiIoI+ffpUjUlOTsZut7Nu3bpaz+SRiwVeqBMnTuB0OomOjq62PTo6mp07d1qUqmFxuVyMHz+eSy+9lG7dugGQnZ2Nv78/ERER1cZGR0eTnZ1tQUrPNm/ePDZt2sSGDRvO2qe5rl379u3jr3/9K6mpqTz66KNs2LCBsWPH4u/vz6hRo6rm9Mf+TtF818zkyZNxOBx07twZHx8fnE4nzz33HCNHjgTQXNeRc5nX7OxsoqKiqu339fUlMjKyTubeKwuK1L2UlBS2bt3Kl19+aXWUBikrK4tx48axYsUKAgMDrY7T4LlcLvr06cPzzz8PQGJiIlu3bmX27NmMGjXK4nQNyzvvvMPbb7/NnDlz6Nq1KxkZGYwfP57Y2FjNtZfxylM8TZs2xcfH56w7GnJycoiJibEoVcMxevRoFi9ezGeffUbLli2rtsfExFBWVkZ+fn618Zr3mtu4cSO5ubn06tULX19ffH19WbVqFbNmzcLX15fo6GjNdS1q3rw58fHx1bZ16dKFQ4cOAVTNqf5OuXATJ05k8uTJDB8+nO7du3P33XczYcIEpk6dCmiu68q5zGtMTAy5ubnV9ldUVJCXl1cnc++VBcXf35/evXuzcuXKqm0ul4uVK1eSlJRkYTLPZoxh9OjRLFy4kE8//ZS2bdtW29+7d2/8/PyqzfuuXbs4dOiQ5r2GrrnmGrZs2UJGRkbVV58+fRg5cmTVnzXXtefSSy8965b53bt307p1awDatm1LTExMtfl2OBysW7dO811Dp06dwm6v/tHk4+ODy+UCNNd15VzmNSkpifz8fDZu3Fg15tNPP8XlctG3b9/aD1Xrl916iHnz5pmAgADz5ptvmu3bt5v777/fREREmOzsbKujeawHHnjAhIeHm88//9wcO3as6uvUqVNVY/7whz+YuLg48+mnn5qvv/7aJCUlmaSkJAtTNxw/vIvHGM11bVq/fr3x9fU1zz33nNmzZ495++23TVBQkHnrrbeqxkybNs1ERESYDz74wGzevNnccsstuvX1PIwaNcq0aNGi6jbj999/3zRt2tRMmjSpaozm+vwUFhaa9PR0k56ebgDz8ssvm/T0dHPw4EFjzLnN68CBA01iYqJZt26d+fLLL03Hjh11m3Fd+Mtf/mLi4uKMv7+/ueSSS8zatWutjuTRgB/9euONN6rGnD592vzxj380jRs3NkFBQea2224zx44dsy50A3JmQdFc166PPvrIdOvWzQQEBJjOnTub//u//6u23+VymSeeeMJER0ebgIAAc80115hdu3ZZlNZzORwOM27cOBMXF2cCAwNNu3btzGOPPWZKS0urxmiuz89nn332o39Hjxo1yhhzbvN68uRJM2LECBMSEmLCwsLMr3/9a1NYWFgneW3G/ODxfCIiIiJuwCuvQRERERH3poIiIiIibkcFRURERNyOCoqIiIi4HRUUERERcTsqKCIiIuJ2VFBERETE7aigiIiIiNtRQRERERG3o4IiIiIibkcFRURERNyOCoqIiIi4nf8Py3doYew9ifEAAAAASUVORK5CYII=\n"
},
"metadata": {}
}
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.13"
},
"orig_nbformat": 4,
"colab": {
"provenance": [],
"gpuType": "T4",
"include_colab_link": true
},
"accelerator": "GPU"
},
"nbformat": 4,
"nbformat_minor": 0
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment