Skip to content

Instantly share code, notes, and snippets.

View bbrowning's full-sized avatar

Ben Browning bbrowning

  • Red Hat
  • Cambridge, MA
View GitHub Profile
diff --git a/tests/entrypoints/openai/tool_parsers/test_pythonic_tool_parser.py b/tests/entrypoints/openai/tool_parsers/test_pythonic_tool_parser.py
index fbbbc1fb2..1f953706b 100644
--- a/tests/entrypoints/openai/tool_parsers/test_pythonic_tool_parser.py
+++ b/tests/entrypoints/openai/tool_parsers/test_pythonic_tool_parser.py
@@ -52,6 +52,27 @@ ESCAPED_STRING_FUNCTION_CALL = FunctionCall(
name="get_weather",
arguments='{"city": "Martha\'s Vineyard", "metric": "\\"cool units\\""}',
)
+PYTHON_TAGS_FUNCTION_OUTPUT="<|python_start|>[get_weather(city='San Francisco', metric='celsius')]<|python_end|>"
+PYTHON_TAGS_FUNCTION_CALL = FunctionCall(
diff --git a/llama_stack/providers/remote/inference/vllm/vllm.py b/llama_stack/providers/remote/inference/vllm/vllm.py
index 8bc733fd..eaea63f8 100644
--- a/llama_stack/providers/remote/inference/vllm/vllm.py
+++ b/llama_stack/providers/remote/inference/vllm/vllm.py
@@ -161,45 +161,52 @@ def _convert_to_vllm_finish_reason(finish_reason: str) -> StopReason:
async def _process_vllm_chat_completion_stream_response(
stream: AsyncGenerator[OpenAIChatCompletionChunk, None],
) -> AsyncGenerator:
- event_type = ChatCompletionResponseEventType.start
- tool_call_buf = UnparseableToolCall()
@bbrowning
bbrowning / llama4_pythonic.diff
Created May 14, 2025 00:42
diff of changes to llama 4 pythonic tool parser
diff --git a/tests/entrypoints/openai/tool_parsers/test_pythonic_tool_parser.py b/tests/entrypoints/openai/tool_parsers/test_pythonic_tool_parser.py
index fbbbc1fb2..5d232f44a 100644
--- a/tests/entrypoints/openai/tool_parsers/test_pythonic_tool_parser.py
+++ b/tests/entrypoints/openai/tool_parsers/test_pythonic_tool_parser.py
@@ -52,6 +52,16 @@ ESCAPED_STRING_FUNCTION_CALL = FunctionCall(
name="get_weather",
arguments='{"city": "Martha\'s Vineyard", "metric": "\\"cool units\\""}',
)
+PYTHON_TAGS_FUNCTION_OUTPUT="<|python_start|>[get_weather(city='San Francisco', metric='celsius')]<|python_end|>"
+PYTHON_TAGS_FUNCTION_CALL = FunctionCall(
diff --git a/vllm/entrypoints/openai/tool_parsers/pythonic_tool_parser.py b/vllm/entrypoints/openai/tool_parsers/pythonic_tool_parser.py
index bb91a35af..2947bc8db 100644
--- a/vllm/entrypoints/openai/tool_parsers/pythonic_tool_parser.py
+++ b/vllm/entrypoints/openai/tool_parsers/pythonic_tool_parser.py
@@ -189,6 +189,8 @@ def _get_parameter_value(val: ast.expr) -> Any:
}
elif isinstance(val, ast.List):
return [_get_parameter_value(v) for v in val.elts]
+ elif isinstance(val, ast.Name) and val.id in ["true", "false"]:
+ return True if val.id == "true" else False
diff --git a/tests/v1/engine/test_processor.py b/tests/v1/engine/test_processor.py
new file mode 100644
index 000000000..c793290d4
--- /dev/null
+++ b/tests/v1/engine/test_processor.py
@@ -0,0 +1,196 @@
+# SPDX-License-Identifier: Apache-2.0
+
+import json
+import math
@bbrowning
bbrowning / REPORT.md
Created April 13, 2025 17:30
Llama Stack OpenAI API Verification Report

Test Results Report

Generated on: 2025-04-13 13:12:49

This report was generated by running python tests/verifications/generate_report.py

Legend

  • ✅ - Test passed
  • ❌ - Test failed
version: '2'
image_name: openai
apis:
- agents
- datasetio
- eval
- inference
- safety
- scoring
- telemetry
@bbrowning
bbrowning / 0_README.md
Last active April 28, 2020 17:57
Deploying OpenShift Serverless 1.7.0 stage builds on external OpenShift 4.3 clusters

Note: This is only tested on OCP 4.3 or 4.4 clusters

Disable the default OLM operator sources:

oc patch OperatorHub cluster --type json \
  -p '[{"op": "add", "path": "/spec/disableAllDefaultSources", "value": true}]'

Download the imageContentSourcePolicy.yaml from this gist and apply it.

@bbrowning
bbrowning / activationStats.sh
Created April 12, 2018 16:11
Useful script to display recent activation stats from OpenWhisk
#!/usr/bin/env bash
# set -x
set -e
func=$1
count=$2
if [ "${func}x" = "x" ]; then
echo "You must supply a function as the first argument"
@bbrowning
bbrowning / openshift_instructions.md
Last active August 22, 2018 15:59
Running Apache OpenWhisk on OpenShift

Running Apache OpenWhisk on OpenShift

Prerequisites

These instructions assume you are using Minishift 1.0.1 or newer as your OpenShift installation.

You'll also need a wsk binary in your $PATH to interact with OpenWhisk after it's deployed. Download the latest version for your OS