Skip to content

Instantly share code, notes, and snippets.

@nkhitrov
Last active October 15, 2024 04:13
Show Gist options
  • Save nkhitrov/a3e31cfcc1b19cba8e1b626276148c49 to your computer and use it in GitHub Desktop.
Save nkhitrov/a3e31cfcc1b19cba8e1b626276148c49 to your computer and use it in GitHub Desktop.
Configure uvicorn logs with loguru for FastAPI
"""
WARNING: dont use loguru, use structlog
https://gist.github.com/nkhitrov/38adbb314f0d35371eba4ffb8f27078f
Configure handlers and formats for application loggers.
"""
import logging
import sys
from pprint import pformat
# if you dont like imports of private modules
# you can move it to typing.py module
from loguru import logger
from loguru._defaults import LOGURU_FORMAT
class InterceptHandler(logging.Handler):
"""
Default handler from examples in loguru documentaion.
See https://loguru.readthedocs.io/en/stable/overview.html#entirely-compatible-with-standard-logging
"""
def emit(self, record: logging.LogRecord):
# Get corresponding Loguru level if it exists
try:
level = logger.level(record.levelname).name
except ValueError:
level = record.levelno
# Find caller from where originated the logged message
frame, depth = logging.currentframe(), 2
while frame.f_code.co_filename == logging.__file__:
frame = frame.f_back
depth += 1
logger.opt(depth=depth, exception=record.exc_info).log(
level, record.getMessage()
)
def format_record(record: dict) -> str:
"""
Custom format for loguru loggers.
Uses pformat for log any data like request/response body during debug.
Works with logging if loguru handler it.
Example:
>>> payload = [{"users":[{"name": "Nick", "age": 87, "is_active": True}, {"name": "Alex", "age": 27, "is_active": True}], "count": 2}]
>>> logger.bind(payload=).debug("users payload")
>>> [ { 'count': 2,
>>> 'users': [ {'age': 87, 'is_active': True, 'name': 'Nick'},
>>> {'age': 27, 'is_active': True, 'name': 'Alex'}]}]
"""
format_string = LOGURU_FORMAT
if record["extra"].get("payload") is not None:
record["extra"]["payload"] = pformat(
record["extra"]["payload"], indent=4, compact=True, width=88
)
format_string += "\n<level>{extra[payload]}</level>"
format_string += "{exception}\n"
return format_string
def init_logging():
"""
Replaces logging handlers with a handler for using the custom handler.
WARNING!
if you call the init_logging in startup event function,
then the first logs before the application start will be in the old format
>>> app.add_event_handler("startup", init_logging)
stdout:
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO: Started reloader process [11528] using statreload
INFO: Started server process [6036]
INFO: Waiting for application startup.
2020-07-25 02:19:21.357 | INFO | uvicorn.lifespan.on:startup:34 - Application startup complete.
"""
# disable handlers for specific uvicorn loggers
# to redirect their output to the default uvicorn logger
# works with uvicorn==0.11.6
loggers = (
logging.getLogger(name)
for name in logging.root.manager.loggerDict
if name.startswith("uvicorn.")
)
for uvicorn_logger in loggers:
uvicorn_logger.handlers = []
# change handler for default uvicorn logger
intercept_handler = InterceptHandler()
logging.getLogger("uvicorn").handlers = [intercept_handler]
# set logs output, level and format
logger.configure(
handlers=[{"sink": sys.stdout, "level": logging.DEBUG, "format": format_record}]
)
"""
WARNING: dont use loguru, use structlog
https://gist.github.com/nkhitrov/38adbb314f0d35371eba4ffb8f27078f
Gist for original issue https://github.com/tiangolo/fastapi/issues/1276#issuecomment-663748916
"""
from fastapi import FastAPI
from starlette.requests import Request
from logger import init_logging
app = FastAPI(title="Test Uvicorn Handlers")
init_logging()
# view.py
@app.get("/")
def index(request: Request) -> None:
logger.info("loguru info log")
logging.info("logging info log")
logging.getLogger("fastapi").debug("fatapi info log")
logger.bind(payload=dict(request.query_params)).debug("params with formating")
return None
@remster85
Copy link

remster85 commented Dec 31, 2020

Hi @SlyFoxy, thanks for this contribution, is this usable to log to elastic search with filebeat?
If not, would you have any sample or recommendation0, please?
Thanks!

@climberjase
Copy link

This is how I have my filebeat.yml file setup to do exactly that (using Docker):

Logging format: <green>{time}</green> | <level>{level: <8}</level> | <blue>{extra[client-id]: <8}</blue> | <cyan>{name}:{function}:{line}{extra[padding]}</cyan> | <level>{message}</level>\n{exception}

filebeat.yml

cloud.id: ${LOGSTASH_CLOUD_ID}
cloud.auth: "elastic:${LOGSTASH_PASSWORD}"

filebeat.inputs:
  - type: container
    paths:
      - '/var/lib/docker/containers/*/*.log'

processors:
  - add_docker_metadata:
      host: "unix:///var/run/docker.sock"

  # Process our Python Loguru logs
  - dissect:
      when:
        equals:
          container.name: "my_python_service"
      tokenizer: "%{@timestamp} | %{log.level} | %{organization.id} | %{log.origin.function}:%{log.origin.file.name}:%{log.origin.file.line} | %{message}"
      field: "message"
      trim_values: "all"
      ignore_failure: true
      overwrite_keys: true
      target_prefix: "my_python_service"

  # Treat other logs as JSON
  - decode_json_fields:
      when:
        not:
          equals:
            docker.container.name: "my_docker_service_name"
      fields: ["message"]
      target: "json"
      overwrite_keys: true

output.elasticsearch:
  indices:
    - index: "filebeat-my_service-%{+yyyy.MM.dd}"

logging.json: true
logging.metrics.enabled: false

I'm using Docker Compose to start both my Python project and Filebeat as containers, and then forwarding the container
logs for my Python project off to ElasticSearch with formatting, and all other logs as JSON or plain text.

Sample docker-compose.yml

version: "3.8"

services:
  my_python_service:
    build: .
    container_name: my_python_service

  filebeat:
    image: "docker.elastic.co/beats/filebeat:7.2.0"
    container_name: filebeat
    user: root
    environment:
      - LOGSTASH_CLOUD_ID=${LOGSTASH_CLOUD_ID}
      - LOGSTASH_USERNAME=${LOGSTASH_USERNAME}
      - LOGSTASH_PASSWORD=${LOGSTASH_PASSWORD}
    volumes:
      - ./filebeat.yml:/usr/share/filebeat/filebeat.yml:ro
      - /var/lib/docker:/var/lib/docker:ro
      - /var/run/docker.sock:/var/run/docker.sock

If you don't use Docker you should be able to figure it out from the above. I've had this working in Python 3.9.

@lyutian623
Copy link

perfect solution! 💯 🚀

@JLHasson
Copy link

FWIW with uvicorn==0.18.2 the above did not print the access logs (i.e. GET /). I needed to add

logging.getLogger("uvicorn.access").handlers = [intercept_handler]

@psadi
Copy link

psadi commented Oct 21, 2022

Hi, Im able to get this working. However default uvicorn logs are not being shown. The logs shows only if its explicitly logged to console.
Am i missing anything here ? thanks.

@nkhitrov
Copy link
Author

Hello there! I not recommend to use loguru anymore. If you want use custom formatting and other cool features, see this new example with structlog

@xsevy
Copy link

xsevy commented Mar 16, 2023

@nkhitrov what is the reason of not using loguru?

@nkhitrov
Copy link
Author

  1. it can not work with libraries that use default logging handlers (sentry-sdk) out the box because loguru write to sdtout directly (without logging)
  2. when you have error in formatter function, loguru crashes without any useful info

structlog is more powerful library. you can write your own processors to modify logs and use plugins

@nosid91
Copy link

nosid91 commented Nov 15, 2023

@nkhitrov
Copy link
Author

@nosid91

  1. docs.sentry.io/platforms/python/integrations/loguru

Check dates.
Sentry sdk ~6 months ago
My comment ~8 months ago

  1. if that's it, you haven't convinced me)

It is your choice. I don't have a goal to convince anyone. I'm just sharing my knowledge and the problems I've discovered.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment