Skip to content

Instantly share code, notes, and snippets.

@hamidrhashmi
Created December 23, 2024 12:52
Show Gist options
  • Save hamidrhashmi/e834cc29f5adbc26b759a402041a80fe to your computer and use it in GitHub Desktop.
Save hamidrhashmi/e834cc29f5adbc26b759a402041a80fe to your computer and use it in GitHub Desktop.
How to send data to qryn

https://gigapipe.com/docs.html

Pre-Requisite: Signup And you have created the Organization and a Project.

As the Project is created you will have two Stack already added to your project

Grafana Observability

STEP 1: Go to Observability Stack - Auth Tokens

You need to use these credentials to ingest the data.

STEP 2:

You can send Logs, Metrics, and traces to Gigapipe

Logs:

Install Grafana-agent and use the following configuration

loki:
  configs:
  - name: default
    positions:
      filename: /tmp/positions.yaml
    scrape_configs:
      - job_name: system
        static_configs:
          - targets: [localhost]
            labels:
              job: varlogs
              __path__: /var/log/*log
    clients:
      - url: https://qryn.trials.gigapipe.io/loki/api/v1/push
        headers:
          X-API-KEY: ******************
          X-API-SECRET: *****************************

Metrics

Install the vector and use the following configuration

[sources.prom_scrape]
type = "prometheus_scrape"
endpoints = [ "http://127.0.0.1:9100/metrics","http://127.0.0.2:9100/metrics" ]
scrape_interval_secs = 15
instance_tag = "instance"
endpoint_tag = "endpoint"

  [sources.prom_scrape.query]
  "match[]" = [ "{job=\"vector\"}", "{__name__=~\"job:.*\"}" ]

[sinks.prom_write_remote]
type = "prometheus_remote_write"
inputs = [ "prom_scrape" ]
endpoint = "https://qryn.trials.gigapipe.io/prom/remote/write"
auth.strategy = "basic"
auth.user = "********************"
auth.password = "***********************************"

Traces

Install qryn-otel-collector and use the following configuration

extensions:
  health_check:
  pprof:
    endpoint: 0.0.0.0:1777
  zpages:
    endpoint: 0.0.0.0:55679

receivers:
  otlp:
    protocols:
      grpc:
        endpoint: 0.0.0.0:4317
      http:
        endpoint: 0.0.0.0:4318

  opencensus:
    endpoint: 0.0.0.0:55678

  # Collect own metrics
  prometheus:
    config:
      scrape_configs:
      - job_name: 'otel-collector'
        scrape_interval: 15s
        static_configs:
          - targets: ['0.0.0.0:9100', '192.168.0.167:9100', '192.168.0.115:9100', '192.168.0.116:9100']

  jaeger:
    protocols:
      grpc:
        endpoint: 0.0.0.0:14250
      thrift_binary:
        endpoint: 0.0.0.0:6832
      thrift_compact:
        endpoint: 0.0.0.0:6831
      thrift_http:
        endpoint: 0.0.0.0:14268

  zipkin:
    endpoint: 0.0.0.0:9411

processors:
  batch:

exporters:
  debug:
    verbosity: detailed
  otlphttp:
    compression: none
    endpoint: https://username:[email protected]
    tls:
      insecure_skip_verify: true

service:

  pipelines:

    traces:
      receivers: [otlp, opencensus, jaeger, zipkin]
      processors: [batch]
      exporters: [debug, otlphttp]

    metrics:
      receivers: [otlp, opencensus, prometheus]
      processors: [batch]
      exporters: [debug]

    logs:
      receivers: [otlp]
      processors: [batch]
      exporters: [debug]

  extensions: [health_check, pprof, pages]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment