Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Filed serverity_number is not mapped to filed serverity_text as expected #35891

Closed
Wudadada opened this issue Oct 21, 2024 · 5 comments
Closed
Labels
bug Something isn't working needs triage New item requiring triage processor/transform Transform processor

Comments

@Wudadada
Copy link

Component(s)

processor/transform

What happened?

Description

want to map Filed serverity_number to filed serverity_text, like serverity_number 9 to serverity_text 'INFO',but it didn't take effect,is it a bug or my config error?
image

Steps to Reproduce

As my configuration

Expected Result

Filed serverity_number is not mapped to filed serverity_text

Actual Result

Wasn't mapped

Collector version

v0.111.0

Environment information

Environment

OS: centOS 8

OpenTelemetry Collector configuration

receivers:
  otlp:
    protocols:
      grpc:
        endpoint: 0.0.0.0:4317
      http:
        endpoint: 0.0.0.0:4318
  prometheus:
    config:
      scrape_configs:
        - job_name: 'otel-collector'
          scrape_interval: 5s
          static_configs:
            - targets: ['0.0.0.0:19100']

processors:
  batch:
    timeout: 5s
    send_batch_size: 100000
  transform:
    log_statements:
      - context: log
        statements:
          - set(severity_text, "TRACE") where severity_number == 1
          - set(severity_text, "DEBUG") where severity_number == 5
          - set(severity_text, "INFO") where severity_number == 9
          - set(severity_text, "WARN") where severity_number == 13
          - set(severity_text, "ERROR") where severity_number == 17
          - set(severity_text, "FATAL") where severity_number == 21

exporters:
  clickhouse:
    endpoint: tcp://10.105.212.248:9000?dial_timeout=10s
    create_schema: true
    database: otel
    async_insert: true
    ttl: 72h
    compress: lz4
    timeout: 5s
    retry_on_failure:
      enabled: true
      initial_interval: 5s
      max_interval: 30s
      max_elapsed_time: 300s
    username: "default"
    password: "123"
    cluster_name: cluster_3S_1R
    table_engine:
      name: "ReplicatedMergeTree"

    logs_table_name: otel_logs

    traces_table_name: otel_traces

    metrics_tables:
      gauge: 
        name: "otel_metrics_gauge"
      sum: 
        name: "otel_metrics_sum"
      summary: 
        name: "otel_metrics_summary"
      histogram: 
        name: "otel_metrics_histogram"
      exponential_histogram: 
        name: "otel_metrics_exp_histogram"

service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [clickhouse]
    logs:
      receivers: [otlp]
      processors: [batch]
      exporters: [clickhouse]
    metrics:
      receivers: [otlp, prometheus]
      processors: [batch]
      exporters: [clickhouse]

  telemetry:
    logs:
      level: "debug"

Log output

Oct 21 15:23:04 xxx otelcol-contrib[3819905]: 2024-10-21T15:23:04.742+0800#011debug#[email protected]/exporter_logs.go:127#011insert logs#011{"kind": "exporter", "data_type": "logs", "name": "clickhouse", "records": 1, "cost": "242.2003ms"}

Additional context

No response

@Wudadada Wudadada added bug Something isn't working needs triage New item requiring triage labels Oct 21, 2024
@github-actions github-actions bot added the processor/transform Transform processor label Oct 21, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@TylerHelmuth
Copy link
Member

Can you add a debug exporter or enable debug logging and look for the value changing?

@Wudadada
Copy link
Author

Weekly Report: 2024-10-15 - 2024-10-22 #35905

yeah here is the log


Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: Trace ID: 9a3209a5d0a291e5968f3aff2df77ac3
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: Span ID: 4ea2aed32ce714f4
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: Flags: 1
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: ResourceLog #1
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: Resource SchemaURL:
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: Resource attributes:
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]:     -> host.name: Str()
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]:     -> service.name: Str(n9e)
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]:     -> service.version: Str(V7)
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: ScopeLogs #0
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: ScopeLogs SchemaURL:
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: InstrumentationScope otel-logger
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: LogRecord #0
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: ObservedTimestamp: 2024-10-22 06:11:40.105141836 +0000 UTC
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: Timestamp: 2024-10-22 06:11:40.105137993 +0000 UTC
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: SeverityText:
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: SeverityNumber: Info(9)
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: Body: Str(dop调用接口)
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: Trace ID: 41f08a168e2898137394310a4bf052ca
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: Span ID: 6eb22414a0bc3492
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: Flags: 1
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: #011{"kind": "exporter", "data_type": "logs", "name": "debug"}
Oct 22 14:11:41 yptjkcshj-Linux-005 otelcol-contrib[3898454]: 2024-10-22T14:11:41.738+0800#011debug#[email protected]/exporter_logs.go:127#011insert logs#011{"kind": "exporter", "data_type": "logs", "name": "clickhouse", "records": 17, "cost": "83.790901ms"}

@TylerHelmuth
Copy link
Member

@Wudadada you need to include the transform processor in the service.pipelines.logs.processors.

@Wudadada
Copy link
Author

@Wudadada you need to include the transform processor in the service.pipelines.logs.processors.

Thanks a lot

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working needs triage New item requiring triage processor/transform Transform processor
Projects
None yet
Development

No branches or pull requests

2 participants