Skip to content

Add tests for startup log behavior in tracer libraries#6241

Draft
bm1549 wants to merge 5 commits intomainfrom
brian.marks/startup-log
Draft

Add tests for startup log behavior in tracer libraries#6241
bm1549 wants to merge 5 commits intomainfrom
brian.marks/startup-log

Conversation

@bm1549
Copy link
Contributor

@bm1549 bm1549 commented Feb 6, 2026

Motivation

Add comprehensive tests for tracer startup log behavior across supported languages. These tests verify that tracers correctly emit startup configuration logs when DD_TRACE_STARTUP_LOGS=true and suppress them when DD_TRACE_STARTUP_LOGS=false, ensuring proper observability and configuration visibility.

Changes

Adds tests/parametric/test_startup_logs.py with three test cases:

  1. test_startup_logs_enabled: Verifies that startup logs are emitted when DD_TRACE_STARTUP_LOGS=true

    • For Node.js: Triggers a trace to ensure startup logs are emitted (they appear when the tracer sends its first trace)
    • For .NET: Reads from the log file /dotnet-tracer-managed-ApmTestApi-1.log (startup logs are written to a file instead of stdout/stderr)
    • For other languages: Captures logs from container stdout/stderr
    • Validates presence of startup log pattern: DATADOG (TRACER )?CONFIGURATION( - (CORE|TRACING|PROFILING|.*))?
    • Marked as incomplete for php, cpp, and rust (need to figure out how to test these)
  2. test_startup_logs_disabled: Verifies that startup logs are suppressed when DD_TRACE_STARTUP_LOGS=false

    • Uses the same checking behavior as the enabled test to ensure consistency
    • For Node.js: Triggers a trace to verify logs would be emitted if enabled
    • For .NET: Checks that the log file is empty or doesn't contain startup logs
    • For other languages: Verifies startup log pattern is not present in container logs
    • Marked as incomplete for php, cpp, and rust
  3. test_startup_logs_diagnostic_agent_unreachable: Verifies diagnostic messages appear when the agent is unreachable

    • Tests that tracers output appropriate diagnostic messages when unable to connect to the agent
    • Searches for various diagnostic patterns (connection refused, DNS failures, etc.)
    • Marked as missing feature for cpp, rust, and dotnet (some tracers may not output diagnostic messages in startup logs)

The tests handle language-specific differences:

  • Node.js requires trace triggering to emit startup logs
  • .NET writes to a file instead of stdout/stderr
  • Other languages output to stdout/stderr

Workflow

  1. ⚠️ Create your PR as draft ⚠️
  2. Work on you PR until the CI passes
  3. Mark it as ready for review
    • Test logic is modified? -> Get a review from RFC owner.
    • Framework is modified, or non obvious usage of it -> get a review from R&P team

🚀 Once your PR is reviewed and the CI green, you can merge it!

🛟 #apm-shared-testing 🛟

Reviewer checklist

  • Anything but tests/ or manifests/ is modified ? I have the approval from R&P team
  • A docker base image is modified?
    • the relevant build-XXX-image label is present
  • A scenario is added, removed or renamed?

@bm1549 bm1549 added the ai-generated The pull request includes a significant amount of AI-generated code label Feb 6, 2026
@github-actions
Copy link
Contributor

github-actions bot commented Feb 6, 2026

CODEOWNERS have been resolved as:

tests/parametric/test_startup_logs.py                                   @DataDog/system-tests-core @DataDog/apm-sdk-capabilities

@datadog-datadog-prod-us1
Copy link

datadog-datadog-prod-us1 bot commented Feb 6, 2026

✅ Tests

🎉 All green!

❄️ No new flaky tests detected
🧪 All tests passed

This comment will be updated automatically if new data arrives.
🔗 Commit SHA: f77dcff | Docs | Datadog PR Page | Was this helpful? Give us feedback!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ai-generated The pull request includes a significant amount of AI-generated code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant