Skip to content

Import timeouts not visible in UI #58995

@mikoloay

Description

@mikoloay

Apache Airflow version

3.1.3

If "Other Airflow 2/3 version" selected, which one?

No response

What happened?

Hi, maybe it should be marked as a feature request, but I decided to mark it as a bug since this feature is present in Airflow 2.

In Airflow 3 when a dag script is being parsed for too long and a dagbag parsing timeout occurs, the dagprocessor subprocess is being immediately killed and no info appears in the UI, as if the dag script didn't exist.

Here's a log from a dag processor pod:
[error ] Processor for DagFileInfo(rel_path=PosixPath('repo/timeout_dag.py'), bundle_name='dags-folder', bundle_path=PosixPath('/opt/airflow/dags'), bundle_version=None) with PID 115 started 50 ago killing it.

What you think should happen instead?

Airflow users should still be able to see timeout errors in the Dag Import Errors section in the UI.

How to reproduce

The following script can be used to replicate this behaviour

from airflow.sdk import DAG
import time

with DAG(
    dag_id="timeout_dag",
    schedule=None
):
    time.sleep(10000)

Operating System

Debian GNU/Linux 12 (bookworm)

Versions of Apache Airflow Providers

No response

Deployment

Official Apache Airflow Helm Chart

Deployment details

No response

Anything else?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions