Skip to content

[v3-0-test] Unify connection not found exceptions between AF2 and AF3 (#52968) #53093

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 14 commits into from
Jul 11, 2025
Merged
Show file tree
Hide file tree
Changes from 13 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 8 additions & 8 deletions .github/actions/migration_tests/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,9 +24,9 @@ runs:
- name: "Test migration file 2 to 3 migration: ${{env.BACKEND}}"
shell: bash
run: |
breeze shell "${{ env.AIRFLOW_2_CMD }}" --use-airflow-version 2.11.0 --answer y &&
breeze shell "export AIRFLOW__DATABASE__EXTERNAL_DB_MANAGERS=${{env.DB_MANGERS}}
${{ env.AIRFLOW_3_CMD }}" --no-db-cleanup
breeze shell "${AIRFLOW_2_CMD}" --use-airflow-version 2.11.0 --answer y &&
breeze shell "export AIRFLOW__DATABASE__EXTERNAL_DB_MANAGERS=${DB_MANGERS}
${AIRFLOW_3_CMD}" --no-db-cleanup
env:
COMPOSE_PROJECT_NAME: "docker-compose"
DB_RESET: "false"
Expand All @@ -47,9 +47,9 @@ runs:
- name: "Test ORM migration 2 to 3: ${{env.BACKEND}}"
shell: bash
run: >
breeze shell "${{ env.AIRFLOW_2_CMD }}" --use-airflow-version 2.11.0 --answer y &&
breeze shell "export AIRFLOW__DATABASE__EXTERNAL_DB_MANAGERS=${{env.DB_MANGERS}}
${{ env.AIRFLOW_3_CMD }}" --no-db-cleanup
breeze shell "${AIRFLOW_2_CMD}" --use-airflow-version 2.11.0 --answer y &&
breeze shell "export AIRFLOW__DATABASE__EXTERNAL_DB_MANAGERS=${DB_MANGERS}
${AIRFLOW_3_CMD}" --no-db-cleanup
env:
COMPOSE_PROJECT_NAME: "docker-compose"
DB_RESET: "false"
Expand All @@ -69,7 +69,7 @@ runs:
- name: "Test ORM migration ${{env.BACKEND}}"
shell: bash
run: >
breeze shell "export AIRFLOW__DATABASE__EXTERNAL_DB_MANAGERS=${{env.DB_MANAGERS}} &&
breeze shell "export AIRFLOW__DATABASE__EXTERNAL_DB_MANAGERS=${DB_MANAGERS} &&
airflow db reset -y &&
airflow db migrate --to-revision heads &&
airflow db downgrade -n 2.7.0 -y &&
Expand All @@ -86,7 +86,7 @@ runs:
shell: bash
run: >
breeze shell
"export AIRFLOW__DATABASE__EXTERNAL_DB_MANAGERS=${{env.DB_MANAGERS}} &&
"export AIRFLOW__DATABASE__EXTERNAL_DB_MANAGERS=${DB_MANAGERS} &&
airflow db reset -y &&
airflow db downgrade -n 2.7.0 -y &&
airflow db migrate -s"
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/basic-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -267,6 +267,8 @@ jobs:
skip-pre-commits: ${{ inputs.skip-pre-commits }}
- name: "Autoupdate all pre-commits"
run: pre-commit autoupdate
- name: "Autoupdate Lucas-C/pre-commit-hooks to bleeding edge"
run: pre-commit autoupdate --bleeding-edge --freeze --repo https://github.com/Lucas-C/pre-commit-hooks
- name: "Run automated upgrade for black"
run: >
pre-commit run
Expand Down
5 changes: 3 additions & 2 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,8 @@ repos:
- "--maxlevel"
- "2"
- repo: https://github.com/Lucas-C/pre-commit-hooks
rev: v1.5.5
# replace hash with version once PR #103 merged comes in a release
rev: abdd8b62891099da34162217ecb3872d22184a51
hooks:
- id: insert-license
name: Add license for all SQL files
Expand Down Expand Up @@ -356,7 +357,7 @@ repos:
- --skip=providers/.*/src/airflow/providers/*/*.rst,providers/*/docs/changelog.rst,docs/*/commits.rst,providers/*/docs/commits.rst,providers/*/*/docs/commits.rst,docs/apache-airflow/tutorial/pipeline_example.csv,*.min.js,*.lock,INTHEWILD.md,*.svg
- --exclude-file=.codespellignorelines
- repo: https://github.com/woodruffw/zizmor-pre-commit
rev: v1.7.0
rev: v1.11.0
hooks:
- id: zizmor
name: Run zizmor to check for github workflow syntax errors
Expand Down
10 changes: 0 additions & 10 deletions airflow-core/docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,6 @@

PACKAGES_THAT_WE_SHOULD_ADD_TO_API_DOCS = {
"hooks",
"decorators",
"example_dags",
"executors",
"operators",
Expand All @@ -140,15 +139,6 @@

MODELS_THAT_SHOULD_BE_INCLUDED_IN_API_DOCS: set[str] = {
"baseoperator.py",
"connection.py",
"dag.py",
"dagrun.py",
"dagbag.py",
"param.py",
"taskinstance.py",
"taskinstancekey.py",
"variable.py",
"xcom.py",
}


Expand Down
2 changes: 1 addition & 1 deletion airflow-core/docs/core-concepts/params.rst
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ If the user-supplied values don't pass validation, Airflow shows a warning inste
DAG-level Params
----------------

To add Params to a :class:`~airflow.models.dag.DAG`, initialize it with the ``params`` kwarg.
To add Params to a :class:`~airflow.sdk.DAG`, initialize it with the ``params`` kwarg.
Use a dictionary that maps Param names to either a :class:`~airflow.sdk.definitions.param.Param` or an object indicating the parameter's default value.

.. code-block::
Expand Down
14 changes: 14 additions & 0 deletions airflow-core/docs/core-concepts/variables.rst
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,20 @@ To use them, just import and call ``get`` on the Variable model::
# Returns the value of default (None) if the variable is not set
baz = Variable.get("baz", default=None)

You can also access variables through the Task Context using
:func:`~airflow.sdk.get_current_context`:

.. code-block:: python

from airflow.sdk import get_current_context


def my_task():
context = get_current_context()
var = context["var"]
my_variable = var.get("my_variable_name")
return my_variable

You can also use them from :ref:`templates <concepts:jinja-templating>`::

# Raw value
Expand Down
9 changes: 5 additions & 4 deletions airflow-core/docs/core-concepts/xcoms.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,9 @@ XComs (short for "cross-communications") are a mechanism that let :doc:`tasks` t

An XCom is identified by a ``key`` (essentially its name), as well as the ``task_id`` and ``dag_id`` it came from. They can have any serializable value (including objects that are decorated with ``@dataclass`` or ``@attr.define``, see :ref:`TaskFlow arguments <concepts:arbitrary-arguments>`:), but they are only designed for small amounts of data; do not use them to pass around large values, like dataframes.

XCom operations should be performed through the Task Context using
:func:`~airflow.sdk.get_current_context`. Directly updating using XCom database model is not possible.

XComs are explicitly "pushed" and "pulled" to/from their storage using the ``xcom_push`` and ``xcom_pull`` methods on Task Instances.

To push a value within a task called **"task-1"** that will be used by another task:
Expand Down Expand Up @@ -73,8 +76,6 @@ An example of pushing multiple XComs and pulling them individually:
# Pulling entire xcom data from push_multiple task
data = context["ti"].xcom_pull(task_ids="push_multiple", key="return_value")



.. note::

If the first task run is not succeeded then on every retry task XComs will be cleared to make the task run idempotent.
Expand All @@ -91,7 +92,7 @@ Custom XCom Backends

The XCom system has interchangeable backends, and you can set which backend is being used via the ``xcom_backend`` configuration option.

If you want to implement your own backend, you should subclass :class:`~airflow.models.xcom.BaseXCom`, and override the ``serialize_value`` and ``deserialize_value`` methods.
If you want to implement your own backend, you should subclass :class:`~airflow.sdk.bases.xcom.BaseXCom`, and override the ``serialize_value`` and ``deserialize_value`` methods.

You can override the ``purge`` method in the ``BaseXCom`` class to have control over purging the xcom data from the custom backend. This will be called as part of ``delete``.

Expand All @@ -104,6 +105,6 @@ If you can exec into a terminal in an Airflow container, you can then print out

.. code-block:: python

from airflow.models.xcom import XCom
from airflow.sdk.execution_time.xcom import XCom

print(XCom.__name__)
6 changes: 3 additions & 3 deletions airflow-core/docs/howto/connection.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Managing Connections

For an overview of hooks and connections, see :doc:`/authoring-and-scheduling/connections`.

Airflow's :class:`~airflow.models.connection.Connection` object is used for storing credentials and other information necessary for connecting to external services.
Airflow's :class:`~airflow.sdk.Connection` object is used for storing credentials and other information necessary for connecting to external services.

Connections may be defined in the following ways:

Expand Down Expand Up @@ -77,7 +77,7 @@ convenience property :py:meth:`~airflow.models.connection.Connection.as_json`. I

.. code-block:: pycon
>>> from airflow.models.connection import Connection
>>> from airflow.sdk import Connection
>>> c = Connection(
... conn_id="some_conn",
... conn_type="mysql",
Expand All @@ -94,7 +94,7 @@ In addition, same approach could be used to convert Connection from URI format t

.. code-block:: pycon
>>> from airflow.models.connection import Connection
>>> from airflow.sdk import Connection
>>> c = Connection(
... conn_id="awesome_conn",
... description="Example Connection",
Expand Down
2 changes: 1 addition & 1 deletion airflow-core/docs/howto/custom-operator.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Creating a custom Operator
Airflow allows you to create new operators to suit the requirements of you or your team.
This extensibility is one of the many features which make Apache Airflow powerful.

You can create any operator you want by extending the :class:`airflow.models.baseoperator.BaseOperator`
You can create any operator you want by extending the public SDK base class :class:`~airflow.sdk.BaseOperator`.

There are two methods that you need to override in a derived class:

Expand Down
2 changes: 1 addition & 1 deletion airflow-core/docs/howto/docker-compose/docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@ services:
echo "For other operating systems you can get rid of the warning with manually created .env file:"
echo " See: https://airflow.apache.org/docs/apache-airflow/stable/howto/docker-compose/index.html#setting-the-right-airflow-user"
echo
export AIRFLOW_UID=$(id -u)
export AIRFLOW_UID=$$(id -u)
fi
one_meg=1048576
mem_available=$$(($$(getconf _PHYS_PAGES) * $$(getconf PAGE_SIZE) / one_meg))
Expand Down
Loading