Skip to content

docs: Correct TaskFlow capitalization in documentation #51794

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions airflow-core/docs/best-practices.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1010,7 +1010,7 @@ There are certain limitations and overhead introduced by this operator:
same worker might be affected by previous tasks creating/modifying files etc.

You can see detailed examples of using :class:`airflow.providers.standard.operators.python.PythonVirtualenvOperator` in
:ref:`this section in the Taskflow API tutorial <taskflow-dynamically-created-virtualenv>`.
:ref:`this section in the TaskFlow API tutorial <taskflow-dynamically-created-virtualenv>`.


Using ExternalPythonOperator
Expand Down Expand Up @@ -1078,7 +1078,7 @@ The nice thing about this is that you can switch the decorator back at any time
developing it "dynamically" with ``PythonVirtualenvOperator``.

You can see detailed examples of using :class:`airflow.providers.standard.operators.python.ExternalPythonOperator` in
:ref:`Taskflow External Python example <taskflow-external-python-environment>`
:ref:`TaskFlow External Python example <taskflow-external-python-environment>`

Using DockerOperator or Kubernetes Pod Operator
-----------------------------------------------
Expand Down Expand Up @@ -1142,9 +1142,9 @@ The drawbacks:
containers etc. in order to author a DAG that uses those operators.

You can see detailed examples of using :class:`airflow.operators.providers.Docker` in
:ref:`Taskflow Docker example <taskflow-docker_environment>`
:ref:`TaskFlow Docker example <taskflow-docker_environment>`
and :class:`airflow.providers.cncf.kubernetes.operators.pod.KubernetesPodOperator`
:ref:`Taskflow Kubernetes example <tasfklow-kpo>`
:ref:`TaskFlow Kubernetes example <tasfklow-kpo>`

Using multiple Docker Images and Celery Queues
----------------------------------------------
Expand Down
2 changes: 1 addition & 1 deletion airflow-core/docs/tutorial/fundamentals.rst
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ Next, we'll need to create a DAG object to house our tasks. We'll provide a uniq
Understanding Operators
-----------------------
An operator represents a unit of work in Airflow. They are the building blocks of your workflows, allowing you to
define what tasks will be executed. While we can use operators for many tasks, Airflow also offers the :doc:`Taskflow API <taskflow>`
define what tasks will be executed. While we can use operators for many tasks, Airflow also offers the :doc:`TaskFlow API <taskflow>`
for a more Pythonic way to define workflows, which we'll touch on later.

All operators derive from the ``BaseOperator``, which includes the essential arguments needed to run tasks in Airflow.
Expand Down
4 changes: 2 additions & 2 deletions airflow-core/docs/tutorial/objectstorage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ Cloud-Native Workflows with Object Storage

.. versionadded:: 2.8

Welcome to the final tutorial in our Airflow series! By now, you've built DAGs with Python and the Taskflow API, passed
Welcome to the final tutorial in our Airflow series! By now, you've built DAGs with Python and the TaskFlow API, passed
data with XComs, and chained tasks together into clear, reusable workflows.

In this tutorial we'll take it a step further by introducing the **Object Storage API**. This API makes it easier to
Expand Down Expand Up @@ -108,7 +108,7 @@ Here's what's happening:
- We generate a filename based on the task's logical date
- Using ``ObjectStoragePath``, we write the data directly to cloud storage as Parquet

This is a classic Taskflow pattern. The object key changes each day, allowing us to run this daily and build a dataset
This is a classic TaskFlow pattern. The object key changes each day, allowing us to run this daily and build a dataset
over time. We return the final object path to be used in the next task.

Why this is cool: No boto3, no GCS client setup, no credentials juggling. Just simple file semantics that work across
Expand Down
4 changes: 2 additions & 2 deletions providers/docker/docs/changelog.rst
Original file line number Diff line number Diff line change
Expand Up @@ -786,7 +786,7 @@ Other
Features
~~~~~~~~

* ``Add a Docker Taskflow decorator (#15330)``
* ``Add a Docker TaskFlow decorator (#15330)``

This version of Docker Provider has a new feature - TaskFlow decorator that only works in Airflow 2.2.
If you try to use the decorator in pre-Airflow 2.2 version you will get an error:
Expand Down Expand Up @@ -882,7 +882,7 @@ Features
~~~~~~~~

* ``Entrypoint support in docker operator (#14642)``
* ``Add PythonVirtualenvDecorator to Taskflow API (#14761)``
* ``Add PythonVirtualenvDecorator to TaskFlow API (#14761)``
* ``Support all terminus task states in Docker Swarm Operator (#14960)``


Expand Down
2 changes: 1 addition & 1 deletion providers/sftp/docs/sensors/sftp_sensor.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ To get more information about this sensor visit :class:`~airflow.providers.sftp.
:end-before: [END howto_operator_sftp_sensor]


We can also use Taskflow API. It takes the same arguments as the :class:`~airflow.providers.sftp.sensors.sftp.SFTPSensor` along with -
We can also use TaskFlow API. It takes the same arguments as the :class:`~airflow.providers.sftp.sensors.sftp.SFTPSensor` along with -

op_args (optional)
A list of positional arguments that will get unpacked when
Expand Down
2 changes: 1 addition & 1 deletion providers/standard/docs/changelog.rst
Original file line number Diff line number Diff line change
Expand Up @@ -288,7 +288,7 @@ Misc
* ``AIP-72: Move non-user facing code to '_internal' (#45515)``
* ``AIP-72: Add support for 'get_current_context' in Task SDK (#45486)``
* ``Move Literal alias into TYPE_CHECKING block (#45345)``
* ``AIP-72: Add Taskflow API support & template rendering in Task SDK (#45444)``
* ``AIP-72: Add TaskFlow API support & template rendering in Task SDK (#45444)``
* ``Remove tuple_in_condition helpers (#45201)``

.. Below changes are excluded from the changelog. Move them to
Expand Down