Skip to content

Commit c763d2d

Browse files
committed
initial astro project structure
0 parents  commit c763d2d

12 files changed

+584
-0
lines changed

.astro/config.yaml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
project:
2+
name: astro-example-dags

.astro/test_dag_integrity_default.py

Lines changed: 119 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,119 @@
1+
"""Test the validity of all DAGs. **USED BY DEV PARSE COMMAND DO NOT EDIT**"""
2+
from contextlib import contextmanager
3+
import logging
4+
import os
5+
6+
import pytest
7+
8+
from airflow.models import DagBag, Variable, Connection
9+
from airflow.hooks.base import BaseHook
10+
from airflow.utils.db import initdb
11+
12+
# init airflow database
13+
initdb()
14+
15+
# The following code patches errors caused by missing OS Variables, Airflow Connections, and Airflow Variables
16+
17+
18+
# =========== MONKEYPATCH BaseHook.get_connection() ===========
19+
def basehook_get_connection_monkeypatch(key: str, *args, **kwargs):
20+
print(
21+
f"Attempted to fetch connection during parse returning an empty Connection object for {key}"
22+
)
23+
return Connection(key)
24+
25+
26+
BaseHook.get_connection = basehook_get_connection_monkeypatch
27+
# # =========== /MONKEYPATCH BASEHOOK.GET_CONNECTION() ===========
28+
29+
30+
# =========== MONKEYPATCH OS.GETENV() ===========
31+
def os_getenv_monkeypatch(key: str, *args, **kwargs):
32+
default = None
33+
if args:
34+
default = args[0] # os.getenv should get at most 1 arg after the key
35+
if kwargs:
36+
default = kwargs.get(
37+
"default", None
38+
) # and sometimes kwarg if people are using the sig
39+
40+
env_value = os.environ.get(key, None)
41+
42+
if env_value:
43+
return env_value # if the env_value is set, return it
44+
if (
45+
key == "JENKINS_HOME" and default is None
46+
): # fix https://github.com/astronomer/astro-cli/issues/601
47+
return None
48+
if default:
49+
return default # otherwise return whatever default has been passed
50+
return f"MOCKED_{key.upper()}_VALUE" # if absolutely nothing has been passed - return the mocked value
51+
52+
53+
os.getenv = os_getenv_monkeypatch
54+
# # =========== /MONKEYPATCH OS.GETENV() ===========
55+
56+
# =========== MONKEYPATCH VARIABLE.GET() ===========
57+
58+
59+
class magic_dict(dict):
60+
def __init__(self, *args, **kwargs):
61+
self.update(*args, **kwargs)
62+
63+
def __getitem__(self, key):
64+
return {}.get(key, "MOCKED_KEY_VALUE")
65+
66+
67+
def variable_get_monkeypatch(key: str, default_var=None, deserialize_json=False):
68+
print(
69+
f"Attempted to get Variable value during parse, returning a mocked value for {key}"
70+
)
71+
72+
if default_var:
73+
return default_var
74+
if deserialize_json:
75+
return magic_dict()
76+
return "NON_DEFAULT_MOCKED_VARIABLE_VALUE"
77+
78+
79+
Variable.get = variable_get_monkeypatch
80+
# # =========== /MONKEYPATCH VARIABLE.GET() ===========
81+
82+
83+
@contextmanager
84+
def suppress_logging(namespace):
85+
"""
86+
Suppress logging within a specific namespace to keep tests "clean" during build
87+
"""
88+
logger = logging.getLogger(namespace)
89+
old_value = logger.disabled
90+
logger.disabled = True
91+
try:
92+
yield
93+
finally:
94+
logger.disabled = old_value
95+
96+
97+
def get_import_errors():
98+
"""
99+
Generate a tuple for import errors in the dag bag
100+
"""
101+
with suppress_logging("airflow"):
102+
dag_bag = DagBag(include_examples=False)
103+
104+
def strip_path_prefix(path):
105+
return os.path.relpath(path, os.environ.get("AIRFLOW_HOME"))
106+
107+
# prepend "(None,None)" to ensure that a test object is always created even if it's a no op.
108+
return [(None, None)] + [
109+
(strip_path_prefix(k), v.strip()) for k, v in dag_bag.import_errors.items()
110+
]
111+
112+
113+
@pytest.mark.parametrize(
114+
"rel_path,rv", get_import_errors(), ids=[x[0] for x in get_import_errors()]
115+
)
116+
def test_file_imports(rel_path, rv):
117+
"""Test for import errors on a file"""
118+
if rel_path and rv: # Make sure our no op test doesn't raise an error
119+
raise Exception(f"{rel_path} failed to import with message \n {rv}")

.dockerignore

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
astro
2+
.git
3+
.env
4+
airflow_settings.yaml
5+
logs/

.gitignore

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
.git
2+
.env
3+
.DS_Store # macOS specific ignore
4+
airflow_settings.yaml
5+
__pycache__/
6+
astro

Dockerfile

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
FROM quay.io/astronomer/astro-runtime:8.3.0

README.md

Lines changed: 49 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,49 @@
1+
Overview
2+
========
3+
4+
Welcome to Astronomer! This project was generated after you ran 'astro dev init' using the Astronomer CLI. This readme describes the contents of the project, as well as how to run Apache Airflow on your local machine.
5+
6+
Project Contents
7+
================
8+
9+
Your Astro project contains the following files and folders:
10+
11+
- dags: This folder contains the Python files for your Airflow DAGs. By default, this directory includes two example DAGs:
12+
- `example_dag_basic`: This DAG shows a simple ETL data pipeline example with three TaskFlow API tasks that run daily.
13+
- `example_dag_advanced`: This advanced DAG showcases a variety of Airflow features like branching, Jinja templates, task groups and several Airflow operators.
14+
- Dockerfile: This file contains a versioned Astro Runtime Docker image that provides a differentiated Airflow experience. If you want to execute other commands or overrides at runtime, specify them here.
15+
- include: This folder contains any additional files that you want to include as part of your project. It is empty by default.
16+
- packages.txt: Install OS-level packages needed for your project by adding them to this file. It is empty by default.
17+
- requirements.txt: Install Python packages needed for your project by adding them to this file. It is empty by default.
18+
- plugins: Add custom or community plugins for your project to this file. It is empty by default.
19+
- airflow_settings.yaml: Use this local-only file to specify Airflow Connections, Variables, and Pools instead of entering them in the Airflow UI as you develop DAGs in this project.
20+
21+
Deploy Your Project Locally
22+
===========================
23+
24+
1. Start Airflow on your local machine by running 'astro dev start'.
25+
26+
This command will spin up 4 Docker containers on your machine, each for a different Airflow component:
27+
28+
- Postgres: Airflow's Metadata Database
29+
- Webserver: The Airflow component responsible for rendering the Airflow UI
30+
- Scheduler: The Airflow component responsible for monitoring and triggering tasks
31+
- Triggerer: The Airflow component responsible for triggering deferred tasks
32+
33+
2. Verify that all 4 Docker containers were created by running 'docker ps'.
34+
35+
Note: Running 'astro dev start' will start your project with the Airflow Webserver exposed at port 8080 and Postgres exposed at port 5432. If you already have either of those ports allocated, you can either stop your existing Docker containers or change the port.
36+
37+
3. Access the Airflow UI for your local Airflow project. To do so, go to http://localhost:8080/ and log in with 'admin' for both your Username and Password.
38+
39+
You should also be able to access your Postgres Database at 'localhost:5432/postgres'.
40+
41+
Deploy Your Project to Astronomer
42+
=================================
43+
44+
If you have an Astronomer account, pushing code to a Deployment on Astronomer is simple. For deploying instructions, refer to Astronomer documentation: https://docs.astronomer.io/cloud/deploy-code/
45+
46+
Contact
47+
=======
48+
49+
The Astronomer CLI is maintained with love by the Astronomer team. To report a bug or suggest a change, reach out to our support.

dags/.airflowignore

Whitespace-only changes.

0 commit comments

Comments
 (0)