airflow Questions
3
Doc says:
Hooks are interfaces to external platforms and databases like Hive, S3, MySQL, Postgres, HDFS, and Pig. Hooks implement a common interface when possible, and act as a building block for o...
4
Is there any way to get the exception details on the airflow on_failure_callback?
I've noticed it's not part of context. I'd like to create a generic exception handling mechanism which posts to Sl...
Forland asked 13/8, 2018 at 12:8
5
I try to configure Airbnb AirFlow to use the CeleryExecutor like this:
I changed the executer in the airflow.cfg from SequentialExecutor to CeleryExecutor:
# The executor class that airflow shoul...
Revanche asked 24/4, 2016 at 11:29
3
I have seen sample for the new data-aware scheduling where datatset is a file(csv,txt e.t.c)
I wanted to check if i can use a SQL server table as Dataset?
from airflow import Dataset
dataset = Data...
Heyman asked 28/3, 2023 at 16:58
3
Solved
I am new to airflow and need some direction on this one...
I'm creating my first dag that uses a subdag and importing the subdag operator
`from airflow.operators.subdag import SubDagOperator`
howe...
Punner asked 5/1, 2021 at 17:17
5
Solved
The question is very similar to the one already available. The only difference is that I ran Airflow in docker
Step by step:
Put docker-compose.yaml to PyCharm project
Put requirements.txt to PyCh...
3
Solved
Suppose a following situation:
[c1, c2, c3] >> child_task
where all c1, c2, c3 and child_task are operators and have task_id equal to id1, id2, id3 and child_id respectively.
Task child_t...
Exchequer asked 16/2, 2019 at 23:0
2
I'm running Apache Airflow 2.x locally, using the Docker Compose file that is provided in the documentation. In the .\dags directory on my local filesystem (which is mounted into the Airflow contai...
Carman asked 7/4, 2021 at 15:28
5
My Airflow DAGs mainly consist of PythonOperators, and I would like to use my Python IDEs debug tools to develop python "inside" airflow. - I rely on Airflow's database connectors, which I think wo...
3
Solved
I am using a LocalExecutor and my dag has 3 tasks where task(C) is dependant on task(A). Task(B) and task(A) can run in parallel something like below
A-->C
B
So task(A) has failed and but task(B...
3
I'd like to pass other arguments to my on_failure_callback function but it only seems to want "context". How do I pass other arguments to that function...especially since I'd like to define that fu...
Harney asked 15/8, 2018 at 0:53
5
Solved
I want to create a snippet that passes the correct date based on whether the DAG was scheduled or whether it was triggered manually. The DAG runs monthly. The DAG generates a report (A SQL query) b...
Funeral asked 5/2, 2020 at 14:5
3
Solved
I just installed Airflow 2.3.0 using the command
pip install "apache-airflow==2.3.0" --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.3.0/constraints-3.8....
Wilk asked 11/5, 2022 at 1:55
1
Solved
I'm facing an issue which my dag cannot be imported, but cannot figure out why:
from airflow.sensors.sql import SqlSensor
import pendulum
from airflow.decorators import task,dag
@dag(
dag_id = &qu...
2
Solved
I know that when you run airflow webserver via your home terminal, you can view the UI interface by going to http://localhost:8080. I am able to do this.
However, I have a virtual Amazon Lightsail...
Tersina asked 12/5, 2017 at 19:53
5
Solved
I have a scenario wherein a particular dag upon completion needs to trigger multiple dags,have used TriggerDagRunOperator to trigger single dag,is it possible to pass multiple dags to the TriggerDa...
Riancho asked 28/6, 2017 at 15:34
4
I am trying to run apache airflow as a docker on a Centos 7 machine.
I followed all the instructions here:https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html
when i am trying to...
Melentha asked 25/3, 2021 at 1:23
4
Solved
webserver_1 | The above exception was the direct cause of the following exception:
webserver_1 |
webserver_1 | Traceback (most recent call last):
webserver_1 | File "/usr/local/bin/airflow&qu...
1
I want to link in some DAGs from a directory outside of my dags_folder. How ever when I create a symlink using
ln -s /absolute/path/to/dag.py dags/
It does not show up when running airflow dags li...
5
Is there any way to reload the jobs without having to restart the server?
Anglia asked 25/4, 2017 at 9:10
2
What's the best way to retry an Airflow operator only for certain failures/exceptions?
For example, let's assume that I have an Airflow task which relies on the availability of an external service...
Unpin asked 26/6, 2019 at 13:59
6
Solved
How can I configure Airflow so that any failure in the DAG will (immediately) result in a slack message?
At this moment I manage it by creating a slack_failed_task:
slack_failed_task = SlackAPIPo...
1
version: '3'
x-airflow-common:
&airflow-common
# In order to add custom dependencies or upgrade provider packages you can use your extended image.
# Comment the image line, place your Docker...
Coif asked 14/7, 2022 at 4:20
3
I tried to run the Python Operator Example in my Airflow installation. The installation has deployed webserver, scheduler and worker on the same machine and runs with no complaints for all non-Pyto...
4
In my first foray into airflow, I am trying to run one of the example DAGS that comes with the installation. This is v.1.8.0. Here are my steps:
$ airflow trigger_dag example_bash_operator
[2017-0...
Silvers asked 19/4, 2017 at 22:38
© 2022 - 2024 — McMap. All rights reserved.