google-cloud-composer Questions

5

Solved

I have BigQuery connectors all running, but I have some existing scripts in Docker containers I wish to schedule on Cloud Composer instead of App Engine Flexible. I have the below script that seem...

2

Solved

I have a bit of confusion about the way BaseSensorOperator's parameters work: timeout & poke_interval. Consider this usage of the sensor : BaseSensorOperator( soft_fail=True, poke_interval = ...
Waterspout asked 7/9, 2020 at 9:58

2

I know how to assign a static external IP address to a Compute Engine, but can this be done with Google Cloud Composer (Airflow)? I'd imagine most companies need that functionality since they'd gen...

3

Solved

In some of my Apache Airflow installations, DAGs or tasks that are scheduled to run do not run even when the scheduler doesn't appear to be fully loaded. How can I increase the number of DAGs or ta...

2

Solved

My DAG looks like this default_args = { 'start_date': airflow.utils.dates.days_ago(0), 'retries': 0, 'dataflow_default_options': { 'project': 'test', 'tempLocation': 'gs://test/dataflow/pipel...

3

Solved

I am learning GCP, and came across Kuberflow and Google Cloud Composer. From what I have understood, it seems that both are used to orchestrate workflows, empowering the user to schedule and monito...
Millsap asked 17/3, 2020 at 8:7

1

I'm running a DAG test_dag.py which is structured in the following way in my Google Cloud Storage Bucket. gcs-bucket/ dags/ test_dag.py dependencies/ __init__.py dependency_1.py module1/ __...
Quinsy asked 6/5, 2020 at 1:16

1

How does a Dataproc Spark operator in Airflow return a value and how to capture it. I have a downstream job which capture this result and based on returned value, I've to trigger another job by br...

3

Solved

When I go to use operators/hooks like the BigQueryHook I see a message that these operators are deprecated and to use the airflow.gcp... operator version. However when i try and use it in my dag it...

3

Solved

I want to delete a DAG from the Airflow UI, that's not longer available in the GCS/dags folder. I know that Airflow has a "new" way to remove dags from the DB using airflow delete_dag my_dag_id co...
Slowwitted asked 31/5, 2018 at 20:53

1

I'd like to use connections saved in airflow in a task which uses the KubernetesPodOperator. When developing the image I've used environment variables to pass database connection information down ...

1

I have a series of notebooks I want to execute each weekday, each of which is dependent on the previous one, and I'd like to have this entire process automated with Cloud Composer. I'm familiar wi...

3

Solved

I'm using Composer (Airflow) in Google Cloud. I want to create a new environment and take my same DAGs and Variables from the old environment into the new one. To accomplish this I do the followin...
Lantern asked 19/2, 2020 at 2:32

2

Solved

I am currently studying for the GCP Data Engineer exam and have struggled to understand when to use Cloud Scheduler and whe to use Cloud Composer. From reading the docs, I have the impression tha...

6

What ways do we have available to connect to a Google Cloud SQL (MySQL) instance from the newly introduced Google Cloud Composer? The intention is to get data from a Cloud SQL instance into BigQuer...

3

Solved

We use GKE (Google Kubernetes Engine) to run Airflow in a GCC (Google Cloude Composer) for our data pipeline. We started out with 6 nodes, and realised that the costs spiked, and we didn't use tha...

1

I have two Airflow DAGs - scheduler and worker. Scheduler runs every minute and polls for new aggregation jobs and triggers worker jobs. You can find the code for scheduler job below. However out ...

1

Solved

I can't edit values of airflow variables in json format through cloud shell. I am using cloud shell to access my airflow variable params (in json format) and it gives me the complete json when i u...
Decompress asked 18/7, 2019 at 22:40

2

On my local machine I created a virtualenv and installed Airflow. When a dag or plugin requires a python library I pip install it into the same virtualenv. How can I keep track of which libraries ...
Vedis asked 4/7, 2019 at 15:33

1

Solved

When creating an Airflow environment on GCP Composer, there is a DAG named airflow_monitoring automatically created and that comes back even when deleted. Why? How to handle it? Should I copy this ...

1

Solved

I have written an airflow plugin that simply contains one custom operator (to support CMEK in BigQuery). I can create a simple DAG with a single task that uses this operator and that executes fine....
Statics asked 23/1, 2019 at 21:35

1

Solved

I run Airflow in a managed Cloud-composer environment (version 1.9.0), whic runs on a Kubernetes 1.10.9-gke.5 cluster. All my DAGs run daily at 3:00 AM or 4:00 AM. But sometime in the morning, I s...

1

Solved

How can I import a json file into Google Cloud Composer using command line? I tried the below command gcloud composer environments run comp-env --location=us-central1 variables -- --import compos...
Margit asked 17/1, 2019 at 13:43

2

I am using Cloud Composer and I noticed that it selects the version of Apache Airflow and Python (2.7.x) for me. I want to use a different version of Airflow and/or Python. How can I change this?
Protozoal asked 1/5, 2018 at 19:7

1

Solved

I would like to be able to access data on a google sheet when running python code via cloud composer; this is something I know how to do in several ways when running code locally, but moving to the...

© 2022 - 2024 — McMap. All rights reserved.