I am using docker compose with bitnami's airflow image as well as minio. I can get airflow to talk to AWS S3, but when I try to substitute Minio I am getting this error:
File "/opt/bitnami/airflow/venv/lib/python3.8/site-packages/botocore/client.py", line 719, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden
Here's the .env:
OBJECT_STORE=s3://xxxx:xxxxx@S3?host%3Dhttp%3A%2F%2Fminio1%3A9001
Here's the environment connection in compose:
AIRFLOW_CONN_AWS_S3=${OBJECT_STORE}
Here's the Airflow test dag:
default_args = {
'owner': 'airflow',
'retries': 1,
'retry_delay': timedelta(seconds=5),
'provide_context': True
}
dag = DAG(
dag_id='s3_test',
tags=['ti'],
default_args=default_args,
start_date=days_ago(2),
schedule_interval='0 * * * *',
catchup=False
)
def func_test():
s3 = S3Hook('aws_s3')
obj = s3.get_key("file.csv", "mybucket")
contents = obj.get()['Body'].read().decode('utf-8')
print('contents', contents)
t1 = PythonOperator(
task_id='test',
python_callable=func_test,
dag=dag
)
t1
I know the file exists in the bucket and the path is correct. I gave the minio user account full admin rights too. Not sure what is causing the 403.