I manually did some experiment and found the below, hope this clarifies.
min_file_process_interval:
for example, lets this is set to 10 seconds. This is the amount of time it takes to process dag files, which also means that, between completion of a task in any dag and to trigger the dependent task, there can be a maximum of 10 second delay as airflow checks for triggering dependent tasks every 10 seconds if the upstream jobs are completed.
If this value is higher, you tasks in dag will take more time to trigger, but airflow will consume less CPU.
Airbnb Airflow using all system resources
dag_dir_list_interval: any new python dag files that you put in dags folder, it will take this much time to be processed by airflow and show up in UI.
dag_dir_list_interval
the scheduler lists DAG definition files and those files are processed everymin_file_process_interval
– Centrepiece