I use airflow v1.7.1.3
I have two DAG, dag_a and dag_b. I set up 10 dag_a tasks at one time, which theoretically should be execution one by one. In reality, the 10 dag_a tasks are executed in parallel. The concurrency parameter doesn't work. Can anyone tell me why?
Here's the pseudocode:
in dag_a.py
dag = DAG('dag_a',
            start_date=datetime.now(),
            default_args=default_args,
            schedule_interval=None,
            concurrency=1,
            max_active_runs=1)
in dag_b.py
from fabric.api import local
dag = DAG('dag_b',
            start_date=datetime.now(),
            default_args=default_args,
            schedule_interval='0 22 */1 * *',
            concurrency=1,
            max_active_runs=1)
def trigger_dag_a(**context):
    dag_list = []
    for rec in rang(1,10):
        time.sleep(2)
        cmd = "airflow trigger_dag dag_a"
        log.info("cmd:%s"%cmd)
        msg = local(cmd)    #"local" is function in fabric
        log.info(msg)
trigger_dag_a_proc = PythonOperator(python_callable=trigger_dag_a,
                          provide_context=True,
                          task_id='trigger_dag_a_proc',
                          dag=dag)
