我已经在现有的Airflow 2.0.0安装的基础上安装了Apache Spark provider,安装方法如下: pip install apache-airflow-providers-apache-spark' from 'apache-airflow-providers-apache-spark' package: name 'client' is not defined
[2021-01-19 18:49' from 'apache-airflow</
调度程序和when服务器在不同的容器上运行,当我运行DAG并检查when服务器上的日志时,它显示了这个特定的错误。*** Log file does not exist: /usr/local/airflow/logs/indexing/index_articles/2019-12-31T00:00:00+00:00AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow</em
我尝试了气流教程DAG,它与调度器一起工作,我可以看到由计划运行生成的日志。但是,如果我使用命令行测试,我没有看到输出:[2018-09-10 15:41:43,121] {__init09-10 15:41:43,281] {models.py:258} INFO - Filling up the DagBag from /Users/xiang/Documents/BigData/airflow</e