I have recently installed airflow
for my workflows. While creating my project, I executed following command:
airflow initdb
which returned following error:
[2016-08-15 11:17:00,314] {__init__.py:36} INFO - Using executor SequentialExecutor
DB: sqlite:////Users/mikhilraj/airflow/airflow.db
[2016-08-15 11:17:01,319] {db.py:222} INFO - Creating tables
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
INFO [alembic.runtime.migration] Will assume non-transactional DDL.
ERROR [airflow.models.DagBag] Failed to import: /usr/local/lib/python2.7/site-packages/airflow/example_dags/example_twitter_dag.py
Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/airflow/models.py", line 247, in process_file
m = imp.load_source(mod_name, file path)
File "/usr/local/lib/python2.7/site-packages/airflow/example_dags/example_twitter_dag.py", line 26, in <module>
from airflow.operators import BashOperator, HiveOperator, PythonOperator
ImportError: cannot import name HiveOperator
Done.
I checked some similar issues on web, which suggested me to install airflow[hive]
, pyhs2
but it doesn't seem to work.
airflow[hive]
worked for me. Can you tell me how to setload_examples
toFalse
. – Rusty Aug 16 '16 at 8:18airflow.cfg
file. Airflow automatically creates the defaultairflow.cfg
file for you in the AIRFLOW_HOME dir. The file has a variableload_examples
which by default is set toTrue
– Vineet GoelAug 16 '16 at 18:41pip install airflow[hive]
was sufficient to resolve the error on a fresh install for me. – Taylor Edmiston Nov 14 '16 at 21:01