Created
September 22, 2015 11:53
-
-
Save natea/37846d308462a15e72f0 to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
2015-09-22 11:49:40,371 INFO 12850 [luigi-interface] worker.py:282 - [pid 12850] Worker Worker(salt=039843096, host=precise64, username=hadoop, pid=12850) running SqoopImportFromMysql(overwrite=False, destination=hdfs://localhost:9000/edx-analytics-pipeline/warehouse/auth_userprofile/dt=2015-09-22/, credentials=/edx/etc/edx-analytics-pipeline/input.json, database=edxapp, num_mappers=None, verbose=False, table_name=auth_userprofile, where=None, columns=('user_id', 'gender', 'year_of_birth', 'level_of_education'), null_string=\\N, fields_terminated_by=, delimiter_replacement= , mysql_delimiters=False) | |
mkdir: `/tmp/luigi/partial/luigi/partial': No such file or directory | |
Exception AttributeError: AttributeError('_process',) in <bound method HdfsAtomicWritePipe.__del__ of <luigi.hdfs.HdfsAtomicWritePipe object at 0x40018d0>> ignored | |
2015-09-22 11:49:45,372 ERROR 12850 [luigi-interface] worker.py:304 - [pid 12850] Worker Worker(salt=039843096, host=precise64, username=hadoop, pid=12850) failed SqoopImportFromMysql(overwrite=False, destination=hdfs://localhost:9000/edx-analytics-pipeline/warehouse/auth_userprofile/dt=2015-09-22/, credentials=/edx/etc/edx-analytics-pipeline/input.json, database=edxapp, num_mappers=None, verbose=False, table_name=auth_userprofile, where=None, columns=('user_id', 'gender', 'year_of_birth', 'level_of_education'), null_string=\\N, fields_terminated_by=, delimiter_replacement= , mysql_delimiters=False) | |
Traceback (most recent call last): | |
File "/var/lib/analytics-tasks/devstack/venv/local/lib/python2.7/site-packages/luigi/worker.py", line 292, in _run_task | |
task.run() | |
File "/var/lib/analytics-tasks/devstack/venv/local/lib/python2.7/site-packages/luigi/hadoop.py", line 567, in run | |
self.job_runner().run_job(self) | |
File "/var/lib/analytics-tasks/devstack/venv/local/lib/python2.7/site-packages/edx/analytics/tasks/sqoop.py", line 221, in run_job | |
password_target.remove() | |
File "/var/lib/analytics-tasks/devstack/venv/local/lib/python2.7/site-packages/luigi/hdfs.py", line 707, in remove | |
remove(self.path, skip_trash=skip_trash) | |
File "/var/lib/analytics-tasks/devstack/venv/local/lib/python2.7/site-packages/luigi/hdfs.py", line 497, in remove | |
call_check(cmd) | |
File "/var/lib/analytics-tasks/devstack/venv/local/lib/python2.7/site-packages/luigi/hdfs.py", line 49, in call_check | |
raise HDFSCliError(command, p.returncode, stdout, stderr) | |
HDFSCliError: Command ['/edx/app/hadoop/hadoop/bin/hadoop', 'fs', '-rmr', '/tmp/luigi/partial/luigitemp-747790816'] failed [exit code 1] |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment