0

I just try to execute this dumb script:

a = LOAD 'example';
dump a ;

where 'example' is a file in the home of hdfs, and I get this exception:

ERROR 2017: Internal error creating job configuration.

org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias a
        at org.apache.pig.PigServer.openIterator(PigServer.java:836)
        at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:696)
        at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320)
        at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
        at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
        at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
        at org.apache.pig.Main.run(Main.java:538)
        at org.apache.pig.Main.main(Main.java:157)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:622)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156) Caused by: org.apache.pig.PigException: ERROR 1002: Unable to store alias a
        at org.apache.pig.PigServer.storeEx(PigServer.java:935)
        at org.apache.pig.PigServer.store(PigServer.java:898)
        at org.apache.pig.PigServer.openIterator(PigServer.java:811)
        ... 12 more Caused by: org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobCreationException: ERROR 2017: Internal error creating job configuration.
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:848)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:294)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:177)
        at org.apache.pig.PigServer.launchPlan(PigServer.java:1264)
        at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1249)
        at org.apache.pig.PigServer.storeEx(PigServer.java:931)
        ... 14 more Caused by: java.io.NotSerializableException: org.apache.log4j.Level
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1183)
        at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1547)
        at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1508)
        at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
        at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347)
        at org.apache.pig.impl.util.JarManager.createJar(JarManager.java:186)
        at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:517)
        ... 19 more

Where the first reason is "java.io.NotSerializableException: org.apache.log4j.Level".

I tested it with Pig 0.11.0, 0.11.1 and 0.12.1.

Java version "1.6.0_31"

Any idea?

Alfonso Nishikawa
  • 1,856
  • 1
  • 17
  • 32

1 Answers1

2

I had Hadoop in the PATH:

PATH=/home/.../hadoop/bin:...

and Pig not only reads the configuration files ~/hadoop/conf/core-site.xml and ~/hadoop/conf/mapred-site.xml but inspects ~/hadoop/lib/* too and seems to be some interaction between the jars in that directory and the ones of Pig installation.

I removed ~/hadoop/bin from PATH, copied the configuration files core-site.xml and mapred-site.xml to ~/pig/conf and everything started to work.

Alfonso Nishikawa
  • 1,856
  • 1
  • 17
  • 32