4

I am in the process of upgrading the version of Apache Phoenix used on a CDH 4.7.0 cluster running hbase version 0.94.15-cdh-4.7.0. My goal is to migrate us from Phoenix version 3.1.0 to 3.3.1, but my join statements begin to fail when I upgrade to version 3.2.2.

The Phoenix Upgrade Documentation is laconic at best, but my understanding is that using Cloudera Manager to push out a new parcel with the upgraded jar, then connecting to the cluster with an upgraded client should be all that is necessary to upgrade from one version to another.

After upgrading, all my tests pass save the join tests. A simple join statement:

SELECT A."id", A."group", B."group", B."name" 
  FROM "Test_Table_1" AS A 
  INNER JOIN "Test_Table_2" AS B 
  ON A."group" = B."group" 
  LIMIT 10

Results in the following exception (full stack trace at end of post):

Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=14, exceptions:
Mon Aug 24 09:21:18 PDT 2015, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1dfa77a9, org.apache.hadoop.ipc.RemoteException(java.lang.NoClassDefFoundError): IPC server unable to read call parameters: org/iq80/snappy/CorruptionException

At version 3.2, phoenix upgraded its version of snappy to use the org.iq80.snappy version 0.3, yet a jar tf phoenix-core-3.2.2.jar does not reveal any of the snappy classes. This makes me think that the snappy jar missing from the classpath, despite it being a dependency in the phoenix 3.2 pom. Here is where the trouble starts: I can add the snappy-0.3.jar to my hbase cluster, but can't figure out how to modify the classpath to include the jar.

I have included the following snipped in Cloudera's HBase Service Environment Advanced Configuration (Safety Valve): HBASE_CLASSPATH="/opt/cloudera/parcels/phoenix_core/lib/java/*:$HBASE_CLASSPATH"

Note that my phoenix jar is in the /opt/cloudera/parcels/phoenix_core/lib/java directory, so I know that this directory is in the hbase classpath even if it does not appear when I type hbase classpath into the terminal on one of the datanodes.

So with this background, how do I verify the snappy-0.3.jar is in the classpath, and more broadly, how do I get my join statements to work properly?

Full stack trace on join:

java.sql.SQLException: Encountered exception in sub plan [0] execution.
    at org.apache.phoenix.execute.HashJoinPlan.iterator(HashJoinPlan.java:151)
    at org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:230)
    at org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:221)
    at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:54)
    at org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:221)
    at org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:1032)
    at MainTest.execute(MainTest.java:87)
    at MainTest.join1(MainTest.java:297)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
    at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
    at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
    at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
    at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
    at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
    at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
    at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
    at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
    at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
    at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
    at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
    at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:78)
    at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:212)
    at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:68)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
Caused by: java.sql.SQLException: java.util.concurrent.ExecutionException: java.lang.reflect.UndeclaredThrowableException
    at org.apache.phoenix.cache.ServerCacheClient.addServerCache(ServerCacheClient.java:202)
    at org.apache.phoenix.join.HashCacheClient.addHashCache(HashCacheClient.java:83)
    at org.apache.phoenix.execute.HashJoinPlan$HashSubPlan.execute(HashJoinPlan.java:333)
    at org.apache.phoenix.execute.HashJoinPlan$1.call(HashJoinPlan.java:130)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.ExecutionException: java.lang.reflect.UndeclaredThrowableException
    at java.util.concurrent.FutureTask.report(FutureTask.java:122)
    at java.util.concurrent.FutureTask.get(FutureTask.java:202)
    at org.apache.phoenix.cache.ServerCacheClient.addServerCache(ServerCacheClient.java:194)
    ... 7 more
Caused by: java.lang.reflect.UndeclaredThrowableException
    at com.sun.proxy.$Proxy20.addServerCache(Unknown Source)
    at org.apache.phoenix.cache.ServerCacheClient$1.call(ServerCacheClient.java:172)
    at org.apache.phoenix.cache.ServerCacheClient$1.call(ServerCacheClient.java:167)
    ... 4 more
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=14, exceptions:
Mon Aug 24 09:21:18 PDT 2015, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1dfa77a9, org.apache.hadoop.ipc.RemoteException(java.lang.NoClassDefFoundError): IPC server unable to read call parameters: org/iq80/snappy/CorruptionException
Mon Aug 24 09:21:20 PDT 2015, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1dfa77a9, org.apache.hadoop.ipc.RemoteException(java.lang.NoClassDefFoundError): IPC server unable to read call parameters: org/iq80/snappy/CorruptionException
Mon Aug 24 09:21:22 PDT 2015, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1dfa77a9, org.apache.hadoop.ipc.RemoteException(java.lang.NoClassDefFoundError): IPC server unable to read call parameters: org/iq80/snappy/CorruptionException
Mon Aug 24 09:21:24 PDT 2015, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1dfa77a9, org.apache.hadoop.ipc.RemoteException(java.lang.NoClassDefFoundError): IPC server unable to read call parameters: org/iq80/snappy/CorruptionException
Mon Aug 24 09:21:27 PDT 2015, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1dfa77a9, org.apache.hadoop.ipc.RemoteException(java.lang.NoClassDefFoundError): IPC server unable to read call parameters: org/iq80/snappy/CorruptionException
Mon Aug 24 09:21:31 PDT 2015, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1dfa77a9, org.apache.hadoop.ipc.RemoteException(java.lang.NoClassDefFoundError): IPC server unable to read call parameters: org/iq80/snappy/CorruptionException
Mon Aug 24 09:21:36 PDT 2015, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1dfa77a9, org.apache.hadoop.ipc.RemoteException(java.lang.NoClassDefFoundError): IPC server unable to read call parameters: org/iq80/snappy/CorruptionException
Mon Aug 24 09:21:44 PDT 2015, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1dfa77a9, org.apache.hadoop.ipc.RemoteException(java.lang.NoClassDefFoundError): IPC server unable to read call parameters: org/iq80/snappy/CorruptionException
Mon Aug 24 09:22:01 PDT 2015, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1dfa77a9, org.apache.hadoop.ipc.RemoteException(java.lang.NoClassDefFoundError): IPC server unable to read call parameters: org/iq80/snappy/CorruptionException
Mon Aug 24 09:22:34 PDT 2015, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1dfa77a9, org.apache.hadoop.ipc.RemoteException(java.lang.NoClassDefFoundError): IPC server unable to read call parameters: org/iq80/snappy/CorruptionException
Mon Aug 24 09:23:38 PDT 2015, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1dfa77a9, org.apache.hadoop.ipc.RemoteException(java.lang.NoClassDefFoundError): IPC server unable to read call parameters: org/iq80/snappy/CorruptionException
Mon Aug 24 09:24:43 PDT 2015, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1dfa77a9, org.apache.hadoop.ipc.RemoteException(java.lang.NoClassDefFoundError): IPC server unable to read call parameters: org/iq80/snappy/CorruptionException
Mon Aug 24 09:25:48 PDT 2015, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1dfa77a9, org.apache.hadoop.ipc.RemoteException(java.lang.NoClassDefFoundError): IPC server unable to read call parameters: org/iq80/snappy/CorruptionException
Mon Aug 24 09:26:52 PDT 2015, org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@1dfa77a9, org.apache.hadoop.ipc.RemoteException(java.lang.NoClassDefFoundError): IPC server unable to read call parameters: org/iq80/snappy/CorruptionException

    at org.apache.hadoop.hbase.client.ServerCallable.withRetries(ServerCallable.java:187)
    at org.apache.hadoop.hbase.ipc.ExecRPCInvoker.invoke(ExecRPCInvoker.java:79)
    ... 7 more
Acerebral
  • 51
  • 4

1 Answers1

1

I have come up with a hacky work around to prevent the error. I unpacked the snappy-0.3.jar file, then added the resulting classfiles to the phoenix-core-3.3.1.jar

  1. jar xf snappy-0.3.jar
  2. jar uf phoenix-core-3.3.1.jar org/iq80/snappy/*
  3. The modified phoenix-core jar then got pushed in a new parcel, and joins now work!

If anybody has a better answer that doesn't involve "rolling my own" so to speak, I would still love to hear it.

Acerebral
  • 51
  • 4