91acb8da23
The option protobuf.scope defines whether the protobuf 2.5.0 dependency is marked as provided or not. * all declarations except those in yarn-csi are updated * those modules which don't compile without their own explicit import (hadoop-hdfs-client, hadoop-hdfs-rbf) It's actually interesting to see where/how that compile fails hadoop-hdfs-client: ClientNamenodeProtocolTranslatorPB hadoop-hdfs-rbf:RouterAdminProtocolTranslatorPB both with "class file for com.google.protobuf.ServiceException not found", even though *neither class uses it* what they do have is references to ProtobufHelper.getRemoteException(), which is overloaded to both the shaded ServiceException and the original one Hypothesis: the javac overload resolution needs to look at the entire class hierarchy before it can decide which one to use. Proposed: add a new method ioe extractException(org.apache.hadoop.thirdparty.protobuf.ServiceException) and move our own code to it. Without the overloading the classes should not be needed Change-Id: I70354abfe3f1fdc03c418dac88e60f8cc4929a33 |
||
---|---|---|
.. | ||
bin | ||
conf | ||
dev-support | ||
hadoop-mapreduce-client | ||
hadoop-mapreduce-examples | ||
lib/jdiff | ||
shellprofile.d | ||
.gitignore | ||
pom.xml |