AnsweredAssumed Answered

Unable to load libMapRClient.so native library

Question asked by mvince on Mar 13, 2017
Latest reply on Mar 14, 2017 by mvince

Hi I`m trying to integrate druid with mapr and I`m getting

java.lang.UnsatisfiedLinkError: Native Library /tmp/tmp4/mapr-mapr-libMapRClient.5.1.0-mapr.so already loaded in another classloader

I tried to set -Dmapr.library.flatclass with no luck

here is the log from when I turn shimloader.debuglog on:

2017-03-13 14:41:24.855 [1] Load in root Classloader: true.
2017-03-13 14:41:24.856 [1] Injecting Native Loader
2017-03-13 14:41:24.856 [1] getRootClassLoader: thread classLoader is 'java.net.URLClassLoader'
2017-03-13 14:41:24.857 [1] getRootClassLoader: root classLoader is 'sun.misc.Launcher.ExtClassLoader'
2017-03-13 14:41:24.877 [1] injectNativeLoader: Loading MapR native classes
2017-03-13 14:41:24.882 [1] Searching for native library '/com/mapr/fs/native/Linux/x86_64/libMapRClient.so'.
2017-03-13 14:41:24.883 [1] Extracting native library to '/tmp/tmp4'.
2017-03-13 14:41:24.883 [1] Native library for this platform is 'mapr-mapr-libMapRClient.5.1.0-mapr.so'.
2017-03-13 14:41:24.883 [1] Target file '/tmp/tmp4/mapr-mapr-libMapRClient.5.1.0-mapr.so' already exists, verifying checksum.
2017-03-13 14:41:26.026 [1] Checksum matches, will not extract from the JAR.
2017-03-13 14:41:26.048 [1] Native library loaded.
2017-03-13 14:41:26.049 [1] Native Loader injected
2017-03-13 14:41:26.056 [1] MapR native classes already loaded
2017-03-13 14:41:26.481 [1] MapR native classes already loaded
2017-03-13 14:41:30.000 [33] Load in root Classloader: true.
2017-03-13 14:41:30.001 [33] Injecting Native Loader
2017-03-13 14:41:30.001 [33] getRootClassLoader: thread classLoader is 'java.net.URLClassLoader'
2017-03-13 14:41:30.001 [33] getRootClassLoader: root classLoader is 'java.net.URLClassLoader'
2017-03-13 14:41:30.016 [33] injectNativeLoader: Loading MapR native classes
2017-03-13 14:41:30.021 [33] Searching for native library '/com/mapr/fs/native/Linux/x86_64/libMapRClient.so'.
2017-03-13 14:41:30.022 [33] Extracting native library to '/tmp/tmp4'.
2017-03-13 14:41:30.022 [33] Native library for this platform is 'mapr-mapr-libMapRClient.5.1.0-mapr.so'.
2017-03-13 14:41:30.022 [33] Target file '/tmp/tmp4/mapr-mapr-libMapRClient.5.1.0-mapr.so' already exists, verifying checksum.
2017-03-13 14:41:31.127 [33] Checksum matches, will not extract from the JAR.
2017-03-13 14:41:31.128 [33] Unable to load libMapRClient.so native library.
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at com.mapr.fs.ShimLoader.loadNativeLibrary(ShimLoader.java:344)
    at com.mapr.fs.ShimLoader.load(ShimLoader.java:226)
    at org.apache.hadoop.conf.CoreDefaultProperties.<clinit>(CoreDefaultProperties.java:61)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2147)
    at org.apache.hadoop.conf.Configuration.getProperties(Configuration.java:2362)
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2579)
    at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2531)
    at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2444)
    at org.apache.hadoop.conf.Configuration.get(Configuration.java:1245)
    at org.apache.hadoop.fs.FileSystem.getDefaultUri(FileSystem.java:180)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:172)
    at io.druid.storage.hdfs.HdfsStorageDruidModule.configure(HdfsStorageDruidModule.java:110)
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:223)
    at com.google.inject.spi.Elements.getElements(Elements.java:101)
    at com.google.inject.spi.Elements.getElements(Elements.java:92)
    at com.google.inject.util.Modules$RealOverriddenModuleBuilder$1.configure(Modules.java:172)
    at com.google.inject.AbstractModule.configure(AbstractModule.java:59)
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:223)
    at com.google.inject.spi.Elements.getElements(Elements.java:101)
    at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:133)
    at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:103)
    at com.google.inject.Guice.createInjector(Guice.java:95)
    at com.google.inject.Guice.createInjector(Guice.java:72)
    at com.google.inject.Guice.createInjector(Guice.java:62)
    at io.druid.initialization.Initialization.makeInjectorWithModules(Initialization.java:366)
    at io.druid.indexer.spark.SerializedJsonStatic$.liftedTree1$1(SparkDruidIndexer.scala:429)
    at io.druid.indexer.spark.SerializedJsonStatic$.injector$lzycompute(SparkDruidIndexer.scala:428)
    at io.druid.indexer.spark.SerializedJsonStatic$.injector(SparkDruidIndexer.scala:427)
    at io.druid.indexer.spark.SerializedJsonStatic$.liftedTree2$1(SparkDruidIndexer.scala:456)
    at io.druid.indexer.spark.SerializedJsonStatic$.mapper$lzycompute(SparkDruidIndexer.scala:455)
    at io.druid.indexer.spark.SerializedJsonStatic$.mapper(SparkDruidIndexer.scala:454)
    at io.druid.indexer.spark.SparkBatchIndexTask$.runTask(SparkBatchIndexTask.scala:296)
    at io.druid.indexer.spark.SparkBatchIndexTask.runTask(SparkBatchIndexTask.scala)
    at io.druid.indexer.spark.Runner.runTask(Runner.java:29)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:201)
    at io.druid.indexer.spark.SparkBatchIndexTask.run(SparkBatchIndexTask.scala:162)
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436)
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.UnsatisfiedLinkError: Native Library /tmp/tmp4/mapr-mapr-libMapRClient.5.1.0-mapr.so already loaded in another classloader
    at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1907)
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
    at java.lang.Runtime.load0(Runtime.java:809)
    at java.lang.System.load(System.java:1086)
    at com.mapr.fs.shim.LibraryLoader.load(LibraryLoader.java:31)
    ... 52 more
2017-03-13T14:41:31,157 ERROR [task-runner-0-priority-0] io.druid.indexer.spark.SparkBatchIndexTask - Error running task [index_spark_sparkTest_2016-01-03T00:00:00.000Z_2017-08-04T00:00:00.000Z_2017-03-13T13:41:22.198Z]
java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
    at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-16.0.1.jar:?]
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:204) ~[druid-indexing-service-0.9.2.1-SNAPSHOT.jar:0.9.2.1-SNAPSHOT]
    at io.druid.indexer.spark.SparkBatchIndexTask.run(SparkBatchIndexTask.scala:162) [druid-spark-batch_2.10-0.9.2.15-SNAPSHOT.jar:0.9.2.15-SNAPSHOT]
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436) [druid-indexing-service-0.9.2.1-SNAPSHOT.jar:0.9.2.1-SNAPSHOT]
    at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408) [druid-indexing-service-0.9.2.1-SNAPSHOT.jar:0.9.2.1-SNAPSHOT]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_101]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_101]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_101]
    at java.lang.Thread.run(Thread.java:745) [?:1.8.0_101]
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_101]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_101]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_101]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_101]
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:201) ~[druid-indexing-service-0.9.2.1-SNAPSHOT.jar:0.9.2.1-SNAPSHOT]
    ... 7 more
Caused by: java.lang.ExceptionInInitializerError
    at com.mapr.fs.ShimLoader.load(ShimLoader.java:243) ~[?:?]
    at org.apache.hadoop.conf.CoreDefaultProperties.<clinit>(CoreDefaultProperties.java:61) ~[?:?]
    at java.lang.Class.forName0(Native Method) ~[?:1.8.0_101]
    at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_101]
    at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2147) ~[?:?]
    at org.apache.hadoop.conf.Configuration.getProperties(Configuration.java:2362) ~[?:?]
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2579) ~[?:?]
    at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2531) ~[?:?]
    at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2444) ~[?:?]
    at org.apache.hadoop.conf.Configuration.get(Configuration.java:1245) ~[?:?]
    at org.apache.hadoop.fs.FileSystem.getDefaultUri(FileSystem.java:180) ~[?:?]
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:172) ~[?:?]
    at io.druid.storage.hdfs.HdfsStorageDruidModule.configure(HdfsStorageDruidModule.java:110) ~[?:?]
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:223) ~[guice-3.0.jar:?]
    at com.google.inject.spi.Elements.getElements(Elements.java:101) ~[guice-3.0.jar:?]
    at com.google.inject.spi.Elements.getElements(Elements.java:92) ~[guice-3.0.jar:?]
    at com.google.inject.util.Modules$RealOverriddenModuleBuilder$1.configure(Modules.java:172) ~[guice-3.0.jar:?]
    at com.google.inject.AbstractModule.configure(AbstractModule.java:59) ~[guice-3.0.jar:?]
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:223) ~[guice-3.0.jar:?]
    at com.google.inject.spi.Elements.getElements(Elements.java:101) ~[guice-3.0.jar:?]
    at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:133) ~[guice-3.0.jar:?]
    at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:103) ~[guice-3.0.jar:?]
    at com.google.inject.Guice.createInjector(Guice.java:95) ~[guice-3.0.jar:?]
    at com.google.inject.Guice.createInjector(Guice.java:72) ~[guice-3.0.jar:?]
    at com.google.inject.Guice.createInjector(Guice.java:62) ~[guice-3.0.jar:?]
    at io.druid.initialization.Initialization.makeInjectorWithModules(Initialization.java:366) ~[druid-server-0.9.2.1-SNAPSHOT.jar:0.9.2.1-SNAPSHOT]
    at io.druid.indexer.spark.SerializedJsonStatic$.liftedTree1$1(SparkDruidIndexer.scala:429) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.injector$lzycompute(SparkDruidIndexer.scala:428) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.injector(SparkDruidIndexer.scala:427) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.liftedTree2$1(SparkDruidIndexer.scala:456) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.mapper$lzycompute(SparkDruidIndexer.scala:455) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.mapper(SparkDruidIndexer.scala:454) ~[?:?]
    at io.druid.indexer.spark.SparkBatchIndexTask$.runTask(SparkBatchIndexTask.scala:296) ~[?:?]
    at io.druid.indexer.spark.SparkBatchIndexTask.runTask(SparkBatchIndexTask.scala) ~[?:?]
    at io.druid.indexer.spark.Runner.runTask(Runner.java:29) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_101]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_101]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_101]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_101]
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:201) ~[druid-indexing-service-0.9.2.1-SNAPSHOT.jar:0.9.2.1-SNAPSHOT]
    ... 7 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_101]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_101]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_101]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_101]
    at com.mapr.fs.ShimLoader.loadNativeLibrary(ShimLoader.java:344) ~[?:?]
    at com.mapr.fs.ShimLoader.load(ShimLoader.java:226) ~[?:?]
    at org.apache.hadoop.conf.CoreDefaultProperties.<clinit>(CoreDefaultProperties.java:61) ~[?:?]
    at java.lang.Class.forName0(Native Method) ~[?:1.8.0_101]
    at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_101]
    at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2147) ~[?:?]
    at org.apache.hadoop.conf.Configuration.getProperties(Configuration.java:2362) ~[?:?]
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2579) ~[?:?]
    at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2531) ~[?:?]
    at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2444) ~[?:?]
    at org.apache.hadoop.conf.Configuration.get(Configuration.java:1245) ~[?:?]
    at org.apache.hadoop.fs.FileSystem.getDefaultUri(FileSystem.java:180) ~[?:?]
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:172) ~[?:?]
    at io.druid.storage.hdfs.HdfsStorageDruidModule.configure(HdfsStorageDruidModule.java:110) ~[?:?]
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:223) ~[guice-3.0.jar:?]
    at com.google.inject.spi.Elements.getElements(Elements.java:101) ~[guice-3.0.jar:?]
    at com.google.inject.spi.Elements.getElements(Elements.java:92) ~[guice-3.0.jar:?]
    at com.google.inject.util.Modules$RealOverriddenModuleBuilder$1.configure(Modules.java:172) ~[guice-3.0.jar:?]
    at com.google.inject.AbstractModule.configure(AbstractModule.java:59) ~[guice-3.0.jar:?]
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:223) ~[guice-3.0.jar:?]
    at com.google.inject.spi.Elements.getElements(Elements.java:101) ~[guice-3.0.jar:?]
    at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:133) ~[guice-3.0.jar:?]
    at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:103) ~[guice-3.0.jar:?]
    at com.google.inject.Guice.createInjector(Guice.java:95) ~[guice-3.0.jar:?]
    at com.google.inject.Guice.createInjector(Guice.java:72) ~[guice-3.0.jar:?]
    at com.google.inject.Guice.createInjector(Guice.java:62) ~[guice-3.0.jar:?]
    at io.druid.initialization.Initialization.makeInjectorWithModules(Initialization.java:366) ~[druid-server-0.9.2.1-SNAPSHOT.jar:0.9.2.1-SNAPSHOT]
    at io.druid.indexer.spark.SerializedJsonStatic$.liftedTree1$1(SparkDruidIndexer.scala:429) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.injector$lzycompute(SparkDruidIndexer.scala:428) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.injector(SparkDruidIndexer.scala:427) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.liftedTree2$1(SparkDruidIndexer.scala:456) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.mapper$lzycompute(SparkDruidIndexer.scala:455) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.mapper(SparkDruidIndexer.scala:454) ~[?:?]
    at io.druid.indexer.spark.SparkBatchIndexTask$.runTask(SparkBatchIndexTask.scala:296) ~[?:?]
    at io.druid.indexer.spark.SparkBatchIndexTask.runTask(SparkBatchIndexTask.scala) ~[?:?]
    at io.druid.indexer.spark.Runner.runTask(Runner.java:29) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_101]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_101]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_101]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_101]
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:201) ~[druid-indexing-service-0.9.2.1-SNAPSHOT.jar:0.9.2.1-SNAPSHOT]
    ... 7 more
Caused by: java.lang.UnsatisfiedLinkError: Native Library /tmp/tmp4/mapr-mapr-libMapRClient.5.1.0-mapr.so already loaded in another classloader
    at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1907) ~[?:1.8.0_101]
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824) ~[?:1.8.0_101]
    at java.lang.Runtime.load0(Runtime.java:809) ~[?:1.8.0_101]
    at java.lang.System.load(System.java:1086) ~[?:1.8.0_101]
    at com.mapr.fs.shim.LibraryLoader.load(LibraryLoader.java:31) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_101]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_101]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_101]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_101]
    at com.mapr.fs.ShimLoader.loadNativeLibrary(ShimLoader.java:344) ~[?:?]
    at com.mapr.fs.ShimLoader.load(ShimLoader.java:226) ~[?:?]
    at org.apache.hadoop.conf.CoreDefaultProperties.<clinit>(CoreDefaultProperties.java:61) ~[?:?]
    at java.lang.Class.forName0(Native Method) ~[?:1.8.0_101]
    at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_101]
    at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2147) ~[?:?]
    at org.apache.hadoop.conf.Configuration.getProperties(Configuration.java:2362) ~[?:?]
    at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2579) ~[?:?]
    at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2531) ~[?:?]
    at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2444) ~[?:?]
    at org.apache.hadoop.conf.Configuration.get(Configuration.java:1245) ~[?:?]
    at org.apache.hadoop.fs.FileSystem.getDefaultUri(FileSystem.java:180) ~[?:?]
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:172) ~[?:?]
    at io.druid.storage.hdfs.HdfsStorageDruidModule.configure(HdfsStorageDruidModule.java:110) ~[?:?]
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:223) ~[guice-3.0.jar:?]
    at com.google.inject.spi.Elements.getElements(Elements.java:101) ~[guice-3.0.jar:?]
    at com.google.inject.spi.Elements.getElements(Elements.java:92) ~[guice-3.0.jar:?]
    at com.google.inject.util.Modules$RealOverriddenModuleBuilder$1.configure(Modules.java:172) ~[guice-3.0.jar:?]
    at com.google.inject.AbstractModule.configure(AbstractModule.java:59) ~[guice-3.0.jar:?]
    at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:223) ~[guice-3.0.jar:?]
    at com.google.inject.spi.Elements.getElements(Elements.java:101) ~[guice-3.0.jar:?]
    at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:133) ~[guice-3.0.jar:?]
    at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:103) ~[guice-3.0.jar:?]
    at com.google.inject.Guice.createInjector(Guice.java:95) ~[guice-3.0.jar:?]
    at com.google.inject.Guice.createInjector(Guice.java:72) ~[guice-3.0.jar:?]
    at com.google.inject.Guice.createInjector(Guice.java:62) ~[guice-3.0.jar:?]
    at io.druid.initialization.Initialization.makeInjectorWithModules(Initialization.java:366) ~[druid-server-0.9.2.1-SNAPSHOT.jar:0.9.2.1-SNAPSHOT]
    at io.druid.indexer.spark.SerializedJsonStatic$.liftedTree1$1(SparkDruidIndexer.scala:429) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.injector$lzycompute(SparkDruidIndexer.scala:428) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.injector(SparkDruidIndexer.scala:427) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.liftedTree2$1(SparkDruidIndexer.scala:456) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.mapper$lzycompute(SparkDruidIndexer.scala:455) ~[?:?]
    at io.druid.indexer.spark.SerializedJsonStatic$.mapper(SparkDruidIndexer.scala:454) ~[?:?]
    at io.druid.indexer.spark.SparkBatchIndexTask$.runTask(SparkBatchIndexTask.scala:296) ~[?:?]
    at io.druid.indexer.spark.SparkBatchIndexTask.runTask(SparkBatchIndexTask.scala) ~[?:?]
    at io.druid.indexer.spark.Runner.runTask(Runner.java:29) ~[?:?]
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_101]
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_101]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_101]
    at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_101]
    at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:201) ~[druid-indexing-service-0.9.2.1-SNAPSHOT.jar:0.9.2.1-SNAPSHOT]
    ... 7 more

Outcomes