AnsweredAssumed Answered

HIVE Issue. 

Question asked by Velumani on Dec 23, 2016
Latest reply on May 24, 2017 by divo7777

Hi,

I am trying to run a spark job,

/opt/mapr/spark/spark-1.6.1/bin/spark-submit --driver-class-path `hbase classpath` --files /opt/mapr/spark/spark-1.6.1/conf/hive-site.xml --class com.example.App --master local[2] /app.jar

 

 which query hive table using Hive Context. When I do spark submit I am getting following exception 

 

2016-12-20 18:21:56,226 INFO [main] DataNucleus.Persistence: Property datanucleus.schema.autoCreateTables unknown - will be ignored
2016-12-20 18:21:56,226 INFO [main] DataNucleus.Persistence: Property datanucleus.schema.validateColumns unknown - will be ignored
2016-12-20 18:21:56,226 INFO [main] DataNucleus.Persistence: Property datanucleus.schema.validateConstraints unknown - will be ignored
2016-12-20 18:21:56,226 INFO [main] DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
2016-12-20 18:21:56,226 INFO [main] DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
2016-12-20 18:21:56,226 INFO [main] DataNucleus.Persistence: Property datanucleus.schema.autoCreateAll unknown - will be ignored
2016-12-20 18:21:56,226 INFO [main] DataNucleus.Persistence: Property datanucleus.schema.validateTables unknown - will be ignored
2016-12-20 18:21:56,658 WARN [main] DataNucleus.Datastore: No Database Adapter was found for your JDBC driver specified. Faling back to the generic DatabaseAdapter!
2016-12-20 18:21:56,675 INFO [main] DataNucleus.General: >> RDBMSStoreManager.init
Identifier Factory with name "datanucleus1" is not registered! Please check your CLASSPATH for presence of the plugin containing this factory, and your PMF settings for identifier factory.
org.datanucleus.exceptions.NucleusUserException: Identifier Factory with name "datanucleus1" is not registered! Please check your CLASSPATH for presence of the plugin containing this factory, and your PMF settings for identifier factory.
at org.datanucleus.store.rdbms.RDBMSStoreManager.initialiseIdentifierFactory(RDBMSStoreManager.java:447)
at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:342)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
at java.security.AccessController.doPrivileged(Native Method)
at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)

 

Mapr Secure cluster version - 5.2

Spark version - 1.6.1

Hive-site.xml are available (/opt/mapr/spark/spark-1.6.1/conf/hive-site.xml, /opt/mapr/hive/hive-1.2/conf/hive-site.xml)  

Outcomes