Skip navigation

News

Filter by Answers and Ideas
According to https://oozie.apache.org/docs/3.3.1/WorkflowFunctionalSpec.html#a4.1_Workflow_Job_Properties_or_Parameters we know .. When submitting a workflow job for the workflow definition above, 3 workflow job properties must be specified:  jobTracker: inputDir: outputDir: I have a PySpark script that has specified input & output
0

elloyd79
Simple question: Where do we find the resource manager and resource scheduler addresses?   I have looked for a yarn-default.xml in our box where mapr is installed with no results.   I found yarn-site.xml and it doesn't seem to have it inside of it either. I am trying to configure a Provider using Hunk (splunk) so it can work with
3
Filter by Answers and Ideas
Terry
Currently running Community 5.2.2, after bungling and upgrade from 5.2.0 to 6.0.1. Back running again, but my NFS server is telling me:   fs/nfsd/nfsha.cc:1033 exiting: No license to run NFS server in servermode   I have tried cp /opt/mapr/conf.new/BaseLicense.txt /opt/mapr/conf/ and restart of all nodes, but no change. There is no
2
Top & Trending
According to https://oozie.apache.org/docs/3.3.1/WorkflowFunctionalSpec.html#a4.1_Workflow_Job_Properties_or_Parameters we know .. When submitting a workflow job for the workflow definition above, 3 workflow job properties must be specified:  jobTracker: inputDir: outputDir: I have a PySpark script that has specified input & output
0
Top & Trending
PETER.EDIKE
Hello everyone, I have observed a strange behavior with MapR Streams, I have the following code in java     Map<String,Object> properties = new HashMap<>(); properties.put(org.apache.kafka.clients.consumer.ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringDeserializer");
9