AnsweredAssumed Answered

Setting max open files using M3 on EMR

Question asked by sorenmacbeth on Aug 20, 2012
Latest reply on Aug 20, 2012 by gera
Hello,

I'm running into an issue using M3 on Elastic Mapreduce. One of my jobs dies somewhere in the middle and kicks back a 'too many open files' exception. I'm trying to go about changing the limit, but I'm have trouble getting it to stick.

Right now I'm writing into ~/conf/hadoop-user-env.sh and attempting to execute "ulimit -n 64000", but I'm still hitting the error and I don't think it's sticking. Is this not the correct place to do this on M3?

I can't write into /etc/security/limits.conf because that requires a reboot to take effect, which one cannot do when using elastic mapreduce.

What is the recommended way of increasing my open files limit?

Outcomes