Java Heap Space Error

MapReduce jobs work for me when I use a small set of data. Currently I am trying to run about 30k files in a MapReduce job that performs a md5 hash of the files. But I’m facing with java heap space error. I have read solutions online and changed my hadoop_heapsize to 4096MB using the ambari hdfs config but I’m still facing this error. My mapreduce code also has the line “conf.set(“mapred.map.child.java.opts”, “-Xmx2048m”)”. Anyone know of a solution for this?