How Does Spark Use MapReduce?

In this talk we will talk about a interesting scenario did spark use mapreduce or not?answer to the question is yes,it use mapreduce but only the idea not the exact implementation lets talk about a example to read a text file from spark what we all do is

spark.sparkContext.textFile("fileName")

but do you know how does it actually works try to control click on this method text file you will find this code

can you understand what this code is doing first parameter is path of file ,second parameter is input format which should be used to read this file,third and fourth parameters are similar to recordreader which is the offset of line and line itself

now you might be thinking that why we do not get this offset back when we are reading from a file reason for this is below

.map(pair => pair._2.toString)

so what is doing is that is mapping over all the key value pair but only collecting the values

Knoldus is the world’s largest pure-play Scala and Spark company. We modernize enterprise through
cutting-edge digital engineering by leveraging Scala, Functional Java and Spark ecosystem. Our mission is to provide reactive and streaming fast data solutions that are message-driven, elastic, resilient, and responsive.

Knoldus is the world's largest pure-play Scala and Spark company. We modernize enterprise through
cutting-edge digital engineering by leveraging Scala, Functional Java and Spark ecosystem. Our mission is to provide reactive and streaming fast data solutions that are message-driven, elastic, resilient, and responsive.