MapReduce - Application



  • specify the input/output locations
  • supply map and reduce functions via implementations of appropriate interfaces and/or abstract-classes.

These, and other job parameters, comprise the job configuration.

Applications typically implement:

to provide:

The Hadoop job client then submits the job (jar/executable etc.) and configuration to the ResourceManager which then assumes the responsibility of distributing the software/configuration to the slaves, scheduling tasks and monitoring them, providing status and diagnostic information to the job-client.

The MapReduce framework operates exclusively on <key, value> pairs conceivably of different types.


Powered by ComboStrap