Skip to main content

Python MapReduce framework

Project description

https://github.com/Yelp/mrjob/raw/master/docs/logos/logo_medium.png

mrjob is a Python 2.6+/3.3+ package that helps you write and run Hadoop Streaming jobs.

Stable version (v0.5.1) documentation

Development version documentation

https://travis-ci.org/Yelp/mrjob.png

mrjob fully supports Amazon’s Elastic MapReduce (EMR) service, which allows you to buy time on a Hadoop cluster on an hourly basis. It also works with your own Hadoop cluster.

Some important features:

  • Run jobs on EMR, your own Hadoop cluster, or locally (for testing).

  • Write multi-step jobs (one map-reduce step feeds into the next)

  • Duplicate your production environment inside Hadoop
    • Upload your source tree and put it in your job’s $PYTHONPATH

    • Run make and other setup scripts

    • Set environment variables (e.g. $TZ)

    • Easily install python packages from tarballs (EMR only)

    • Setup handled transparently by mrjob.conf config file

  • Automatically interpret error logs from EMR

  • SSH tunnel to hadoop job tracker on EMR

  • Minimal setup
    • To run on EMR, set $AWS_ACCESS_KEY_ID and $AWS_SECRET_ACCESS_KEY

    • To run on your Hadoop cluster, just make sure $HADOOP_HOME is set.

Installation

From PyPI:

pip install mrjob

From source:

python setup.py install

A Simple Map Reduce Job

Code for this example and more live in mrjob/examples.

"""The classic MapReduce job: count the frequency of words.
"""
from mrjob.job import MRJob
import re

WORD_RE = re.compile(r"[\w']+")


class MRWordFreqCount(MRJob):

    def mapper(self, _, line):
        for word in WORD_RE.findall(line):
            yield (word.lower(), 1)

    def combiner(self, word, counts):
        yield (word, sum(counts))

    def reducer(self, word, counts):
        yield (word, sum(counts))


if __name__ == '__main__':
     MRWordFreqCount.run()

Try It Out!

# locally
python mrjob/examples/mr_word_freq_count.py README.rst > counts
# on EMR
python mrjob/examples/mr_word_freq_count.py README.rst -r emr > counts
# on your Hadoop cluster
python mrjob/examples/mr_word_freq_count.py README.rst -r hadoop > counts

Setting up EMR on Amazon

Advanced Configuration

To run in other AWS regions, upload your source tree, run make, and use other advanced mrjob features, you’ll need to set up mrjob.conf. mrjob looks for its conf file in:

  • The contents of $MRJOB_CONF

  • ~/.mrjob.conf

  • /etc/mrjob.conf

See the mrjob.conf documentation for more information.

Reference

More Information

Thanks to Greg Killion (ROMEO ECHO_DELTA) for the logo.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mrjob-0.5.1.tar.gz (206.2 kB view details)

Uploaded Source

Built Distribution

mrjob-0.5.1-py2.py3-none-any.whl (269.1 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file mrjob-0.5.1.tar.gz.

File metadata

  • Download URL: mrjob-0.5.1.tar.gz
  • Upload date:
  • Size: 206.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for mrjob-0.5.1.tar.gz
Algorithm Hash digest
SHA256 5f66c2a180b8d8addd634af3dec402358df3bf946696a963793a48f28069d21b
MD5 f759b0c8fa17515749504d64c19deb77
BLAKE2b-256 5a375b61427817aec3923411f58c3b90fcaae87f19a89c058f592c5002184741

See more details on using hashes here.

File details

Details for the file mrjob-0.5.1-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for mrjob-0.5.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 23760613a85f1f8a89cfad07431e72e2dc845c37469d0a36dc1c920a1bfefdb9
MD5 04a03ca0c9204849652b02bbb4efb8f3
BLAKE2b-256 f74a83f271c1940952b6ab8b9de6fad91bfb6d748c372002276969167a079631

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page