## SQLWise

Timothy A Wiseman is a Database Administrator for SAIC with a focus on efficiency and readability of the database and its surrounding ecosystem. In addition to administering the core SQL Server system, he has experience working with Python and Microsoft Access in conjunction with SQL Server. He holds a Bachelor's of Science in Mathematics as well as an MCDBA and MCITP.

### A really simple Multiprocessing Python Example

Purpose and introduction

A Python program will not be able to take advantage of more than one core or more than one CPU by default.  One way to get the program to take advantage of multiple cores is through the multiprocessing module.  There are lots of excellent references and tutorials available on the web (links are included at the bottom), but one thing I was not able to find back when I first started using multiprocessing was a detailed look at an extremely simple, but still practical, example.  Sometimes, it is useful when dealing with a new technique to see it in a very simple form, but not so simple as some of the completely contrived examples in the library documentation.

So, here is a look at an extremely simple example using an embarrassingly parallel issue: generating the Mandelbrot set.  The algorithm used is basically a direct adaptation of the one presented in pseudo-code on Wikipedia, grouping the pixels into rows to make it easier to pass off to the multiprocessing.  Just to be clear, this is far from the fastest or best or most elegant way to use Python to calculate the Mandelbrot set.  It does provide a fairly good springboard for using multiprocessing while still doing actual work.

Essentially this provides a straightforward example with explanations of processing a function against a list of arguments using multiprocessing and then gathering those together into a list.

A look at a single processor version

Here is the code for a single processor version:

```import matplotlib.pyplot as plt
from functools import partial

def mandelbrotCalcRow(yPos, h, w, max_iteration = 1000):
y0 = yPos * (2/float(h)) - 1 #rescale to -1 to 1
row = []
for xPos in range(w):
x0 = xPos * (3.5/float(w)) - 2.5 #rescale to -2.5 to 1
iteration, z = 0, 0 + 0j
c = complex(x0, y0)
while abs(z) < 2 and iteration < max_iteration:
z = z**2 + c
iteration += 1
row.append(iteration)

return row

def mandelbrotCalcSet(h, w, max_iteration = 1000):
partialCalcRow = partial(mandelbrotCalcRow, h=h, w=w, max_iteration = max_iteration)
mandelImg = map(partialCalcRow, xrange(h))
return mandelImg

mandelImg = mandelbrotCalcSet(400, 400, 1000)
plt.imshow(mandelImg)
plt.savefig('mandelimg.jpg')
```

The modifications needed to use multiprocessing

Obviously, to use multiprocessing, we need to import it, so towards the top, we add:

```Import multiprocessing
```

The mandelbrotCalcRow function can remain unchanged.  The main changes are to the mandelbrotCalcSet function, which now looks like:

```def mandelbrotCalcSet(h, w, max_iteration = 1000):
#make a helper function that better supports pool.map by using only 1 var
#This is necessary since the version
partialCalcRow = partial(mandelbrotCalcRow, h=h, w=w, max_iteration = max_iteration)

pool =multiprocessing.Pool() #creates a pool of process, controls worksers
#the pool.map only accepts one iterable, so use the partial function
#so that we only need to deal with one variable.
mandelImg = pool.map(partialCalcRow, xrange(h)) #make our results with a map call
pool.close() #we are not adding any more processes
pool.join() #tell it to wait until all threads are done before going on

return mandelImg
```

Here, Pool creates the pool of processes that controls the workers.  It gets the environment ready to run multiple tasks.  One of the easiest ways to use the pool is to use its map.  That takes a function and an iterable of parameters.  That function is then called for each parameter in the iterable and results are put into a list, distributing the calls over the available threads.

One significant difference between pool.map and the built-in map, other than the fact pool.map can take advantage of multiple processors, is that pool.map will only take a single iterable of arguments for processing.  That is why I created a partial function which freezes the other arguments.

Pool.close() then informs the processor that no new tasks will be added that pool.  Either pool.close or pool.terminate need to be called before pool.join can be called.  Pool.join stops and waits for all of the results to be finished and collected before proceeding with the rest of the program.  This gives a simple way to collect the results into a single list for use later.

The other significant change is that the main portion, the entry-point of the script, needs to be wrapped with a  “if __name__=’__main__’ conditional on Windows.  This is because the main module needs to be able to be safely imported by a new python interpreter.  Not doing this can result in problems such as a RuntimeError or completely locking up the system in some of the tests I tried.  This, and a couple of other caveats, are mentioned in the Programming Guidelines.

So, the entry point now looks like:

```if __name__=='__main__':
mandelImg = mandelbrotCalcSet(400, 400, 1000)
plt.imshow(mandelImg)
plt.savefig('mandelimg.jpg')
```

In this example, the multiprocessing version only has 8 additional lines of code (its 15 lines longer, but 7 of those lines are additional whitespace or comment lines I added to make it easier to read).  But it runs in less than a third of the time.

Of course, it is worth remembering the saying that “premature optimization is the root of all evil.”  It is normally smart to get the code working first, and then consider bringing in multiprocessing options.

And the results:

1. Multiprocessing Docs
2. The examples in the documentation.
3. Wiki.cython.org has an example of creating the Mandelbrot set using Cython.  For actually generating the set rather than just making examples for multiprocessing, that version is much better.
4. SciPy.org has a good discussion of parallel programming with numpy and scipy.

{Edit 10 Jan 13 – Corrected a minor spelling error.}