Parallel Distributed Computing Using Pyt
Parallel Distributed Computing Using Pyt
Lisandro Dalcin
dalcinl@gmail.com
HPCLatAm 2011
Córdoba, Argentina
September 1, 2011
Outline
Overview
Applications
Overview
Applications
Motivation
◮ NumPy ◮ Cython
◮ SciPy ◮ SWIG
◮ SymPy ◮ F2Py
What is MPI?
Applications
MPI for Python (mpi4py)
COMM_NULL = NewComm(MPI_COMM_NULL)
COMM_SELF = NewComm(MPI_COMM_SELF)
COMM_WORLD = NewComm(MPI_COMM_WORLD)
[mpi4py] Features – MPI-1
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
if rank == 0:
msg1 = [77, 3.14, 2+3j, "abc", (1,2,3,4)]
elif rank == 1:
msg1 = {"A": [2,"x",3], "B": (2.17,1+3j)}
wt = MPI.Wtime()
if rank == 0:
comm.send(msg1, 1, tag=0)
msg2 = comm.recv(None, 1, tag=7)
elif rank == 1:
msg2 = comm.recv(None, 0, tag=0)
comm.send(msg1, 0, tag=7)
wt = MPI.Wtime() - wt
[mpi4py] Features – Communicating array data
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
wt = MPI.Wtime()
if rank == 0:
comm.Send([array1, MPI.DOUBLE], 1, tag=0)
comm.Recv([array2, MPI.DOUBLE], 1, tag=7)
elif rank == 1:
comm.Recv([array2, MPI.DOUBLE], 0, tag=0)
comm.Send([array1, MPI.DOUBLE], 0, tag=7)
wt = MPI.Wtime() - wt
Point to Point Throughput – Gigabit Ethernet
PingPong
120
Pickle
100 Buffer
C
Throughput [MiB/s]
80
60
40
20
0
100 101 102 103 104 105 106 107
Array Size [Bytes]
Point to Point Throughput – Shared Memory
PingPong
4500
4000
Pickle
Buffer
3500 C
Throughput [MiB/s]
3000
2500
2000
1500
1000
500
0
100 101 102 103 104 105 106 107
Array Size [Bytes]
Overview
Applications
PETSc for Python (petsc4py)
PETSc
Preconditioners (PC)
%module MyApp
%include petsc4py/petsc4py.i
%{
#include "MyApp.h"
%}
class Nonlinear {
Nonlinear(MPI_Comm comm,
const char datafile[]);
Vec createVec();
Vec createMat();
void evalFunction(SNES snes, Vec X, Vec F);
bool evalJacobian(SNES snes, Vec X, Mat J, Mat P);
};
[petsc4py] Interoperability – SWIG
comm = PETSc.COMM_WORLD
app = MyApp.Nonlinear(comm, "example.dat")
X = app.crateVec()
F = app.crateVec()
J = app.crateMat()
snes = PETSc.SNES().create(comm)
snes.setFunction(app.evalFunction, F)
snes.setFunction(app.evalJacobian, J)
snes.setFromOptions()
snes.solve(None, X)
Overview
Applications
Microfluidics (µ-TAS)
Microfluidics (µ-TAS)
mpi4py
◮ Development: http://mpi4py.googlecode.com
◮ Mailing List: mpi4py@googlegroups.com
◮ Chat: dalcinl@gmail.com
petsc4py
◮ Development: http://petsc4py.googlecode.com
◮ Mailing List: petsc-users@mcs.anl.gov
◮ Chat: dalcinl@gmail.com
Thanks!