Is parallel programming == multithread programming?
Asked Answered
S

3

45

Is parallel programming == multithread programming?

Separate answered 18/2, 2010 at 10:3 Comment(1)
Possible Duplicate --> #1073598Bolshevik
M
33

Multithreaded programming is parallel, but parallel programming is not necessarily multithreaded.

Unless the multithreading occurs on a single core, in which case it is only concurrent.

Mottle answered 18/2, 2010 at 11:48 Comment(5)
AFAIK, on a single core processor, threading is not parallel. It is concurrent, but not parallel.Glasses
@Ionut: thesaurus.reference.com/browse/concurrent <- If you would look under the 'Synonyms' header of the first result.Hazan
the difference I make between concurrent and parallel is that parallel is truly simultaneous, while concurrent just looks as if it was simultaneous. The switch between threads is so fast that it looks as if it were parallel, but it isn't. Maybe there are other terms designating this, but that's what I understand.Glasses
@LucasLindström: No, parallel has special meaning in the context of computer science, your link is misleading.Pickett
This answer is misleading as well. Multi-threading just implies concurrency, not parallelism. In other words you can have multi-threading on a single-core processor.Maintenon
R
28

Not necessarily. You can distribute jobs between multiple processes and even multiple machines - I wouldn't class that as "multi-threaded" programming as each process may only use a single thread, but it's certainly parallel programming. Admittedly you could then argue that with multiple processes there are multiple threads within the system as a whole...

Ultimately, definitions like this are only useful within a context. In your particular case, what difference is it going to make? Or is this just out of interest?

Ragman answered 18/2, 2010 at 10:5 Comment(2)
Should we also consider SIMD to be parallel programming? We're performing the same operations on multiple data in parallel, but I don't know if this is considered to much a micro-parallelization to be included in a definition of parallel programming.Overissue
I'd say that SIMD was more parallel hardware design, but i guess at some level you have to consider the programming side of having dedicated parallel hardware e.g. what about programming for a GPU?Sash
M
26

No. multithread programming means that you have a single process, and this process generates a bunch of threads. All the threads are running at the same time, but they are all under the same process space: they can access the same memory, have the same open file descriptors, and so on.

Parallel programming is a bit more "general" as a definition. in MPI, you perform parallel programming by running the same process multiple times, with the difference that every process gets a different "identifier", so if you want, you can differentiate each process, but it is not required. Also, these processes are independent from each other, and they have to communicate via pipes, or network/unix sockets. MPI libraries provide specific functions to move data to-and-fro the nodes, in synchronous or asynchronous style.

In contrast, OpenMP achieves parallelization via multithreading and shared-memory. You specify special directives to the compiler, and it automagically performs parallel execution for you.

The advantage of OpenMP is that it is very transparent. Have a loop to parallelize? just add a couple of directives and the compiler chunks it in pieces, and assign each piece of the loop to a different processor. Unfortunately, you need a shared-memory architecture for this. Clusters having a node-based architecture are not able to use OpenMP at the cluster level. MPI allows you to work on a node-based architecture, but you have to pay the price of a more complex and not transparent usage.

Monarchist answered 18/2, 2010 at 10:6 Comment(3)
ow, so it's mean 1 job is processed by n process not 1 job is processed by n threadSeparate
I seem to recall that work is being done on OpenMP-style parallelization for multi-process architectures... I can't remember if it's part of OpenMP itself, or something else?Overissue
@Eko : not exactly. MPI starts n instances of the same program, each one with a different id number in a special variable (look for MPI_Comm_Rank). What to do with those n instances is up to you.Monarchist

© 2022 - 2024 — McMap. All rights reserved.