Thinking Parallel

A Blog on Parallel Programming and Concurrency by Michael Suess

How-to Split a Problem into Tasks

AxeThe very first step in every successful parallelization effort is always the same: you take a look at the problem that needs to be solved and start splitting it into tasks that can be computed in parallel. This sounds easy, but I can see from my students reactions that at least for some of them it is not. This article shows five ways to do just that. It is a short version for a blog article, the long version can be found in the book Introduction to Parallel Computing by Grama and others.

Let’s start with a short paragraph on terminology: what I am describing here is also called problem decomposition. The goal here is to divide the problem into several smaller subproblems, called tasks that can be computed in parallel later on. The tasks can be of different size and must not necessarily be independent (although, of course, that would make your work easier later on 🙂 ).

If many tasks are created, we talk about fine grained decomposition. If few tasks are created it is called coarse grained decomposition. Which one is better heavily depends on the problem, as many tasks allow for more concurrency and better scalability, while fewer tasks usually need less communication / synchronization.

There are several ways to do problem decompositions, the most well-known probably being recursive decomposition, data decomposition, functional decomposition, exploratory decomposition and speculative decomposition. The next few paragraphs will shortly explain how they are carried out in practice.

Recursive Decomposion

Divide-and-Conquer is a widely deployed strategy for writing sequential algorithms. In it, a problem is divided into subproblems, which are again divided into subproblems recursively until a trivial solution can be calculated. Afterwards, the results of the subproblems are merged together when needed. This strategy can be applied as is to achieve a recursive problem decomposition. As the smaller tasks are usually independent of one another, they can be calculated in parallel, often leading to well-scaling parallel algorithms. Parallel sorting algorithms often use recursive decompositions.

Data Decomposition

When data structures with large amounts of similar data need to be processed, data decomposition is usually a well-performing decomposition technique. The tasks in this strategy consist of groups of data. These can be either input data, output data or even intermediate data, decompositions to all varieties are possible and may be useful. All processors perform the same operations on these data, which are often independent from one another. This is my favorite decomposition technique, because it is usually easy to do, often has no dependencies in between tasks and scales really well.

Functional Decomposition

For functional decomposition, the functions to be performed on data are split into multiple tasks. These tasks can then be performed concurrently by different processes on different data. This often leads to so-called Pipelines. Although this decomposition technique is usually easy to do as well, it often does not scale too well, because there is only a limited amount of functions available in each program to split up.

Exploratory Decomposition

Exploratory decompostion is a special case for algorithms, who search through a predefined space for solutions. In this case, the search space can often be partitioned into tasks, which can be processed concurrently. Exploratory decompostion is a special case decomposition that is not generally applicable, an example is breadth-first search in trees.

Speculative Decomposition

Another special purpose decomposition technique is called speculative decomposition. In the case when only one of several functions is carried out depending on a condition (think: a switch-statement), these functions are turned into tasks and carried out before the condition is even evaluated. As soon as the condition has been evaluated, only the results of one task are used, all others are thrown away. This decompostion technique is quite wasteful on resources and seldom used (personally, I have never used it).

The different decomposition methods described above can of course also be combined into hybrid decompositions. I think these are the most common techniques. Do you happen to know any other important ones? Then please don’t hesitate to share it in the comments!

11 Responses to How-to Split a Problem into Tasks


Comments