# Law of the Gustav loss

A law (British known as Gustafson's law, Gustafson-Barsis' law) **of the Gustav loss** is a law in the calculator engineering and shows what "the problem of the enough big scale becomes parallel effectively and can solve". The law of the Gustav loss is related to a law and closeness of Amdahl which showed the limit where a program can speed up parallel by becoming it. This law was shown by John Gustav loss for the first time in 1988.

If *P* is the number of processors, and *S* is the part which Speedup, *α* cannot make the parallel of the process, follows are established.

The law of the Gustav loss copes with the part which was short in a law of Amdahl which performance does not scale so as to use up the computing power that is available if the scale of the calculator becomes big. I removed a supposition that the load of the calculation on the parallel processor was constant again that the scale of the problem was fixed and, by the law of the Gustav loss, proposed a concept at fixation time instead and showed what speedup in this way scaled.

The law of Amdahl is based on the scale of a work load and the problem being constant. In other words, the serial part of the program does not change without depending on the scale (i.e., the number of the processors) of the calculator. However, it is said that I can distribute the part which can become parallel between the processor of the n unit equally. Under the influence of a law of Amdahl, the research organization developed a parallel compiler and reduced the serial part of the problem and came to be going to give parallel system performance.

## Table of contents

## Realization of the law of the Gustav loss

I assume *n* quantity indicating the size of the problem.

The practice of the program on the parallel computer can disintegrate as follows.

In the ratio of the part that *a* is serial here, *b* is a ratio of multiple part. But I ignore the overhead.

On the other hand, when do it with the number of the processors when made *p* parallel with the serial computer; is *a*(n) + *p* for the relative processing time · It is *b*(*n*).

In other words, when Speedup becomes parallel for *a*(n) + *b*(n) = 1 when it is serial (*a*(n) + *p* · *b*(*n*)) であるから

となる. *a*(*n*) is a function indicating the ratio of the serial part here.

Speedup arrives at wished-for *p* when I suppose that serial function *a*(*n*) decreases by size *n* of the problem if *n* grows big endlessly.

When the law of the Gustav loss has a look, I seem to be able to rescue parallel computing from the limit of the law of Amdahl.

As for this difference, the influence of the serial part is born in the law of Amdahl from a way of thinking to increase whereas there is not the influence to give a serial part and thinks that I can consider the size of the therefore part to be uniformity even if the law of the Gustav loss uses the parallel calculator of enormous numbers as the numbers of the processor increase.

## The metaphor that showed the way of thinking of two laws

The place that the law of Amdahl shows is compared as follows:

" | It traveled two 60 miles away cities by car and already ran half distance at 30mph for one hour (series execute time). Even if only anything was able to run fast in the latter half, it has already run for one hour, and only 60 miles have distance altogether, and it is impossible to achieve an average of 90mph in speed by arrival. Even if I run at infinite speed and arrive in an instant, it becomes only 60mph. | " |

On the other hand, the law of the Gustav loss is compared as follows:

" | It is said that I drove one car in 90mph or less. I can let the average speed of the car finally arrive at 90mph without depending on distance at the time when I already ran it if there are enough time and distance (remaining calculation). For example, it can arrive at 90mph if I run it at 120mph for another two hours if I already drove at 30mph for one hour or run at 150mph for another one hour. | " |

## Limit of the law of the Gustav loss

There is the thing which does not have large-scale data set depending on a problem to be settled essentially. For example, as for the problem processing existing data for human beings of the world one by one, only several percent grows big in the year.

The non-linear algorithm may not make use of concurrency by a law of the Gustav loss well "clearly". I point out that the size of the problem is only increased 9% even if Snyder doubles concurrency in the algorithm of O(N^{3}). Therefore, the parallel of the cause cause remains for the method that there is little degree of making it it and has the possibilities that it does not have an advantage even if I realized very high concurrency. However, big progress is really accomplished by using a distributed computer such as cluster in particular and Condor.

## Outside link

- Reevaluating Amdahl's Law - the paper in which John Gustafson first described his Law. It is pp. Originally published in Communications of the ACM 31(5), 1988 532-533
- [1] -- Lawrence Snyder, "Type Architectures, Shared Memory, and The Corrolary of Modest Potential"

This article is taken from the Japanese Wikipedia **Law of the Gustav loss**

This article is distributed by cc-by-sa or GFDL license in accordance with the provisions of Wikipedia.

In addition, Tranpedia is simply not responsible for any show is only by translating the writings of foreign licenses that are compatible with CC-BY-SA license information.

## 0 개의 댓글:

## 댓글 쓰기