1

Having got some basics down in regard to addition and explaining it in terms of primitive operations (addition and multiplication), I am now again stuck on understanding the more complicated long multiplication.

I have read in my introductory book (Mehlhörn's 'Algorithms and Data Structures') that each partial product requires 2n + 1 primitive operations.

Now I understand, I think, where the +1 comes from for the first partial product at least, as in there being one extra digit in the result for the first partial product, but I don't understand why it is "2n" + 1, as it seems to me it would be more like n + 1, so I am defintely missing something important here. By 'partial product', I mean the result arising from the first intermediate calculation of the multiplication problem.

I am of course also totally confused about quantifying the rest of the primitive operations required to get the final result/product! But I would just like to understand where the 2n is coming from for now for the first partial product. Just to add my guess from what I have seen in addition: does the 2 in 2n refer to a maximum guiding number, rather than a definitive one?

Here is a link to the chapter in the book I am learning from: http://people.mpi-inf.mpg.de/~mehlhorn/ftp/Toolbox/Appetizer.pdf

I understood primitive operations to mean: addition and multiplication. The author does write at bottom of pg.1

"...we have two primitive operations at our disposal: the addition of three digits with a two-digit result (this is sometimes called a full adder), and the multiplication of two digits with a two-digit result"

The product of two n-digit numbers when they multiplied together needs then: 3n^2 + 2n primitive operations.

It is the algorithm as per the method taught at school for long multiplication problems.

(All of this is on pgs 1 and 2 of chapter)

hinterbu
  • 113
  • 3

1 Answers1

1

Perhaps the following example will be useful. Suppose that $n = 3$, and we want to multiply $a_2 a_1 a_0$ by $b$. We first compute $$ \begin{align*} a_0 \times b &= c_0 d_0 \\ a_1 \times b &= c_1 d_1 \\ a_2 \times b &= c_2 d_2 \end{align*} $$ (Here $c_0 d_0$ is a two-digit integer.) So far we have done 3 operations (for general $n$, this will be $n$ operations).

Now we wish to perform the addition $$ \begin{array}{cccc} 0&d_2&d_1&d_0\\ c_2&c_1&c_0&0 \\\hline r_3&r_2&r_1&r_0 \end{array} $$ The book states that you need 4 operations for this (and in general, $n+1$), though 2 are enough (in general, $n-1$): $$ \begin{align*} r_0 &= d_0 \\ s_1 r_1 &= d_1 + c_0 \\ s_2 r_2 &= d_2 + c_1 + s_1 \\ r_3 = s_2 \end{align*} $$ Here $s_1,s_2$ are carry bits, and the first and last operations are for free (though the author charges one operation for each).

Summarizing, if you only charge addition and multiplication, only $2n-1$ operations are needed. If you also charge moves, $2n+1$ operations are needed. (The moves can be eliminated, though. Exercise.)

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514