8

Apologies for the newbie question, but I am a bit confused about what exactly counts as a "simple operation" when working out the time complexity of an algorithm. In particular, why do we consider all operations to be equal?

Surely, dividing two very large numbers is more time-consuming than adding one to a number (as in each iteration of a for loop). Multiplication, for example, can consist of any number of small additions. So, instead of just adding them up shouldn't we be applying some kind of weight to each operation depending on the type of operation (addition, multiplication, etc) and the size of the numbers involved?

My problem is that I am being asked to prove that the complexity of my algorithm is $O(f)$ (for some function $f$) and I am not sure how to do this in a mathematically rigorous fashion because of the inherent vagueness in the definition of a "simple operation". So how would I go about this?

user85798
  • 1
  • 3

1 Answers1

1

Simple operations are anything that takes constant time to achieve. The confusion is that division does not seem to take constant time, and in fact, in general, it does not. BUT!

Division and addition both take constant time if, say, you are adding two 32-bit integers together, which is usually how it goes. However, if you are adding arbitrarily large numbers, they won't really be constant-time, though sometimes I think people will treat it as if it is because the numbers are kind of assumed to not get super big.

Alex Li
  • 143
  • 5