4

Wikipedia gives us the following defintion of time complexity:

"In computer science, the time complexity is the computational complexity that describes the amount of time it takes to run an algorithm. Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform."

Now what are these elementary operations ? Are they arithmetic operations or some simpler operations ?
Thanks in advance!

1 Answers1

4

The set of elementary operations and their cost constitute a model of computation. The most common model of computation used to analyze algorithms is the Random access machine. Briefly, memory is divided into words of size $O(\log n)$ bits, where $n$ is the size of the input to the algorithm. A program consists of a sequence of instructions. The exact set of instructions isn't really set in stone, but usually includes arithmetic instructions (addition, subtraction, multiplication, division with remainder), possibly bitwise instructions, memory dereferencing, control flow, and so on. In the unit cost RAM, the most common model, each instruction operations on a single word, and costs one unit of time.

Other relatively common models of computation include various models of Turing machine, the bit complexity model (in which words consist of a single bit), and logarithmic cost RAM.

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514