Constant-Time Algorithm
Jump to navigation
Jump to search
See: Algorithm, Constant-Time Complexity, Constant Algorithm, Running Time, Computational Complexity Analysis Task, Linear-Time Algorithm, Exponential-Time Algorithm, Polynomial-Time Algorithm.
References
2009
- http://en.wikipedia.org/wiki/Constant_time
- In computational complexity theory, constant time, or O(1) time, refers to the computation time of a problem when the time needed to solve that problem is bounded by a value that does not depend on the size of the data it is given as input.
- For example, accessing any single element in an array takes constant time as only one operation has to be made to locate it. However, finding the minimal value in an unordered array is not a constant time operation as a scan over each element in the array is needed in order to determine the minimal value. Hence it is a linear time operation, taking O(n) time (unless some more efficient algorithm is devised, for example, a binary search across a sorted array). If the number of elements is known in advance and does not change, however, such an algorithm can still be said to run in constant time.
- Despite the name "constant time", the time does not have to be strictly independent of the problem size, but only has to be bounded independently of the problem size. For example, the task "exchange the values of [math]\displaystyle{ a }[/math] and [math]\displaystyle{ b }[/math] if necessary so that a≤b" is called constant time even though the time may depend on whether or not it is already true that a≤b. The important thing is that there is some constant [math]\displaystyle{ t }[/math] such that the time required is always at most t.
- Here are some examples of code fragments that run in constant time: