Time Complexity Performance Measure
A Time Complexity Performance Measure is a Computational Complexity that measures the amount time that takes to run an algorithm.
- AKA: Time Complexity, Time Complexity Measure, Computing Time Complexity, Run-Time Complexity.
- Context:
- It can be calculated using a Run Time Complexity Evaluation Algorithm.
- Example(s):
- a Constant Time Complexity Measure: $T(n)=O(1)$,
- a Cubic Time Complexity Measure: $T(n)= O(n^3)$,
- an Exponential Time Complexity Measure: $T(n)=2^{poly(n)}$,
- a Factorial Time Complexity Measure: $T(n)=O(n!)$
- a Linear Time Complexity Measure: $T(n)=O(n)$,
- a Logarithmic Time Complexity Measure: $T(n)=O\left(\log (n)\right)$,
- a Polylogarithmic Time Complexity Measure: $T(n) = poly\left(log (n)\right)$,
- a Polynomial Time Complexity Measure: $T(n) = 2^{\left(log (n)\right)} = poly(n)$,
- a Quadratic Time Complexity Measure: $T(n) = O(n^2)$,
- a Turing Machine Time Complexity (DTIME) Measure.
- …
- Counter-Example(s):
- See: Time Hierarchy Theorem, Bit, Constant Factor, Worst-Case Complexity, Average-Case Complexity, Mathematical Function, Asymptotic Analysis, Big O Notation.
References
2020
- (Wikipedia, 2020) ⇒ https://en.wikipedia.org/wiki/Computational_complexity#Time Retrieved:2020-4-5.
- The resource that is most commonly considered is time. When "complexity" is used without qualification, this generally means time complexity.
The usual units of time (seconds, minutes etc.) are not used in complexity theory because they are too dependent on the choice of a specific computer and on the evolution of technology. For instance, a computer today can execute an algorithm significantly faster than a computer from the 1960s; however, this is not an intrinsic feature of the algorithm but rather a consequence of technological advances in computer hardware. Complexity theory seeks to quantify the intrinsic time requirements of algorithms, that is, the basic time constraints an algorithm would place on any computer. This is achieved by counting the number of elementary operations that are executed during the computation. These operations are assumed to take constant time (that is, not affected by the size of the input) on a given machine, and are often called steps.
- The resource that is most commonly considered is time. When "complexity" is used without qualification, this generally means time complexity.
2018
- (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/Time_complexity Retrieved:2018-4-1.
- In computer science, the time complexity is the computational complexity that measures or estimates the time taken for running an algorithm. Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that an elementary operation takes a fixed amount of time to perform. Thus, the amount of time taken and the number of elementary operations performed by the algorithm differ by at most a constant factor.
Since an algorithm's running time may vary with different inputs of the same size, one commonly considers the worst-case time complexity, which is the maximum amount of time taken on inputs of a given size. Less common, and usually specified explicitly, is the average-case complexity, which is the average of the time taken on inputs of a given size (this makes sense, as there is only a finite number of possible inputs of a given size).
In both cases, the time complexity is generally expressed as a function of the size of the input[1] Since this function is generally difficult to compute exactly, and the running time is usually not critical for small input, one focuses commonly on the behavior of the complexity when the input size increases; that is, on the asymptotic behavior of the complexity. Therefore, the time complexity is commonly expressed using big O notation, typically [math]\displaystyle{ O(n), }[/math] [math]\displaystyle{ O(n\log n), }[/math] [math]\displaystyle{ O(n^\alpha), }[/math] [math]\displaystyle{ O(2^n), }[/math] etc., where is the input size measured by the number of bits needed for representing it.
Algorithm complexities are classified by the function appearing in the big O notation. For example, an algorithm with time complexity [math]\displaystyle{ O(n) }[/math] is a linear time algorithm, an algorithm with time complexity [math]\displaystyle{ O(n^\alpha) }[/math] for some constant [math]\displaystyle{ \alpha \ge 1 }[/math] is a polynomial time algorithm.
- In computer science, the time complexity is the computational complexity that measures or estimates the time taken for running an algorithm. Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that an elementary operation takes a fixed amount of time to perform. Thus, the amount of time taken and the number of elementary operations performed by the algorithm differ by at most a constant factor.
- ↑ Sipser, Michael (2006). Introduction to the Theory of Computation. Course Technology Inc. ISBN 0-619-21764-2.