Time complexity
* What is it?
According to codility.com:
Complexity can be viewed as the maximum number of primitive operations that a program may execute. Regular operations are single additions, multiplications, assignments etc. We may leave some operations uncounted and concentrate on those that are performed the largest number of times. Such operations are referred to as dominant.
* Why care?
Well, some nice answer from Quora.com? Basically, it's saying, as processing power of computers gets larger, so do the problems:
Suppose you build a program that has a time complexity of O(n^2), say, a Bubblesort. You use Bubble sort because it happens to run fast enough, and you think, well, over time computers get faster anyway so it won't matter in the future.
Now time passes. Computers get 1000 times faster, memory gets 1000 times bigger, and so problems get 1000 times bigger. When you try to stuff a modern sized problem into your old program, it suddenly runs 1000 times slower than it did back in the day!
That's one reason why worrying about complexity is important (there are others). As computers scale up in speed, the problems scale up in size.
And a similar line of answer from nick's blog:
As a front-end developer working on smaller projects, computer science topics like time complexity analysis may never enter your frame at all. But as the size, scope, and scale of a project changes and grows, it must.
....
As I mentioned before, time complexity becomes an extremely important issue when the scale of an application grows.
* How do you calculate it?
We focus more on the magnitude of a complexity, rather than the constants. Codility.com gives a nice example to explain this:
DEF DOMINANT(N):
RESULT = 0
FOR I IN XRANGE(N):
RESULT += 1
RETURN RESULT
The operation in line 4 is dominant and will be executed n times. The complexity is described in Big-O notation: in this case O(n) — linear complexity. The complexity specifies the order of magnitude within which the program will performits operations. More precisely, in the case of O(n), the program may perform c · n opera-tions, where c is a constant; however, it may not perform n2 operations, since this involvesa di erent order of magnitude of data. In other words, when calculating the complexity we omit constants: i.e. regardless of whether the loop is executed 20 · n times or n times, we still 5 have a complexity of O(n), even though the running time of the program may vary.
Codility.com explained it quite clearly and concisely so I don't really have anything to add onto this. But what about more complicated operations? First of all,
Statement;
is clearly independent of the input N. It will run only a set number of times regardless of what N is. Time complexity: O(1).
for ( i = 0; i < N; i++ )
statement;
This for loop is directly proportional to N. For example, when N = N * 2, running time = running time * 2. Time complexity:O(N)
for ( i = 0; i < N; i++ ) { // run N times here
for ( j = 0; j < N; j++ ) // run n times here for each i (which essentially sums up to n*n at the end)
statement;
}
This nested loop structure is quadratic in relation to N. When N doubles, the running time increases by N^2. Time complexity: O(N^c), where c = # loops. In the example here, c = 2)
for (int i = 1; i <=n; i *= c) {
// some O(1) expressions
}
for (int i = n; i > 0; i /= c) {
// some O(1) expressions
}
Time Complexity of a loop is considered as O(Log(n)) if the loop variables is divided / multiplied by a constant amount. Note that the base of log does not matter, because it only refers to different scales, not different magnitudes.
Links for help
http://bigocheatsheet.com
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm
http://www.geeksforgeeks.org/analysis-of-algorithms-set-4-analysis-of-loops/
http://discrete.gr/complexity/
https://www.quora.com/Why-we-care-for-time-n-space-complexity-even-if-we-have-super-fast-processors-today
http://callmenick.com/post/time-complexity-analysis-why-its-important
http://bigocheatsheet.com
https://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm
http://www.geeksforgeeks.org/analysis-of-algorithms-set-4-analysis-of-loops/
http://discrete.gr/complexity/
https://www.quora.com/Why-we-care-for-time-n-space-complexity-even-if-we-have-super-fast-processors-today
http://callmenick.com/post/time-complexity-analysis-why-its-important