# Why is O(n) slower than O(n^2)?

• I'm studying the Algorithm's Grokim Manual, which says:

O(n). Example: very slow algorithms.
O(n^2). Example: slow algorithms.

I thought I'd check it out for myself.

O(n!), n! = 3! = O(3) = 1 * 2 * 3 = 6
O(n^2), n = 3 = O(3^2) = 3 * 3 = 9

Consequently O(n!) À(n^2)

It can be concluded: O(n) faster than O(n^2)

Or did I get something wrong?

• The thing is, the badge O() refers to asymptomatic.

I mean, to understand why O(n) is slower than O(n^2), a number of values of each function should be written:

``````    n        n!          n^2
1        1           1
2        2           4
3        6           9
4        24          16
5        120         25
6        720         36
7        5040        49
8        40320       64
9        362880      81
10       3628800     100
``````

Is that clearer?

PS. And the book is nice. It's particularly pleasant that it explains the BFS algorithms with simple words, without over-throwing the head.

2

2

2

2

2

2

2

2

2

2

2

2

2

2

2