Relative efficiency of Algorithms using Big O Notation
Upasana | November 21, 2020 | 2 min read | 143 views
Big O Notation is a mechanism used to measure the relative inefficiencies of Algorithms in terms of space and time. It makes us understand how execution time & memory requirements of an algorithm grow as a function of increasing input size. In this notation, O stands for the Order of magnitude.
Constant O(1)
A program whose running time’s order of growth is constant, executes a fixed number of operations to finish the job, thus its running time does not depend on N.
Linear O(N)
Program that spends a constant amount of time processing each piece of input data and thus running time is proportional to the N.
Logarithmic O(log n)
A program where on every subsequent iteration, the problem size is cut by half, for example - Binary Search.
Quadratic O (n2)
A quadratic task requires a number of steps equal to the square of it’s input value.
Exponential O (2n)
A program where on every subsequent iteration, the problem size is doubled, for example - Shortest Path Problem Djigstraw Algorithm. Calculating fibonacci number using recursive algorithm (without Dynamic Programming) also results in exponential runtime.
Following are the examples of Big O, in increasing order of their magnitude:
Big O Notation | Name | Example |
---|---|---|
O (1) |
Constant-time |
Searching from a HashMap, check a number for even/odd |
O (log n) |
Logarithmic |
Find an item inside sorted array using Binary Search |
O (n) |
Linear |
Printing all elements from an array |
O (n log n) |
Log Linear |
Sorting using Merge Sort |
O (n2) |
Quadratic |
Bubble Sorting Algorithm |
O (2n) |
Exponential |
Shortest Path Problem Djigstraw Algorithm |
O (n!) |
Factorial |
Solving Travelling Sales Man Problem |
Base of Logarithm is irrelevant in Big O Notation
The base of algorithm is not relevant with respect to the order of growth, since all logarithms with a constant base are all related by a constant proportion, so log N is used when referring to the order of growth. But also note that base in case of exponent matters, because it makes lot of difference.
Time efficiency in Big O notation for few Java Collections
-
O(1) for add, size and get
-
O(n) for toString() method
-
O(1) for peek, element and size
-
O(log n) for offer, poll, remove() and add
-
O(n) for remove(Object) & contains(Object)
-
O(1) for get operation
-
O(1) for put operation
-
O(1) for removal and O(1) for add & poll method
-
O(n) for toString() method
Top articles in this category:
- Multi-threading Java Interview Questions for Investment Bank
- Cracking core java interviews - question bank
- ION Trading Java Interview Questions
- Top 10 occurring words in a very large file java algorithm
- Hibernate & Spring Data JPA interview questions
- Sapient Global Market Java Interview Questions and Coding Exercise
- BlackRock Java Interview Questions