What does it mean if an algorithm has a time complexity of Ω(n log n)?
It runs in exactly n log n time.
It runs in at most n log n time.
It has a logarithmic growth rate.
It runs in at least n log n time.
Which of the following asymptotic notations represents the tightest upper bound on the growth of a function?
Little-o (o)
Big Omega (Ω)
Big Theta (Θ)
Big-O (O)
How can understanding the time complexity of data structures aid in optimizing code?
It has no direct impact on code optimization; it's purely for theoretical analysis.
It helps determine the best programming language for the algorithm.
It helps choose the most appropriate data structure for the task, optimizing operations.
It guides the choice of variable names for improved code readability.
Which of the following is NOT a valid reason for analyzing an algorithm's time complexity?
Identifying potential performance bottlenecks
Comparing the efficiency of different algorithms for a given task
Determining the optimal programming language for an algorithm
Understanding how an algorithm's runtime scales with input size
What is the time complexity of adding an element at the end of a dynamic array (ArrayList in Java, vector in C++) if there is enough capacity?
O(log n)
O(1)
O(n)
O(n log n)
Which of the following is a limitation of time complexity analysis?
It can't be applied to algorithms with nested loops
It always provides the exact runtime of an algorithm
It's only relevant for algorithms processing numerical data
It doesn't consider the hardware on which the algorithm will run
Which notation provides both an upper and lower bound on the growth of a function, implying the function grows at the same rate as the specified function?
Little-omega (ω)
Which time complexity is represented by an algorithm that iterates through a list of size n and performs a constant time operation in each iteration?
O(n^2)
Which of the following operations typically represents constant time complexity, O(1)?
Inserting an element at the beginning of a linked list
Sorting an array using bubble sort
Finding the smallest element in a sorted array
Searching for a specific value in an unsorted array
If an algorithm's time complexity is O(n^2), what can you conclude about its best-case time complexity?
It is Ω(n^2).
It is also O(n^2).
It is always constant, i.e., O(1).
It cannot be determined from the given information.