You are designing a dynamic array implementation that needs to support efficient insertion at both the beginning and the end. Which of the following underlying data structures would be the MOST suitable to achieve this with optimal time complexity?
Doubly Linked List
Binary Search Tree
Circular Buffer
Singly Linked List
Consider an algorithm that iterates through a sorted array of size N. In each iteration, it performs a binary search on the array. What is the overall time complexity of this algorithm?
O(N^2)
O(N log N)
O(log N)
O(N)
External Merge Sort is particularly well-suited for scenarios where:
The data is stored on a slow, disk-based storage device.
Real-time sorting is required.
The elements are already nearly sorted.
The dataset is small and fits entirely in memory.
What is a key advantage of Radix Sort over comparison-based sorting algorithms like Quick Sort and Merge Sort?
Radix Sort guarantees stability, while Quick Sort and Merge Sort do not.
Radix Sort can achieve better than O(n log n) time complexity in certain cases.
Radix Sort is generally more suitable for sorting strings than numerical data.
Radix Sort is always more space-efficient than comparison-based algorithms.
In the context of amortized analysis, what is the purpose of the potential function?
To analyze the space complexity of an algorithm.
To optimize the performance of individual array operations.
To determine the maximum possible runtime of a single operation in the worst-case scenario.
To calculate the average runtime of a single operation over a sequence of operations.
You are given an array of integers and a target sum. Find all unique quadruplets in the array that sum up to the target sum.
Use a backtracking algorithm to explore all possible combinations of elements.
Sort the array and use two pointers to find pairs of elements that sum up to a specific value.
Use a hash table to store the sum of all pairs of elements.
Use four nested loops to iterate through all possible combinations of four elements.
What is a significant disadvantage of using arrays for storing and processing extremely large datasets, particularly in the context of limited memory resources?
Arrays do not support dynamic resizing, making it challenging to handle growing datasets.
Arrays require contiguous blocks of memory, which can be difficult to allocate for massive datasets.
Arrays have slow access times for individual elements.
Arrays are not suitable for storing structured data, such as key-value pairs.
What is the time complexity of Bucket Sort in the average case, assuming uniformly distributed data and a fixed number of buckets?
O(log n)
O(n^2)
O(n)
O(n log n)
Which sorting algorithm is the MOST suitable for sorting a massive dataset that cannot fit entirely in RAM?
Bubble Sort
External Merge Sort
Quick Sort
Merge Sort
Given an array of n positive integers and a positive integer s, find the minimal length of a contiguous subarray of which the sum ≥ s. If there isn't one, return 0 instead.
Use two nested loops to iterate through all possible subarrays.
Use binary search to find the minimal length.
Use a sliding window approach to find the minimal length subarray.
Use dynamic programming to store the minimal length for all subarrays ending at each index.