You are implementing an LRU (Least Recently Used) cache. Which data structure, in conjunction with a hashmap, is most suitable for tracking the usage order of cached items?
Queue
Stack
Doubly Linked List
Binary Tree
What is a potential drawback of using double hashing for collision resolution compared to linear or quadratic probing?
Higher risk of primary clustering
Requires dynamic memory allocation for linked lists
Increased computational cost due to the second hash function
Not suitable for use with open addressing
What is a primary disadvantage of using linear probing for collision resolution in a hash table?
Increased potential for primary clustering
Not suitable for open addressing
Higher memory overhead compared to chaining
Complex implementation
What is the significance of the output size of a cryptographic hash function?
Determines the speed of hash computation
Affects the uniqueness of the hash for different inputs
Influences the memory required to store the hash function
Impacts the resistance against brute-force attacks
In a hash table using separate chaining for collision resolution, what is the worst-case time complexity for searching for an element?
O(n log n)
O(log n)
O(n)
O(1)
In a hash table using open addressing with quadratic probing, if the initial hash function maps a key to index 'i', and a collision occurs, what is the index probed in the second attempt (assuming table size 'm')?
(i * 2) % m
(i + 4) % m
(i + 2) % m
(i + 1) % m
When does rehashing typically occur in a hashmap?
When the load factor exceeds a predetermined threshold.
Every time a new key is inserted.
When the hash function is modified.
When the hashmap is cleared using the clear() method.
How does the choice of a hash function impact the performance of a hashmap?
A simple hash function is always preferred as it reduces computational overhead.
A well-chosen hash function minimizes collisions, leading to faster lookups and insertions.
The hash function has a negligible impact on performance compared to the data structure itself.
A complex hash function guarantees a lower collision rate, improving performance.
What is a significant disadvantage of using a fixed-size hash table in conjunction with a hash function prone to collisions?
Increased memory usage due to the fixed size allocation
Degraded performance due to chaining or open addressing for collision resolution
Complexity in implementing the hash function itself
Inability to store data that exceeds the pre-defined table size
How are deletions typically handled in a hashmap with open addressing to avoid creating 'holes' that disrupt search operations?
By marking the slot as "deleted" and implementing a mechanism to handle such markers during search and insertion.
Deletions are not allowed in hashmaps with open addressing.
By simply removing the element, leaving the slot empty.
By shifting all subsequent elements one position back to fill the gap.