I do not understand why hastable's rehash complexity may be quadratic in worst case at :
http://www.cplusplus.com/reference/unordered_set/unordered_multiset/reserve/
Any help would be appreciated !
Thanks
I do not understand why hastable's rehash complexity may be quadratic in worst case at :
http://www.cplusplus.com/reference/unordered_set/unordered_multiset/reserve/
Any help would be appreciated !
Thanks
Just some basics:
Hash collisions is when two or more elements take on the same hash. This can cause worst-case O(n)
operations.
I won't really go into this much further, since one can find many explanations of this. Basically all the elements can have the same hash, thus you'll have one big linked-list at that hash containing all your elements (and search on a linked-list is of course O(n)
).
It doesn't have to be a linked-list, but most implementations does it this way.
A rehash creates a new hash table with the required size and basically does an insert for each element in the old table (there may be a slightly better way, but I'm sure most implementations don't beat the asymptotic worst-case complexity of simple inserts).
In addition to the above, it all comes down to this statement: (from here1)
Elements with equivalent values are grouped together in the same bucket and in such a way that an iterator (see equal_range) can iterate trough all of them.
So all elements with equivalent values needs to be grouped together. For this to hold, when doing an insert, you first have to check if there exists other elements with the same value. Consider the case where all the values take on the same hash. In this case, you'll have to look through the above-mentioned linked-list for these elements. So n
insertions, looking through 0
, then 1
, then 2
, then ..., then n-1
elements, which is 0+1+2+...+n-1
= n*(n-1)/2
= O(n2)
.
Can't you optimize this to O(n)
? To me it makes sense that you may be able to, but even if so, this doesn't mean that all implementations have to do it this way. When using hash-tables it's generally assumed that there won't be too many collisions (even if this assumption is naive), thus avoiding the worst-case complexity, thus reducing the need for the additional complexity to have a rehash not take O(n2)
.
1: To all the possible haters, sorry for quoting CPlusPlus instead of CPPReference (for everyone else - CPlusPlus is well-known for being wrong), but I couldn't find this information there (so, of course, it could be wrong, but I'm hoping it isn't, and it does make sense in this case).
unordered_multiset
: "The iteration order of this container is not required to be stable, except that every group of elements whose keys compare equivalent forms a contiguous subrange in the iteration order, also accessible with equal_range()." –
Masturbation © 2022 - 2024 — McMap. All rights reserved.