Python: delete element from heap
Asked Answered
S

3

72

Python has heapq module which implements heap data structure and it supports some basic operations (push, pop).

How to remove i-th element from the heap in O(log n)? Is it even possible with heapq or do I have to use another module?

Note, there is an example at the bottom of the documentation: http://docs.python.org/library/heapq.html which suggest a possible approach - this is not what I want. I want the element to remove, not to merely mark as removed.

Sedan answered 15/4, 2012 at 14:0 Comment(0)
S
97

You can remove the i-th element from a heap quite easily:

h[i] = h[-1]
h.pop()
heapq.heapify(h)

Just replace the element you want to remove with the last element and remove the last element then re-heapify the heap. This is O(n), if you want you can do the same thing in O(log(n)) but you'll need to call a couple of the internal heapify functions, or better as larsmans pointed out just copy the source of _siftup/_siftdown out of heapq.py into your own code:

h[i] = h[-1]
h.pop()
if i < len(h):
    heapq._siftup(h, i)
    heapq._siftdown(h, 0, i)

Note that in each case you can't just do h[i] = h.pop() as that would fail if i references the last element. If you special case removing the last element then you could combine the overwrite and pop.

Note that depending on the typical size of your heap you might find that just calling heapify while theoretically less efficient could be faster than re-using _siftup/_siftdown: a little bit of introspection will reveal that heapify is probably implemented in C but the C implementation of the internal functions aren't exposed. If performance matter to you then consider doing some timing tests on typical data to see which is best. Unless you have really massive heaps big-O may not be the most important factor.

Edit: someone tried to edit this answer to remove the call to _siftdown with a comment that:

_siftdown is not needed. New h[i] is guaranteed to be the smallest of the old h[i]'s children, which is still larger than old h[i]'s parent (new h[i]'s parent). _siftdown will be a no-op. I have to edit since I don't have enough rep to add a comment yet.

What they've missed in this comment is that h[-1] might not be a child of h[i] at all. The new value inserted at h[i] could come from a completely different branch of the heap so it might need to be sifted in either direction.

Also to the comment asking why not just use sort() to restore the heap: calling _siftup and _siftdown are both O(log n) operations, calling heapify is O(n). Calling sort() is an O(n log n) operation. It is quite possible that calling sort will be fast enough but for large heaps it is an unnecessary overhead.

Edited to avoid the issue pointed out by @Seth Bruder. When i references the end element the _siftup() call would fail, but in that case popping an element off the end of the heap doesn't break the heap invariant.

Note Someone suggested an edit (rejected before I got to it) changing the heapq._siftdown(h, 0, i) to heapq._siftdown(h, o, len(h)). This would be incorrect as the final parameter on _siftdown() is the position of the element to move, not some limit on where to move it. The second parameter 0 is the limit.

Sifting up temporarily removes from the list the new item at the specified position and moves the smallest child of that position up repeating that until all smaller children have been moved up and a leaf is left empty. The removed item is inserted in the empty leaf node then _siftdown() is called to move it below any larger parent node. The catch is the call to _siftdown() inside _siftup() uses the second parameter to terminate the sift at the original position. The extra call to _siftdown() in the code I gave is used to continue the sift down as far as the root of the heap. It only does something if the new element actually needs to be further down than the position it got inserted.

For the avoidance of doubt: sift up moves to higher indexes in the list. Sift down moves to lower indexes i.e. earlier in the list. The heap has its root at position 0 and its leaves at higher numbers.

Sectional answered 15/4, 2012 at 15:35 Comment(15)
+1, with the side note that it would be cleaner to copy the definition of _siftup into the program as recommended by @AlexMartelli, here.Shipboard
Thanks, _siftup looks definitely interesting! Btw., why pop(-1), instead of just pop()?Sedan
@EcirHana just because I can't remember the default off the top of my head. I've tidied it up.Sectional
@Sectional I have a doubt here, I am trying to implement decreaseKey operation on priority queue. In your method, you are assuming that decrease has index(i) to the item to be deleted. If I have just the element not the index, then how can it be done?Klaus
@Klaus If you don't have an index then I think the best you can do is use heapify(q) to restore the heap ordering in linear time.Sectional
@Sectional I was thinking whether it is possible if I keep a dict alongside my heap which will point to the index in heap? But updating such dict with every insert becomes bottleneck. Is there anyway in which we can overcome it? The heap I am building is quite large and I really need O(log n) time decreasekey operation.Klaus
@dano I guess in the _siftup() example, there's a mistake. If I removed last item, you are just replacing last item with itself then you pop it out and then when you'll call _siftup(), it will give array out of bound. We can't call sift up in that case. Am I right?Klaus
Since you don't know whether the new h[i] will be greater or smaller than its parents or children, you also need to call heapq._siftdown(h, 0, i) before or after calling _siftupEthmoid
Why not use: h.sort() instead of siftup/siftdown methods in order to maintain the heap invariant? Seems to be preferred way to do it according to the docs: https://docs.python.org/3.5/library/heapq.html?highlight=heap#module-heapq Also I find it cleaner to just del h[i] instead of playing with reference/popNicoline
@Nicoline Because of performance. sort and del are both much slower than the above method.Countersubject
@Sectional I think the point by @Ethmoid still stands: as it is now, the index argument to _siftup() may index the element that was just removed by pop(), causing _siftup() to throw.Macintyre
@SethBruder, good catch. Yes, the _siftup would indeed throw, but if you remove the very last element you don't need to do either _siftup or _siftdown. Updated the answer accordingly.Sectional
can you explain why the first approach is O(n), I thought heapify is O(logn), thanksGahl
@Gahl heapq.heapify() is O(n) because it doesn't know which element is out of order so it will scan every element in the heap. Calling _siftup() and _siftdown() start at the element that may be out of position and don't consider other parts of the heap, so that is O(log n).Sectional
You are right. Here are two example of maximum heap: 1. [15,5,11,4,3,10,9,2,1,0,-1,8] when deleting 4, it has to siftup 2. [15,5,11,4,3,10,9,2,1,0,-1,-3] when deleting 4, it has to siftdownInexhaustible
F
23

(a) Consider why you don't want to lazy delete. It is the right solution in a lot of cases.

(b) A heap is a list. You can delete an element by index, just like any other list, but then you will need to re-heapify it, because it will no longer satisfy the heap invariant.

Fondness answered 15/4, 2012 at 14:3 Comment(5)
could you add some reference for (b) ?Mccallister
@Mccallister Which part of b? You can look at the type of an object in your interpreter, or read the documentation that OP links to; as to needing to re-heapify, this is a consequence of the fact that such an operation leads to a list that violates the heap invariant (also given in that documentation).Fondness
(a) - lazy delete is perfectly valid, I just would like to understood the heaps better. (b) I'm interested in at least O(log n), heapify is O(n)Sedan
lazy delete is a genius way to get around O(N) delete cost for heaps.Vellavelleity
for anyone wondering what a 'lazy delete' is you can find the article below but essentially in this case you mark an element as 'deleted' in a key value store but don't actually remove it from the heap as that would require O(n) time. Then when you are using the heap you can check that key value store if the node you are looking at is marked as deleted. It's used for hash tables but can be used here as well en.wikipedia.org/wiki/Lazy_deletionCommonly
C
-1

Implemented an OOP example of heap, which supports remove by index. Essentially the same as in the accepted answer, just happypath class example.

Hope this will be useful to someone who looks here later.

class RemoveByIndexHeap:
    def __init__(self):
        self.heap = []

    def _swap(self, i: int, j: int):
        self.heap[i], self.heap[j] = self.heap[j], self.heap[i]

    def _sink(self, i: int):
        while i < self.size():
            swap_with = i
            if i * 2 + 1 < self.size() and self.heap[swap_with] > self.heap[i * 2 + 1]:
                swap_with = i * 2 + 1
            if i * 2 + 2 < self.size() and self.heap[swap_with] > self.heap[i * 2 + 2]:
                swap_with = i * 2 + 2
            if swap_with == i:
                break
            else:
                self._swap(i, swap_with)
                i = swap_with

    def _swim(self, i: int):
        while i > 0:
            swap_with = i
            if (i - 1) // 2 >= 0 and self.heap[swap_with] < self.heap[(i - 1) // 2]:
                swap_with = (i - 1) // 2
            if swap_with == i:
                break
            else:
                self._swap(i, swap_with)
                i = swap_with

    def add(self, obj):
        self.heap.append(obj)
        self._swim(self.size() - 1)

    def remove(self, index: int):
        self._swap(index, self.size() - 1)
        self.heap.pop()

        if index != self.size():
            self._sink(index)
            self._swim(index)

    def get_top(self):
        if not self.heap:
            return None

        return self.heap[0]

    def size(self):
        return len(self.heap)
Chin answered 29/10, 2023 at 21:24 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.