Using custom allocator to make std::list cache friendly?
Asked Answered
E

1

8

During my daily work, I am always advised by senior member of the team that list is not cache friendly so I should vector. I understand that list is not continuous so that the memory allocation is scattered around the whole memory.

However, very often I do need the functionality of a list (or a map). So I am wondering if I can write my own allocator, which is a vector underneath. Every time when I push_back, my own allocator will allocate a new item from a per-allocated vector.

When I travel the list/map, the cache locality is preserved.

Does this make sense to any of you?

Euripides answered 29/3, 2017 at 21:15 Comment(11)
std::list isn't an associative container.Kyanite
What your looking for is called a stack allocatorStreamway
Obvious question: why not just use a vector? What does this construction give you that a vector doesn't? If you try to use any of the functionality that's ordinarily more efficient on a list than a vector, you'll leak memory.Berty
I have a roughly fixed size list, i constantly need to insert/erase in the middle of it, i don't think this can be easily replaced by a vectorEuripides
@Berty This can be done without leaking memory if the list can see the elements in a different order than the vector sees the list nodes.Nardoo
The value add of list over vector is O(1) insert and remove. I can't think of a way to get this and guarantee decent spatial locality.Paradox
Do some benchmarks. For 500,000 elements vector out performed list (and map) doing random insert deletes.Bradski
See the video linked in this question #24543436Bradski
I think the main reason you'd use std::list over std::vector is that removing from the middle of std::list doesn't invalidate iterators. If you don't need that property of lists then I'd suggest using a vector instead.Byrnie
with so many comments, why nobody gives a proper answer?Euripides
@Euripides your collection can't have insert and remove and be fixed length; if an element is added or removed, the collection grows or shrinks. If you reserve an appropriate sized array, you can do it so that it's a cache friendly list-like, but the details of arranging and finding the elements depend strongly on which operations you want to optimize. That said, your seniors need to wait until the product is profiled, or you will end up spending unnecessary time and effort deciding and refining those details. Unless you really have a boat, don't re-invent the wheel.Injurious
B
1

std::list and std::set (I believe you need set as alternative to list, not a map) both will use allocator for there internals. You can preallocating a block of memory and use it to create your objects and containers. If you google, you will find several. In this case your objects instead if "scattered around the whole memory" will be scattered around your block of memory. If block fit on the cache, you will get some improvement. But it will not completely solve you problem.

From description of the problem, you really need deque. Deque is implemented as a list of arrays. It is compromise between vector and a list. It is cache friendly for iteration and faster then array when you insert.

So you can choose either custom allocator or deque, depends on your collection size.

image

Bdellium answered 5/4, 2017 at 21:4 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.