There is a great answer regarding the generation of partition sets:
(another similar answer with ruby is found here: Calculate set partitions with specific subset sizes which also has a picture)
The problem is that it quickly gets slow when N increases, it can be noticed from 12 or 13
CONTEXT:
I need set partitions to fit a collection of images into either a portrait or landscape sheet of paper, to reduce blank space as most as possible.
I tried bin packing algorithms like answered here How is 2D bin packing achieved programmatically?
but it generates bad solutions (I posted a comment on why with a screenshot). Other bin packing solutions are too complex.
A good solution I found is to scale sets of images so they have the same height, put them in a row, and assemble several rows on a landscape or portrait layout. Description here:
I filter out candidates that:
scale up or down images too much, since I want each image to have almost equal surface allocated
waste too much space
END OF CONTEXT
This technique works well for 10 up to 11 images, but at 12 and beyond, it's too slow.
The more-itertools
's set_partitions()
also works, but it has the same speed problem.
A lot of partitions have subsets of size 1, or have too few large subsets. As you can see in the image, I don't need singletons (subsets of size 1) and it needs partitions with a minimum number of subsets (it's not really the square root of N, it depends if it's either landscape or portrait).
My question is this:
Is it possible to speed up set partition generation and directly skip:
- set partitions with subsets of size SMALLER than X
AND/OR
- set partitions with fewer than Y subsets
and eventually:
- set partitions with subsets of size LARGER than X
or eventually
- set partitions with MORE than Y subsets
Those X and Y can vary since I want to generate those set partitions for different values of N.
EDIT: I also found this: https://codereview.stackexchange.com/questions/1526/finding-all-k-subset-partitions
This seems to be somewhat fast.