How to use palettes in SDL 2
Asked Answered
B

2

9

I'm updating a program from SDL 1 to SDL 2 and need to use color palettes. Originally, I used SDL_SetColors(screen, color, 0, intColors); but that does't work in SDL 2. I'm trying to use:

SDL_Palette *palette = (SDL_Palette *)malloc(sizeof(color)*intColors);
SDL_SetPaletteColors(palette, color, 0, intColors);
SDL_SetSurfacePalette(surface, palette);

But SDL_SetPaletteColors() returns -1 and fails. SDL_GetError gives me no information.

How can I make a palette from a SDL_Color and then set it as my surface's palette?

Brace answered 13/4, 2015 at 15:43 Comment(0)
C
8

It's hard to tell what your variables are and how you intend to use them without seeing your declarations.

Here's how I set up a grayscale palette in SDL_gpu:

SDL_Color colors[256];
int i;

for(i = 0; i < 256; i++)
{
    colors[i].r = colors[i].g = colors[i].b = (Uint8)i;
}

#ifdef SDL_GPU_USE_SDL2
SDL_SetPaletteColors(result->format->palette, colors, 0, 256);
#else
SDL_SetPalette(result, SDL_LOGPAL, colors, 0, 256);
#endif

The result SDL_Surface already has a palette because it is has an 8-bit pixel depth (see note in https://wiki.libsdl.org/SDL_Palette).

Cecillececily answered 19/4, 2015 at 4:10 Comment(0)
E
4

It has been awhile since the OP posted the question and there has been no accepted answer. I ran into the same issue while trying to migrate a SDL 1.2 based game into using 2.0. here is what I did hoping it could help other who may be facing similar issue:
Replace:
SDL_SetColors(screen, color, 0, intColors);

With:
SDL_ SDL_SetPaletteColors(screen->format->palette, color, 0, intColors);

David

Eugeniusz answered 13/7, 2019 at 18:58 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.