-
Notifications
You must be signed in to change notification settings - Fork 162
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support true 32-bit colors in game data and script API #1980
Comments
I would suggest to make AGS4 games 32-bit color depth only.
It would be fine to have both, we could even move color related functions into a
Maybe a
Either "255,255,255,0" or "#FFFFFF00" (with or without the #) |
Both formats have at least the purpose of saving disk and runtime memory. At the same time, what is the real downside of keeping these formats? EDIT: Overall, I suspect, only few people are using these formats, like 8-bit, because AGS does not promote them. There are engines out there, like PICO-8, that specifically promote restrictions. I genuinely wonder if number of people trying to make 8-bit gfx games would increase if AGS was advertising this mode. EDIT2: In any case, I would not want to do format removal along in this task, I'd prefer to have it as a separate matter.
Well, it's not legacy color, it's a hardcoded color still used for certain purposes which do not have any exposed settings, like drawing a built-in dialog window, or default dialog options color, etc (I do not remember them all).
Actually, to save on api entries, |
Oh, something I forgot to mention. 8-bit images and thus 8-bit color values are currently used on masks, so they cannot be thrown out. |
I don't want to throw out 8bit images, just 8bit game projects if it reduces the complexity of code. Mask colors are hardcoded too, but with extra hacks to prevent certain colors. |
What do you mean? Mask colors just are 0-255 indexes. |
Maybe I'm remembering the part used in the editor, but there was a piece of code with hardcoded values up to 31, some whites are turned into red, and above 31 is all red. |
No, that is not related to masks, I explained what it is in the ticket's description. |
In any case, I would not want to do any format/feature removal along in this task. I'd prefer if it's considered a separate matter. Currently I don't think that presence of alternate formats will make it difficult to implement 32-bit color values support. |
Upon more thinking, perhaps it would be better to store color properties simply as 32-bit ARGB or 8-bit index (depending on the context). It will be much easier that way for both users and doing any kind of analysis in the engine. It may still be converted to 16-bit or else when applying to a drawing operation (it's done using fast bitwise operations). EDIT: amended the first post with proposed order of changes. |
Sounds good to me. |
An interesting problem of transparent color: Right now, when the engine loads 32-bit sprites, it replaces all fully transparent pixels (w alpha 0) with standard AGS COLOR_TRANSPARENT (0x00FF00FF). This is done for compatibility, and also to let users draw and check for transparent pixels with DrawingSurface. If we want a full ARGB support, we can no longer do that conversion, because even fully transparent pixel values may have a meaning. But then users will no longer be able to check for transparency by comparing pixels with COLOR_TRANSPARENT. They will have to check the alpha channel (upper 8 bits of integer) instead. That is - in generic case. Of course if they draw the image themselves they might still use COLOR_TRANSPARENT, |
They could keep comparing COLOR_TRANSPARENT, if they've drawn it with COLOR_TRANSPARENT. Obviously imported images may have other colors hidden in the RGB fields, so we may add a note somewhere about it. |
This partially reverts commit 2d0150b. We'll have this change back along with the adventuregamestudio#1980.
Opened a PR here: #2501 EDIT: ready for review now.
|
I posted this in comments to #2501, but might repeat here: I've been wondering, would that make sense to handle existing duality of Color / ColorNumber properties by hiding one of this pair depending on the current game's color depth? That is - display ColorNumber if game is 8-bit and Color (rgb) if game is 32-bit. But then, it would also be nice to have color displayed on ColorNumber field. Maybe even add a [...] ColorNumber property which opens a palette for selecting a color. Alternatively, how feasible that would be to merge Color and ColorNumber properties together? |
Resolved most subtasks from the list in #2501, the rest should be moved to separate tasks. |
Problem
Historically AGS stores a script color value in 16-bit format (iirc it is R5-G6-B5) in both 16-bit and 32-bit games. This causes obvious problems in 32-bit games:
Besides that, colors with values 0-31 (this corresponds to lower blue hues in ARGB format) have special meaning: they are forced to refer to the palette indices 0-31, even in 32-bit games. The reason why this is done is, AFAIK, because certain hardcoded graphics in the engine use these palette slots to get drawn.
EDIT: upon quick check of the code, it's not the real game palette where these colors are taken from, but the special 32-slot hardcoded palette. So any changes to the 256-colors game palette do not affect these.
This also means that these 0-31 color indices are not "dynamic", they are always resolving to the same RGB values
Proposed changes
In brief, the goal is this:
Further notes:
Game.GetColorFromRGB
is superceded or accompanied withGame.GetColorFromRGBA(r, g, b, a)
int color
value in script API in 32-bit games should reliably be A8R8G8B8, meaning that users may also just make color themselves using bitwise operations. This must be documented in the manual (same for 16-bit games, if it's not already).GetHardcodedRGB(index)
(or better name).EDIT: good start is to search for GetCompatibleColor() calls which have hardcoded numbers as arguments.
DrawingSurface::DrawingTransparency
#1514 where this was first proposed).Upgrading game projects
Unfortunately, the "color" properties are stored as raw values in both the classes and the game project files (and not as RGBs in high/true color games). I think, that for better future compatibility, and readability, it may be better to store RGB values along. (Need to find a good form for serialization, maybe something like a comma separated values, and avoid oververbose xml format).
When importing older projects the color values will be automatically converted to RGB, except for 8-bit games.
Proposed order of changes
Editor:
If/since we still support 16-bit game depth, for these games the RGB selector might adjust the selected color to the nearest supported 16-bit one (but color value is still stored as 32-bit ARGB!).short
and adjust them toint
, changing format as necessary.Engine:
Script API
Game.GetColorFromRGB(r,g,b)
with an optional alpha value, which is 255 by default.EDIT: looks like this is not necessary, as there's a global import
ColorType palette[PALETTE_SIZE];
, where ColorType is a struct with RGB fields.The text was updated successfully, but these errors were encountered: