Joined: 6-May-2007 Posts: 10688
From: Greensborough, Australia
In my limited research I can't find much info on HDR apart from it becoming available a few years ago and it needs a supported video card as well as monitor.
As well as this I don't see 32 bits per channel but only 10 bits per channel. Whilst this can fit in a standard pixel size of 32 bits and increase resolution it does cut the alpha down to only 2 bits but the framebuffer wouldn't need it for the final display,
I can find less info on the actual HDR pixel format used. I found a 32 bit float format but it is a compressed HDR pixel format and what I found was for storage but not how the pixels are stored in the actual framebuffer. I also read of a 16 bit half float format but again nothing clear as to it being what the video card is using.
So it looks like HDR is rather like Amiga HAM. Higher colour resolution with hardware compression. Except 10 bits RGB a pixel can fit in lossless. Given it only looks like 10 bits are common in supported HDR hardware it still makes the 32-bit resolution of Commodore look silly to me. Now, I see 32 bits was a software construct or standard they chose, scaled down to the hardware. So 32-bit per RGB didn't mean any hardware could display 32 bits per gun, even though the functions did imply this, since they set a hardware palette. I can find no hardware that can do this and if it can then it must be very specialist and out of the league of even the most serious gamer. By the end the Amiga (A4000) was an expensive machine that didn't offer any specialist workstation graphics. This looks like putting the cart before the horse. Even the planned AAA and Hombre chipsets didn't break the 8 bit RGB limit. I still think AmigaE has the best idea with SetColour(). Takes an 8 bit RGB value and scales it down for OCS/ECS or scales it up for the OS to scale it down again for AGA. The new way was confusing for programmers, a few would have shifted it up instead of scaling. And it wasted CPU time doing conversions. Apart from being CLUT based which is more limited than even standard 8 bit RGB triplets and to an extent 16 bit RGB.
This Lenovo laptop I am using goes in the opposite direction. I read that the LCD panel is more like VGA. And it only has 6 bits per gun. Not meant to be the best. But I found there are some top laptops with HDR and 4K. Expensive ones gamers and programmers like.
Joined: 24-Aug-2003 Posts: 2757
From: As-sassin-aaate! As-sassin-aaate! Ooh! We forgot the ammunition!
More specifically the notion of 32-bit per gun (floating point usually) turns up in graphics pipelines, shader code and textures/bitmaps. Using these formats in image processing tools that support HDR is not uncommon and nor is it even a particularly new idea. There was even a version of Photogenics that used 32-bit floating point per channel.