Some HDR software don't read CR3 besides RAW files are cumbersome and take a lot of time to process, I even think the software is converting them to JPG before actually doing HDR using converted JPG.
Is it OK to use JPG for HDR or it is better to use the RAW?
01-12-2021, 03:58 PM
(This post was last modified: 01-12-2021, 03:59 PM by thxbb12.)
Well, JPG as 2 main drawbacks:
- Each RGB component of a pixel is stored as 8 bits (a color intensity can be represented on a scale from 0 to 255)
- It uses lossy compression (to save space, the image is compressed at the cost of lost information (artefacts such as approximations and tiling)
(- No camera meta data)
A RAW file usually uses 14 bits thus a color intensity can be represented on a scale from 0 to 16384.
The format is usually losless as well.
One could use jpeg images to produce HDR images. However, to mitigate the limitation of 8 bit representation (and compression artefacts), one would need more images to work with (and it would also depend on the quality of the software used of course).
It all depends on the software. Which software are you talking about exactly?
01-13-2021, 12:21 PM
(This post was last modified: 01-13-2021, 12:23 PM by Brightcolours.)
No, HDR software does not use JPEG or other 8 bit formats. HDR software also does not use 12 or 14 bit RAW, as far as I am aware of. HDR software does a RAW conversion into 16 bit format (demosaiced, with balance applied, low contrast tonal curve), and then you can apply whatever settings in the HDR software you like. Or HDR software takes many different exposure images with maybe a normal tonal curve applied, and uses that to make a 16 bit (or higher) composite image to which you apply whichever settings you like in the HDR software.
Software never uses JPEG internally, JPEG is just a file format.
It never is "better" to use a number of JPEG files to make a HDR image, but of course you can use JPEG files with different exposures to make a tonal mapped image with HDR software.