• 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Forums > Back > What to use for HDR raw or JPG?
#1
Some HDR software don't read CR3 besides RAW files are cumbersome and take a lot of time to process, I even think the software is converting them to JPG before actually doing HDR using converted JPG. 
Is it OK to use JPG for HDR or it is better to use the RAW?
  Reply
#2
Well, JPG as 2 main drawbacks:
- Each RGB component of a pixel is stored as 8 bits (a color intensity can be represented on a scale from 0 to 255)
- It uses lossy compression (to save space, the image is compressed at the cost of lost information (artefacts such as approximations and tiling)
(- No camera meta data)

A RAW file usually uses 14 bits thus a color intensity can be represented on a scale from 0 to 16384.
The format is usually losless as well.

One could use jpeg images to produce HDR images. However, to mitigate the limitation of 8 bit representation (and compression artefacts), one would need more images to work with (and it would also depend on the quality of the software used of course).
--Florent

Flickr gallery
  Reply
#3
(01-12-2021, 03:58 PM)thxbb12 Wrote: Well, JPG as 2 main drawbacks:
- Each RGB component of a pixel is stored as 8 bits (a color intensity can be represented on a scale from 0 to 255)
- It uses lossy compression (to save space, the image is compressed at the cost of lost information (artefacts such as approximations and tiling)
(- No camera meta data)

A RAW file usually uses 14 bits thus a color intensity can be represented on a scale from 0 to 16384.
The format is usually losless as well.

One could use jpeg images to produce HDR images. However, to mitigate the limitation of 8 bit representation (and compression artefacts), one would need more images to work with (and it would also depend on the quality of the software used of course).
of course RAW is superior, but does the software use the 14 bits or just it converts to 8 bits then uses the data, in that case no point using RAW and it leads to prolonged editing time
  Reply
#4
It all depends on the software. Which software are you talking about exactly?
--Florent

Flickr gallery
  Reply
#5
No, HDR software does not use JPEG or other 8 bit formats. HDR software also does not use 12 or 14 bit RAW, as far as I am aware of. HDR software does a RAW conversion into 16 bit format (demosaiced, with balance applied, low contrast tonal curve), and then you can apply whatever settings in the HDR software you like. Or HDR software takes many different exposure images with maybe a normal tonal curve applied, and uses that to make a 16 bit (or higher) composite image to which you apply whichever settings you like in the HDR software.

Software never uses JPEG internally, JPEG is just a file format.

It never is "better" to use a number of JPEG files to make a HDR image, but of course you can use JPEG files with different exposures to make a tonal mapped image with HDR software.
  Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)