Home · Register · Join Upload & Sell

Moderated by: Fred Miranda
Username  

  New fredmiranda.com Mobile Site
  New Feature: SMS Notification alert
  New Feature: Buy & Sell Watchlist
  

FM Forums | Post-processing & Printing | Join Upload & Sell

1
       2       end
  

Archive 2008 · WideRGB/ProRGB

  
 
srui
Offline
• •
Upload & Sell: Off
p.1 #1 · p.1 #1 · WideRGB/ProRGB


Hi!

How many of you use the WideRGB colorspace with Canon DPP? I'm having some difficulties because the colors seem less saturated when I choose this colorspace and some artifacts appear in the shadow zones of the image... Like dithering and clipping... Should I set the monitor colorspace for WideRGB too?

Regards,

Sergio



Oct 23, 2008 at 04:05 PM
colinm
Offline
• • • •
Upload & Sell: Off
p.1 #2 · p.1 #2 · WideRGB/ProRGB


Are you exporting as a 16-bit image? ProPhoto RGB is not intended for use with 8-bit.


Oct 23, 2008 at 05:40 PM
cgardner
Offline
• • • • •
Upload & Sell: Off
p.1 #3 · p.1 #3 · WideRGB/ProRGB


Why do you think you need to edit in those spaces?

Regardless of color space used for capture and editing there are the same number of colors values. The number discrete colors is determined by the bit-depth of the sensor and the editing working space. 8bits can describe 256 different values. When you combine 256 shades of red with an equal number of green and blue shades you get 256x256x256 = 16.8 million discrete colors. Add a 9th bit and its 512x512x512 = 134 million. By the time you get to the 14-bit level of today's cameras and 32-bit editing the number of possible colors approaches Bill Gates net worth or the US National Debit in dog years.

So if whether you choose sRGB, AbobeRGB, WideRGB, ProRGB you are not increasing the number of colors, only how widely apart they are separated in the space. The larger the color space the more widely spaced the colors are in it.

By way of analogy sRGB is a one-story building, AbobeRGB two-floors and WideRGB three-floors. The stairways in all three buildings each have the same number of steps used to get from the ground level to the top. The difference is how tall each step is. The top of the green staircase in the sRGB building is as green as green is ever going to get in that space. In terms of 8-bit-per-color RGB eye-dropper values that max. green will be 000.255.000 Now in the the two story AbobeRGB building the top step is also the greenest green that gamut can record, and twice as green as the one in the sRGB space and also defined in that space as 000.255.000 So what happens when you take that max green from the sRGB building and locate into the taller AdobeRGB one? It gets put in the same spot, the top of the first floor. But in the AdobeRGB building the top of the first floor has an address of only 0.128.0. Relative to the maximum saturation in AdobeRGB the same absolute color values (as plotted in CIE*Lab coordinates) are less saturated.

What happens when you display an image in working space larger than it was captured in, the colors get remapped relative to the maximum level of saturation the space can theoretically record. So what does the camera actually record? It has the potential to capture all the colors in the gamut of the sensor's RAW colorspace. The LABxyz format plot below compares AbobeRGB (white) and the standard color profile for a Canon1DSmkII from DPP (the most recent one I could find):

http://super.nova.org/TP/ColorMang/AdobeRGBvsCanon1DSmkII.jpg

So you can see the camera is capturing a gamut smaller than AbobeRGB. But when editing you can't see a gamut any larger than that of your monitor can display, typically about the size and shape of sRGB on most consumer grade monitors. For example, here is the profile for my 24" Aluminum iMac, which is pretty good for a consumer grade LCDs, compared to sRGB and AdobeRGB:
http://super.nova.org/TP/ColorMang/iMacAdobesRGB.jpg

Since most monitors can only display sRGB if you were only shooting for display on screen there no really compelling reason to use a workflow other than sRGB. But the problem with sRGB is that its shape is a lousy match to CYMK profiles of ink and paper. Saturated reds, blues and purples obtained by cranking up the saturation on the monitor just cannot be reproduced with transparent CYMK dyes and inks:

http://super.nova.org/TP/ColorMang/sRGBHP8C.jpg

That's why colors change when printed and you'll never match your printer to the monitor.

So you are dealing with several different physical limitations of the color reproduction process: camera gamut, monitor gamut, and printer gamut. sRGB became the de facto standard for web work because it is the limit of what most monitors could display. But as noted above the problem with sRGB is that it does not map well to CYMK.

Color management, as originated in the printing industry in the early 1970s with proofing standards like SWOP, actually has the goal of making all the pre-press proofing steps as close to the final production mode as possible, i.e. as bad looking. I started in web offset printing in the 1970s when color separations proofs where printed one color at a time with dry trapping and exotic high pigment inks on high brightness papers. Color separators would do that to match the saturation of the transparencies and please their clients the designers and art directors. The problem was we could not print with the same level of saturation on a high-speed web press on paper with lower brightness. Art Directors and their clients would scream “Why didn’t you match the proof and transparency?” and demand a free reprint of the ad. SWOP stands for Standards for Web Offset Printing. It is a an overall set of standards for inks, papers and ink densities which realistically reflect what production web offset presses can reproduce.

In 1998 Adobe addressed the sRGB screen / printing gamut dilemma by creating a new color space: AdobeRGB1998. It's size and shape are such that both the sRGB monitor gamut and the SWOP CYMK gamut fit neatly inside of it. The size and shape of AdobeRGB allows files captured and edited in AdobeRGB to either be displayed or printed without clipping, i.e. loss of color saturation.

http://super.nova.org/TP/ColorMang/sRGBinAbdobeRGB.jpg
http://super.nova.org/TP/ColorMang/AbobevsSWOP.jpg

Because Adobe1998 is a good fit for screen and printer gamuts it gradually replaced the de facto 5000K 1.8 “paper white” graphic arts standard created by Radius with its Pressview monitors. But remember there still a a Catch-22. If image capture is done with a camera gamut the size of AbobeRGB and AbobeRGB is used as a working space in Photoshop on a consumer grade monitor YOU STILL CAN'T SEE ALL THE COLORS IN THE FILE! Some of the colors in AdobeRGB fall outside the gamut the monitor can display accurately: which of course is why it is necessary to convert files to sRGB for accurate display in non-color managed web browsers.

Photoshop has a “soft proofing” mode which can be used to simulate how a file will look when printed or converted. That is what Proof Color and the Out Of Gamut Warning under the View menu in Photoshop are for. If you view a file on screen using Proof Color and select the SWOP CYMK profile the screen will simulate how the file will look if converted to CYMK and printed on a web press. The SWOP CYMK profile is used to re-map the RGB values in AdobeRGB into the maximum CYMK saturation the ink and paper can print, then those CYMK values are re-mapped back into RGB so the screen can display a simulation. All that re-mapping of color space occurs deep in the bowels of Photoshop in the color management engine based on the Lab space coordinates. Lab is the largest gamut which simulates the range of human color vision.

You can also use your inkjet or service bureau printer ICC profile with View > Proof Color to alter the screen image will simulate how the saturation will change when printed. But since the monitor can't actually display all the colors in the CYMK gamut the Out of Gamut Warning (OOGW) grays out the colors which exceed the range the inks can print. Here’s an example of how soft proofing looks on a camera file using the profile for my trusty HP7950 8/C printer on Premium Glossy paper.

So if you start with this file out of the camera:
http://super.nova.org/TP/ColorMang/OGW1.jpg
and edit in soft proofing mode...
http://super.nova.org/TP/ColorMang/OGW2.jpg
... reducing saturation until the OOGW warning disappears, you will wind up with a simulation on screen, to the extent the RGB monitor can simulate CYMK, of what the file will look like when printed:
http://super.nova.org/TP/ColorMang/OGW3.jpg

In absolute terms of saturation the image isn't as richly saturated as the camera capture, but relative to the amount of saturation ink on paper can produce it is as good as it is gonna look when printed.

So if your monitor can’t even display AbobeRGB why in the world would you want to use an even larger working space like ProRGB? Well a hint is in the name. A professional in graphic arts reproduction with a high level of color management expertise and the profiling tools needed to manage the printing process can tweek out some additional saturation afforded by the current generation of ink jet printers using the wider gamut. The profile comparison below shows an HP 8/C gamut compared to AdobeRGB. SWOP CYMK fits inside of AdobeRGB1998, but current ink jet printer gamuts exceed AdobeRGB:

http://super.nova.org/TP/ColorMang/AbobeHP8C.jpg

So to tweek the that last bit of saturation out of the printer its necessary to use an editing working space which is larger than that of printer and use press proofs to evaluate the editing on screen because the monitor can’t simulate the printer accurately. Here's my iMac monitor space compared to an 8/C HP Premium Glossy printer profile:
http://super.nova.org/TP/ColorMang/iMacHP8C.jpg

Life will be simpler and more predictable if you stick with AdobeRGB as a working space in Photoshop and learn to use soft proofing with your ink jet printer profiles. Keep your master edit file in AdobeRGB and unsharpened. There may be a bit of clipping compared to the 8/C printer gamut, but not to the extent you'd likely notice it with looking at a print. Resize and sharpen as the last step before printing on your ink jet at home.

For web display open the master AdobeRGB file, resize, convert to sRGB, and sharpen

If you print a places like Costco the photo printers are typically set up to assume an sRGB profile, so you’d want to save a copy of the file, scaled to printer resolution, with the color space converted to sRGB. Experiment with different amounts of USM for different print sizes to find the optimal amount of USM.

If you print at a professional service bureau like WHCC you’ll usually have the option of submitting files in sRGB or AdobeRGB. A professional service bureau will also usually have profiles for the printer / paper combinations it uses to download for soft proofing. Experiment with different amounts of USM for different print sizes to find the optimal amount of USM or follow their guidelines.

The bottom line in color management is always perceptual and practical. The human eye and brain are the weakest links in the chain so all the technical nuances of how many fairies can dance on the head of the color management pin only really matter if you can actually see the fairies well enough to count them and have the skills to make them dance to your tune. Try everything you think might and compare the results. If the difference in quality is apparent and seems worth the extra effort then do it.

Chuck



Oct 23, 2008 at 07:46 PM
srui
Offline
• •
Upload & Sell: Off
p.1 #4 · p.1 #4 · WideRGB/ProRGB


Wow! Thanks for the reply! Very good explanation. I had had some problems before with print services that only used sRGB. I now found a place that correctly accepts sRGB and AdobeRGB. I can also access the printer color profile, it’s a Fuji Frontier. I will try to use it.

For the web I have now been using sRGB + USM at the display resolution. What a difference. You can see here: www.olhares.com/srui . Some of the photos are old and not with the correct colorspace, can you spot them? ;-)

I will assume that the problem with the DPP clipping/dithering is a bug. I'm starting to prefer Lightroom anyway.

One more question then: how can one take advantage of the high DR of current cameras like the 40D or the Fuji S5? If there is really an advantage...

Again thanks for the reply!

Regards,

Sergio




Oct 24, 2008 at 02:23 AM
cgardner
Offline
• • • • •
Upload & Sell: Off
p.1 #5 · p.1 #5 · WideRGB/ProRGB


srui wrote:
One more question then: how can one take advantage of the high DR of current cameras like the 40D or the Fuji S5? If there is really an advantage...


That is another can of worms...

I'd suggest you first do the practical test outlined in my Histogram tutorial

With camera on a tripod and custom color balanced off a gray card, shoot the grey card with aperture of a fast lens wide open and adjust shutter until the card is it about 3 stops over-exposed and rendered as a textured white value (i.e. 240.240.240) then keep closing the aperture by 1 full f/stop (i.e. 1.4, 2/ 2/8, 4, 5.6 ....) until you run out of f/stops and the card is 3-4 stops under-exposed and rendered as dark as possible. That will map dynamic range of your camera which can record detail in practical terms (i.e. what you can see and measure in Photoshop).

Because the card only has one tonal value the histogram will be a narrow spike. As you reduce exposure by closing the camera aperture you will see the spike move to the left on the histogram. When you open the thumbnails the frames will form a gray scale. The illustration below created from such a test shows how the histogram spike shifted (red lines) with 1-stop changes in exposure on my 20D and 24-70mm f/2.8 lens. It allows me to visualize which of the 256 displayed tonal value each point on the horizontal histogram scale represents and tells me in practical terms my camera and lens can record about 6 stops of scene difference with detail.
http://super.nova.org/TP/HistogramTest.jpg
From my days in the 1970s and 1980s using the zone system and methodically measuring scene ranges with a 1-degree spot meter I know an outdoor scene can range in contrast from 12 stops on a sunny day on snow to as low as 4-5 on a foggy day or at the break of dawn. That knowledge, combined with knowing my 20D can record 6 stops with detail allows me to pretty much pre-visualize when I will lose shadow detail by just looking at the light.

What is important to realize and agree on is what defines correct exposure in a digital file. I use a white towel as a reference when I shoot whenever possible to gauge highlight exposure. I consider a file optimally exposed when there is no or only minimal clipping (loss of detail) in the brightest highlights as in this flat sunlit test shot:

http://super.nova.org/TP/DR_FlatLight.jpg

In the 16 bit RAW file in AdobeRGB working space in ACR the brightest parts of the towel are barely clipping. Its the baseline test shot for a tutorial I did on high speed FP flash use.

Here is that same subject in the same light, but backlit:
http://super.nova.org/TP/DR_Backlight.jpg

Though it is difficult to tell in this small version in the 16 bit RAW file in AdobeRGB working space in ACR the brightest rim-lit parts of the towel are also barely clipping just as in the first shot. Technically it is optimally exposed for highlight detail, there just isn't a great amount of highlight detail visible: analogous to the eyes and teeth of a black cat on a coal pile. To correctly render the cat or the towel its necessary to set exposure for so the textured white areas, regardless of size, are rendered correctly: white with texture.

Loss of shadow detail is in the backlit shot above is evident in the histogram by looking at the left side. If the bars of the graph that form the curve high in amplitude and running off the left side its an indication of a lot of tones in the scene which are at or below the range where the camera can record detail.

But here's the rub... Show that backlit file to 100 photographers and probably 99 of them would say, "That file is a stop or more underexposed!" That illustrates the difference between what is technically correct and what seems correct perceptually. Perceptually we expect the overall exposure to look normal. The problem is that the limited DR of a digital camera can't cover the same range as our eyes...

So from a technical point of view the highlights are exposed correctly, but the middle tones and shadows are beyond the range of brightness the camera can render accurately in the natural light. Its not possible to change the DR of the single-exposure capture, so the only way to record the scene as seen by eye is to either add fill flash, or shoot bracketed exposures and combine in Photoshop (i.e., HDR).

In this case the point of the exercise was to demonstrate that when a wide aperture is used a 580ex flash has enough power in High Speed FP mode to correctly expose the foreground. So I simply turned on the flash in High Speed FP mode and took another shot, getting this result:

http://super.nova.org/TP/DR_FlashFill.jpg

The fill flash added to the fill already there from the sky (which is actually only 3 stops below the highlight side) is enough to expose the foreground optimally as seen in this Levels evaluation of just the area on the card in the foreground:

http://super.nova.org/TP/DR_FillHisto.jpg

Because flash falls off about 2-stops each time the distance from the flash doubles there wasn't as much fill on the towel hung on the stand in the background and no difference at all in the tree in the extreme background. That's why the histogram of the overall scene really doesn't tell the whole story. But if I was able to crop in camera tight on the card the same way I did in the selection in the screen shot above the over-exposure warning and histogram of a shot taken that way would allow me to interpret the exposure of both highlights and shadows in the foreground subject.

But in that fill flash shot also notice that while the card and towel are correctly exposed in the technical sense of correct highlight and shadow values, the middle tones in the foreground look a bit overexposed perceptually. That's because in similar light in person we expect the "highlights" on the shadow side of a back lit object to be about 1 stop darker than if flat lit. Again you need to be aware of the difference between what your brain tells you is being captured and what is actually being captured by the camera.

So to answer your question the way to take full advantage of your camera DR is to first have an idea what it is in practical terms -- the range of detailed tones in photos you take from about 240 (textured white) to 30-40 (the darkest values where texture can be detected. The grey card test allows you to quickly - in about 15 minutes - visualize that and understand what the horizontal scale on the camera histogram represents in terms of neutral tonal values.

Once you understand what the heck the histogram and over-exposure warning are trying to tell you exposure becomes pretty much a no-brainer. Being an old zone system practitioner I still tend to think of scenes in term of their gray tones, but in color use the highest of the three channels if the brightest / darkest textured objects in the scene are not neutral.


Put your camera in the playback mode that shows the over-exposure clipping warning (OEW). Monitor it as you shoot and keep the textured highlights about 1/3 stop below the point where they show as starting to clipping. Err on the side of slight underexposure, which can be easily fixed in ACR or Levels to avoid clipping textured highlights too small to see in the playback.

The only 255.255.255 values in any camera file should, IMHO, be specular highlights. The reasoning for that is based on how we perceive "white" in a background and shape in very light 3D objects. When shooting white-on-white objects its the slight difference between 255.255.255 in the specular highlights, the 250.250.250 "paper" white values of smooth objects and the light 240.240.240 - 230.230.230 shading in the textured white objects which allow our brains to create the illusion of 3D. If you blow out the highlights they look flat. In a studio shot if you nuke a white background any specular rim light on the edges of a face will make the edges of the face melt into the background. So even in a studio flash shot on a white background you'd ideally want the background to read 250.250.250 (paper white) not 255.255.255 (direct light source or specular reflection).

http://super.nova.org/TP/WhiteBackground.jpg
http://super.nova.org/TP/WhiteBGTowelCard1.jpg
http://super.nova.org/TP/WhiteBGTowelCard2.jpg

Making the background "paper" white the rim light will define the shape by allowing the specular reflections to contrast. Its also worth noting that part of the lighting strategy for lighting white or chrome/reflective object is intentionally creating (and controlling) specular highlights via the types of light modifiers used.

Once you start exposing optimally for the highlights in the technical rather than perceptual sense you can begin to see clearly how your camera is mangling the shadow tones by simply glancing at the left side of the histogram. This is a exposure confirmation test shot from a portrait shoot with a pair of 580ex flashes with foam reflectors:

http://super.nova.org/TP/TowelGary.jpg

The camera histogram I pasted in from levels is about the same as the one on the camera playback. Looking at the detail in the towel and the OEW I knew the highlights were exposed optimally. Looking at the left side of the histogram told me that my key/fill ratio was lifting all the shadows into the range the camera sensor could record them: optimal exposure on both ends of the total scale.

What happens in the middle of the histogram really isn't important DURING CAPTURE IN THE CAMERA because there isn't anything that can be done to change it in a RAW capture. You can affect the way the file will look when opened by messing around with the contrast settings in the camera, but I prefer a workflow with as few in-camera variables as possible so the results are more predictable. Once the file is opened for editing the "internal" contrast -- i.e. the relationship of tones between black and white -- and the overall perceptual appearance of the file as captured can be altered with a variety of tools such as ACR, Levels, Curves, etc.

Chuck



Oct 24, 2008 at 09:57 AM
SoundHound
Offline
• • • • •
Upload & Sell: On
p.1 #6 · p.1 #6 · WideRGB/ProRGB


Thank You< THANK YOU! Four such a lucid and informative survey of two essential but misunderstood topics!!


Oct 26, 2008 at 11:09 AM
stompyq
Offline
• • • •
Upload & Sell: Off
p.1 #7 · p.1 #7 · WideRGB/ProRGB


Mr Gardner thank you. An outstanding write up


Oct 29, 2008 at 11:40 AM
Bifurcator
Offline
• • • • •
Upload & Sell: Off
p.1 #8 · p.1 #8 · WideRGB/ProRGB


cgardner wrote:
Why do you think you need to edit in those spaces?

Regardless of color space used for capture and editing there are the same number of colors values. The number discrete colors is determined by the bit-depth of the sensor and the editing working space. 8bits can describe 256 different values. When you combine 256 shades of red with an equal number of green and blue shades you get 256x256x256 = 16.8 million discrete colors. Add a 9th bit and its 512x512x512 = 134 million. By the time you get to the 14-bit level of today's cameras and 32-bit editing
...Show more

You're confusing hardware color registers with how a gamut works. Smaller gamuts or color spaces actually have a narrower value range available.

Think of a flight of stairs. It goes from the 1st floor to the 2nd floor only. The number of bits is like the number of steps in the staircase. Is it 10 giant steps to the 2nd floor or 1,000 tiny steps? Either way you end up on the second floor. When dealing with color profiles (gamuts, spaces, etc.) the actual height of the 1st and 2nd floors (distance between them) are determined by the color space itself. Not by how many bits are available.

This is different for HDR formats which actually go to the 5th floor.

If a particular gamut or color space does not include 100% white for example then even tho there are 16.8 million steps (registers) available none of them will be 255,255,255 100% white. If a particular gamut cuts out the entire top range of whites and 200, 200, 200 is the highest value allowed by that color model than no matter how many steps there are none of them will fall between 200,200,200 ~ 255,255,255.

But the registers are still there so what happens to the extras? They're remapped back into specified value range which ultimately does result in fewer "numbers of colors". When 16 bit is used those values can be mapped in between and a smoother spread (more tinier steps) across the limits of the gamut is achieved.

One should ALWAYS edit in the widest possible gamut (color space) available and then "convert" down to the subset that matches the display device.




Oct 30, 2008 at 03:07 AM
Bifurcator
Offline
• • • • •
Upload & Sell: Off
p.1 #9 · p.1 #9 · WideRGB/ProRGB


Yup! Wayne Fox got it right.

Additionally edits and adjustments can and do result in values that are clipped more severely by smaller color spaces.

The phrase "giving yourself room to work" comes to mind here if you know what I mean.





Oct 30, 2008 at 03:18 AM
cgardner
Offline
• • • • •
Upload & Sell: Off
p.1 #10 · p.1 #10 · WideRGB/ProRGB


Wayne:

First, I don't disagree with you at all. My comments about camera gamut were based on the illustrated profile of an Canon camera profile from DPP which when mapped in comparison to AdobeRGB, as illustrated above, show it falling inside of LABxyz. I did do a Google search and that was the widest camera gamut I could find. Do you have links to profiles I can download and plot in the same Lab space?. That space is I believe what is actually used by the CMM in Photoshop and best reveals gamut differences.

Secondly, if you had actually read everything I wrote rather than stopping when hitting the first thing you didn't agree with you'd see that further down I say basically the same thing you do. That AdobeRGB was apparently based largely on SWOP CYMK and is now smaller that the gamuts of some ink jet printers: not the ideal choice if using a wider gamut printer. But OTOH AdobeRGB is not a problem if you do all your printing at Costco or any other photo-based printer. The selection of working gamut is tied more to printer gamut than anything else due to the different shape of CYMK inside the connection space.

If the argument for editing "in the widest gamut possible" were valid and taken at face value the logical gamut for editing everything would be an RGB gamut the exact size and shape of the Lab connection space, which is after all the widest gamut possible within Photoshop. But that wouldn't be practical now would it because many of the colors would fall outside the range any device can record or reproduce. What is practical for editing? A color space just large enough to encompass the camera capture, screen display, and printing ink gamuts. That varies, depending on the equipment you happen own or use for output.

I don't disagree with you that one has a camera and is printing with a gamut larger than AbobeRGB then there is a sound technical rationale for using the wider ProPhoto space. But the bottom line is really whether or not the technical rationale translates into changes which can be seen in the final result with one's eyeballs. Printer output will never exactly match screen appearance. Seeing color is an adaptive perceptual process not color by the numbers. In any case, editing an image in even sRGB will not damage the integrity of the color space of the captured RAW file, or even many screen displays, simply the print. Worst case you'd need to re-edit from the RAW capture in a wider working space to maximize the printer gamut. Not the end of the world.

If all the actual colors captured by the camera actually fall inside sRBG and you don't have plans to print the image (i.e. an on-line catalog shot) then there is actually a sound technical rationale for working in the smaller space in the workflow so more color data points are available to describe the actual colors captured (i.e. smoother gradation between colors).

So there really is no one-size fits all, bigger is always better solution to color management. The most effective workflow is the one which meets the output objectives. If you had read my complete message you would have seen the conclusion of the discussion, especially the last sentance:

"The bottom line in color management is always perceptual and practical. The human eye and brain are the weakest links in the chain so all the technical nuances of how many fairies can dance on the head of the color management pin only really matter if you can actually see the fairies well enough to count them and have the skills to make them dance to your tune. Try everything you think might and compare the results. If the difference in quality is apparent and seems worth the extra effort then do it. "


Chuck






Oct 30, 2008 at 10:40 AM
Peter Figen
Online
• • • • •
Upload & Sell: On
p.1 #11 · p.1 #11 · WideRGB/ProRGB


"That AdobeRGB was apparently based largely on SWOP CYMK..."

Adobe RGB was based on SMPTE-240M HDTV color standard, and in fact, that was how it was originally referred to in Ps 5.0. When it was discovered that Adobe got one of the primaries slightly wrong, rather than fix that error they just renamed it AdobeRGB.

As to the gamut of the 1DsMKII, I get an entirely different result. There is no specific gamut of the chip per se; the gamut is derived from the input profile created for the camera. I used ColorThinkPro for the gamut plot and the PhaseOne 1DsMKII Generic camera profile vs. Adobe RGB. While this is only one angle on a 3-D view, it clearly shows how much larger in most areas the camera is than Adobe RGB. There is a small area of green in Adobe RGB that extends beyond the camera, but that's largely immaterial.

www.peterfigen.com/ocs/adobergb-1dsmkII_colorthink



Oct 30, 2008 at 11:59 AM
mgipe
Offline
• • •
Upload & Sell: On
p.1 #12 · p.1 #12 · WideRGB/ProRGB


Great discussion, guys. Creative disagreement and debate nearly always leads to greater understanding. And I have to say I'm learning a lot from this discussion. It will be a long, long time before I could play in your league, so I appreciate your sharing this wisdom with me and pulling me up a little closer.

--Mike



Oct 30, 2008 at 12:52 PM
Bifurcator
Offline
• • • • •
Upload & Sell: Off
p.1 #13 · p.1 #13 · WideRGB/ProRGB


Peter Figen,
Yes, that's correct. Thanks for the clarification!


Mgipe,
I dunno if I would call it "creative disagreement". I think creative disagreement is more like how and why noise works or not as an artistic effect - or something like that. This is more a case of getting the facts straight.



Oct 30, 2008 at 02:10 PM
cgardner
Offline
• • • • •
Upload & Sell: Off
p.1 #14 · p.1 #14 · WideRGB/ProRGB


Peter:

FWIW - Nearly everything in the early versions of Photoshop was modeled after conventional halftone and drum scanning techniques (e.g. Levels, Un-sharp masking). I have a wider perspective on color management than some because I started out creating halftones and separations manually with contact screens at National Geographic and taught a college level course how to do it for five years in the late 1970s and then went on to manage a 100 person full-service (design to web printing) publishing operation and introduce "desktop" publishing and color. I had hands on experience with the PS3/ 4/ 5/ transition period when color management was first incorporated into Photoshop and have perhaps a wider perspective as why Abobe picked that particular size for its gamut.

Was Photoshop widely used for HDTV back in 1998? No of course not, there was no commercially available HDTV then. Photoshop was a tool at that time used mostly by designers, color separators of transparencies, and printers. The first versions of Photoshop 0.86, before Abobe licensed it from the Knoll brothers, was bundled with BarneyScan scanners. The rationale for picking that existing standard wasn't to equip Photoshop to produce work for HDTV, it was because it was easier to use an existing standard than create a new one and it happened to have a gamut large enough to fit both sRGB and SWOP CYMK. It was implemented to allow offset printers like me to use Photoshop for editing photos without the CY MK offset printing ink gamut getting severely clipped during conversation as it is with sRGB.

SWOP = Standards for Web Offset Printing. After working at NGS I was production process manager at a large commercial web printer which was on the committee which developed the standard, mainly to prevent color separators from using exotic inks on proofs production presses couldn't match: the 1970s version of "why can the press match my monitor color". If you compare AbobeRGB with SWOP you find SWOP fits inside of it, more or less. So does sRGB. Also while Abode adopted the SMPTE-240M HDTV color standard it was later discovered it actually got the implementation wrong and had to rename it AdobeRGB1998.

In the early days of desktop color the standard was an AppleRGB monitor calibrated with Adobe Gamma because that was the only way to do desktop color. At the time all color in graphic arts was evaluated with 5000K light sources. Radius introduced a self-calibrating monitor with a 1.8 gamma and 5000K white point called "ColorMatch" and it quickly became the de facto desktop standard. We bought them for all of our color workstations. The Internet and the fact AbobeRGB would allow color separators and designers to prepare work which could be either displayed on the web or printed is one of the reasons AbobeRGB supplanted ColorMatch as the de facto standard. In the realm of dual-purpose web / offset printing publishing AbobeRGB works just fine, another example of where using ProPhoto wouldn't be the best choice. Simply put there is no compelling reason to use ProPhoto RGB as a working space unless you want to tweek the last scintllla of saturation out of an ink jet printer.

Even Abobe in its tutorials for Camera RAW, I believe prepared by Andrew Rodney (DigitalDog), shows an example where in some cases sRBG would be a better choice of working gamut. He makes the very valid point that what really matters isn't the gamut of the camera, but the gamut of the colors in the scene the camera captured and the output medium.

I'm not a zealot for any particular colorspace. I've used them all at one time or another. All I'm suggesting here to the OP - my intended audience -- that he try all the different approaches with his equipment and then compare whether one offers better color IQ he can actually see than another. Isn't that more objective than insisting someone join the cult of ProRGB without really understanding how color management actually works? You've got to figure if the OP did know how color management works, he wouldn't have asked the question in the first place.

Chuck





Edited on Oct 30, 2008 at 05:02 PM · View previous versions



Oct 30, 2008 at 03:19 PM
Bifurcator
Offline
• • • • •
Upload & Sell: Off
p.1 #15 · p.1 #15 · WideRGB/ProRGB


cgardner wrote:
I started out creating halftones and separations manually with contact screens at National Geographic and taught a college level course how to do it for five years in the late 1970s and then went on to manage a 100 person full-service (design to web printing) publishing operation and introduce "desktop" publishing and color. I had hands on experience with the PS3/ 4/ 5/ transition period when color management was first incorporated into Photoshop and have perhaps a wider perspective as why Abobe picked that particular size for its gamut.


Wow, we have very similar backgrounds.

Worked in old-school print graphics for 6 years,
Taught University for 8 years (+course design),
Was directly involved with the PhotoShop Dev team and PS development.

I was also involved with 3DCG stuff too though and wrote parts of LightWave 3D as well as about one third of their ~2000 page paper/pdf manual.

I actually refer to myself as a video guy tho.







Oct 30, 2008 at 04:50 PM
floris
Offline
• • • •
Upload & Sell: Off
p.1 #16 · p.1 #16 · WideRGB/ProRGB


Interesting discussion, thanks for all the info.

A few questions...

1. Whayne - from your aspens colorspace graph it appears the red plot (aRGB) contains areas the green (prophoto) does not, the only region in which the prophoto extends beyond the aRGB is in the yellows. Am I reading that correctly? It seems to me from that plot that aRGB is bigger, and unless you're primarily dealing with yellows it would be the choice of colorspace. But that goes against what you're saying, so I think I'm confused

2. Is there an easy way to tell whether prophoto vs aRGB is going to make a difference for me for a given image? (working with colorful landscapes/wildlife printing off an epson 7880). I'd rather not go and try reediting everything if it won't have a perceptible difference, but if it will, I'd be willing to do it for the images where it truly matters.




Oct 31, 2008 at 02:35 AM
theSuede
Offline
• • • •
Upload & Sell: Off
p.1 #17 · p.1 #17 · WideRGB/ProRGB


Thank you all for a highly informative and well-written thread. This information is a lot more productive in an image-quality sense than all the "my 50D out-resolves your D300" type threads combined (that's about 1/3 of all post written at FM forums... ). I wish more people would read threads like this, and that less people worry about their equipment size (ehrm... I meant "resolution". That came out wrong.)

I also wish people would appreciate a good quality print, and that more people would have as their photography "goal" to hang at least one good sized blowup of one of their own shots on their wall per year... And I don't mean as in several feet big, just big enough to show and be proud of.



Oct 31, 2008 at 05:34 AM
theSuede
Offline
• • • •
Upload & Sell: Off
p.1 #18 · p.1 #18 · WideRGB/ProRGB


@floris: the green is a printer. You can tell it's a cmyk-based colour-space by the "sharper" point in the -b axis (among other things...). What you really need to do is to compare your own preferred printer to the colour-space you use, and make your decision on that base. For this one 2D projection of the colour-space, I would say that aRGB is a good match for the printer in 99% of all cases... Fall leaves being one of the exceptions.

An "easy way" to compare? Not without already having all your material in ProPhoto... Then you could load a picture into PS and toggle "view gamut warning" in soft-proof in two versions (aRGB and ProPhotoRGB) of the picture. If the aRGB show significantly larger areas of greyout, then you're better off with ProPhoto. My suggestion if you don't have any colour-space visualization software at home is:
1: Redo some pictures with strongly saturated colours of many hues (mind, yellow is one of the normally "tricky" colours), increase saturation even a bit more in your RAW-processor of choice.
2: Export from raw to two copies, aRGB and PPRGB.
3: load both in Photoshop.
4: set your printer as the soft-proof profile, and set "gamut warning" to on.
5: toggle between the pictures. Is there a difference?

If the difference is small, don't worry about redoing all your work.

And now, I'm actually going out to shoot some of those yellow leaves... See y'all in a bit...
Greetings from Sweden / Joakim Bengtsson



Oct 31, 2008 at 06:01 AM
Bifurcator
Offline
• • • • •
Upload & Sell: Off
p.1 #19 · p.1 #19 · WideRGB/ProRGB


    cgardner wrote:
    If the argument for editing "in the widest gamut possible" were valid


Wayne Fox wrote:
I'm not arguing to use the widest gamut space possible. The reason for ProPhotoRGB is because it happens to be the best alternative for photographers currently available. The space was created with the specific intent of having a gamut as close as possible to the colors available in the real world. Since my images usually won't fit in aRGB, but will always fit in ProPhotoRGB, it seems to make the most sense. If there were a space that was in between, it might be a better working space, but there isn't. So to me ProPhoto makes sense.
I don't
...Show more

Well, I am arguing to use the widest space possible and IMO you should not have contradicted yourself nor conceded. Everyone interested in achieving the best results from an edit session should indeed work in the largest possible gamut. This is just simple common sense. I'll agree that there may be many people not interested enough in achieving the very best results possible, to bother learning these things but that seems to stand aside from this discussion and from the concerns of the OP. This is premised from the logic and intent of the original decision to incorporate, utilize, and eventually rename Kodak's ROMM RGB to ProPhoto. It simply is and was the widest established space for the task. It's larger even than Adobe's own internally developed Wide Gamut RGB - which was in fact back-shelved in favor of "ProPhoto". Why?

    Adobe Wide Gamut RGB = 77.5% of the color spectrum visible to humans.
    Kodak's ROMM RGB (aka ProPhoto) = 100% of the visible color spectrum.

In fact ProPhoto is so big that about 10 or 15% of the represented space falls OUTSIDE the spectrum of human detectable colors. Using large color models for editing is also a matter of common sense where the editing process can be compared to folding a queen sized bed-sheet in a room of predetermined size. Limiting one's self to sRGB while editing is akin to attempting to fold the sheet wrinkle-less-ly in a room 2x2 foot square, Adobe RGB 4x4, and ProPhoto a room the size of a basketball court. The analogy continues in that on the basketball court you have room to fold it as you like - twirling it over your head, throwing it to your right or left in an attempt to manipulate it's form and achieve the best wrinkle-free fold.

ProPhoto is very close (though not 100%) of L-a-b CIE space and so is suitable for Lab edits in PhotoShop and other editors. Additionally, it actually DOES make sense for "for someone to capture jpeg files in camera and then [to] convert them to ProPhotoRGB". You seem to be assuming that after such a conversion no edits will be performed that will produce color values outside whatever the JPEG was encoded with. It makes a lot of sense and is a common practice in the professional workplace [Pixar, etc.] to convert high color graphics (jpegs, etc.) into 16bits with the widest possible color space prior to editing. Think of it as being handed our imaginary bed-sheet pre-folded into a 2x2 square and being asked to fold it in a different way. You will of course want to take the sheet onto the basketball court first! Editing an sRGB spaced image for example, in 16-bit ProPhoto will reduce the hue rotations that occur from non-linear tone scaling operations (or number of sheet corners that get bent by the walls of the smaller room ) and tone clipping that occurs from linear operations.

I could go on and on in this topic presenting other bits of history, practices, and mathematics but this should cover it well enough for a general photography site. At least I hope it will.


Nice photograph BTW!



Oct 31, 2008 at 10:22 AM
cgardner
Offline
• • • • •
Upload & Sell: Off
p.1 #20 · p.1 #20 · WideRGB/ProRGB


>>>> Indicates corrections <<<<<

Let's not lose sight of the fact that the capture of a digital image is done with three very narrow band filters over sensor sites. Each "bucket" of the sensor is either red, green or blue and sees the world as a monochromatic tonal range in each of those colors. That's exactly the same way color separation has always been done on graphic arts cameras I operated in the 1970s, on drum scanners I managed in the 1980s, and the camera I use today. I also used color filters with my B&W photography and understand how they work: a narrow band filter reproduces anything the same color as a lighter in tone than by eye and all other colors darker.

The camera is simply recording the scene in terms of its red, green and blue components and then mathematically converting the individual red, green, and blue cell site values into RGB pixel values based on a mathematical model of human visual response. A camera gamut can be profiled by shooting a color reference chart with known color values (as defined by Lab coordinates) within the range of vision. Those known values are then compared with how the camera actually records them. The camera profiling process has little practical use, except in situations such as copying art on a copy camera, because the profile will vary depending on the color temp of the source illuminating the chart.

I did that exercise with my first Kodak DC290 digital camera in 2001 when I had access to all the necessary tools. The process is pretty simple. You shoot an IT-8 color target which is produced with stringent quality control.

http://super.nova.org/PhotoClass/Part7/IT8.jpg

Accompanying the target is a data file which describes what the color of each patch. You then run the camera TIFF file though an application which compares the chart data file with how the camera actually reproduced the file.

http://super.nova.org/PhotoClass/Part7/ICCCamProfile.jpg

That camera had an odd quirk. Color looked normal, in the perceptual sense, if I used daylight or the built-in flash but if I used my Vivitar 285HV flashes blue and neutral gray tones would get a cyan cast and reds would become oversaturated. The color of my jeans and Mac CPU are more accurately reflected in the photo on the left which uses the camera profile. The plot of the profile revealed the reason for the odd color shift:

http://super.nova.org/PhotoClass/Part7/ICC_3D/CompareProfiles2D.jpg

http://super.nova.org/PhotoClass/Part7/ICC_3D/290Profile.gif

Note how the camera gamut as defined by the chart test fall outside of the Lab space in the blue and red corners? I suspected the odd color shifts were due to the fact the Vivitar flash was producing UV and IR wavelengths the camera wasn’t filtering out. The camera saw the UV as just a brighter tone of blue and the IR as a more intense shade of red because it was just recording relative brightness in the color channels through the filters over the sensor sites, not color in the literal sense. Those arbitrary RGB values in the camera file only became odd looking when mapped using icc color management to a working space and my monitor profile with relative colorimetric or perceptual rendering intent. They looked odd because the perceptual rendering intent apparently assumes all the RGB values the camera will capture fit inside the visual spectrum. I didn’t see the same color shifts in photos taken with the built-in flash because it had a heavy UV filter over it. With a bit of web searching I located a professor at RIT who taught a class on profiling using a DC290 and exchanged several e-mails. He confirmed my suspicions. The Vivitar flash didn't cause the same problem on my other digital camera because they had better UV/IR cut filters over the sensor. The solution with my DC290 was to add UV absorbing mylar to the flash head.

Reality Check: You need to realize that any camera profile made from printed chart or transparency is based on the limited CYMK gamut of that target and then extrapolated into outer space by the profile generation software and is only valid for identical lighting conditions. Its not the “real” color because all color is relative. That what I mean when I say all color reproduction is "fake" or an illusion. Color perception and the fact we thing a flat 2D contrast pattern in a photo represent real objects are all perceptual illusions. The real genius of icc based color management is that it takes than into account with the rendering intents used to map colors. But as my DC290 example shows, its also based on certain assumptions of how RGB maps to the gamut of human vision perceptually.

The fact color reproduction works at all has to do with the physiology of the eye and brain.
>>> Correction per Hermie: There are only L, M and S cones (long, middle and short wavelength sensitive cones). Thrichromatic signals are transformed to opponent signals, allowing more efficient signal transmission. The processing of cone signals is analogous to the a and b channels in Lab. Differencing of the cone signals allows construction of red-green (L−M+S) and yellow-blue (L+M−S) opponent signals. <<<<<< Having diffrent points of reference for any color allows its hue to be triangulated by the brain in much the same way as a you can plot your position on a map by taking simple compass bearings off two different mountain peaks then drawing lines on those bearings from the tip of the mountain out into the map: your location will be where the lines intersect.

The rods of the eye are also monochromatic, sensitive to a narrow band in the green/blue area of the spectrum and about 3000x more sensitive than the cones to light energy. That adds a third point of reference to human vision, a very sensitive range of brightness. There are actually far more rods than cones in the eye. If you hold your arm out and stick up your thumb the cones in your eye would be concentrated in an area about >>> twice <<<<< the size your thumbnail, 2% of your field of view, and the rods the rest of your field of vision.

The physiology of the eye and the process of human vision explain some the things you may just accept as givens about digital photography without really knowing why, such as the shape of the Lab gamut (much larger in the green region) and the fact that a digital camera sensor with a Bayer patten has twice as many green sensor wells as red and blue. Both to mimic the response of the human eye. I don't know for certain why Bayer decided on doubling the number of green sites but suspect it was clever workaround to eliminate the noise / signal ratio problem which would occur if a single green site was amplified twice as much as the red and blue ones.

There are physical design considerations which affect the dynamic range of a sensor site. They are analogous to a bucket in that the larger the bucket is the more water it can hold. Because there might be both very dark and very light areas in a photo the process of filling the buckets is like aiming a fire hose at some while filling others with an eye-dropper (the real one - not the one in Photoshop). The process of filling the buckets stops, ideally, at the precise moment the first of the million buckets on the sensor gets filled to the brim. Then all the buckets are dumped and the contents measured.

Ironically, a cheap point-n-shoot camera with a live preview continually dumps and resets the sensor and by constantly measuring the contests can know, down to the pixel level, when any of the buckets have been overfilled, and provide feedback to correct the exposure. But a high-end camera, slave to the optical viewfinder, must meter off the light in the viewfinder in roughly defined zones, and then take a WAG at the exposure. The best is can do is try to warn the photographer via the post facto histogram and over-exposure warning what the state of exposure was. Thus a $300 P&S in most cases will often do as good or better job of getting auto-exposure correct than a $3,000 pro body.

How many bits the camera uses to convert the analog voltage of the sensor dump into a decimal value representing a range of brightness as seen through the red, green and blue filters affects the gradation of tone. The four bit processors in use back when I started programming business applications in the late 1970s could only describe a tonal scale with 16 steps. The lowest 0 would be black, the highest 16 white with 14 gray tones in-between. Nowadays a camera uses 14-bits and there are many more discrete steps in the gray scale but 0 is still pure black and the highest value pure white. When we get to the point past the RAW converter, we have a camera file with RGB values for each pixel assigned within the range the bit-depth can express during the image processing based on the mathematical model of how RGB values map to human vision.

In the days before icc managed color the RGB values in the file would directly drive the video card and the monitor with no intervention by the application. That is still true for applications such as web browsers which are not icc-aware. Back in the early days of desktop color the pros were editing color separations of transparencies and color prints made on scanners on Macs with 5000K / 1.8 gamma Radius Pressview monitors. The color balance was checked visually and empirically against the same chart printing on an offset press.

In the early 1980s when running a Hell DC300 drum scanner, which output “hard dot” litho CYMK film ready for printing, the basic calibration exercise involved printing a optimized standard color target on the offset press with production ink and paper then scanning it. The scanner gray and color balance controls were then adjusted so that the resulting film accurately rendered the target: a CYMK > CYMK round-trip.

There was color management, and it worked quite well within the realm of professional graphic arts if everyone used standardized viewing conditions and SWOP standard inks because it was based not around what a monitor could display but rather what the final result on the press would be: monitors were adjusted by eye, perceptually, to match the press sheet as close as possible. All things considered, in a closed loop environment like our offset printing operation where the designers and pressmen shared identical viewing conditions it worked as well as icc based color management. Every light bulb in the entire facility was a 5000K Chroma so even if you looked at color outside of the viewing booth environment the color temp was still the same.

The Internet and digital cameras started to change the paradigm of color communication several years before icc based color management became universally available on the desktop. The native color balance of an un-calibrated CRT RGB monitor on a typical PC at that time was typically around 8-9000K with a gamma of 2.2. Microsoft and HP, focused on the business market, not graphic arts like Apple, collaborated to create the sRGB standard in 1995 based around the gamut of monitors at that time. The specification called for a white point of 6600K and gamma of 2.2. Needless to say color edited on a Mac didn't look the same on a PC. Apple had introduced icc based color management into its OS in 1992, but it wasn’t until Windows98 that Microsoft actually incorporated icc based profiling into its OS. But because the Internet shifted the medium of communication from the printed page to the monitor and PCs outnumbered Macs sRGB became the de facto standard for web viewing for a very practical reason: it was a close match to most CRT monitors so even an unmanaged file would look OK.

With early digital cameras - my office owned one of the first a .8MP Apple QuickTake 100 - there was no choice of color space. Later cameras offered the choice of sRGB or AdobeRGB, but that choice defined the working space gamut the camera JPG values would be mapped to, not the range the camera could record, which is as described is primarily a function of the filtration over the sensor sites.

In digital camera RAW workflow the first point at a color space is first assigned in the icc model of color is when the digital color values of 0 - the maximum bit value calculated by the camera from the analogue voltages recorded by the camera sensors sites and are assigned to a working space. A RAW camera file has no defined icc gamut: no color space in the icc lexicon. The RAW file contains luminance and color information, not discrete RGB pixel values. The point at which RGB values are first assigned is when the working space is selected. Since whatever values the camera captured are mapped to the working space any colors outside of the working gamut the camera might have recorded are mapped to the other boundary of working space. The beauty of the icc workflow is that you can pick whichever working space you want and the color management driving the conversion will adjust the color values in a way that perceptually the colors look realistic: that bright red fire engine or stop light will be mapped to the appropriate RGB values within that space. To the extent a camera profile created from an IT8 or Macbeth ColorCheck interjected into the RAW > working space conversion will affect the outcome it is by modifying the values within the boundary of the working gamut to make them more accurately reflect the chart values in the test file. The camera profile will not affect the outer boundary of the working space, and will only be accurate for photos taken in the same color light.

Implied in the design of the icc system is the assumption that people actually know how it works and can make intelligent choices of workspace based on image content and output medium. That is the point of this thread. The RGB and CYMK gamuts are analogous to an RGB apple and a CYMK pear: different sizes and different shapes. If you owned a fruit mail order business wanted to stock one size box to ship either fruit, or combinations or both you’d need one that could hold either shape, but not have so much room left over that the fruit would rattle around during shipping. The same is true when picking a working space: the working space is the box.

A RAW file can be opened in Adobe Camera Raw (ACR) and evaluated in different working spaces. The histogram and preview will show clipping due to exposure. Abobe offers this advice:

Clipping occurs when the color values of a pixel are higher than the highest value or lower than the lowest value that can be represented in the image. Overbright values are clipped to output white, and overdark values are clipped to output black. The result is a loss of image detail.
• To see which pixels are being clipped with the rest of the preview image, select Shadows or Highlights options at the top of the histogram. Or press U to see shadow clipping, O to see highlight clipping.
• To see only the pixels that are being clipped, press Alt (Windows) or Option (Mac OS) while dragging the Exposure, Recovery, or Blacks sliders.
For the Exposure and Recovery sliders, the image turns black, and clipped areas appear white. For the Blacks slider, the image turns white and clipped areas appear black. Colored areas indicate clipping in one color channel (red, green, blue) or two color channels (cyan, magenta, yellow).
Note: In some cases, clipping occurs because the color space that you are working in has a gamut that is too small. If your colors are being clipped, consider working in a color space with a large gamut, such as ProPhoto RGB.

But what the ACR preview isn’t showing whether or not the working space is large enough to hold the CYMK printer gamut. The ideal display for ACR would be something similar to soft proofing in Photoshop where the CYMK printer profile could be inserted into the ACR screen preview color management workflow with a separate out of gamut warning showing in the preview for any colors in the CYMK output the working space will clip. That of course can be done now by simply opening the file in Photoshop in the selected working space and applying soft proofing before doing anything else, but the ability to do in ACR would save a step.

Color is adjusted perceptually when the working gamut is displayed on a monitor. When working in a gamut larger than than of the monitor you are actually manipulating the color outside of the monitor gamut by remote control. For example you might have a red which is 100% of the saturation your LCD monitor can display, but only 50% of what the working space can define. That means you can make the red more saturated in the working space but the monitor can’t accurately display it. So what happens? All the other colors your monitor can display will change instead to try to simulate, within the limited monitor gamut, how the all the colors will look in relative terms perceptually.

When you print and convert to CYMK the RGB values will also be remapped in ways a monitor can’t display. What gets mapped to CYMK isn’t what is actually seen on the monitor, which is just a perceptual simulation of the wider working space: the RGB> CYMK transform goes directly from the working space you can’t real see on the monitor to a CYMK space you can’t actually see until you make a test print. Soft Proofing is mostly valuable to the extent it identifies out of gamut output colors by graying them out. By selectively adjusting saturation to eliminate the warning the clipping can be eliminated, but the image on the screen will still not exactly match the output.

The downside of working in a gamut larger than actually needed is that much of the manipulation of the file is shown in relative perceptual ways on the monitor. Using a monitor with the widest possible gamut you can afford will allow you to see more of the actual manipulation rather than a perceptual simulation. But that also has a downside if you are preparing files for the web for people with smaller gamut monitors. Finding the best workflow for the work you do just requires a bit of your time to try them and see which is most convenient. If you only every post the files on the web then working in sRGB would save the step of converting. If you only shoot and edit for offset printing and the Internet then AdobeRGB would fit both. At the current time its only the gamut of some ink jet printers which exceed AdobeRGB

The most objective way to judge printer output with various workflows is to print a carefully prepared test file. It is possible to download standard test files from CIE and other sources:
http://www.colour.org/tc8-03/test_images.html

The choice of working space doesn’t affect the underlying RAW values in the file so there is no irreversible penalty for making a bad working space choice if the RAW file is retained. Worst case the file would need to be re-edited.

Canon offers a more empirical approach to color in DPP.

http://super.nova.org/TP/Styles480sRGB.jpg


A RAW file can be opened in DPP in various working spaces and then various style profiles can be applied interactively which will change the internal color mapping within the working space. That allows the user to empirically pick the one which best matches the content of the photo and the desired mood of you want it to evoke in the mind of the viewer. That, not color by the numbers is what really matters most from my practical point of view.



Edited on Oct 31, 2008 at 05:26 PM · View previous versions



Oct 31, 2008 at 11:44 AM
1
       2       end




FM Forums | Post-processing & Printing | Join Upload & Sell

1
       2       end
    
 

You are not logged in. Login or Register

Username       Or Reset password



This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.