I’ve been into photography for years and this is an issue that keeps coming up and discouraging me. If someone could help me resolve this, I’d be eternally grateful

Basically, I understand the concept of calibrating monitors but every time I actually calibrate mine it only makes my monitor look unusably awful and kind of ruins my prints that already looked good when posting online.

This all started ten years agon (and again, this pattern has repeated every 1 to 2 years for the past ten years)….

Ten years ago, I would take a RAW photo on my camera and transfer it to my macbook pro (yes, I know you shouldn’t edit and print from a laptop, but it’s all I had at the time). The RAW, undedited image from the camera to Lightroom looked identical. I edit the photo, post it online and it looks good from my iphone, facebook, other peoples phones and other computers. I even printed a couple photos and they looked pretty good. I am now looking at a photo that I edited at that time from my uncalibrated MBP and it looks very close to how it looks on my iphone, which is the same LR from 10 years ago.

At the time, I figured it was important to calibrate my monitor but when I did that it just destroyed the screen on the macbook. It didn’t even look close to natural and turned everything muddy brown. Now, I understand maybe I was just used to seeing the incorrect, uncalibrated version but I have an image that proves the uncalibrated screen printed just find and looked great on a screen. However, the calibrated screen looked too awful to continue using so I deleted the profile and continued editing the way I did.

Again, over the next ten years I’ve repeated this process over and over. The calibrated screen just looks too bad to deal with and it makes my images that I worked so hard on, and look good on other screens, look terrible.

So tonight I am now using a PC and a BenQ gaming monitor that is 100% SRGB accurate, I decided to calibrate again because I really really want to get into printing my images but the same thing happened. All my images, that look great on my iphone and match my uncalibrated screen to about 90% now look awful.

What am I doing wrong? I do like to game on this same screen but I’ve always just decreased the screens default color saturation and contrast to match how the images look on my iphone, which matches Lightroom pretty closely.

Also, the uncalibrated screen I am currently using looks identical to how the raw images look in camera but the calibrated screen looks nowhere near close.

I’m once again discouraged and giving up on trying to print but I’d love to figure out what I’m doing wrong.

It seems that I have to choose between editing and viewing my images on an uncalibrated screen and my images will look better on a screen or calibrate my screen and maybe they print more accurate but they will not look the same when posted online.

If there is someone out there who wants to make some money, PM and I will pay you 50$ for your time if you can help me figure out this problem.

  • VetusiratusB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    I would mostly agree, however…

    Calibrating to sRGB is a pretty bad idea for photographic work. In the motion picture and VFX world you would calibrate to rec. 709 or whatever standard you’re targeting, but that is mostly down to how the color management pipelines work. You don’t get ICC based color management there, and there aren’t that many different outputs.

    For photographic work you can have a ton of different outputs, as every printer, ink and paper combination has it’s own color space. For that purpose, it’s best to keep the display at it’s native gamut as a wider gamut allows you to see more of the colors you’re working with.

    A good and simple strategy is to try and target a D65 white point and 2.2 gamma. That is by adjusting the RGB “gain” in the display and finding the gamma setting that is closest to 2.2. Don’t write anything to the video card gamma table - that will just lead to banding (which you’ll get anyway, so best to minimize it).

    Then, you simply profile the display and make sure to install the profile in your OS. This will take care of things in (ICC) color managed applications. Meaning, output transforms will be handled on the fly to match the display.

    For non-color managed applications, well… it’s probably easiest to try and avoid them. Windows UI will look oversaturated and games don’t support ICC color management. There are ways to use LUT’s for this if it bothers you. In fact, you might actually want to get madVR and use a LUT for your video player, if you like to watch videos on your computer. Most web browsers work fine if you stick to gamma 2.2. With Firefox you can enable color management and plug in your display profile.

    Anyhow, as for proofing… there’s nothing inherently wrong with it, but I find it unnecessary for web delivery. You could use it as a quick preview of what happens to your image after color space conversion. I rarely bother.

    Start with raw conversion to a large working color space, and make sure camera raw (or whatever raw converter you use) is set to 16 bits. Prophoto RGB is good.

    Make your edits and then convert it to sRGB, or whatever you want. Edit -> convert to profile in Photoshop. If you’re targeting print, proof to the printers profile. Don’t convert, as the printer software will handle the conversion.