With all my recent photo work, I’ve started developing an eye for how my photos are looking on different monitors. I have an LG monitor at home for which I have a ICM profile for. I can get Ubuntu to load this profile using xcalib. At work, I have two Viewsonics that don’t live on viewsonic.com anymore, and there are no model specific ICM profiles available for them to download. So on my recent Ravenna Tree Sign picture, the Viewsonic monitors displayed that picture distinctly darker than my home monitor. I upped the gamma setting on my viewsonics to 1.50 (?) using the on-screen-menu. However, this is all very vague–using on-screen menus isn’t necessarily the bees knees. This leads me to wonder: What is a typical screen gamma?
According to this ancient discussion on monitor gamma photo.net, Macs and PCs have an entirely different default range, 1.5 for Macs, 2.2 for PCs (Windows). Apparently this lead to a development of storing gamma-hints in picture formats(?) First I’ve heard of this. However, here is an interesting monitor gamma dipstick posted in that thread:

On an LCD monitor, moving your head around makes the gamma value change. So if I tilt my monitor up to be more perpendicular to my line of sight, I change the apparent gamma from 1.65 to 1.35. Wow, looking at the picture in this html editor, it’s now 1.05. I think if someone were to pay me to do this stuff, I think I’d probably stop using an LCD to do color managed work. (Are there good LCD monitors for color managed work?)
If you have trouble viewing my pictures, or if you think they are too light or too dark, let me know. I’m not going to get a colorimiter (a Color Spyder) for my home monitor anytime soon, but if my home monitor gamma and brightness off the tracks, I want to know.
