Search: for:

CHROMiX ColorNews Issue #48 - Print Verification

SmartNote: 50187
Type: ColorNews
ColorGeek factor:
  CHROMiX ColorNews
   Issue # 48 - May 16, 2012

This Month's Contents

  1. CHROMiX News
  2. Latest blog entries in ColoRants (and Raves)
  3. Shows and Events
  4. Color Industry News
  5. Forum Topics, Random Bits, etc.
  6. Article - Print Verification - Are your Proofs Bona Fide?
  7. CHROMiX Open Box items for sale
  8. ColorNews Admin (feedback, subscriptions, etc.)

For the very freshest color updates, check out our new blog Colorants (and raves).

Respond & Discuss
Don't forget, you can discuss this month's article and anything else from this newsletter in

Find full details about subscriptions, etc at the end of this newsletter.

CHROMiX News What the heck have we been up to?

Curve2 VERIFY Module and DEMO mode

To address the growing need for a stand-alone G7 verification tool, CHROMiX and HutchColor have developed Curve2 'Verify' mode. This is the same great verification feature that Curve2 has always had but now can be licensed separately for customers who simply want to verify a sample print for G7 Grayscale or G7 Targeted compliance.

Curve2 Verify only mode is $99. Verify users can upgrade to Curve2 for $1100 unlocking full Curve2 goodness.

Also, as a side announcement and by popular request, there is a new 'Demo' mode for Curve2. Demo mode allows users to test the interface as well as the main calibration and verification functionalities of Curve2 (including Verify mode) without a serial number. Sample files are included and are used for Demo mode.

So, download Curve2.3 for the new modes as well as updated OneRun targets and apogee and Konica Minolta native output files.

For more please contact CHROMIX Sales at 877-ColorGear or or to order here

ColorThink Pro Tips & Tricks

CHROMiX has been adding videos on Tips and Tricks with ColorThink Pro. We now have two videos (and growing), narrated by our own comedic technician, Pat Herold. The first is about using ColorThink Pro for Linearization and Optimization, and the second contains a few examples and some good advice on how to spot a bad profile using ColorThink Pro.

Stay tuned for more soon. And feel free to comment on the humor!


CHROMiX Blog Here are some of the recent posts to our blog: Colorants (and raves)

Shows and Events Color-relevant gatherings to plan for

May 3rd - 16th, 2012 - DRUPA 2012, Duesseldorf Messe, Duesseldorf Germany

June 11th - 13th, 2012 - G7 Expert/Professional Training, Pittsburg, PA, sponsored by Printing Industries of America. Call Joe Marin at 800-910-4283, ext. 731 or email

June 12th - 14th, 2012 - OnDemand Expo & Conference, Javits Center, New York, NY

June 24th - June 28th, 2012 - IPMA In-Plant Printing and Mailing Association, Kansas City, MO

October 7th - 10th, 2012 - GRAPH Expo 2012, McCormick Place South, Chicago, IL

October 18th - 20th, 2012 - SGIA Expo 21012, Las Vegas, NV

Events Calendar: For all current and future events, bookmark this calendar.

Color Industry News What's going on in the world of color

ColorEyes Display Pro v1.6 Update (Mac only)

Integrated Color Corp. has released version 1.6 of their ColorEyes Display Pro monitor calibration and profiling software for Mac. Its a paid upgrade for existing v1.5 customers ($49). It contains support for new instruments (i1DisplayPro, Spyder4 and ColorMunki) and the latest OS's. It also provides DDC support for the NEC PA series and Wacom Cintique tablets. (Again, Mac only). Here are some important notes and for now, the upgrade must be purchased direct through Integrated Color.

New EIZO Features

Eizo has introduced three interesting new features in their latest ColorNavigator calibration software version update 6.1.1: 1) Mobile Device color emulation for tablets and smartphones, 2) Film Emulation via the ability to apply 3D LUT's, and finally 3) Support for the new Spyder 4.

NEC latest SpectraView II calibration software now supports the DISCUS


X-Rite acquired by Danaher Corporation

X-Rite and industry people we've spoken to are positive about this acquisition

X-Rite ships the new generation i1Pro 2 spectrophotometer

The new i1Pro '2' is the heir replacement to the legendary handheld spectrophotometer, the i1Pro. The i1Pro 2 will replace all i1Pro bundles moving forward (i1Basic, i1Photo, i1Publish, etc.). The i1Pro 2 has a new design, look and feel, and in fact won the recentRed Dot Award Probably the most important new addition is of different illuminants in one device. The i1Pro 2 includes illuminant conditions: M0 (Standard Illuminant A), the emerging M1 (illuminant D50) and M2 (known as UV-cut). The i1Pro 2 also has a new status LED, new diagnostics and self-correction features plus a new self-cleanable aperture protection glass and white tile cover, and more.

i1Basic Pro 2 bundle,i1Photo Pro 2 bundle, i1Publish Pro 2 bundle

X-Rite released i1Profiler version 1.3.1 software update

Version 1.3's main functionality includes compatibility with the new i1Pro 2 device in M0 and M1 measurement conditions. It also includes the ability for smaller patch sizes (to 7mm), white point editing(!), improved CGATS reading, two chart measurement workflows added, CMYK+4 is corrected, and more. i1Publish software...

X-Rite has upgraded ColorPort to 2.0.5

to support the new i1Pro 2 device, i1Profiler 1.3.1 software and most known issues with MacOS 10.5, 10.6 32/64, 10.7 32/64 and Windows XP 32, Vista 32/64, Win7 32/64. Available free here.

2nd Generation iO Table from X-Rite

for the new i1Pro 2...

Forum Topics and other bits  Popular topics from and other things we've found along the way.

Andrew Rodney has a video overview of soft proofing in Lightroom 4:    In short, he likes it. LR has quite a few features that make it better than Photoshop. It has a histogram of the image that updates with the profile being used. Simulate paper is handled better, more subtly. LR has the ability to save a "virtual copy" based on a soft proof that leaves the original alone. Finally, LR has out of gamut warnings for both the monitor profile and the output profile, and more.

We think anyone associated with the Graphic Arts industry:   will enjoy this article from Cary Sherbourne. She looks back at the story of DX Imaging (a joint effort from Xerox and DuPont) in what could have fundamentally changed the printing industry.

A sad note for those who use and love the Spectrolino/Spectroscan:   Parts, technical support and recalibration for the Spectrolino line will be discontinued by X-Rite December 2013. It is expected that no other company will be licensed or authorized to support it further

Moving outside of our profession (a bit):   here are two fun video's you might enjoy. These are about 6-7 minutes each, but we're confident the geek side of you will benefit. Higgs Boson and Dark Matter explained through animation.

Print Verification - Are your Proofs Bona Fide? An article by CHROMiX field specialist Terry Wyse

(All about print verification and what it may mean....or not) With proofing being a large percentage of my color management business, the topic of proof verification comes up quite often. I thought an article covering proof and print verification and what it means.....and what it DOESN'T NECESSARILY mean would be valuable. This article is intended both for those in print and proof production but also for those receiving proofs from outside vendors. If you're receiving a proof and it has some sort of "pass/fail" label on the proof, it's critical to know what EXACTLY that pass/fail label means to you as the person accepting the proof.

I'll break things down into three categories of verification:

  • Proof verification (external)
  • Proof verification (internal)
  • Calibration verification

Proof Verification (external)

Let's talk about "external" proof verification first since this is probably the easiest one to define and likely what most people understand the term "proof verification" to mean when someone is handing them a verified proof.

Generally, a "verified proof" is meant to convey to the party receiving the proof that this proof has passed some sort of quality assurance test (measured with a spectrophotometer such as an i1Pro) with the implication that the proof you received was compared to some "standard" and was deemed acceptable to within some tolerance, usually a "delta e" tolerance (I'll refer to "delta e" as simply "dE" from now on). What's usually measured in this case is a small color bar that's included on the proof, usually something like the IDEAlliance 12647-7 Control Strip (see graphic) but other types of control strips can be used as well such as the FOGRA Media Wedge or similar. The point here is that the control strip should include, at minimum, both primary (CMYK) and secondary (RGB) colors plus tints. Most control strips will also include a selection of "memory colors" such as skin tones plus several steps of CMY neutrals and similar steps of K only....all-in-all a couple dozen patches of colors.

IDEAlliance 12647-7 Control Strip:

The first thing that's critical here is WHAT standard is being used to compare against? If the proofing is targeted to a standard print specification such as GRACoL, SWOP or one of the many FOGRA specifications, then the logical thing would be to compare the control strip colors against the standard colorimetric (L*a*b*) values specified by that standard. If you don't know what they are, you can visit the various "owners" of these specifications and usually they will publish exactly what the L*a*b* values should be for the control strip they support. If they don't, the values can usually be derived either from their standard characterization data set (ECI2002 or IT8.7/4 data sets) or via an ICC profile made from their standard data set (this is easy to do in ColorThink Pro via a Worksheet). It's generally understood in this scenario that you'll be using some form of "absolute colorimetric" rendering since you're comparing directly to an external standard which, by definition, would include the paper white point of that standard.

So the proof is printed, including the control strip, and then measured/compared either in software made specifically for proof verification or you can even do it by hand in an Excel spreadsheet (this is rather clumsy and some of the dE formulae used for comparing are not for the faint of heart or weak of stomach!). Once verified, typically a small adhesive "pass/fail" label is printed and affixed to the proof to show the person receiving the proof that it's "all good" and can be trusted to represent the final printed job, assuming the press run is targeted to that same printing specification (it doesn't do anyone any good if you produce the perfect GRACoL proof only to have the job printed via web offset on a #5 press stock...they won't match!).

Proof Verification (internal)

So far so good......but let's say you're really not interested in comparing your proof to some "absolute" external standard but are more interested in proof consistency as opposed to accuracy. If that's your goal (and it's equally valid in my opinion), then your "standard" becomes the proof that you agreed was a good visual match to your reference when the proofing system was first installed and profiled. I call this "internal" proof verification. The simulation in your proofing system could still be based on a specification such as GRACoL, but you simply want to verify against your proofing system's interpretation of that specification and monitor your proofing system's consistency.

Print your "stake in the ground" proof (include a control strip) that everybody agrees is a good proof and then measure the control strip, the same one you'll use later for verification. This measurement is established as the standard that all subsequent proofs will be compared against. Usually the verification software you're using will accommodate custom standards or at least will give you a way of manually entering the L*a*b* values from your "golden" proof into the software.

Calibration Verification

A third option is similar to the internal verification above. I call it calibration verification. In this case, instead of verifying against your proof's color-managed interpretation of a standard specification, you want to verify that your proofing system's calibration (ink limiting and linearization generally) is consistent. The major difference here is that you print a control strip with color management DISabled but with calibration (linearization) ENabled. Again, if you're primarily interested in consistency from proof-to-proof, this method has several advantages:

You're verifying proof consistency using the entire color gamut of the printer, not just a dumbed-down version that's been run through a profile conversion ("color-managed"). By testing/comparing using the entire color gamut of the printer, your proof verification should pick up on proof consistency issues sooner than a color-managed verification would (a color-managed verification could actually "mask" or hide proof consistency problems).

Since a printer's calibration is generally common for all color-managed conversions on that media, you only need to perform a single proof verification. On the other hand, if you verify to a proof standard and use several production proofing simulations (GRACoL, SWOP, Uncoated, etc.), you may need to establish proof verification parameters for all those proofing simulations....even though they all use the same basic calibration parameters. "Calibration" verification eliminates the need for separate verification of each standard, assuming your production proof standards share the same media calibration.

Disadvantages to this method? Yes, a couple:

  • Some proofing systems require that the same color management applied to the images be applied to the control strip. Only a few of the high-end systems allow you to include a control strip on the proof and print it without color management.
  • It's possibly overly-sensitive to proof may be lead to overreact to a calibration issue when it may not show up visually on a proof.

No matter which of these methodologies you employ, you'll likely be setting a tolerance based on dE....but it's important to know WHAT dE formula is appropriate (there are several) for these different scenarios. Since a thorough discussion of the different dE calculation methods is really beyond the scope of this article, I'll focus on the methods most commonly employed and how they differ.

The first and most common dE calculation is called dE 1976 or simply dE76 for short. dE76 is a very simple calculation and simply gives you the mathematical distance of one color from another using L*a*b* colorimetry (show formula here?). It's important to note that the mathematical difference between two colors is not the same as the visual difference. In other words, a dE76 difference of "2" between two yellow colors would not illicit the same visual response as that same difference between two blue would likely see a visual difference between the two blues but perhaps not perceive any difference at all between the two yellows, even though they both had the same dE76 color difference.

Fortunately, we have another dE calculation that is more relevant to visual differences, not just mathematical ones. That formula is called dE 2000 or simply dE00. The beauty of this formula is that it more accurately accounts for how we humans respond visually to various colors across the spectrum. With dE00, a difference of, say, "2" elicits roughly the same visual response no matter what color pairs you're comparing. In colorimetric terms, dE00 is more "sensitive" to hue and lightness differences as opposed to chroma or "saturation" differences. These explanations oversimplify the differences between these two dE calculation methods but I think you get the idea.

(For more information on deltaE, see ColorNews #17: The Color Difference)

What's important here is that when exchanging "verified" proofs and using dE criteria as the tolerance, you need to know WHAT dE calculation is being employed. From the discussion above, it would seem obvious that if we're interested in comparing visual differences between proofs and press sheets that dE00 would likely make the most sense. Wish it were that simple! Unfortunately, there's a long history of using dE76 as the color difference metric for legacy reasons as well as because it's the simpler formula to use. If you see any dE tolerance specifications for print standards and specifications, you can almost be assured that they are using dE76 as the calculation method.

Here's where I come down on what dE method should be used for these different types of proof verification:

If you're using the "external" proof verification method and you're being asked to use the same dE tolerances as established by the standards bodies (IDEAlliance, Fogra), then you're pretty much stuck with using the "straight" dE76 method. But as the standards bodies transition from dE76 to more modern "visual" difference calculations such as dE00, by all means use these newer methods instead. If it's an "internal" or closed proof verification you're doing, I would suggest you use dE00 since this will alert you to any real visual differences that may be happening.

For "calibration" verification, I would likely stick to using dE76 since this tends to be the more sensitive method......I would rather have my calibration verification routine alert me to color problem before it actually becomes an issue on my production proofs. The dE76 method may be overly-sensitive where visual differences are concerned but when it comes to checking calibration, I would want to run a fairly tight ship. If you feel it's too sensitive, then simply adjust your dE tolerance criteria upwards a bit to give yourself a bit more calibration slack.

If you take only one thing away from all this verification and dE talk, it's that you need to communicate with your proof-provider and 1) insist on some sort of proof verification and 2) understand exactly how they are verifying the proofs you receive and what standard (if any) they're being compared against....and understand what dE method they are employing. It would also behoove you to be able to verify the proofs you're receiving for yourself. The investment in hardware and software to do this yourself is minimal. CHROMiX plug: Talk to someone at CHROMiX about their excellent Maxwell system. With Maxwell, not only can you verify and share your proof/press verification data, the entry fee is extremely inexpensive compared to other stand-alone solutions. I've started using it myself with some of my customers to monitor their proofing systems remotely and it's been extremely helpful.

(Terry Wyse is a well known and recognized industry expert of color management. Terry is a G7 Certified Expert and provides press profiling and press optimization services. He also provides knowledge and services for pre-press, proofing and other related areas. Terry has a wide range of product familiarity too long to list here. Finally, Terry is a valued partner of CHROMiX and is much appreciated.)

   To read this article with images in ColorWiki, click here

ColorNews Administration (feedback, subscriptions, etc.)

FEEDBACK and FAQs - ColorNews (this publication) has its own forum on Each issue of this newsletter tends to prompt responses from our readers and we often don't have enough time to respond to everyone (sorry!). So we created a discussion area on so anyone can ask questions, make suggestions, take issue with our prognostications or whatever. Come on by and have a chat!

SUBSCRIPTIONS - To unsubscribe from CHROMiX ColorNews, reply to this message with "unsubscribe" in the subject title. To subscribe, simply reply with "subscribe" in the subject title.

For previous ColorNews articles head to our ColorNews Archives

Entire Contents of CHROMiX ColorNews (c)2012 CHROMiX, Inc. CHROMiX, Maxwell, ColorThink, ColorNews, ColorSmarts, ColorGear, ColorForums, DisplayWatch and are trademarks of CHROMiX Inc. All other trademarks are property of their respective owners. CHROMiX ColorNews is intended as an informative update to CHROMiX customers and business associates. We are not responsible for errors or omissions. You may not copy or reuse any content from this newsletter without written permission from CHROMiX, Inc.