Which Wedge Is Best?

Recently, a color process control manager at a large print production facility wanted to know if there is a more comprehensive chart available for daily digital color evaluations than an 12647-7 proofing wedge. He pointed out the IT8.7-4 has too many patches, and the P2P51 has too many gray finder patches. Reiterating a thought we’ve all had many times, he asked: “Am I overthinking the value of additional patches?”

Great question!

There is a tradeoff between patch count and how effective a chart is at gathering QC information. There is also something to be said for both extremes; too many patches and too few patches. Too many patches on a noisy (grainy, low screen ruling, etc.) printing device can cause unwanted noise in the measurement data (like using a 1 pixel eyedropper setting in photoshop to determine the dot percentage in a noisy image). Too few patches and you are not sampling enough colors to accurately model how the device is printing.

I just dissected the TC3.5 patch set and found it to be lacking in the 3 color grays. There are not many patches and none are G7 compliant gray patches. In my opinion, this eliminates the TC3.5 for any G7 evaluation. In fact, most of the currently available charts are not very good in the gray areas, especially if you are trying to evaluate G7 compliance. Idealliance built the TC1617 to address this lack of G7 gray patches in the IT8.7-4, but even this chart has too many patches for day-to-day evaluations.

The 3-row 2013 12647-7 chart (the replacement for the 2009 2-row chart) was built as a very good compromise between patch count and patch value. It has a decent number of patches to effectively evaluate print consistency, which includes G7 compliant gray patches, the typical array of CMYKRGB tone ramps, pastel patches, saturated patches, and a good assortment of dirty patches. These dirty patches were purposely built with CMY values and then with 100% GCR values excluding the 3rd color and replacing it with K. This was done because many separations, especially those done with ink reduction products, are made with GCR these days. It’s hard to beat what’s in that 3-row, 84-patch control strip.


The 3 Row Control Strip with key patches highlighted.

While considering charts and patch values, it’s almost more important to note the metrics and tolerances we place on these patches for conformance to specifications. If you look at the metrics we currently use for pass/fail, they are very CMYK printing press centric. Commercial print, specifically offset printing, has been the forefront of most industry standard and best practice development. Therefore much of the data gathering and evaluation is based on printing devices where C, M, Y, and K ink thicknesses are controllable by the operator. This means most metrics are tied to effective control of those ink thicknesses, which is largely irrelevant to the digital world.

We should be asking: “What are we passing and failing?”

For the G7 Colorspace metrics (currently the most stringent) we are evaluating:

  • Substrate – Paper color is good to evaluate
  • Solid CMYK – Very useful to press operators, but not much of a typical image or job is just solid C, M, Y, or K. This makes these patches poor for evaluating digital print consistency, especially visual consistency.
  • Solid RGB overprints – In my opinion, this is more important than Solid CMYK, as overprinted colors are what we see when we look at printed material. Still, these are only the solids, no tints.
  • CMY gray balance and tone – This is very important in controlling and evaluating print consistency, although it’s more important in print processes that lay down individual CMYK inks like offset.
  • All the other patches (pastels, saturated, dirty colors, skintones, CMYKRGB tints) are all lumped into a single metric called ‘All’ and then given a whopping average ∆E of 1.5 or 2.0 and a worst patch ∆E of 5.0 (95th percentile). That’s huge! A virtual barn door to let almost anything outside of grays and CMYKRGB solids pass.

These are not very visually oriented metrics and tolerances. So the big question to ask is what are you evaluating with your chart, or more importantly, what metrics and tolerances are you using to evaluate your chart? For G7 you could just use a P2P and eliminate the gray finder patches (columns 6-12), because the metrics are really only focused on CMYKRGB solids and the gray patches.

Bottom line, if we are looking for print consistency, we need to look at establishing new metrics that truly help us determine how visually consistent a print is. After a great deal of research, I believe this should be based on a cumulative relative frequency model (CRF) that evaluates all colors in a chart. In a CRF model, each and every one of the patches is relevant to visual consistency and is being counted within the evaluation. I have found the 3-row control strip does an excellent job of evaluating visual print consistency when using CRF. I’ve also performed the experiment in live production many times and have continued to get feedback from users who say using CRF and the 3-row control strip is the best method they’ve found to evaluate visual consistency.

If you would like to see the true power of CRF and real world metrics, try SpotOn! Verify. The trial is free, and our team will help you get started.

Real User Stories: Process Control for Seagate


Long-time SpotOn! user and product dealer Joe Pasky at Cathay America offers insight into the value of process control…

EIGHT years ago, I was asked by Seagate Technologies (California) to help them G7 qualify the 6 printing plants that they used in China and Thailand which produced retail cartons for disk-drives. They were having difficulty with variations in the color appearance of similar product images that were printed at different factories, even though the files and proofs for these images were created by a single color-house in the States. Several runs of cartons were rejected and had to be reprinted; this caused delays in the product release schedules and extra costs.

The original strategy for press checks on new products was to ask the printer to produce a press sheet at make-ready that matched the supplied proofs as closely as possible. This best match was approved and became the master reference sheet for printing all subsequent SKUs that used those particular brand colors and product images.

The problem with this approach was that the printed product images approved at one factory were sometimes different from identical product images produced at other Seagate printers.

As we began to G7 Qualify each of the printers and train them to ‘print to the numbers’, the color differences between the master reference sheets at the multiple printing plants was greatly reduced.

When traveling to the States a few months later, I visited a Best Buy store. I took particular notice of the display of Seagate disk drives; cartons that I had approved several months earlier. Looking at the same and similar cartons side-by-side, I was disappointed to see a noticeable variation in color and balance. I took photos of the UPC codes to identify the printer and began an investigation.

As it turned out, the operators were not being vigilant in monitoring the color during the press run. We implemented a new sampling procedure where a percentage of sheets from the press run were pulled and time-stamped. These were measured by the QC department and reports from each production run were sent to the Seagate China office for review. This procedure required extra effort and attention from the press operators and supervisors, but after several months of monitoring, the printing consistency was improved and variation reduced.

Monitoring the press sheets was effective, but we also wanted to sample the individual cartons after they were formed. But once the carton was die cut from the press sheet there was nothing to measure. The press colorbars were gone. To solve this problem, we built a 7-mm, 14-patch control strip that was hidden in the glue flap and another control strip placed in the tuck-flap of the box. The patches on this strip included two brand-color patches, paper-white, CMYK, 3-color black, and the G7 tone value targets: 25k, 50k and 75k and 3-color, ¼-tone, midtone and ¾-tone tints.

We had the experience of using SpotOn! Verify software to check digital proofs. We realized that we could also use SpotOn! Verify to scan the printed control strips. We took samples of finished boxes from the production runs and used SpotOn! Verify to quickly measure and record the color data. SpotOn!’s reporting function helped us build a quality report and history for each of the production runs.

The press operators also began using SpotOn! Verify during make-ready to measure the control patches to be sure that they were hitting their G7 target values and assure that they would pass the QC department’s ‘pass/fail’ criterion. It was an excellent tool that was very easy to use in production. We no longer use proofs as a color reference; we’re printing 100% ‘to the numbers’.

Seagate was quite pleased with the results. For the past 6-years, each of the factories have been using SpotOn! Verify to monitor color on press and in the QC department, cartons from each of the factories and the images for each SKU were an excellent match to each other. All the printing plants were printing product images that were accurate and consistent from SKU to SKU and run-to-run. Color variation between boxes produced at multiple factories was practically eliminated.

Using SpotOn! Verify to measure these simple hidden color control strips that incorporate G7 data points has made it possible to monitor and improve packaging color reproduction of finished folding cartons across multiple production facilities. Any brand hoping to improve color consistency of printed packaging across a global supply chain could use this approach to improve quality and color fidelity.

Joseph J. Pasky

Cathay America

Shenzhen, China