Our very own Bruce Bayne was recently featured on the IDEAlliance GAMUT Printing and Packaging podcast. The episode focuses on new
Ready to listen in? Check it out now.
Our very own Bruce Bayne was recently featured on the IDEAlliance GAMUT Printing and Packaging podcast. The episode focuses on new
Ready to listen in? Check it out now.
Calibration of a single printing device is not always the easiest task, and matching multiple printers to one another is an even bigger challenge. One question that has come up frequently, especially with the rise of digital printing, is “what is the best way to profile multiple devices of the same model?”. If you are trying to achieve a close visual match between printing devices, there are three key things to consider before putting ink on the sheet:
1) Printer gamuts have to be pretty close between devices. This of course has a lot to do with substrate texture and ink texture (rough textured media and/or UV inks both exhibit more light scattering properties due to their roughness vs. solvent on glossy substrates).
2) It is necessary to evaluate more than just the worst ∆E value. You need to know how all the patches in a control strip compare in ∆E, not just the worst or the average. When choosing a control strip,the more patches, the better, as long as the chart doesn’t become too large for practical daily use. The more patches under 1 ∆E, the more likely the printing is visually close, as you are comparing all patches in the strip to themselves and ranking them on visual closeness.
3) You can’t compare to an industry reference, like GRACoL, when visually comparing devices. You have to compare one device as the reference to the other, because that’s what you’re looking at in the viewing area. You can’t see GRACoL, as there is no perfect GRACoL proof, but you certainly can see the difference between printer A and printer B, so make printer A the reference when comparing those two devices. Hopefully with grouping tests you can compare multiple devices to one device.
Tight calibration of the device and the ability to truly recalibrate back to the same known state the device was in when profiled is key. From my experience, the automated “recalibration” process does not always work well in the field. Some RIPs are better than others, but the bottom line is for true recalibration to work successfully it has to be a two-part process. First, you have to achieve the same solid ink value that was in the original calibration, and second, you have to then create the same curve along the values between 0% and 100%. Most RIPs do the latter, but few actually do the former during the automated recalibration process. This is important, because if you can’t fully recalibrate the printer, the original profile is eventually going to be too far off the mark to be useful.
Also consider: very rarely do two of the exact same devices that are exactly the same age print the same color right out of the box. I’ve proven this many times when evaluating color output data during calibration sessions. There is no way to successfully use a single profile for multiple devices that aren’t even close and achieve a tight visual match. My advice is to target the same source reference space (GRACoL as an example) for each device, then calibrate and profile each device as carefully as possible to achieve as tight a match as the RIP can provide to the source reference space. When finished you can compare how close each device is to one another by printing a test chart and comparing the measured results. Now that being said, RIPs that have iterative optimization have a much better chance of achieving a tight calibration between multiple devices than RIPs that can only rely on ink limits, linearization, and icc profiles alone.
You certainly can and should run comparison tests between all your devices (ideally on a single substate all devices can print on) to identify which devices are the closest to one another and group them accordingly. The point here is to get to know each and every device (it’s gamut, how consistently it prints, etc.). Maybe you get lucky and find several devices that actually are close enough to calibrate using a single profile. Only after going through the process of calibration and evaluating the results can you truly know the color capability of each device.
I have installed many pairs of Epson aqueous printers and have never found two that calibrate the same or profile the same, however, following the process described above will get them to the closest possible visual match.
SpotOn! Verify is the ideal tool for comparing the calibration results of each printing device. SpotOn! Analyze is the ideal tool for setting ink limits and examining the color differences between each printing device. Try them for yourself!
Last year I was working with a well established company in Pennsylvania that specializes in ‘museum quality’ art and photography books. They were considering several titles for their first Asia production attempt, but the company was concerned the quality that their reputation was built upon would not be upheld by the new contract facility. Additionally, they wanted to implement a color control and monitoring process at their US factory.
The company’s ultimate objective was to bring the proofing of both facilities to within 90%+ of GRACoL 2013 target values using the SpotOn! Visual Match Scorcard. At press we matched the printing color to these accurate proofs. Achieving this goal would ensure their customers continued to receive the same high quality products regardless of where these items were printed, and the company would be able to efficiently control color in two facilities nearly 8,000 miles apart.
This was a job for color experts, so we asked Bruce Bayne from Alder Technologies to spend several days in the Pennsylvania facility and calibrate their new Epson SC9900 proofer. He also installed SpotOn! Verify to bring their quality monitoring and QC assurance procedures to the next level. Moving forward, Verify would be used to control the print process by monitoring and tracking the consistency and accuracy of proofs made in the US and abroad.
Next, the highly skilled Cathay America team used SpotOn! Verify to calibrate and monitor the Epson proofers at the printer’s new facility in Shenzhen, China. Verify’s Visual Match feature guided our calibration work and helped ensure proofers on two continents would match accurately.
Finally, the US based press was calibrated, and the Cathay America team calibrated the presses in China. Both were able to achieve GRACoL2013/CGATS/CRPC6 target values. This was the last step to bring all of the proofing and production devices into alignment.
The next step was to implement a global monitoring and QC assurance system (process control) that would allow our customer to achieve the same high quality presswork over time, regardless of where their books were printed.
Thanks to the support of Bruce Bayne, our China team, and SpotOn! Verify, our customer was deeply impressed by the quality of their first prints from Asia. Both the US client/printer and the photographer whose work we reproduced were thrilled with the quality of the images in this very impressive book. Most importantly, our US client was confident that he could print his high quality museum editions at our China facility.
Since this first successful printing last year, many more titles are now being produced with excellent color. This was possible because of our process control program enabled by SpotOn! Verify.
The details of our method are below.
Using SpotOn! Verify to monitor and control presswork.
The printer had 10 fairly new, well-maintained Komori presses, each with an Intellitrax scanning spectro. But the intellitrax software was several generations old and didn’t report anything more that solid ink densities (SID).
This limited color data was simply not enough information to tightly monitor production printing and achieve the accurate and consistent color control that we required. The printed images were quite well known, and our client needed assurance the original images could be reproduced accurately.
We decided to print on a slightly larger press sheet so that we could add a second set of colors bars at the trailing edge of the sheet. One control strip was placed in each of the four alleys of the book pages, just below the Intellitrax scanner’s control strip. These 18-patch, custom color control strips were designed to be scanned by a i1Pro2. After scanning, the measurement data was sent to SpotOn! Verify.
Press proofing key images before the production run. Notice the SpotOn! Version 2 color control strips that run vertically. This allowed for SpotOn! Analyse to measure and display comprehensive printing data.
If the press has shifted too far out of spec, the ink can be levelled accurately by scanning the press color bar with X-Rite’s Intellitrax system. The next step is to scan the custom color control strips in each of the four alleys with the i1Pro2 and SpotOn! Verify to monitor compliance with CGATS21/CRPC6 target values.
Verify clearly displays the data from each scan for the operator to evaluate compliance with the CGATS21/CRPC6 target values. If the press is out of compliance, the operator can take further corrective action to bring the press back into tolerance before running color-critical jobs.
Great color is within reach, and it can be maintained efficiently over time. What it takes is the desire to control the process.
– Joseph J. Pasky, Shenzhen
Patches: C, C50, M, M50, Y, Y50, Red, Green, Blue, 3/c Black, 100k, 25k, 25cmy, 50k, 50cmy, 75k, 75cmy, paper white
Recently, a color process control manager at a large print production facility wanted to know if there is a more comprehensive chart available for daily digital color evaluations than an 12647-7 proofing wedge. He pointed out the IT8.7-4 has too many patches, and the P2P51 has too many gray finder patches. Reiterating a thought we’ve all had many times, he asked: “Am I overthinking the value of additional patches?”
There is a tradeoff between patch count and how effective a chart is at gathering QC information. There is also something to be said for both extremes; too many patches and too few patches. Too many patches on a noisy (grainy, low screen ruling, etc.) printing device can cause unwanted noise in the measurement data (like using a 1 pixel eyedropper setting in photoshop to determine the dot percentage in a noisy image). Too few patches and you are not sampling enough colors to accurately model how the device is printing.
I just dissected the TC3.5 patch set and found it to be lacking in the 3 color grays. There are not many patches and none are G7 compliant gray patches. In my opinion, this eliminates the TC3.5 for any G7 evaluation. In fact, most of the currently available charts are not very good in the gray areas, especially if you are trying to evaluate G7 compliance. Idealliance built the TC1617 to address this lack of G7 gray patches in the IT8.7-4, but even this chart has too many patches for day-to-day evaluations.
The 3-row 2013 12647-7 chart (the replacement for the 2009 2-row chart) was built as a very good compromise between patch count and patch value. It has a decent number of patches to effectively evaluate print consistency, which includes G7 compliant gray patches, the typical array of CMYKRGB tone ramps, pastel patches, saturated patches, and a good assortment of dirty patches. These dirty patches were purposely built with CMY values and then with 100% GCR values excluding the 3rd color and replacing it with K. This was done because many separations, especially those done with ink reduction products, are made with GCR these days. It’s hard to beat what’s in that 3-row, 84-patch control strip.
While considering charts and patch values, it’s almost more important to note the metrics and tolerances we place on these patches for conformance to specifications. If you look at the metrics we currently use for pass/fail, they are very CMYK printing press centric. Commercial print, specifically offset printing, has been the forefront of most industry standard and best practice development. Therefore much of the data gathering and evaluation is based on printing devices where C, M, Y, and K ink thicknesses are controllable by the operator. This means most metrics are tied to effective control of those ink thicknesses, which is largely irrelevant to the digital world.
We should be asking: “What are we passing and failing?”
For the G7 Colorspace metrics (currently the most stringent) we are evaluating:
These are not very visually oriented metrics and tolerances. So the big question to ask is what are you evaluating with your chart, or more importantly, what metrics and tolerances are you using to evaluate your chart? For G7 you could just use a P2P and eliminate the gray finder patches (columns 6-12), because the metrics are really only focused on CMYKRGB solids and the gray patches.
Bottom line, if we are looking for print consistency, we need to look at establishing new metrics that truly help us determine how visually consistent a print is. After a great deal of research, I believe this should be based on a cumulative relative frequency model (CRF) that evaluates all colors in a chart. In a CRF model, each and every one of the patches is relevant to visual consistency and is being counted within the evaluation. I have found the 3-row control strip does an excellent job of evaluating visual print consistency when using CRF. I’ve also performed the experiment in live production many times and have continued to get feedback from users who say using CRF and the 3-row control strip is the best method they’ve found to evaluate visual consistency.
If you would like to see the true power of CRF and real world metrics, try SpotOn! Verify. The trial is free, and our team will help you get started.
Long-time SpotOn! user and product dealer Joe Pasky at Cathay America offers insight into the value of process control…
EIGHT years ago, I was asked by Seagate Technologies (California) to help them G7 qualify the 6 printing plants that they used in China and Thailand which produced retail cartons for disk-drives. They were having difficulty with variations in the color appearance of similar product images that were printed at different factories, even though the files and proofs for these images were created by a single color-house in the States. Several runs of cartons were rejected and had to be reprinted; this caused delays in the product release schedules and extra costs.
The original strategy for press checks on new products was to ask the printer to produce a press sheet at make-ready that matched the supplied proofs as closely as possible. This best match was approved and became the master reference sheet for printing all subsequent SKUs that used those particular brand colors and product images.
The problem with this approach was that the printed product images approved at one factory were sometimes different from identical product images produced at other Seagate printers.
As we began to G7 Qualify each of the printers and train them to ‘print to the numbers’, the color differences between the master reference sheets at the multiple printing plants was greatly reduced.
When traveling to the States a few months later, I visited a Best Buy store. I took particular notice of the display of Seagate disk drives; cartons that I had approved several months earlier. Looking at the same and similar cartons side-by-side, I was disappointed to see a noticeable variation in color and balance. I took photos of the UPC codes to identify the printer and began an investigation.
As it turned out, the operators were not being vigilant in monitoring the color during the press run. We implemented a new sampling procedure where a percentage of sheets from the press run were pulled and time-stamped. These were measured by the QC department and reports from each production run were sent to the Seagate China office for review. This procedure required extra effort and attention from the press operators and supervisors, but after several months of monitoring, the printing consistency was improved and variation reduced.
Monitoring the press sheets was effective, but we also wanted to sample the individual cartons after they were formed. But once the carton was die cut from the press sheet there was nothing to measure. The press colorbars were gone. To solve this problem, we built a 7-mm, 14-patch control strip that was hidden in the glue flap and another control strip placed in the tuck-flap of the box. The patches on this strip included two brand-color patches, paper-white, CMYK, 3-color black, and the G7 tone value targets: 25k, 50k and 75k and 3-color, ¼-tone, midtone and ¾-tone tints.
We had the experience of using SpotOn! Verify software to check digital proofs. We realized that we could also use SpotOn! Verify to scan the printed control strips. We took samples of finished boxes from the production runs and used SpotOn! Verify to quickly measure and record the color data. SpotOn!’s reporting function helped us build a quality report and history for each of the production runs.
The press operators also began using SpotOn! Verify during make-ready to measure the control patches to be sure that they were hitting their G7 target values and assure that they would pass the QC department’s ‘pass/fail’ criterion. It was an excellent tool that was very easy to use in production. We no longer use proofs as a color reference; we’re printing 100% ‘to the numbers’.
Seagate was quite pleased with the results. For the past 6-years, each of the factories have been using SpotOn! Verify to monitor color on press and in the QC department, cartons from each of the factories and the images for each SKU were an excellent match to each other. All the printing plants were printing product images that were accurate and consistent from SKU to SKU and run-to-run. Color variation between boxes produced at multiple factories was practically eliminated.
Using SpotOn! Verify to measure these simple hidden color control strips that incorporate G7 data points has made it possible to monitor and improve packaging color reproduction of finished folding cartons across multiple production facilities. Any brand hoping to improve color consistency of printed packaging across a global supply chain could use this approach to improve quality and color fidelity.
Joseph J. Pasky