Many of us have the luxury of processes with data. We can see how fast a line is running, find out production rates, weights of a product or counts of final inventory. We can take this data and manipulate them into different forms to tell us different stories. We can then be proactive or reactive in our response.
Our favourite way to respond is often to turn that dial. There is little more satisfying than to make a change and see it instantly in the process. But did that change make things worse? Did it just mask the true problem? Was there a downstream effect not anticipated?
Knowing when to touch that dial and, more importantly, when NOT to touch it, can be the difference between stellar performance and a truckload of rejected material.
Let’s consider several factors that decide whether or not to adjust the process. First is the data integrity (accuracy, timeliness and relevance) and second is interpreting the story the data is telling.
This blog will consider the data integrity and the next blog will go into the story telling.
Is Your Data Accurate?
When considering accuracy, there is an instrument factor, a human factor and an environmental factor. They can be considered separately or collectively. Interpreting the story though is going to depend on the ability to trust in the accuracy of the combinations.
From an instrument perspective, calibration frequency and performance should be regular and complete. The complication here is when the only time you doubt the reading is when it tells you something you don’t like. Why do you retest only if the product is out of limits?
The human factor pertains to standardizations. Does everyone read and record the instrument the same way? If the measurement is of the reaction to an input (ie. rejecting or accepting a product), has everyone followed the same standards? For example, if a label is missing, does the person add the missing label back and consider it 100% accepted? Or do they add it but track an error? If they reject the product, do they only reject the one item or do they reject the whole box it was in? While it is good to have people automatically correct the problem, that lack of data could mask the real issue while also creating an increase in manual labour involved in the process.
The effects of the environment can be subtle in their impact on taking measurements. Raw pork is often graded on a colour scale as to whether or not it is pale or dark (National Pork Producers Council Color Standards, http://www.porkstore.pork.org/producer/default.aspx?p=viewitem&item=NPB-04036&subno=&showpage=4&subcat=8). If the lights are changed from fluorescent to incandescent or moved closer or further away from the sample, the evaluation of the meat can change significantly.
Is Your Data Timely?
Not everyone operates continuous flow processes. In fact, very few processes could be considered true continuous flow, especially in the food industry when a sanitation step needs to be included. The opposite of continuous flow is batch flow. When it comes down to timeliness of information, getting the results only after the batch is complete is often too late. In fact, sometimes those results don’t come until the product has actually reached the customer.
Sometimes getting the final release of a product has to be a while after the batch is over. Chocolate bloom can take several days to literally surface. Quarantining the product though can be costly while waiting for that release. While an issue of quality requires an immediate action, turning that dial may not be the right one.
Yields are often calculated after the fact, which again, in a batch scenario, is too late to react by turning a dial. Your next batch, especially with variable raw material such as meat, milk, vegetables, may require completely different settings due to other reasons.
Relevance
That timeliness works hand in hand with relevance of data. HACCP can dramatically help in identifying which points of the process are critical from a food safety perspective but they may not be indicative of processing optimization control parameters. When considering what to watch from an optimization process, you have to consider what parameters can be changed and how easily it can be done. Are they truly control parameters or just indicators?
As a final comment on understanding the relevance of a measurement, I am reminded of a time when I worked with an extrusion process. The “control” point was considered to be the exit temperature. A key control of that exit temperature was the upstream temperatures in the extruder. So much focus was spent on ensuring the exit temperature was steady that several weeks went by before it was realized that every time it moved so much as 0.5 deg C, the operators were changing the upstream process to get it back in line. The upstream ended up operating under dramatically different conditions, as evidenced both downstream in property changes to the product and in the extruder itself when it was opened and broken screw elements discovered. Does maintaining your control point mask other issues?
Part two: Interpreting the story behind the data! - stay tuned!
No comments:
Post a Comment