Friday, September 24, 2010

Manufacturing, Process Optimization and Product Lifecycle Management

This week I took advantage of some free webinars being offered about the food industry.  I was really intrigued by the Product Lifecycle Management for the Food Industry given by Greg Nutter of Lascom Solutions, through the Prepared Foods webinars.

Why was I intrigued?  Well, some complaints you hear in manufacturing facilities are; you can never find the information you need (ie Sales won't give it to you) or that the other business departments don't understand.  Or you may discover that the same product is made a different way at another facility.  Or people are using different standards somewhere else!  Or, that no one kept track of the way the process was set up when R&D or Sales did a last minute trial run.

Imagine being able to have all this information available easily and set up so that changes in spot initiate notices wherever else it may have an impact.  Imagine being able to get your product to market faster because information is centralized.  Wouldn't it be great to be able to see why things are being done the way they are? To be able to identify ways to improve and cut costs.

Check out the presentation on Product Lifecycle Management.   The concept would fill many people's wishlists.

As this will only be archived for a limited time, send me an email if you would like a copy of the podcast: gemma@edgeofcontrol.ca

Tuesday, September 21, 2010

Don't Touch That Dial - Part 2

Last time I discussed accurate data and how that impacts our reacting to our process outputs. This week we will look at interpreting the story those outputs tell us. Without the right story and context, manufacturing optimization will not occur effectively.


A lot of the time the data comes to us in numbers. Some people can read numbers very well presented in tabular form. Me, I prefer a picture. The best way to see that picture is in a graph. On the Y-axis, we typically put the outputs of our process – kgs produced, hrs uptime, % efficiency etc. On the X-axis is our inputs with time being the most common. If you are trying to decide whether to do an input or time, ask yourself, which changes? If you always have a setting constant, then you don’t want it on your X-axis. Time can be days, hours, minutes, shifts or even batches or lots.

Charts can tell you a lot but only if you have done them correctly. We will focus at this time on run charts, spec charts and control charts. Below are 3 graphs with the identical production information – but each tells a different story.


Figure 1 is a run chart. The shift production rates are plotted against the day and shift. That’s it. Is it good or bad? Who knows. The production rate varies from shift to shift and it looks like some were worse than others but who knows if it was bad?
 Figure 1: Run Chart

Figure 2: Spec Chart

The next chart has the production goal laid over top. We will call this a spec chart because often it is not the production goals but the specification limits the customer wants. Now we have some context and can see that some of the production was great, well over the goal but some was bad, well below. But can we expect our process to perform more consistently, with less variation?

Figure 3: Control Chart

Figure 3 removes the production goal and puts what the mean (average) production rate is for the whole time period. It also introduces an Upper Control Limit (UCL) and a Lower Control Limit (LCL). According to those limits, there is so much variation in the process that the probability of having a day with an unplanned production of 0 KLiters is as likely as a day with 150 KLiters of production. Therefore, we should look at our process, not for one-off things to change (ie train Operator B to do CIP properly) but for improvements that go across all shifts, all the time.
Figure 4: Control Chart for Fine Tuned Process (with Production Goal)

Say we managed to fine tune the process and we ended up with Figure 4. Wow, lot less variation but, without any special events occurring, we would never make the production goal. One shift though something happened (special event – ie power outage) and the production really fell off. Another shift though, production was phenomenal, exceeding the control limits AND the Production Goal. Again, this is a special event, so we SHOULD be able to find out what happened differently and see how we can adjust the process to be like that all the time.

Context is critical when looking at data. And that context has to reflect the process itself and its capabilites and not just the goals of the business. The goals will put emphasis, pressure and criticality on the process, but if it cannot fundamentally perform that way, you have to do a complete rethink on what you want.

To put it another way; If you put a bowl of cookies on an 8 ft wall and tell your 6yr old to jump and grab one, they will try again and again and will get closer and closer to it but will never reach it. That is, until they get a ladder, get a taller friend or start throwing things to knock the bowl off. Does your data tell you it is time to get a ladder?

Thursday, September 9, 2010

Don’t Touch That Dial – Part I

Many of us have the luxury of processes with data.  We can see how fast a line is running, find out production rates, weights of a product or counts of final inventory.  We can take this data and manipulate them into different forms to tell us different stories.  We can then be proactive or reactive in our response.

Our favourite way to respond is often to turn that dial.  There is little more satisfying than to make a change and see it instantly in the process.  But did that change make things worse?  Did it just mask the true problem?  Was there a downstream effect not anticipated?

Knowing when to touch that dial and, more importantly, when NOT to touch it, can be the difference between stellar performance and a truckload of rejected material.

Let’s consider several factors that decide whether or not to adjust the process.  First is the data integrity (accuracy, timeliness and relevance) and second is interpreting the story the data is telling.

This blog will consider the data integrity and the next blog will go into the story telling.

Is Your Data Accurate?

When considering accuracy, there is an instrument factor, a human factor and an environmental factor.  They can be considered separately or collectively.  Interpreting the story though is going to depend on the ability to trust in the accuracy of the combinations.

From an instrument perspective, calibration frequency and performance should be regular and complete.  The complication here is when the only time you doubt the reading is when it tells you something you don’t like.  Why do you retest only if the product is out of limits?

The human factor pertains to standardizations.  Does everyone read and record the instrument the same way?  If the measurement is of the reaction to an input (ie. rejecting or accepting a product), has everyone followed the same standards?  For example, if a label is missing, does the person add the missing label back and consider it 100% accepted?  Or do they add it but track an error?  If they reject the product, do they only reject the one item or do they reject the whole box it was in?  While it is good to have people automatically correct the problem, that lack of data could mask the real issue while also creating an increase in manual labour involved in the process.

The effects of the environment can be subtle in their impact on taking measurements.  Raw pork is often graded on a colour scale as to whether or not it is pale or dark (National Pork Producers Council Color Standards, http://www.porkstore.pork.org/producer/default.aspx?p=viewitem&item=NPB-04036&subno=&showpage=4&subcat=8).  If the lights are changed from fluorescent to incandescent or moved closer or further away from the sample, the evaluation of the meat can change significantly.

Is Your Data Timely?

Not everyone operates continuous flow processes.  In fact, very few processes could be considered true continuous flow, especially in the food industry when a sanitation step needs to be included.  The opposite of continuous flow is batch flow.  When it comes down to timeliness of information, getting the results only after the batch is complete is often too late.  In fact, sometimes those results don’t come until the product has actually reached the customer.

Sometimes getting the final release of a product has to be a while after the batch is over.  Chocolate bloom can take several days to literally surface.  Quarantining the product though can be costly while waiting for that release.  While an issue of quality requires an immediate action, turning that dial may not be the right one.

Yields are often calculated after the fact, which again, in a batch scenario, is too late to react by turning a dial.  Your next batch, especially with variable raw material such as meat, milk, vegetables, may require completely different settings due to other reasons.

Relevance

That timeliness works hand in hand with relevance of data.  HACCP can dramatically help in identifying which points of the process are critical from a food safety perspective but they may not be indicative of processing optimization control parameters.  When considering what to watch from an optimization process, you have to consider what parameters can be changed and how easily it can be done.  Are they truly control parameters or just indicators?

As a final comment on understanding the relevance of a measurement, I am reminded of a time when I worked with an extrusion process.  The “control” point was considered to be the exit temperature.  A key control of that exit temperature was the upstream temperatures in the extruder.  So much focus was spent on ensuring the exit temperature was steady that several weeks went by before it was realized that every time it moved so much as 0.5 deg C, the operators were changing the upstream process to get it back in line.  The upstream ended up operating under dramatically different conditions, as evidenced both downstream in property changes to the product and in the extruder itself when it was opened and broken screw elements discovered.  Does maintaining your control point mask other issues?

Part two: Interpreting the story behind the data!  - stay tuned!