DOE

Design of Experiments

EARLY EXPERIENCES WITH DESIGN OF EXPERIMENTS
The well known engineer and IEEE Fellow Samuel Keene Ph.D. was fortunate to have a positive defining reliability experience in the 1970s related to design of experiments (DOE) as the foundation for six sigma and design for six sigma (DFSS) processes and tools. He was involved with a team designing one of the first laser scanners. The active element was a HeNe laser, the type with a red beam that we see regularly in food store checkout lanes. Early HeNe lasers were chaotic in their parametric performance and their demonstrated lifetimes. Up to that time, these lasers had been used only under laboratory conditions and at construction sites; there had been no emphasis on high-reliability lasers. The industrial application of a photo-scanner demanded higher reliability.
High reliability could only be achieved with testing and design changes, which would prove to be a great burden to the program. The test samples had a long delivery time and were expensive, about $5000 apiece. Their performance, sample to sample, varied widely. There were differences in their starting voltage, run current, and important from a reliability standpoint, in their lifetimes. These qualities are labeled key process output variables (KPOVs) in the six sigma vernacular. The life goal was 5000 hours before replacement. The laser test samples that were exercised in initial tests were exhibiting 50 hours on average, with maximum lifetimes between 100 and 1000 hours across 50 samples. Further testing was conducted on new samples. When a laser sample was finally observed exhibiting stable characteristics across several thousand hours of operation, it represented proof that the lifetime and desired stability were obtainable. It proved the existence theorem: that lasers could achieve the life goal. All the laser failures were analyzed thoroughly and some subtle differences were found. Some lasers had 2024 aluminum cathodes; some had 6067 aluminum cathodes. Some were temper T4; some were temper T6. These lasers were built differently even though the purchase specifications and the vendor’s datasheet did not identify or allow for such variations. These were just variations that the supplier did not feel were important and had not been a factor to his customers up until that time. The vendor was not even aware of the subtle changes in his supplier’s materials. These input variables are potentially key process input variables (KPIVs). We desire to understand the effect of these input variables on the output(s) of the laser and to determine what the key input process variables are.

Before we go any further it is important that the reader be familiar with the basic tenants of Statistical Analysis and, even if you feel you need to refresh your memory, here is a good statistics refresher.

DOE

Figure 1) JMP DOE

 

Teaming with the supplier, metallurgists, and failure analysis and development personnel, he identified the possible variations in process and materials in their product. These cross-functional meetings even identified other configuration changes that the manufacturer was considering. Twenty-five potential KPIVs were identified. This list was scaled down to 13 potentially more significant variables that could influence laser performance. These process and product variations were all analyzed in a series of designed experiments. The first experiment was a screening experiment to find the more important KPIVs. This test simultaneously examined 10 possible input factors at two levels (e.g., 6061, 2024 aluminum) each and three other factors at three levels (e.g., high, medium, and low fill pressures). These variables were tested in the screening test at their extreme levels to see if there was any impact on laser performance. The effects of these input variables were measured on three critical output variables: laser start voltage, laser current stability, and laser life. This initial screening test narrowed the initial 13 KPIVs from the screening test down to a set of six KPIVs. Subsequent testing then established, and later validated, the optimum setting of each of these KPIVs to achieve the best KPOVs. Through the systematic effort of DOE with design requirement enhancements and manufacturing process improvements, the laser scanner performance over an entire population of lasers in its application went from chaotic to rock solid. The new laser scanners never caused a field problem.

DOE Data

Figure 2)  Planning the Experiment

Conversely, too often an operating point is set based on a single sample or single lot of samples from the supplier. Supplier part variation occurs later, compromising product performance. Redesign too often compromises the original design architecture and increases design complexity. DOEs are surprisingly underutilized. One might ask why a successful tool such as DOE is not embraced more widely. It could be that developers don’t use this tool for a number of reasons. It may be that developers:

  1. Don’t know of DOE.
  2. Don’t have the statistical skill sets or confidence to use it.
  3. Don’t want to expend the time and effort to design and conduct a formal experiment.
  4. Believe that the problem is trivial and that only one variable is involved (i.e., it is too simple for a sophisticated tool such as DOE). They want to treat the problem as a one-factor-at-a-time (OFAT) problem.
  5. Feel that their intuitive solution is adequate.

Yet in the end, DOE is the only way to analyze a multifactor problem with statistical confidence and reveal the factor interactions. The factor interactions are never uncovered in an OFAT problem, and they are often the predominant drivers of the effects measured. An interaction is prevalent whenever some effect is conditional. For example, a question that illustrates interactions is: “Do you feel better with humidity added to your environment?” The answer is: “It depends.” There is an interaction, but how and when? People living in their houses will add humidity in the wintertime when the house is heated to 66 to 70°F. Humidity is undesirable under extreme cold or extreme heat conditions. An engineering example of factor interaction is the problem experienced in underinflated Firestone tires on a Ford Explorer as related in Forbes magazine on June 20, 2006 (see http://www.forbes.com/2001/06/20/tireindex.html). This led to tire pressure blowouts whenever all three conditions were present: (1) underinflated tires, (2) Firestone tires, and (3) Ford Explorer automobile. This is an interactive problem and led to Firestone tire recalls in 2000.

 

Custom DOE
Figure 3 ) Custom Designers

Through DOE the design variation can be explored and a proper, stable design point established as was done for Mr. Keene’s HeNe laser build. This became a collaboration exercise, with the supplier’s designers and manufacturing people meeting with the system design and material research staff from his company. For sure it was a real-time learning experience for all participants. The device supplier better understood the system application needs for their device. The system folks learned the build constraints on the laser as well as potential future design and material changes under consideration. The research staff reported the changes and variation they found in the laser samples, some of which were unknown to the laser supplier. So the DOE not only addressed the immediate design and material variations in the laser construction but provided a platform from which to investigate future changes. DOE helped establish the optimum laser build design point by properly defining the materials and processes used to build it. This process yielded a product design that met all of its functional and reliability goals and provided an understanding of how to control potential product variations.

 

DOE Diagnostics
Figure 4) DOE Design Diagnostics
Bysian D Optimal
Figure 5 Bayesian D Optimal Approach Example

 

 

 

 

(0)