Showing posts with label Improve. Show all posts
Showing posts with label Improve. Show all posts

Sunday, April 29, 2007

Six Sigma Answer to Material Shortages

One of Six Sigma¡¯s strengths is its facility for revealing causes and solutions that run contrary to our initial assumptions. When a persistent condition resists all attempts at improvement, or when an obvious fix to a newly discovered problem turns out to be lacking, a methodical approach like Six Sigma¡¯s can uncover even the most unlikely of causes and deliver results.

In the following case study the continuous improvement team was in for just such a surprise. Conventional wisdom was wrong, and the path the team started down hid unexpected complexities.

Definition

The XYZ Pump Garage program overall performance was poor. Future customer orders would not have been forthcoming without substantial improvements in quality and delivery.

  • On-time delivery was 80% vs. >99% goal.
  • Direct labor overtime was running 15% vs. a goal of zero.
  • Field reported defects were found in 50% of system shipments vs. 0.5% goal.
  • Project margin was approximately 22% vs. a 33% goal.

A process improvement team was formed with members from Customer Service, Manufacturing, Production Control, Engineering, Operations, and Purchasing.

Measurement

' Initial ' majority team consensus was that the program¡¯s poor on-time delivery was the result of material shortages due to understaffing in Purchasing. More buyers seemed to be the probable solution. The team suspected that the field defects were principally a result of poorly trained assembly staff.

The team began daily monitoring of data for number of daily kit shortages, overdue suppliers, and daily purchasing workload based on Material Requirements Planning (MRP) demands. Field personnel were interviewed for detailed descriptions of field defect rework.

Briefly summarized, the data showed:

  • Typical labor overtime occurred near the end of the manufacturing process.
  • 100% of all kits were issued with shortages.
  • The key suppliers were >3 days late 50% of the time.
  • The MRP system was posting material demands inside the material lead times!
  • The requested delivery dates for material in the MRP system did not match well with project ship dates!
  • A majority of customer-reported defects appeared to be the result of incomplete or incorrect manufacturing documentation.
  1. Overtime was being worked to make up lost time due to late material deliveries.
  2. Understaffing in Purchasing was not the problem! An army of buyers would not result in on-time material when the MRP ¡®buy¡¯ signal came too late or not at all. The team¡¯s true analysis problem was to understand why the MRP System was giving wrong signals. The team decided to focus on one specific sales order line item that exemplified the problem set for a typical system.

    What they found:
  1. The sales order was coded incorrectly in a fashion that would generate several MRP problems.
  2. Item master attributes were not properly populated for many of the material items that had MRP problems.
  3. Customer engineering change orders (ECO) had been accepted without renegotiating product delivery dates with the customer to allow time for ECO implementation, including new material delivery.
  4. A check of other customer order line items showed similar problems.
  1. Customer ECO information was not being properly transmitted and propagated throughout the organization, resulting in out-of-date manufacturing instructions and field defects.
  2. Problems would not have occurred if program participants had properly followed the procedures and work instructions documented in the Quality Management System.

Improvement

The improvements we implemented can be summed up in one word: training. The company had grown significantly during the past year and while all employees had received training, it had sometimes been rushed or had not been completely absorbed by the new personnel. Mandatory training was scheduled immediately for all Customer Service, Engineering, and Operations personnel on the documented procedures for sales order entry, customer engineering change orders, creating item masters, and creating engineering masters. Retraining took 7 working days with approximately 30 personnel participating.

ERP data for all active purchase orders was audited for the most common errors the team had recently discovered. This process required 5 working days.

New delivery dates were negotiated with the customer¡¯s buyer based on the new solid data foundation. This was difficult, but fortunately the customer¡¯s buyer is a mature personality with a long-term partnership attitude.

Results:

  • Within 4 weeks material shortages had improved considerably.
  • On-time delivery reached 100%.
  • Overtime labor became negligible.
  • After 8 weeks there had been no field defects found in the 6 systems shipped in the prior 5 weeks.
  • Margin has improved to 28%, but this needs further investigation.
  • Teamwork between organizations improved as a result of greater appreciation for the needs and complexities of their respective jobs.

Control

On-time delivery, customer field defects, and margin remain the bottom-line metrics for process control on the XYZ program. However, most importantly, as a result of the XYZ team findings, a new continuous improvement team was formed: the Enterprise Resource Planning Data Integrity Team (EDIT). EDIT is tasked with developing a set of strategies and process control tools to insure there are no repeats of the XYZ difficulties on other programs.

Implications

This single Six Sigma project thus had far-reaching implications for the XYZ Pump Garage program. First, in fulfilling the immediate purpose of improving our performance, we achieved customer retention for the near future. On a broader level, we also seized an opportunity to enhance our overall long-term approach to improvement. The value of reaching beyond obvious solutions having been so dramatically reinforced, we created a new continuous improvement team charged with making the pursuit of quality a more proactive endeavor.

Friday, April 27, 2007

Six Sigma Case Study: Defect Reduction in the Service Sector

by Chris Bott

This case study discusses the effective use of Six Sigma tools to improve our plastic issuance processes. It will take you through a project American Express completed, “Eliminate Non-received Renewal Credit Cards.?This analysis demonstrates how we applied Six Sigma techniques to reduce the defect rate with ongoing dollar savings.

Define and Measure the Problem
(Data has been masked to protect confidentiality.)

  • On average (in 1999), American Express received 1,000 returned renewal cards each month.
  • 65% (650) were due to the fact that the card members changed their addresses and did not tell us.
  • The U.S. Post Office calls these forwardable addresses. Please note: Amex does not currently notify a card member when we receive a returned plastic card.

Analyze the Data

We applied various Six Sigma tools to identify Vital Xs, or the root causes of the defect. The use of Chi Square indicated the following:

  • By type of card/plastic: We isolated significant differences in the causes of returned plastics among product types. Optima, our revolving card product, had the highest incident of defects but was not significantly different in the percentage of defects from the other card types.
  • Issuance reason: Renewals had far and away the highest defect rate in the three areas in which we issue plastic—replacement, renewal, and new accounts.
  • Validated reason for returned: Because we suffered scope creep early in the project, it was important to confirm what our initial data was telling us. After testing the five reasons for returns, returns with “forwardable?addresses were overwhelmingly the largest percentage and quantity of returns.

Improve the Process

An experimental pilot was run on all renewal files issued. This “bumping?against the “National Change of Address?service was implemented on all renewal cards in mid August. Due to the strict file matching criteria, this solution will impact 33% of the remaining population (or 333 cards monthly).

As a result of a successful pilot, we were able to reduce the defect rate by 44.5%, from 13,500 to 6,036 defects per million, reflecting annual savings of $1,228. Figure 1 outlines the combined test results.

Fig. 1 Combined Test Results

Non-Received Renewal Credit Cards

Baseline

Test Results

Defect rate

1.35%

.6%

DPMO

13552

6036

COPQ

$3,360

Total annual savings

$1,228

Sigma level

3.71

4.01

Control the Process

To ensure that we perform within the acceptable limits on an ongoing basis, it is important to monitor the new process. To achieve “control?status, we will be using the p chart, a tool that tracks proportions of returns over time.

In addition, our vendor has constructed reporting, which gives us the ability to monitor the defect rate on a monthly basis. The report will tell us if any credit cards that were “bumped?against the "National Change of Address" database were returned back to our warehouse.

Impact on Customer Satisfaction

Using the "National Change of Address" will enable over 1,200 card members to get their credit cards. Prior to this implementation, these card members would have never received their cards automatically. Revenue and customer satisfaction will undoubtedly increase.

Thursday, April 26, 2007

(Illustration) Determine the tolerances

Determine the tolerances

Instructor : In Step two we performed optimization experiments and used statistical analysis to obtain the transfer function for the effect of the X's of nut type and installation torque force upon the Y of nut removal time. The enabled us to optimize the vital X settings for our proposed solution.
Instructor : In this Step we will use the results of previous Step, to determine tolerances of the key operating parameters, or vital x's, necessary to achieve the project objectives and thus satisfy the customer C T Qs.

Instructor : In this final step of the Improve phase, we will define the basic purpose of Statistical Tolerancing,
Instructor : Define the basic principles of Statistical Tolerancing, and
Instructor : Apply Statistical Tolerancing to the Rockledge case.

Instructor : The basic purpose and concept which guide tolerancing are straightforward.
Instructor : The purpose is to establish the range of values for each vital X that will satisfy customer requirements.
Instructor : The concept is equally direct. If you know the relationship between X and Y and also the required specifications of Y,
Instructor : Then you can set the tolerance of the X factor.

Instructor : In the next few minutes, we will go through an exercise which will culminate in developing the tolerances for a specific task. we will begin by defining some basic terms, developing the transfer function, and then applying tolerancing techniques to develop a final description of the solution. The transfer function we're looking at involves the relationship between hours of exercise per week and pounds of weight lost per week. Given those two factors, which one do you think is the X and which one is the Y?

Instructor : So in previous step, where we determine variable relationships, we would produce a chart like that shown here, in order to provide a visual representation which quantifies the relationship between weight loss
Instructor : and hours of exercise per week.
Instructor : In order to determine how much exercise we need to achieve a specific weekly weight loss, we draw a horizontal line from the desired weight loss to the transfer function line,
Instructor : And then drop a vertical line from that intersection to the exercise scale.
Instructor : We can then read the required exercise time directly from the scale. So this tells us that we need to exercise two point two hours per week in order to lose two point five pounds per week.
Instructor : This is a simple example, but we can add additional parameters which will more accurately reflect the real world.

Instructor : We have decided that we would like to lose at least one pound per week, and preferably two. For health reasons, we would like to lose no more than three pounds per week. So how do we translate this to our statistical model?
Instructor : First, our Target is two pounds per week. So we draw the horizontal line from that value to the transfer function and then drop the vertical line down to the exercise scale where it intersects two hours per week.
Instructor : We then set our lower specification limit at one pound per week, which means we need to exercise at least one point four hours of exercise per week to meet the minimum goal.
Instructor : Finally, we set our upper specification limit at three pounds, which corresponds to exercising no more than two point six hours per week, or we may exceed our upper spec limit. So our preliminary statement is that we should exercise two hours per week, with a tolerance of plus or minus zero point six hours per week.

Instructor : Before we can finalize these limits, we need to take into account the variation of our measurement system. As it turns out, there is a one-quarter pound variation in our measurement system. So we need to modify our limits accordingly.
Instructor : First, we center a one quarter pound range over the L S L and the U S L at the pounds per week scale.
Instructor : We then use these variation ranges to adjust our lower specification limit upward to one point five hours of exercise and
Instructor : our upper specification limit down to two point five hours of exercise.
Instructor : So accounting for any variation due to our measurement system, our new tolerance around two hours per week is plus or minus zero point five hours of exercise.
Instructor : At this point, we look at the transfer function and can say that the part between to adjusted lower spec limit and the adjusted upper spec limit defines the acceptable range.

Instructor : In some cases there may also be measurement variation in the X. In that case, you would use the same technique to adjust the lower and upper specification limits for the X values after they had been set and adjusted for Y value measurement variation. The results, as you can see, are a further tightening of the tolerances for X. These adjustments will be critical in the next phase, Control.
Instructor : This is a graphical representation and simplification of the statistical tolerancing process. A detailed look at this process and its calculations will be covered in further training.

Instructor : Let's take a look at how this applies to the Rockledge case. The measurement variation for Y is less than one point five percent, as established in Step Three of Measure. The measurement of X, the torque force, also has very small variation. Therefore, establishing the tolerance around our target of 17,000 for this case is very straightforward.
Instructor : Since the variation associated with measurement is minimal, the team decided to take advantage of this situation and use the curve as an approximation of the acceptable solutions. First, we draw a line across the chart at thirty minutes, since that is our upper specification limit. The curve of the line under thirty minutes represents acceptable operating settings, because any torque setting based on the portion of the curve under the upper spec limit will provide acceptable results. Of course, we're going to try to optimize our solution to provide the maximum benefits possible, but there are cases where you may need to compromise the ideal solution with other considerations.
Instructor : So now we'll drop two lines down from where the thirty minute mark intersects the transfer function.
Instructor : This gives us a lower limit of fifteen thousand one hundred and sixty six foot pounds for the lower specification limit and eighteen thousand eight hundred and fifty three for the upper. Rounding the lower limit up and the upper limit down to three significant places gives us a lower limit of fifteen thousand two hundred and an upper limit of eighteen thousand eight hundred,

Instructor : The manufacturer adjusts the torque setting of the wrench at the factory and re-adjusts them at scheduled intervals. The setting is certified to be plus or minus five percent, which in this case is eight hundred and fifty.
Instructor : So a wrench set at seventeen thousand will be between sixteen thousand one hundred and fifty on the low side and seventeen thousand eight hundred and fifty on the high side. These are both well within our limits for keeping our nut removal times below our upper specification limit of thirty minutes.
Instructor : So we can rest assured that our torque wrenches will properly apply the correct force.

Instructor : Let's summarize this step
Instructor : Process tolerances are based upon specification flow-down from customer requirements
Instructor : You will need to adjust tolerances due to variation in both the process and product measurement systems, unless those variations are small enough to ignore.
Instructor : This is a very simplified overview of statistical tolerancing. There are times when trying to determine the tolerancing for multiple variables becomes extremely complex. Consult with your Master Black Belt for assistance with such situations.

Instructor : So here is where we are with the Rockledge case. We will make two changes in the Nut Installation process in order to affect our Nut Removal Time.
Instructor : Our first change is that the nuts used will be changed to the Torque Master type.
Instructor : Second, we will have them installed at a torque setting of seventeen thousand foot pounds
Instructor : The guaranteed tolerance from the factory is will be within our upper and lower specification limits. So we will not need to apply any control measures to this factor.

Instructor : Congratulations. You have finished this Step

(Illustration) Optimize the results

Optimize the results.

Instructor : Welcome to Step Two. In Step One we narrowed the vital X's down to three. The nut type, the installation torque force, and the interaction between those two factors. Here in this Step , we'll determine the actual changes to be made in the process, primarily by quantifying the continuous variable, the Installation Torque Force, and its relationship to the Nut Removal Time, or Y.

Instructor : Let's look at the learning objectives for this step. When you finish this step , you will be able to perform Optimizing experiments in order to develop the deliverable, which is the proposed solution.
Instructor : Describe considerations and potential trade-offs involved in reducing the number of experiments.

Instructor : Once you know what you are going to change, you can narrow your focus to determine what the precise changes will be. Let's return to the Rockledge case and see how that exercise plays out.

Instructor : Screening DOE provided enough information for us to determine that we will change to the new Torque-Master nut. The question now arises as to what our optimum torque force setting should be used during installation. We received some indication in previous step , but not enough.
Instructor : In order to determine this information, we will run an experiment using the both the regular nuts and the Torque Master nuts installed with a larger number of torque settings covering a broad range. To interpret the data we will run a Regression Analysis. Each analysis will be a one factor, multi level analysis designed to optimize the torque force for each nut.

Instructor : Let's briefly discuss our experimental parameters:
Instructor : We will conduct a total of thirty runs
Instructor : Fifteen runs will use the current nuts
Instructor : and fifteen will use the Torque Master nuts
Instructor : Each nut will be installed either two or three times at torque settings of fourteen, fifteen, sixteen, seventeen, eighteen, and nineteen thousand foot pounds.

Instructor : Before we continue, let's briefly discuss some issues which may impact your experimental design.
Instructor : While exhaustive testing may be desirable, budgetary considerations may require you to reduce the number of experiments. There are statistical approaches such as the fractional factorial design which can assist you in this task.
Instructor : It will take a certain number of person-hours to run each experiment. This may be another limiting factor.
Instructor : In a situation like the Rockledge case, a solution must be implemented in a timely fashion, or the project objectives will not be met. Your experimental design must take this into account.
Instructor : Depending on your process, there may be a number of other resources which can limit your experimental design. You will have to accommodate them when you reach this stage of the project.
Instructor : As discussed earlier, the full factorial test may require a large number of runs as the number of factors and levels increase. If you need to reduce the number of experiments, you can use a fractional factorial design, which involves,
Instructor :
selecting and testing an appropriate sub-set of all possible combinations of factors. This design may also be applicable in the screening process of step one as well.
Instructor : A trade off in using only a sub-set of possible combinations is that we will lose some information about the interactions among factors. Techniques and tools to aid you in selecting the appropriate design will also be covered in further training.

Instructor : At this point, we have run the thirty tests and entered the data into MiniTab. If you look at the worksheet, you will see that
Instructor : Columns C four and C five refer to the test performed with the original nuts, and
Instructor : Columns C six and C seven address the tests performed on the Torque Master nuts. Now we are ready to perform the actual analysis
Instructor : As a preliminary, we will first do a simple plot of the data and see if it shows any discernable trend or curvature. This will determine the specific regression analysis we perform.
Instructor : From the MiniTab menu, select the Graph Menu and then Plot.
Instructor : The dialogue box for plot allows you to generate multiple X Y graphs at one time.
Instructor : Each graph is entered on one line in the Graph Worksheet. The next step is to define the two graphs.
Instructor : Our first graph will look at varying torque levels with the current type of nut.
Instructor : The Y axis is the nut removal time with the current nut,
Instructor : and the X axis is the torque force used.
Instructor : Likewise, the second graph covers the Torque Master nut.
Instructor : With the Y axis showing nut removal time,
Instructor : and the X axis the torque force.
Instructor : Now, when you click on OK, the following graphs are generated.
Instructor : As you can see, the two graphs are similar in shape. The shape is what we're interested in right now.
Instructor : It's obvious that there is no way a straight line could come close to intersecting all of the points on any one of these graphs. So what does this mean?
Instructor : The fact that the relationship is not linear determines which tool will give you the most accurate and usable results when you generate your solution.

Instructor : From the stat menu, choose Regression.
Instructor : From the Regression sub-menu, choose Fitted Line Plot. This is the decision driven by the apparent curvature of the relationship shown in the initial plot.
Instructor : By now, the MiniTab selection menu should be very familiar. First, we will test the Torque-Master nuts at varying levels of torque.
Instructor : Select Column C seven, the Torque-Master removal time for the response variable, or Y.
Instructor : Select Column C six, torque two, for the predictor, or X
Instructor : Select a Quadratic plot to account for the expected curvature of the line and
Instructor : then click on Options
Instructor : From the Options menu, make sure that all of the boxes in the Transformations section are unchecked.
Instructor : Make sure your Confidence Level is set at ninety five point zero, or ninety five percent. This is the accepted convention, and we will commonly accept it as our default.
Instructor : Click on O K to return to the Fitted Line Dialogue Box and then O K from there to generate the graph.
Instructor : Click Next to see the results.

Instructor : The regression plot shows the actual transfer function directly above the graph. In this case, is a quadratic equation that describes the curve of the graph. This transfer function is this Step deliverable which is used in next Step to set the implementation parameters.
Instructor : The lowest point on the graph, which corresponds to the lowest nut removal time, is at an installation torque of seventeen thousand foot pounds. So based on the transfer function, the torque set at seventeen thousand foot pounds will provide us with the lowest removal time.
Instructor : This reading is also well below the upper specification limit of thirty minutes, and even exceeds our target of fifteen minutes.
Instructor : Just to validate our impression, we ran the same exact test with the regular nuts. Click next to see the results.

Instructor : The shape of this graph is similar to that for the Torque-Master nuts. The difference is that none of these readings are below the
Instructor : upper specification limit. This tells us that even if we optimize the installation torque force, it will be impossible to meet our project objectives without changing the nut.
Instructor : Click Next and we'll look at the results of this step.

Instructor : On the left hand side of the screen are two columns. The left hand column represents information that was present at the beginning of this Step and the right hand column represents information that was produced during the Step. At the right are four different pieces of information. Please drag each of the four to the appropriate column. When you are finished, click on DONE to submit.

Instructor : So the summary of the Step is that we accept the Vital X's identified in Step one and determine what actual changes in those Vital X's are necessary to meet our project objectives.

Instructor : Remember when we looked at the process capability of the Nut Installation process and determined that it was already a six sigma process? So we concentrated on improving the Nut Removal process and have determined what changes are required.
Instructor : However, the changes required to improve the Nut Removal process must all be implemented in the Nut Installation process!
Instructor : While our experiments measure the Nut Removal time as the Y factor,
Instructor : The experimental variables, or X factors,
Instructor : Were all implemented in the Nut Installation process.
Instructor : So another valuable lesson learned from this case was
Instructor : Keep the big picture; the entire process map, in mind even as you concentrate on improvements targeted at a single sub-process. So the concept of process mapping discussed in the Define and Measure Phases, becomes a critical element when we implement our improvements in order to impact the CTQs and meet our project goals.

Instructor : So what are our conclusions at the end of Step ? They consist of a proposed solution to our project task.
Instructor : In this case, the first of these deliverables is the requirement to change from the standard nuts to the Torque-Master nuts.
Instructor : The other change requires that the Torque-Master nuts be installed to a torque of 17,000 foot pounds. This was confirmed by our regression analysis earlier. With these changes implemented, we expect to reduce our average nut removal time to less than twenty minutes.
Instructor : We're almost done with the Improve Phase. There's only one thing left to do, and that's to provide the tolerancing information necessary to translate these changes into procedure documents for the shop floor.
Instructor : Be cautious when using the preliminary transfer function. It is limited in that it will not reveal any curvature. We use it to help identify a Vital X, but not to optimize the actual setting.
Instructor : You can use optimization DOE or other tools to optimize the setting of a Vital X.

(Illustration) To Indentify Potential X's

To Indentify Potential X's

Master: In the Improve Phase, we have three steps to lead us toward our goal.
Master: In Step one, we will take the list of potential causes we brainstormed in Analyze, step six, and determine which of them have the potential to be our vital Xs. This allows us to narrow our focus to the X's that may significantly impact the CTQ's.
Master: In step two, we will use a variety of tools to sicover the variable relationships and then propose one or more solutions to our problem. Just knowing the importance of a factor does not provide enough information to allow us to recommend a solution. We need to run additional tests in order to optimize our proposals.
Master: And finally in step three, we will establish operating tolerances for our revised process and pilot the solution.

Master: So your assignment in the Improve phase is to use all the information that has been acquired and calculated to identify the vital Xs, propose a solution, establish the operating tolerances and prove the solution through the pilot process.
Master : Frank will be your instructor in this phase. So I'm going to turn things over to Frank and let him introduce himself and then take you on through the Improve steps. You're up, Frank!
Instructor: Hi, I'm Frank Finding out all you can about your process and it's limitations is important, but I like to get to the point where we're fixing the actual problem. I have extensive experience in Design of Experiments and Statistical Tolerancing, which is where you determine how you're going to improve the process.

Instructor : At the end of the Analyze phase, you brainstormed a list of potential causes of variation. These are the candidates for the position of "Vital X." In step one, we do further analysis on these candidates and select the winner, or winners for the "Vital X" designation. The Fishbone, or Cause and Effect diagram, will help us link these potential Vital X's to the targeted effect, our Y.

Instructor : Let's quickly look through this Step's learning objectives. We will
>Identify the Vital X's for a given Y.
>We will also select the appropriate improvement strategy based upon characterizing the X's as either operating parameters or critical elements.
>We will review the applicability of a statistical techniques called Design of Experiments, or D O E, which allows us to test causality and which provides a solid statistical foundation to identify which factors are the Vital X's,
>Design and execute a screening DOE with factorial designs, and
>Define, describe, and explain the significance of a lurking variable.

Instructor : This is the fishbone chart we developed as a result of our brainstorming of potential X's. The team started analyzing the individual factors with an eye to narrowing down the choices for our project.

Instructor : The fishbone exercise is the first part of a brainstorming session. It serves as the container for all suggested factors. The second part of brainstorming involves tapping historical data, process knowledge, and process documents to eliminate as many factors as possible. Any factors not conclusively eliminated will be tested with a technique called Design of Experiments to validate their significance.
Instructor : The first area they looked at was the Tools branch.
Instructor : The shift supervisor and the mechanic said that none of the items listed under tools was a significant problem since existing procedures replace all worn or otherwise defective tools.
Instructor : The team then turned its attention to the question of environmental factors. The team's field engineer had been checking data on environmental factors and stated that he saw no potential for high benefits from attempting to control these items with the exception of ambient temperature. An engineer suggested that ambient temperature could affect the tightness between the stud and the nut, so they tentatively decided to leave it in.
Instructor : So the team looked at people. The shift supervisor and Jim had thought about these factors. They searched the records and found all technicians properly trained and certified through testing, and they decided to eliminate them as well,so this left Method and Materials.
Instructor : First, the team looked at the methods. The Shift Supervisor and the Mechanic pointed out that heating and cutting only took place if a nut was jammed or otherwise frozen to the stud, and had already been worked on too long.
Instructor : So removing that individual nut was already going to be out of spec if those steps were necessary.
Instructor : That left Torquing. The Supervisor pointed out that torquing only took place during nut installation, not removal. The Mechanic and the Field Engineer both agreed that the amount of force used to tighten the nuts could address the difficulty in removal. The Field Engineer was assigned to determine how to test removal of nuts tightened to specific torque levels. This factor will be included in the test plan to determine if it is a Vital X.

Instructor : So we've got a couple of Vital X candidates.
Instructor : The team then took up the only topic left, materials.
Instructor : The Mechanic said that according to the maintenance log, there had never been a problem with corrosion causing difficulty in nut removal, So the team agreed to drop it.
Instructor : The Field Engineer said that there was an alternative nut called the Torq-Master, which the benchmarked competitors were using for similar heavy-equipment applications. It is possible that they might improve performance in this process.
Instructor : Let's take a quick look at how we characterize Vital X's
Instructor : Critical elements tend to be changed by changing the kind or type of element, as opposed to changing an amount. These are Xs that are not necessarily measurable on a specific scale, but have an affect on the process. Some of these changes would include alternative work flow sequences, process standardization, and other qualitative changes in the process. Vital x's which are critical elements require testing of alternatives in order to determine the best solution.
Instructor : Operating parameters tend to be changed in amount, rather than replacing an element with another. These are Xs that can be set at multiple levels to study how they affect the process Y. Some examples include heat treatment temperature, cycle time, or the cutting speed on a machine tool. Operating parameters call for a mathematical model which helps you find the optimal setting for the vital x's to address the CTQ's.

Instructor : The approach we will use to test the significance of the suggested vital x's is called a Design of Experiments, or D O E. D O E is a statistical approach to test the significance of X's and interactions between and among X's. There are two general kinds of DOEs
Instructor : A screening DOE is used in this step to identify the vital x's for the CTQ. It looks at all of the variables, or factors, to determine those that affect our CTQ. Thus, it allows us to focus on the vital few and avoid wasting time on the trivial many.
Instructor : An optimization DOE is used in Step 8 to determine the optimized settings for the vital x's. These setting are identified through what is called the transfer function, which relates the customer specifications to specific settings for the vital x's. Now we'll look at some key considerations in our DOE.
Instructor : Including the current process conditions, or baseline, is a standard practice for doing experiments. Two benefits of this practice are:
Instructor : It allows you to compare experimental results with real-world process data using the same parameters, and by doing so
Instructor : validates your experimental setup.
Instructor : Remember, you obtained baseline data in order to generate your original process capability report.
Instructor : Let's look at another key consideration.
Instructor : Another key consideration is the overall statistical design approach to your experiments.
Instructor : A full factorial design requires your to test every combination of factors at all levels.
Instructor : If the number of runs required to execute a full factorial design is too large to accommodate, reducing the number of experimental runs is inevitable.
Instructor : Statistical approaches such as fractional factorial design can provide an effective way to reduce the number of experiments while remaining aware of potential shortcomings in the design.
Instructor : Another of the key considerations addresses a number of experimental factors.
Instructor : The number of factors to test is an outcome of the brainstorming session.
Instructor : In a screening DOE, which is what we are doing here, we will look at two different settings for each factor. This is typical of screening DOEs where we are simply determining the Vital X's. When we fine-tune our recommendations through an optimization DOE, more levels are typically used.
Instructor : Finally, we need to define the actual settings or the range of values over which we will test each factor. The range should be sufficient to ensure that we can detect any effect of the change, and it should also be feasible to carry out. Click Next and we'll look at some critical experimental design considerations.

Instructor : Another of the key considerations addresses the experimental setup.
Instructor : Replication is often confused with repetition. Repetition may involve additional observations during a single trial or run. Replication means to reproduce an entire experimental trial, but to do that under the exact same conditions each time.
Instructor : In order to minimize the effect of data gathered at a certain time or in a predetermined order, the order of trials should be randomized.
Instructor : All trials should take place under identical conditions, but that isn't always possible. Temperature, humidity, time of day, shift, individual supervisor or technician, are all nuisance factors that can be eliminated with a blocked design. Block variables lumps all reading taken under identical conditions are part of a block.
Instructor : Adding center points is a way to determine if there is curvature in the results. While it typically isn't done in a screening DOE, it can prove useful in certain situations.

Instructor : We have decided to use both replication and randomization in our screening DOE.
Instructor : Replication, by providing multiple sets of data under identical conditions, will provide an estimate of experimental error, which will become the basis for determining if a difference in observations is significant or not.
Instructor : A lurking variable is a variable which has an important effect on outcomes, but which has not been accounted for in the data. In this case, we will randomize the order in which various experiments take place, so that a time-dependent lurking variable, possibly tied to when a particular activity takes place, does not pollute the results.
Instructor : The Experimenter's Checklist is an excellent resource. You will find it under the Resources menu on the left side.

Instructor : After taking all of these considerations into account, we decided how to address the situation at the Rockledge plant.
Instructor : We will perform a screening DOE with a 2-level factorial design on the three remaining factors to identify the true vital x's
Instructor : We will use both replication and randomization to enhance the reliability of our results.Instructor : To summarize our design, we start out with three factors
Instructor : The type of nut, the torque level used to install the nuts, and the ambient temperature.
Instructor : We will define two possible levels for each factor
Instructor : The nut will be either the current nut or the Torque-Master nut.
Instructor : The installation torque setting will be either fifteen thousand five hundred or eighteen thousand foot pounds of force.
Instructor : And the nut will be removed at an ambient temperature of either fifty degrees Fahrenheit or one hundred degreed Fahrenheit.

Instructor : So how many different test make up a full factorial? Before we discuss the underlying logic, let's see if you can figure it out. We have three factors and two levels we will use for each. So how many different tests do we have to run in order to cover all possible combinations? Each combination must contain a unique combination of values for all three factors.

Instructor : Here are the eight possible combinations which we will test in our screening DOE. If all our screening experiments were limited to three variables and two levels, it would be simple to determine the number of possible combinations. We realize, however, that in the real world there are often far larger numbers of potential causes to test. Is there a simple mathematical formula which we can use to determine how many combinations there are for any number of factors?

Instructor : The general formula for the number of tests, or n, required for full factorial DOE is:
Instructor : n equals the number of levels to the power of the number of factors

Instructor : If we apply the general formula to the Rockledge case, we have two levels for three factors.
Instructor : Taking two to the power of three, results in eight different tests.

Instructor : It would be valid to ask why we only test each factor at two levels. We do so because this is a screening DOE, rather than an optimization DOE. We are only attempting to verify the significance of the factor, not determine the optimal setting.

Instructor : Our next task is to design the experiment. We're going to do a full factorial experiment as our screening exercise. We will use a test casing. Our casing is placed in a testing chamber
Instructor : This will allow us to subject the test to the full range of vibration it would on an actual operating generator
Instructor : Our measurable response will be the time required to remove the nut.
Instructor : According to historical data, nut removal time is not sensitive to the diameter of the nut used.
Instructor : The team decided to use a two point five inch nut for this test.
Instructor : In order to perform the statistical analysis, we must utilize numbers to represent the levels for each factor. In this case, since each factor has two levels, we will code those levels as minus one and one. Let's build a chart to show what we mean.
Instructor : For the nut, minus one will designate the regular nut while one represents using the Torque-Master nut.
Instructor : For the torque setting, minus one will designate fifteen thousand, five hundred foot pounds and one will indicate eighteen thousand foot pounds
Instructor :
Finally, minus one will indicate an ambient temperature of fifty degrees Fahrenheit, while one will indicate an ambient temperature of one hundred Fahrenheit. It is good practice to make a note of how you have coded these levels because that information is not captured in Minitab.

Instructor : The first step is to select your factorial design. First click on the Stat menu and then select D O E. From the D O E submenu, select Create Factorial Design.

Instructor : This dialogue box comes up when you select Create a Factorial Design.
Instructor : Since we have only two values, lets select two level factorial design with the default generators.
Instructor : We have three factors to address, so select three from the Number of factors pull-down menu.
Instructor :
We still have to select an actual design, so click on Designs.

Instructor : When you click on Designs, this dialogue box appears. First, select Full Factorial from the choices in the large window.
Instructor : Make sure that number of center points is zero.
Instructor : Then select two for the number of replicates. In MiniTab, this automatically requires each individual test to be run two times.
Instructor : Then make sure that number of blocks is set to one.
Instructor : When you click on O K, you will go back to the top design screen.
Instructor : Next, you need to name the three factors. To do this, click on Factors.
Instructor : Clicking on the Factors button brings up this dialogue box.
Instructor : We're going to change only the names here, so we will rename the three factors Nut Type, Torque, and Ambient Temp.
Instructor : When you have changed the names, as shown here, click on the O K button.
Instructor : By default, MiniTab is set to randomize the run order, and to assume there is only one block. We're going to accept those defaults of the program, so the final step is clicking the OK button to accept the design.
Instructor : If you did everything right, you will see this information in the Session window of MiniTab.
Instructor : Factors three indicates that you have three variables
Instructor : The Base Design shows that you have three variables and eight unique tests, which is a full factorial design for this group.
Instructor : The Runs field says you have sixteen actual tests to perform
Instructor : While the Replicates field says that each of the eight unique tests will be run twice. We aren't going to bother with blocks and Center Points in this example.

Instructor : The system then generates the testing design.
Instructor : Numbers in the first column represent the combinations of factors; the numbers in the second are the order of the run. You can see from these two columns that the runs have been randomized to avoid lurking variables. Since we are not using center points or blocks, all of the entries in those columns are one.
Instructor : The last three columns indicate the actual level of each of the three variables. We already defined what minus one and one mean for each variable, so this is a simple way to record the test conditions.
Instructor : Remember that we assigned two levels for each factor, coded to minus one and one. Let's take a look at how that works.
Instructor : For example, if we look at row four, the levels indicated in columns five, six and seven are minus one, minus one, and one. The minus one in column five means that this run will use the standard nut. The minus one in column six means that the nut will be tightened to 15,500 ft-lbs, and the one in column seven indicates that the ambient temperature will be 100 degrees Fahrenheit.

Instructor : So here are the results of all our tests. The actual removal time has been added as column C eight. Our next step will be to analyze this data.

Instructor : Now that we have the data in place, it's time to analyze our factorial screening.
Instructor : From the Stat menu, select DOE
Instructor : And then from the sub menu, select Analyze Factorial Design
Instructor : When you make your selection, the now-familiar MiniTab dialogue box will appear.
Instructor : You select Nut Removal as your response by clicking on column C eight in the selection box and then clicking on Select.
Instructor : When your dialogue box looks like this, you should click on Terms to make sure that they are correct.
Instructor : A key item here is to determine which terms will be analyzed by MiniTab. This is defined in the pull down menu at the upper right.
Instructor : If one is selected, only the main effects, or the individual variables in isolation, will be analyzed. There will be no analysis of the interactions between variables.
Instructor : If two is selected, the two way interactions between all pairs of variables will also be analyzed.
Instructor : The default is three, which means that all interactions, including the three way combination of all variables, is analyzed.
Instructor : Clicking on OK here will take you back to the previous screen. Clicking on OK in the main dialogue box will generate the analysis of your factorial screening.
Instructor : The DOE analysis uses the hypothesis test to determine the significance of each factor and interaction between and among factors. As you have seen previously, the P value is a test of significance. In this case, a P value of less than zero point zero five means that the corresponding term is a vital X.
Instructor : Of the three main effects, the nut type and torque show a P less than zero point zero five and are therefore considered vital X's.
Instructor : The Ambient Temperature is not a vital X
Instructor : The combination of nut type and torque is also a vital X, but none of the other interactions qualify.
Instructor : So our conclusion from this analysis is that nut type, torque, and the interaction of nut type and torque are the vital X's. Interactions are a critical concept here.

Instructor : The combination of time and temperature in cooking is a good example of an interaction.
Instructor : If we have a specific temperature set, it will take a certain amount of time to properly cook an egg. Therefore the time required, in this case, is dependent upon the temperature used.
Instructor : If we raise the temperature, it takes less time.
Instructor : If we decide to leave the eggs cooking longer, we must reduce the temperature. Neither factor can be optimized without taking into account the other.

Instructor : A common practice is to rerun the analysis, removing the term with the highest P value each time, until the results show only those terms with a P value of less than zero point zero five. Click Next to see those results.

Instructor : So here are the final results. These are the terms with P values less than zero point zero five. So we conclude that these are the vital X's. The effect column measures the overall effect on the Y of moving from the low, or minus one, level to the high, or positive one, level. In the case of the Nut Type, for example, the time required to remove the nut will decrease by approximately eighteen minutes on average, when the standard nut is replaced by the Torque Master nut. We will now build an equation to predict the nut removal time for the Torque Master nut installed at the lower torque level.
Instructor : We start with the Constant Coefficient.
Instructor : We then add to that the product of the Nut Type Coefficient and the assigned value of the nut type. In this case, the torque master nut type is one and the regular nut is minus one, so we will use one.
Instructor : In a similar manner, we add to that the product of the Torque Coefficient and the assigned value of the lower torque level, which is minus one.
Instructor : Finally, we add the product of the Coefficient of the interaction of the nut type and torque, which means changing simultaneously from the standard nut to the torque master and from the low to the higher torque value, and the product of the assigned value of each.
Instructor : By taking all the products first, we simplify the equation to this straight forward string of additions and subtractions
Instructor : Which adds up to twenty four point six minutes.

Instructor : Be cautious when using the preliminary transfer function. It is limited in that it will not reveal any curvature. We use it to help identify a Vital X, but not to optimize the actual setting.
Instructor : You can use optimization D O E or other tools to optimize the setting of a Vital X. We will do that in the next step, coming up soon.

Instructor : Another thing we can now do is generate a graphical representation of the actual factorial screening. This often provides visually dramatic results. However, graphical representations must always be backed by statistical results. From the Stat Menu, select D O E and then Factorial Plots
Instructor : When you select Factorial Plots, the following dialogue screen will appear. There are three kinds of plots that you can do. In this particular case, we'll go ahead and select all three by clicking in each white box.
Instructor : Upon selecting a plot, the SETUP buttons become active. So let's click on the top SETUP button.
Instructor : When you click on SETUP, MiniTab displays the following dialogue box. Let's go through the proper setup.
Instructor : First, we need to select a Response. This is the CTQ we believe our Vital X's will influence.
Instructor : The Response value is in column C eight, so we place our cursor in the Responses Field
Instructor : and click on C eight
Instructor : and then click on the SELECT button on the lower left. you can also double click on an item, which will perform the same function as selecting the item first and then clicking on SELECT.
Instructor : After successfully entering our Response, we look at which factors to include in the plots.
Instructor : Our three factors are in the AVAILABLE box. We want to look at all three variables, so you can either click on the arrow three times or the double arrow once to transfer them over to the selected window.
Instructor : This screen shows all proper choices selected. When you press O K, you will return to the Factorial Plot Main Menu.
Instructor : From this screen, check the settings for the other two plots by clicking on SETUP. Make certain that all three have the same factors defined. Once you have completed the setup,
Instructor : click on O K to execute the plot. Click next and we'll start looking at the data.Instructor : This first graph shows us the individual factors plotted against the nut removal time. The slope or height of the line demonstrates the impact of the change.
Instructor : It appears that there is a sizable decrease in removal time when we switch from the regular nut to the Torque-Master nut.
Instructor : Is also appears that there is some increase in removal time due to increase in torque applied.
Instructor :
And finally, there does not appear to be any direct effect of changing ambient temperature.
Instructor : The second plot shows interactions between pairs of variables. You can determine which two by simply looking either up or to the right of each variable to see where it is used. Interactions are significant when lines are not parallel. This means that the lines either cross, or if extended would cross. A right angle would be the strongest interaction.
Instructor : The first result shows some interaction between Nut Type
Instructor : and Torque.
Instructor : The next shows no detectable interaction between the Nut Type
Instructor : and Ambient Temperature.
Instructor : The third shows the same lack of interaction between the Torque
Instructor : and the Ambient Temperature.
Instructor : So here in step one, our primary task was to identify the Vital X's for our Y.
Instructor : We also looked at a strategy for determining our exact improvement plan.
Instructor : We reviewed the applicability of using a Design of Experiments approach to this process,
Instructor : and executed a full factorial screening DOE
Instructor : As part of the design our or screening experiment we took steps to limit the possible effect of any lurking variable.
Instructor : So we designed and executed a full factorial screening on our three potential Vital X's.
Instructor : We used replication and randomization to enhance the reliability and credibility of our experimental results.
Instructor : We determined that the Nut Type and Torque were both significant factors,
Instructor : but Ambient Temperature was not.
Instructor : So we're about ready to wrap up this section of the course and the case, and move forward to the next step of the Analyze phase.

Instructor : Well that wraps up step seven. Congratulations. We have correctly identified two vital X's which we will optimize in the next stage to lead us to fulfillment of our process objectives.

Tuesday, April 24, 2007

Inspection of the Performance

This step equal to evaluate the application of the improvement. Before we have checked the capability process after improvement, now we will evaluate the feasibility conducting improvement. We must check and evaluate the effect, costs, and easiness of derived solutions. Sometimes the improvement need some re-design because of organization limitation. We must clearly map the limitation and make verification and re-design adjustment. Re-design also could be making some positive points and raise up the advantages of the improvement.

Verify Capability Process after Improvement

The conclusion of Experimental will be used as the improvement result (optimal solution). We expect that better condition after improvement than before improvement. To ensure whether our improvement give the positive impact to the project, we must re-calculate the capability of the process after improvement. The best result of our improvement will match with our target (Define Phase).

Optimizing the Improvement

Design of Experiment and Response Surface Methodologies is the power tools to get the optimal solution based the observation and experiment data. Many tools in this world can be applied in this phase; you can choose what tools that proper with the project. In this site, you will learn how to optimize the solution use the Design of Experiment and Response Surface Methodology. Both methods are experiment base.

In the term of experiment, we will often talk about response, factor and level. Response is a measurable variable from the output which provides useful information about the process. Or we can say response is the Y of our project. Through the experiment, we will find the optimal factors (vital factor value) to gain the optimal of Y value. While factor selected controllable variable and important independent variables for an experiment. Each factor will be detailed in the levels. Levels mean values of factor. To conduct the experiment, we will do a treatment which combine the each level each factor.

The basic to get the success of an experiment, we must follow the basic way of the experiment rules. Experiment must be no ordered. Its mean, the experiment must be follow randomization rule to guarantee the objective and reproducibility of the result. Also, the project must be run more than one. Often this rule we call Replication. By using this rule in our experiment, we get more precision and better result of the experiment. Besides that, the experiment also contains blocking rules. Blocking is means grouping experimental units into the homogeneous sets. Blocking very effective way to minimize the experimental error and guarantee the better precision result.

Experimental should be use Randomization Rule which experimental unit are randomly assigned to the treatment and randomly execute the order of the treatment. Besides get better objectivity and reproducibility of experimental, this way also able to minimize the bias from the external sources of experimental.

Replication Rule consists of Repetition and Replication in the term itself. Repetition mean action of repeating experimental units while replication is repetition of the basic experimental. Its mean that the experimental units will be executed more than one in the same treatment. By doing this rule, the experimental will provide more accurate and precise set of observation. This way able to estimate an experimental error.

Blocking Rule aim to compare experiment that being done in one environment with another one within the group. Blocking will provide local control of the environmental error and effective to minimize the experimental error.

Picture I.3 Experimental Steps Process

The experimental steps consist of Design Step, Execution and Analyze. In the design step, we must clearly determine what the purposes of our experiment (must be clear – use SMART concept), response (Y), Factors (X’s) and Levels. After we can construct the components of experimental, we can decide what design of experimental that should be chosen.

In Execution steps, we execute the design that has been built. This step is equal Process step in SIPOC tools. The entire rule must be applied in this step. So, we can collect the data from this step correctly.

In the analyze step, we can analyze the data experimental based on the design. The properly tool of analyzing will determine the properly result of our experimental and get the right conclusion. The conclusion has been made, will be used to the implementation or application of the process of improvement. Better we run additional experimental to verify the reproducibility of derived optimal solution. This final conclusion may better than the first conclusion.

The detail of this tool, you can learn in Tool Categories in this site. (Design of Experiment and Response Surface Methodology)

Design the Improvement Plan of Vital Factor

The optimal solution in this phase can be begun by enumerating the all possibility of the improvement idea both qualitative method and quantitative method. Then, we evaluate the improvement focus on financial benefits, defect etc. we must keep consistency during the evaluation in order to the result will represent the actual performance. So, we can select the improvement idea based on the evaluation that has been done. In the final step, verification needed to check some points to check overall condition when apply the improvement idea.

Picture I.2. Procedure conducting the Improvement Idea.

Designing Improvement idea can be generated from qualitative or quantitative method. For more objectives, quantitative method is preferable than qualitative method. As long as the data are quantified, better we use quantitative but qualitative no mean worse.

In Quantitative method, we often conduct experiment to get the best value based on the observation data. Based on the controlling data, we can categorize the experiment type. If there are uncontrollable, we can use the regression analysis. In example, we collect data demand, sales, etc. When data can be controlled, usually Design of Experiment (DoE) or Response Surface Methodologies (RSM), etc are the preferred method to be applied.

The Regression analysis just show the correlation between vital factors and its response not ensure there are has causalities between each other or not. While, DoE or RSM besides show the correlation, this method could be show the causalities effect.

But, if selecting best solution use the quantitative is not possible, we can use qualitative method to create the improvement way. Or, we also can combine between quantitative and qualitative method to gain more power improvement.

Qualitative method could be implemented by using tools Osbone’s Check List, Listing Faults - Listing Wish points, ERRS, SCAMPER, Functional Improvement, Procedure Improvement, etc.

Improvement Matrix

In the Improvement phase, the activities, objectives and tools that will be used to optimize the vital factor as the result in analyze phase to gain the optimal solution of project Y. Detail of Improvement phase activities are drawn Matrix bellow:

Picture I.1 Improvement Matrix