*Offer extended until 5 PM PDT September 29, 2017*
Design & Factor Selection
Design Types & Categories
Normalization - Normalizing is one of several transformation techniques. It has the effect of reducing the parameters (factors or response) to a common range. This provides a measure that allows the relative importance of any factor or interaction to be identified more clearly. It improves the numerical accuracy of the regression and the computation of significance. Often, the parameters to be retained or discarded are thus identified more easily. Usually either all or none of the quantitative factors to be used in a particular regression run are normalized. Normalizing has no useful purpose for qualitative factors.
Where the experimental situation is nonlinear, the linear model will not fit well and normalizing is of limited value, except as a first step in screening out factors or interactions that have minimal effect on the response. After an initial determination of significant parameters, a regression may be done on transformed experimental values. Often, it is convenient to do a final analysis with the experimental values so that the results in plotting are displayed in conventional units.
Normalized value = (Value - Mean of value) / ( 0.5 x Range of values)
The effect is to transform the values of the variable to a -1 to +1 range when the experimental array is balanced. Unbalanced factors in an array, e.g., an array containing repeated runs, do not have means centered on their range.