Now that this workshop has taken place, the programme, presentations and other details of this workshop may be found in the Workshop Results section, workshop 1.8 (members only).
The programme committee is still working on the final programme. Please find the currently confirmed programme below.
Workshop - Benefits to Industry
Introductions - short presentations
List of invited speakers:
Statistics Seminar: Recent Research Developments in DoE for process and product improvement.
J. Engel, CQM, Building HCZ-3, P.O. Box 414, 5600 AK Eindhoven, The Netherlands. Email: Engel@cqm.nl Phone: +31 40 2758722 Fax: +31 40 2758712
We shall summarize benefits of experimental design for industry. New developments in robust design and computer experimentation will also be presented, illustrated by practical applications. Usually, the proper choice of design is a main point at industrial experimentation, but it is less clear why the role of statistical modelling is often underestimated. Thus, aspects of model building for experimental data will be considered. In specific, we shall concentrate on the modelling of longitudinal data from experiments where treatments are applied in blocks. Models are fitted to the data and effects are estimated and tested. Finally, conclusions are drawn on the use of these models for industrial experiments.
Bente Kirkhus (Mills DA), Frøydis Bjerke (Matforsk)
An important objective in food production is to achieve robust products with desired quality that remains stable throughout the shelf life. Knowledge about the relationship between raw materials, process conditions and product quality is essential. In order to find critical factors that influence the quality of low-fat mayonnaise during storage, Mills DA performed designed experiments in a pilot plant as well as on a full scale production line. The design variables included both raw materials and process parameters. The mayonnaise samples were stored at two different temperatures. The response, viscosity (Brookfield) of the samples, was measured at various time intervals after production.
A simple and straightforward approach was chosen where the response curve was modelled by a polynomial of appropriate degree, in this case a straight line (y = b0 +b1t). The results showed that the viscosity of newly produced product (b0) as well as the change in viscosity with time (b1), were influenced by the design variables. Response surface analysis revealed important interactions between raw materials and process parameters. The achieved knowledge is used to select process settings that ensure a robust production of low fat mayonnaise. These new production conditions are robust enough to handle unwanted variation in raw materials as well as variation in critical process parameters thus ensuring optimal and stable quality of the product during shelf life.
Geir Ausland, Elkem ASA Silicon division. P.O. Box 8040 Vaagsbygd 4675 Kristiansand. Email: firstname.lastname@example.org, phone +47 38017543, fax +47 38017494
This presentation will give an overview of the statistical methods mostly used for development and optimisation of process operation of the different metallurgical processes in Elkem. The presentation will include:
Nadine Henkenjohann, Department of Statistics, University of Dortmund
We present the results of a joint project between the University of Dortmund and the Federal Research Centre for Nutrition and Food. The aim of the cooperation is to determine ingredients of cream that inhibit the growth of different bacteria. For each combination of ingredients, a small time series is observed, recording the bacteria growth over a fixed period of time. We present and compare two approaches that allow such relationships to be modelled.
The workshop "Design of Experiments - Benefits to Industry" intends to help users of applied statistics to overcome common challenges of statistics in practice. Thus, in this presentation we will give an overview of the problems that occurred in the different stages of the project and how we solved them. The user should become aware that successful application of statistical methods often requires more than just following the "statistical cooking recipe". Hence it is important to verify the efficiency and adequacy of the applied methods during all stages of the experiment.
Asa Jansson, IVL Swedish Environmental Research Institute Ltd, Box 470 86, 402 58 Gothenburg, Sweden
To our experience, industrial practitioners are often sceptical towards expected benefits of statistical modelling and reluctant to introduce the temporary changes involved with experimental design. A first step for them to overcome their disbelief is to see results of statistical modelling applied to ordinary operational data from the process. Questions to consider with this approach are; What results can be achieved without applying experimental design? How much further can you get if you do?
Results from a national project with applied statistics at an oil refinery will be presented. Accurate soft sensors of important quality measurements and hierarchical monitoring models were developed from historical process data from normal production. Expected benefits of proper experimental design will also be discussed.
John Tyssedal, Department of Mathematical Sciences, The Norwegian University of Science and Technology, 7491 Trondheim, Norway
The primary goal of a screening experiment is to identify the active factors. The design chosen in order to perform the screening is often a two-level design. The results from the experimentation are often analysed under various assumptions such as effect sparsity, effect hierarchy and effect heredity. These principles work well in many situations, but when the experimental region is close to the factor values that optimize the response or when a polynomial model will not give a good fit, they are of less value.
In this work we look into the possibility of using non-geometric two-level designs as screening experiments. As shown in Box and Tyssedal (1996 & 2001) these designs have better projective properties than fractionated designs also called geometric designs. Projective properties are important if factor sparsity can be assumed, a principle that only put limitations on the number of active factors. A way to analyze non-geometric two-level designs is presented and its efficiency is tested out on the 12 run Plackett Burman design in various situations. Both simulated and real data from a production of small plastic parts are used.
Some conclusions are:
Øyvind Langsrud, MATFORSK - Norwegian Institute for Food Research, Osloveien 1, N-1430 Aas, NORWAY
Multiple responses are common in industrial and scientific experimentation. Often these multiple response variables are related in some way. Digitisations of continuous curves and several related measures of the same physical phenomena are examples of such data. Ordinary variance analysis (or general linear modelling) of each response variable results in several p-values (raw p-values). We may want to adjust these p-values in the sense that the experimentwise error rate is controlled. Bonferroni adjustment is the most well known method, but this method is extremely conservative.
It is, however, possible to compute exact and non-conservative adjusted p-values by using Monte Carlo testing. The unknown covariance matrix is handled by conditioning on sufficient statistics and this methodology is called rotation tests. Compared to permutation tests, we replace permutations by proper random rotations. Permutation tests avoid the multinormal assumption, but they are limited to relatively simple models. On the other hand, a rotation test can be applied to adjust p-values from any general linear model. Instead of controlling the experimentwise (or familywise) error rate, we can make a rotation testing method that controls the false discovery rate. This type of p-value adjustment has become very popular in microarray data analysis. More information and free software can be found at http://www.matforsk.no/ola/rotation.htm.
Henry Wynn and Ron A Bates, Department of Statistics, London School of Economics, Houghton Street, London WC2A 2AE
This is a round up of work carried out on a recently completed European Grant under the 5th Framework. The work consists of experimental design, modelling and optimisation with case studies in the automotive and other engineering sectors. In all three areas has seen some level of innovation.
There has been experimental design for physical and computer experiments, the later including input design for simulation. Different types of kernel based modelling has been have been used. Optimisation has concentrated on multi-objective and global optimisation. Many lesson have been learned about the use of experimental design and modelling strategies for complex systems. One is the problem of tuning strategies to difficult constraint regions. Another is need to develop integrated sofware environments to automate certain aspects of the modelling and optimisation.
J. Engel* & S. A. Ortega (TU/e). *CQM, Building HCZ-3, P.O. Box 414, 5600 AK Eindhoven, The Netherlands. Email: Engel@cqm.nl Phone: +31 40 2758722 Fax: +31 40 2758712
Our starting point will be the modelling of longitudinal response data from a research experiment where treatments are applied in blocks. Analysis of experimental data will be two-fold: exploratory analysis, and inferential analysis by model estimation and testing. In the spirit of the latter, we shall visit five models from statistical literature. These models are fitted to the data and the estimation and testing results are compared. A simulation study investigates the power of testing the main effect of a treatment, and its interaction with time. Conclusions are drawn on the usability of these models for industrial experimentation and suggestions for further research are given.include 'footerx.html' ?>