Final Report Summary - CAFE (Computer-aided food processes for control engineering)
Executive Summary:
Food industry is nowadays facing critical changes in response to consumer needs, which in addition to health and safety awareness, demand an ever larger diversity of food products with high quality standards. From the consumer side, such variety in the demand is driven by social or ethical incentives as it is the case of products more compliant with the environment or produced by sustainable processes. On the other hand, food industry is in a permanent quest for new markets and new population sectors not accessible before. This immediately translates into the search for novel products and more efficient processes so to gain market opportunities with respect to other companies.
In order to satisfy such needs and demands, which although driven by the product, directly affect the process, novel and efficient food and process engineering approaches must be developed so to comply with the proposed requirements. Product engineering approaches are already responding to the challenge by proposing new methods and tools to systematically modify or even design new products in response to consumer needs.
In similar terms, Process Engineering should offer efficient and flexible process alternatives to comply with the product safety and quality standards, while minimizing operation costs and environmental impact. At this point, it must be noted that by the characteristics of the food industry, such two concepts can be almost immediately related with minimization of energy and water consumption, and therefore with sustainability, a notion of particular relevance on these days when we start experiencing persistent evidence warning us about a global climate change.
The objective of the CAFÉ project has been to provide new paradigms for the smart control of food processes, on the basis of four typical processes in the areas of bioconversion, separation, preservation and structuring. The novelty of the project lies in the capacity of combining PAT (Process Analytical Technology) and sensing devices with models and simulation environment with the following objectives:
1) to extract as much as possible information from the process/plant in the form of precise estimations of unmeasured variables defining, in particular, product quality, and of physical parameters changing as the process dynamics does or difficult to know beforehand;
2) to save and encode in a reliable and usable way, basically via physical/deterministic models;
3) to develop control methods to keep uniform quality and production despite the variability in the raw material and/or to respond to sudden changes in the demand.
The project is organized in a matrix format around four food processing applications which can be considered representative case studies of food industry. The applications cover all process categories in food industry, namely bioconversion, separation, preservation and structuring, being the selected cases:
1) wine-making, as a case study of bioconversion processes (CS1)
2) microfiltration of food beverages, as a case study of separation processes (CS2)
3) freeze-drying of lactic acid bacteria, as a case study of preservation processes (CS3)
4) ice cream crystallization, as a case study of structuring processes (CS4)
These applications serve as the basis to test and demonstrate the viability and efficiency of the concepts, methods and paradigms to be developed through a total number of 9 work packages :
WP1 (Project management), WP2 (Knowledge representation and data management), WP3 (Process experiments), WP4 (Model building, process dynamics and model reduction), WP5 (Sensor development and validation), WP6 (Process design and optimization), WP7 (Process monitoring and control), WP8 (Integration), and WP9 (Demonstration). Transversal genericity among the different applications will be pursued mainly on WP4 to WP8, while WP2 and WP3 provide the necessary specifications in terms of data and experiments to make the concepts applicable in WP9.
Three central paradigms have been emphasized within the CAFÉ project : model parsimony, which can be simply expressed by the fact that a simple model will always do a better job than a complex model as long as the simple model is validated on real-life data ; the combination of several measurements more specifically the combination of different measuring techniques based on different physical principles as well as the integration of software sensors ; and optimising control, i.e. the combination of optimisation and feedback control within a unified control scheme.
Project Context and Objectives:
Food industry is nowadays facing critical changes in response to consumer needs, which in addition to health and safety awareness, demand an ever larger diversity of food products with high quality standards. From the consumer side, such variety in the demand is driven by social or ethical incentives as it is the case of products more compliant with the environment or produced by sustainable processes. On the other hand, food industry is in a permanent quest for new markets and new population sectors not accessible before. This immediately translates into the search for novel products and more efficient processes so to gain market opportunities with respect to other companies.
In order to satisfy such needs and demands, which although driven by the product, directly affect the process, novel and efficient food and process engineering approaches must be developed so to comply with the proposed requirements. Product engineering approaches are already responding to the challenge by proposing new methods and tools to systematically modify or even design new products in response to consumer needs.
In similar terms, Process Engineering should offer efficient and flexible process alternatives to comply with the product safety and quality standards, while minimizing operation costs and environmental impact. At this point, it must be noted that by the characteristics of the food industry, such two concepts can be almost immediately related with minimization of energy and water consumption, and therefore with sustainability, a notion of particular relevance on these days when we start experiencing persistent evidence warning us about a global climate change.
To that purpose, original combinations of adapted or standard process units operations need to be designed and optimally operated through smart control configurations covering the whole food plant production. In this way, food plants would evolve so to become flexible and multipurpose production structures, able to efficiently modify or adapt the operation or to combine, in the best possible way (including product and process requirement), several production lines in response to market demands.
Main obstacles to smart operation and control in the food industry
The food industry is well established and many processes in operation nowadays are the subject of intensive work with regard to the ways of devising better operation modes in terms of product quality and safety (how to operate in order to ensure quality and comply with safety constraints) as well as in terms of operation costs and environmental impact. There is also an intensive development work aimed at responding to consumer demands by designing new products and designing and operating the more appropriate combination of unit operations needed to produce them.
However and despite the fact that the essential physical, biochemical and microbiological principles are reasonably well understood, foods are complex systems with properties that because are connected with quality and safety are usually very difficult to measure, estimate or even represent through reliable models. Such properties may include physico-chemical parameters associated to quality such as nutrient content, texture, colour or rheology, or microbiological characteristics usually connected with food safety.
In addition, and from a Process Engineering perspective, the food industry integrates a rich variety of apparently very diverse processes and technologies thus hampering the search for unifying paradigms useful for dealing with different yet analogous processes. Such processes have only recently been classified into a reasonably small number of categories, namely bioconversion, separation, preservation and structuring.
The different processes summarized above might have different simultaneous purposes, or maybe combined for the same product. For instance, cooking has also the effect of better preserving the products, bioconversion or separation technologies such as lactic fermentation or drying also favour preservation. It is also important to highlight that in order to guarantee the production efficiency and product quality, the combination of the different processes and technologies in each of its production lines, and their operation should be defined and integrated harmoniously among the different parts in order to produce the required coordination. Such integration, being flexible, would allow the satisfaction of process and product requirements. In this context, it is worth noting that the food industry has experienced significant changes in their mode of operation over the last years in order to try to rapidly adapt to a changing market driven by consumer demand, stringent safety and environmental regulations as well as the highly restrictive specifications in product quality. However, and despite previous efforts in R&D, the time lag that goes from product conception to optimal product development, including plant design and operation is still too costly and time consuming.
In particular, and when considering more specifically the plant operation and the different processes briefly summarized above, the efficiency of the current control schemes in the food industry is still far from being optimal in a global sense, i.e. for covering operation costs, energy and water consumption, environmental impact and product quality and safety.
The underlying main obstacles to optimal plant operation and control are those derived from the –sometimes apparent- diversity and complexity of processes, technologies and products, which often translate into partial solutions, empirically driven, and valid only on a case-by-case basis.
Current needs
From a control and automation point of view, the following weaknesses have been presently detected in most food companies:
• Most plant control schemes reduce to local and decentralized control loops acting on a usually very small number of states (typically temperature or pressure) not directly connected with product quality and in many cases neither with critical aspects of the operation such as water or energy consumption. Despite the fact that the performance of PID control can be in many instances more than acceptable (as it is also the case in other process industries), the control loops should be combined among the different processes to highlight synergies and not to cancel them. In addition, this regulatory layer is not commanded nor integrated on higher level supervisory levels.
• Although presently, many food plants benefit from advances in data acquisition and monitoring of the full production lines to gather and store huge amounts of data, the use of such information is quite limited, usually not efficiently employed and reduced to configure alarms (often handled at a very low level) or to help producing simple production decision rules and off-line control of inventories. Much more efficient use of such information should be made possible when properly combined with process models and prognosis tools to be able to estimate unmeasured yet relevant plant states and to predict future scenarios, even on a real-time context.
• Often recognized as a specificity of food processes, the lack of sensors for relevant product characteristics is still a problem. Even if numerous algorithms are available for advanced control purposes, it is obvious that the on-line, real-time reliable information is necessary. Common sensors (temperature, pressure pH, flowrates,…) give information only very indirectly related to the product properties of interest, such as texture, aroma content, biomass, contaminants, vitamins, etc. Sensors for such product properties are either missing completely or very expensive and not robust or unreliable enough to be used in everyday industrial practice. Improving the reliability of sensing devices and developing new hardware-software sensing techniques to on-line estimate difficult quality product parameters are critical in developing smart control applications for food factories.
• It is important to note that generally speaking, there is a lack of control design and operation paradigms to optimally operate plants, either learnt or inherited from diverse yet similar processes or scenarios. One such paradigm is that pursued by the engineering community that focused on concepts such as simulation, optimality or optimisation as those containing the real systematic approach to the problem of devising smart control for food processes.
In order to overcome the present limitations detected on food production plants and to offer a novel and original integral approach to guarantee complete and intelligent control of the whole plant in response to changes in the demand and the supply of materials, in order to ensure product quality, flexibility and efficient operation, compliant with the environment, a number of scientific and technological objectives stated so as be achievable within the project duration, are proposed next.
Scientific and technological objectives
The main scientific and technological objective of the project is the development and implementation of novel process engineering tools and methods to efficiently control in a flexible way wide classes of food processing plants. In particular, it concentrates on the development of an integrated approach to optimal food process operation and control, and the implementation through reliable and novel sensing systems and advanced simulation tools :
• to efficiently reconstruct unmeasured states of the plant such as process and product parameters indicative of food quality and health safety of the food product, as well as process operation parameters indicative of the efficiency of the operation;
• to provide, efficiently and in real time, predictions of future scenarios, and robust and efficient control strategies that achieve optimality in terms of process factors (minimization of operation costs, energy and water consumption, environmental impact) and quality parameters of the food product.
Figure 1. The integration concept of CAFÉ
In order to achieve the proposed global objective, the following specific objectives need to be achieved.
1. The development of a robust and reliable sensing architecture, in order to obtain on-line estimations of product and process parameters and properties indicative of food quality, safety and efficiency of the operation. The design of the novel and low-cost process sensing systems will be included into an integrated measurement system, based on optimized ensembles of sensors, PAT (Process Analytical Technology) technologies and in combination with modelling and identification techniques, to allow real-time or near-real-time (e.g. on-, in-, or at-line) monitoring of critical parameters (raw material, process, product) while manufacturing is in progress. The sensing system will be built on a hardware-software architecture with the following properties.
• It will be Component Based, i.e. it will be constituted by adaptable application components of different types, using recent paradigms based on formal real-time process specifications.
• It will have a Distributed Configuration. In order to overcome the intrinsic complexity of the plant and to be of use in a real-time context, software components will be implemented on distributed platforms.
• It will be Reusable. The architecture must be designed as a framework of reusable components with distributed real-time features. This framework will be extensible and adaptable in order to allow its implementation to a wide rage of possible applications.
2. The development and application of new concepts and tools from model building and simulation to produce dynamic models able to capture the relevant process features. The models will have explicit parametric dependence and thus will be suitable for dynamic optimisation, smart sensor and sensing system development (objective 1), monitoring and control. To that purpose:
• Different classes of mechanistic dynamic models covering the different food process categories will be explored. These models will cover the main phenomenological aspects of the systems in order to capture kinetic and microbiological phenomena related with food quality and safety aspects, even at the at the cell level. In addition detailed flow and heat transfer models, also based on Computational Fluid Dynamics (CFD) and population balance equations (PBEs) will be constructed.
• Novel model reduction techniques will be adapted to of the selected models to obtain simpler and computationally amenable reduced descriptions while maintaining their explicit parametric dependence in a format that will keep them appropriate for process monitoring and control.
3. The development of an intelligent control framework to operate the complete plant and to take optimal decisions. The framework will include conceptual modelling tools, efficient real-time optimal decision algorithms (real-time optimisers) and robust control methods that will minimize the inherent system’s uncertainty and will make rational use, in combination with models and in a predictive context, of the large and complex data sets obtained from the plant signals and sensing technologies. The framework will include the following two items.
• An advanced control layer that will take advantage of the monitoring of quality information provided by on-line quality sensors and Process Analytical Technology (PAT) and advanced and robust controllers based on mathematical models of the processes, to account for the uncertainties associated with the system and external disturbances and thus enforce optimal operation.
• Optimal operation support tools. These tools will include dynamic predictive simulation and dynamic optimisation user-friendly modules for analysis (optimal selection of future plant operation policies and optimal decision making). Optimality will be stated in quantitative terms by maximizing product quality (while satisfying safety constraints) as well as in terms of minimizing operation costs, mostly related to the minimization of water and energy consumption, and environmental impact.
4. The demonstration of the applicability and efficiency of the concepts, methods, paradigms and tools developed on a number of application cases representative of processes and process combinations in the food industry. To that purpose, the CAFÉ concepts, methods and tools will be confronted with standard control solutions in order to quantify improvements in terms of product quality, operation costs and environmental impact, on the following set of applications illustrative of bioconversion, separation, preservation/stabilization and structuring, in cooperation with the SMEs involved in CAFÉ project:
1) wine making, as a case study of bioconversion processes
2) microfiltration of food beverages, as a case study of separation processes
3) freeze-drying of lactic acid bacteria, as a case study of preservation processes
4) ice cream crystallization, as a case study of structuring processes
Project Results:
WP2: Knowledge representation and data management
The specific objectives of WP2 were to propose tools and methods dedicated to knowledge and data management on the different case studies. One main point was to use and validate these methods for all food processes implied in the CAFE project. Approaches, architectures, models and innovative software have been developed that were implemented on different food processes. The relevance of proposed methods and their ability to evolve and operate together was also analyzed. A major challenge was also to provide a data management system easy to use and efficient in a scientific context. One of the problems often encountered is the lack of formalized description in order to re-use and share data in large scale. Indeed, one of the biggest challenges in scientific research is now to unlock the full value of scientific data. The first key issue is to strongly associate data models (relational models, XML Schema, formats, etc) and formalized knowledge models. We focused on introducing metadata expressiveness to provide a controlled vocabulary and inference through ontologies. The other main issue is to represent knowledge in a generic way, so that models and tools are easily implemented for the different food processes. This task required to target at the common concepts in food processes. It also required to formalize these concepts and to formalize relations between concepts. To be able to design tools and methods for all case studies was a work package specific challenge. We have developed and implemented for all case studies: a layer architecture and Web Service implementations; models for scientific and local databases; ontologies for processes, variables and information descriptions (metadata); and two approaches of interface for mathematical models: based on OpenMI/Seamless or local database.
The ontology of units and measures was specifically created to offer a structure in which units and measures can be formally defined (see WP2.1). It allows to specify variables’ role and acquisition method, and to define the usual/unusual variables, the on-line/off-line variables, etc. This information can be used further to generate the databases. This ontology was implemented on different food processes. It was accompanied by a set of services that support unit conversion to enable models to obtain information in appropriate units.
Case study 1: Wine-making
The main result was the different measurement management (on-line, at-line and off-line) and metadata implementation. A Data Linked environment has been provided, based on Web semantic languages (RDF, RDFS, OWL and SPARQL) that allows to link resources (experiments, data, articles, reports, webpages,…). The scientific database is the process memory, and is a posteriori used for scientific issues, data analysis, data mining, modelling,…. This guarantees data reliability, integrity and sustainability. For example, the occasional anomalous results caused by oenological operations must be distinguished from disruptions caused by faults and failures. Indeed oenological operations should not be taken into account in the analysis while faults may significantly affect fermentation and should be considered in the analysis. Thanks to a Web form in Information System, operators and experts described events with RDF syntax according ontology of events. The off-line measurement management is often difficult because there are very heterogeneous data sources. The information system generates custom spreadsheets (xls files) and allows acquisition in a friendly way. The software checks the consistency of data, displaying which data already exist in the database, which will be updated and which will be added.
Case study 2: Microfiltration of food beverages
An ontology describing the beer production process, published on the food-ontology portal Wurvoc as the brewing-ontology (http://www.wurvoc.org/brewing) has been created. It has been zoomed in on the filtration process (Filtration-ontology, http://www.foodontology.nl/beer-filtration) and all relevant physical and chemical variables in this process and the workflow concerning filtration experiments (Beer filtration variables-ontology, http://www.foodontology.nl/beer-filtration-variables) have been specified. Concepts from these ontologies are used in the membrane fouling model and in the beer membrane filtration unit. In order to store the data in a fast and meaningful way we opted to use SeamFRAME, specifically designed to address aspects in the agriculture domain. Once the ontologies have been created, the framework can be used to (semi-)automatically generate a matching database and a set of Java classes. These classes and the tables in the database represent the classes from the ontology. The database stores instances of the classes with the semantic metadata, while the Java classes are used to connect the different models, input devices and database to each other. The connection between models, devices and databases uses OpenMI, a standard that defines how models should communicate with each other. The database and the input devices can be considered as models as well. The generated Java classes are used to program layers around the different models. The OpenMI standard is request driven, each layer specifies what data is available and when other models need some kind of information they will request it from the relevant model. This model in turn might ask another model for certain information. Once this chain reaches the lowest level (usually either the database or the input device) the information travels back up again. Although the SeamFRAME framework automates a lot of the above mentioned translation tasks, there are some limits to the system. We had to restructure part of the ontologies to fit the limits that SeamFRAME imposed on them. Moreover, we aim to gather process data and present them in a uniform and standardised way to the scientific layer that handles further processing in the fouling model and the control model. SeamFRAME turned out to be a suitable solution to gather and present explicitly modelled data and processes. The result is a database and a model connection interface.
Case study 3: Freeze-drying of lactic acid bacteria
On this case study, models, architecture, methods and software had been implemented. We were able to take into account existing software and obtained a very good interoperability. Web services, using standards and local database model allow this very important result. The local database has the role of hub for the transmission of each information required for the plant control for real-time operations. So the local database represents the main communication means between the concurrent processes and software units that constitute the control application. The local database scheme provides one table per variable, and triggers that create a new table as soon as a new variable is created. This allows managing the evolution of the different studied variables (e.g. one variable studied for a short duration, a new variable that will be measured on a long-term, etc). This solution avoids generating a lot of null values in columns (in the case of one table with one field per measured variable) or too long tables (in the case of one table with all measurements in line and not in columns). This kind of architecture enables a dynamical design of the hardware and software, providing a competitive advantage by the optimal reallocation of the computing capabilities. It has also a key role as an interface for integration from many of the technologies already present in process rigs. The integration is implemented by generic bindings from already existing softwares to table data where inputs, outputs and parameters are logged or by specific plug-ins infrastructure provided in open and portable way.
Case study 4: Ice cream crystallization
This case study shows how our approach could be flexible. We started from scratch on the Ice cream process. After 20 person/days of work we got a Scientific Information System with the main functionalities: experimental management conditions with a standardized vocabulary; data online and offline acquisition. Data of current or previous experiment can be plot dynamically in a Web browser; annotation of data, experiment, protocol descriptions,... A “data linked” environment is available.
The CAFE Information System currently runs on ice cream crystallization process. It was used for all the demonstration “Cafe DEMO day” in Antony.
Traceability
In the field of food processing, improve performance and reliability of information processing dedicated to traceability is a major issue. Traceability allows to track any product from its origin to the final reception point. This requires to record all movement of product and steps within the production process. With traceability, it is possible to identify precise datetime and location of products in order to be able to recall products. We proposed a methodology to store the traceability using Semantic Web annotations and using a reasoner. The first part of this work had started from the generic process ontology. This ontology describes different aspects of process involving products and sub-process (or unit process). It has implied the definition of a precise shared vocabulary with the different teams of the project CAFE. These knowledge models describe processes as successive sub-processes involving products and operations. This representation allows the description of case studies and the traceability of products in a simple way. Products and sub-processes are linked by property and an experiment can be described as an instance of a process. Any product could be an input product for a specific subprocess and the same product constitute output for another subprocess. We also defined the 'adding product' property. In our ontology, the relationship between products and unit processes is described and specialized. This method allows to trace the product elaboration with the semantic. Software reasoner infers new information. Traceability purposes need to deduce information. We used engine reasoner to infer a RDF graph. Beyond the needed traceability in the CAFE project, we used semantic graphs to link supplier, producer, retailer and carrier. Graph of products or graph of operations can be generated allowing queries to find history of a product and queries to estimate impact of a product. This work was done with BivTrace and INRA. We developed a software prototype of a smart tool whose originality is: (i) to use reasoning to help experts in order to find problem origin; (ii) to determine (inferred approach) all the actors impacted by the problem; (iii) to provide a friendly access to semantic information. We adapted process ontology to traceability case, in close collaboration with traceability experts. We added new concepts such as "Lot", "Actor" or "Transaction" and semantic relation between these concepts. “IsTransformedIn”, “IsMixedTo”, “IsCutUp”, “IsComposedOf” are defined sub-properties of “IsImplicatedIn”. We designed a distributed application architecture based on a Web Service. A main point of this application is interoperability between semantic graph part and database part. The semantic graph and database model are based on key common concepts.
WP3 : Process experiments
Case study 1: Wine-making
Wine quality is difficult to estimate because more than 100 different compounds can contribute to wine flavour. A number of “quality marker molecules” have been identified among these compounds: (i) varietal aromas, volatile compounds linked to non-volatile precursors in the grape that are released by the yeast during fermentation, (ii) fermentative aromas, generated by yeast secondary metabolism. The varietal aroma compounds are present at very low concentrations, and cannot therefore be determined on-line. Efforts to develop on-line monitoring of quality and metabolic markers therefore currently focus on fermentative aroma compounds.
Task 3.1: Product and process characterization using on-line and off-line measurements
The main objective of the case study “Bioconversion” in Task 3.1 was the on-line measurement of key ‘marker molecules’ with a high acquisition frequency during the winemaking fermentation. To measure the concentrations of these volatile molecules, it was decided to use an on-line gas chromatography (GC) system. The acquisition of this data allowed us to get the synthesis kinetics of some key volatiles markers. It was possible to know when each metabolite is produced and then to establish a chronology of the metabolic events that took place during the wine-making fermentation. Moreover, the obtained on-line data were compared with off-line standard data, such as CO2 kinetic and main products concentrations. Correlations between some volatile markers measured on line and some off-line parameters have been built. Thanks to the high acquisition frequency of this device, kinetic parameters of major interest for modelling yeast metabolism were calculated.
Task 3.2: Impact of the process variables, setup of a database for process modelling and identification of the parameters critical for the control
It was decided to focus on the study of the gas-liquid transfer of the volatile compounds. Indeed, even if the concentration of volatiles at the end of fermentation depends primarily on their synthesis by the yeasts, it may also be significantly modified by losses into the exhausted CO2. Measuring these data on-line allows us to calculate balances differentiating the microbiological process of production and the physicochemical process of transfer into the exhausted CO2. It permits a better understanding of the production of the fermentative aromas and the development of optimized strategies for fermentation control. Indeed, from a microbiological point of view, the total produced amount must be considered, whereas, from a technological point of view, the concentration remaining in the wine is the key issue. The study of the gas-liquid transfer was focused on 3 compounds – representative of the diversity of the fermentative aromas: a higher alcohol (isobutanol), an acetate ester (isoamyl acetate) and an ethyl ester (ethyl hexanoate). During the alcoholic fermentation, the gas-liquid ratio of the volatile compounds can be affected by different parameters: the temperature, the liquid phase composition (matrix effect) and the CO2 release (stripping effect). These two last parameters vary throughout the whole fermentation.
Task 3.3: Test of control strategies elaborated in WP6 and WP7.
The test of the control strategies developed in WP7 was performed on the continuous multi-stage reactor (MSCF), which was shown to be representative of the traditional batch fermentation plant. The control objective has been expressed in terms of time minimization. The goal was thus to make the system go as fast as possible from one set-point to another. The overall motivation concerning the control of MSCF is to make this fermentation setup more reliable and reproducible. The selection of the desirable values of sugar concentrations will be easier, due to the use of a model associated to a control system. The linearizing feedback control law has been tested on the experimental setup.
Case study 2: Microfiltration of food beverages
For beer filtration, three methods are common: depth filtration, and surface filtration (single- or double-pass).
Depth filtration removes particles from beer within the depth structure of the filter medium itself. The particles are either mechanically trapped in the pores or absorbed on the surface of the internal pores of the filtration medium. The filter media can be pre-made sheet filters or fine powder made of, for example, diatomaceous earth (DE), also known as kieselguhr, which is introduced into the beer and re-circulated past screens to form a filtration bed. Surface filtration can be either absolute or nominal with a minimal depth capacity. Surface filtration consists of a thin membrane or a thin membrane covered with polypropylene or polyethersulfone in which particles are trapped in pores in the filter medium. Prior filtration with a depth filter is usually required to prevent clogging the surface of a cartridge membrane filter. Surface filtration, using membranes, is a novel beer filtration method. Cleaning of the membrane filters is therefore still done using a rule of thumb, using a simple maximum pressure rule that was determined experimentally. Finding an optimal cleaning strategy is the next step: minimizing the TCO (Total Costs of Ownership) for a filtration unit by reducing chemicals, energy and water consumption and lengthening membrane life.
Task 3.1: Product and process characterization using on-line and off-line measurements, improvements in the description and understanding of the process
The fouling of the microfiltration membrane is studied by varying specific process and beer parameters. Based on data available from literature three beer components were determined to be the most relevant factors causing fouling on the membrane surface and in the membrane pores: macromolecules, aggregates, and yeast cells. Macromolecules, like proteins and carbohydrates reach a scale not larger than 0.2 μm. It is expected that those particles will be adsorbed in the membrane (mainly beta-glucans). Aggregates (Colloids) are the so called Haze particles, which reach a size of around 0.5 μm. Beside the adsorption in the membrane and selected pores, they are expected to be captured in the cake layer. The largest particles are the yeast cells and (if added) filter aids with a size about 5 μm. Those particles are larger than the membrane pores and are expected to form the cake layer. Firstly the “mBMF” has been built to be run in a Cross-Flow configuration, as is also done on industrial scale. Beer is being circulated in a loop through the membrane hollow fibers. Process parameters which can be varied are the cross Flow velocity over the membrane surface and the permeate flux through the membrane pores. Since most of the data available from literature to describe filter fouling in modelling systems are based on Dead-End studies, the decision was made to adapt the mBMF to be able to also run in Dead-End configuration. This means that the beer is pressed into the hollow fiber membrane, which is blocked at the outlet. The beer is pressed through the membrane, instead of partly circulating back into the tank. To meet the requirements of the brewers and to work with the most realistic beer composition, also a third process program was written: in a serial filtration, beer will be filtered in a Cross Flow configuration on one membrane module and the “fresh filtered” permeate were used
Task 3.2: Impact of the process variables, setup of a database for process modelling and identification of the parameters critical for the control
The effect of macromolecules on fouling was studied with unfiltered beer in a Serial Configuration. In the first filtration step the yeast cells and macromolecules should have been filtered out. The second filtration step then should separate the remaining macromolecules. Several filtration runs in this configuration were performed, but were faced with a strong foam creation within the membranes and tubes in between. Aggregates are temperature sensitive. Temperature was thus used as a parameter to influence the concentration of the aggregates present in the feed beer. It is expected that the higher the temperature was during centrifugation, the more aggregates are present in the beer, since at this temperature most of the aggregates were dissolved and not removed during centrifugation.
Task 3.3: Test of control strategies elaborated in WP6 and WP7.
Advanced (optimal) control relies on a physical dynamic BMF model. The general goal is to produce enough cleaned beer over time while removing fouling against minimum costs. The average transmembrane pressure (TMPavg) is measured and used as an indicator for membrane fouling. It is not allowed to exceed an upper bound to prevent membrane degradation. The following optimal control configurations were studied: constant values for retentate and permeate flux that are identical for all filtration periods (CC0), constant values for retentate and permeate flux during each filtration period but different from one filtration period to another (VC0) and variable values for retentate and permeate flux during each filtration period that are also different for each filtration period (VC1).
Case study 3: Freeze-drying of lactic acid bacteria
This case study is mainly investigated on a model strain of lactic acid bacteria: Lactobacillus bulgaricus CFL1, a very sensitive strain to the freeze-drying process. Two formulations of protective molecules were selected according to the following criteria: different physical behaviour during the freeze-drying process and different ability to protect bacteria during the process. The freeze-drying process involves three successive steps: freezing of the aqueous solution, followed by primary drying to remove ice by sublimation and finally secondary drying to remove unfrozen or sorbed water by desorption. The improvement of the understanding of the freeze-drying process of bacteria requires the determination of the drying kinetics (sublimation and desorption kinetics) and the evaluation of quality degradation at various time of the process. Both these aspects were investigated for the protective medium C200 and the both freezing conditions (compact and pellet layers). Characterization of the formulation is needed for process optimization and for defining the upper product temperature limit during the primary and the secondary steps of the freeze-drying process. During primary drying, if the product temperature is higher than the collapse temperature, the amorphous material will undergo viscous flow, resulting in loss of the pore structure obtained by freezing, which is defined as the collapse phenomenon. Collapsed dried products generally have high residual water content and lengthy reconstitution times and may also present a loss of functional properties.
Task 3.1: Product and process characterization using on-line and off-line measurements, improvements in the description and understanding of the process
It was dedicated to the physical and biological characterization of the selected lactic acid bacteria strain with respect to the freeze-drying process. Two protective media and two freezing procedures were investigated. The freeze-drying process resulted in a degradation of the acidification activity of lactic acid bacteria whatever the freezing and the drying conditions applied. The loss of acidification activity is more important for the pellet layer than for the compact layer even if the loss of acidification activity is lower after the freezing step. The product configuration in pellets induces more degradation during the both drying steps than the compact layer configuration. When considering the different steps of the process, the most important loss of acidification activity is caused by the sublimation step whatever the freezing and the drying conditions. The removal of ice by sublimation cannot cause bacteria degradation since ice and bacteria are phase separated during the freezing step. It is thus the removal of the unfrozen water that causes the major bacteria degradation. When considering the pellet configuration, most of the unfrozen water is removed during the primary (sublimation). When comparing the both drying conditions for the same freezing method (compact layer), the aggressive condition results in lower loss during the sublimation step and higher loss during the desorption step than the conservative condition. A higher sublimation rate seems to limit bacteria degradation during the primary drying step. Nevertheless the higher shelf temperature applied during the secondary drying seems to have a negative impact on bacteria quality. Freeze-drying of pellets of 1-2 mm of diameter of bacterial suspension has several potential advantages over compact layer freeze-drying in tray, like shorter desorption step and easier handling of the dried product. One of the main objectives of the Partner 5 (Telstar) is to develop a freeze-dryer prototype for generating frozen droplets of controlled size of bacterial suspension and to freeze-dry them. A prototype of a spray freezer was developed. The sprayed droplets are frozen by a cold gas stream circulating in counter-current.
Task 3.2: Impact of the process variables, setup of a database for process modelling and identification of the parameters critical for the control
A systematic study of the effect of the process variables on the drying kinetics and on the degradation of the biological activity of the bacteria was carried out. The chamber pressure applied during the desorption step had an impact on the water activity of the freeze-dried product. The higher the chamber pressure, the higher the water activity. The water activity reached at the end of the process had an important impact on the biological activity recovery of the bacteria and on the storage stability.
The impact of the process conditions applied during the sublimation step on the sublimation time was also significant. As expected, an increase of shelf temperature resulted in an important decrease of the sublimation time. The main result observed is that the sublimation rate impacts the degradation of the biological activity of the bacteria during the process. The higher the sublimation rate, the lower the degradation. This tendency is confirmed by the storage stability.
When considering the formulation without bacteria, the collapse temperatures (Tcoll) and glass transition temperatures of aqueous solutions (Tg’) are close. The glass transition and the collapse of the product structure take place in the same temperature range. The collapse temperature determination was quite difficult for bacterial suspensions as the result of a less distinct structure pattern obtained by freezing compared to aqueous solutions. In addition, the viscous flow took place gradually from the beginning of local loss of structure to the complete loss of structure. A strong influence of the cells was highlighted compared to the “protective” medium effect, tending to increase the collapse temperature of the complex material. As a conclusion, it can be pointed out that the presence of lactic acid bacterial cells conferred a significant “robustness” to the freeze-dried product, thus allowing the use of higher sublimation temperatures during primary drying than expected from the protective medium. The effect of the presence of lactic acid bacteria on the behaviour of the product during freeze-drying seems to be partly related to the cell structure.
Task 3.3: Test of control strategies elaborated in WP6 and WP7.
In the freeze-drying case study, the optimal control problem consisted in reducing the duration of the drying cycle, while satisfying final and path product quality constraints. The considered constraints were the internal process dynamics, as given by the model developed in WP4, the final moisture content of the product imposed by the product stability requirement, the collapse temperature of the product ensuring limited biological quality degradation and mechanical integrity of the product, as well as the equipment capabilities. Optimisation algorithms were implemented and experimentally tested on-line. Real-time optimization algorithms demonstrated its ability to update the control profiles reliably, in face of various disturbances: initial process state (temperature, pressure, amount of product) temporary lack of measurements (feedback) and temporary difficulties in approaching the prescribed set-points.
Case study 4: Ice cream crystallization
Ice cream and sorbet manufacturing process is composed of different steps: the first step is a mixing of ingredients, followed by a preheating at about 60°C in order to perform homogenisation. According to the kind of product, a pasteurisation (80-85°C during a few seconds) is carried out. The mix is then stored for ripening at 4°C for 12-24 hours. After, the mix follows a stage of pre-freezing and foaming, and ice crystallization takes place inside a Scraped Surface Heat Exchanger (SSHE, and often called "freezer"), thanks to the refrigerant fluid vaporizing at the wall. This step is the most critical of the process and it is responsible for the final quality of the product. Ice cream and sorbet quality is mainly governed by sensory properties related to the ice content, the ice crystal size distribution and the apparent viscosity, which are dependent on how the crystallization occurs in the freezer. All through the crystallization process, ice cream and sorbet undergoes very significant changes in the transport properties and thermal properties. This generates significant changes in velocity profiles, which, in turn, considerably modify temperature profiles and pressure drops inside the process equipment. In order to control the final quality and the technological properties of ice cream and sorbet, it is necessary to control the influence of temperature and shear rate, as well as their coupled effect on the product quality.
Task 3.1: Product and process characterization using on-line and off-line measurements, improvements in the description and understanding of the process
The pilot scale Scraped Surface Heat Exchanger (SSHE) has been installed at CEMAGREF and is fully functional. The pilot plant was equipped with a variety of sensors for monitoring the process and the product quality, but also for the refrigerating system: temperature, pressure, power and water consumption sensors on the refrigerating system; on-line quality sensors: draw temperature, ice crystal size distribution by Focused beam reflectance method (FBRM), ice crystal size distribution by EZ on-line imaging probe, on-line viscometer MIVI, capillary viscometer, etc.; numerical control of the freezer and data acquisition in LabView® was implemented. Some of these sensors are quite innovative, such as the two optical sensors for on-line measurement of ice crystal size distribution. A large number of equipment qualification and sensor validation experiments were performed.
Task 3.2: Impact of the process variables, setup of a database for process modelling and identification of the parameters critical for the control
The mechanism of ice crystallization within a freezer is affected mainly by the operating conditions of the freezing process, such as the evaporation temperature of the refrigerant fluid, the dasher rotational speed and the mix flow rate. The temperature of the refrigerant fluid provides the driving force that triggers ice nucleation and it determines the heat removal rate of the system. The scraping action of the dasher improves the heat transfer rate between the freezer wall and the product. The mix flow rate dictates the residence time of the product within the freezer, affecting the available time to remove heat from the product, and consequently, the ice nucleation and growth mechanisms of ice crystals. It is therefore important to identify the operating conditions of the freezing process that most directly affect ice crystal size so as to improve the quality of the final product. Our results showed that the use of the FBRM sensor makes it possible to monitor online the development of the ice crystals in sorbets containing up to 40% of ice. The mean ice crystal chord length was mainly affected by the evaporation temperature and slightly by the dasher speed. Decreasing the refrigerant fluid temperature allows us to reduce the ice crystal size, due to the increase of the supercooling driving force that leads to further ice nucleation. High dasher speeds slightly decreased the mean ice crystal chord length, due to production of new smaller ice nuclei by secondary nucleation, inducted either by the smaller ice flocs remaining from previous scrapings; or by the remaining ice debris produced during the attrition of the larger ice crystals. The draw temperature of sorbet was significantly affected by the mix flow rate, followed by the refrigerant fluid temperature and the dasher speed. Low mix flow rates (long residence times) result in lower draw temperatures, due to the fact that the product remains longer time in contact with the freezer wall, extracting more heat from the product. Low evaporation temperatures lead to lower draw temperatures. High dasher speeds very slightly warm the product, due to the dissipation of frictional energy into the product, which effect was in part moderated by the improving of the heat transfer coefficient between the product and the freezer wall. We observed that an increase of the mix flow rate will reduce the axial dispersion in the SSHE. We also observe that lower evaporation temperatures would lead to the presence of a dead volume, which is due to the increase of the apparent viscosity of the product near the heat exchange cylinder wall, which delays the exit of a certain amount of product near the freezer wall. The dasher rotational speed, showed no significant influence on the RTD curves.
Task 3.3: Test of control strategies elaborated in WP6 and WP7.
The control strategy for the crystallization case study has first been expressed as a problem of minimization of the energy consumption with constraints on both the viscosity and the mean crystal size. The energy consumption has been numerically evaluated for different values of the evaporation temperature Te, the mass flow rate mfr and the scraper rotation speed Nscrap. We have observed that, in the ranges of admissible values of these input controls, the function of the energy consumption is monotonous. More specifically, it is decreasing with respect to mfr and increasing with respect to Nscrap and Te. As a consequence, the energy consumption will be minimal for the lowest evaporation temperature Te, the lowest dasher rotation speed Nscrap, and the highest mix flow rate mfr. The optimal values of the control inputs are therefore determined by the constraints and no optimal control strategy is needed to solve the problem. As a consequence, the problem has been reformulated: whereas the mean crystal size can really be considered as a constraint, the viscosity has to be controlled, depending on the desired type of ice cream. In the new formulation of the problem, the issue is therefore to control the viscosity of the ice cream at the outlet of the freezer. For that, we use as control input the evaporation temperature Te. It must be pointed out that the inlet mass flow rate could also be used, but, as it is directly related to the productivity of the process, it is usually kept constant in industry. Under the physical assumption that the outlet temperature (after the pipe) is equal to the saturation temperature, the outlet viscosity of the ice cream only depends on the third moment M3 of the crystallization model developed in WP4. Thus, to control the viscosity, we just have to control M3 or Tsat, which is a function of M3. For the experiments, a value of Tsat has been chosen as the set-point. A cascade control strategy has been developed to control the saturation temperature of the ice cream. This is composed of two loops : a primary loop to control Tsat using Te, and a secondary loop to control Te with Vcomp (the compressor rotation speed). The control law has been validated on the experimental set-up and gives satisfactory results.
WP4 : Model building, process dynamics and model reduction
WP4.1 Existing model review and adaptation
As much as possible, the developed models are developed from ‘first principles’. We have succeeded in that for three case studies, namely the freeze-drying case, ice-crystallization case and beer microfiltration case. The problems addressed in these three cases are dominated by physical phenomena, in contrast to the wine fermentation case – where biochemistry is dominating the problem. For physical phenomena ‘first-principles’ models are often available – while for biochemistry the compounds involved in the reactions and their kinetics are often a priori unknown. A first-principle approach via metabolic-network modelling has been investigated. Deliverable D4.1 also reports on the often required model adaptions required for the specific problems in the case studies, which involved for example developing constitutive laws for material properties and a first estimation of model parameters. All three first-principle models developed for the three cases involving processes of physical nature, has gone beyond the state-of-the-art models known in literature, at the start of the project.
WP4.2 Model reduction
Model reduction techniques have been applied to these models. The model reduction applied to the first principle models from three case studies have been reported in D4.2 and several scientific papers. To the freeze-drying case and beer microfiltration case, scale analysis has been applied as a first step in the model reduction. With this technique one can simplify the model, but one can still retain its mathematical-physical description (in terms of algebraic and (partial) differential equations). In the ice-cream case the population balance is reduced in complexity using the moment method. Also, here the mathematical-physical description of the problem can be retained. Only in the freeze-drying case it was required to reduce the model further in complexity to attain sufficient computational speed for use of the model in real-time model-based control. Here, the technique of Proper Orthogonal Decomposition has been applied. From a modelling perspective this technique has the disadvantage that the mathematical-physical description is lost, but that is at the gain of very significant computational speed. The top-down approach applied to the wine case more or less automatically leads to a model of reduced complexity. Hence, the above mentioned model techniques need not be applied to these type of processes (of chemical nature).
WP4.3 Model identification and validation
For the ice cream and freeze-drying case study model parameters have been identified using the method of optimal experimental design, where the reduced model has been used to compute an optimal excitation of the experimental system. Furthermore, the reduced model has been used for sensitivity analysis in all cases, which is a requirement prior to optimal experimental design. Details are reported in the deliverables of WP3 and WP6.
WP4.4 Development of simulation software
For each case study simulation software is developed for the reduced model, which has been applied in the model-based process control and optimization. (Reduced) model descriptions are incorporated in deliverables D4.1 and D4.2. Specifics on simulation software for demonstration purposes are described in deliverables of WP9. Only for the ice-crystallization case study a demonstration has been developed.
WP5 : Sensor development and validation
Microfiltration of food beverages
For case study 2 gas sensors provides a method to follow the evolution of the beer during the filtration and to detect the point when the beer reached a stable stage indicating that no more solids are removed from the liquid and then that the membrane is saturated. Gas sensors are obviously sensitive to the composition of the headspace then for a reliable analysis it is necessary that the product is kept in the optimal conditions to develop a representative and measurable headspace. The beer inside the filtration unit is kept at low temperature (T<5°C) and high pressure. Both these conditions hinder the possibility to obtain a representative volatile part, then the gas sensor has been operated off-line through a periodical sampling of beer at the output of the filtration unit. Figure 1 shows the typical behaviour of transmembrane pressure versus time and the ongoing of the first cleaning procedure (backwash). The dots indicate approximately the sampling point. Samples taken at specified times, were measured with the gas sensors. The features of these gas sensors are described in the deliverable 5.2. Figure 2 shows the first principal component (PC1) calculated from the signals of the eight gas sensors plotted versus the time of sampling. The behaviour is representative of the on-going filtration process and it shows the occurrence of the membrane saturation when PC1, and then the sensor signals, saturates. CTech explored in the project the possibility to measure the progress of filtration of beer by using impedance spectroscopy. For the scope, a interdigitated couple of electrodes is placed in the liquid and the impedance versus frequency is measured with a network analyzer. Actually, the Network Analyser directly measures the complex reflection coefficient. In this case the magnitude of the reflection coefficient has been found to be meaningful to describe the properties of the beer samples. The spectral magnitude of reflectance coefficient provides a fingerprint that can be related to the composition of beer. Figure 3 shows an examples of fingerprint. The reflection coefficient profile method has then been found very capable of detecting a small perturbation from the initial beer constitution and also the capability to distinguish between change in beer constitution, beer temperature and beer aeration.
Figure 1: qualitative behaviour of the transmembrane pressure (TMP) versus time.
Figure 2: behaviour of the first principal component plotted versus the filtration time.
Figure 3: spectral fingerprint of a beer with high content of particulate.
Freeze-drying of lactic acid bacteria
The major results have been obtained with Ultrasound sensors and Electronic nose.
Ultrasound sensors have been developed by Alctra. It is based on an arrangement of piezocomposite emission and receiver transducer allowing for the contemporaneous transmission and echo/pulse measurement modes. Figure 1 shows the schematic set-up of the sensor system whose actual appearance is displayed in Figure 2. The transmission mode is provided by a couple of emission and receiver transducers, while the echo mode is provided by a single emission/receiver unit. In both cases the propagation time a ultrasound pulse is measured. Pulses of ultrasounds at 1.25 MHz have been used. The use of the two measurement modes gives an unprecedented advantage to monitor the process of freeze-drying and also the vertical stratification occurring in the bacteria mass. This is visible in Figure 3 where the signals versus the time of both the measurements are shown together with the temperature of the sample. The propagation time recorded for the transmission mode (wave traveling parallel to the sample) shows an abrupt change at the temperature of -10.6°C. On the other hand, the propagation mode of the echo time undergoes a more smooth transition indicating the occurring of the stratification process that proceeds until the temperature reaches -14.8°C. This arrangement provides than a thorough characterization of the freeze-drying process.
Figure 1. schematic arrangement of Ultrasound sensors for freeze-drying monitoring
Figure 2. Picture of the sensor cell with the emission/receiver transducers.
Figure 3. transmission (E/R) and echo mode signals and temperature during one freeze-drying process.
The other developed sensor is the array of gas sensors (electronic nose) developed by UTOV. In this case, the gas sensor array has been used to estimate the quality of finished product. The quality of freeze-dried bacteria are evaluated in a series of destructive tests aimed at measuring the residual moisture, the cells viability, and the acidification activity. This last quantity is estimated with a method called CINAC where constant conditions are kept and the time necessary to reach the peak of the derivative of pH is measured. Shorter this time higher the acidification property. The electronic nose was applied to measure in a non destructive way the headspace of the dried bacteria. Figure 4 shows the experimental arrangement, measurements have been done at room temperature. Results suggests that an electronic like that developed in CAFÉ project can be fruitfuilly utilized for a non destructive inspection of the quality of freeze-dried bacteria.
Figure 4 : Experimental setup for freeze-drying bacteria quality test. Bacteria are kept under the sampler at the right. The electronic unit in the background captures the headspace and delivers to the internal sensor array.
Ice cream crystallization
The sensors developed for the ice cream case study are a refractometer for sugar quantification and a gas sensor array (electronic nose) for the evaluation of global properties of the ice cream.
The refractometer developed by ALCTRA is a total internal reflection arrangement where the change of the refractive index of the measured sample modulates the amount of light internally reflected in a sapphire glass. Fig. 6 shows the principle of the device. The probe is an hemispherical lens that directs the light emitted by an IR LED to a phototransistor, the light is reflected by the surface of separation between the detector and the sample. Any change in the refraction index in the sample is then detected as a change of amount of backreflected light. Infrared light is used because it is known that the refraction index in infrared is particularly sensitive to the content of sugar molecules. Than the amount of light lost at the sapphire/sample interface is also sensitive to the concentration of sugar. In the adopted arrangement, the light is refracted twice giving rise to an increase of sensitivity towards changes of refractive index. Fig. 7 shows the device placed in line at the output of the ice-cream machine, at the pilot plant at IRSTEA. The device is complemented by a temperature sensor necessary to compensate for the changes in refraction index due to temperature.
Figure 6. Drawing of the measurement principle of the ALCTRA refractometer
Figure 7. Measurement cell implementing the refractometer and a temperature sensor.
Figure 8. Sugar content (in °Bx) and temperature behaviour during the sorbet processing.
In Figure 8 the behaviour of temperature and the estimated sugar content (given in degree brix) is plotted. Since the concentration of sugar is also dependent on the segregation of water in ice-crystals there is a correlation between the sugar concentration and the ice mass fraction. In order to measure the properties of the ice-cream the gas sensor array has been connected on-line at the output of the ice-cream machine.
WP6 : Process design and optimization
Main results within the context of WP6 to be summarized below, are related to the use of efficient optimization methods, on the one hand to produce a reliable representation of the plant (a model) consistent with the available data and on the other hand to devise optimal modes of plant operation. This essentially leads to the following optimization paradigms:
• Optimization for understanding process’s behavior, which translates into the combination of measurements with process experiments to identify and calibrate mathematical models that will be subsequently employed in process optimization.
• Optimization for process control, meaning either the off-line computation (design) of optimal operation conditions to be communicated to controllers or real-time optimization during plant operation
Microfiltration of food beverages as Separation process
For a given amount of beer to be filtered, operation policies were designed to minimize, pumping energy in addition to a number and costs associated to membrane cleaning. Given such objective, off-line and on-line optimal control policies have been computed and validated, resulting in costs reductions of about 12%. Optimal plant operation involves decisions at different levels, in particular, the following values have to be chosen: the number of chemical cleanings (CIPs), the number of back-flushes per CIP, the value of the cross and permeate flow set points (QF, QP) and the maximum trans-membrane pressure (TMPmax) between back-flushes that allow processing the specified amount of beer by the required final time.
On-line model parameter estimation
Some experiments have been performed in the beer filtration pilot plant in order to identify the fouling formation dynamics and associated filtering properties. Adjustment of the reduced order (operational) model is done via an on-line (recursive) least-squares parameter estimation. The experimental data include permate and cross flow, output and transmembrane pressures. Parameters were selected by sensitivity analysis and comprised initial membrane resistance Rk, feed dynamic viscosity η, fraction of membrane aggregates β, critical distance parameter Qcr and backflash cleaning efficiency cBF. Initial and final values of the parameters adjusted in the reduced model are given in Table 1.
Rk η β Qcr CBF
Initial 1.16667 1.16667x10-11 0.4 2.1x10-7 0.5
After PE 0.13763 1.3x10-11 0.49 2.65x10-7 0.4
Table 1: Summary of the parameter estimation results. PE refers to Parameter Estimation.
Computation of optimal operation policies
The approach selected for the beer microfiltration merges economic optimization and control and makes use of particular parameterizations to solve the problem using a small number of NLP problems in a single layer. This provides an efficient way of solving the problem and shows a way of dealing with mix-integer dynamic optimization problems. The operational costs to be minimized include mechanical energy and costs associated to backflush and CIP. Results are presented in deliverable 8.4 for two scenarios: one that involves constant flows over a whole CIP cycle and the other with step-wise variable flows (constant for each single BF). In general it can be concluded that (global) stochastic algorithms performed much better than deterministic ones (i.e. local based on SQP). Two thirds of the costs were reduced for constant flows when using 6 instead of 5 filtration periods. As it should be expected, better solutions were obtained when allowing variable flows. In both cases, the costs obtained were significantly smaller than those associated to standard operation involving 7 filtration periods and constant flows (QF = 10 l/h, QP = 0.26 l/h).
Case study 3: Freeze-drying of lactic acid bacteria as Preservation process
Freeze-drying operation makes use of shelf temperature and chamber pressure as the variables to control mass transfer in the product and thus time to attain a given dehydration level. In minimizing such process time, product temperature should not trespass too often the glass transition temperature in order to avoid collapse of product structure that must be considered as a quality objective. For this case study, and in order to have a reliable model representative of the process, model parameters were first identified. The model has been combined with optimal control methods to produce off-line as well as on-line operation policies to minimize process time while ensuring maximum quality.
Model parameter estimation
A detailed description of the model employed for computing optimal can be found in Deliverables 4.1 and 4.2. The objective is to compute the value of the model parameters that better fits the model simulations to the experimental data. Critical parameters included those related to mass and heat transfer resistance. In particular selected parameters were: two parameters (k1 and k2) included in the mass transfer resistance variable, two parameters (hL,1 and hL,2) included in the convective heat transfer variable, dried region thermal conductivity kD, geometrical correction factor at the product bottom fb and mass transfer resistance between the condenser and the chamber kv. Six experiments were performed in which two states were measured and employed as observables: the temperature at the bottom of the product and the vapor pressure in the chamber. Two control variables were employed to define the different experiments: shelf temperature and chamber pressure. The results of the parameter estimation are summarized in Table 2 comparing initial parameter values collected from literature and those obtained from estimation. In order to illustrate the predictive capability of the model, results are compared with experimental data in Figure 9. Figure 9(a) shows the improvement after parameter calibration. Figure 9(b) compares model results with experiments other than those employed for model calibration, what suggests good model predictive capabilities.
KD k1 k2 hL1 hL2 kv fb
Initial
10.1 0.4
0.99
After PE
1 0.1
0.92
Table 2: Estimated parameters as compared with previous ones obtained from literature. PE refers to Parameter Estimation.
Computation of optimal operation policies
Off-line as well as on-line operation policies have been computed and tested both in simulation and at the pilot plant level. Comparisons between standard and optimal operation have been established, resulting in reductions of process time of about one fourth of the standard operation while ensuring quality. Dynamic models employed were physical and mathematical reduced versions of a multi-scale model for mass and energy transfer in the product.
Figure 9: Predictive capabilities of the freeze-drying model. (a) Effect of parameters of the dynamics. (b) Validation of the model and experiments
Typical constraints for the optimization problem in this case study are the final water content and product temperature. Water content is directly related to the product quality and must be below a given bound at the end of the process. Product temperature on the other hand should not exceed the collapse temperature in order to ensure the product integrity. The last constraint can be relaxed to increase productivity, by means of an integral constraint of the form:
(1)
The maximum difference between product and collapse temperature has been also included as a constraint . Values for in (1) and have been obtained for each experiment carried out in the pilot plant. Differences between collapse and product temperature are lower than in the standard case. Control profiles (shelf temperature) as well as glass transition and product temperature for standard and optimal cycles is presented in Figure 10.
Figure 10: (Shelf, product and collapse) temperature profiles for (a) the standard and (b) the optimal operation policies for the second experiment.
Case study 4: Ice cream crystallization as Structuring process
Crystallization is a continuous process with some of the operating conditions fixed as constraints by the user. Energy cost has been characterised experimentally as a function of the conditions and it has been shown that it is a monotonous function of the control variables. A dynamic optimal operation policy is not relevant in this case, however for the purpose of process design and scheduling a crystallization model proved to be critical.
System identification and experimental design
Parameters were identified for a reduced version of a population balance based model. Based on a set of integro-differential equations describing different momentum orders, the model relates inputs such as scrapper speed, mass flow rate and evaporator temperature with outputs such as temperature along the freezer, crystal size and viscosity (a detailed description of the model employed can be found in deliverable 4.1 and 4.2). Experiments have been performed at IRSTEA using a factorial plan (D-optimum) considering the above mentioned control variables. Three different types of measurements were employed to compare model prediction and experimental data: Temperature of the mixture at 3 locations in the crystallizer, mean crystal chord length at the output and viscosity at the output (6 values). Parameters to be estimated were: Wall heat transfer coefficient (he), Growth and nucleation parameters (β and α), Viscous dissipation parameter (χ), Initial crystal size (Lc) and Sorbet viscosity parameter (ξ). Table 3, summarizes parameter estimation results.
he α β χ Lc ξ
Upper bound
40
10
Lower bound
0 0 0
Estimated value
3.85
Table 3: Results of the parameter estimation procedure for the crystallization case study
WP7 : Process monitoring and control
Wine-making as Bioconversion process
The basis for the control design has been proposed within the first years of the project and has been reported in the related reports and deliverables. However, extensive simulations have pointed out a number of drawbacks with respect to the influence of several uncertainties, notable with respect to the available online measurements. Here are the possible improvements in the estimation of the different substrate concentrations necessary to apply the control on the process. First, recall that the process is composed of 4 interconnected chemostats in which the ouput of a reactor is the input of the next one. The adaptive linearizing control law is given by :
in order to force an input/output linear behaviour of Si as :
for some given values of λι. The terms k2μ2 are measured on line (these are the biogas measurements). The setpoints being also known, the only unknowns are the glucose concentrations Si. It was planned to estimate these unmeasured states using an approach based on the minimization of a sum of mean square errors. However this approachs suffers from a lack of robustness with respect to measurement errors. Thus, an approach based on the use of interval observers is expected to give better results. This approach takes advantage of the fact that the biogas is measured online in each reactor. Thus, the dynamics of the substrates in each reactor is given by :
.
Since these are decoupled equations of first order dimension, we can easily reconstruct intervals for the unmeasured varaibles Si in taking into accounts uncertainties on both the available measurements and the estimates of Si which are used as inputs of subsequent reactors.
Figure 11. Experimental results obtained by application of the linearizing control law strategy. Top: CO2 production rates. Middle: dilution rates (i.e. control inputs). Bottom: sugar concentration bounds, estimates and set-points.
The test of the control strategies developed in WP7 was performed on the continuous multi-stage reactor, which was previously shown to be representative of the traditional batch fermentation plant. The control objective has been expressed in terms of time minimization. The goal was thus to make the system go as fast as possible from one set-point to another. The experimental setup is composed of four tanks (reactors) connected in series. The control inputs are the flow rates Qa1, Qa2, Qa3 and Qa4 of the four reactors, with the physical constraints Qai ≥ Qa(i-1) ≥ 0 . The only available measurements are the CO2 production rates (one per reactor at each measurement time). As the measurement data are noisy, a filtering is performed so that a new value of the CO2 production rate is available every 20 minutes for each reactors. We present in Figure 11 the most relevant and interesting obtained results. Some details about these experiments can be found in deliverable D3.3.
Microfiltration of food beverages
Due to the large state dimension of the model (160), the sophisticated implicit numerical integration scheme and finally the need for global optimization due to local minima, the computation of optimal controls is difficult and time consuming. In addition long term effects require computation of optimal controls over all phases (F) and cycles (FC & CC), not just a single phase (F). Finally, due to uncertainty of certain model parameters and states, optimal control computations must be repeated after certain time periods to incorporate improved parameter and state-estimates. Feedback is needed to limit performance degradation due to uncertainty. In principle the number of backflushes or filtration cycles within a CIP cycle is variable. The same applies to the number of CIP cycles needed to filter a desired amount of beer. As a result optimal control computation becomes a mixed integer non-linear programming problem (MINLP) which belongs to the most difficult and time-consuming class of problems in mathematics. Therefore a dedicated problem reformulation is developed for on-line control. Currently the only on-line measurements available for control are those of the trans-membrane pressure (TMP). On-line measurements may be used to realize feedback control with the on-line computation of control corrections to limit performance degradation due to model and other types of uncertainties. Common practice is to on-line estimate the system state and several critical parameters if these are not known precisely. In the latter case the controller is called adaptive. During each back flush, we first estimate some critical, uncertain system parameters, using TMP measurements from the previous phase. Second the model with the adapted parameter values is used to estimate the initial state and compute a new optimal control policy for the next filtration phase. The optimal control is fed to the BMF during the next filtration phase thereby realizing feedback. In this way a computationally feasible adaptive sub-optimal feedback control approach is realized. After simulations performed as expected, we applied our adaptive sub-optimal controller to the BMF in the experimental setup. These experiments were done at a very late stage of the project and were therefore limited to a small number. Initially they also suffered from some programming errors. After removing those it still turned out hard to get a good model fit, as opposed to the ones shown in Figure 12. Remarkably, the optimal control computed from the model performed better than the standard one. The costs in the first case are equal to 2.56 [Euro/m2] whereas in the last case they were 6.8 [Euro/m2].
Fig. 12: Intermediate results adaptive suboptimal control system
Case study 3: Freeze-drying of lactic acid bacteria as Preservation process
The control objective combines operational and quality objectives and is formulated as minimizing cycle time while maintaining product temperature below glass transition. To that purpose, two input variables are considered: condenser temperature and shelf temperature. Measurements (output variables for feed-back control) include chamber temperature and pressure while process states are temperature distribution within the product. One particular state is the temperature of the moving front that in the present set up must be estimated or inferred from the available measurements. The proposed control configuration is presented in block diagram form in the figures 13 and 14 below. It consists of a two level integral structure that includes:
1. A supervisory level, responsible of computing/recomputing optimal control profiles in the event of deviations from quality (optimal profiles for chamber temperature and pressure, as well as front temperature)
2. A tracking/regulatory level to command the follow up of optimal profiles by acting on condenser temperature and shelf temperature
Figure 13. Supervisory Control Structure (RTO loop)
Design of the supervisory level has been completed and its main elements have been validated. These comprise:
- operational models to describe input-output variables
- a model of temperature distribution in the product as a function of chamber temperature and pressure
- a dynamic optimization solver, suitable for RTO and predictive control
The proposed structure has been tested at the pilot plant level (see Figure 15 below).
Figure 14. Robust Control Structure
Figure 15. Temperature control in the freeze-drying case study
Ice cream crystallization
In this case study, the control problem of the ice fraction is studied. The control strategy we propose is based on two control loops: a first loop to control with , and a second loop to control (and so ) with . The control laws which are considered are some linearizing control laws. Some approximations are made to make the control laws only dependent on the available measurements. The control scheme which has been proposed is given in Figure 16.
Figure 16. Control scheme for the control of the viscosity of the ice cream at the outlet of the freezer. The quantities Tsat, Vcomp and x are the respective saturation temperature, compressor rotation speed and state of the system; μ is the viscosity set-point, Tmsat is the delayed saturation temperature measurement, Tem is the evaporation temperature measurement and θ is the parameters vector. The circumflex accent is used for the estimates of the unknown quantities.
The control law which has been designed enables to compute, at each time instant, a value of the control input Vcomp. This value depends on the difference between the estimate of the viscosity and the set-point. It also depends on the estimates of the state and on the measurement of the evaporation temperature. This control strategy is in fact a cascade control strategy with two control loops:
• a primary loop to control the viscosity μ with the evaporation temperature Te;
• a secondary loop to control the evaporation temperature Te with the compressor rotation speed Vcomp.
Figure 17. Experimental results obtained by application of the control law on the crystallization process. Top: ice temperature T. Middle: evaporation temperature Te. Bottom: compressor rotation speed Vcomp.
The control loop can be described in the following way (see Figure 16) :
• first, the temperature of the ice cream is measured. This measurement is not made directly at the outlet of the freezer, but a bit further: there is therefore a measurement delay which has to be taken into account in the control scheme.
• the viscosity is then estimated at the current time by an observer and from the delayed measurement of the temperature: the observer used is called a Smith Predictor; it enables to compensate the delay.
• the estimate of the viscosity is then used in the control law which has been designed to make the viscosity of the ice cream reach a given set-point.
• the parameters used in the Smith predictor are adjusted on-line, to improve the estimation and be sure that the estimate of the viscosity goes to the real value of the viscosity.
Many experiments have been made to test and validate the control law on the pilot plant. We first have validated the control law without perturbations. Some results are given in deliverable D3.3. We then have tested the control law in presence of perturbations. In Figure 17, we show the results of the experiments performed on the day of the demonstration. During these experiments, we have considered several disturbances.
WP8 : integration
The tasks within WP8 can be divided in two main classes of activity. The first class is the collection and evaluation of the progresses and achievements of the work packages and case studies in the project.
The second class of tasks relevant for the activity of WP8 is represented by the development and validation of the integrated control systems. For these tasks the WP8 leader, SPES, developed and validated an integrated control system based on a new paradigm for computer aided control technology in food. A major achievement of the work package WP8 has been the development of an integrated system that includes the results from WP2 to WP7 in a unified technological framework featuring utmost trends in the ICT context. The design of a database-centric infrastructure realized a unified and open system for the integration of the results from the four case studies whilst combining the achievements of WP2 and WP8. The link between a Scientific database (for knowledge based reasoning) and Local Database (for communications and control) is the key of the integration activity. The project is based on the idea that each plant is connected to a Remote Server Machine (or possibly a clustered set of) – located at one or more remote control centres – in order to upload measured data and alarms and download control actions. This Remote Server is essentially a supervisor system which implements distributed control architecture. New DBMS (Database Management System) technology, allows organizing a completely different control structure driven by data and event acquisition from physical processes. The change or acquisition of a value or event can thereby trigger all the actions in the control infrastructure from both hardware and software points of view. This new kind of approach had not been into consideration from electrical and computer science engineering in the past because, due the technological limits on operating systems, on hardware resources and software paradigms, these were not ready nor suitable for this ground-breaking counterintuitive approach. The algorithms which constitute generic software modules, controllers (i.p. regulators), optimizers and generic logical units are deployable both on remote sites and on plant local sites. Local plant algorithms are enforced by the embedded electronics infrastructure. A relevant effort has been the inclusion of Embedded Matlab framework over embedded electronics. The organization has been conceived as hierarchical: on the upper side there is the highest computing power and the highest semantics and data abstraction. It is the distributed computing and storage layer. Typically there we consider the usual personal computers used both for access and computing and the possibly dedicated machine for high load scientific computing. It will be called the Global layer. On the lowest level (but not exclusively) we have the regulatory layer for autonomous control and the contact layer with sensors and actuators by means of low level communication protocols. Middle and bottom layers are enforced by embedded microelectronics.
A fundamental feature, paradigm and tenet of the control system is: every controlling software is a plug-in. Plug-ins are the major framework to render the system able to implement very complex control policies, as it happens for food processes, at a very low cost and with a very easy and unified procedure. Every algorithm (i.p. Matlab) developed within the project can be installed as a plug-in on the system. Every plug-in can be immediately put in communication with the others allowing for a vast class of topologies and hierarchies of the control. A remote upload server (on the highest layer of the infrastructure) is accessed for the deployment of plug-ins on the system: a web page will enable the enrolment of a control, optimizer, model or general logic software on the CAFE infrastructure. With the plug-in concept it has been created an abstraction layer that enables scientists to operate on the control framework without concerning about low level communications. Plug-ins architecture can implement an evolutionary distributed control system intrinsically: by the capability of generation of descendant plug-ins with dynamically computed parameters or configuration. Database records trace all the system evolution as every plug-in leaves its footprint on data model. Local plug-ins can be run under real time constraints. Global plug-ins are to be considered on-line but not strictly real time. A key for integration, both for Global and Local plug-ins is the integration text file.
The algorithms which constitute generic software modules, controllers (i.p. regulators), optimizers and generic logical units are deployable both on remote sites and on plant local sites. Local plant algorithms are enforced by the embedded electronics infrastructure. A relevant achievement has been the inclusion of Embedded Matlab framework over embedded electronics. The infrastructure layers can be multiple, allowing a great scalability of the infrastructure and an optimal configuration that keeps to a minimum the ratio between computing power and costs. This increases also the overall sustainability of the process operation. The flexibility in the configuration of components and software enforce optimal management of food processes. HMI (Human Machine Interfaces) Web interfaces for control of the system are available from the highest level down to the lowest: every device features a server for Web-based HMI applications. The solution is oriented to new trends in computing and information accessibility. The plant operator will be able to control and operate the plant remotely and pervasively: to be always connected.
Wine-making
Case study 1 contributed to the WP8.1 in identifying the flavour markers rate vs. ethanol rate and the metabolic modelling along with timings for control to optimal states of the process. Contribution to model paradigms have been provided for task WP8.2. It was found that the modeling procedure induces a relevant description of the studied phenomena and a suitable formulation for control so a simple mass-balanced formulation is therefore preferred along with the provision of a description of the main kinetics involving the growth of biomass on nitrogen and the production of ethanol and carbon dioxide resulting from the sugar consumption. This mathematical model can be extended to the description of the considered flavour-active compounds. In addition to typical variables as biomass, sugar and nitrogen, a new variable has been introduced: the transporters. Its introduction allows describing more coherently the evolution of the fermentation activity following the initial nitrogen concentration. It was maintained that it is the first time that flavour markers can be dynamically described and that the aromatic profile synthesis of a wine can be studied and hopefully understood and controlled. A combination of metabolic flux and classical modeling provides great insights on the modeling technique for bioconversion processes. Thist modeling strategy can probably be effectively applied and extended to other categories of processes as well. Contribution on sensors’ paradigms within WP8.3 for this case study have been obtained mainly by the use of temperature sensors, flowmeter for injected O2, flowmeter for produced CO2 and on-line GC. The wine-making process has been of reference for cross evaluation of optimization methods related to task WP8.4. Though no direct implementation of optimization algorithms were neither foreseen nor possible for this case study, the results on the modelling, control and scale of the process provided much knowledge for optimization techniques and paradigms to be used also in this case study. The needs for the wine case were included in the design of the overall optimization framework, with the objective of immediate extrapolation of the results tested on the other case studies in a possible application to wine-making real plant case.
Case study 1 contributed to the developments in the achievement of paradigm on monitoring and control related to objectives of task WP8.5 and WP8.6. The first paradigm of the work package "Knowledge representation and data management" is the association of data models (relational models, XML Schema) and formalized knowledge models. The second paradigm is to represent knowledge in a generic way, so that models and tools can be easily implemented for the different food processes. This task required the formalization of unified concepts and formalization of relations between concepts. In particular, the focus has been on the design and implementation of the database structure, the conceptual design graphs and the development of the Information System architecture. A part of data communications in this distributed architecture is performed using Web Services techniques. From the work and knowledge of the wine case study it began the construction of the Scientific Database which is the process memory, and is a posteriori used for scientific issues. The scientific database was implemented for wine process. The database model correctly fitted to the needs, especially for the project and experiment aspects that appeared hugely relevant to scientists. This first implementation trivially demonstrates the possibility of extension to every other case study or process class.
On the control side, the complexity of modeling has been tackled for the wine case. The time scale for the wine production has been assessed. the alcoholic fermentation takes several hours in batch process and one of the objectives was to reduce this stabilization time. Reduced order models have been obtained that are composed of 16 Ordinary Differential Equations (ODE’s), each of which is highly nonlinear. An observer has been used to estimate the unknown sugar concentration in each reactor. The control strategy which has been implemented is a linearizing feedback law that drives exponentially the dynamics to the target. The control objective has been expressed in terms of time minimization. To make the system go as fast as possible from one set-point to another one an off-line minimal time feedback control problem has been studied.
Microfiltration of food beverages
Case study 2 contributed to the WP8.1 in managing the Total Costs of Ownership/Energy consumption for a process plant and by giving indications on filtered beer quality. Contribution to model paradigms have been provided for task WP8.2 in close link with WP4. It was fond a real paradigm on modeling which states the eight steps to be taken in the development of models oriented to process control. It is a very generic methodology as and the applicability of the paradigm to the case studies at hand has been provided in table. Contribution on sensors’ paradigms within WP8.3 for this case study have been obtained mainly by the sensing of viscosity, pH, conductivity, VIS-NIR, particle size distribution and turbidity. This case study has been suited for experiments with ultrasounds, electronic nose and electric impedance techniques. The involvement of this case study in WP8.4 has been due to the major achievements in the determination of optimal operation policies for beer-microfiltration. The optimal plant operation involves decisions at different levels: the number of chemical cleanings (CIPs), the number of back-flushes per CIP, the value of the cross and permeate flow set points (QF, QP) and the maximum trans-membrane pressure (TMPmax). The approach selected for the beer microfiltration in the project, merges economic optimization and control and makes use of particular parameterizations to solve the problem using a small number of NLP problems. It has been observed that (global) stochastic algorithms performed much better than deterministic ones (based on SQP) leading to the optimal solutions. Along with that an operation profile has been obtained. Contribution of case study on beer microfiltration to task WP8.5 has been used for the refinement of ontology classification techniques. In this case an OpenMI approach has been pursued for monitoring and control for comparison. It was useful to make the effort to formalize the beer case knowledge and allow automated reasoning to take place. A number of knowledge rules in the beer filtration domain has been perfected and showed how the addition of facts to the rule base lead to newly inferred or retracted facts. In order to obtain a basic set of facts and knowledge rules in the beer filtration a certain number of knowledge sources have been considered: text documents have been used in which expert beer knowledge described a physical model of the fouling behavior of membranes. This technique used has been proven powerful for assessing the impact of the raw materials used as input for the beer filtration process. Since it is our intention to create a physical model that describes the fouling behavior of the filtration membrane and to create an automated control system to optimize the beer filtration process, it is useful to learn the impact of new facts. New hypotheses can be formulated to finetune the physical fouling model and the automated control system.
Freeze-drying of lactic acid bacteria
Case study 3 contributed to the WP8.1 by provision of indicators on: Qt time to reach max acidification; Qv cell viability; mechanical stability and rheology; controlled product temperature; water activity and total cost of ownership of the process. Contribution to model paradigms have been provided for task WP8.2 as it could be learnt that the direct on-line measurement of the biological activity of the bacteria is impossible and critical process parameters (CPP) for quality need to be identified and quantitative relationships between these CPP and the viability or the acidification activity of bacteria need to be established. By integrating the model of the bacteria quality degradation in a model describing the drying kinetic, it becomes possible to develop an in-line control policy of the process maximising the productivity and the quality. A simplified one dimensional model for the compact layer configuration has been developed by APT and given to the other partners. The model predicts the drying time, the profile of product temperature, water content and the glass transition temperature. Contribution on sensors’ paradigms within WP8.3 for this case study have been obtained mainly by the sensing of temperature (product and refrigerating liquid) and chamber pressure. This case study has been also viable for experiments of ultrasound techniques. The involvement of this case study in WP8.4 has been demonstrated by the achievement of model based parameter estimation. To that purpose, six experiments were performed in which two states were measured and employed as observables: the temperature at the bottom of the product and the vapor pressure in the chamber. Two control variables were employed to define the different experiments: shelf temperature and chamber pressure. The results showed that the maximum and mean errors between model predictions and experimental data improved after parameter estimation. The optimization oriented to process control has been performed in order to guarantee on-line as well as in-line product quality control while ensuring safe and efficient operation, the development of an optimal control structure has been proposed that responds optimally to input and state disturbances. Real time optimal control has been performed in two different plant configurations. The first one involved a nominal case experiment and used the powerful eSS (enhanced scatter search) algorithm. The aim was to reproduce the nominal situation considered in the off-line optimal profile calculation.
Contribution of case study 3 to WP8.5 WP8.6 and WP8.7 has been the use of such a case study as the first where the new distributed monitoring and control system has been deployed and demonstrated. The distributed control infrastructure developed is a low cost, long life-cycle, reusable, open, robust, versatile and modular industrial technology. Latter achievements until project demonstration showed themselves highly satisfying, by allowing scientists of CAFE project to implement control and supervision by pure Matlab language modules. The effort for Matlab inclusion has been implemented over very general and inclusive open source oriented software. The database-centric (by means of local database) architecture proved to be very general by allowing a fast refurbishment or integration of every existent piece of work or instrument within the project. The integration solutions there proposed featured inherent scalability properties: only SQL query level connection in the most abstract and incompatible case (where completely proprietary control solution were present), a retrofit in intermediate case (where some open low level communications were possible), down to hardware and software instrumentation of a process from scratch. The durability and openness of the solution proved to the academic research the existence of a powerful tool for actual industrial exploitation of the scientific results, and at the same time, frees research labs from the dependency from typical hardware and software vendors sales policies based on proprietary software compatibility requirements. The database objects and engines are used as the main interface between different processes, distinct control layers and legacy or alternative standard solutions. This kind of interface at the same time enforces a seamless integration and implementation of the CAFE project achievements based on paradigms for models, controllers, sensors and optimizers, by providing a unified tool which potentially can host all the practical realizations of their computational needs. Communication layers were added over the Internet protocol to enable seamless remote connection for control, diagnosis, maintenance, security, knowledge based control and data model connections. A unified hardware and then software infrastructure for integration of process control has been developed. Integrated software architecture has been developed to include the results from modelling, optimization and control work packages in a unified framework. The proposed infrastructure is able to implement general control and supervision policies. The essential communications between all the devices in the infrastructure are made mainly by means of data replication. This constitutes a novelty and change of paradigm also for the communication technology, allowing the design and test of open, low cost and very general applicability. HMI (Human Machin Interfaces) Web interfaces for control of the system are available from the highest level down to the lowest: every device features a server for Web-based HMI applications.
Ice cream crystallization
Case study 4 contributed to the WP8.1 by provision of indicators on: energy consumption by mass of ice; settling time to optimal value of viscosity and crystal size; rheology of ice-cream ad total costs of ownership and energy consumption. Contribution to model paradigms have been provided for task WP8.2 in food structuring processes. A paradigm for the crystallization process has been identified. It consists in following the transformation during crystallization, by using the phase equilibria liquidus curve, and by following the difference of temperature between the freezer surface (evaporation of refrigerating fluid) and the equilibrium temperature of the crystals’ solution in the bulk. The crystal nucleation laws, the crystal growth laws and the ice mass fraction depend on the difference of temperature to that of the phase equilibrium diagram. From experimental data at the laboratory plant scale model parameters have been identified by using a reduced model. Validation has been done following the key parameters such as crystal size, ice content and viscosity of the product. Energy consumption has been included in the model to be used in a future dynamical modelling. Contribution on sensors’ paradigms within WP8.3 for this case study have been obtained mainly by the sensing of temperature, pressure, power consumption, dasher speed, crystal size, ice fraction and by means of focused beam reflection and on-line viscometer built on purpose. This case study has been suited for experiments with ultrasounds and electronic nose. The involvement of this case study in WP8.4 concerned experiments for model identification using a factorial plan (D-optimum) considering three control variables: evaporation temperature, scraper speed and mass flow rate. Three different types of measurements were employed to compare model prediction and experimental data: Temperature of the mixture at three locations in the crystallizer, mean crystal chord length at the output and viscosity at the output (6 values). After parameter estimation, optimization has been obtained in order to guarantee on-line as well as in-line product quality control while ensuring safe and efficient operation. An optimal control scheme has been proposed that uses reliable process models and optimization tools which combined in appropriate ways will enable processes to be operated at their optimal conditions and to respond optimally in the event of plant disturbances.
Concerning activity related to WP8.5 WP8.6 and WP8.7 the case study on ice cream has been the second case study where the new distributed monitoring and control system has been deployed and demonstrated. In this case a from-scratch complete installation has been performed and described in deliverable D8.7. Case study 4 was chosen as the main case for final demonstration for its completeness both in monitoring and control. On the control side, the same paradigmatic approach of the other case studies has been followed and refined in the last reporting period for final demonstration. The complexity of the biological and chemical processes involved in ice–cream process has been tackled. During the crystallization, several phenomena are involved and three mechanisms have been taken into account: the nucleation of the crystals, the growth of the crystals’ size, the breakage of the crystals mainly caused by the blades of the scraper. Other phenomena have been taken into account: the wall heat transfer, the transport of the product, the viscous dissipation, the radial diffusion. A peculiar characteristic of case study 4 is the continuous nature of the process. The time scale of the process has been identified in the stabilization time from one operating point to another which takes between 5 and 10 minutes. To describe the crystallization processes, a population balance equation coupled with an energy balance equation has been used. The population balance equation describes the evolution of the crystal size distribution inside the freezer by Partial Differential Equation (PDE) which describes the crystal size as a function of spatial coordinates and time. The energy balance equation is also a PDE. The reduced order model is composed of 6 ODE’s, which here again are higly nonlinear, and non-affine with respect to the control input, which has been a difficulty for the control design. The problem considered is the control of the viscosity of the ice cream at the outlet of the freezer. It is a problem of regulation of the ice cream viscosity at a fixed set-point value. The control input is the compressor rotation speed. The control law which has been designed enables to compute a value of the control input. This value depends on the difference between the estimate of the viscosity and the set-point. It also depends on the estimates of the state and on the measurement of the evaporation temperature. The control strategy is in fact a cascade control strategy and based on a reduced order model obtained from the initial PDE’s by means of the method of the moment. The viscosity that we want to control can be expressed as a function of the state variables of the model. Interestingly the same linearizing control law as the one used in the wine making case study has been considered for the ice cream crystallization.
WP9 : demonstration
A DEMO day was held at Irstea to showcase a live demonstration of the ice cream case study. The CS was shown to over 50 participants including control, sensing, modelling, data management and integration:
Potential Impact:
Issues in food and bio process industries
There are different factors that today characterize the food industry : these include its diversity, its complexity and the high level of industrial competition. The main issue that the food industry has to face nowadays is related with the production and delivery of reliable food, able to satisfy organoleptic, nutritional and safety considerations. The fact that the consumers are looking for more and more services (easy-to-use and/or ready-to-eat products, quick time for preparation,…) imply more complex formulation recipes and processes. At the industrial production level, this means an increasing number of unit operations whose combinations need to be well understood to be efficiently operated, but also the fact that raw material and ingredients are added at different steps and the co-products becomes sometime more interesting on an economical point of view as main products. These general issues, are however not the only ones. Many other issues need also to be addressed which may in particular be depending on the kind of products.
The first question to be addressed is the following one : why do we need to implement control strategies in a food process ?
Indeed several objectives lead to consider control in the context of food processing :
- to increase the productivity of machines (but mechanization can also address this issue),
- to increase the productivity of workers (while training can also be helpful with that respect),
- to decrease the product losses (it is generally considered that 50% of losses (raw material and food) occurs during the shelf life and production. Considering the nutrition of peoples, the reduction of losses is a major issue),
- to increase the product quality regularity and/or reduce their variability,
- to increase the flexibility of machines and processes.
But requirements directly related to the specificities of food and bio industry have also to be considered, more precisely :
- to increase the hygiene of the food processing and production
- to decrease the effects of the natural variability of bioproducts characteristics
- to decrease the effects of the natural perishability of bioproducts
- to increase the global product quality : for instance for bioproducts, the quality is a term covering a wide range of contradictory aspects (texture, colour, taste, composition,...)
A central issue that needs to be addressed in priority is the importance of maintaining the properties of the product as constant as possible. The interest for the optimisation of the product quality is undoubtedly very important, but it is obvious that the confidence of the consumer with respect to food products is first based on the constant properties of the product.
At this point it is important to remember that there are numerous constraints that are connected to the food production. With respect to the cost of production consideration, the main factor is usually the cost of the raw materials, the second factor being the labour cost. Energy is of lower importance since it usually presently amounts to 3 to 15 % of the production costs. Nevertheless, energy becomes a increasingly important factor due to environment considerations and the increasing cost of fuel sources. One may expect in the coming years that the price of food will most probably be indexed with oil and gas costs. Therefore the control of the energy consumption within the food industry will become of increasing importance and interest. It is important to note that the first substantial implementation of automatic control in the food industry had been starting with the first oil crisis in 1974.
Emerging questions
With respect to the labour costs, the economic consideration is not the only one that one has to face. Indeed the fact that the workers and employees in the factory are in contact with food imply safety and sanitary considerations. The idea of a factory without humans is probably not suitable but such a consideration has to be envisaged.
More recently, the effects of the different crises in the food production (e.g. Bovine Spongiform Encephalopathy (BSE), avian flu, pig fever, foot-and-mouth disease, dioxine crisis in Belgium in 1999 related the contamination of the food chain via pig and chicken feed,…) show that the consumers are worried about the food. More safety is asked. Even if we believe that a lot of progresses are done, the ability of food industries to control really the safety of production is better, one of the consequence of the past years is that regulations are more complicated and new tools arrives at the factory level, that are opportunities for control purposes. The development of HACCP principles and tools (Figure 1) is available for a long time. Nevertheless, today, quantitative HACCP arrives as new tools, and databases, Good Manufacturing Practices become a basis for the implementation of Decision Support Systems.
Figure 1. Implementation of control and constraints
This resulted in particular in, new European or national regulations, among which the two following EC Regulations 178/2002 (“laying down the general principles and requirements of food law, establishing the European Food Safety Authority and laying down procedure in matters of food safety”) and 1829/2003 (“on genetically modified food and feed”). More constraints appear for the control of the production. Traceability is necessary, this implies the implementation of new tools, mainly for the measurement, monitoring as well as the memorizing and history of the production events. The new responsibilities of the factory with respect to raw materials and food imply new tools, mostly based on computers. It happens that databases are available, yet not used as they could be.
Another key factor to be considered is the requirement for productivity, in relation with the economic considerations and the profitability of factories. The appropriate compromise among all the constraints is usually difficult to find, and many food factories are looking for tools able to help them to take the appropriate decisions with that regard. Figure 7 provides an historical perspective and illustrates the fact that the food factories are presently looking for a global process database that would allow them to implement new control functions.
The number of studies that are dedicated to monitoring of indicators related with productivity and efficiency of processes is quite low. One of the difficulty of the food companies is due to the very small profit margins that they have to manage. Each time a better control of margins becomes possible, the economical situation becomes better. Even if no generic study is available, it is well recognised that energy costs are around 3 to 15% of the product costs. Raw materials losses are sometime important and a better control can decrease significantly the yield. The productivity increase is often a competitiveness criteria. It is, to date, more difficult to manage criteria that are related with a better control of organoleptic or nutritional considerations.
Numerous control systems have been implemented in industry over the last decade, as it has been reported in several studies. For example Morris [40] points out that in the United States, the main issues from the food industries were related to the implementation of an increasing number of automatic control and integration tools, with the objective to address the important questions of organoleptic and sanitary considerations. As far as we know, there is no similar study for Europe; nevertheless, new issues have to be addressed, even if the search for the control of organoleptic properties and safety remains a priority.
The main new issue is nutrition. The motivations of the consumers for heathly food are obvious; the main question to be addressed is therefore to be able to control the impact of food on nutrition. There is indeed an open scientific question : to determine and characterize what are high impact nutrition foods. Numerous research projects are presently considering this question. But new European Directives have to be taken into account by the food industry. The main one is probably the Nutritional profile one (even if to date the choice of a profile method is not done, and many criteria are proposed in each European country). The most simple view to be followed to analyse the situation is to consider that the chemical food composition, during and after processing, becomes of major interest. It is a combination between the interest for nutrition issues and the capabilities of analytical tools to accurately discriminate the composition of food. More precisely, as it was the case for microbiology, the analysis of the positive and negative aspects of chemical composition becomes possible.
The positive aspect is indeed the retention of important molecules during the processing and the conservation. For the negative point of view, the neo formatted compounds (NFC) that are produced during the processing of food are of increasing interest today because the number of analytical tools that can measure properties are increasing (and the resolution is decreasing). This is indeed a new situation compared with the past years. Most of the control strategies of food operation are based on sensory properties that are visible properties (visible that means that at least peoples are able to “measure” the properties with their senses), at least for the human operator working close to the production line. With new considerations about chemical composition, the expected properties that we have to control are non visible and the way to measure them is difficult, sometimes not available to date.
Good manufacturing practice (GMP) are implemented, today not under regulatory constraints as it is the case for pharmaceutical industries, but the increasing pressure of regulations needs to anticipate the increasing level of regulation, and automatic control is one of the way that will be very helpful to address this issue, although probably the food industries will not have to reach the same level of regulatory constraints as the FDA requires for pharmaceutical products. Nevertheless, the pressure is more from the retailers as from regulations, and the consequence is a need for more automatic control and monitoring.
As compared to other manufactured products, the food is a product that has to be transformed at two levels. The first level is the factory level (that is more and more often performed in at least two steps, including a final assembling process). The second level is the culinary/domestic level. During the second transformation, where heating is most often considered although not the only one, is generally not controlled, because of the powerty of culinary technologies for control purposes. An important question for the food companies is to manufacture products that are satisfactory in terms of taste, safety and nutrition independently of the way that they are used at domestic level. The necessary robustness of the product is an objective, and among all the possible trajectories of transformation, the one that introduces the robustness will be the optimal one. An other consideration could be related to the study of the domestic unit operation on the same way as we have to study the control of an industrial unit operation. If we consider heating or baking for instance, the understanding and the design of an automatic control strategy at the plant level could be similarly applied at the domestic level (as long as the cost allows it).
It is obvious that the economic aspects are also important. The consumers tend to lower significantly the part of their budget related to food (14% today compared with more than 25% thirty years ago). The production costs, as it has been highlighted here above, are so high, and the pressure from the retailers so important that in most of the food productions today, the profit margin is low (around 3% for most of the food products). The economical sustainability of most of the food factories is then under pressure, and without any change, the profitability of many food companies will significantly decrease.
Taking into account the demand from consumers, the industrial companies are looking for tools capable of defining the design and production of food from market to factory. The necessary flexibility is not obvious and all the process manufacturers are looking for higher level control implementation in order to increase the flexibility opportunity. Nevertheless, as we shall see here below, this is not obvious.
In order to provide an answer to the previous questions, different options are under study. The first one is to increase the engineering of food processes, and more and more complicated food chain are proposed. One of the idea here is mainly to decrease the energy costs, to address the environmental issues (via the reduction of wastes and wastewater, for instance) and to maintain the quality level as much as possible. As a consequence, because the processes are more and more complex, the control ability is more difficult, mainly due to the fact that the dimension of the expected properties is increasing. The search for a more complicated compromise is more difficult. The training of the operators is not sufficient in order to meet all the control purposes.
Another interesting evolution is necessary to be taken into account. Looking at the recent food processes exhibitions (Achema, in Germany, IPA in France), a large number of proposed processes are smaller and smaller, and the size reduction is an important trend. If this trend is easy to understand because it allows to install more operation units on the same location, to have a larger diversity of the ways of processing the food, However this also means that the residence times for the products are shorter, rendering the control of the process operation harder. This clearly shows that automatic control functions become more and more important, if not necessary.
Even if automatic control appears to be a solution, it must be considered that the social situation in Europe do not allow to decrease strongly the number of workers on an industrial plant. The characteristics of the food product, the complexity of food processes, the large number of degrees of freedom that are available require people for controlling the processes. In that context, advanced control strategies are also essential, and new concepts for control, that permits to share automatic functions and decision support system for human decision are important and will be a part of the sustainability of food industries.
Facing the questions, the situation of control
Two points have to be addressed. The first one is concerned with the progresses for components and tools for control, the second one is concerned with the way that the companies are handling the question.
The control components industry situation
The progresses in control technologies and techniques are enormous. It is well known that the power of computer tools is increasing each year. The industrial computers are cheaper, more accurate and an nincreasing number of functions are available. The improvements of the performances in Programmable Logic Controllers (PLC) (industrial computers dedicated to automatic control tasks) are obvious. Most of the food factories are able to implement these tools, even if, as Ilyukhin [27] explains in the context of the United States, these tools are mainly implemented by process manufacturers and not by food producers. PLC allows to implement for instance high level algorithms for on-off control or sequential control, even for setpoint control.
The improvements of actuators are very important too, yet often neglected. More and more accurate actuators are available in food factories. This is often motivated by the search for increased productivity. But it is a fact that controlled motors, smart valves, complex manifolds are available, and new control functions become possible. One of the actuator that is more and more implemented is the multi axis robot due to price reduction but also because of the increasing labour illness costs, which tends to be used as an argument to replace human by robots. Today the implementation of robotic functions coupled with image analysis is more an engineering question as a research one.
The third component of control, which could be indeed considered as the first one, is the on-line sensor. Here the situation is still difficult. Numerous papers are dealing with the lack of reliable and appropriate sensors. It is obvious that even the major sensor manufacturing companies do not show major motivation to adapt their technologies to food and bio characteristics. The main reason is related to the market : specific sensors are needed for each specific application for which the market is usually quite narrow. The profitability of sensor adaptation is not easy to emphasize. This most probably explains why, still today, only a limited number of on-line specific sensors are available in the food industry. The emerging solution, at least for a part of the question, is the adaptation of analytical tools, classically available at laboratory level, for process and plant considerations (increase of robustness). The composition of food becomes easier to measure. At this step, the transfer of the Process Analytical Technology (PAT), first developed for the pharmaceutical industries is a way to face the on line food composition analysis. However it is not enough for control purposes because the set of properties that are expected to be measured during process operation is not only the composition but also the texture, the colour, the aroma, the sanitary properties etc. The proposed solutions are numerous. First it must be noted that the necessary time to develop a new measurement method is a long process that usually takes several years, not just a few months. Secondly the use of smart sensors (i.e. the combination of easy-to-do measurements and softwares) allows to solve numerous situations. Unfortunately it is a case-by-case set of application, without usually no possible generalisation. One of the ideas could be to transfer results and tools available for control of bioprocesses to food processes (e.g. state and parameter estimation, indirect model based sensors).
The present situation in the food industry is such that, even in the absence of on-line sensors, largely thanks to the know-how of the process operators, the food companies are able to provide products that corresponds to the consumers’ expectations. The main tools for sensing are indeed the operator and the use of their ability to evaluate the product. It is obvious that such a know-how should not be neglected in the design of control strategies. Recent results have provided methods that are able to include the human evaluation into feedback and/or feedforward control strategies. Most probably, it is one of the way to find response to the issue where the consumer expectation is taken into account. Results shows how the experts of the plant control are able to anticipate the consumer behaviour and how they control the elaboration of quality in such a way than the expectation are reached. It is interesting at this step to keep in mind that SME are here often more efficient as big companies. It is one of the relevant goal of the Strategic research Agenda [wp5].
It is evident that due to the complexity of food transformation, the combination between classical sensors, analytical tools, human evaluation with models optimization and control algorithms is the most appropriate solution for the development of a set of tools able to provide a Food PAT based control system concept.
The food and bio companies situation
The food companies are trying to increase the potential and performance of their control systems. It is obvious that the level of necessary studies and developments is such that these will be more likely implemented in big companies than in small and medium one’s. However there is an major potential of improvements in both types of companies. The key question is that it is often a case-by-case approach, without any generalisation. As it has been said above, the process manufacturers are the main track for the introduction of control functions within the food factories. The consequence is that the main emphasis is on the appropriate functions related to the technologies rather than related to the products. The specific dimensions of the process and its product(s) are usually missing. It is therefore important to come with control engineering tools that are addressing appropriately the process issues and in particular the different interactions within the process, typically in terms of the main phenomena (chemical/biochemical transformations, heat transfer, mass transfer, multiple phase interactions (liquid/solid/gas but also emulsions or gels for instance), …) that are taking place within the process.
The second conclusion at the food industry level is linked with the classical advanced control approaches. In food companies, there are most often no control engineers while the new proposed control tools and methods are mainly proposed by control engineers. The way the algorithms are designed imply the definition of parameters that are not easy to understand for operators and process engineers. It appears that, in the food industry (as it is also the case in many process industries), the best approach for the design of the control problem is to consider that one of the key issue is to control the constraints that have to be fulfilled. This often implies an adaptation in the design of control strategies.
The Food industries in Europe propose an agenda via the Food for Life plateform. Process control is one of the goal (goal 2, challenge 3). The main point is related to the design of “robust and reliable quality sensing systems must be researched and developed over differing time scales so as to assess quality throughout the life history of a product. In-line, preferably non-destructive, and integrative quality sensors are a prerequisite for a modern process control. It will be essential to adapt read-outs of such quality-sensing systems to generate useful parameters for the design of new processes and for the creation of new food”. With respect to these objectives, the expected deliverables are expected for 2020. It is obvious that the industry have to face questions before this date, and the purpose of this analysis is to be able to develop appropriate tools and methods even for the control of the processes.
The potential sciencific responses are numerous
An overview of the scientific production in peer-reviewed journals and international conferences is a good tool to provide elements of responses to the level of development and implementation of the automatic control in food industry. In simple words, it is interesting to note that if on one hand the automatic control and process control scientific communities have been characterized by major scientific developments and industrial implementations over the last two decades, the application of advanced process control in the food industry is still limited. Such an a priori negative statement should be balanced by a still rather yet susbstantial increase of interest of process control scientists for food processes, and of food scientists for advanced control methods.
The first trend is for instance illustrated by the publication in the Control Magazine, one of the publication of the Control Society of the IEEE (Institute of Electrical and Electronics Engineers) of two special issues (August and December 2006) on Process Control with one paper fully dedicated to the control of food processes [41] and two papers on crystallisation processes and biological reactors [25][30] in which specific food process issues are also addressed. It is also illustrated in the triennal milestones report of the IFAC (International Federation of Automatic Control) which summarize every three years (the basic periodicity of the IFAC activities) the accomplishments and trends in the different in the research and applications of automatic control in all fields : it is obvious that food processes are given an increasing importance [12]. The same remark holds for the activities of at least three technical committees (TC6.1 on Chemical Process Control, TC8.1 on the Control in Agriculture and TC8.4 on Biosystems and Bioprocesses) where food process modelling, monitoring and control are clearly identified. It is also worth noting that major control engineering journals like Automatica and the Journal of Process Control gives increasing room for publications dedicated to the monitoring and control of food processes.
Recent international conferences in food processing are illustrative of the second trend. It must be noted that these are more and more including sessions on automatic control. During the ICEF9 (International Conference for Engineering and Food, Montpellier, France, 2004), the sessions related with control clearly emphasized that a lot of new functions are available. Optimisation appears to be the main approach; only a very few number of new sensors and new principles for measurement have been proposed. During the Iufost conference in Nantes, France (September 2006), a similar observation can be drawn : some presentations dedicated to the control of specific food process case studies, optimisation as a tool.
It is important to note that, as it has been already mentioned, the research activities in automatic and process control over the last decades have resulted in a wide spectrum of new methodologies that address important control issues. In the field of process control, a very active research activity has been concentrating on process monitoring, optimisation and control. If Model Predictive Control (MPC) has been often viewed as a common denominator for control design, it has also to be considered as a starting point for the design of more specifically dedicated and appropriate controllers that are aimed to address the specific issues of classes of processes. Robust state estimation [13][14], process optimization and optimisation-based control [3][4][5][6], and real-time optimisation (including adaptive extremum seeking control techniques [20][35][36][56][57][64]) have been the object of increasing research and resulted in several promising new techniques, in particular for food processes. Another important issue is the use of appropriate models for control design : very often, the mechanistic models used to describe the (complex) dynamical behaviour of food processes and their intricate mechanisms are too complex for being used in efficient control schemes; this holds for processes involving complex fluid flows, spatial distribution and particulate distribution (which results in partial differential equations (PDE’s) model representation), but also for biochemical systems involving a large number of complex biological reactions. Several methods for model reduction have been largely studied (singular perturbation, pseudo-spectral methods, weighted residual methods (Galerkin, orthogonal collocation,…), finite elements, etc, but also system biology reduction methods e.g. based on convex basis analysis).
It appears for instance that batch processes have been recently (i.e. over the last decades) the object of increasingly research activities due to obvious industrial needs. An issue, for instance, is the ability to provide rapid estimation of key process variables representative of the process quality parameters : this can be done via the use of software sensors that convergence in finite time. It is also worth noting that EC projects have been recently dedicated to the monitoring and control of batch processes, in particular the BatchPro project (“Knowledge-driven batch production”)[wp4].
All this knowledge will obviously serve as a basis for the developments to be performed within the CAFÉ project.
A synthesis
Based on the observation that the level of specifically dedicated scientific developments and real-time implementations of advanced process control strategies in the food industry is still limited, the central idea of the CAFÉ project is to handle the smart control issue in the food industry by considering a new concept. The main question is the combination of the different software based components in order to associate them in appropriated control tools. The key point, as it has been explained in the preceding sections of the proposal, is therefore to develop an integration approach that takes into account the specificity of food processes where the cooperation between man and technology is important.
This includes the following important axes :
• Sensors and food adapted Process Analytical Technology concept development and implementation, with the idea to propose, based on applications, how to adapt existing sensors, how to combine them with analytical tools and with human evaluation. The last point is clearly very important since it is obvious that manpower will continue to play a central role in the process operation in the food industry, at least for all the evaluation work to be performed close to the plant, with the issue of providing user-friendly control tools for the process operators.
• Optimisation based control algorithms have to be developed. The optimisation point of view is well recognised today as the leading one. The different situations have to consider the specificities of food operation, where the controllability is located or distributed in space, where batch or continuous operations are concerned. The calculation and control of product trajectories along the process life is important and will be a major concept for controller design.
These axes are based on the development of a new approaches in modelling of food and bio processes where existing models could be integrated and adapted for optimisation-based control purposes (via model reduction approaches) and where the global integration of model related algorithms is performed through software components technology.
Dissemination activities
Dissemination was aimed at three target groups: food industry, food plants providers and academics. The main users of the CAFÉ production strategy, monitoring and control system are the first two groups. The dissemination goal was to gain acceptance of the results and methodologies developed in their application in wineries all over Europe. The dissemination goal for the academic users of the information was to inform on the progress made and hence stimulate further research: as described in the work programme and objectives, much of the project is about review recent advances and incorporating them into an integrated tool rather than wholesale model development.
Apart from the publications in refereed journals and the communications in international scientific conferences, a specific scientific workshop as well as a demonstration workshop aimed at the industry have been organized at the end of the project with the objective to provide an overview of the project results, with the option to possibly combine on-site demonstration with teleoperated demonstration on a distant site.
A webpage has been used as the principal tool for the exchange of information within the project. The CAFÉ webpage has also been used as a dissemination tool outside the project by the communication of the project results and the scientific publications connected to the project, as well as the announcement of activities (conferences, workshops, courses, exhibitions,…) of scientific/industrial relevance to the activities of the project.
The project has given the opportunity to increase the technical and commercial potentialities of European companies on the food market by incorporating efficient production strategies and smart control systems that can guarantee high and uniform product quality and reliable operation of the plants. This improves the position of the European food industry on the international market, and favours the increase of exportation outside Europe.
For the academic participants in the consortium, the results of this project maintain their international reputation as leaders in modelling, diagnosis, monitoring and control of food processes. The results from all parts of the programme are available, allowing them to strengthen and extend their areas of expertise. The academic element to the project continues to consolidate European reputation in the world as a scientific leader in the field of food research. It also increases the links and collaborations between the European partners. Most of the academic partners have already common scientific projects with some of the academic partners. The project has been another opportunity to increase and reinforce the scientific collaboration among these institutions. Moreover, the collaboration with academic partners has been an important opportunity to develop new scientific links, build up new research activities and increase the scientific potential of both the consortium and the individual partners. The collaboration between the academic partners and the industrial partners has been an important source of vivid interaction that will lead to the transfer of new technologies from the universities and research institutes to the industry and the environmental activities. It has also been a unique source of inspiration in terms of research themes from the industry and the “real” world to the researchers of the universities and research centres by providing challenging scientific problems that are raised from the problems encountered in the industrial practice and the management of food processes, and require deep and fundamental scientific developments.
Training and exchange of scientists
A large part of the work of the CAFÉ project has been performed by young researchers. Training of young researchers has therefore been an important issue of the project. It will include exchange of researchers among the different partners. The training and the different exchanges have played a central role in the project not only for improving the coherence and interactions within the consortium but also for favouring the education of high skill scientists and engineers. With that respect, the objective of the project has indeed also been clearly to improve the scientific expertise in food science and technology in Europe.
Exploitation of project results
In terms of exploitation, particularly for the industrial companies and the end users of the smart control system, the potential benefits are enormous in terms of turnover and ROI. It is for this reason that one end-user (PMS) and 7 industrial companies involved in the interface with food industrial companies (Telstar, Oenodev, SPES, C-Tech, Alctra, Norit X-flow, BIV Trace) are involved in the project. From an energy, environmental and economic point of view it is essential that the final deliverable addresses the needs of the end users.
For Telstar, the targets of the CAFÉ project are the following : to gain knowledge and ability to build a freeze granulation system for liquid solutions in order they can be subsequently lyophilized. As the dried products are very often pharmaceuticals, the final target is that the system should operate under sterile (or at least aseptic) conditions. Nowadays, pharmaceutical industries have to “overdose” the vials they lyophilize in order they can match the required potency after the product is reconstituted (due to different range of achieved results at each batch). With this proposed method, the product could be “bulk freeze-dried” in granules. After the batch is finished and the potency analyzed, the individual vials could be filled with just the necessary amount of product. Important savings (10%) of batch cost expected. The second benefit is that freeze drying of products dosed in vials is a very long process: usually a batch needs 36 to 48 h (even longer processes are not uncommon). With granules, drying time could be reduced by a factor of 5 to 10 (drying time is expected to need 5 to 10 h). The challenge are therefore the following. Pellets formed by spray cooling with counter current cryogenic fluid stream is a process already known but it is impossible to perform this process in an aseptic/sterile way. Pellets drying will first be done by filling pre-cooled vials and loading into a freeze-dryer. Further in the project, it will be studied how to do it in a continuous way. Further advantages are that these spheres are the perfect candidates to be coated in order to achieve an API controlled release within the patient. Extra exploitation results will be the availability of monitoring tools being able to in-line process control. This means that the expected economic impact on IMA-Telstar should be a technological clear competitive advantage over both American and Asian freeze-dryer manufacturers:
- A lyophilization process is usually only specified in terms of a ‘recipe’ (shelf temperature and chamber pressure vs. time). This, however, may not guarantee repeatable conditions for the freezing and sublimation steps. According to the recent FDA PAT guidelines, there is the need to study in depth the process in order to develop in-line tools to enable better monitoring and control. It is necessary to move from intensive property (independent of mass) measurements (such as temperature, pressure –partial or total-) to parameters which are scalable and in so doing implement the necessary tools to control the parameters and hence the cycle; thus permitting real-time feedback control actions. Two variables are key to monitoring the lyophilization process: sublimation interface temperature (during the whole of primary drying this has to be maintained below the collapse temperature) and sublimation mass flow (which has to be maximized to achieve the most cost effective cycle). With classical monitoring tools these key goals are not achievable.
- The second advantage would be moving from a classical batch process to a possible continuous process, achieving tremendous process cost reduction and better uniformity of the processed materials. This can only be achieved by obtaining a solid that can be dosed in aseptic conditions (pellets). This would open the nowadays very restricted niche of lyophilisation to other less expensive products, unviable nowadays due to the cost (both time and money) of the process.
SPES aims to exploit innovations from the CAFÉ project improving the activity for the design and development of embedded electronics for measurement and control applications concerning with biotechnological processes. The expectation from this improvement is in term of products’ quality and performances and in terms of turnover.
For a company like C-Tech, the CAFÉ proposal offers considerable potential impact on the performance of C-Tech Innovation. The high performance food processing equipment developed by C-Tech for its client companies can only realise their full potential through the adoption of optimum control systems, without such control systems the number of sales of such equipment is compromised. The level of sales of such equipment by C-tech has certainly been reduced over the years by the inadequacy of available control systems. It is hoped that the successful development of improved control protocols and systems that will occur in CAFE might increase sales income for C-Tech by about €200,000 per year in equipment sales.
At Alctra the increase in the company's activities in the sector of foodstuffs represents a development factor of prime importance, based as it is on the company's scientific & technological achievements and on the possibilities for consolidation and improvement in the future. In fact, this sector – which can be regarded as integrating a "low technology" area – exhibits a tremendous capacity for incorporation of methods and equipment originating from the results of "high tech" works. Yet the incorporation of new technical approaches into the professional field of foods has to negotiate a preliminary stage of achieving test results under laboratory conditions. Accordingly, professional players will be more motivated to engage in the investments required in terms of in house and specific research & development giving rise to economically viable realization of innovative production & inspection methods.The CAFE project is well oriented towards the objective of combining the additional skills which go to make up the basis of any structured and finalized research & development activity. Alctra, in its policy for involvement, is effectively demonstrating a capital strategic interest for its own development and the development of its technological & industrial partners.
The number of applications of membrane filtration processes for beverages is rapidly growing. The current annual turnover of Norit in this area is about M€20 to 30. For Norit the CAFÉ project will result in technical improvement of membrane filtration systems, resulting in better control of food quality. This will lead to increased market penetration of membrane filtration systems and higher added value of membrane systems (higher pricing). Thus, it is expected that this project will result in added annual turnover of several million euros.
The CAFÉ project will allow BIVtrace to enlarge its competence field by adding and linking added-value informations of a product during the life of this product, towards a central data base. Moreover, the datas synchronization with different new sensors never studied by a traceability company will have a positive return for BIVTrace in terms of position in the market.
List of Websites:
http://www.cafe-project.org/
Coordinator:
Denis Dochain
Professeur ordinaire
Directeur de Recherches FNRS honoraire
CESAME, UCL
Bâtiment Euler, avenue Georges Lemaître 4-6, bte L4.05.01
B-1348 Louvain-la-Neuve, Belgium
tel : +32 10 472378
fax : +32 10 472180,
e-mail : denis.dochain@uclouvain.be
homepage : http://perso.uclouvain.be/denis.dochain/
Food industry is nowadays facing critical changes in response to consumer needs, which in addition to health and safety awareness, demand an ever larger diversity of food products with high quality standards. From the consumer side, such variety in the demand is driven by social or ethical incentives as it is the case of products more compliant with the environment or produced by sustainable processes. On the other hand, food industry is in a permanent quest for new markets and new population sectors not accessible before. This immediately translates into the search for novel products and more efficient processes so to gain market opportunities with respect to other companies.
In order to satisfy such needs and demands, which although driven by the product, directly affect the process, novel and efficient food and process engineering approaches must be developed so to comply with the proposed requirements. Product engineering approaches are already responding to the challenge by proposing new methods and tools to systematically modify or even design new products in response to consumer needs.
In similar terms, Process Engineering should offer efficient and flexible process alternatives to comply with the product safety and quality standards, while minimizing operation costs and environmental impact. At this point, it must be noted that by the characteristics of the food industry, such two concepts can be almost immediately related with minimization of energy and water consumption, and therefore with sustainability, a notion of particular relevance on these days when we start experiencing persistent evidence warning us about a global climate change.
The objective of the CAFÉ project has been to provide new paradigms for the smart control of food processes, on the basis of four typical processes in the areas of bioconversion, separation, preservation and structuring. The novelty of the project lies in the capacity of combining PAT (Process Analytical Technology) and sensing devices with models and simulation environment with the following objectives:
1) to extract as much as possible information from the process/plant in the form of precise estimations of unmeasured variables defining, in particular, product quality, and of physical parameters changing as the process dynamics does or difficult to know beforehand;
2) to save and encode in a reliable and usable way, basically via physical/deterministic models;
3) to develop control methods to keep uniform quality and production despite the variability in the raw material and/or to respond to sudden changes in the demand.
The project is organized in a matrix format around four food processing applications which can be considered representative case studies of food industry. The applications cover all process categories in food industry, namely bioconversion, separation, preservation and structuring, being the selected cases:
1) wine-making, as a case study of bioconversion processes (CS1)
2) microfiltration of food beverages, as a case study of separation processes (CS2)
3) freeze-drying of lactic acid bacteria, as a case study of preservation processes (CS3)
4) ice cream crystallization, as a case study of structuring processes (CS4)
These applications serve as the basis to test and demonstrate the viability and efficiency of the concepts, methods and paradigms to be developed through a total number of 9 work packages :
WP1 (Project management), WP2 (Knowledge representation and data management), WP3 (Process experiments), WP4 (Model building, process dynamics and model reduction), WP5 (Sensor development and validation), WP6 (Process design and optimization), WP7 (Process monitoring and control), WP8 (Integration), and WP9 (Demonstration). Transversal genericity among the different applications will be pursued mainly on WP4 to WP8, while WP2 and WP3 provide the necessary specifications in terms of data and experiments to make the concepts applicable in WP9.
Three central paradigms have been emphasized within the CAFÉ project : model parsimony, which can be simply expressed by the fact that a simple model will always do a better job than a complex model as long as the simple model is validated on real-life data ; the combination of several measurements more specifically the combination of different measuring techniques based on different physical principles as well as the integration of software sensors ; and optimising control, i.e. the combination of optimisation and feedback control within a unified control scheme.
Project Context and Objectives:
Food industry is nowadays facing critical changes in response to consumer needs, which in addition to health and safety awareness, demand an ever larger diversity of food products with high quality standards. From the consumer side, such variety in the demand is driven by social or ethical incentives as it is the case of products more compliant with the environment or produced by sustainable processes. On the other hand, food industry is in a permanent quest for new markets and new population sectors not accessible before. This immediately translates into the search for novel products and more efficient processes so to gain market opportunities with respect to other companies.
In order to satisfy such needs and demands, which although driven by the product, directly affect the process, novel and efficient food and process engineering approaches must be developed so to comply with the proposed requirements. Product engineering approaches are already responding to the challenge by proposing new methods and tools to systematically modify or even design new products in response to consumer needs.
In similar terms, Process Engineering should offer efficient and flexible process alternatives to comply with the product safety and quality standards, while minimizing operation costs and environmental impact. At this point, it must be noted that by the characteristics of the food industry, such two concepts can be almost immediately related with minimization of energy and water consumption, and therefore with sustainability, a notion of particular relevance on these days when we start experiencing persistent evidence warning us about a global climate change.
To that purpose, original combinations of adapted or standard process units operations need to be designed and optimally operated through smart control configurations covering the whole food plant production. In this way, food plants would evolve so to become flexible and multipurpose production structures, able to efficiently modify or adapt the operation or to combine, in the best possible way (including product and process requirement), several production lines in response to market demands.
Main obstacles to smart operation and control in the food industry
The food industry is well established and many processes in operation nowadays are the subject of intensive work with regard to the ways of devising better operation modes in terms of product quality and safety (how to operate in order to ensure quality and comply with safety constraints) as well as in terms of operation costs and environmental impact. There is also an intensive development work aimed at responding to consumer demands by designing new products and designing and operating the more appropriate combination of unit operations needed to produce them.
However and despite the fact that the essential physical, biochemical and microbiological principles are reasonably well understood, foods are complex systems with properties that because are connected with quality and safety are usually very difficult to measure, estimate or even represent through reliable models. Such properties may include physico-chemical parameters associated to quality such as nutrient content, texture, colour or rheology, or microbiological characteristics usually connected with food safety.
In addition, and from a Process Engineering perspective, the food industry integrates a rich variety of apparently very diverse processes and technologies thus hampering the search for unifying paradigms useful for dealing with different yet analogous processes. Such processes have only recently been classified into a reasonably small number of categories, namely bioconversion, separation, preservation and structuring.
The different processes summarized above might have different simultaneous purposes, or maybe combined for the same product. For instance, cooking has also the effect of better preserving the products, bioconversion or separation technologies such as lactic fermentation or drying also favour preservation. It is also important to highlight that in order to guarantee the production efficiency and product quality, the combination of the different processes and technologies in each of its production lines, and their operation should be defined and integrated harmoniously among the different parts in order to produce the required coordination. Such integration, being flexible, would allow the satisfaction of process and product requirements. In this context, it is worth noting that the food industry has experienced significant changes in their mode of operation over the last years in order to try to rapidly adapt to a changing market driven by consumer demand, stringent safety and environmental regulations as well as the highly restrictive specifications in product quality. However, and despite previous efforts in R&D, the time lag that goes from product conception to optimal product development, including plant design and operation is still too costly and time consuming.
In particular, and when considering more specifically the plant operation and the different processes briefly summarized above, the efficiency of the current control schemes in the food industry is still far from being optimal in a global sense, i.e. for covering operation costs, energy and water consumption, environmental impact and product quality and safety.
The underlying main obstacles to optimal plant operation and control are those derived from the –sometimes apparent- diversity and complexity of processes, technologies and products, which often translate into partial solutions, empirically driven, and valid only on a case-by-case basis.
Current needs
From a control and automation point of view, the following weaknesses have been presently detected in most food companies:
• Most plant control schemes reduce to local and decentralized control loops acting on a usually very small number of states (typically temperature or pressure) not directly connected with product quality and in many cases neither with critical aspects of the operation such as water or energy consumption. Despite the fact that the performance of PID control can be in many instances more than acceptable (as it is also the case in other process industries), the control loops should be combined among the different processes to highlight synergies and not to cancel them. In addition, this regulatory layer is not commanded nor integrated on higher level supervisory levels.
• Although presently, many food plants benefit from advances in data acquisition and monitoring of the full production lines to gather and store huge amounts of data, the use of such information is quite limited, usually not efficiently employed and reduced to configure alarms (often handled at a very low level) or to help producing simple production decision rules and off-line control of inventories. Much more efficient use of such information should be made possible when properly combined with process models and prognosis tools to be able to estimate unmeasured yet relevant plant states and to predict future scenarios, even on a real-time context.
• Often recognized as a specificity of food processes, the lack of sensors for relevant product characteristics is still a problem. Even if numerous algorithms are available for advanced control purposes, it is obvious that the on-line, real-time reliable information is necessary. Common sensors (temperature, pressure pH, flowrates,…) give information only very indirectly related to the product properties of interest, such as texture, aroma content, biomass, contaminants, vitamins, etc. Sensors for such product properties are either missing completely or very expensive and not robust or unreliable enough to be used in everyday industrial practice. Improving the reliability of sensing devices and developing new hardware-software sensing techniques to on-line estimate difficult quality product parameters are critical in developing smart control applications for food factories.
• It is important to note that generally speaking, there is a lack of control design and operation paradigms to optimally operate plants, either learnt or inherited from diverse yet similar processes or scenarios. One such paradigm is that pursued by the engineering community that focused on concepts such as simulation, optimality or optimisation as those containing the real systematic approach to the problem of devising smart control for food processes.
In order to overcome the present limitations detected on food production plants and to offer a novel and original integral approach to guarantee complete and intelligent control of the whole plant in response to changes in the demand and the supply of materials, in order to ensure product quality, flexibility and efficient operation, compliant with the environment, a number of scientific and technological objectives stated so as be achievable within the project duration, are proposed next.
Scientific and technological objectives
The main scientific and technological objective of the project is the development and implementation of novel process engineering tools and methods to efficiently control in a flexible way wide classes of food processing plants. In particular, it concentrates on the development of an integrated approach to optimal food process operation and control, and the implementation through reliable and novel sensing systems and advanced simulation tools :
• to efficiently reconstruct unmeasured states of the plant such as process and product parameters indicative of food quality and health safety of the food product, as well as process operation parameters indicative of the efficiency of the operation;
• to provide, efficiently and in real time, predictions of future scenarios, and robust and efficient control strategies that achieve optimality in terms of process factors (minimization of operation costs, energy and water consumption, environmental impact) and quality parameters of the food product.
Figure 1. The integration concept of CAFÉ
In order to achieve the proposed global objective, the following specific objectives need to be achieved.
1. The development of a robust and reliable sensing architecture, in order to obtain on-line estimations of product and process parameters and properties indicative of food quality, safety and efficiency of the operation. The design of the novel and low-cost process sensing systems will be included into an integrated measurement system, based on optimized ensembles of sensors, PAT (Process Analytical Technology) technologies and in combination with modelling and identification techniques, to allow real-time or near-real-time (e.g. on-, in-, or at-line) monitoring of critical parameters (raw material, process, product) while manufacturing is in progress. The sensing system will be built on a hardware-software architecture with the following properties.
• It will be Component Based, i.e. it will be constituted by adaptable application components of different types, using recent paradigms based on formal real-time process specifications.
• It will have a Distributed Configuration. In order to overcome the intrinsic complexity of the plant and to be of use in a real-time context, software components will be implemented on distributed platforms.
• It will be Reusable. The architecture must be designed as a framework of reusable components with distributed real-time features. This framework will be extensible and adaptable in order to allow its implementation to a wide rage of possible applications.
2. The development and application of new concepts and tools from model building and simulation to produce dynamic models able to capture the relevant process features. The models will have explicit parametric dependence and thus will be suitable for dynamic optimisation, smart sensor and sensing system development (objective 1), monitoring and control. To that purpose:
• Different classes of mechanistic dynamic models covering the different food process categories will be explored. These models will cover the main phenomenological aspects of the systems in order to capture kinetic and microbiological phenomena related with food quality and safety aspects, even at the at the cell level. In addition detailed flow and heat transfer models, also based on Computational Fluid Dynamics (CFD) and population balance equations (PBEs) will be constructed.
• Novel model reduction techniques will be adapted to of the selected models to obtain simpler and computationally amenable reduced descriptions while maintaining their explicit parametric dependence in a format that will keep them appropriate for process monitoring and control.
3. The development of an intelligent control framework to operate the complete plant and to take optimal decisions. The framework will include conceptual modelling tools, efficient real-time optimal decision algorithms (real-time optimisers) and robust control methods that will minimize the inherent system’s uncertainty and will make rational use, in combination with models and in a predictive context, of the large and complex data sets obtained from the plant signals and sensing technologies. The framework will include the following two items.
• An advanced control layer that will take advantage of the monitoring of quality information provided by on-line quality sensors and Process Analytical Technology (PAT) and advanced and robust controllers based on mathematical models of the processes, to account for the uncertainties associated with the system and external disturbances and thus enforce optimal operation.
• Optimal operation support tools. These tools will include dynamic predictive simulation and dynamic optimisation user-friendly modules for analysis (optimal selection of future plant operation policies and optimal decision making). Optimality will be stated in quantitative terms by maximizing product quality (while satisfying safety constraints) as well as in terms of minimizing operation costs, mostly related to the minimization of water and energy consumption, and environmental impact.
4. The demonstration of the applicability and efficiency of the concepts, methods, paradigms and tools developed on a number of application cases representative of processes and process combinations in the food industry. To that purpose, the CAFÉ concepts, methods and tools will be confronted with standard control solutions in order to quantify improvements in terms of product quality, operation costs and environmental impact, on the following set of applications illustrative of bioconversion, separation, preservation/stabilization and structuring, in cooperation with the SMEs involved in CAFÉ project:
1) wine making, as a case study of bioconversion processes
2) microfiltration of food beverages, as a case study of separation processes
3) freeze-drying of lactic acid bacteria, as a case study of preservation processes
4) ice cream crystallization, as a case study of structuring processes
Project Results:
WP2: Knowledge representation and data management
The specific objectives of WP2 were to propose tools and methods dedicated to knowledge and data management on the different case studies. One main point was to use and validate these methods for all food processes implied in the CAFE project. Approaches, architectures, models and innovative software have been developed that were implemented on different food processes. The relevance of proposed methods and their ability to evolve and operate together was also analyzed. A major challenge was also to provide a data management system easy to use and efficient in a scientific context. One of the problems often encountered is the lack of formalized description in order to re-use and share data in large scale. Indeed, one of the biggest challenges in scientific research is now to unlock the full value of scientific data. The first key issue is to strongly associate data models (relational models, XML Schema, formats, etc) and formalized knowledge models. We focused on introducing metadata expressiveness to provide a controlled vocabulary and inference through ontologies. The other main issue is to represent knowledge in a generic way, so that models and tools are easily implemented for the different food processes. This task required to target at the common concepts in food processes. It also required to formalize these concepts and to formalize relations between concepts. To be able to design tools and methods for all case studies was a work package specific challenge. We have developed and implemented for all case studies: a layer architecture and Web Service implementations; models for scientific and local databases; ontologies for processes, variables and information descriptions (metadata); and two approaches of interface for mathematical models: based on OpenMI/Seamless or local database.
The ontology of units and measures was specifically created to offer a structure in which units and measures can be formally defined (see WP2.1). It allows to specify variables’ role and acquisition method, and to define the usual/unusual variables, the on-line/off-line variables, etc. This information can be used further to generate the databases. This ontology was implemented on different food processes. It was accompanied by a set of services that support unit conversion to enable models to obtain information in appropriate units.
Case study 1: Wine-making
The main result was the different measurement management (on-line, at-line and off-line) and metadata implementation. A Data Linked environment has been provided, based on Web semantic languages (RDF, RDFS, OWL and SPARQL) that allows to link resources (experiments, data, articles, reports, webpages,…). The scientific database is the process memory, and is a posteriori used for scientific issues, data analysis, data mining, modelling,…. This guarantees data reliability, integrity and sustainability. For example, the occasional anomalous results caused by oenological operations must be distinguished from disruptions caused by faults and failures. Indeed oenological operations should not be taken into account in the analysis while faults may significantly affect fermentation and should be considered in the analysis. Thanks to a Web form in Information System, operators and experts described events with RDF syntax according ontology of events. The off-line measurement management is often difficult because there are very heterogeneous data sources. The information system generates custom spreadsheets (xls files) and allows acquisition in a friendly way. The software checks the consistency of data, displaying which data already exist in the database, which will be updated and which will be added.
Case study 2: Microfiltration of food beverages
An ontology describing the beer production process, published on the food-ontology portal Wurvoc as the brewing-ontology (http://www.wurvoc.org/brewing) has been created. It has been zoomed in on the filtration process (Filtration-ontology, http://www.foodontology.nl/beer-filtration) and all relevant physical and chemical variables in this process and the workflow concerning filtration experiments (Beer filtration variables-ontology, http://www.foodontology.nl/beer-filtration-variables) have been specified. Concepts from these ontologies are used in the membrane fouling model and in the beer membrane filtration unit. In order to store the data in a fast and meaningful way we opted to use SeamFRAME, specifically designed to address aspects in the agriculture domain. Once the ontologies have been created, the framework can be used to (semi-)automatically generate a matching database and a set of Java classes. These classes and the tables in the database represent the classes from the ontology. The database stores instances of the classes with the semantic metadata, while the Java classes are used to connect the different models, input devices and database to each other. The connection between models, devices and databases uses OpenMI, a standard that defines how models should communicate with each other. The database and the input devices can be considered as models as well. The generated Java classes are used to program layers around the different models. The OpenMI standard is request driven, each layer specifies what data is available and when other models need some kind of information they will request it from the relevant model. This model in turn might ask another model for certain information. Once this chain reaches the lowest level (usually either the database or the input device) the information travels back up again. Although the SeamFRAME framework automates a lot of the above mentioned translation tasks, there are some limits to the system. We had to restructure part of the ontologies to fit the limits that SeamFRAME imposed on them. Moreover, we aim to gather process data and present them in a uniform and standardised way to the scientific layer that handles further processing in the fouling model and the control model. SeamFRAME turned out to be a suitable solution to gather and present explicitly modelled data and processes. The result is a database and a model connection interface.
Case study 3: Freeze-drying of lactic acid bacteria
On this case study, models, architecture, methods and software had been implemented. We were able to take into account existing software and obtained a very good interoperability. Web services, using standards and local database model allow this very important result. The local database has the role of hub for the transmission of each information required for the plant control for real-time operations. So the local database represents the main communication means between the concurrent processes and software units that constitute the control application. The local database scheme provides one table per variable, and triggers that create a new table as soon as a new variable is created. This allows managing the evolution of the different studied variables (e.g. one variable studied for a short duration, a new variable that will be measured on a long-term, etc). This solution avoids generating a lot of null values in columns (in the case of one table with one field per measured variable) or too long tables (in the case of one table with all measurements in line and not in columns). This kind of architecture enables a dynamical design of the hardware and software, providing a competitive advantage by the optimal reallocation of the computing capabilities. It has also a key role as an interface for integration from many of the technologies already present in process rigs. The integration is implemented by generic bindings from already existing softwares to table data where inputs, outputs and parameters are logged or by specific plug-ins infrastructure provided in open and portable way.
Case study 4: Ice cream crystallization
This case study shows how our approach could be flexible. We started from scratch on the Ice cream process. After 20 person/days of work we got a Scientific Information System with the main functionalities: experimental management conditions with a standardized vocabulary; data online and offline acquisition. Data of current or previous experiment can be plot dynamically in a Web browser; annotation of data, experiment, protocol descriptions,... A “data linked” environment is available.
The CAFE Information System currently runs on ice cream crystallization process. It was used for all the demonstration “Cafe DEMO day” in Antony.
Traceability
In the field of food processing, improve performance and reliability of information processing dedicated to traceability is a major issue. Traceability allows to track any product from its origin to the final reception point. This requires to record all movement of product and steps within the production process. With traceability, it is possible to identify precise datetime and location of products in order to be able to recall products. We proposed a methodology to store the traceability using Semantic Web annotations and using a reasoner. The first part of this work had started from the generic process ontology. This ontology describes different aspects of process involving products and sub-process (or unit process). It has implied the definition of a precise shared vocabulary with the different teams of the project CAFE. These knowledge models describe processes as successive sub-processes involving products and operations. This representation allows the description of case studies and the traceability of products in a simple way. Products and sub-processes are linked by property and an experiment can be described as an instance of a process. Any product could be an input product for a specific subprocess and the same product constitute output for another subprocess. We also defined the 'adding product' property. In our ontology, the relationship between products and unit processes is described and specialized. This method allows to trace the product elaboration with the semantic. Software reasoner infers new information. Traceability purposes need to deduce information. We used engine reasoner to infer a RDF graph. Beyond the needed traceability in the CAFE project, we used semantic graphs to link supplier, producer, retailer and carrier. Graph of products or graph of operations can be generated allowing queries to find history of a product and queries to estimate impact of a product. This work was done with BivTrace and INRA. We developed a software prototype of a smart tool whose originality is: (i) to use reasoning to help experts in order to find problem origin; (ii) to determine (inferred approach) all the actors impacted by the problem; (iii) to provide a friendly access to semantic information. We adapted process ontology to traceability case, in close collaboration with traceability experts. We added new concepts such as "Lot", "Actor" or "Transaction" and semantic relation between these concepts. “IsTransformedIn”, “IsMixedTo”, “IsCutUp”, “IsComposedOf” are defined sub-properties of “IsImplicatedIn”. We designed a distributed application architecture based on a Web Service. A main point of this application is interoperability between semantic graph part and database part. The semantic graph and database model are based on key common concepts.
WP3 : Process experiments
Case study 1: Wine-making
Wine quality is difficult to estimate because more than 100 different compounds can contribute to wine flavour. A number of “quality marker molecules” have been identified among these compounds: (i) varietal aromas, volatile compounds linked to non-volatile precursors in the grape that are released by the yeast during fermentation, (ii) fermentative aromas, generated by yeast secondary metabolism. The varietal aroma compounds are present at very low concentrations, and cannot therefore be determined on-line. Efforts to develop on-line monitoring of quality and metabolic markers therefore currently focus on fermentative aroma compounds.
Task 3.1: Product and process characterization using on-line and off-line measurements
The main objective of the case study “Bioconversion” in Task 3.1 was the on-line measurement of key ‘marker molecules’ with a high acquisition frequency during the winemaking fermentation. To measure the concentrations of these volatile molecules, it was decided to use an on-line gas chromatography (GC) system. The acquisition of this data allowed us to get the synthesis kinetics of some key volatiles markers. It was possible to know when each metabolite is produced and then to establish a chronology of the metabolic events that took place during the wine-making fermentation. Moreover, the obtained on-line data were compared with off-line standard data, such as CO2 kinetic and main products concentrations. Correlations between some volatile markers measured on line and some off-line parameters have been built. Thanks to the high acquisition frequency of this device, kinetic parameters of major interest for modelling yeast metabolism were calculated.
Task 3.2: Impact of the process variables, setup of a database for process modelling and identification of the parameters critical for the control
It was decided to focus on the study of the gas-liquid transfer of the volatile compounds. Indeed, even if the concentration of volatiles at the end of fermentation depends primarily on their synthesis by the yeasts, it may also be significantly modified by losses into the exhausted CO2. Measuring these data on-line allows us to calculate balances differentiating the microbiological process of production and the physicochemical process of transfer into the exhausted CO2. It permits a better understanding of the production of the fermentative aromas and the development of optimized strategies for fermentation control. Indeed, from a microbiological point of view, the total produced amount must be considered, whereas, from a technological point of view, the concentration remaining in the wine is the key issue. The study of the gas-liquid transfer was focused on 3 compounds – representative of the diversity of the fermentative aromas: a higher alcohol (isobutanol), an acetate ester (isoamyl acetate) and an ethyl ester (ethyl hexanoate). During the alcoholic fermentation, the gas-liquid ratio of the volatile compounds can be affected by different parameters: the temperature, the liquid phase composition (matrix effect) and the CO2 release (stripping effect). These two last parameters vary throughout the whole fermentation.
Task 3.3: Test of control strategies elaborated in WP6 and WP7.
The test of the control strategies developed in WP7 was performed on the continuous multi-stage reactor (MSCF), which was shown to be representative of the traditional batch fermentation plant. The control objective has been expressed in terms of time minimization. The goal was thus to make the system go as fast as possible from one set-point to another. The overall motivation concerning the control of MSCF is to make this fermentation setup more reliable and reproducible. The selection of the desirable values of sugar concentrations will be easier, due to the use of a model associated to a control system. The linearizing feedback control law has been tested on the experimental setup.
Case study 2: Microfiltration of food beverages
For beer filtration, three methods are common: depth filtration, and surface filtration (single- or double-pass).
Depth filtration removes particles from beer within the depth structure of the filter medium itself. The particles are either mechanically trapped in the pores or absorbed on the surface of the internal pores of the filtration medium. The filter media can be pre-made sheet filters or fine powder made of, for example, diatomaceous earth (DE), also known as kieselguhr, which is introduced into the beer and re-circulated past screens to form a filtration bed. Surface filtration can be either absolute or nominal with a minimal depth capacity. Surface filtration consists of a thin membrane or a thin membrane covered with polypropylene or polyethersulfone in which particles are trapped in pores in the filter medium. Prior filtration with a depth filter is usually required to prevent clogging the surface of a cartridge membrane filter. Surface filtration, using membranes, is a novel beer filtration method. Cleaning of the membrane filters is therefore still done using a rule of thumb, using a simple maximum pressure rule that was determined experimentally. Finding an optimal cleaning strategy is the next step: minimizing the TCO (Total Costs of Ownership) for a filtration unit by reducing chemicals, energy and water consumption and lengthening membrane life.
Task 3.1: Product and process characterization using on-line and off-line measurements, improvements in the description and understanding of the process
The fouling of the microfiltration membrane is studied by varying specific process and beer parameters. Based on data available from literature three beer components were determined to be the most relevant factors causing fouling on the membrane surface and in the membrane pores: macromolecules, aggregates, and yeast cells. Macromolecules, like proteins and carbohydrates reach a scale not larger than 0.2 μm. It is expected that those particles will be adsorbed in the membrane (mainly beta-glucans). Aggregates (Colloids) are the so called Haze particles, which reach a size of around 0.5 μm. Beside the adsorption in the membrane and selected pores, they are expected to be captured in the cake layer. The largest particles are the yeast cells and (if added) filter aids with a size about 5 μm. Those particles are larger than the membrane pores and are expected to form the cake layer. Firstly the “mBMF” has been built to be run in a Cross-Flow configuration, as is also done on industrial scale. Beer is being circulated in a loop through the membrane hollow fibers. Process parameters which can be varied are the cross Flow velocity over the membrane surface and the permeate flux through the membrane pores. Since most of the data available from literature to describe filter fouling in modelling systems are based on Dead-End studies, the decision was made to adapt the mBMF to be able to also run in Dead-End configuration. This means that the beer is pressed into the hollow fiber membrane, which is blocked at the outlet. The beer is pressed through the membrane, instead of partly circulating back into the tank. To meet the requirements of the brewers and to work with the most realistic beer composition, also a third process program was written: in a serial filtration, beer will be filtered in a Cross Flow configuration on one membrane module and the “fresh filtered” permeate were used
Task 3.2: Impact of the process variables, setup of a database for process modelling and identification of the parameters critical for the control
The effect of macromolecules on fouling was studied with unfiltered beer in a Serial Configuration. In the first filtration step the yeast cells and macromolecules should have been filtered out. The second filtration step then should separate the remaining macromolecules. Several filtration runs in this configuration were performed, but were faced with a strong foam creation within the membranes and tubes in between. Aggregates are temperature sensitive. Temperature was thus used as a parameter to influence the concentration of the aggregates present in the feed beer. It is expected that the higher the temperature was during centrifugation, the more aggregates are present in the beer, since at this temperature most of the aggregates were dissolved and not removed during centrifugation.
Task 3.3: Test of control strategies elaborated in WP6 and WP7.
Advanced (optimal) control relies on a physical dynamic BMF model. The general goal is to produce enough cleaned beer over time while removing fouling against minimum costs. The average transmembrane pressure (TMPavg) is measured and used as an indicator for membrane fouling. It is not allowed to exceed an upper bound to prevent membrane degradation. The following optimal control configurations were studied: constant values for retentate and permeate flux that are identical for all filtration periods (CC0), constant values for retentate and permeate flux during each filtration period but different from one filtration period to another (VC0) and variable values for retentate and permeate flux during each filtration period that are also different for each filtration period (VC1).
Case study 3: Freeze-drying of lactic acid bacteria
This case study is mainly investigated on a model strain of lactic acid bacteria: Lactobacillus bulgaricus CFL1, a very sensitive strain to the freeze-drying process. Two formulations of protective molecules were selected according to the following criteria: different physical behaviour during the freeze-drying process and different ability to protect bacteria during the process. The freeze-drying process involves three successive steps: freezing of the aqueous solution, followed by primary drying to remove ice by sublimation and finally secondary drying to remove unfrozen or sorbed water by desorption. The improvement of the understanding of the freeze-drying process of bacteria requires the determination of the drying kinetics (sublimation and desorption kinetics) and the evaluation of quality degradation at various time of the process. Both these aspects were investigated for the protective medium C200 and the both freezing conditions (compact and pellet layers). Characterization of the formulation is needed for process optimization and for defining the upper product temperature limit during the primary and the secondary steps of the freeze-drying process. During primary drying, if the product temperature is higher than the collapse temperature, the amorphous material will undergo viscous flow, resulting in loss of the pore structure obtained by freezing, which is defined as the collapse phenomenon. Collapsed dried products generally have high residual water content and lengthy reconstitution times and may also present a loss of functional properties.
Task 3.1: Product and process characterization using on-line and off-line measurements, improvements in the description and understanding of the process
It was dedicated to the physical and biological characterization of the selected lactic acid bacteria strain with respect to the freeze-drying process. Two protective media and two freezing procedures were investigated. The freeze-drying process resulted in a degradation of the acidification activity of lactic acid bacteria whatever the freezing and the drying conditions applied. The loss of acidification activity is more important for the pellet layer than for the compact layer even if the loss of acidification activity is lower after the freezing step. The product configuration in pellets induces more degradation during the both drying steps than the compact layer configuration. When considering the different steps of the process, the most important loss of acidification activity is caused by the sublimation step whatever the freezing and the drying conditions. The removal of ice by sublimation cannot cause bacteria degradation since ice and bacteria are phase separated during the freezing step. It is thus the removal of the unfrozen water that causes the major bacteria degradation. When considering the pellet configuration, most of the unfrozen water is removed during the primary (sublimation). When comparing the both drying conditions for the same freezing method (compact layer), the aggressive condition results in lower loss during the sublimation step and higher loss during the desorption step than the conservative condition. A higher sublimation rate seems to limit bacteria degradation during the primary drying step. Nevertheless the higher shelf temperature applied during the secondary drying seems to have a negative impact on bacteria quality. Freeze-drying of pellets of 1-2 mm of diameter of bacterial suspension has several potential advantages over compact layer freeze-drying in tray, like shorter desorption step and easier handling of the dried product. One of the main objectives of the Partner 5 (Telstar) is to develop a freeze-dryer prototype for generating frozen droplets of controlled size of bacterial suspension and to freeze-dry them. A prototype of a spray freezer was developed. The sprayed droplets are frozen by a cold gas stream circulating in counter-current.
Task 3.2: Impact of the process variables, setup of a database for process modelling and identification of the parameters critical for the control
A systematic study of the effect of the process variables on the drying kinetics and on the degradation of the biological activity of the bacteria was carried out. The chamber pressure applied during the desorption step had an impact on the water activity of the freeze-dried product. The higher the chamber pressure, the higher the water activity. The water activity reached at the end of the process had an important impact on the biological activity recovery of the bacteria and on the storage stability.
The impact of the process conditions applied during the sublimation step on the sublimation time was also significant. As expected, an increase of shelf temperature resulted in an important decrease of the sublimation time. The main result observed is that the sublimation rate impacts the degradation of the biological activity of the bacteria during the process. The higher the sublimation rate, the lower the degradation. This tendency is confirmed by the storage stability.
When considering the formulation without bacteria, the collapse temperatures (Tcoll) and glass transition temperatures of aqueous solutions (Tg’) are close. The glass transition and the collapse of the product structure take place in the same temperature range. The collapse temperature determination was quite difficult for bacterial suspensions as the result of a less distinct structure pattern obtained by freezing compared to aqueous solutions. In addition, the viscous flow took place gradually from the beginning of local loss of structure to the complete loss of structure. A strong influence of the cells was highlighted compared to the “protective” medium effect, tending to increase the collapse temperature of the complex material. As a conclusion, it can be pointed out that the presence of lactic acid bacterial cells conferred a significant “robustness” to the freeze-dried product, thus allowing the use of higher sublimation temperatures during primary drying than expected from the protective medium. The effect of the presence of lactic acid bacteria on the behaviour of the product during freeze-drying seems to be partly related to the cell structure.
Task 3.3: Test of control strategies elaborated in WP6 and WP7.
In the freeze-drying case study, the optimal control problem consisted in reducing the duration of the drying cycle, while satisfying final and path product quality constraints. The considered constraints were the internal process dynamics, as given by the model developed in WP4, the final moisture content of the product imposed by the product stability requirement, the collapse temperature of the product ensuring limited biological quality degradation and mechanical integrity of the product, as well as the equipment capabilities. Optimisation algorithms were implemented and experimentally tested on-line. Real-time optimization algorithms demonstrated its ability to update the control profiles reliably, in face of various disturbances: initial process state (temperature, pressure, amount of product) temporary lack of measurements (feedback) and temporary difficulties in approaching the prescribed set-points.
Case study 4: Ice cream crystallization
Ice cream and sorbet manufacturing process is composed of different steps: the first step is a mixing of ingredients, followed by a preheating at about 60°C in order to perform homogenisation. According to the kind of product, a pasteurisation (80-85°C during a few seconds) is carried out. The mix is then stored for ripening at 4°C for 12-24 hours. After, the mix follows a stage of pre-freezing and foaming, and ice crystallization takes place inside a Scraped Surface Heat Exchanger (SSHE, and often called "freezer"), thanks to the refrigerant fluid vaporizing at the wall. This step is the most critical of the process and it is responsible for the final quality of the product. Ice cream and sorbet quality is mainly governed by sensory properties related to the ice content, the ice crystal size distribution and the apparent viscosity, which are dependent on how the crystallization occurs in the freezer. All through the crystallization process, ice cream and sorbet undergoes very significant changes in the transport properties and thermal properties. This generates significant changes in velocity profiles, which, in turn, considerably modify temperature profiles and pressure drops inside the process equipment. In order to control the final quality and the technological properties of ice cream and sorbet, it is necessary to control the influence of temperature and shear rate, as well as their coupled effect on the product quality.
Task 3.1: Product and process characterization using on-line and off-line measurements, improvements in the description and understanding of the process
The pilot scale Scraped Surface Heat Exchanger (SSHE) has been installed at CEMAGREF and is fully functional. The pilot plant was equipped with a variety of sensors for monitoring the process and the product quality, but also for the refrigerating system: temperature, pressure, power and water consumption sensors on the refrigerating system; on-line quality sensors: draw temperature, ice crystal size distribution by Focused beam reflectance method (FBRM), ice crystal size distribution by EZ on-line imaging probe, on-line viscometer MIVI, capillary viscometer, etc.; numerical control of the freezer and data acquisition in LabView® was implemented. Some of these sensors are quite innovative, such as the two optical sensors for on-line measurement of ice crystal size distribution. A large number of equipment qualification and sensor validation experiments were performed.
Task 3.2: Impact of the process variables, setup of a database for process modelling and identification of the parameters critical for the control
The mechanism of ice crystallization within a freezer is affected mainly by the operating conditions of the freezing process, such as the evaporation temperature of the refrigerant fluid, the dasher rotational speed and the mix flow rate. The temperature of the refrigerant fluid provides the driving force that triggers ice nucleation and it determines the heat removal rate of the system. The scraping action of the dasher improves the heat transfer rate between the freezer wall and the product. The mix flow rate dictates the residence time of the product within the freezer, affecting the available time to remove heat from the product, and consequently, the ice nucleation and growth mechanisms of ice crystals. It is therefore important to identify the operating conditions of the freezing process that most directly affect ice crystal size so as to improve the quality of the final product. Our results showed that the use of the FBRM sensor makes it possible to monitor online the development of the ice crystals in sorbets containing up to 40% of ice. The mean ice crystal chord length was mainly affected by the evaporation temperature and slightly by the dasher speed. Decreasing the refrigerant fluid temperature allows us to reduce the ice crystal size, due to the increase of the supercooling driving force that leads to further ice nucleation. High dasher speeds slightly decreased the mean ice crystal chord length, due to production of new smaller ice nuclei by secondary nucleation, inducted either by the smaller ice flocs remaining from previous scrapings; or by the remaining ice debris produced during the attrition of the larger ice crystals. The draw temperature of sorbet was significantly affected by the mix flow rate, followed by the refrigerant fluid temperature and the dasher speed. Low mix flow rates (long residence times) result in lower draw temperatures, due to the fact that the product remains longer time in contact with the freezer wall, extracting more heat from the product. Low evaporation temperatures lead to lower draw temperatures. High dasher speeds very slightly warm the product, due to the dissipation of frictional energy into the product, which effect was in part moderated by the improving of the heat transfer coefficient between the product and the freezer wall. We observed that an increase of the mix flow rate will reduce the axial dispersion in the SSHE. We also observe that lower evaporation temperatures would lead to the presence of a dead volume, which is due to the increase of the apparent viscosity of the product near the heat exchange cylinder wall, which delays the exit of a certain amount of product near the freezer wall. The dasher rotational speed, showed no significant influence on the RTD curves.
Task 3.3: Test of control strategies elaborated in WP6 and WP7.
The control strategy for the crystallization case study has first been expressed as a problem of minimization of the energy consumption with constraints on both the viscosity and the mean crystal size. The energy consumption has been numerically evaluated for different values of the evaporation temperature Te, the mass flow rate mfr and the scraper rotation speed Nscrap. We have observed that, in the ranges of admissible values of these input controls, the function of the energy consumption is monotonous. More specifically, it is decreasing with respect to mfr and increasing with respect to Nscrap and Te. As a consequence, the energy consumption will be minimal for the lowest evaporation temperature Te, the lowest dasher rotation speed Nscrap, and the highest mix flow rate mfr. The optimal values of the control inputs are therefore determined by the constraints and no optimal control strategy is needed to solve the problem. As a consequence, the problem has been reformulated: whereas the mean crystal size can really be considered as a constraint, the viscosity has to be controlled, depending on the desired type of ice cream. In the new formulation of the problem, the issue is therefore to control the viscosity of the ice cream at the outlet of the freezer. For that, we use as control input the evaporation temperature Te. It must be pointed out that the inlet mass flow rate could also be used, but, as it is directly related to the productivity of the process, it is usually kept constant in industry. Under the physical assumption that the outlet temperature (after the pipe) is equal to the saturation temperature, the outlet viscosity of the ice cream only depends on the third moment M3 of the crystallization model developed in WP4. Thus, to control the viscosity, we just have to control M3 or Tsat, which is a function of M3. For the experiments, a value of Tsat has been chosen as the set-point. A cascade control strategy has been developed to control the saturation temperature of the ice cream. This is composed of two loops : a primary loop to control Tsat using Te, and a secondary loop to control Te with Vcomp (the compressor rotation speed). The control law has been validated on the experimental set-up and gives satisfactory results.
WP4 : Model building, process dynamics and model reduction
WP4.1 Existing model review and adaptation
As much as possible, the developed models are developed from ‘first principles’. We have succeeded in that for three case studies, namely the freeze-drying case, ice-crystallization case and beer microfiltration case. The problems addressed in these three cases are dominated by physical phenomena, in contrast to the wine fermentation case – where biochemistry is dominating the problem. For physical phenomena ‘first-principles’ models are often available – while for biochemistry the compounds involved in the reactions and their kinetics are often a priori unknown. A first-principle approach via metabolic-network modelling has been investigated. Deliverable D4.1 also reports on the often required model adaptions required for the specific problems in the case studies, which involved for example developing constitutive laws for material properties and a first estimation of model parameters. All three first-principle models developed for the three cases involving processes of physical nature, has gone beyond the state-of-the-art models known in literature, at the start of the project.
WP4.2 Model reduction
Model reduction techniques have been applied to these models. The model reduction applied to the first principle models from three case studies have been reported in D4.2 and several scientific papers. To the freeze-drying case and beer microfiltration case, scale analysis has been applied as a first step in the model reduction. With this technique one can simplify the model, but one can still retain its mathematical-physical description (in terms of algebraic and (partial) differential equations). In the ice-cream case the population balance is reduced in complexity using the moment method. Also, here the mathematical-physical description of the problem can be retained. Only in the freeze-drying case it was required to reduce the model further in complexity to attain sufficient computational speed for use of the model in real-time model-based control. Here, the technique of Proper Orthogonal Decomposition has been applied. From a modelling perspective this technique has the disadvantage that the mathematical-physical description is lost, but that is at the gain of very significant computational speed. The top-down approach applied to the wine case more or less automatically leads to a model of reduced complexity. Hence, the above mentioned model techniques need not be applied to these type of processes (of chemical nature).
WP4.3 Model identification and validation
For the ice cream and freeze-drying case study model parameters have been identified using the method of optimal experimental design, where the reduced model has been used to compute an optimal excitation of the experimental system. Furthermore, the reduced model has been used for sensitivity analysis in all cases, which is a requirement prior to optimal experimental design. Details are reported in the deliverables of WP3 and WP6.
WP4.4 Development of simulation software
For each case study simulation software is developed for the reduced model, which has been applied in the model-based process control and optimization. (Reduced) model descriptions are incorporated in deliverables D4.1 and D4.2. Specifics on simulation software for demonstration purposes are described in deliverables of WP9. Only for the ice-crystallization case study a demonstration has been developed.
WP5 : Sensor development and validation
Microfiltration of food beverages
For case study 2 gas sensors provides a method to follow the evolution of the beer during the filtration and to detect the point when the beer reached a stable stage indicating that no more solids are removed from the liquid and then that the membrane is saturated. Gas sensors are obviously sensitive to the composition of the headspace then for a reliable analysis it is necessary that the product is kept in the optimal conditions to develop a representative and measurable headspace. The beer inside the filtration unit is kept at low temperature (T<5°C) and high pressure. Both these conditions hinder the possibility to obtain a representative volatile part, then the gas sensor has been operated off-line through a periodical sampling of beer at the output of the filtration unit. Figure 1 shows the typical behaviour of transmembrane pressure versus time and the ongoing of the first cleaning procedure (backwash). The dots indicate approximately the sampling point. Samples taken at specified times, were measured with the gas sensors. The features of these gas sensors are described in the deliverable 5.2. Figure 2 shows the first principal component (PC1) calculated from the signals of the eight gas sensors plotted versus the time of sampling. The behaviour is representative of the on-going filtration process and it shows the occurrence of the membrane saturation when PC1, and then the sensor signals, saturates. CTech explored in the project the possibility to measure the progress of filtration of beer by using impedance spectroscopy. For the scope, a interdigitated couple of electrodes is placed in the liquid and the impedance versus frequency is measured with a network analyzer. Actually, the Network Analyser directly measures the complex reflection coefficient. In this case the magnitude of the reflection coefficient has been found to be meaningful to describe the properties of the beer samples. The spectral magnitude of reflectance coefficient provides a fingerprint that can be related to the composition of beer. Figure 3 shows an examples of fingerprint. The reflection coefficient profile method has then been found very capable of detecting a small perturbation from the initial beer constitution and also the capability to distinguish between change in beer constitution, beer temperature and beer aeration.
Figure 1: qualitative behaviour of the transmembrane pressure (TMP) versus time.
Figure 2: behaviour of the first principal component plotted versus the filtration time.
Figure 3: spectral fingerprint of a beer with high content of particulate.
Freeze-drying of lactic acid bacteria
The major results have been obtained with Ultrasound sensors and Electronic nose.
Ultrasound sensors have been developed by Alctra. It is based on an arrangement of piezocomposite emission and receiver transducer allowing for the contemporaneous transmission and echo/pulse measurement modes. Figure 1 shows the schematic set-up of the sensor system whose actual appearance is displayed in Figure 2. The transmission mode is provided by a couple of emission and receiver transducers, while the echo mode is provided by a single emission/receiver unit. In both cases the propagation time a ultrasound pulse is measured. Pulses of ultrasounds at 1.25 MHz have been used. The use of the two measurement modes gives an unprecedented advantage to monitor the process of freeze-drying and also the vertical stratification occurring in the bacteria mass. This is visible in Figure 3 where the signals versus the time of both the measurements are shown together with the temperature of the sample. The propagation time recorded for the transmission mode (wave traveling parallel to the sample) shows an abrupt change at the temperature of -10.6°C. On the other hand, the propagation mode of the echo time undergoes a more smooth transition indicating the occurring of the stratification process that proceeds until the temperature reaches -14.8°C. This arrangement provides than a thorough characterization of the freeze-drying process.
Figure 1. schematic arrangement of Ultrasound sensors for freeze-drying monitoring
Figure 2. Picture of the sensor cell with the emission/receiver transducers.
Figure 3. transmission (E/R) and echo mode signals and temperature during one freeze-drying process.
The other developed sensor is the array of gas sensors (electronic nose) developed by UTOV. In this case, the gas sensor array has been used to estimate the quality of finished product. The quality of freeze-dried bacteria are evaluated in a series of destructive tests aimed at measuring the residual moisture, the cells viability, and the acidification activity. This last quantity is estimated with a method called CINAC where constant conditions are kept and the time necessary to reach the peak of the derivative of pH is measured. Shorter this time higher the acidification property. The electronic nose was applied to measure in a non destructive way the headspace of the dried bacteria. Figure 4 shows the experimental arrangement, measurements have been done at room temperature. Results suggests that an electronic like that developed in CAFÉ project can be fruitfuilly utilized for a non destructive inspection of the quality of freeze-dried bacteria.
Figure 4 : Experimental setup for freeze-drying bacteria quality test. Bacteria are kept under the sampler at the right. The electronic unit in the background captures the headspace and delivers to the internal sensor array.
Ice cream crystallization
The sensors developed for the ice cream case study are a refractometer for sugar quantification and a gas sensor array (electronic nose) for the evaluation of global properties of the ice cream.
The refractometer developed by ALCTRA is a total internal reflection arrangement where the change of the refractive index of the measured sample modulates the amount of light internally reflected in a sapphire glass. Fig. 6 shows the principle of the device. The probe is an hemispherical lens that directs the light emitted by an IR LED to a phototransistor, the light is reflected by the surface of separation between the detector and the sample. Any change in the refraction index in the sample is then detected as a change of amount of backreflected light. Infrared light is used because it is known that the refraction index in infrared is particularly sensitive to the content of sugar molecules. Than the amount of light lost at the sapphire/sample interface is also sensitive to the concentration of sugar. In the adopted arrangement, the light is refracted twice giving rise to an increase of sensitivity towards changes of refractive index. Fig. 7 shows the device placed in line at the output of the ice-cream machine, at the pilot plant at IRSTEA. The device is complemented by a temperature sensor necessary to compensate for the changes in refraction index due to temperature.
Figure 6. Drawing of the measurement principle of the ALCTRA refractometer
Figure 7. Measurement cell implementing the refractometer and a temperature sensor.
Figure 8. Sugar content (in °Bx) and temperature behaviour during the sorbet processing.
In Figure 8 the behaviour of temperature and the estimated sugar content (given in degree brix) is plotted. Since the concentration of sugar is also dependent on the segregation of water in ice-crystals there is a correlation between the sugar concentration and the ice mass fraction. In order to measure the properties of the ice-cream the gas sensor array has been connected on-line at the output of the ice-cream machine.
WP6 : Process design and optimization
Main results within the context of WP6 to be summarized below, are related to the use of efficient optimization methods, on the one hand to produce a reliable representation of the plant (a model) consistent with the available data and on the other hand to devise optimal modes of plant operation. This essentially leads to the following optimization paradigms:
• Optimization for understanding process’s behavior, which translates into the combination of measurements with process experiments to identify and calibrate mathematical models that will be subsequently employed in process optimization.
• Optimization for process control, meaning either the off-line computation (design) of optimal operation conditions to be communicated to controllers or real-time optimization during plant operation
Microfiltration of food beverages as Separation process
For a given amount of beer to be filtered, operation policies were designed to minimize, pumping energy in addition to a number and costs associated to membrane cleaning. Given such objective, off-line and on-line optimal control policies have been computed and validated, resulting in costs reductions of about 12%. Optimal plant operation involves decisions at different levels, in particular, the following values have to be chosen: the number of chemical cleanings (CIPs), the number of back-flushes per CIP, the value of the cross and permeate flow set points (QF, QP) and the maximum trans-membrane pressure (TMPmax) between back-flushes that allow processing the specified amount of beer by the required final time.
On-line model parameter estimation
Some experiments have been performed in the beer filtration pilot plant in order to identify the fouling formation dynamics and associated filtering properties. Adjustment of the reduced order (operational) model is done via an on-line (recursive) least-squares parameter estimation. The experimental data include permate and cross flow, output and transmembrane pressures. Parameters were selected by sensitivity analysis and comprised initial membrane resistance Rk, feed dynamic viscosity η, fraction of membrane aggregates β, critical distance parameter Qcr and backflash cleaning efficiency cBF. Initial and final values of the parameters adjusted in the reduced model are given in Table 1.
Rk η β Qcr CBF
Initial 1.16667 1.16667x10-11 0.4 2.1x10-7 0.5
After PE 0.13763 1.3x10-11 0.49 2.65x10-7 0.4
Table 1: Summary of the parameter estimation results. PE refers to Parameter Estimation.
Computation of optimal operation policies
The approach selected for the beer microfiltration merges economic optimization and control and makes use of particular parameterizations to solve the problem using a small number of NLP problems in a single layer. This provides an efficient way of solving the problem and shows a way of dealing with mix-integer dynamic optimization problems. The operational costs to be minimized include mechanical energy and costs associated to backflush and CIP. Results are presented in deliverable 8.4 for two scenarios: one that involves constant flows over a whole CIP cycle and the other with step-wise variable flows (constant for each single BF). In general it can be concluded that (global) stochastic algorithms performed much better than deterministic ones (i.e. local based on SQP). Two thirds of the costs were reduced for constant flows when using 6 instead of 5 filtration periods. As it should be expected, better solutions were obtained when allowing variable flows. In both cases, the costs obtained were significantly smaller than those associated to standard operation involving 7 filtration periods and constant flows (QF = 10 l/h, QP = 0.26 l/h).
Case study 3: Freeze-drying of lactic acid bacteria as Preservation process
Freeze-drying operation makes use of shelf temperature and chamber pressure as the variables to control mass transfer in the product and thus time to attain a given dehydration level. In minimizing such process time, product temperature should not trespass too often the glass transition temperature in order to avoid collapse of product structure that must be considered as a quality objective. For this case study, and in order to have a reliable model representative of the process, model parameters were first identified. The model has been combined with optimal control methods to produce off-line as well as on-line operation policies to minimize process time while ensuring maximum quality.
Model parameter estimation
A detailed description of the model employed for computing optimal can be found in Deliverables 4.1 and 4.2. The objective is to compute the value of the model parameters that better fits the model simulations to the experimental data. Critical parameters included those related to mass and heat transfer resistance. In particular selected parameters were: two parameters (k1 and k2) included in the mass transfer resistance variable, two parameters (hL,1 and hL,2) included in the convective heat transfer variable, dried region thermal conductivity kD, geometrical correction factor at the product bottom fb and mass transfer resistance between the condenser and the chamber kv. Six experiments were performed in which two states were measured and employed as observables: the temperature at the bottom of the product and the vapor pressure in the chamber. Two control variables were employed to define the different experiments: shelf temperature and chamber pressure. The results of the parameter estimation are summarized in Table 2 comparing initial parameter values collected from literature and those obtained from estimation. In order to illustrate the predictive capability of the model, results are compared with experimental data in Figure 9. Figure 9(a) shows the improvement after parameter calibration. Figure 9(b) compares model results with experiments other than those employed for model calibration, what suggests good model predictive capabilities.
KD k1 k2 hL1 hL2 kv fb
Initial
10.1 0.4
0.99
After PE
1 0.1
0.92
Table 2: Estimated parameters as compared with previous ones obtained from literature. PE refers to Parameter Estimation.
Computation of optimal operation policies
Off-line as well as on-line operation policies have been computed and tested both in simulation and at the pilot plant level. Comparisons between standard and optimal operation have been established, resulting in reductions of process time of about one fourth of the standard operation while ensuring quality. Dynamic models employed were physical and mathematical reduced versions of a multi-scale model for mass and energy transfer in the product.
Figure 9: Predictive capabilities of the freeze-drying model. (a) Effect of parameters of the dynamics. (b) Validation of the model and experiments
Typical constraints for the optimization problem in this case study are the final water content and product temperature. Water content is directly related to the product quality and must be below a given bound at the end of the process. Product temperature on the other hand should not exceed the collapse temperature in order to ensure the product integrity. The last constraint can be relaxed to increase productivity, by means of an integral constraint of the form:
(1)
The maximum difference between product and collapse temperature has been also included as a constraint . Values for in (1) and have been obtained for each experiment carried out in the pilot plant. Differences between collapse and product temperature are lower than in the standard case. Control profiles (shelf temperature) as well as glass transition and product temperature for standard and optimal cycles is presented in Figure 10.
Figure 10: (Shelf, product and collapse) temperature profiles for (a) the standard and (b) the optimal operation policies for the second experiment.
Case study 4: Ice cream crystallization as Structuring process
Crystallization is a continuous process with some of the operating conditions fixed as constraints by the user. Energy cost has been characterised experimentally as a function of the conditions and it has been shown that it is a monotonous function of the control variables. A dynamic optimal operation policy is not relevant in this case, however for the purpose of process design and scheduling a crystallization model proved to be critical.
System identification and experimental design
Parameters were identified for a reduced version of a population balance based model. Based on a set of integro-differential equations describing different momentum orders, the model relates inputs such as scrapper speed, mass flow rate and evaporator temperature with outputs such as temperature along the freezer, crystal size and viscosity (a detailed description of the model employed can be found in deliverable 4.1 and 4.2). Experiments have been performed at IRSTEA using a factorial plan (D-optimum) considering the above mentioned control variables. Three different types of measurements were employed to compare model prediction and experimental data: Temperature of the mixture at 3 locations in the crystallizer, mean crystal chord length at the output and viscosity at the output (6 values). Parameters to be estimated were: Wall heat transfer coefficient (he), Growth and nucleation parameters (β and α), Viscous dissipation parameter (χ), Initial crystal size (Lc) and Sorbet viscosity parameter (ξ). Table 3, summarizes parameter estimation results.
he α β χ Lc ξ
Upper bound
40
10
Lower bound
0 0 0
Estimated value
3.85
Table 3: Results of the parameter estimation procedure for the crystallization case study
WP7 : Process monitoring and control
Wine-making as Bioconversion process
The basis for the control design has been proposed within the first years of the project and has been reported in the related reports and deliverables. However, extensive simulations have pointed out a number of drawbacks with respect to the influence of several uncertainties, notable with respect to the available online measurements. Here are the possible improvements in the estimation of the different substrate concentrations necessary to apply the control on the process. First, recall that the process is composed of 4 interconnected chemostats in which the ouput of a reactor is the input of the next one. The adaptive linearizing control law is given by :
in order to force an input/output linear behaviour of Si as :
for some given values of λι. The terms k2μ2 are measured on line (these are the biogas measurements). The setpoints being also known, the only unknowns are the glucose concentrations Si. It was planned to estimate these unmeasured states using an approach based on the minimization of a sum of mean square errors. However this approachs suffers from a lack of robustness with respect to measurement errors. Thus, an approach based on the use of interval observers is expected to give better results. This approach takes advantage of the fact that the biogas is measured online in each reactor. Thus, the dynamics of the substrates in each reactor is given by :
.
Since these are decoupled equations of first order dimension, we can easily reconstruct intervals for the unmeasured varaibles Si in taking into accounts uncertainties on both the available measurements and the estimates of Si which are used as inputs of subsequent reactors.
Figure 11. Experimental results obtained by application of the linearizing control law strategy. Top: CO2 production rates. Middle: dilution rates (i.e. control inputs). Bottom: sugar concentration bounds, estimates and set-points.
The test of the control strategies developed in WP7 was performed on the continuous multi-stage reactor, which was previously shown to be representative of the traditional batch fermentation plant. The control objective has been expressed in terms of time minimization. The goal was thus to make the system go as fast as possible from one set-point to another. The experimental setup is composed of four tanks (reactors) connected in series. The control inputs are the flow rates Qa1, Qa2, Qa3 and Qa4 of the four reactors, with the physical constraints Qai ≥ Qa(i-1) ≥ 0 . The only available measurements are the CO2 production rates (one per reactor at each measurement time). As the measurement data are noisy, a filtering is performed so that a new value of the CO2 production rate is available every 20 minutes for each reactors. We present in Figure 11 the most relevant and interesting obtained results. Some details about these experiments can be found in deliverable D3.3.
Microfiltration of food beverages
Due to the large state dimension of the model (160), the sophisticated implicit numerical integration scheme and finally the need for global optimization due to local minima, the computation of optimal controls is difficult and time consuming. In addition long term effects require computation of optimal controls over all phases (F) and cycles (FC & CC), not just a single phase (F). Finally, due to uncertainty of certain model parameters and states, optimal control computations must be repeated after certain time periods to incorporate improved parameter and state-estimates. Feedback is needed to limit performance degradation due to uncertainty. In principle the number of backflushes or filtration cycles within a CIP cycle is variable. The same applies to the number of CIP cycles needed to filter a desired amount of beer. As a result optimal control computation becomes a mixed integer non-linear programming problem (MINLP) which belongs to the most difficult and time-consuming class of problems in mathematics. Therefore a dedicated problem reformulation is developed for on-line control. Currently the only on-line measurements available for control are those of the trans-membrane pressure (TMP). On-line measurements may be used to realize feedback control with the on-line computation of control corrections to limit performance degradation due to model and other types of uncertainties. Common practice is to on-line estimate the system state and several critical parameters if these are not known precisely. In the latter case the controller is called adaptive. During each back flush, we first estimate some critical, uncertain system parameters, using TMP measurements from the previous phase. Second the model with the adapted parameter values is used to estimate the initial state and compute a new optimal control policy for the next filtration phase. The optimal control is fed to the BMF during the next filtration phase thereby realizing feedback. In this way a computationally feasible adaptive sub-optimal feedback control approach is realized. After simulations performed as expected, we applied our adaptive sub-optimal controller to the BMF in the experimental setup. These experiments were done at a very late stage of the project and were therefore limited to a small number. Initially they also suffered from some programming errors. After removing those it still turned out hard to get a good model fit, as opposed to the ones shown in Figure 12. Remarkably, the optimal control computed from the model performed better than the standard one. The costs in the first case are equal to 2.56 [Euro/m2] whereas in the last case they were 6.8 [Euro/m2].
Fig. 12: Intermediate results adaptive suboptimal control system
Case study 3: Freeze-drying of lactic acid bacteria as Preservation process
The control objective combines operational and quality objectives and is formulated as minimizing cycle time while maintaining product temperature below glass transition. To that purpose, two input variables are considered: condenser temperature and shelf temperature. Measurements (output variables for feed-back control) include chamber temperature and pressure while process states are temperature distribution within the product. One particular state is the temperature of the moving front that in the present set up must be estimated or inferred from the available measurements. The proposed control configuration is presented in block diagram form in the figures 13 and 14 below. It consists of a two level integral structure that includes:
1. A supervisory level, responsible of computing/recomputing optimal control profiles in the event of deviations from quality (optimal profiles for chamber temperature and pressure, as well as front temperature)
2. A tracking/regulatory level to command the follow up of optimal profiles by acting on condenser temperature and shelf temperature
Figure 13. Supervisory Control Structure (RTO loop)
Design of the supervisory level has been completed and its main elements have been validated. These comprise:
- operational models to describe input-output variables
- a model of temperature distribution in the product as a function of chamber temperature and pressure
- a dynamic optimization solver, suitable for RTO and predictive control
The proposed structure has been tested at the pilot plant level (see Figure 15 below).
Figure 14. Robust Control Structure
Figure 15. Temperature control in the freeze-drying case study
Ice cream crystallization
In this case study, the control problem of the ice fraction is studied. The control strategy we propose is based on two control loops: a first loop to control with , and a second loop to control (and so ) with . The control laws which are considered are some linearizing control laws. Some approximations are made to make the control laws only dependent on the available measurements. The control scheme which has been proposed is given in Figure 16.
Figure 16. Control scheme for the control of the viscosity of the ice cream at the outlet of the freezer. The quantities Tsat, Vcomp and x are the respective saturation temperature, compressor rotation speed and state of the system; μ is the viscosity set-point, Tmsat is the delayed saturation temperature measurement, Tem is the evaporation temperature measurement and θ is the parameters vector. The circumflex accent is used for the estimates of the unknown quantities.
The control law which has been designed enables to compute, at each time instant, a value of the control input Vcomp. This value depends on the difference between the estimate of the viscosity and the set-point. It also depends on the estimates of the state and on the measurement of the evaporation temperature. This control strategy is in fact a cascade control strategy with two control loops:
• a primary loop to control the viscosity μ with the evaporation temperature Te;
• a secondary loop to control the evaporation temperature Te with the compressor rotation speed Vcomp.
Figure 17. Experimental results obtained by application of the control law on the crystallization process. Top: ice temperature T. Middle: evaporation temperature Te. Bottom: compressor rotation speed Vcomp.
The control loop can be described in the following way (see Figure 16) :
• first, the temperature of the ice cream is measured. This measurement is not made directly at the outlet of the freezer, but a bit further: there is therefore a measurement delay which has to be taken into account in the control scheme.
• the viscosity is then estimated at the current time by an observer and from the delayed measurement of the temperature: the observer used is called a Smith Predictor; it enables to compensate the delay.
• the estimate of the viscosity is then used in the control law which has been designed to make the viscosity of the ice cream reach a given set-point.
• the parameters used in the Smith predictor are adjusted on-line, to improve the estimation and be sure that the estimate of the viscosity goes to the real value of the viscosity.
Many experiments have been made to test and validate the control law on the pilot plant. We first have validated the control law without perturbations. Some results are given in deliverable D3.3. We then have tested the control law in presence of perturbations. In Figure 17, we show the results of the experiments performed on the day of the demonstration. During these experiments, we have considered several disturbances.
WP8 : integration
The tasks within WP8 can be divided in two main classes of activity. The first class is the collection and evaluation of the progresses and achievements of the work packages and case studies in the project.
The second class of tasks relevant for the activity of WP8 is represented by the development and validation of the integrated control systems. For these tasks the WP8 leader, SPES, developed and validated an integrated control system based on a new paradigm for computer aided control technology in food. A major achievement of the work package WP8 has been the development of an integrated system that includes the results from WP2 to WP7 in a unified technological framework featuring utmost trends in the ICT context. The design of a database-centric infrastructure realized a unified and open system for the integration of the results from the four case studies whilst combining the achievements of WP2 and WP8. The link between a Scientific database (for knowledge based reasoning) and Local Database (for communications and control) is the key of the integration activity. The project is based on the idea that each plant is connected to a Remote Server Machine (or possibly a clustered set of) – located at one or more remote control centres – in order to upload measured data and alarms and download control actions. This Remote Server is essentially a supervisor system which implements distributed control architecture. New DBMS (Database Management System) technology, allows organizing a completely different control structure driven by data and event acquisition from physical processes. The change or acquisition of a value or event can thereby trigger all the actions in the control infrastructure from both hardware and software points of view. This new kind of approach had not been into consideration from electrical and computer science engineering in the past because, due the technological limits on operating systems, on hardware resources and software paradigms, these were not ready nor suitable for this ground-breaking counterintuitive approach. The algorithms which constitute generic software modules, controllers (i.p. regulators), optimizers and generic logical units are deployable both on remote sites and on plant local sites. Local plant algorithms are enforced by the embedded electronics infrastructure. A relevant effort has been the inclusion of Embedded Matlab framework over embedded electronics. The organization has been conceived as hierarchical: on the upper side there is the highest computing power and the highest semantics and data abstraction. It is the distributed computing and storage layer. Typically there we consider the usual personal computers used both for access and computing and the possibly dedicated machine for high load scientific computing. It will be called the Global layer. On the lowest level (but not exclusively) we have the regulatory layer for autonomous control and the contact layer with sensors and actuators by means of low level communication protocols. Middle and bottom layers are enforced by embedded microelectronics.
A fundamental feature, paradigm and tenet of the control system is: every controlling software is a plug-in. Plug-ins are the major framework to render the system able to implement very complex control policies, as it happens for food processes, at a very low cost and with a very easy and unified procedure. Every algorithm (i.p. Matlab) developed within the project can be installed as a plug-in on the system. Every plug-in can be immediately put in communication with the others allowing for a vast class of topologies and hierarchies of the control. A remote upload server (on the highest layer of the infrastructure) is accessed for the deployment of plug-ins on the system: a web page will enable the enrolment of a control, optimizer, model or general logic software on the CAFE infrastructure. With the plug-in concept it has been created an abstraction layer that enables scientists to operate on the control framework without concerning about low level communications. Plug-ins architecture can implement an evolutionary distributed control system intrinsically: by the capability of generation of descendant plug-ins with dynamically computed parameters or configuration. Database records trace all the system evolution as every plug-in leaves its footprint on data model. Local plug-ins can be run under real time constraints. Global plug-ins are to be considered on-line but not strictly real time. A key for integration, both for Global and Local plug-ins is the integration text file.
The algorithms which constitute generic software modules, controllers (i.p. regulators), optimizers and generic logical units are deployable both on remote sites and on plant local sites. Local plant algorithms are enforced by the embedded electronics infrastructure. A relevant achievement has been the inclusion of Embedded Matlab framework over embedded electronics. The infrastructure layers can be multiple, allowing a great scalability of the infrastructure and an optimal configuration that keeps to a minimum the ratio between computing power and costs. This increases also the overall sustainability of the process operation. The flexibility in the configuration of components and software enforce optimal management of food processes. HMI (Human Machine Interfaces) Web interfaces for control of the system are available from the highest level down to the lowest: every device features a server for Web-based HMI applications. The solution is oriented to new trends in computing and information accessibility. The plant operator will be able to control and operate the plant remotely and pervasively: to be always connected.
Wine-making
Case study 1 contributed to the WP8.1 in identifying the flavour markers rate vs. ethanol rate and the metabolic modelling along with timings for control to optimal states of the process. Contribution to model paradigms have been provided for task WP8.2. It was found that the modeling procedure induces a relevant description of the studied phenomena and a suitable formulation for control so a simple mass-balanced formulation is therefore preferred along with the provision of a description of the main kinetics involving the growth of biomass on nitrogen and the production of ethanol and carbon dioxide resulting from the sugar consumption. This mathematical model can be extended to the description of the considered flavour-active compounds. In addition to typical variables as biomass, sugar and nitrogen, a new variable has been introduced: the transporters. Its introduction allows describing more coherently the evolution of the fermentation activity following the initial nitrogen concentration. It was maintained that it is the first time that flavour markers can be dynamically described and that the aromatic profile synthesis of a wine can be studied and hopefully understood and controlled. A combination of metabolic flux and classical modeling provides great insights on the modeling technique for bioconversion processes. Thist modeling strategy can probably be effectively applied and extended to other categories of processes as well. Contribution on sensors’ paradigms within WP8.3 for this case study have been obtained mainly by the use of temperature sensors, flowmeter for injected O2, flowmeter for produced CO2 and on-line GC. The wine-making process has been of reference for cross evaluation of optimization methods related to task WP8.4. Though no direct implementation of optimization algorithms were neither foreseen nor possible for this case study, the results on the modelling, control and scale of the process provided much knowledge for optimization techniques and paradigms to be used also in this case study. The needs for the wine case were included in the design of the overall optimization framework, with the objective of immediate extrapolation of the results tested on the other case studies in a possible application to wine-making real plant case.
Case study 1 contributed to the developments in the achievement of paradigm on monitoring and control related to objectives of task WP8.5 and WP8.6. The first paradigm of the work package "Knowledge representation and data management" is the association of data models (relational models, XML Schema) and formalized knowledge models. The second paradigm is to represent knowledge in a generic way, so that models and tools can be easily implemented for the different food processes. This task required the formalization of unified concepts and formalization of relations between concepts. In particular, the focus has been on the design and implementation of the database structure, the conceptual design graphs and the development of the Information System architecture. A part of data communications in this distributed architecture is performed using Web Services techniques. From the work and knowledge of the wine case study it began the construction of the Scientific Database which is the process memory, and is a posteriori used for scientific issues. The scientific database was implemented for wine process. The database model correctly fitted to the needs, especially for the project and experiment aspects that appeared hugely relevant to scientists. This first implementation trivially demonstrates the possibility of extension to every other case study or process class.
On the control side, the complexity of modeling has been tackled for the wine case. The time scale for the wine production has been assessed. the alcoholic fermentation takes several hours in batch process and one of the objectives was to reduce this stabilization time. Reduced order models have been obtained that are composed of 16 Ordinary Differential Equations (ODE’s), each of which is highly nonlinear. An observer has been used to estimate the unknown sugar concentration in each reactor. The control strategy which has been implemented is a linearizing feedback law that drives exponentially the dynamics to the target. The control objective has been expressed in terms of time minimization. To make the system go as fast as possible from one set-point to another one an off-line minimal time feedback control problem has been studied.
Microfiltration of food beverages
Case study 2 contributed to the WP8.1 in managing the Total Costs of Ownership/Energy consumption for a process plant and by giving indications on filtered beer quality. Contribution to model paradigms have been provided for task WP8.2 in close link with WP4. It was fond a real paradigm on modeling which states the eight steps to be taken in the development of models oriented to process control. It is a very generic methodology as and the applicability of the paradigm to the case studies at hand has been provided in table. Contribution on sensors’ paradigms within WP8.3 for this case study have been obtained mainly by the sensing of viscosity, pH, conductivity, VIS-NIR, particle size distribution and turbidity. This case study has been suited for experiments with ultrasounds, electronic nose and electric impedance techniques. The involvement of this case study in WP8.4 has been due to the major achievements in the determination of optimal operation policies for beer-microfiltration. The optimal plant operation involves decisions at different levels: the number of chemical cleanings (CIPs), the number of back-flushes per CIP, the value of the cross and permeate flow set points (QF, QP) and the maximum trans-membrane pressure (TMPmax). The approach selected for the beer microfiltration in the project, merges economic optimization and control and makes use of particular parameterizations to solve the problem using a small number of NLP problems. It has been observed that (global) stochastic algorithms performed much better than deterministic ones (based on SQP) leading to the optimal solutions. Along with that an operation profile has been obtained. Contribution of case study on beer microfiltration to task WP8.5 has been used for the refinement of ontology classification techniques. In this case an OpenMI approach has been pursued for monitoring and control for comparison. It was useful to make the effort to formalize the beer case knowledge and allow automated reasoning to take place. A number of knowledge rules in the beer filtration domain has been perfected and showed how the addition of facts to the rule base lead to newly inferred or retracted facts. In order to obtain a basic set of facts and knowledge rules in the beer filtration a certain number of knowledge sources have been considered: text documents have been used in which expert beer knowledge described a physical model of the fouling behavior of membranes. This technique used has been proven powerful for assessing the impact of the raw materials used as input for the beer filtration process. Since it is our intention to create a physical model that describes the fouling behavior of the filtration membrane and to create an automated control system to optimize the beer filtration process, it is useful to learn the impact of new facts. New hypotheses can be formulated to finetune the physical fouling model and the automated control system.
Freeze-drying of lactic acid bacteria
Case study 3 contributed to the WP8.1 by provision of indicators on: Qt time to reach max acidification; Qv cell viability; mechanical stability and rheology; controlled product temperature; water activity and total cost of ownership of the process. Contribution to model paradigms have been provided for task WP8.2 as it could be learnt that the direct on-line measurement of the biological activity of the bacteria is impossible and critical process parameters (CPP) for quality need to be identified and quantitative relationships between these CPP and the viability or the acidification activity of bacteria need to be established. By integrating the model of the bacteria quality degradation in a model describing the drying kinetic, it becomes possible to develop an in-line control policy of the process maximising the productivity and the quality. A simplified one dimensional model for the compact layer configuration has been developed by APT and given to the other partners. The model predicts the drying time, the profile of product temperature, water content and the glass transition temperature. Contribution on sensors’ paradigms within WP8.3 for this case study have been obtained mainly by the sensing of temperature (product and refrigerating liquid) and chamber pressure. This case study has been also viable for experiments of ultrasound techniques. The involvement of this case study in WP8.4 has been demonstrated by the achievement of model based parameter estimation. To that purpose, six experiments were performed in which two states were measured and employed as observables: the temperature at the bottom of the product and the vapor pressure in the chamber. Two control variables were employed to define the different experiments: shelf temperature and chamber pressure. The results showed that the maximum and mean errors between model predictions and experimental data improved after parameter estimation. The optimization oriented to process control has been performed in order to guarantee on-line as well as in-line product quality control while ensuring safe and efficient operation, the development of an optimal control structure has been proposed that responds optimally to input and state disturbances. Real time optimal control has been performed in two different plant configurations. The first one involved a nominal case experiment and used the powerful eSS (enhanced scatter search) algorithm. The aim was to reproduce the nominal situation considered in the off-line optimal profile calculation.
Contribution of case study 3 to WP8.5 WP8.6 and WP8.7 has been the use of such a case study as the first where the new distributed monitoring and control system has been deployed and demonstrated. The distributed control infrastructure developed is a low cost, long life-cycle, reusable, open, robust, versatile and modular industrial technology. Latter achievements until project demonstration showed themselves highly satisfying, by allowing scientists of CAFE project to implement control and supervision by pure Matlab language modules. The effort for Matlab inclusion has been implemented over very general and inclusive open source oriented software. The database-centric (by means of local database) architecture proved to be very general by allowing a fast refurbishment or integration of every existent piece of work or instrument within the project. The integration solutions there proposed featured inherent scalability properties: only SQL query level connection in the most abstract and incompatible case (where completely proprietary control solution were present), a retrofit in intermediate case (where some open low level communications were possible), down to hardware and software instrumentation of a process from scratch. The durability and openness of the solution proved to the academic research the existence of a powerful tool for actual industrial exploitation of the scientific results, and at the same time, frees research labs from the dependency from typical hardware and software vendors sales policies based on proprietary software compatibility requirements. The database objects and engines are used as the main interface between different processes, distinct control layers and legacy or alternative standard solutions. This kind of interface at the same time enforces a seamless integration and implementation of the CAFE project achievements based on paradigms for models, controllers, sensors and optimizers, by providing a unified tool which potentially can host all the practical realizations of their computational needs. Communication layers were added over the Internet protocol to enable seamless remote connection for control, diagnosis, maintenance, security, knowledge based control and data model connections. A unified hardware and then software infrastructure for integration of process control has been developed. Integrated software architecture has been developed to include the results from modelling, optimization and control work packages in a unified framework. The proposed infrastructure is able to implement general control and supervision policies. The essential communications between all the devices in the infrastructure are made mainly by means of data replication. This constitutes a novelty and change of paradigm also for the communication technology, allowing the design and test of open, low cost and very general applicability. HMI (Human Machin Interfaces) Web interfaces for control of the system are available from the highest level down to the lowest: every device features a server for Web-based HMI applications.
Ice cream crystallization
Case study 4 contributed to the WP8.1 by provision of indicators on: energy consumption by mass of ice; settling time to optimal value of viscosity and crystal size; rheology of ice-cream ad total costs of ownership and energy consumption. Contribution to model paradigms have been provided for task WP8.2 in food structuring processes. A paradigm for the crystallization process has been identified. It consists in following the transformation during crystallization, by using the phase equilibria liquidus curve, and by following the difference of temperature between the freezer surface (evaporation of refrigerating fluid) and the equilibrium temperature of the crystals’ solution in the bulk. The crystal nucleation laws, the crystal growth laws and the ice mass fraction depend on the difference of temperature to that of the phase equilibrium diagram. From experimental data at the laboratory plant scale model parameters have been identified by using a reduced model. Validation has been done following the key parameters such as crystal size, ice content and viscosity of the product. Energy consumption has been included in the model to be used in a future dynamical modelling. Contribution on sensors’ paradigms within WP8.3 for this case study have been obtained mainly by the sensing of temperature, pressure, power consumption, dasher speed, crystal size, ice fraction and by means of focused beam reflection and on-line viscometer built on purpose. This case study has been suited for experiments with ultrasounds and electronic nose. The involvement of this case study in WP8.4 concerned experiments for model identification using a factorial plan (D-optimum) considering three control variables: evaporation temperature, scraper speed and mass flow rate. Three different types of measurements were employed to compare model prediction and experimental data: Temperature of the mixture at three locations in the crystallizer, mean crystal chord length at the output and viscosity at the output (6 values). After parameter estimation, optimization has been obtained in order to guarantee on-line as well as in-line product quality control while ensuring safe and efficient operation. An optimal control scheme has been proposed that uses reliable process models and optimization tools which combined in appropriate ways will enable processes to be operated at their optimal conditions and to respond optimally in the event of plant disturbances.
Concerning activity related to WP8.5 WP8.6 and WP8.7 the case study on ice cream has been the second case study where the new distributed monitoring and control system has been deployed and demonstrated. In this case a from-scratch complete installation has been performed and described in deliverable D8.7. Case study 4 was chosen as the main case for final demonstration for its completeness both in monitoring and control. On the control side, the same paradigmatic approach of the other case studies has been followed and refined in the last reporting period for final demonstration. The complexity of the biological and chemical processes involved in ice–cream process has been tackled. During the crystallization, several phenomena are involved and three mechanisms have been taken into account: the nucleation of the crystals, the growth of the crystals’ size, the breakage of the crystals mainly caused by the blades of the scraper. Other phenomena have been taken into account: the wall heat transfer, the transport of the product, the viscous dissipation, the radial diffusion. A peculiar characteristic of case study 4 is the continuous nature of the process. The time scale of the process has been identified in the stabilization time from one operating point to another which takes between 5 and 10 minutes. To describe the crystallization processes, a population balance equation coupled with an energy balance equation has been used. The population balance equation describes the evolution of the crystal size distribution inside the freezer by Partial Differential Equation (PDE) which describes the crystal size as a function of spatial coordinates and time. The energy balance equation is also a PDE. The reduced order model is composed of 6 ODE’s, which here again are higly nonlinear, and non-affine with respect to the control input, which has been a difficulty for the control design. The problem considered is the control of the viscosity of the ice cream at the outlet of the freezer. It is a problem of regulation of the ice cream viscosity at a fixed set-point value. The control input is the compressor rotation speed. The control law which has been designed enables to compute a value of the control input. This value depends on the difference between the estimate of the viscosity and the set-point. It also depends on the estimates of the state and on the measurement of the evaporation temperature. The control strategy is in fact a cascade control strategy and based on a reduced order model obtained from the initial PDE’s by means of the method of the moment. The viscosity that we want to control can be expressed as a function of the state variables of the model. Interestingly the same linearizing control law as the one used in the wine making case study has been considered for the ice cream crystallization.
WP9 : demonstration
A DEMO day was held at Irstea to showcase a live demonstration of the ice cream case study. The CS was shown to over 50 participants including control, sensing, modelling, data management and integration:
Potential Impact:
Issues in food and bio process industries
There are different factors that today characterize the food industry : these include its diversity, its complexity and the high level of industrial competition. The main issue that the food industry has to face nowadays is related with the production and delivery of reliable food, able to satisfy organoleptic, nutritional and safety considerations. The fact that the consumers are looking for more and more services (easy-to-use and/or ready-to-eat products, quick time for preparation,…) imply more complex formulation recipes and processes. At the industrial production level, this means an increasing number of unit operations whose combinations need to be well understood to be efficiently operated, but also the fact that raw material and ingredients are added at different steps and the co-products becomes sometime more interesting on an economical point of view as main products. These general issues, are however not the only ones. Many other issues need also to be addressed which may in particular be depending on the kind of products.
The first question to be addressed is the following one : why do we need to implement control strategies in a food process ?
Indeed several objectives lead to consider control in the context of food processing :
- to increase the productivity of machines (but mechanization can also address this issue),
- to increase the productivity of workers (while training can also be helpful with that respect),
- to decrease the product losses (it is generally considered that 50% of losses (raw material and food) occurs during the shelf life and production. Considering the nutrition of peoples, the reduction of losses is a major issue),
- to increase the product quality regularity and/or reduce their variability,
- to increase the flexibility of machines and processes.
But requirements directly related to the specificities of food and bio industry have also to be considered, more precisely :
- to increase the hygiene of the food processing and production
- to decrease the effects of the natural variability of bioproducts characteristics
- to decrease the effects of the natural perishability of bioproducts
- to increase the global product quality : for instance for bioproducts, the quality is a term covering a wide range of contradictory aspects (texture, colour, taste, composition,...)
A central issue that needs to be addressed in priority is the importance of maintaining the properties of the product as constant as possible. The interest for the optimisation of the product quality is undoubtedly very important, but it is obvious that the confidence of the consumer with respect to food products is first based on the constant properties of the product.
At this point it is important to remember that there are numerous constraints that are connected to the food production. With respect to the cost of production consideration, the main factor is usually the cost of the raw materials, the second factor being the labour cost. Energy is of lower importance since it usually presently amounts to 3 to 15 % of the production costs. Nevertheless, energy becomes a increasingly important factor due to environment considerations and the increasing cost of fuel sources. One may expect in the coming years that the price of food will most probably be indexed with oil and gas costs. Therefore the control of the energy consumption within the food industry will become of increasing importance and interest. It is important to note that the first substantial implementation of automatic control in the food industry had been starting with the first oil crisis in 1974.
Emerging questions
With respect to the labour costs, the economic consideration is not the only one that one has to face. Indeed the fact that the workers and employees in the factory are in contact with food imply safety and sanitary considerations. The idea of a factory without humans is probably not suitable but such a consideration has to be envisaged.
More recently, the effects of the different crises in the food production (e.g. Bovine Spongiform Encephalopathy (BSE), avian flu, pig fever, foot-and-mouth disease, dioxine crisis in Belgium in 1999 related the contamination of the food chain via pig and chicken feed,…) show that the consumers are worried about the food. More safety is asked. Even if we believe that a lot of progresses are done, the ability of food industries to control really the safety of production is better, one of the consequence of the past years is that regulations are more complicated and new tools arrives at the factory level, that are opportunities for control purposes. The development of HACCP principles and tools (Figure 1) is available for a long time. Nevertheless, today, quantitative HACCP arrives as new tools, and databases, Good Manufacturing Practices become a basis for the implementation of Decision Support Systems.
Figure 1. Implementation of control and constraints
This resulted in particular in, new European or national regulations, among which the two following EC Regulations 178/2002 (“laying down the general principles and requirements of food law, establishing the European Food Safety Authority and laying down procedure in matters of food safety”) and 1829/2003 (“on genetically modified food and feed”). More constraints appear for the control of the production. Traceability is necessary, this implies the implementation of new tools, mainly for the measurement, monitoring as well as the memorizing and history of the production events. The new responsibilities of the factory with respect to raw materials and food imply new tools, mostly based on computers. It happens that databases are available, yet not used as they could be.
Another key factor to be considered is the requirement for productivity, in relation with the economic considerations and the profitability of factories. The appropriate compromise among all the constraints is usually difficult to find, and many food factories are looking for tools able to help them to take the appropriate decisions with that regard. Figure 7 provides an historical perspective and illustrates the fact that the food factories are presently looking for a global process database that would allow them to implement new control functions.
The number of studies that are dedicated to monitoring of indicators related with productivity and efficiency of processes is quite low. One of the difficulty of the food companies is due to the very small profit margins that they have to manage. Each time a better control of margins becomes possible, the economical situation becomes better. Even if no generic study is available, it is well recognised that energy costs are around 3 to 15% of the product costs. Raw materials losses are sometime important and a better control can decrease significantly the yield. The productivity increase is often a competitiveness criteria. It is, to date, more difficult to manage criteria that are related with a better control of organoleptic or nutritional considerations.
Numerous control systems have been implemented in industry over the last decade, as it has been reported in several studies. For example Morris [40] points out that in the United States, the main issues from the food industries were related to the implementation of an increasing number of automatic control and integration tools, with the objective to address the important questions of organoleptic and sanitary considerations. As far as we know, there is no similar study for Europe; nevertheless, new issues have to be addressed, even if the search for the control of organoleptic properties and safety remains a priority.
The main new issue is nutrition. The motivations of the consumers for heathly food are obvious; the main question to be addressed is therefore to be able to control the impact of food on nutrition. There is indeed an open scientific question : to determine and characterize what are high impact nutrition foods. Numerous research projects are presently considering this question. But new European Directives have to be taken into account by the food industry. The main one is probably the Nutritional profile one (even if to date the choice of a profile method is not done, and many criteria are proposed in each European country). The most simple view to be followed to analyse the situation is to consider that the chemical food composition, during and after processing, becomes of major interest. It is a combination between the interest for nutrition issues and the capabilities of analytical tools to accurately discriminate the composition of food. More precisely, as it was the case for microbiology, the analysis of the positive and negative aspects of chemical composition becomes possible.
The positive aspect is indeed the retention of important molecules during the processing and the conservation. For the negative point of view, the neo formatted compounds (NFC) that are produced during the processing of food are of increasing interest today because the number of analytical tools that can measure properties are increasing (and the resolution is decreasing). This is indeed a new situation compared with the past years. Most of the control strategies of food operation are based on sensory properties that are visible properties (visible that means that at least peoples are able to “measure” the properties with their senses), at least for the human operator working close to the production line. With new considerations about chemical composition, the expected properties that we have to control are non visible and the way to measure them is difficult, sometimes not available to date.
Good manufacturing practice (GMP) are implemented, today not under regulatory constraints as it is the case for pharmaceutical industries, but the increasing pressure of regulations needs to anticipate the increasing level of regulation, and automatic control is one of the way that will be very helpful to address this issue, although probably the food industries will not have to reach the same level of regulatory constraints as the FDA requires for pharmaceutical products. Nevertheless, the pressure is more from the retailers as from regulations, and the consequence is a need for more automatic control and monitoring.
As compared to other manufactured products, the food is a product that has to be transformed at two levels. The first level is the factory level (that is more and more often performed in at least two steps, including a final assembling process). The second level is the culinary/domestic level. During the second transformation, where heating is most often considered although not the only one, is generally not controlled, because of the powerty of culinary technologies for control purposes. An important question for the food companies is to manufacture products that are satisfactory in terms of taste, safety and nutrition independently of the way that they are used at domestic level. The necessary robustness of the product is an objective, and among all the possible trajectories of transformation, the one that introduces the robustness will be the optimal one. An other consideration could be related to the study of the domestic unit operation on the same way as we have to study the control of an industrial unit operation. If we consider heating or baking for instance, the understanding and the design of an automatic control strategy at the plant level could be similarly applied at the domestic level (as long as the cost allows it).
It is obvious that the economic aspects are also important. The consumers tend to lower significantly the part of their budget related to food (14% today compared with more than 25% thirty years ago). The production costs, as it has been highlighted here above, are so high, and the pressure from the retailers so important that in most of the food productions today, the profit margin is low (around 3% for most of the food products). The economical sustainability of most of the food factories is then under pressure, and without any change, the profitability of many food companies will significantly decrease.
Taking into account the demand from consumers, the industrial companies are looking for tools capable of defining the design and production of food from market to factory. The necessary flexibility is not obvious and all the process manufacturers are looking for higher level control implementation in order to increase the flexibility opportunity. Nevertheless, as we shall see here below, this is not obvious.
In order to provide an answer to the previous questions, different options are under study. The first one is to increase the engineering of food processes, and more and more complicated food chain are proposed. One of the idea here is mainly to decrease the energy costs, to address the environmental issues (via the reduction of wastes and wastewater, for instance) and to maintain the quality level as much as possible. As a consequence, because the processes are more and more complex, the control ability is more difficult, mainly due to the fact that the dimension of the expected properties is increasing. The search for a more complicated compromise is more difficult. The training of the operators is not sufficient in order to meet all the control purposes.
Another interesting evolution is necessary to be taken into account. Looking at the recent food processes exhibitions (Achema, in Germany, IPA in France), a large number of proposed processes are smaller and smaller, and the size reduction is an important trend. If this trend is easy to understand because it allows to install more operation units on the same location, to have a larger diversity of the ways of processing the food, However this also means that the residence times for the products are shorter, rendering the control of the process operation harder. This clearly shows that automatic control functions become more and more important, if not necessary.
Even if automatic control appears to be a solution, it must be considered that the social situation in Europe do not allow to decrease strongly the number of workers on an industrial plant. The characteristics of the food product, the complexity of food processes, the large number of degrees of freedom that are available require people for controlling the processes. In that context, advanced control strategies are also essential, and new concepts for control, that permits to share automatic functions and decision support system for human decision are important and will be a part of the sustainability of food industries.
Facing the questions, the situation of control
Two points have to be addressed. The first one is concerned with the progresses for components and tools for control, the second one is concerned with the way that the companies are handling the question.
The control components industry situation
The progresses in control technologies and techniques are enormous. It is well known that the power of computer tools is increasing each year. The industrial computers are cheaper, more accurate and an nincreasing number of functions are available. The improvements of the performances in Programmable Logic Controllers (PLC) (industrial computers dedicated to automatic control tasks) are obvious. Most of the food factories are able to implement these tools, even if, as Ilyukhin [27] explains in the context of the United States, these tools are mainly implemented by process manufacturers and not by food producers. PLC allows to implement for instance high level algorithms for on-off control or sequential control, even for setpoint control.
The improvements of actuators are very important too, yet often neglected. More and more accurate actuators are available in food factories. This is often motivated by the search for increased productivity. But it is a fact that controlled motors, smart valves, complex manifolds are available, and new control functions become possible. One of the actuator that is more and more implemented is the multi axis robot due to price reduction but also because of the increasing labour illness costs, which tends to be used as an argument to replace human by robots. Today the implementation of robotic functions coupled with image analysis is more an engineering question as a research one.
The third component of control, which could be indeed considered as the first one, is the on-line sensor. Here the situation is still difficult. Numerous papers are dealing with the lack of reliable and appropriate sensors. It is obvious that even the major sensor manufacturing companies do not show major motivation to adapt their technologies to food and bio characteristics. The main reason is related to the market : specific sensors are needed for each specific application for which the market is usually quite narrow. The profitability of sensor adaptation is not easy to emphasize. This most probably explains why, still today, only a limited number of on-line specific sensors are available in the food industry. The emerging solution, at least for a part of the question, is the adaptation of analytical tools, classically available at laboratory level, for process and plant considerations (increase of robustness). The composition of food becomes easier to measure. At this step, the transfer of the Process Analytical Technology (PAT), first developed for the pharmaceutical industries is a way to face the on line food composition analysis. However it is not enough for control purposes because the set of properties that are expected to be measured during process operation is not only the composition but also the texture, the colour, the aroma, the sanitary properties etc. The proposed solutions are numerous. First it must be noted that the necessary time to develop a new measurement method is a long process that usually takes several years, not just a few months. Secondly the use of smart sensors (i.e. the combination of easy-to-do measurements and softwares) allows to solve numerous situations. Unfortunately it is a case-by-case set of application, without usually no possible generalisation. One of the ideas could be to transfer results and tools available for control of bioprocesses to food processes (e.g. state and parameter estimation, indirect model based sensors).
The present situation in the food industry is such that, even in the absence of on-line sensors, largely thanks to the know-how of the process operators, the food companies are able to provide products that corresponds to the consumers’ expectations. The main tools for sensing are indeed the operator and the use of their ability to evaluate the product. It is obvious that such a know-how should not be neglected in the design of control strategies. Recent results have provided methods that are able to include the human evaluation into feedback and/or feedforward control strategies. Most probably, it is one of the way to find response to the issue where the consumer expectation is taken into account. Results shows how the experts of the plant control are able to anticipate the consumer behaviour and how they control the elaboration of quality in such a way than the expectation are reached. It is interesting at this step to keep in mind that SME are here often more efficient as big companies. It is one of the relevant goal of the Strategic research Agenda [wp5].
It is evident that due to the complexity of food transformation, the combination between classical sensors, analytical tools, human evaluation with models optimization and control algorithms is the most appropriate solution for the development of a set of tools able to provide a Food PAT based control system concept.
The food and bio companies situation
The food companies are trying to increase the potential and performance of their control systems. It is obvious that the level of necessary studies and developments is such that these will be more likely implemented in big companies than in small and medium one’s. However there is an major potential of improvements in both types of companies. The key question is that it is often a case-by-case approach, without any generalisation. As it has been said above, the process manufacturers are the main track for the introduction of control functions within the food factories. The consequence is that the main emphasis is on the appropriate functions related to the technologies rather than related to the products. The specific dimensions of the process and its product(s) are usually missing. It is therefore important to come with control engineering tools that are addressing appropriately the process issues and in particular the different interactions within the process, typically in terms of the main phenomena (chemical/biochemical transformations, heat transfer, mass transfer, multiple phase interactions (liquid/solid/gas but also emulsions or gels for instance), …) that are taking place within the process.
The second conclusion at the food industry level is linked with the classical advanced control approaches. In food companies, there are most often no control engineers while the new proposed control tools and methods are mainly proposed by control engineers. The way the algorithms are designed imply the definition of parameters that are not easy to understand for operators and process engineers. It appears that, in the food industry (as it is also the case in many process industries), the best approach for the design of the control problem is to consider that one of the key issue is to control the constraints that have to be fulfilled. This often implies an adaptation in the design of control strategies.
The Food industries in Europe propose an agenda via the Food for Life plateform. Process control is one of the goal (goal 2, challenge 3). The main point is related to the design of “robust and reliable quality sensing systems must be researched and developed over differing time scales so as to assess quality throughout the life history of a product. In-line, preferably non-destructive, and integrative quality sensors are a prerequisite for a modern process control. It will be essential to adapt read-outs of such quality-sensing systems to generate useful parameters for the design of new processes and for the creation of new food”. With respect to these objectives, the expected deliverables are expected for 2020. It is obvious that the industry have to face questions before this date, and the purpose of this analysis is to be able to develop appropriate tools and methods even for the control of the processes.
The potential sciencific responses are numerous
An overview of the scientific production in peer-reviewed journals and international conferences is a good tool to provide elements of responses to the level of development and implementation of the automatic control in food industry. In simple words, it is interesting to note that if on one hand the automatic control and process control scientific communities have been characterized by major scientific developments and industrial implementations over the last two decades, the application of advanced process control in the food industry is still limited. Such an a priori negative statement should be balanced by a still rather yet susbstantial increase of interest of process control scientists for food processes, and of food scientists for advanced control methods.
The first trend is for instance illustrated by the publication in the Control Magazine, one of the publication of the Control Society of the IEEE (Institute of Electrical and Electronics Engineers) of two special issues (August and December 2006) on Process Control with one paper fully dedicated to the control of food processes [41] and two papers on crystallisation processes and biological reactors [25][30] in which specific food process issues are also addressed. It is also illustrated in the triennal milestones report of the IFAC (International Federation of Automatic Control) which summarize every three years (the basic periodicity of the IFAC activities) the accomplishments and trends in the different in the research and applications of automatic control in all fields : it is obvious that food processes are given an increasing importance [12]. The same remark holds for the activities of at least three technical committees (TC6.1 on Chemical Process Control, TC8.1 on the Control in Agriculture and TC8.4 on Biosystems and Bioprocesses) where food process modelling, monitoring and control are clearly identified. It is also worth noting that major control engineering journals like Automatica and the Journal of Process Control gives increasing room for publications dedicated to the monitoring and control of food processes.
Recent international conferences in food processing are illustrative of the second trend. It must be noted that these are more and more including sessions on automatic control. During the ICEF9 (International Conference for Engineering and Food, Montpellier, France, 2004), the sessions related with control clearly emphasized that a lot of new functions are available. Optimisation appears to be the main approach; only a very few number of new sensors and new principles for measurement have been proposed. During the Iufost conference in Nantes, France (September 2006), a similar observation can be drawn : some presentations dedicated to the control of specific food process case studies, optimisation as a tool.
It is important to note that, as it has been already mentioned, the research activities in automatic and process control over the last decades have resulted in a wide spectrum of new methodologies that address important control issues. In the field of process control, a very active research activity has been concentrating on process monitoring, optimisation and control. If Model Predictive Control (MPC) has been often viewed as a common denominator for control design, it has also to be considered as a starting point for the design of more specifically dedicated and appropriate controllers that are aimed to address the specific issues of classes of processes. Robust state estimation [13][14], process optimization and optimisation-based control [3][4][5][6], and real-time optimisation (including adaptive extremum seeking control techniques [20][35][36][56][57][64]) have been the object of increasing research and resulted in several promising new techniques, in particular for food processes. Another important issue is the use of appropriate models for control design : very often, the mechanistic models used to describe the (complex) dynamical behaviour of food processes and their intricate mechanisms are too complex for being used in efficient control schemes; this holds for processes involving complex fluid flows, spatial distribution and particulate distribution (which results in partial differential equations (PDE’s) model representation), but also for biochemical systems involving a large number of complex biological reactions. Several methods for model reduction have been largely studied (singular perturbation, pseudo-spectral methods, weighted residual methods (Galerkin, orthogonal collocation,…), finite elements, etc, but also system biology reduction methods e.g. based on convex basis analysis).
It appears for instance that batch processes have been recently (i.e. over the last decades) the object of increasingly research activities due to obvious industrial needs. An issue, for instance, is the ability to provide rapid estimation of key process variables representative of the process quality parameters : this can be done via the use of software sensors that convergence in finite time. It is also worth noting that EC projects have been recently dedicated to the monitoring and control of batch processes, in particular the BatchPro project (“Knowledge-driven batch production”)[wp4].
All this knowledge will obviously serve as a basis for the developments to be performed within the CAFÉ project.
A synthesis
Based on the observation that the level of specifically dedicated scientific developments and real-time implementations of advanced process control strategies in the food industry is still limited, the central idea of the CAFÉ project is to handle the smart control issue in the food industry by considering a new concept. The main question is the combination of the different software based components in order to associate them in appropriated control tools. The key point, as it has been explained in the preceding sections of the proposal, is therefore to develop an integration approach that takes into account the specificity of food processes where the cooperation between man and technology is important.
This includes the following important axes :
• Sensors and food adapted Process Analytical Technology concept development and implementation, with the idea to propose, based on applications, how to adapt existing sensors, how to combine them with analytical tools and with human evaluation. The last point is clearly very important since it is obvious that manpower will continue to play a central role in the process operation in the food industry, at least for all the evaluation work to be performed close to the plant, with the issue of providing user-friendly control tools for the process operators.
• Optimisation based control algorithms have to be developed. The optimisation point of view is well recognised today as the leading one. The different situations have to consider the specificities of food operation, where the controllability is located or distributed in space, where batch or continuous operations are concerned. The calculation and control of product trajectories along the process life is important and will be a major concept for controller design.
These axes are based on the development of a new approaches in modelling of food and bio processes where existing models could be integrated and adapted for optimisation-based control purposes (via model reduction approaches) and where the global integration of model related algorithms is performed through software components technology.
Dissemination activities
Dissemination was aimed at three target groups: food industry, food plants providers and academics. The main users of the CAFÉ production strategy, monitoring and control system are the first two groups. The dissemination goal was to gain acceptance of the results and methodologies developed in their application in wineries all over Europe. The dissemination goal for the academic users of the information was to inform on the progress made and hence stimulate further research: as described in the work programme and objectives, much of the project is about review recent advances and incorporating them into an integrated tool rather than wholesale model development.
Apart from the publications in refereed journals and the communications in international scientific conferences, a specific scientific workshop as well as a demonstration workshop aimed at the industry have been organized at the end of the project with the objective to provide an overview of the project results, with the option to possibly combine on-site demonstration with teleoperated demonstration on a distant site.
A webpage has been used as the principal tool for the exchange of information within the project. The CAFÉ webpage has also been used as a dissemination tool outside the project by the communication of the project results and the scientific publications connected to the project, as well as the announcement of activities (conferences, workshops, courses, exhibitions,…) of scientific/industrial relevance to the activities of the project.
The project has given the opportunity to increase the technical and commercial potentialities of European companies on the food market by incorporating efficient production strategies and smart control systems that can guarantee high and uniform product quality and reliable operation of the plants. This improves the position of the European food industry on the international market, and favours the increase of exportation outside Europe.
For the academic participants in the consortium, the results of this project maintain their international reputation as leaders in modelling, diagnosis, monitoring and control of food processes. The results from all parts of the programme are available, allowing them to strengthen and extend their areas of expertise. The academic element to the project continues to consolidate European reputation in the world as a scientific leader in the field of food research. It also increases the links and collaborations between the European partners. Most of the academic partners have already common scientific projects with some of the academic partners. The project has been another opportunity to increase and reinforce the scientific collaboration among these institutions. Moreover, the collaboration with academic partners has been an important opportunity to develop new scientific links, build up new research activities and increase the scientific potential of both the consortium and the individual partners. The collaboration between the academic partners and the industrial partners has been an important source of vivid interaction that will lead to the transfer of new technologies from the universities and research institutes to the industry and the environmental activities. It has also been a unique source of inspiration in terms of research themes from the industry and the “real” world to the researchers of the universities and research centres by providing challenging scientific problems that are raised from the problems encountered in the industrial practice and the management of food processes, and require deep and fundamental scientific developments.
Training and exchange of scientists
A large part of the work of the CAFÉ project has been performed by young researchers. Training of young researchers has therefore been an important issue of the project. It will include exchange of researchers among the different partners. The training and the different exchanges have played a central role in the project not only for improving the coherence and interactions within the consortium but also for favouring the education of high skill scientists and engineers. With that respect, the objective of the project has indeed also been clearly to improve the scientific expertise in food science and technology in Europe.
Exploitation of project results
In terms of exploitation, particularly for the industrial companies and the end users of the smart control system, the potential benefits are enormous in terms of turnover and ROI. It is for this reason that one end-user (PMS) and 7 industrial companies involved in the interface with food industrial companies (Telstar, Oenodev, SPES, C-Tech, Alctra, Norit X-flow, BIV Trace) are involved in the project. From an energy, environmental and economic point of view it is essential that the final deliverable addresses the needs of the end users.
For Telstar, the targets of the CAFÉ project are the following : to gain knowledge and ability to build a freeze granulation system for liquid solutions in order they can be subsequently lyophilized. As the dried products are very often pharmaceuticals, the final target is that the system should operate under sterile (or at least aseptic) conditions. Nowadays, pharmaceutical industries have to “overdose” the vials they lyophilize in order they can match the required potency after the product is reconstituted (due to different range of achieved results at each batch). With this proposed method, the product could be “bulk freeze-dried” in granules. After the batch is finished and the potency analyzed, the individual vials could be filled with just the necessary amount of product. Important savings (10%) of batch cost expected. The second benefit is that freeze drying of products dosed in vials is a very long process: usually a batch needs 36 to 48 h (even longer processes are not uncommon). With granules, drying time could be reduced by a factor of 5 to 10 (drying time is expected to need 5 to 10 h). The challenge are therefore the following. Pellets formed by spray cooling with counter current cryogenic fluid stream is a process already known but it is impossible to perform this process in an aseptic/sterile way. Pellets drying will first be done by filling pre-cooled vials and loading into a freeze-dryer. Further in the project, it will be studied how to do it in a continuous way. Further advantages are that these spheres are the perfect candidates to be coated in order to achieve an API controlled release within the patient. Extra exploitation results will be the availability of monitoring tools being able to in-line process control. This means that the expected economic impact on IMA-Telstar should be a technological clear competitive advantage over both American and Asian freeze-dryer manufacturers:
- A lyophilization process is usually only specified in terms of a ‘recipe’ (shelf temperature and chamber pressure vs. time). This, however, may not guarantee repeatable conditions for the freezing and sublimation steps. According to the recent FDA PAT guidelines, there is the need to study in depth the process in order to develop in-line tools to enable better monitoring and control. It is necessary to move from intensive property (independent of mass) measurements (such as temperature, pressure –partial or total-) to parameters which are scalable and in so doing implement the necessary tools to control the parameters and hence the cycle; thus permitting real-time feedback control actions. Two variables are key to monitoring the lyophilization process: sublimation interface temperature (during the whole of primary drying this has to be maintained below the collapse temperature) and sublimation mass flow (which has to be maximized to achieve the most cost effective cycle). With classical monitoring tools these key goals are not achievable.
- The second advantage would be moving from a classical batch process to a possible continuous process, achieving tremendous process cost reduction and better uniformity of the processed materials. This can only be achieved by obtaining a solid that can be dosed in aseptic conditions (pellets). This would open the nowadays very restricted niche of lyophilisation to other less expensive products, unviable nowadays due to the cost (both time and money) of the process.
SPES aims to exploit innovations from the CAFÉ project improving the activity for the design and development of embedded electronics for measurement and control applications concerning with biotechnological processes. The expectation from this improvement is in term of products’ quality and performances and in terms of turnover.
For a company like C-Tech, the CAFÉ proposal offers considerable potential impact on the performance of C-Tech Innovation. The high performance food processing equipment developed by C-Tech for its client companies can only realise their full potential through the adoption of optimum control systems, without such control systems the number of sales of such equipment is compromised. The level of sales of such equipment by C-tech has certainly been reduced over the years by the inadequacy of available control systems. It is hoped that the successful development of improved control protocols and systems that will occur in CAFE might increase sales income for C-Tech by about €200,000 per year in equipment sales.
At Alctra the increase in the company's activities in the sector of foodstuffs represents a development factor of prime importance, based as it is on the company's scientific & technological achievements and on the possibilities for consolidation and improvement in the future. In fact, this sector – which can be regarded as integrating a "low technology" area – exhibits a tremendous capacity for incorporation of methods and equipment originating from the results of "high tech" works. Yet the incorporation of new technical approaches into the professional field of foods has to negotiate a preliminary stage of achieving test results under laboratory conditions. Accordingly, professional players will be more motivated to engage in the investments required in terms of in house and specific research & development giving rise to economically viable realization of innovative production & inspection methods.The CAFE project is well oriented towards the objective of combining the additional skills which go to make up the basis of any structured and finalized research & development activity. Alctra, in its policy for involvement, is effectively demonstrating a capital strategic interest for its own development and the development of its technological & industrial partners.
The number of applications of membrane filtration processes for beverages is rapidly growing. The current annual turnover of Norit in this area is about M€20 to 30. For Norit the CAFÉ project will result in technical improvement of membrane filtration systems, resulting in better control of food quality. This will lead to increased market penetration of membrane filtration systems and higher added value of membrane systems (higher pricing). Thus, it is expected that this project will result in added annual turnover of several million euros.
The CAFÉ project will allow BIVtrace to enlarge its competence field by adding and linking added-value informations of a product during the life of this product, towards a central data base. Moreover, the datas synchronization with different new sensors never studied by a traceability company will have a positive return for BIVTrace in terms of position in the market.
List of Websites:
http://www.cafe-project.org/
Coordinator:
Denis Dochain
Professeur ordinaire
Directeur de Recherches FNRS honoraire
CESAME, UCL
Bâtiment Euler, avenue Georges Lemaître 4-6, bte L4.05.01
B-1348 Louvain-la-Neuve, Belgium
tel : +32 10 472378
fax : +32 10 472180,
e-mail : denis.dochain@uclouvain.be
homepage : http://perso.uclouvain.be/denis.dochain/