Skip to main content
European Commission logo
français français
CORDIS - Résultats de la recherche de l’UE
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary
Contenu archivé le 2024-05-27

Easy Composition in Future Generation Component Systems

Livrables

The Component Workbench (CWB) is a flexible toolkit for the composition of components. Due to CWB's modular design and a generic component model used for the internal representation of the components, it enables the composition of components implemented for different component models. Hence, the CWB solves the problem of current software development tools which only have limited support for the composition of components implemented for different component models. It enables them to interact with each other and therefore eases the composition process.
Aspect-oriented connectors adapt independently developed components (semi-) automatically to establish interaction among them thus allowing to (re-) configure software systems statically. UKA defined a corresponding component model consisting of components, ports and connectors. Ports represent the interaction points of components and the expected interaction behaviour at these points. Connectors specify the interaction to establish by connecting ports. They also serve to identify the necessary adaptations in case of architectural mismatches thus preserving the information hiding principle with respect to the system composer. Exchanging a connector therefore triggers the corresponding adaptations, performed by program transformations. This approach allows abstracting from concrete connections found in source code, reconfigure them on the model level and generate transformed source code. The model also enables (component-based) software development on the architectural level, detecting and bridging architectural mismatches, and substantiating the interaction semantics stepwise to the implementation (source code) level. We have defined the semantics of our interaction model formally using the PI-calculus and implemented our approach in the form of our COMPASS (COMPosition with ASpectS) tool. To construct our component model, we implemented analyses that identify components, their ports (interactions points and interaction properties) and connectors (concrete connections, especially interaction patterns) implemented in components. To detect interaction patterns we also implemented corresponding analyses (combining static and dynamic analyses). We also implemented transformations to reconfigure interactions, especially a reconfiguration of a direct method call interaction into a communication via a buffer object.
For the first time, this result defines precise the prevalent architectural style for web systems. This formalization will enable the construction of completely new design tools and optimisers for web systems. It will help web engineers to raise the level of abstraction in web system design, and find the way from an assembler-like programming style to a way of high-level programming. This will simplify the future construction and maintenance of web systems enormously.
This result establishes interoperation between ARMINES-DI's EAOP tool for the dynamic manipulation of stateful aspects (see the corresponding result) and TUV's component framework VCF for the uniform handling of different industrial-strength components (see the corresponding result).
The Minerva framework serves as test bed for the development and evaluation of future active document systems. To be as flexible as possible, Minerva's technology is based on component-oriented principles, so arbitrary data types may be embedded into documents. This important feature is especially useful in case of documents, which may be used over long periods of time. The flexible technology is able to integrate new components even if this means to overcome slight changes in the requirements. Besides this, Minerva offers sophisticated mechanisms, which aid the developer in realising non-linear navigation, annotation management or hot update capabilities to name just a few. In a nutshell, Minerva can serve as base for next generation active document systems.
For the first time, this result defines precise architectural styles for active documents, i.e., web systems and complex component-based documents. Based on the styles, better tools, libraries, component models can be developed, that will improve document engineering, and reduce the number of future legacy systems on the web. The styles are invasive script-parameterised documents, invasive wizard-parameterised documents, transconsistent documents, and staged documents. Many of the existing web systems can be characterized as belonging to one of those styles. This will simplify their future construction and maintenance enormously.
The use-cases of systems are often converted to packages and classes by using patterns. Due to the crosscutting behaviour that most patterns incorporate, the behaviour of the pattern is replicated over the packages and classes of the system. Similarly, when new requirements are identified in the realization of the system, the behaviour of the pattern is again replicated. There is a growing consensus that aspects of software systems should be captured in the early stages and be kept separated throughout the life cycle of the development of the system. By extending the pattern with an aspect specification that describes its crosscutting behaviour, it can be used to identify the aspects of the system; these patterns are called Aspectual Patterns. Aspectual Patterns are, however, not sufficient to identify aspects and therefore in this paper we present a pattern-based aspect identification method. By using the method, the roles that represent the aspects of the pattern are assigned to the classes of the system, thereby describing the superimposition of the behaviour of the pattern on the classes of the system. This way of expressing aspects allows aspects to stay separated throughout the life cycle of the development of the system. By supporting the method with a tool, software engineers are adequately guided in the identification of the aspects of a system. The tool is integrated with the commercial available UML CASE tool Rational Rose. The aspect identification tool interprets the UML designs of Rose to allow software engineers to relate the pattern to the design
This methodology enables the end-user to edit component-based architectures of artefacts with different component models. From design time over compile-time and link-time to runtime, all component models can be plugged into the generic visual component editor, and edited uniformly. Hence, not only the component models can be handled uniformly, they can also be edited uniformly. This result paves the way for the substitution of programming by composition. It should simplify construction of complex software artefacts and documents enormously in the future. It is the worldwide first technology for this purpose and gives Europe a leading edge in simplification of software and document engineering.
The Vienna Component Framework (VCF) supports the interoperability and composability of components across different component models, a facility that is lacking in existing component models. The VCF presents a unified component model-implemented by a facade component-to the application programmer. The programmer may write new components by composing components from different component models, accessed through the VCF. The model supports common component features, namely, methods, properties, and events. To support a component model within the VCF, a plugging component is needed that provides access to the component model. Performance measurements of the VCF implementations of COM, Enterprise JavaBeans, CORBA distributed objects, and JavaBeans have shown that the overhead of accessing components through the VCF is negligible for distributed components.
Composing complex systems from reusable components is a challenge for the end-user and developer. We have developed a multi-staged method for checking component composition both at static and dynamic level.
In the first stage we provide a method called Compression-Based Reasoning (CBR) to reason about expected or forbidden behaviour in component systems. Consistency rules (the specification) express (in the temporal logic language CTL) dynamic dependencies and constraints (dynamic with respect to the control flow). Model- checking is the underlying theory to process the static verification and has to be adapted for its usage in component-based systems. The process is to verify given temporal specifications (formalising intended behaviour) against the model derived from the real component system. In CBR, characteristics of component systems (refinement, hierarchy, compression, piece-meal growth, and modularisation) are aimed at being exploited to minimise the gap between model and real component system as well as to minimise the state explosion problem. CBR defines a set of concepts incorporated in the model-checking procedure, i.e. it realises an application of formal model-checking verification to the domain of components. CBR is based on earlier approaches in feature interaction and aspect interaction. A subset of the concepts defined in CBR is realised in an appropriate model-checker implementation, the Component Verifier (CoV). These are: state grouping (explicit states with their properties), explicit NOT (to increase the expressiveness of the modelling language with respect to design knowledge), hierarchy and compression (allowing to express states composed from other states).
The second stage of the multi-staged semantic component checker concentrates on distributed component systems. Composing distributed components correctly rise to a challenge. Since in general, a composed component must guarantee transactions, i.e., the execution of simple services without interruption or interleaving with other services. Classic locking policies could lead to deadlocks or even would fail, as the simple Services do not know that they are part of more complex ones. UKA proposes an approach to guarantee the consistency and to avoid the deadlock. This approach is based on dynamic protocol checking. Protocols are sequencing constraints that describe legal execution of component services, especially transactions. Component protocols are defined on top of the classical component interface. The goal of the explicit specification of component protocols is to specify the needed constraints for composing components correctly. In our approach, we specify not only the provided protocols but also the required protocols for solving the problem described above. A component system is consistent if its component protocols are satisfied. The checking of component protocols is based on transaction technique which uses optimistic synchronization mechanisms to guarantee the consistency and avoiding of distributed deadlocks. Additional, our technique provides a distributed solution for checking protocols; i.e. there is no central instance that controls the entire component system. UKA has developed a tool called CoTaP (Composition of Transactional Protocols CoTaP) to check the composition of component protocols dynamically. CoTaP provides the protocol language TCPL (Transactional Component Protocol Language) to model the component protocols and the transactional properties of the protocols. From the component protocols, CoTaP generates the needed checker for individual components.
Type based adaptation an adaptation technique that, unlike other techniques, supports the automated adaptation of component interfaces by relying on the component's type information and without requiring knowledge about the component's implementation. This adaptation technique eases the software composition process. Software components are typically developed independently of each other. A disadvantage of software components, however, is that due to their independent development, component interfaces do not necessarily match and thus need to be adapted. This problem is solved by type-based adaptation.
RECODER is a framework for Java, C und C# source code metaprogramming aimed to deliver a sophisticated infrastructure for many kinds of analysis and transformation tools. The following points give a short description of the different components of RECODER features as well as the application perspectives that these layers offer: 1. Parsing and unparsing of Java sources In addition to abstract model elements, RECODER also supports a highly detailed syntactic model - no information is lost. Comments and formatting information are retained. The pretty printer is customisable and is able to reproduce the code (possibly improving upon it, but retaining given code structures) and to embed new code seamlessly. Possible applications: Simple pre-processors, simple code generators, and source code beautification tools 2. Name and type analysis for Java programs RECODER performs a name and type analysis, can evaluate compile-time constants, resolve all kinds of references and maintain cross-reference information. Possible applications: Software visualization tools, software metrics, Lint-like semantic problem detection tools, design problem detection tools (anti-patterns), cross-referencing tools 3. Transformation of Java sources RECODER contains a library of analyses, code snippet generators and frequently used transformations. Possible applications: Pre-processors for language extensions, semantic macros, aspect weavers, source code obfuscation tools, compilers 4. Incremental analysis and transformation of Java sources. Transformations change the underlying program model; for incremental and iterative use, this model has to be updated accordingly. Transformations have to take care of dependencies by updating their local data and setting back matching positions when necessary; however, RECODER will analyse change impacts for its model and perform updates automatically. Possible applications: Source code optimisation, refactoring tool, software migration programs (Smart Patches), design pattern synthesis, cliches synthesis and idiom synthesis, architectural connector synthesis, adaptive programming environments, invasive software composition
ARMINES-DI has introduced Event-based AOP and, in particular, achieved the following major results: - First publication of a general formally defined approach for AOP. This approach supports much more general aspect definitions than provided by almost all (than and now) existing ones. - First publication of a static analysis technique for interactions among aspects. This result is important because interactions among aspects constitute a probably the, major problem of AO. - The EAOP model by includes aspect composition and dynamic aspect instantiation. The theoretical foundation of EAOP has been developed, in particular, by presenting the first technique for static analysis of aspect interactions (published at GPCE'02). An extension of this analysis framework including an explicit notion of aspect composition and conflict resolution using composition operators has been developed and will be submitted in Oct. 2003 to AOSD'04. The static analysis has been adapted to a component model with explicit protocols (PhD. Andres Farias). The theoretical framework of EAOP has been applied to schedulers in operating systems (publication at ASE'03, MSc Rickard Aberg).
Q-Labs have specified, designed and realized a flexible system for dynamic composition at run time -the ECDyn platform- reusing technologies from Ilog partner and UKA and implementing a kind of Event-based Aspect Component Composition. This platform may be used as a tool to support the main activities of the analysis and design workflow of the EASYCOMP process. Using this platform allows quick prototyping of functional and non-functional requirements inside a well-controlled development process. Furthermore, the design enforces separation of concerns between business rules and functional components, which is of great interest for traceability purpose. Q-Labs have begun collaboration with French company OXAND to explore new technologies and provide new classes of services upon the ECDyn platform. The expected benefit is to speed up the development process without dropping control; in particular the acceptance phase will be improved by using the ECDyn platform itself to automate the tests. The ECDyn platform used as a tool to support the EASYCOMP process is a mean to manage the trade-of between the flexibility of dynamic composition and keeping the process under control to mitigate the risks.
Current environments for web application development focus on Java or scripting languages. Developers that want to or have to use C or C++ are left behind with little options. To solve this problem, we have developed a C++ Servlet Environment (CSE) that provides a high performance servlet engine for C++. One of the biggest challenges we have faced while developing this environment was to come up with an architecture that provides high performance while not allowing a single servlet to crash the whole servlet environment, a serious risk with C++ application development. In this thesis we explain the requirements for such a servlet environment, the architecture we have designed, as well as details about the implementation of the CSE. To allow developers to get familiar with the CSE, we have also designed a C++ servlet API and a syntax for C++ Server Pages that closely resembles that used by Java servlet environments. To illustrate the use of the CSE, we have also implemented the Record Store, a sample web application. On the basis of this example, we describe how servlets can be developed using our environment. To demonstrate the benefits that can be gained by using the CSE, we evaluate its performance and compare it to that of other popular servlet environments.
Component-based hardware/software co-design (CBHSCD) is a new design process for the design of embedded systems using component-based methods. A typical embedded system consists of both hardware and software components; therefore this methodology has to support them both. Moreover, the design of embedded systems is typically constrained in several non-functional aspects: the design has to respect costs and timing requirements such as hard real-time constraints. In order to cope with the complexity of the designed system, the design methodology has to allow the designer to work at a sufficiently high level of abstraction. This means that the designer works with behavioural components, focusing mainly on functionality, while implementation issues are handled automatically. This also implies that components are handled regardless of whether they are implemented in hardware or in software. Moreover, the methodology has to be supported by powerful tools that provide a high-level view on the design while hiding and automating lower-level implementation issues. CBHSCD defines a component as a functional unit. The composition of components is based on their functionality. This functionality is captured by the interface of the component, which is completely decoupled from its implementation. It is also possible to have more than one implementation for the same interface. What is more, it is possible that there is a hardware implementation and a software implementation for the same interface. An important feature is hardware/software transparency, which means that a change between the two implementations is transparent to the rest of the system.
The 2 basic laws of uniform composition are: software composition = architecture + invasiveness + staging+ anticipation. active document composition = architecture + invasiveness + staging + transconsistency. The laws precisely specify the difference of software and active documents. While software, since it is running automatically, does not need transconsistency, it needs anticipation of compositions from the runtime to an earlier stage. On the other hand, active documents do not need anticipation, because they are running in stages anyway. However, they need transconsistency because they are edited interactively. These laws could only be defined on top of the other principles for active document composition that EASYCOMP has discovered.
ARMINES-DI has developed a tool for EAOP in Java (publication at LMO'03). The tool integrates UKA's Recoder to provide a general means to hook event-based aspects to Java execution events. An integration of the EAOP tool with \TUV's composition framework (VCF) has been designed and its realization is to be finished end of Oct. 2003. A first version of an optimised implementation of the current thread-based implementation using an extension of existing transformations into continuation-passing style has been developed.
Result of the e-commerce case study is a library of web components useful to compose e-shops and the know-how on how to combine these components effectively. The case study also includes a wizard that provides a quick start when building a shopping application. Using the component library e-shops can be developed much more efficiently than programming from scratch. On the other hand the component approach provides a very high flexibility in contrast to standard software. The components support a product catalogue, shopping cart, personalization, administration of the product catalogue, web site menus, page layout etc. The e-commerce case study was never meant to become a product itself. It was designed as a vehicle to test composition technology on. Nevertheless it works very well, so that H.E.I. also decided to commercialise the library. H.E.I. intends to use components of the library when developing web applications for customers and also plans to sell some of the components itself. It is not clear, if the components will be a stand-alone product or if they are included with another product.
The SWEDE system generates fast ontology checkers for OWL in Java. They can be embedded into Java applications and work directly on the internal representation of the Java objects. This is a rather unique feature. SWEDE is available under http://www.the-swede-system.org.
A composition machine for web documents is a basic technology to compose web applications such an e-commerce sites or portals from reusable software components. In contrast to creating a web application from scratch reusing components significantly reduces the development effort. This uniform composition machine provides composition across multiple component models. So it becomes possible to use components of different component models in one application and the user is no longer restricted to the set of components that belong to a particular component model for his application. The present uniform composition machine supports the heitml/RADpage component model and the java-beans/taglib components model. By using the VCF (another result from the easycomp project) additionally EJBs, CORBA, COM and SOAP components can be accessed.
An essential idea of software engineering is that software should be systematically constructed using building blocks that we now call components. Many notions of component have been put forward with a varying degree of expressiveness over the component lifecycle. Hence, different means for composition are available at construction time, at assembly time, and at runtime. This diversity of notions and the difference of expressive power constitute major reasons for the current complexity of component-based software development. Our uniform composition model addresses the lack of homogeneity in the different development stages. It integrates three different composition models, which provide expressive means for composition over all stages of the development lifecycle of a component. The three models are smoothly integrated in that artefacts constructed during previous phases of the component lifecycle can be reused and manipulated during later phases. Furthermore, the model provides a common environment for the development of components for each of the three models. Finally, specific attention is paid towards the integration of existing commercial off-the-shelf (COTS) components.
Transconsistency is a revolutionary principle for active and hypertext documents. It gives them the notion of an architecture with hot update. A typical active document contains scripts that derive parts of the documents from base parts. A transconsistent architecture enables the immediate recomputation of all derived parts. As such, the principle generalizes the ubiquitous hypertext principle of transclusion. This result defines precisely what happens in many web systems, when a user interactively develops a document or form. Until now, this systematic principle has not been recognized, and that is the reason why web systems are so complicated and hard to construct. This principle will enable the construction of completely new design tools and optimisers for web systems. It will help web engineers to raise the level of abstraction in web system design, and find the way from an assembler-like programming style to a way of high-level programming. This will simplify the future construction and maintenance of web systems enormously.
JPatch is a dynamic composition system for Java runtime components, relying on dynamic uploading of Java code on-the-fly. It is primarily intended for maintenance of high-availability systems, such as telecommunication software. It can be found under http://www.jpatch.org.
This approach is based on a generic component model that unifies diverse component models; a composition language that supports the specification of applications composed of generic components; a mapping language that enables the application developer to define mappings from the generic components to components in a concrete component model; and a component extension mechanism that allows the application developer to extend the capabilities of one or more components to satisfy the requirements of a needed generic component. Using this approach, the developer defines an application configuration that may be instantiated for different component models and platforms. If a component is missing in a given component model or on a given platform, the developer can use the mapping and extension mechanisms to dynamically adapt an available component in order to instantiate the application. We have applied the approach to map components from JavaBeans, COM+, CORBA, and EJB. The approach may be applied in cross-platform development and in pervasive computing.
Q-Labs have defined a EASYCOMP process by customizing the RUP.The Unified Process is architecture-centric and promotes extensive usage of iterations, which makes it a perfect basis for controlling semi-continuous integration of components as defined in EASYCOMP. The process mainly extends Unified Process on Analysis and Design activities. It defines additional roles, activities and artefacts like "Component Library Administrator", "EASYCOMP Component Model". New tools like "Ontology Checker" are also used in various steps of the process. The ECDyn platform can be used to support and speed up the design activities and acceptance tests. UT/CTIT have written a white paper about cross cutting concern used to handle architecture mechanisms. The overall process is described through a complete web site. This process is currently deployed in the french company OXAND and will be used in addition with the ECDyn platform to speed up the development of new classes of services. Furthermore, when Q-Labs will be faced to similar industrial context, we plan to use the EASYCOMP process as it was designed to manage the trade-of between quick development/dynamic composition and keeping the process under control to mitigate the risks.
The COMPOST Java framework (COMPOSiTion system) enables a programmer to define component models for static software components (Java) and XML languages (XTHML and others), and runtime components (Java). Based on a generic component model, the UNICOMP set of interfaces; architectures of software and document artefacts can be written as composition programs, using the library functions of the framework. Since the component model is generic, and can be instantiated to new concrete component models, architectures can be reused. Hence, COMPOST's composition technology is truly uniform: it scales from design-time over compile-time and link-time to run-time. As such, COMPOST is the worldwide first framework the uniform composition of software and of active documents, documents that contain data and software.
The XHTML refactoring tool allows for refactoring of XHTML web sites. Refactoring means restructuring under the preservation of semantics, i.e., restructuring under preservation of link consistency. Such a tool will be of primary importance for future web systems and active documents.

Recherche de données OpenAIRE...

Une erreur s’est produite lors de la recherche de données OpenAIRE

Aucun résultat disponible