**Netcentric Virtual Laboratories for Composite Materials**

E. Dado, E.A.B. Koenders and D.B.F. Carvalho

Additional information is available at the end of the chapter

http://dx.doi.org/10.5772/48705

## **1. Introduction**

226 Composites and Their Properties

621.

805.

2002.

1997;27(3):243-264.

2001;22(4):491-505.

1997;57(3):327-344.

Parseval YD, Pillai KM, Advani SG. A Simple Model for the Variation of Permeability due to Partial Saturation in Dual Scale Porous Media. Transport in Porous Media.

Potter KD. The early history of the resin transfer moulding process for aerospace applications. Composites Part A: Applied Science and Manufacturing. 1999;30(5):619-

Reia da Costa EF, Skordos AA. Modelling flow and filtration in liquid composite moulding of nanoparticle loaded thermosets. Composites Science and Technology. 2012;72(7):799-

S. G. Advani EMS. Process Modeling in Composites Manufacturing: Marcel Dekker press;

S. G. Advani MVB, R. Parnas. Resin Transfer Molding. Flow and Rheology in Polymeric

Saunders RA, Lekakou C, Bader MG. Compression and microstructure of fibre plain woven cloths in the processing of polymer composites. Composites Part A: Applied Science

Saunders RA, Lekakou C, Bader MG. Compression in the processing of polymer composites 1. A mechanical and microstructural study for different glass fabrics and resins.

Simacek P, Advani SG. A numerical model to predict fiber tow saturation during liquid composite molding. Composites Science and Technology. 2003;63(12):1725-1736. Šimáček P, Advani SG. Desirable features in mold filling simulations for Liquid Composite

Slade J, Pillai KM, Advani SG. Investigation of unsaturated flow in woven, braided and stitched fiber mats during mold-filling in resin transfer molding. Polymer Composites.

Smith P, Rudd CD, Long AC. The effect of shear deformation on the processing and mechanical properties of aligned reinforcements. Composites Science and Technology.

Soutis C. Carbon fiber reinforced plastics in aircraft construction. Materials Science and

Tan H, Roy T, Pillai KM. Variations in unsaturated flow with flow direction in resin transfer molding: An experimental investigation. Composites Part A: Applied Science and

Tari MJ, Bals A, Park J, Lin MY, Thomas Hahn H. Rapid prototyping of composite parts using resin transfer molding and laminated object manufacturing. Composites Part A:

Reduced cost, higher performance RTM. Reinforced Plastics. 1997;41(9):48-54.

Composites Manufacturing, Amsterdam: Elsevier Publishers; 1994.

Composites Science and Technology. 1999;59(7):983-993.

Molding processes. Polymer Composites. 2004;25(4):355-367.

Applied Science and Manufacturing. 1998;29(5–6):651-661.

and Manufacturing. 1998;29(4):443-454.

Engineering: A. 2005;412(1–2):171-176.

Manufacturing. 2007;38(8):1872-1892.

Physical laboratory-based experiments and testing has been a way to develop fundamental research and learning knowledge for many areas of (civil) engineering education, science and practice. In the context of education, it has particularly enriched engineering education by helping students to understand fundamental principles and by supporting them to understand the link between the theoretical equations of their text books and real world applications. In the context of science, physical laboratory experiments have been used to scrutinize particular phenomenon in a real-life setting or to verify and validate scientific computational models over a longer period of time. In both the educational and research context, conducting physical laboratory experiments are generally governed by complex and expensive lab-infrastructures and require significant allocation of resources from the educational and research institutes. Besides, results most frequently have a limited range of exposure and are available for a relatively small audience (i.e. high costs versus relatively low benefits) [1]. In the context of practice, physical tests are often performed to validate performances of products. With the increasing regulations from national governments and European Union (EU) concerning quality, safety and environmental properties of products, the number of physical tests performed in laboratories of certificated (research) institutes have increased recently. For example, the European labels of conformity, known as CE markings, are a guarantee of quality and safety for products produced and sold in the EU. According to the CE conformity standards, products will have undergone a series of performance tests before they can be sold on the EU market. However, these performance tests entail additional costs which may result in financial difficulties stemming from these additional costs and in the long run result in competitive disadvantages for Small and Medium Enterprises (SMEs) in Europe [2].

To improve this situation, initiatives have been launched where research and development (R&D) projects could be conducted in so-called 'virtual laboratories'. In civil engineering,

© 2012 Dado et al., licensee InTech. This is an open access chapter distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. © 2012 Dado et al., licensee InTech. This is a paper distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

the National Institute of Standards and Technologies (NIST) in the United States were the first who set the standard for R&D projects conducted in a virtual laboratory for composite materials [3]. From this initiative and the promises of emerging information and communication technologies a whole new realm of possibilities for developing virtual laboratories has become available. Based on these observations and developments, the authors of this book chapter have initiated a number of R&D projects which main focus was to explore the concept of virtual laboratories for cement-based materials in real-life settings. A number of (journal) papers have been published about the findings of these projects in the past [4-7]. The main focus of this book chapter will be on a relatively new concept for establishing virtual laboratories which is based on a 'netcentric' approach.

Netcentric Virtual Laboratories for Composite Materials 229

assumption. Although it is true that computer simulation is an important tool for virtual experiments, it is only one of key components that constitute a virtual laboratory. This can be explained by the limitation of the existing composite material models. As discussed by Garboczi et al, an ideal model of a composite material or structure should be one that starts from the known chemical composition of the composite material [8]. Beginning with the correct proportioning and arrangement of atoms, the modeling effort would build up the needed molecules, then the nanostructure and microstructure, and would eventually predict properties at the macroscale level. Such fundamental and multi-scale material model, however, is still a long way off. Each existing material model has its range of applicability and its own restrictions. Corresponding computational models and supporting computer tools are developed at a number of different research institutes worldwide. However, most existing virtual laboratories are setup as web applications that only provide access to a 'closed' virtual laboratory that contains a number of (integrated) computational models and

A good example of a closed virtual laboratory is the Virtual Cement and Concrete Testing Laboratory (VCCTL) from the National Institute of Standards and Technologies (NIST) in the United States. The main goal of the VCCTL project was to develop a virtual testing system, using a suite of integrated computational models for designing and testing cementbased materials in a virtual testing environment, which can accurately predict durability and service life based on detailed knowledge of starting materials, curing conditions, and environmental factors. In 2001, an early prototype (version 1.0) of the VCCTL became public available and accessible through the Internet. The core of this prototype was formed by the NIST 3D Cement Hydration and Microstructure Development Model (CEMHYD3D). Using the web-based interface of the VCCTL, one can create an initial microstructure containing cement, mine mineral admixtures, and inert fillers following a specific particle size distribution, hydrate the microstructure under a variety of curing conditions and evaluate the properties (e.g. chemical shrinkage, heat release, and temperature rise) of the simulated microstructures for direct comparison to experimental data. As the VCCTL project proceeded, the prediction of rheological properties (viscosity and yield stress) of the fresh materials and elastic properties (elastic modulus, creep, and relaxation) of the hardened materials were incorporated into the VCCTL resulting in the release of version 1.1 (latest) of

In order to cope with this particularity of distributed material (computational) models, the authors adopted a relatively new concept for establishing virtual laboratories which is based on a 'netcentric' approach. In this respect, a netcentric virtual laboratory is considered as a part of an evolutionary, complex community of people (users), devices, information (i.e. experimental data) and services (i.e. computational models and supporting computer tools) that are interconnected by the Internet. Optimal benefit of the available databases containing experimental data, computational models and supporting computer tools, that replace the physical laboratory equipment, is achieved via a distributed virtual laboratory environment

and is assessable for students and researchers (see Figure 2).

supporting computer tools.

the VCCTL in 2003 [3].

**Figure 1.** One of the results of the R&D projects conducted by the authors in the past.

## **2. The concept of netcentric virtual laboratories**

Virtual experiments (or testing) are rapidly emerging as a key technology in civil engineering. Although some applications of virtual experiments other than related to materials and components have been reported by a number of researchers, most effort has been put into the development of virtual laboratories for composite materials and components. In this respect, virtual experiments are often defined as a concept of making use of high performance computers in conjunction with high quality models to predict the properties and/or behavior of composite materials and components. Consequently, virtual laboratories are often seen as a new terminology for computer simulation, which is a wrong assumption. Although it is true that computer simulation is an important tool for virtual experiments, it is only one of key components that constitute a virtual laboratory. This can be explained by the limitation of the existing composite material models. As discussed by Garboczi et al, an ideal model of a composite material or structure should be one that starts from the known chemical composition of the composite material [8]. Beginning with the correct proportioning and arrangement of atoms, the modeling effort would build up the needed molecules, then the nanostructure and microstructure, and would eventually predict properties at the macroscale level. Such fundamental and multi-scale material model, however, is still a long way off. Each existing material model has its range of applicability and its own restrictions. Corresponding computational models and supporting computer tools are developed at a number of different research institutes worldwide. However, most existing virtual laboratories are setup as web applications that only provide access to a 'closed' virtual laboratory that contains a number of (integrated) computational models and supporting computer tools.

228 Composites and Their Properties

the National Institute of Standards and Technologies (NIST) in the United States were the first who set the standard for R&D projects conducted in a virtual laboratory for composite materials [3]. From this initiative and the promises of emerging information and communication technologies a whole new realm of possibilities for developing virtual laboratories has become available. Based on these observations and developments, the authors of this book chapter have initiated a number of R&D projects which main focus was to explore the concept of virtual laboratories for cement-based materials in real-life settings. A number of (journal) papers have been published about the findings of these projects in the past [4-7]. The main focus of this book chapter will be on a relatively new concept for

establishing virtual laboratories which is based on a 'netcentric' approach.

**Figure 1.** One of the results of the R&D projects conducted by the authors in the past.

Virtual experiments (or testing) are rapidly emerging as a key technology in civil engineering. Although some applications of virtual experiments other than related to materials and components have been reported by a number of researchers, most effort has been put into the development of virtual laboratories for composite materials and components. In this respect, virtual experiments are often defined as a concept of making use of high performance computers in conjunction with high quality models to predict the properties and/or behavior of composite materials and components. Consequently, virtual laboratories are often seen as a new terminology for computer simulation, which is a wrong

**2. The concept of netcentric virtual laboratories** 

A good example of a closed virtual laboratory is the Virtual Cement and Concrete Testing Laboratory (VCCTL) from the National Institute of Standards and Technologies (NIST) in the United States. The main goal of the VCCTL project was to develop a virtual testing system, using a suite of integrated computational models for designing and testing cementbased materials in a virtual testing environment, which can accurately predict durability and service life based on detailed knowledge of starting materials, curing conditions, and environmental factors. In 2001, an early prototype (version 1.0) of the VCCTL became public available and accessible through the Internet. The core of this prototype was formed by the NIST 3D Cement Hydration and Microstructure Development Model (CEMHYD3D). Using the web-based interface of the VCCTL, one can create an initial microstructure containing cement, mine mineral admixtures, and inert fillers following a specific particle size distribution, hydrate the microstructure under a variety of curing conditions and evaluate the properties (e.g. chemical shrinkage, heat release, and temperature rise) of the simulated microstructures for direct comparison to experimental data. As the VCCTL project proceeded, the prediction of rheological properties (viscosity and yield stress) of the fresh materials and elastic properties (elastic modulus, creep, and relaxation) of the hardened materials were incorporated into the VCCTL resulting in the release of version 1.1 (latest) of the VCCTL in 2003 [3].

In order to cope with this particularity of distributed material (computational) models, the authors adopted a relatively new concept for establishing virtual laboratories which is based on a 'netcentric' approach. In this respect, a netcentric virtual laboratory is considered as a part of an evolutionary, complex community of people (users), devices, information (i.e. experimental data) and services (i.e. computational models and supporting computer tools) that are interconnected by the Internet. Optimal benefit of the available databases containing experimental data, computational models and supporting computer tools, that replace the physical laboratory equipment, is achieved via a distributed virtual laboratory environment and is assessable for students and researchers (see Figure 2).

Netcentric Virtual Laboratories for Composite Materials 231

requires

requires

requires

Virtual Laboratory Environment

Integrated Computational Models and Computer Tools

Integrated Infrastructure

Integrated Data

**Figure 3.** A Virtual Laboratory Environment requires an Integrated Infrastructure which on its turn requires respectively Integrated Computational Models and Computer Tools and Integrated Data.

these resources is an inevitable condition for making multi-scale modeling successful.

**3.2. Integrated computational models & computer tools** 

discussed in the next sections.

computer programs work together.

Cloud Computing and Grid Computing also share some limitations, namely the inability to provide intelligent and autonomous services, the incompetency to address the heterogeneity of systems and data, and the lack of machine-understandable content [11]. Mika and Tummarello (2008) identified the root cause of these limitations as the lack of 'Web semantics' [12]. These limitations should be addressed at service and data levels as

Traditionally, the programming environment at research institutes is dominated by (programming) languages such SQL for storing, retrieving and manipulating data, Fortran, C, and C++ and Java for implementing computational models and HTML for developing web-based interfaces for end-users. Web Services fundamentally concern about the integration of computer programs, especially when the computer programs concerned are developed using different programming languages and computer operating platforms. Web Services standards and technologies offer a widely adopted mechanism for making

Currently, the two main players on the web services market are Oracle (by the ownership of Sun), with their Java platform and Microsoft with the .NET platform, where both agree on the core standards (e.g. SOAP, WSDL, UDDI and XML), but disagree on how to deliver the

network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction" [10]. In addition, Grid Computing technology adds the concept of 'virtual enterprise'. A virtual enterprise is a dynamic collection of institutes together in order to share hardware and software resources as they tackle common goals. As discussed in [3], a shared access to resources and the use of

**Figure 2.** Virtual Laboratory Environment populated by devices, users, computational models and computer tools and databases with experimental data which are interconnected by the Internet.

## **3. Overview of emerging and enabling technologies**

As discussed in the previous paragraph, a virtual laboratory is no longer regarded as an isolated web-based application, but as a set of integrated devices (and supported infrastructures), computational models and supporting computer tools and databases containing experimental data that, used together, form a distributed and collaborative virtual laboratory environment for virtual experiments. Multiple, geographically dispersed (research) institutes will use this virtual laboratory environment to establish their own virtual laboratory to perform experiments as well as share their result from their R&D projects. As discussed in [5], emerging and enabling technologies should fundamentally concern about the integration (or interoperability), connecting devices, computational models, computer tools and data stores. In order to structure the discussion in this section, a conceptual scheme of the different levels of 'integration' is presented in Figure 3.

### **3.1. Integrated infrastructure**

Concerning the issue of the 'integrated infrastructure' two enabling and emerging technologies should be mentioned: Cloud Computing and Grid Computing. According to Foster et al. [9], Grid Computing and Cloud Computing are closely related paradigms that share a lot of commonality in their goals, architecture, and technology. According the National Institute of Standards and Technology (NIST) Cloud Computing (and to a large extend Grid Computing) can be defined as "a model for enabling convenient, on-demand

**Figure 2.** Virtual Laboratory Environment populated by devices, users, computational models and computer tools and databases with experimental data which are interconnected by the Internet.

As discussed in the previous paragraph, a virtual laboratory is no longer regarded as an isolated web-based application, but as a set of integrated devices (and supported infrastructures), computational models and supporting computer tools and databases containing experimental data that, used together, form a distributed and collaborative virtual laboratory environment for virtual experiments. Multiple, geographically dispersed (research) institutes will use this virtual laboratory environment to establish their own virtual laboratory to perform experiments as well as share their result from their R&D projects. As discussed in [5], emerging and enabling technologies should fundamentally concern about the integration (or interoperability), connecting devices, computational models, computer tools and data stores. In order to structure the discussion in this section, a

conceptual scheme of the different levels of 'integration' is presented in Figure 3.

Concerning the issue of the 'integrated infrastructure' two enabling and emerging technologies should be mentioned: Cloud Computing and Grid Computing. According to Foster et al. [9], Grid Computing and Cloud Computing are closely related paradigms that share a lot of commonality in their goals, architecture, and technology. According the National Institute of Standards and Technology (NIST) Cloud Computing (and to a large extend Grid Computing) can be defined as "a model for enabling convenient, on-demand

**3. Overview of emerging and enabling technologies** 

**3.1. Integrated infrastructure** 

**Figure 3.** A Virtual Laboratory Environment requires an Integrated Infrastructure which on its turn requires respectively Integrated Computational Models and Computer Tools and Integrated Data.

network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction" [10]. In addition, Grid Computing technology adds the concept of 'virtual enterprise'. A virtual enterprise is a dynamic collection of institutes together in order to share hardware and software resources as they tackle common goals. As discussed in [3], a shared access to resources and the use of these resources is an inevitable condition for making multi-scale modeling successful.

Cloud Computing and Grid Computing also share some limitations, namely the inability to provide intelligent and autonomous services, the incompetency to address the heterogeneity of systems and data, and the lack of machine-understandable content [11]. Mika and Tummarello (2008) identified the root cause of these limitations as the lack of 'Web semantics' [12]. These limitations should be addressed at service and data levels as discussed in the next sections.

### **3.2. Integrated computational models & computer tools**

Traditionally, the programming environment at research institutes is dominated by (programming) languages such SQL for storing, retrieving and manipulating data, Fortran, C, and C++ and Java for implementing computational models and HTML for developing web-based interfaces for end-users. Web Services fundamentally concern about the integration of computer programs, especially when the computer programs concerned are developed using different programming languages and computer operating platforms. Web Services standards and technologies offer a widely adopted mechanism for making computer programs work together.

Currently, the two main players on the web services market are Oracle (by the ownership of Sun), with their Java platform and Microsoft with the .NET platform, where both agree on the core standards (e.g. SOAP, WSDL, UDDI and XML), but disagree on how to deliver the potential benefits of Web Services to their customers. Simple Object Access Protocol (SOAP) is the standard for web services messages. Based on XML, SOAP let web services exchange information over HTTP. In addition, WSDL (Web Services Description Language) is an XML-based language for describing web services and the way how to access them. UDDI (Universal Description, Discovery, and Integration) is an XML-based registry for web services to list themselves on the Internet. XML (EXtensible Markup Language) has replaced HTML as de defacto standard for describing defining and sharing data on the Internet. The most important advantages of XML are: (1) its separation of definition (content) and representation (mark-up), and (2) its ability to support the development and use of domain specific XML vocabularies. Using the web service standards SOAP, WSDL and UDDI make computational models and supporting computer tools web services that can be accessed, described and discovered. Using XML as a basis for sharing data on the Internet will solve the interoperability problems at data level as described in the next section.

Netcentric Virtual Laboratories for Composite Materials 233

**4. A multi-scale modeling approach for concrete materials** 

**Figure 4.** Experimental testing device for concrete compression (Left) and tension (Right).

associate with the characterization of specific materials properties (Table 1).

In a virtual laboratory, testing procedures and methods have to be mimicked using computer simulation models. These computer models are applied to simulate the different characteristics of cement-based materials and produce results that may be validated with experimental data. Especially for heterogeneous materials such as concrete materials, characteristics are modeled at different scale-levels requiring the modeling approach to be multi-scale. This means that models, operating at scale length, have to communicate with each other and exchange information by means of parameters passing which require upscaling algorithms. In general the following scale levels can be distinguished that

right), elastic modulus, hydration temperature, etc.

The development of a virtual laboratory for construction materials requires many years of research to find out the basic principles of which a virtual laboratory should comply with, and also to find out the conditions at which a virtual laboratory would be attractive to researchers, students and people from the industry [13]. At the Delft University of Technology, first trials were focused on the virtual testing of a concrete compressive strength test, where the hydration conditions and the fracture behavior where evaluated with emphasis on the upscaling of the simulation model results. In physical-based concrete laboratories, the compressive strength is determined with an experimental device where a concrete cube is positioned in between two steel plates and compressed using a hydraulic force (Figure 4, left). The force imposed to the concrete cube is increased until failure occurs. Later, the focus was widened to other concrete properties such as tensile strength (Figure 4,

## **3.3. Integrated data**

As discussed earlier, one of the main causes that Cloud Computing and Grid Computing share some limitations is the lack of 'web semantics'. Using XML will not solve the problem entirely; it has also its own limitations. The limitations of XML were solved by the introduction of the Semantic Web in 2004. Driving the 'Semantic Web' is the organization of content specialized vocabularies, referred to as 'ontologies'. In this respect, ontology is a collection of concepts (or terms) and constructs used to describe a particular knowledge domain. Build upon RDF (Resource Description Framework) and XML and derived from DAML/OIL, OWL (Web Ontology Language) has become the default standard for creating ontologies.

Building ontologies for describing and exchanging data between computational models and supporting computer tools in virtual laboratory environment will result in a number of different ontologies. In this respect, three different types of ontologies can be distinguished: (1) high-level (reference) ontologies that hold common concepts which can be applied for all knowledge domains and hold high-level constructs for defining the relationships between these knowledge domains, (2) knowledge domain (reference) ontologies which hold the concepts and constructs that are common within one specific knowledge domain (i.e. referring to the different macro-, meso-, micro- and nano-scale levels that exist in material research) and (3) application ontologies that hold detailed information about concepts and constructs which form the basis for sharing data between a 'group' of computational models and supporting computer tools on the Internet. From a modeling point of view, each application ontology is an extension of one or more knowledge domain ontologies, while each knowledge domain ontology is an extension of one or more high-level ontologies. Together, they form a so-called 'ontology network' that evolutionary changes over time.

## **4. A multi-scale modeling approach for concrete materials**

232 Composites and Their Properties

section.

ontologies.

evolutionary changes over time.

**3.3. Integrated data** 

potential benefits of Web Services to their customers. Simple Object Access Protocol (SOAP) is the standard for web services messages. Based on XML, SOAP let web services exchange information over HTTP. In addition, WSDL (Web Services Description Language) is an XML-based language for describing web services and the way how to access them. UDDI (Universal Description, Discovery, and Integration) is an XML-based registry for web services to list themselves on the Internet. XML (EXtensible Markup Language) has replaced HTML as de defacto standard for describing defining and sharing data on the Internet. The most important advantages of XML are: (1) its separation of definition (content) and representation (mark-up), and (2) its ability to support the development and use of domain specific XML vocabularies. Using the web service standards SOAP, WSDL and UDDI make computational models and supporting computer tools web services that can be accessed, described and discovered. Using XML as a basis for sharing data on the Internet will solve the interoperability problems at data level as described in the next

As discussed earlier, one of the main causes that Cloud Computing and Grid Computing share some limitations is the lack of 'web semantics'. Using XML will not solve the problem entirely; it has also its own limitations. The limitations of XML were solved by the introduction of the Semantic Web in 2004. Driving the 'Semantic Web' is the organization of content specialized vocabularies, referred to as 'ontologies'. In this respect, ontology is a collection of concepts (or terms) and constructs used to describe a particular knowledge domain. Build upon RDF (Resource Description Framework) and XML and derived from DAML/OIL, OWL (Web Ontology Language) has become the default standard for creating

Building ontologies for describing and exchanging data between computational models and supporting computer tools in virtual laboratory environment will result in a number of different ontologies. In this respect, three different types of ontologies can be distinguished: (1) high-level (reference) ontologies that hold common concepts which can be applied for all knowledge domains and hold high-level constructs for defining the relationships between these knowledge domains, (2) knowledge domain (reference) ontologies which hold the concepts and constructs that are common within one specific knowledge domain (i.e. referring to the different macro-, meso-, micro- and nano-scale levels that exist in material research) and (3) application ontologies that hold detailed information about concepts and constructs which form the basis for sharing data between a 'group' of computational models and supporting computer tools on the Internet. From a modeling point of view, each application ontology is an extension of one or more knowledge domain ontologies, while each knowledge domain ontology is an extension of one or more high-level ontologies. Together, they form a so-called 'ontology network' that The development of a virtual laboratory for construction materials requires many years of research to find out the basic principles of which a virtual laboratory should comply with, and also to find out the conditions at which a virtual laboratory would be attractive to researchers, students and people from the industry [13]. At the Delft University of Technology, first trials were focused on the virtual testing of a concrete compressive strength test, where the hydration conditions and the fracture behavior where evaluated with emphasis on the upscaling of the simulation model results. In physical-based concrete laboratories, the compressive strength is determined with an experimental device where a concrete cube is positioned in between two steel plates and compressed using a hydraulic force (Figure 4, left). The force imposed to the concrete cube is increased until failure occurs. Later, the focus was widened to other concrete properties such as tensile strength (Figure 4, right), elastic modulus, hydration temperature, etc.

**Figure 4.** Experimental testing device for concrete compression (Left) and tension (Right).

In a virtual laboratory, testing procedures and methods have to be mimicked using computer simulation models. These computer models are applied to simulate the different characteristics of cement-based materials and produce results that may be validated with experimental data. Especially for heterogeneous materials such as concrete materials, characteristics are modeled at different scale-levels requiring the modeling approach to be multi-scale. This means that models, operating at scale length, have to communicate with each other and exchange information by means of parameters passing which require upscaling algorithms. In general the following scale levels can be distinguished that associate with the characterization of specific materials properties (Table 1).


Netcentric Virtual Laboratories for Composite Materials 235

disadvantage of this approach is that still expensive and time consuming physical laboratory experiments have to be conducted for the fitting and validation of the models. With these FEM models the stress and strength development can be calculated for full-scale concrete structures during the early stage of hardening. For accurate assessments, the model requires input from the structure's geometry and formwork characteristics, the ambient conditions and the thermo-mechanical properties of the hardening mix. From these inputs, the model calculates the temperature field and, from this, the development of the tensile stresses that occur as a result of the internal and external restraint (Figure 6). Important material properties herein are the development of the tensile strength, the elastic modules and the relaxation coefficient. All these properties are depending on the mix composition and on the hardening conditions, and can be expressed as a function of the degree of

**Figure 6.** Temperature and stress field during early hydration of a hardening concrete wall cast on an

With the calculated stress field and the tensile strength development known at any location in a structure, the probability of (macro) crack occurrence can be calculated using statistical methods. Cracking will occur if the calculated stress exceeds the strength while accounting for the scatter of the material properties. The accuracy of the crack predictions also requires an accurate description of the material properties. Experienced-based models, databases or numerical simulation models operating at a more detailed scale level can be used for this. More fundamental models, based on thermo-mechanical-physical mechanisms operating at an increased level of detail, can also be applied. In the next sections, models operating at the meso, micro and nano-scale level will be discussed with emphasis on the increased level of detail for the simulations while aiming to improve the prediction accuracy of the models

Increasing the level of detail of numerical schemes with the objective to simulate fracture propagation processes in concrete, lattice models can be applied. In order to be able to

hydration.

already hardened slab [17].

that operate at the macro-scale level.

**4.2. Meso-scale level** 

**Table 1.** Overview of modeling scales, properties and size ranges.

The input for the different scale level models consists either of direct user input or of input achieved from a lower or higher scale level. This will lead to a refinement of the simulation predictability and will lead to a system with an increased synergy. A virtual laboratory can help to facilitate this linking of models and provides the opportunity to allow other partners to link their models as well, leading to an overall modeling platform for computational materials design (see Figure 5).

**Figure 5.** Schematic representation of the modeling levels.

## **4.1. Macro-scale level**

The macro-scale level is the level at which full-scale structures are being designed and calculated. Relevant concrete related issues that have to be known for this particular scale level are listed in Table 1. For the macro-scale level, Finite element models (FEM) are very often used to simulate the structural respond of systems under static and/or dynamic actions, but also to simulate the early age hardening behavior of freshly cast material. Simulation models are used to avoid cracking during hardening. Practical problems are often related to the mix design and to the climatic conditions under which hardening takes place. Design and engineering of concrete structures, therefore, can be conducted with macro-scale FEM models like DIANA [14], ANSYS [15], ABAQUS [16] or FEMMASSE [17], where the latter is especially designed for early age analysis of concrete structures. In particular when considering the early age behavior, the development of the materials properties becomes very relevant. FEM models need to receive this information as input, or sometimes simple mathematical formulas or empirical functions are developed that can be fitted to experimental data and, with this, are able to predict the materials behavior. The disadvantage of this approach is that still expensive and time consuming physical laboratory experiments have to be conducted for the fitting and validation of the models. With these FEM models the stress and strength development can be calculated for full-scale concrete structures during the early stage of hardening. For accurate assessments, the model requires input from the structure's geometry and formwork characteristics, the ambient conditions and the thermo-mechanical properties of the hardening mix. From these inputs, the model calculates the temperature field and, from this, the development of the tensile stresses that occur as a result of the internal and external restraint (Figure 6). Important material properties herein are the development of the tensile strength, the elastic modules and the relaxation coefficient. All these properties are depending on the mix composition and on the hardening conditions, and can be expressed as a function of the degree of hydration.

**Figure 6.** Temperature and stress field during early hydration of a hardening concrete wall cast on an already hardened slab [17].

With the calculated stress field and the tensile strength development known at any location in a structure, the probability of (macro) crack occurrence can be calculated using statistical methods. Cracking will occur if the calculated stress exceeds the strength while accounting for the scatter of the material properties. The accuracy of the crack predictions also requires an accurate description of the material properties. Experienced-based models, databases or numerical simulation models operating at a more detailed scale level can be used for this. More fundamental models, based on thermo-mechanical-physical mechanisms operating at an increased level of detail, can also be applied. In the next sections, models operating at the meso, micro and nano-scale level will be discussed with emphasis on the increased level of detail for the simulations while aiming to improve the prediction accuracy of the models that operate at the macro-scale level.

#### **4.2. Meso-scale level**

234 Composites and Their Properties

Durability

Toughness

ratio

materials design (see Figure 5).

**4.1. Macro-scale level** 

Scale Property Size range [m]

Micro Hydration / Chemistry / Pore pressure / Permeability 10-6 – 10-5

The input for the different scale level models consists either of direct user input or of input achieved from a lower or higher scale level. This will lead to a refinement of the simulation predictability and will lead to a system with an increased synergy. A virtual laboratory can help to facilitate this linking of models and provides the opportunity to allow other partners to link their models as well, leading to an overall modeling platform for computational

The macro-scale level is the level at which full-scale structures are being designed and calculated. Relevant concrete related issues that have to be known for this particular scale level are listed in Table 1. For the macro-scale level, Finite element models (FEM) are very often used to simulate the structural respond of systems under static and/or dynamic actions, but also to simulate the early age hardening behavior of freshly cast material. Simulation models are used to avoid cracking during hardening. Practical problems are often related to the mix design and to the climatic conditions under which hardening takes place. Design and engineering of concrete structures, therefore, can be conducted with macro-scale FEM models like DIANA [14], ANSYS [15], ABAQUS [16] or FEMMASSE [17], where the latter is especially designed for early age analysis of concrete structures. In particular when considering the early age behavior, the development of the materials properties becomes very relevant. FEM models need to receive this information as input, or sometimes simple mathematical formulas or empirical functions are developed that can be fitted to experimental data and, with this, are able to predict the materials behavior. The

10-1 - 102

10-5 – 10-1

10-10 – 10-6

Macro Rheology / Mechanical / Cracking / Volume stability /

Meso Compressive and Tensile strength / Fracture energy and

Nano C-S-H analysis, Calcium leaching / CH, Ca/Si -ratio / Al/Si –

**Table 1.** Overview of modeling scales, properties and size ranges.

**Figure 5.** Schematic representation of the modeling levels.

Increasing the level of detail of numerical schemes with the objective to simulate fracture propagation processes in concrete, lattice models can be applied. In order to be able to predict the ultimate capacity of a virtual concrete sample by simulating its cracking pattern, a fracture mechanics model is required which can handle the crack propagation of the material while loaded. A model that can be adopted in this respect is the Delft Lattice model, which has originally been developed by Schlangen and Van Mier in 1991 [18]. The model simulates fracture processes by means of mapping a framework of beams to a materials meso structure. The basic principles of the Lattice model are schematically shown in Figure 7. In this figure (a) is a schematization of a regular type of framework mesh that can be used to simulate the meso-level structure of brittle materials, such as concrete.

Netcentric Virtual Laboratories for Composite Materials 237

**Figure 8.** Left: Concrete crack pattern after loading. Right: Lattice simulation [18].

gel, therefore, is a scale-level that has to be considered as well.

that enables an exchange of fundamental materials properties (Figure 10).

**4.4. Nano-scale level** 

cement-based material. The Hymostruc model (Figure 9, left) can be applied to predict the actual state of the material properties with the degree of hydration as the basic parameter. The model calculates the hardening process of cement-based materials as a function of the water-cement ratio, the reaction temperature, the chemical cement composition and the particle size distribution of the cement. The model calculates the inter-particle contacts by means of the 'interaction mechanism for the expanding particles' (Figure 9, right) where hydrating particles are embedded in the outer shell of larger hydrating particles. This mechanism provides the basis of the formation of a virtual microstructure which, on its turn, can be considered as the backbone of the evolving strength capacity of the material. When considering the virtual laboratory, the Hymostruc model will operate at the micro-scale level and will be used to calculate the internal microstructure that is necessary to simulate the compressive and tensile strength development, the development of the elastic modulus and other microstructure related properties. The microstructure of the material can be considered as the morphology-based inner structure of the paste, i.e. the "glue", that tightens together the aggregate particles and/or other composite phases such as fibers, fillers, etc, inside a composite material. Failure of the paste structure, therefore, strongly depends on the strength characteristics of the internal bondings in the microstructure of the paste. Modeling the morphologies of these bondings in terms of their chemo-physical nature has to be resolved at the nano-scale level. The development of the properties of the C-S-H

Nano-scale modeling has benefit from an enormous increase in the attention of the research community with the aim to model the chemical and physical based processes of the Calcium-Silicate-Hydrates (C-S-H gel) that forms the fundamental elements of the hydration products of cementitious materials [21]. Characterizing the materials performance at this particular scale level asks for modeling the fundamental processes using molecular dynamics principles. For cement-based materials in particular, emphasis has to be on the characterization of the basic building blocks of the C-S-H nano-structure that operate at the sub-micro scale level. This intermediate scale-level between the nano and micro-scale level enables a modeling approach that bridges the gap between the micro and nano level and

**Figure 7.** Principle of the Lattice model [18]; a) Lattice framework, b) Lattice beams with action forces and displacements indicated.

For composite materials in particular, the meso-level structure reflects the schematization of the paste phase, the Interfacial Transition Zone (ITZ) and the aggregate explicitly. This approach of schematization fits very well with the level that is required for modeling the compressive stress calculations inside a virtual laboratory. The model should be able to detect failure paths through the material (weakest links) and to calculate the accompanying ultimate strength of the building material from it. Once this failure path has been initiated, the inner structure of the material starts to disintegrate and the strength capacity will reach its maximum. After having reached this maximum strength level, a descending branch will follow that indicates the post-peak behavior of the material. The Lattice model is capable to calculate this part of the failure traject and to quantify the fracture energy of the failure behavior as well. For conventional concretes, the Interfacial Transition Zone (ITZ, weak bonding zone around the aggregates) is almost always the weakest part of the material that initiates and contributes to the failure paths (Figure 8). For higher quality concretes, the failure paths might cross through the aggregate particles which implicitly affect the brittleness of the material. A proper compressive strength model within a Virtual Laboratory should therefore implicitly deal with these different kinds of failure mechanisms related to the mix composition in general and the inner microstructure of the material in particular.

#### **4.3. Micro-scale level**

For the simulation of the evolving cementitious microstructure that forms the fundamental basis for the development of the material properties the hydration model Hymostruc can be used [19,20]. After mixing, hardening commences and the material properties start to develop. This process leads to a set of properties that is unique for every particular type of

**Figure 8.** Left: Concrete crack pattern after loading. Right: Lattice simulation [18].

cement-based material. The Hymostruc model (Figure 9, left) can be applied to predict the actual state of the material properties with the degree of hydration as the basic parameter. The model calculates the hardening process of cement-based materials as a function of the water-cement ratio, the reaction temperature, the chemical cement composition and the particle size distribution of the cement. The model calculates the inter-particle contacts by means of the 'interaction mechanism for the expanding particles' (Figure 9, right) where hydrating particles are embedded in the outer shell of larger hydrating particles. This mechanism provides the basis of the formation of a virtual microstructure which, on its turn, can be considered as the backbone of the evolving strength capacity of the material. When considering the virtual laboratory, the Hymostruc model will operate at the micro-scale level and will be used to calculate the internal microstructure that is necessary to simulate the compressive and tensile strength development, the development of the elastic modulus and other microstructure related properties. The microstructure of the material can be considered as the morphology-based inner structure of the paste, i.e. the "glue", that tightens together the aggregate particles and/or other composite phases such as fibers, fillers, etc, inside a composite material. Failure of the paste structure, therefore, strongly depends on the strength characteristics of the internal bondings in the microstructure of the paste. Modeling the morphologies of these bondings in terms of their chemo-physical nature has to be resolved at the nano-scale level. The development of the properties of the C-S-H gel, therefore, is a scale-level that has to be considered as well.

#### **4.4. Nano-scale level**

236 Composites and Their Properties

and displacements indicated.

**4.3. Micro-scale level** 

predict the ultimate capacity of a virtual concrete sample by simulating its cracking pattern, a fracture mechanics model is required which can handle the crack propagation of the material while loaded. A model that can be adopted in this respect is the Delft Lattice model, which has originally been developed by Schlangen and Van Mier in 1991 [18]. The model simulates fracture processes by means of mapping a framework of beams to a materials meso structure. The basic principles of the Lattice model are schematically shown in Figure 7. In this figure (a) is a schematization of a regular type of framework mesh that

can be used to simulate the meso-level structure of brittle materials, such as concrete.

**Figure 7.** Principle of the Lattice model [18]; a) Lattice framework, b) Lattice beams with action forces

For composite materials in particular, the meso-level structure reflects the schematization of the paste phase, the Interfacial Transition Zone (ITZ) and the aggregate explicitly. This approach of schematization fits very well with the level that is required for modeling the compressive stress calculations inside a virtual laboratory. The model should be able to detect failure paths through the material (weakest links) and to calculate the accompanying ultimate strength of the building material from it. Once this failure path has been initiated, the inner structure of the material starts to disintegrate and the strength capacity will reach its maximum. After having reached this maximum strength level, a descending branch will follow that indicates the post-peak behavior of the material. The Lattice model is capable to calculate this part of the failure traject and to quantify the fracture energy of the failure behavior as well. For conventional concretes, the Interfacial Transition Zone (ITZ, weak bonding zone around the aggregates) is almost always the weakest part of the material that initiates and contributes to the failure paths (Figure 8). For higher quality concretes, the failure paths might cross through the aggregate particles which implicitly affect the brittleness of the material. A proper compressive strength model within a Virtual Laboratory should therefore implicitly deal with these different kinds of failure mechanisms related to the mix composition in general and the inner microstructure of the material in particular.

For the simulation of the evolving cementitious microstructure that forms the fundamental basis for the development of the material properties the hydration model Hymostruc can be used [19,20]. After mixing, hardening commences and the material properties start to develop. This process leads to a set of properties that is unique for every particular type of Nano-scale modeling has benefit from an enormous increase in the attention of the research community with the aim to model the chemical and physical based processes of the Calcium-Silicate-Hydrates (C-S-H gel) that forms the fundamental elements of the hydration products of cementitious materials [21]. Characterizing the materials performance at this particular scale level asks for modeling the fundamental processes using molecular dynamics principles. For cement-based materials in particular, emphasis has to be on the characterization of the basic building blocks of the C-S-H nano-structure that operate at the sub-micro scale level. This intermediate scale-level between the nano and micro-scale level enables a modeling approach that bridges the gap between the micro and nano level and that enables an exchange of fundamental materials properties (Figure 10).

Netcentric Virtual Laboratories for Composite Materials 239

In 2012, the first prototype of a virtual laboratory has been developed at Delft University of Technology. In this paragraph, the system development rationale and architecture are presented, including screenshots of the prototype created to demonstrate the idea of a webbased virtual laboratory. Due to the fact that existing computation models and supporting computer tools have been implemented in the traditional way in the past, the demonstrated prototype not fully operates according the netcentric approach as discussed above. The main purpose of this prototype is to support the approach of multi-scale modeling for concrete materials in an integrated web-based environment. From this, it can be derived that prototype should provide a complete environment for multi-scale experimentation easing

From the point of view of the final users, the prototype should aid and support them on their experiments relying on a fast execution of the best simulation models existing. Since the system's user interface is based on the functionalities available in the computational (simulation) models and supporting computer tools, the final user needs to know about the computational models to be able to work with them. However, this user does not want to care where these computational models are running – they want to focus on their educational and research applications instead of the technology required providing their needs. For instance, the system should relief the users from the burden of looking for simulation module availability, its installation and execution; take care of the issues regarding combining modules to perform multi-scale experiments and carry out other system administration duties too. From the point of view of the computational model creators, the virtual laboratory should be open in a way to be able to support new computational models to be plugged-in adding new services and functionalities to the existent ones. The computational models must exchange data, making possible the execution of a multi-scale experiment based on different computational models developed independently (i.e. mashup). Therefore, there must exist a platform that works as an open ecosystem environment, populated with simulation models, created and executed independently, but that are coordinated by the final users through an interface that offers multi-scale experiments of composite materials. With this purpose, the system architecture was created based on two principal modules, the user interface module (frontend) and the simulation modules (backend). This modular organization provides decoupling of the user interface from the simulation modules. The deployment diagram of the virtual laboratory system is presented in Figure 11. showing the fronted module as the Graphical User Interface (GUI) component and the backend module as the model execution controller,

which relies on a Grid infrastructure to execute the computational models.

The virtual modelling laboratory contains a rich interface web application that has been developed using cutting-edge technology, such as HTML5, CSS3 and JavaScript. The backend is defined as a set of web services which are developed as a Restful API and which is accessed through AJAX calls. An extensive set of toolkits and frameworks is used in the

**5. Virtual laboratory prototype** 

the study of composite materials.

**Figure 9.** Left: 3D virtual microstructure simulated with Hymostruc. Right: Hymostruc interaction mechanism for expanding particles representing the formation of structure of the virtual microstructure [19,20].

**Figure 10.** Structural model that describes the atomic scale of the C-S-H gel [21].

### **4.5. Up-scaling**

The development of numerical algorithms that allows for particular scale-level information to be used at other scale levels is the most challenging part of multi-scale modeling. Bridging the length scales between the nano scale level and the macro-scale level requires a upscale models that enable to span of 10 orders of magnitude (see Table 1). In this approach the nano scale level forms the basis of the multi-scale framework. The output properties calculated at a particular scale level forms the input at a higher scale level. This approach enables the analysis and design of composite materials starting from the fundamental nanoscale level and evaluates the results at the full-scale macro-level. It opens the door for tailor made design of composite materials and is a first step towards property defined modeling approach. A virtual laboratory is an excellent vehicle to achieve this.

## **5. Virtual laboratory prototype**

238 Composites and Their Properties

[19,20].

**4.5. Up-scaling** 

**Figure 9.** Left: 3D virtual microstructure simulated with Hymostruc. Right: Hymostruc interaction mechanism for expanding particles representing the formation of structure of the virtual microstructure

**Figure 10.** Structural model that describes the atomic scale of the C-S-H gel [21].

approach. A virtual laboratory is an excellent vehicle to achieve this.

The development of numerical algorithms that allows for particular scale-level information to be used at other scale levels is the most challenging part of multi-scale modeling. Bridging the length scales between the nano scale level and the macro-scale level requires a upscale models that enable to span of 10 orders of magnitude (see Table 1). In this approach the nano scale level forms the basis of the multi-scale framework. The output properties calculated at a particular scale level forms the input at a higher scale level. This approach enables the analysis and design of composite materials starting from the fundamental nanoscale level and evaluates the results at the full-scale macro-level. It opens the door for tailor made design of composite materials and is a first step towards property defined modeling In 2012, the first prototype of a virtual laboratory has been developed at Delft University of Technology. In this paragraph, the system development rationale and architecture are presented, including screenshots of the prototype created to demonstrate the idea of a webbased virtual laboratory. Due to the fact that existing computation models and supporting computer tools have been implemented in the traditional way in the past, the demonstrated prototype not fully operates according the netcentric approach as discussed above. The main purpose of this prototype is to support the approach of multi-scale modeling for concrete materials in an integrated web-based environment. From this, it can be derived that prototype should provide a complete environment for multi-scale experimentation easing the study of composite materials.

From the point of view of the final users, the prototype should aid and support them on their experiments relying on a fast execution of the best simulation models existing. Since the system's user interface is based on the functionalities available in the computational (simulation) models and supporting computer tools, the final user needs to know about the computational models to be able to work with them. However, this user does not want to care where these computational models are running – they want to focus on their educational and research applications instead of the technology required providing their needs. For instance, the system should relief the users from the burden of looking for simulation module availability, its installation and execution; take care of the issues regarding combining modules to perform multi-scale experiments and carry out other system administration duties too. From the point of view of the computational model creators, the virtual laboratory should be open in a way to be able to support new computational models to be plugged-in adding new services and functionalities to the existent ones. The computational models must exchange data, making possible the execution of a multi-scale experiment based on different computational models developed independently (i.e. mashup). Therefore, there must exist a platform that works as an open ecosystem environment, populated with simulation models, created and executed independently, but that are coordinated by the final users through an interface that offers multi-scale experiments of composite materials. With this purpose, the system architecture was created based on two principal modules, the user interface module (frontend) and the simulation modules (backend). This modular organization provides decoupling of the user interface from the simulation modules. The deployment diagram of the virtual laboratory system is presented in Figure 11. showing the fronted module as the Graphical User Interface (GUI) component and the backend module as the model execution controller, which relies on a Grid infrastructure to execute the computational models.

The virtual modelling laboratory contains a rich interface web application that has been developed using cutting-edge technology, such as HTML5, CSS3 and JavaScript. The backend is defined as a set of web services which are developed as a Restful API and which is accessed through AJAX calls. An extensive set of toolkits and frameworks is used in the final implementation. The GUI foundation is based on the Twitter Bootstrap library for UI building and on the Backbone.js toolkit for organizing the JavaScript code, and being fully based on the jQuery library. The backend was developed using the Play framework. Although the basic application architecture is defined as a two-layered system – frontend and backend, the backend layer is composed by a set of different computational modules that can be seen as independent web services. The modules executions are managed coordinated by the overall controller service, but the composition architecture is much more complex than what is exposed. Different researchers and developers, distributed over the Internet, can make their computational models available through different interfaces. In addition, the computational models can also use results from other computational models.

Netcentric Virtual Laboratories for Composite Materials 241

user can choose at which level he/she wants to start the numerical experiments (Figure 12). It is organized in such a way that data can be calculated in a certain scale level and, when the user decides to proceed with an analysis at another scale level, the data can be taken.

The pilot version of the virtual laboratory at the Delft University of Technology (referred to as DelftCode*)* provides output (results) which are represented as graphs in the GUI. In addition, each simulation of computational model at a certain modelling scale produces specific parameter output that is managed by the DelftCode framework in such a way that it can be used as input for computational models at other scale levels (Figure 13). In this way active multi-scale modelling can be conducted and the results can be reused at other scale levels. The way how it is implemented in the DelftCode framework is that after conducting numerical experiments the user can switch to other scale levels. For the other scale levels the same procedure is followed. Since data will be stored and available for all scale levels, the user can access data of previous numerical experiments and reuse it for

**Figure 12.** DelftCode: Selection of scale level.

**Figure 13.** DelftCode: Model setup and result outputs.

other experiments.

**Figure 11.** The deployment diagram of the virtual laboratory at Delft University of Technology.

As discussed earlier, the prototype relies on a Grid infrastructure to execute the computational models. This infrastructure is distributed over different institutions, because each computational model owner should be responsible for the model execution and availability as a web service. For this purpose, a Grid infrastructure should be available to support the whole platform. As discussed earlier, other new emerging technologies such as Cloud Computing can supply on-demand computational resources to run the computational models.

In order to be able to transfer data between the different modelling scale levels, multi-scale modelling principles have to be considered as well. Apart from how the data is transferred from one scale level to another, the most challenging part is how to connect the levels from a modelling point of view. Bridging the scale levels can go along with transfer of data only by means of parameters passing or by means of a more complicated integration of models that operate at different scale levels or a possible combination. In either way, bridging the scale levels is an intensive modelling work that requires significant effort both from materials properties as well as from a modelling point of view. With the virtual modelling lab, the user can choose at which level he/she wants to start the numerical experiments (Figure 12). It is organized in such a way that data can be calculated in a certain scale level and, when the user decides to proceed with an analysis at another scale level, the data can be taken.

**Figure 12.** DelftCode: Selection of scale level.

240 Composites and Their Properties

final implementation. The GUI foundation is based on the Twitter Bootstrap library for UI building and on the Backbone.js toolkit for organizing the JavaScript code, and being fully based on the jQuery library. The backend was developed using the Play framework. Although the basic application architecture is defined as a two-layered system – frontend and backend, the backend layer is composed by a set of different computational modules that can be seen as independent web services. The modules executions are managed coordinated by the overall controller service, but the composition architecture is much more complex than what is exposed. Different researchers and developers, distributed over the Internet, can make their computational models available through different interfaces. In addition, the computational models can also use results from other computational models.

**Figure 11.** The deployment diagram of the virtual laboratory at Delft University of Technology.

can supply on-demand computational resources to run the computational models.

As discussed earlier, the prototype relies on a Grid infrastructure to execute the computational models. This infrastructure is distributed over different institutions, because each computational model owner should be responsible for the model execution and availability as a web service. For this purpose, a Grid infrastructure should be available to support the whole platform. As discussed earlier, other new emerging technologies such as Cloud Computing

In order to be able to transfer data between the different modelling scale levels, multi-scale modelling principles have to be considered as well. Apart from how the data is transferred from one scale level to another, the most challenging part is how to connect the levels from a modelling point of view. Bridging the scale levels can go along with transfer of data only by means of parameters passing or by means of a more complicated integration of models that operate at different scale levels or a possible combination. In either way, bridging the scale levels is an intensive modelling work that requires significant effort both from materials properties as well as from a modelling point of view. With the virtual modelling lab, the The pilot version of the virtual laboratory at the Delft University of Technology (referred to as DelftCode*)* provides output (results) which are represented as graphs in the GUI. In addition, each simulation of computational model at a certain modelling scale produces specific parameter output that is managed by the DelftCode framework in such a way that it can be used as input for computational models at other scale levels (Figure 13). In this way active multi-scale modelling can be conducted and the results can be reused at other scale levels. The way how it is implemented in the DelftCode framework is that after conducting numerical experiments the user can switch to other scale levels. For the other scale levels the same procedure is followed. Since data will be stored and available for all scale levels, the user can access data of previous numerical experiments and reuse it for other experiments.

**Figure 13.** DelftCode: Model setup and result outputs.

At time of writing, the prototype only supports computational models at micro and meso scales, but the prototype has been designed and implemented to support all scale levels. Figure 14 show all the proposed user interactions in the prototype. These interactions show the possibilities of the developed prototype that allows multi-scale modeling platform. The architecture of the prototype is designed in such a way that future extensions in terms of new model additions or adding another scale level can easily be achieved.

Netcentric Virtual Laboratories for Composite Materials 243

scale modeling platform. The architecture of the proposed virtual laboratory as provided in this chapter allows the models to be exchangeable and merge-able leading to an integrated

The numerical models for simulating the materials performance operate at the different scale-levels with the Hymostruc model as the main microstructural model at the micro-scale level. With this model connecting to the nano-scale model for inputting detailed information on C-S-H gel properties, the microstructural information can be used as input for the mesolevel Lattice model to simulate the fracture behavior of composite materials submitted to internal actions (drying, autogenous shrinkage, etc) or external actions (loads, thermal imposed loading, etc). These models can generate input data for the macro-scale models to simulate the full scale performance of structural elements. With the multi-scale approach, the consequences of changing parameters that act as input for lower scale models (nano, micro, meso) can directly be made visible by upscaling. Therefore, from this approach the

The netcentric virtual laboratory is a most appropriate tool for the assessment of

 The web-based approach enables the communication between models that operate at different geometrical scale-levels using an integrated computational modeling system; The prototype shows the huge potential of web-based modeling and provides an

 The future perspective of virtual web-based modeling shows to be a very powerful alternative for vast computational models that run over different time and length scales.

[1] Kuester, F. and Hutchinson, T. A virtualized laboratory for earthquake engineering

[2] ECWINS consortium. The Road to Standardized Window Production. Collective

[3] Bullard, J. et al. Virtual Cement, Innovations in Portland Cement Manufacturing, USA,

composite materials performance using a multi-scale modeling approach;

exchangeable and scalable system for multi-scale modeling;

*Netherlands Defence Academy, Breda, The Netherlands* 

*Delft University of Technology, Delft, The Netherlands,* 

*COPPE-UFRJ, Programa de Engenharia Civil, Rio de Janeiro, Brazil* 

education, ASEE Journal of Engineering Education, 15:1, 2007.

Research Projects for SMEs, Vol. 3, European Union, 2007.

approach.

following conclusions can be drawn:

**Author details** 

E.A.B. Koenders

D.B.F. Carvalho

**7. References** 

2004.

*(PUC-Rio), Rio de Janeiro, Brazil* 

E. Dado

**Figure 14.** The activity diagram of the user experiments of all scales.

## **6. Conclusion and discussion**

This paper depicts the development the concept of so-called virtual laboratories that are based on a netcentric approach. In this context, a netcentric virtual laboratory is considered as a part of an evolutionary, complex community of people (users), devices, information (i.e. experimental data) and services (computational models and supporting computer tools) that are interconnected by the Internet. A large number of emerging and enabling technologies are discussed which form technological basis for establishing such netcentric virtual labs. Although the developed prototype is hampered by the fact that most computational models and supporting computer tools have been developed using traditional programming languages and platforms, it showed the enormous potential of the netcentric approach advocated in this paper.

From the perspective of the application domain, a virtual laboratory can be considered as a most appropriate way to interact with users and developers of computational material models at different scale levels. In this paper the approach of multi-scale modeling approach for concrete materials is explained. This approach is based on numerical (computational) models developed for cementitious materials that operate at different 'geometrical' scale levels. With the ability to use the output generated at any particular scale level as input for models that run at other scale levels, the web-based virtual laboratory acts as a real multiscale modeling platform. The architecture of the proposed virtual laboratory as provided in this chapter allows the models to be exchangeable and merge-able leading to an integrated approach.

The numerical models for simulating the materials performance operate at the different scale-levels with the Hymostruc model as the main microstructural model at the micro-scale level. With this model connecting to the nano-scale model for inputting detailed information on C-S-H gel properties, the microstructural information can be used as input for the mesolevel Lattice model to simulate the fracture behavior of composite materials submitted to internal actions (drying, autogenous shrinkage, etc) or external actions (loads, thermal imposed loading, etc). These models can generate input data for the macro-scale models to simulate the full scale performance of structural elements. With the multi-scale approach, the consequences of changing parameters that act as input for lower scale models (nano, micro, meso) can directly be made visible by upscaling. Therefore, from this approach the following conclusions can be drawn:


## **Author details**

242 Composites and Their Properties

At time of writing, the prototype only supports computational models at micro and meso scales, but the prototype has been designed and implemented to support all scale levels. Figure 14 show all the proposed user interactions in the prototype. These interactions show the possibilities of the developed prototype that allows multi-scale modeling platform. The architecture of the prototype is designed in such a way that future extensions in terms of

This paper depicts the development the concept of so-called virtual laboratories that are based on a netcentric approach. In this context, a netcentric virtual laboratory is considered as a part of an evolutionary, complex community of people (users), devices, information (i.e. experimental data) and services (computational models and supporting computer tools) that are interconnected by the Internet. A large number of emerging and enabling technologies are discussed which form technological basis for establishing such netcentric virtual labs. Although the developed prototype is hampered by the fact that most computational models and supporting computer tools have been developed using traditional programming languages and platforms, it showed the enormous potential of the netcentric approach

From the perspective of the application domain, a virtual laboratory can be considered as a most appropriate way to interact with users and developers of computational material models at different scale levels. In this paper the approach of multi-scale modeling approach for concrete materials is explained. This approach is based on numerical (computational) models developed for cementitious materials that operate at different 'geometrical' scale levels. With the ability to use the output generated at any particular scale level as input for models that run at other scale levels, the web-based virtual laboratory acts as a real multi-

new model additions or adding another scale level can easily be achieved.

**Figure 14.** The activity diagram of the user experiments of all scales.

**6. Conclusion and discussion** 

advocated in this paper.

E. Dado *Netherlands Defence Academy, Breda, The Netherlands* 

E.A.B. Koenders *Delft University of Technology, Delft, The Netherlands, COPPE-UFRJ, Programa de Engenharia Civil, Rio de Janeiro, Brazil* 

D.B.F. Carvalho *(PUC-Rio), Rio de Janeiro, Brazil* 

## **7. References**


[4] Dado, E., Koenders, E. and Mevissen, S. Towards an advanced virtual testing environment for concrete materials, published in the proceedings of the MS2010 conference, 2010.

**Section 4** 

**Mechanical and Physical** 

**Properties of Composites** 


**Section 4** 

**Mechanical and Physical Properties of Composites** 

244 Composites and Their Properties

conference, 2010.

Technology, 16:2, 2009.

International, No. 12, United States, 2004.

publication 800-145, United States, 2011.

Zhejiang University, 13:4, China, 2012.

Magazine, 23-5, United States, 2008.

[14] DIANA, http://tnodiana.com/. [15] ANSYS, www.ansys.com/. [16] ABAQUS, www.simulia.com. [17] FEMMASSE, www.femmasse.nl.

Netherlands, 1991.

[4] Dado, E., Koenders, E. and Mevissen, S. Towards an advanced virtual testing environment for concrete materials, published in the proceedings of the MS2010

[5] Dado, E, Koenders, E. and Beheshti, R. Theory and Applications of Virtual Testing Environments in Civil Engineering, International Journal of Design Sciences and

[6] Koenders, E., Schlangen, E. and Dado, E. Virtual testing of compressive strength of

[7] Koenders, E., Dado, E and van Breugel, K. A Virtual Environment for Multi-Aspect

[8] Garboczi, E., Bullard, J. and Bentz, D. Virtual Testing of Cement and Concrete, Concrete

[9] Foster, I., Zhao, Y., Raicu, I. and Lu, S. Cloud Computing and Grid Computing 360- Degree Compared. Proc. Grid Computing Environments Workshop, p.1-10, 2008. [10] Mell, P. and Grance, T. The NIST Definition of Cloud Computing, NIST special

[11] Wu, Z. and Chen, H. From Semantic Grid to Knowledge Service Cloud, Journal of

[12] Mika, P. and Tummarello, G. Web semantics in the clouds. IEEE Intelligent Systems

[13] Koenders, E, Schlangen, E. and Dado, E. Virtual Testing of Compressive Strength of

[18] Schlangen, E. Experimental and Numerical Analysis of Fracture Processes in Concrete,

[19] Breugel, van, K. Simulation of Hydration and Formation of Structure in Hardening Cement-based Materials, PhD. Thesis, Delft University of Technology, Delft, The

[20] Koenders E. Simulation of Volume Changes in Hardening Cement-Based Materials,

[21] Dolado J. S., Hamaekers J. and Griebel M. A Molecular Dynamic study of cementitious Calcium Silicate Hydrate (C-S-H) gels, Journal of American Ceramic Society.

PhD-Thesis, Delft University of Technology, Delft, The Netherlands, 1997.

Concrete, Proceedings of the ISEC-4 Conference, Australia, 2007.

PhD thesis, Delft University of Technology, The Netherlands, 1993.

concrete, published in the proceedings of the ISEC-4, Conference, 2007.

Modeling, published in the proceedings of the SCI2004 conference, 2004.

**Chapter 12** 

© 2012 Hammood and Radeef, licensee InTech. This is an open access chapter distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

© 2012 Hammood and Radeef, licensee InTech. This is a paper distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use,

distribution, and reproduction in any medium, provided the original work is properly cited.

**Characterizations of Environmental Composites** 

Recently, environmental preservation issues have been critical between the chemical pollution matters and the development technology requirements. However, the renewable

Numerous researches have richly studies the natural fiber reinforcement polymer composites. This fact, based on both fibers and matrixes are derived from renewable resources. Therefore, the formed composites have more compatibility with the environmental preservation issues. [1] Isabel investigated of the most natural fibers are used such as palm, cotton, silk, coconut, wool and wood fibers. A significant development in the lignocelluloses fiber in thermoplastics realized the distinct researches presented by [2-5] Composite-reinforcing fibers can be categorized by chemical composition, structural morphology, and commercial function. Natural fibers, such as kenaf, ramie, jute, flax, sisal, sun hemp and coir are derived from plants that used almost exclusively in PMCs. Aramid fibers [6] are crystalline polymer fibers are mostly used to reinforce PMCs. The compounds percentage of composite have the essential role for verify the designed values according to applications, therefore the mechanical properties of PMCs predicated by Mohamed (2007).

The primary function of a reinforcing fiber is to increase the strength and stiffness of a matrix material. The fibers reinforced composite have the essential role in this investigation for its significant property advantages as high stiffness, lightweight, easily recycled material, availability, low manufacturing cost, the environment effect and lifetime rupture behavior. Various types of natural fibers are available to combine with other mineral fiber for construct composite material. Essentially, the fiber can be classified as vegetable, animal, and man-made fibers. The main disadvantages of natural fibers are their high level of moisture absorption, poor and interfacial adhesion, relatively low heat resistance. [7-8] investigated high speed impact events using (PKV, PRM) composites. This research was indicated significant improvements in the penetration resistance. This fact comes from the improvement of target geometry structure. Numerous researches have been carried out on

Ali Hammood and Zainab Radeef

http://dx.doi.org/10.5772/50494

and friendly materials come to use.

**1. Introduction** 

Additional information is available at the end of the chapter
