**5. Evaluating design solution**

Evaluation is one of the most important components of project development and can play an important part in keeping the project aligned with its scope and objectives. Evaluation is used to support the decision making in development by applying evaluation methods and gathering the important data to compare the project to its pre-defined goals.

There are a number of evaluation types that exist but they can all be summarized into three most basic types which are Goal based, Process based, and Outcome based evaluations.

**Goal based** evaluations are mainly focused on the objectives of the project that have been pre-defined during the requirements gathering process. Once the project development is complete, it is evaluated to see if the features included in the project support the goals and objectives that were pre-defined.

**Process based** evaluations focus on the project's quality, strengths, and weaknesses. It discusses if the processes included in the project satisfy the stakeholders and are implemented the way they were intended. It also summarizes the strengths and weaknesses of the project as important feedback to improve in the next iterations.

**Outcome based** evaluations discuss the lasting effects of the project and the greater good that can be served as an outcome of the project. It measures the final goal of the project and how well it has been achieved.

This section will discuss ViDAS evaluation. It will also discuss the steps that were taken to evaluate ViDAS and to obtain important feedback, which was then used to improve the tool.

#### **5.1 ViDAS evaluation**

Once the implementation was done, the next step was to evaluate the ViDAS development. This evaluation was based on the business, user, and system requirements that were gathered during the requirements elicitation process discussed in section 3. First, the tool was self-evaluated and later by our stakeholders and users to get their useful feedback to improve the User Experience and User Interface of ViDAS.

The evaluation of ViDAS was mainly goal-based as the development was done keeping in mind the requirements and objectives that were defined before the implementation. Various evaluation methods were combined for ViDAS as shown *Implementing Visual Analytics Pipelines with Simulation Data DOI: http://dx.doi.org/10.5772/intechopen.96152*

**Figure 10.** *ViDAS evaluation steps.*

in **Figure 10**, including comparing the goals to the pre-defined documents, arranging a workshop with the focus group, and collecting important data before and after the workshop. The evaluation also partially included segments from process based and outcome based evaluations as the project's quality was assessed, important feedback was collected, and the final goal was studied during the complete process.

Once the development of ViDAS was completed, the features and functions developed were compared to the pre-defined requirements. The product was assessed in the view of the objectives and goals defined during the requirements gathering process. Also, a number of tests were carried out that included analyzing the reliability of the front-end on multiple browsing agents as well as various screen sizes. Stress tests were also a vital part of the self-evaluation process and ViDAS was tested by uploading big data containing millions of rows to assess the data handling time and look for any bottlenecks that may hinder the tool's performance. All of these steps were an important part of self-evaluation and the necessary changes were made to improve the final User Experience (UX) before preparing a workshop with the stakeholders.

After self-evaluation, a workshop was arranged with the stakeholders. This workshop was prepared using tasks similar to the ones used during the requirements gathering process to compare and improve with the state-of-the-art in the market. Similar tasks made it easy for the stakeholders to compare ViDAS with the tools used in the requirements gathering workshop.

#### **5.2 Feedback**

After the evaluation workshop, the initial feedback was collected from the stakeholders and documented to improve ViDAS. The stakeholders were pleased with the overall implementation of the tool. While there were suggestions to improve ViDAS, it was recognized to be a great help with the analysis of the simulation data for the stakeholders.

The post-workshop feedback was gathered by asking the stakeholders to fill open-ended and closed-ended questionnaires that have been summarized in **Table 6**. Overall, the stakeholders were satisfied with the tool execution. The data overview, the chart building process, and the interactive visualizations were greatly appreciated. The stakeholders also found the chart recommendation concept useful. The custom analysis features were commended and the workflow pipeline design which comprised of the drag and drop module was praised by the stakeholders. Furthermore, the development of ViDAS in such a short span of a few months was greatly appreciated, which resulted in the overall feedback being much positive. However, the overall tool execution feedback was lower than anticipated and this was mainly due to some usability aspects such as missing validations, missing tool tips, and labels not being user-friendly. These usability issues have since been resolved and have been implemented in the second iteration of the tool.


#### **Table 6.**

*ViDAS evaluation feedback overview (1 star - unforgettably bad, 2 stars - below average, 3 stars - average, 4 stars - above average, and 5 stars - unforgettably good).*
