**3. Method**

A multiple case study methodology was chosen as the preferred approach to investigate the project management practices that can be used to enable the enrichment of user experience evaluation while maintaining agility and speed in UX evaluation projects. We conducted 12 cases studies on usability test projects using enriched UX methods over a short period of time (maximum 2 weeks). All 12 tests were conducted by the same organization. In this chapter, we refer to these as a Sprint projects*.*

The multiple case studies thus make it possible to identify the inherent and recurrent markers [27–29] of the Sprint project's management practices in order to better define and understand it in all its complexity. Different variables of the Sprint projects have been highlighted to better understand its mechanisms such as the project's objective and its level of complexity, the execution (the UX team deployed and their work per hour ratio, the experimental design, the maturity of the stimuli (which were all prototypes), the tools used, the measures analyzed, and the time of completion), and, more specifically, the details of the tests (number of

*Towards Agility and Speed in Enriched UX Evaluation Projects DOI: http://dx.doi.org/10.5772/intechopen.89762*

participants, recruitment process, and testing time which, for most of the projects, has been standardized to 12 participants and 1 h of testing). Finally, the degree of details in the test results has been presented in terms of the magnitude of the final report submitted (**Table 1**).

Data were collected using structured interviews with at least three members of each project. The structured interview covered the following properties for each project: (i) objective of the usability test; (ii) difficulty of the objective (1—easy to 5—difficult); (iii) description of the experimental design; (iv) maturity of the stimuli (prototype); (v) tools used and measures analyzed in the test; (vi) time to execute the test; (vii) difficulty of the execution (1—easy to 5—difficult); (viii) number of participants; (ix) population; (x) testing time (in minutes); and (xi) magnitude of the report (in pages). To evaluate the difficulty measures, we averaged the answer of the respondents. **Table 1** provides a summary description of all 12 projects that are presented in chronological order.

The interviews also included open-ended questions focused on project management practices. Questions covered project planning, project management, communication and coordination in the team, status of work with the external client, project execution, and analysis management.

## **4. Result**

project. Especially, since "Once there is an established relationship with the client, and the team is familiar with both how they work together and with outside resources, they can better assess the consultant's ability to work with them in an

The integration of the two approaches into a common methodology is based on two main strategies [25]: the first suggests that the UX team should become quickly integrated into the product development cycle so that it can understand the initial mission of the project and be present from the first decisions taken, and the second strategy suggests the use and deployment of "agile" tools to facilitate communication and documentation. These are mainly personas, usage scenarios, sketches, and concept maps to quickly understand the direction of the project as well as to facilitate message transmission to all the stakeholders of the project [23].

Regarding the importance of collaboration among the various stakeholders involved in the project, it is essential for all members to maintain constant communication and a working synergy to ensure the sharing of a common mission and vision. In addition, integrating targeted users at key points in the development process allows creators to respond appropriately to their needs. This way of working makes it possible to ensure a certain consistency and uniformity of the project as well as to more effectively control the expectations of the client. "In an ideal situation, UX development and research involves frequent, iterative user testing. Because agile focuses on smaller changes, it can be possible to conduct small-scale testing at various points throughout the process to ensure changes fit with UX expectations" [24]. These different parts of the project can also take the form of "Sprints" of the "development, testing, evaluation, and adjustment" cycle.

By adopting an agile approach, UX researchers tend to change their work methodology by reducing their activities, adopting a less formal process, and a more minimalist method [24]. Although the integration of an agile approach requires a restructuring of the UX experimental design, it is necessary to ensure that the integrity and enhanced value of the UX process are maintained, and even enriched with psychophysiological measures. The recent development of a laboratory management and analytics software platform for human-centered research now makes this kind of integrated process possible, which (a) enables accurate triangulation of enriched UX measures, (b) produces results in a timely manner, and (c) helps to

A multiple case study methodology was chosen as the preferred approach to investigate the project management practices that can be used to enable the enrichment of user experience evaluation while maintaining agility and speed in UX evaluation projects. We conducted 12 cases studies on usability test projects using enriched UX methods over a short period of time (maximum 2 weeks). All 12 tests were conducted by the same organization. In this chapter, we refer to these as a

The multiple case studies thus make it possible to identify the inherent and recurrent markers [27–29] of the Sprint project's management practices in order to better define and understand it in all its complexity. Different variables of the Sprint projects have been highlighted to better understand its mechanisms such as the project's objective and its level of complexity, the execution (the UX team deployed and their work per hour ratio, the experimental design, the maturity of the stimuli (which were all prototypes), the tools used, the measures analyzed, and the time of completion), and, more specifically, the details of the tests (number of

generate meaningful recommendations [26].

**3. Method**

Sprint projects*.*

**86**

agile setting" [24].

*Human 4.0 - From Biology to Cybernetic*

The 12 projects involved in this study are descripted in **Table 2**. In total, these projects necessitated the participation of 144 typical users (experts and neophytes), deployed 4 neurophysiological tools and 4 psychometric tools, and concluded with 799 pages of reports. It should be noted that the organization conducted regular debriefing sessions with the client to outline the failures and accomplishments. We can observe that over time, the projects experienced a significant reduction in execution time, human intervention, level of difficulty, and costs by standardizing the methodology. We went from a 19-day project to a 12-day project (including preparation time), from a 20 expert (internal staff and external sponsors) implication to a core team of only 4 experts, and from a level of difficulty of 5–2.5, which all ultimately affect the cost of operations.

Based on the interviews and the observation, it has been possible to put forward the following conclusions. To execute a Sprint projects, many considerations have to be taken into account:

	- a. Human: need to communicate regularly with the design clients and various project stakeholders and jointly establish the mandate and experimental design with the concerned design clients.
	- b. Technical: need to anticipate the collected measurements and enhanced results using a mosaic of hybrid collection methods.


**Case**

**89**

**Objective**

**Objective**

 **Level of**

**Experimental**

 **design**

**Maturity**

**Measures\***

**Time spent**

 **Level of**

**Number of**

**Sample**

 **Testing**

**Magnitude**

**time**

**of the** **report**

**difficulty**

**participants**

**(1**

**—easy**

**to 5**

**—**

**difficult)**

**of the** **stimuli**

**difficulty**

**(condition**

 **= version**

**(1**

**—easy**

**of the product)**

> **to 5**

**—**

**(task =** 

**assignment)**

**(prototype**

**or final** **product)**

**difficult)**

5

 Analysis of user

3

4 tasks

A&Cl

7 days of preparation

2.5

Millennial

1

*DOI: http://dx.doi.org/10.5772/intechopen.89762*

presentation

35-page

report

*Towards Agility and Speed in Enriched UX Evaluation Projects*

–

3 days of data collection

(testing)

2 days of analysis

*E*

A (EDA)

KPI

PI (SUS) &

PI (Att)

N&I

(4 sub-tasks)

1 interview

2 surveys

experience in

interaction with

a website, version

2.

6

 Analysis of user

2.5

4 tasks

A&Cl

7 days of preparation

2.5

Millennial

1

presentation

58-page

report

–

3 days of data collection

(testing)

2 days of analysis

*E*

A (EDA)

KPI

PI (SUS) &

PI (Att)

N&I

A&Cl

7 days of preparation

4.5

Millennial

1

presentation

102-page

report

–

4 days of data collection

(testing)

4 days of analysis

*E*

A

(EDA) + A

(EKG)

KPI

PI (Wq)

N&I

(4 sub-tasks)

1 interview

2 surveys

experience in

interaction with

a website, version

3.

7

 Analysis of user

4.5

 4 scenarios

15 tasks 2 conditions 4 interviews

1 survey

experience in

interaction with

two different

versions of a web

and mobile

interface

transactional

transactional

**Execution**

**Details of the testing**

**Results**

*Human 4.0 - From Biology to Cybernetic*

