**3. Methodology proposal**

After exploring the perceived topics' strengths and limitations and presenting the rationale for combining DT and DA to enhance the operational performance management systems, the methodological stages are laid down on **Figure 4**.

*Concepts, Applications and Emerging Opportunities in Industrial Engineering*

**2.2 Design thinking and data analytics for developing OPMS**

performance management sub-systems.

tages and limitations on **Figure 2**.

on the organizational sphere.

approach.

Firstly, there is an obvious need for a deep knowledge about the specific opera-

To approach these OPMS issues, the authors explored two different theoretical backgrounds which could be combined to provide an adequate solution for the prior OPMS issues, at the same time they could be leveraged to overcome each theory's intrinsic limitations. As such, the author initially summarizes each theory's advan-

This diagram depicts both DT and DA positive and negative points and it can be noticed there are two divergent ways of analyzing the real world according to these methodologies. To the left, Design Thinking takes a broad approach to problemsolving, where it explores different problem-solution frames, through intensive validation and by gathering a vast business knowledge allied with a user-centric

As limitations, some argue over the designer's intrinsic subjectivity on choosing the procedure's iterations and artifacts used, which affects the traditional business' requirement for milestones' definition and compliance. Furthermore, there is a fundamental difference on design thinking and manager thinking methodologies for problem-solving, which can sometimes lead to misunderstandings and attrition

On the other hand, Data Analytics is more focused on analyzing a specific perceived problem or system, with great power for creating valuable and meaningful

tional environment, with a systemic approach over the intertwined processes between internal and external boundaries, and the various interactions between people, machines, and people and machines. On the other hand, there is also the necessity for connecting the strategic and operational performance objectives, with a holistic and integrated view of the whole business. Finally, the operation managers can and should leverage the ICT developments to help reduce unfounded and biased decision-making and provide an automatic integration of the various performance systems to make up a systemic OPMS composed by function-specific operational

**100**

**Figure 2.**

*Pros and cons of design thinking and data analytics methodologies.*

**Figure 4.**

*Schematic representation of the methodological stages and their foundations.*

This diagram portrays six main stages. The first two stages (Stages 0 and 1) are only related to Design Thinking steps. These refer to the initial mapping of the organization's activities and stakeholder's relationships, while understanding their emotional and intuitive ways of reasoning. Altogether, this allows for multiple framing of the problem-space according to different perspectives.

The third stage (Stage 2) is intended to define the problem-space as it is perceived, and explore the solution-space, by conceiving different solutions for further narrowing and prototyping. This step is presented as being included both in DT and DA methodologies since the problem definition is based on DT practices, while for the solution space exploration the practitioner should already have a DA focus to try to come up with data-driven solutions.

The fourth stage (Stage 3) is focused on the models' conception for the solutions idealized. To materialize these models, they require the development of proper datasets, along with and its inherent collection and storage processes. For further analysis and construction of the virtual model of the real system, this stage is primarily based on DA practices and tools, and it is a laborious and time-consuming step, since it needs diverse knowledge and infrastructure to capture, pre-process, structure and test different datasets for various analytic tools.

The fifth stage (Stage 4) requires a combination of DA and DT techniques and expertise to test and absorb feedback, so the developed prototypes and models can be improved or selected, grounded on a collaborative creation. This step's outcome is intended to filter misconceptions and align stakeholder's expectations, as well as narrowing down the solution space to concentrate the team's efforts and walk towards the final solution.

The sixth and final stage (Stage 5) proposes the development of the desired solution into a real-life testable model. For this purpose, it is recommended the iterative testing and refining of the perceived solution until all parts are satisfied with the product/service.

This methodology entails an inherent iterative nature between all of the six stages, repeatedly exploring and refining the problem and solution space until the team is satisfied with the frames developed.

Beyond the methodological procedure, there must be a set of tools or practices which guide the practitioner, or at least are used as reference, for applying the conceptualized methodological steps. The proposed tools and practices presented on **Table 1** are extracted from prior literature research, but these are not exclusively limited to its assigned stage, nor are they to use strictly as the literature proposes.

**103**

**Table 1.**

*A Hybrid Human-Data Methodology for the Conception of Operational Performance…*

**Stage Tools and practices Reference literature**

mapping

Competitor activity system

Customer narratives/ storytelling

Stage 3—Solution exploration Data types and data collection; [30, 31];

Stage 4—Prototype and test Rapid prototyping [15, 25, 41]

Activity system mapping [14–16]

Stakeholders mapping [14, 17, 18] Value Chain analysis [19] PEST analysis [14, 20]; Five Porter's Forces [21]

Shadowing [14, 22, 23] Semi-structured interviews [14, 24]

Scenarios [15, 22]; Personas [14, 18, 25] Customer journey [14, 15]; Mind mapping [14]

Motivational mapping [14, 15]; Metaphors [14] Brainstorming [15, 26] Sketching [15, 27]; Concept development [28, 29]

Data pre-processing [30, 32]; Data warehousing [30, 33, 34] Statistical tools and models [30, 35]; Machine Learning models [36, 37];

Scenarios [15, 22]; Role-playing [14, 15]; Co-creation [14, 28, 42] Future customer journey [14, 43]; Iterative prototyping [14, 41];

Co-creation [14, 28, 42, 44] Iterative prototyping [14, 41];

Value delivery [2, 14, 15, 45, 46];

Business analytics and visualization techniques

Integration with existing

operations

[14, 15];

[14, 24]

[38–40]

[13, 14, 47, 48]

*DOI: http://dx.doi.org/10.5772/intechopen.93631*

Stage 0—Constructing the organizational 'picture'

Stage 1—Exploring users' pains and

Stage 2—Co-evolution of problem and

Stage 5–Develop concrete product/

*Summary of tools and practices to perform each methodological stage.*

service solution

needs

solution space


*A Hybrid Human-Data Methodology for the Conception of Operational Performance… DOI: http://dx.doi.org/10.5772/intechopen.93631*

#### **Table 1.**

*Concepts, Applications and Emerging Opportunities in Industrial Engineering*

framing of the problem-space according to different perspectives.

*Schematic representation of the methodological stages and their foundations.*

to come up with data-driven solutions.

analytic tools.

**Figure 4.**

product/service.

towards the final solution.

team is satisfied with the frames developed.

This diagram portrays six main stages. The first two stages (Stages 0 and 1) are only related to Design Thinking steps. These refer to the initial mapping of the organization's activities and stakeholder's relationships, while understanding their emotional and intuitive ways of reasoning. Altogether, this allows for multiple

The third stage (Stage 2) is intended to define the problem-space as it is perceived, and explore the solution-space, by conceiving different solutions for further narrowing and prototyping. This step is presented as being included both in DT and DA methodologies since the problem definition is based on DT practices, while for the solution space exploration the practitioner should already have a DA focus to try

The fourth stage (Stage 3) is focused on the models' conception for the solutions idealized. To materialize these models, they require the development of proper datasets, along with and its inherent collection and storage processes. For further analysis and construction of the virtual model of the real system, this stage is primarily based on DA practices and tools, and it is a laborious and time-consuming step, since it needs diverse knowledge and infrastructure to capture, pre-process, structure and test different datasets for various

The fifth stage (Stage 4) requires a combination of DA and DT techniques and expertise to test and absorb feedback, so the developed prototypes and models can be improved or selected, grounded on a collaborative creation. This step's outcome is intended to filter misconceptions and align stakeholder's expectations, as well as narrowing down the solution space to concentrate the team's efforts and walk

The sixth and final stage (Stage 5) proposes the development of the desired solution into a real-life testable model. For this purpose, it is recommended the iterative testing and refining of the perceived solution until all parts are satisfied with the

This methodology entails an inherent iterative nature between all of the six stages, repeatedly exploring and refining the problem and solution space until the

Beyond the methodological procedure, there must be a set of tools or practices which guide the practitioner, or at least are used as reference, for applying the conceptualized methodological steps. The proposed tools and practices presented on **Table 1** are extracted from prior literature research, but these are not exclusively limited to its assigned stage, nor are they to use strictly as the literature

**102**

proposes.

*Summary of tools and practices to perform each methodological stage.*
