**3. Conclusion**

Technology is not neutral. A biased development team, organization, societal context, etc. will most likely produce biased software. Biased technology perpetuates damaging stereotypes, hinders the empowerment of minorities and underrepresented groups, and ultimately delays innovation and progress. This can hardly be considered successful [18]. So far, the most effective ways of fighting biases in technology that we know are (a) being aware of said biases, in all their shapes and forms, and (b) striving to have as much diverse development teams and organizations. However, future possibilities may include the perspective of automated testing of fairness [19].

From the perspective of software architecture, one of the critical stages in software development, we can work toward the construction of less biased software by carefully analyzing the non-functional requirements that are relevant to our product. By first extending the traditional taxonomy of non-functional requirements (much focused on product requirements alone) and then (re)visiting it one by one, we have shed some light on how a software architect can contribute in this very important endeavor. We have provided a sort of exhaustive but high-level checklist that (a) practitioners and future professionals can use when analyzing and designing their systems and applications and (b) users can use to empower themselves in claiming that the whole software industry evolves to a higher level of responsibility toward not only innovation but progress.

*Introduction to Data Science and Machine Learning*
