**4.4 Progression of reflection**

We wanted the students to improve their ability to reflect more deeply. Therefore, we in 2012 developed and introduced a four-level model for reflections


**Table 6.**

*Results from the postquestionnaire on the effects on the students' learning.*

(see **Table 1**) [22]. The reflection documents are graded in two passing grades, and from 2012 we required that the reflection should reach level 3 and 4 for students in years 2 and 3, in order to receive the highest grade. Since the students are reading each other's reflection documents, the first-year students could learn how to reflect more deeply when reading the reflections of the older students.

We developed a language technology-based system that is able to measure the depth of a reflection, according to the model in **Table 1** [28]. When comparing the mean reflection level of the reflection documents by the same students in the beginning of year 1 and in the end of year 3, we can see that the mean reflection level is raised from year 1 to year 3 for every student group and that the increase became larger after the introduction of the four-level model [28].

Thus, introducing the four-level reflection model and assessing the students' reflection documents using this model improved the mean progression of reflection from the beginning of the course to the end of the course.

The students are aware of this progression. When we ask them if they feel that they are better at writing reflections than when they started the programme, the students at the end of their first year do not see any clear improvement, but after year 2, and even more after year 3, the improvement is evident (see **Table 5**).

#### **4.5 Inspiration for exchange studies**

Before the seminar *studying and working abroad*, each student has to read about how exchange studies work and read three travel reports from students who have studied abroad. Then each student should reflect on exchange studies and discuss with the other students in the ordinary PIC way. Our hypothesis was that the introduction of this seminar should increase the number of students studying abroad. The number of students studying abroad almost doubled after the introduction of the seminar, which might indicate a correlation [28].

#### **5. Usage of the course**

Mandatory surveys in the course (see function 10 in Section 3.2) are an important and versatile tool. In this section, we will look at five examples of how submitted reflection documents and mandatory questions to all students in all years can be used.

#### **5.1 Student-based programme development**

In the mandatory questionnaire in PIC2 and PIC3 in 2016 (and again in 2019), we asked the following question: 'Give at least one proposal for how the master's programme in computer science and engineering could be improved.'

Almost 800 suggestions for improvements were received, at least one from every active student. We manually sorted and categorised the suggestions into 25 categories, with respect to what each suggestion aims to improve.

We then prioritised the suggestions: already implemented, should be implemented immediately or when possible, needs further work to become useful, save for future consideration or reject.

We selected 24 suggestions that would be possible to implement and presented them to two student representatives, who prioritised which suggestions we should proceed with in the next stage.

We proceeded with 14 suggestions. In a new mandatory questionnaire in PIC2 and PIC3, we now asked each student to evaluate each suggestion on a seven-point

**213**

*Programme Integrating Courses Making Engineering Students Reflect*

scale and, optionally, comment. Finally, we analysed the evaluation and started to

We found that it is possible to collect suggestions for improvement and opinions on them from all students that most suggestions were realistic and well founded. Furthermore, we could see what support and what opposition each suggestion would meet if implemented. For each suggestion, we got comments showing pos-

This approach, which we call *student-based programme development*, thus gives us a very good foundation for deciding whether and when the suggestions should be

In Sweden, there has since 2013 been a debate in public media, where university professors, mostly from departments of history, have argued that today's students entering university are much less accomplished than earlier students when it comes to basic Swedish language skills. According to the debate, both the spelling and grammar of Swedish students are weak. The first signs of these are said to have been observed in 2010. In order to objectively study the language skills of Swedish first-year university students, we constructed an automatic tool, based on language technology, which measures the language skills that, according to the critics, have been deteriorating. We used the tool on the PIC2 reflection documents from the first seminar from seven different years, 2010–2016. The results show, surprisingly, that the language skills of the studied groups of students have not deteriorated during the period. If anything, the skills have slightly improved regarding the level

The next example is an effort to find out which competencies the students had attained through studying the programme ('attained competencies') and compare these to the competencies that the programme director has stated that the pro-

In the mandatory questionnaire, we asked the students 'Which competencies do you think are the most important that you have developed/will develop during your

From the answers of the first-year students and fourth-year students, we built two separate sets of competencies, by clustering the student stated competencies and formulating aggregated competencies describing the simple competencies in

When comparing the two sets to each other, we found no large differences. And when comparing the sets of competencies to the programme objectives defined by the programme director, they were unexpectedly similar. Thus, the students' collective view of the programme objectives seen as competencies was quite close to the programme director's view. This shows a good programme coherence with respect to the P ⇄ L edge in the programme triangle [33]. This is in contrast to Nilsson's interviewed engineers, who consider the educational and professional competence

There are different tools for measuring learning strategies, such as deep, surface and strategic learning strategies. In mandatory surveys in PIC1 and PIC2, we have

implement the suggestions approved by the students into the programme.

sible positive effects or obstacles that we did not think of ourselves.

*DOI: http://dx.doi.org/10.5772/intechopen.88253*

implemented [31].

**5.2 Studying language quality**

of complexity of the language [32].

bases to be only loosely coupled [3].

**5.4 Studying learning strategies**

gramme should result in ('intended competencies').

**5.3 Studying competencies**

studies at KTH?'

each cluster.

#### *Programme Integrating Courses Making Engineering Students Reflect DOI: http://dx.doi.org/10.5772/intechopen.88253*

scale and, optionally, comment. Finally, we analysed the evaluation and started to implement the suggestions approved by the students into the programme.

We found that it is possible to collect suggestions for improvement and opinions on them from all students that most suggestions were realistic and well founded.

Furthermore, we could see what support and what opposition each suggestion would meet if implemented. For each suggestion, we got comments showing possible positive effects or obstacles that we did not think of ourselves.

This approach, which we call *student-based programme development*, thus gives us a very good foundation for deciding whether and when the suggestions should be implemented [31].

#### **5.2 Studying language quality**

*Theorizing STEM Education in the 21st Century*

(see **Table 1**) [22]. The reflection documents are graded in two passing grades, and from 2012 we required that the reflection should reach level 3 and 4 for students in years 2 and 3, in order to receive the highest grade. Since the students are reading each other's reflection documents, the first-year students could learn how to reflect

We developed a language technology-based system that is able to measure the depth of a reflection, according to the model in **Table 1** [28]. When comparing the mean reflection level of the reflection documents by the same students in the beginning of year 1 and in the end of year 3, we can see that the mean reflection level is raised from year 1 to year 3 for every student group and that the increase became

Thus, introducing the four-level reflection model and assessing the students' reflection documents using this model improved the mean progression of reflection

The students are aware of this progression. When we ask them if they feel that they are better at writing reflections than when they started the programme, the students at the end of their first year do not see any clear improvement, but after year 2, and even more after year 3, the improvement is evident (see **Table 5**).

Before the seminar *studying and working abroad*, each student has to read about how exchange studies work and read three travel reports from students who have studied abroad. Then each student should reflect on exchange studies and discuss with the other students in the ordinary PIC way. Our hypothesis was that the introduction of this seminar should increase the number of students studying abroad. The number of students studying abroad almost doubled after the introduction of

Mandatory surveys in the course (see function 10 in Section 3.2) are an important and versatile tool. In this section, we will look at five examples of how submitted reflection documents and mandatory questions to all students in all years can

In the mandatory questionnaire in PIC2 and PIC3 in 2016 (and again in 2019), we asked the following question: 'Give at least one proposal for how the master's

Almost 800 suggestions for improvements were received, at least one from every active student. We manually sorted and categorised the suggestions into 25 catego-

We then prioritised the suggestions: already implemented, should be implemented immediately or when possible, needs further work to become useful, save

We selected 24 suggestions that would be possible to implement and presented them to two student representatives, who prioritised which suggestions we should

We proceeded with 14 suggestions. In a new mandatory questionnaire in PIC2 and PIC3, we now asked each student to evaluate each suggestion on a seven-point

programme in computer science and engineering could be improved.'

ries, with respect to what each suggestion aims to improve.

more deeply when reading the reflections of the older students.

larger after the introduction of the four-level model [28].

from the beginning of the course to the end of the course.

the seminar, which might indicate a correlation [28].

**5.1 Student-based programme development**

for future consideration or reject.

proceed with in the next stage.

**4.5 Inspiration for exchange studies**

**5. Usage of the course**

be used.

**212**

In Sweden, there has since 2013 been a debate in public media, where university professors, mostly from departments of history, have argued that today's students entering university are much less accomplished than earlier students when it comes to basic Swedish language skills. According to the debate, both the spelling and grammar of Swedish students are weak. The first signs of these are said to have been observed in 2010. In order to objectively study the language skills of Swedish first-year university students, we constructed an automatic tool, based on language technology, which measures the language skills that, according to the critics, have been deteriorating. We used the tool on the PIC2 reflection documents from the first seminar from seven different years, 2010–2016. The results show, surprisingly, that the language skills of the studied groups of students have not deteriorated during the period. If anything, the skills have slightly improved regarding the level of complexity of the language [32].

#### **5.3 Studying competencies**

The next example is an effort to find out which competencies the students had attained through studying the programme ('attained competencies') and compare these to the competencies that the programme director has stated that the programme should result in ('intended competencies').

In the mandatory questionnaire, we asked the students 'Which competencies do you think are the most important that you have developed/will develop during your studies at KTH?'

From the answers of the first-year students and fourth-year students, we built two separate sets of competencies, by clustering the student stated competencies and formulating aggregated competencies describing the simple competencies in each cluster.

When comparing the two sets to each other, we found no large differences. And when comparing the sets of competencies to the programme objectives defined by the programme director, they were unexpectedly similar. Thus, the students' collective view of the programme objectives seen as competencies was quite close to the programme director's view. This shows a good programme coherence with respect to the P ⇄ L edge in the programme triangle [33]. This is in contrast to Nilsson's interviewed engineers, who consider the educational and professional competence bases to be only loosely coupled [3].

#### **5.4 Studying learning strategies**

There are different tools for measuring learning strategies, such as deep, surface and strategic learning strategies. In mandatory surveys in PIC1 and PIC2, we have


**Table 7.**

*Results from Uppsala University and KTH of a stress survey question.*

used two such tools, ASSIST and RSPQ. The individual result was sent as feedback to each student, together with the summarised results of the whole group.

On group level, there are no large differences between the programmes or between the years of the students. However, there were quite large differences between the tools, especially for some individuals. Therefore, students testing their learning strategies by using one of these tools should not trust the results [34].

#### **5.5 Studying stress and health**

In the final example, Kann and Lundkvist [35] used the mandatory survey to replicate a study of the experience of stress among students, which had been performed at Uppsala University some months earlier. The same questions on stress were given to the PIC2 students from year 1–3:


We compared the answers of the students from different years and to the Uppsala students (see **Table 7**). The most common reasons for stress among the PIC2 students were nervousness before the exams, high (own) performance demands and that leisure activities are prioritised before studies. For about half of the students, the stress is sometimes a problem.

The PIC2 students got the compiled results as a part of the reading to the seminar about *ergonomics and mental health*. This seminar was appreciated by the students—it was in fact the most popular seminar (see **Table 4**).

#### **6. Discussion**

The programme integrating course was given in 2008 for engineering students in Media Technology, and in 2010 the course was introduced for Computer Science and Engineering students. Thereafter the course has spread rapidly, both to other engineering programmes and to master's programmes. In 2013 it was adopted by two engineering programmes at Linköping University [36]. In 2019, there exist at least 20 successful implementations of the course in different programmes at KTH and Linköping University. The basic structure of all these courses is the same, but there have been local modifications, both in topics and in add-ons to the seminar and reflection part of the course.

**215**

**7. Conclusions**

*Programme Integrating Courses Making Engineering Students Reflect*

on the programme, the course itself or other courses.

successful would also be valuable to study in more detail.

programme that can be swiftly handled, etc.

but also to have many other functions.

short improve the *programme coherence*.

There have also been a few unsuccessful attempts to start a programme integrat-

Many students express that the best part of the course is the sharing of experiences with other students, especially students from other years, at the seminars. Discussing the courses of the programme and how they link to each other was also considered to be an important part of PIC, where the mentors were seen as gateways to change things. At a technical university, many students are sceptical to the elements of the education that they consider to be nonscientific or irrelevant to their future profession. The focus of the programme integrating course is on practicing soft skills, dispositions and attitudes, which makes it a target for such scepticism [29]. Therefore, we take care to show the direct or indirect benefit related to the engineering profession, for every topic that we introduce to the students. This is also in line with the course, since the programme objectives and the professional role are central parts of the course. The surveys in the programme integrating course are mandatory. A high response rate is important for the quality of the results of the survey [37]. However, by forcing students to answer a survey, the quality of the answers might drop. Since the surveys are anonymous—the survey system is hiding information on who has answered what—students could write a nonsense answer to an open question without being held responsible for this. In our experience, this is not the case. It is extremely uncommon that answers are noticeably unserious. However, we do not know how often answers look serious but are untruthful. We try to make the students take the surveys seriously by asking relevant questions, by explaining the importance of the survey and by showing that former surveys have had an influence

From the perspective of the programme, the greatest benefit of the course is probably that it makes the student reflect regularly and with high quality, which will improve the self-regulated learning, identify problems in courses and the

As shown above, the programme integrating course improves the programme coherence, which is important for a prosperous educational programme. However, Hammerness emphasises that coherence should not be viewed as an end product but rather a process 'as part of the steady work of such programs, a continuing and necessary effort of adjustment, revision and calibration' [2]. The programme integrating course has been shown to not just improve the programme coherence

Further research should investigate the concept of programme coherence more

deeply and study other ways of improving the programme coherence, besides programme integrating courses. Another area needing more research is the effect of different forms of reflection seminars, such as the full-group seminar, the split group seminar and the walking seminar [24]. The question why some attempts to introduce programme integrating courses fail while others (a clear majority) are

In this chapter, we have explained how a *programme integrating course* can strengthen the six different relations involved in the programme triangle

(**Figure 1**), between the students, the instructors and the programme director, in

ing course, where the course has had to be removed, since it did not work. The reasons might be that the involved instructors did not believe in the course themselves and that the students got an initial bad impression of the course, which was

*DOI: http://dx.doi.org/10.5772/intechopen.88253*

difficult to change.

*Theorizing STEM Education in the 21st Century*

**How often do you feel stressed because of** 

**your studies?**

**Table 7.**

**5.5 Studying stress and health**

your studies?

**6. Discussion**

and reflection part of the course.

were given to the PIC2 students from year 1–3:

*Results from Uppsala University and KTH of a stress survey question.*

the students, the stress is sometimes a problem.

• How often do you feel stressed because of your studies?

students—it was in fact the most popular seminar (see **Table 4**).

used two such tools, ASSIST and RSPQ. The individual result was sent as feedback

Never 1% 5% 4% 6% 5% About every month 12% 30% 28% 19% 12% About every week 32% 41% 40% 47% 43% About every day 55% 24% 28% 29% 27%

**Uppsala KTH Computer Science and Engineering**

**Year 1 Year 2 Year 3 All**

In the final example, Kann and Lundkvist [35] used the mandatory survey to replicate a study of the experience of stress among students, which had been performed at Uppsala University some months earlier. The same questions on stress

• If you feel stressed of your studies, what do you think are the reasons?

• To which degree do you estimate that stress is a problem/obstacle for you in

We compared the answers of the students from different years and to the Uppsala students (see **Table 7**). The most common reasons for stress among the PIC2 students were nervousness before the exams, high (own) performance demands and that leisure activities are prioritised before studies. For about half of

The PIC2 students got the compiled results as a part of the reading to the seminar about *ergonomics and mental health*. This seminar was appreciated by the

The programme integrating course was given in 2008 for engineering students in Media Technology, and in 2010 the course was introduced for Computer Science and Engineering students. Thereafter the course has spread rapidly, both to other engineering programmes and to master's programmes. In 2013 it was adopted by two engineering programmes at Linköping University [36]. In 2019, there exist at least 20 successful implementations of the course in different programmes at KTH and Linköping University. The basic structure of all these courses is the same, but there have been local modifications, both in topics and in add-ons to the seminar

to each student, together with the summarised results of the whole group. On group level, there are no large differences between the programmes or between the years of the students. However, there were quite large differences between the tools, especially for some individuals. Therefore, students testing their learning strategies by using one of these tools should not trust the results [34].

**214**

There have also been a few unsuccessful attempts to start a programme integrating course, where the course has had to be removed, since it did not work. The reasons might be that the involved instructors did not believe in the course themselves and that the students got an initial bad impression of the course, which was difficult to change.

Many students express that the best part of the course is the sharing of experiences with other students, especially students from other years, at the seminars. Discussing the courses of the programme and how they link to each other was also considered to be an important part of PIC, where the mentors were seen as gateways to change things.

At a technical university, many students are sceptical to the elements of the education that they consider to be nonscientific or irrelevant to their future profession. The focus of the programme integrating course is on practicing soft skills, dispositions and attitudes, which makes it a target for such scepticism [29]. Therefore, we take care to show the direct or indirect benefit related to the engineering profession, for every topic that we introduce to the students. This is also in line with the course, since the programme objectives and the professional role are central parts of the course.

The surveys in the programme integrating course are mandatory. A high response rate is important for the quality of the results of the survey [37]. However, by forcing students to answer a survey, the quality of the answers might drop. Since the surveys are anonymous—the survey system is hiding information on who has answered what—students could write a nonsense answer to an open question without being held responsible for this. In our experience, this is not the case. It is extremely uncommon that answers are noticeably unserious. However, we do not know how often answers look serious but are untruthful. We try to make the students take the surveys seriously by asking relevant questions, by explaining the importance of the survey and by showing that former surveys have had an influence on the programme, the course itself or other courses.

From the perspective of the programme, the greatest benefit of the course is probably that it makes the student reflect regularly and with high quality, which will improve the self-regulated learning, identify problems in courses and the programme that can be swiftly handled, etc.

As shown above, the programme integrating course improves the programme coherence, which is important for a prosperous educational programme. However, Hammerness emphasises that coherence should not be viewed as an end product but rather a process 'as part of the steady work of such programs, a continuing and necessary effort of adjustment, revision and calibration' [2]. The programme integrating course has been shown to not just improve the programme coherence but also to have many other functions.

Further research should investigate the concept of programme coherence more deeply and study other ways of improving the programme coherence, besides programme integrating courses. Another area needing more research is the effect of different forms of reflection seminars, such as the full-group seminar, the split group seminar and the walking seminar [24]. The question why some attempts to introduce programme integrating courses fail while others (a clear majority) are successful would also be valuable to study in more detail.
