**7.3 Validation and reliability of the research instruments**

To ensure validity, the questionnaire was hand-delivered to two NS Subject Education Specialists (SES) and one science teacher for critic feedback before it was used for data collection. A Subject education specialist is someone who provides support to teachers with subject content knowledge, practical activities, relevant resources to be used in schools based on their areas of specialization as well as organizing workshops [20]. The study considered the Cronbach alpha test to ensure the reliability of the instrument. In **Table 1**, Part 5 illustrates the outcomes of the Cronbach's alpha test for the quantitative instrument that was used in this study.

Cronbach's alpha for 14 items in the instrument used in this study is shown as 0.703. This is construed as an indication of high reliability and internal consistency in the questionnaire.

#### **7.4 Procedure for data collection**


The structured questionnaire was distributed to 30 science teachers who were requested to complete the task within seven days. After seven days, only 16 had

#### **Table 1.**

*Cronbach's alpha reliability of internal consistency for Likert scale.*

completed. Those who were not ready for submissions were given four more days to respond to the questionnaire which was later collected by the researchers.

#### **7.5 Data analysis**

Both descriptive and inferential statistics were employed in the analysis of the collected data. The research question was answered using science teachers' frequency counts and percentages as follows: promotion of science learning through science content and practical assessment responses: Principles of assessment, those who agreed were 70% and above, while neutral were 30% below; disagree—0%; and no response—0%.

#### **8. Ethical considerations**

Respondents were informed about the significance of the study and its purpose. School principals and science teachers signed the informed consent. Respondents were also informed, as part of their consent that the results of the study would be made available to them on request. Moreover, they were told that their responses would remain anonymous and confidential and their names would not be written even if the findings were published.

#### **9. Results**

**Table 2** illustrates quantitative results on how educators assess science learning and promote science learning.

The data in **Table 2** illustrate how teachers assess SL. The responses were as follows: agree, neutral, disagree, and no response.

#### **9.1 Agree**

Align with cognitive domains 86.7%; Principles of assessment 77%; Investigations 66.7%; Simulation 63.3%; Debates 50%; each among these: Assignment, Experiments, and Examinations 40%; Problem-Solving 37%; Projects 34%; Presentation 23%; Roleplay 17%; Tests 13.3%; and Quiz 7%.

#### **9.2 Neutral**

Presentations 70%; Problem-Solving 60%; Experiments 57%; projects 53%; each among these: Tests and Examinations 50%; Assignments 46,7%; Debates 34%; each among these: Simulations and Investigations 30%; Roleplay 27%; Quiz 23%; Principles of assessment 13%; and Align with cognitive domain 10%.

#### **9.3 Disagree**

Quiz 70%; Roleplay 53%; Tests 13.3%; each among these: Debates 13.3%; Examinations 13%; each among these: Debates and Projects 13%; each among these: Principles of assessment, Assignments, and Examinations 10%; Presentations 7%; Simulations 3.3%; each among these: Problem-Solving and Align with Cognitive domain 3%; each among these: Experiments and Investigations 0%.


*Promotion of Science Learning through Science Content and Practical Assessment DOI: http://dx.doi.org/10.5772/intechopen.105407*


#### **Table 2.**

*Responses on how educators assess SL.*
