Return to Article Details Effect of written corrective feedback in research writing competence of non-education students

Effect of written corrective feedback in research writing competence of non-education students

Michelle Joy Gutierrez, Sheila Mae Dabu, Alcera Mary Joy, Nest Lene Banogon, and Janice R. Carambas[0000-0002-1937-5991]
Pangasinan State University – Alaminos City Campus, Bolaney, Alaminos City, Pangasinan, Region I, Philippines
Abstract. This research addressed several key problems: the extent of use of teachers’ feedback strategies, the effects of WCF on students’ performance, and the significant correlation among the variables. A quantitative descriptive research design was used, and 153 participants were surveyed through the messenger platform and Google Forms. Research findings indicate that among the four types of written corrective feedback (WCF), direct feedback was the most frequently employed strategy by teachers, while focused feedback was the least utilised. Furthermore, results indicate that students value reflective learning facilitated by teachers’ written corrective feedback, especially in error recognition and constructive feedback responses, as non-education students displayed positive attitudes toward receiving feedback. The study also revealed a positive correlation between WCF and research writing performance, regardless of age, gender, or degree program, with relatively low values at 0.022, 0.003, and 0.005, respectively, using partial eta squared. Additionally, significant correlations were observed between students’ GPAs in research courses and the impact of WCF, revealing a disparity in GPAs between Research 1 and Research 2. As direct, unfocused, and indirect written corrective feedback showed a statistically significant relationship in the extent of use of teachers’ written corrective feedback, focused feedback shows no significant correlation. The findings emphasise the importance of evaluating different written corrective feedback strategies to enhance students’ research writing outcomes. Findings call for the strategic use of written corrective feedback based on the specified needs and level of performance of non-education students. Furthermore, students, teachers, and the institution are to work hand in hand to create a culture of continuous improvement of students in research writing and academic performance through written corrective feedback.

Keywords: written corrective feedback · research writing · non-education students

1 Introduction

Writing plays a crucial role in students’ holistic development, serving not only as a medium for expressing thoughts and ideas but also as a tool for expanding their knowledge. Among the various forms of academic writing, the research paper holds particular significance, especially in undergraduate courses. Mastering the skill of writing a compelling research paper is essential for students, and a vital component of this process is receiving feedback to enhance their work. Feedback is an integral part of writing, involving the evaluation and refinement of written work based on teachers’ comments and corrections [20]. It helps identify areas for improvement and ultimately enhances the quality of the final output. In an educational setting, feedback is vital as it guides students in addressing specific needs and improving their writing abilities.

Despite the time-consuming and labour-intensive nature of providing feedback, Martin [18] emphasises that teachers should dedicate a substantial amount of time to offering written comments, as it can significantly contribute to students’ improvement in writing skills. However, as Redd and Kennett [21] observed, many students either do not read the feedback provided or fail to show noticeable improvement. This issue highlights the complexity of the feedback process and the challenges associated with its effectiveness. While teachers invest considerable time and effort in providing constructive criticism, several factors may hinder students’ engagement with the feedback. These factors include a lack of motivation, time constraints, or difficulty in understanding and implementing the suggestions.

Acknowledging these potential drawbacks is essential for ensuring a balanced understanding and for helping teachers provide constructive written feedback that students can easily comprehend. Sometimes, students may misinterpret teachers’ comments due to unclear or unreadable handwriting, which can lead to confusion and hinder their ability to make meaningful improvements [21]. If students do not fully understand the feedback or its intended purpose, their ability to address specific areas of concern may be compromised.

Moreover, according to a study by Best et al. [5], students often disregard feedback on their research papers when they find it difficult to understand. This can have emotional repercussions, as students may experience tension, anxiety, and feelings of inadequacy in response to overly critical or negative remarks, potentially harming their motivation and overall well-being [17]. To mitigate these negative effects, educators must strive to offer constructive and encouraging written feedback that highlights both areas for growth and strengths. Clear communication, prompt feedback, and a balanced approach that acknowledges students’ efforts can reduce the adverse effects of feedback while maximising its positive impact.

Ene and Yao [11] stress the importance of professors providing students with comprehensive descriptions of their expectations and feedback methods. Effective communication of these elements can facilitate students’ understanding of the lessons being taught and the underlying assumptions behind the feedback they receive. Without such clarity, misunderstandings or confusion may arise. Therefore, educators should take the time to explain their approaches to providing feedback and the objectives they hope to achieve. By offering thorough explanations, teachers can bridge the gap between their expectations and students’ comprehension, making the feedback process more productive and meaningful. This approach fosters better communication and enables students to use the feedback they receive to enhance their writing.

Furthermore, a study conducted by Wirantaka [26] recommends that educators consider the clarity and specificity of the feedback they give to ensure it effectively enhances students’ writing abilities. However, this study was limited to just five participants completing their undergraduate theses at Yogyakarta University’s English Education Department. Consequently, the present study aims to determine whether written feedback is beneficial for a larger group of non-education students writing research papers.

In the Philippine context, a study by Balanga et al. [4] examined Filipino high school students’ beliefs about written corrective feedback. Their findings identified five types of written corrective feedback: direct feedback, indirect feedback, focused feedback, unfocused feedback, and reformulation. The researchers suggested that teachers should ensure students pay attention to the criticism they receive to minimise flaws in their final work. Teachers should also provide feedback on areas needing modification in students’ papers, helping them become more aware of their mistakes and avoid repeating them in future writing. Additionally, teachers should encourage students to learn how to self-edit.

While existing studies underscore the necessity of effective written corrective feedback in research writing, there remains a gap in local research focusing on this aspect, particularly in the context of writing research papers. Most previous studies on written corrective feedback have concentrated on other types of academic writing or the differences between teachers’ and students’ preferences. Therefore, the present study seeks to investigate the effects of written corrective feedback, specifically on students’ research writing.

2 Objectives of the study

The outcomes of this study were expected to determine the effects of written corrective feedback in the research writing of fourth-year students in the two programs: Bachelor of Science in Information Technology and Bachelor of Science in Business Administration at Pangasinan State University, Alaminos Campus. Specifically, this research intended to answer the following questions:

1.
What is the profile of the respondents in terms of:
(a)
age;
(b)
sex;
(c)
degree program; and
(d)
GPA in the research courses (Research 1 and Research 2)?
2.
What is the extent of the use of teachers’ written corrective feedback strategies in students’ research writing?
3.
What are the effects of written corrective feedback on the research writing performance of students?
4.
Is there a significant relationship between the demographic profile of the respondents and the effects of teachers’ written corrective feedback on the research writing performance of the students?
5.
Is there a significant relationship between the extent of use and the effects of teachers’ written corrective feedback on students’ research writing?
6.
Is there a significant difference in the GPA of respondents in Research 1 and 2?

In this study, three research hypotheses were formulated, which were tested at a 0.05 level of significance. The null hypotheses are:

H01:
There is no significant relationship between the demographic profile of the respondents and the effects of teachers’ written corrective feedback on the students’ research writing performance.
H02:
There is no significant relationship between the extent of use and the effects of teachers’ written corrective feedback on students’ research writing.
H03:
There is no significant difference in the GPA of respondents in Research 1 and 2.

3 Materials and methods

This study utilised a descriptive research design to examine the impact of teachers’ written corrective feedback on the research writing competence of fourth-year non-education students at Pangasinan State University, Alaminos Campus. Descriptive research design, characterised by its quantitative nature, entails the gathering of numerical data for analysis through statistical techniques. This approach facilitates the generation of precise and accurate descriptions of populations or phenomena under study and explores participants’ experiences [14].

Within the parameters of descriptive research design, survey research was used to collect data from a sample or population using standardised questionnaires. This method was used to understand specific group’s attitudes, opinions, behaviours, and demographic characteristics. In this research context, quantitative data on the demographic profile, extent of use, and effects of teachers’ written corrective feedback were collected through survey questionnaires.

The primary data collection method involved using a profile questionnaire and a questionnaire adapted from a study by Aridah et al. [3]. The questionnaire was modified to suit the study’s specific data needs. Both closed-ended and open-ended questions were used to assess respondents’ perspectives on teachers’ written corrective feedback strategies and to explore the respondents’ perceptions of the impact of written corrective feedback on students’ research writing performance.

The questionnaire used in this study was structured into three distinct parts to address the research objectives comprehensively. Part I (section 4.1) focused on gathering demographic information about the students, including their age, sex, degree program, and GPA in key research courses (Research 1 and Research 2). This demographic data was crucial for exploring potential correlations between the respondents’ backgrounds and their experiences with teachers’ written corrective feedback. Understanding these demographic factors was essential, as they may influence how students perceive and respond to feedback, as well as how feedback impacts their research writing performance.

Part II of the questionnaire (section 4.2) examined the extent to which teachers implemented various written corrective feedback strategies in the students’ research writing process. This section included questions designed to assess the frequency, consistency, and types of feedback provided by teachers. Students were asked to reflect on their experiences, including how often they received comments on different aspects of their research papers, such as grammar, structure, content, and overall coherence. This part was critical in identifying the most commonly used feedback strategies and how systematically they were applied across different research projects.

Finally, part III (section 4.3) explored the effects of teachers’ written corrective feedback on the students’ research writing performance. This section sought to capture the students’ perspectives on how feedback influenced their ability to revise drafts, improve research skills, and enhance overall writing competence. Questions in this section addressed perceived changes in writing quality, confidence in writing research papers, and the ability to meet academic standards. By gathering insights into the effects of feedback, this section aimed to provide a deeper understanding of the role of corrective feedback in fostering academic writing skills. Appropriate statistical tools were used in the data analysis to obtain viable and reliable results.

The respondents’ demographic profile, the extent of use of teachers’ written corrective feedback strategies, and the effects of teachers’ written corrective feedback on students’ research writing were tabulated and constructed in a frequency table using frequency counts, percentages, and mean.

Meanwhile, the association between the respondents’ demographic profile and the effects of teachers’ written corrective feedback on research writing was calculated using partial eta-squared values.

To determine the correlation between the extent of use of teachers’ written corrective feedback strategies and the effects of teachers’ written corrective feedback on students’ research writing, the Pearson – r coefficient was used. However, to ascertain whether there was a significant difference in the GPA of respondents in Research 1 and 2, a paired sample t-test was employed.

4 Results and siscussion

4.1 Demographic profile of the respondents

The respondents’ profiles in terms of age, sex, degree program, and Grade Point Average in Research 1 and 2 courses are shown in table 1. A frequency table was created using counts, percentages, and means. Mean scores and descriptive ratings were used to interpret the data.


Table 1: Demographic profile of the respondents.



Parameters Frequency (n=153)Percentage



Age



22 years old and below 124 81.05%
23-24 years old 27 17.65%
25 years old and above 2 1.31%



Mean = 21.96 SD = 4.16



Sex



Female 90 58.82%
Male 63 41.18%



Degree program



BSIT 59 38.56%
BSBA (Operations Management) 57 37.25%
BSBA (Financial Management) 37 24.18%



GPA in Research 1



1.25-1.50 41 26.80%
1.75-2.25 90 58.82%
2.50-2.75 12 7.84%
3 10 6.54%



GPA in Research 2



1.25-1.50 20 13.07%
1.75-2.25 103 67.32%
2.50-2.75 16 10.46%
3 14 9.15%




Age. It was shown in the table that almost all, or 81.05% of the respondents, were 22 years old and below, few, or 17.65%, were 23-24 years old, and 2, or 1.31% of the respondents were 25 years old and above. The National Center for Education Statistics (NCES) states that since most students enrol in college immediately following high school, usually at 18, the age range for graduating college students falls under 22 and below. Furthermore, a bachelor’s degree program typically takes four years to complete, with the majority of graduates being 22 years old or younger. This is consistent with the result of the current study, which shows that almost all the respondents belong to the 22 and below age bracket.

Sex. The data presented in the table shows that the majority of the respondents were female, which is 90 or 58.82%, and 63 or 41.18% were male. According to a study by Pennington et al. [19], females tend to have higher levels of academic engagement and motivation, which can contribute to their higher enrollment rates. This coincides with the current study’s findings, which show that female students dominate males.

Degree program. An analysis of the respondents’ degree programs showed that fifty-nine (59), or 38.56% were Bachelor of Science in Information Technology (BSIT) students, fifty-seven (57), or 37.25% were Bachelor of Science in Business Administration (BSBA) majoring in Operations Management, and thirty-seven (37) or 24.18% were Bachelor of Science in Business Administration majoring in Financial Management. According to a study by the Commission on Higher Education (CHED) in the Philippines, the BSIT program is popular among students because of the increasing demand for IT professionals in various industries. This is consistent with the results of the current study, wherein the Bachelor of Science in Information Technology program outnumbered the Bachelor of Science in Business Administration.

Grade Point Average. The majority of respondents, totalling 90 individuals or 58.82%, achieved a GPA range of 1.75-2.25 in Research 1, indicating a performance level classified as “Good”. Subsequently, 41 respondents, or 26.80%, obtained a GPA range of 1.25-1.50, signifying a performance level categorised as “Very Good”. Following this, 12 respondents, or 7.84%, fell within the GPA range of 2.50-2.75, while 10 respondents, or 6.54%, received a GPA range of 3.00, denoting a performance level classified as “Passed”.

In the Research 2 courses of the respondents, 103 or 67.32% obtained a 1.75-2.25 GPA. This means that a great majority of the respondents perform at a level that is considered “Good”. Following this, a few or 20 students, representing 13.07% of the sample, attained a GPA falling within the range of 1.25-1.50, signifying a level of performance classified as “Very Good”. Subsequently, 16 respondents, accounting for 10.46% of the total, received grades within the range of 2.50-2.75. Lastly, a few of the participants, comprising 9.15%, obtained a GPA in the range of 3.00.

4.2 Extent of use of teachers’ written corrective feedback strategies on students’ research writing

It is reflected in table 2 that non-education students often received direct written corrective feedback. This claim is supported by the average weighted mean of 4.01, with all the indicators categorised as “Often”.


Table 2: Extent of use of direct written corrective feedback strategies.













Indicators: When giving
direct written corrective
feedback, the teacher...
F5
F4
F3
F2
F1
WM
DR










n
%
n
%
n
%
n
%
n
%














1. provides correct linguistics form or structure above or near the linguistic error I made

4932.037247.063019.61 0 0 2 1.31 4.09 Often













2. crosses out the errors of words/phrases/sentences and supplies them with the correct one

4730.727549.022818.30 2 1.31 1 0.65 4.08 Often













3. inserts missing words, morphemes, or phrases with the correct form

5032.686139.883724.18 4 2.61 1 0.65 4.01 Often













4. gives short explanations for every error indicated

4832.036441.833422.22 6 3.92 1 0.65 3.99 Often













5. gives explanations and examples at the end of my paper with a reference back to places in the text where the error has occurred

5032.686542.483321.57 4 2.61 1 0.65 4.03 Often













6. provides explicit guidance on how to correct errors

5133.336341.183321.57 6 3.92 0 0 4.03 Often













7. numbers errors in the text and write a description for each numbered error at the bottom of the text

4428.766240.523724.18 6 3.92 4 2.61 3.87 Often













Average weighted mean
4.01
Often














The study’s findings indicate that non-educational students frequently receive direct written corrective feedback. This is consistent with the findings of Chen et al. [7] and Zhang et al. [28], who discovered that students preferred receiving direct written corrective feedback since it effectively addressed their problems and provided clear solutions for improvement. This type of criticism not only helps students realise their errors but also shows them how to fix them, thereby improving their writing skills and academic achievement. However, this opposes Zohra and Fatiha [29] notion, which states that teachers should not “always” spoon-feed learners and offer them adequate corrections but rather encourage them to take ownership of their learning by self-correcting errors.

One implication of this study is the need to customise feedback methods to address students’ varied academic backgrounds. Recognising that non-education students benefit from direct written corrective feedback allows educators to personalise teaching strategies in research writing instruction. This highlights the importance of student-centred approaches in enhancing learning outcomes by catering to the distinct needs and preferences of diverse student groups.

In table 3, the computed average weighted mean of 3.47, with a descriptive equivalent of “Often”, is presented. The indicator with the highest weighted mean of 3.65 states that “When giving indirect written corrective feedback, the teacher often shows where the error is and gives a clue about how to correct it”.


Table 3: Extent of use of indirect written corrective feedback strategies.













Indicators: When
giving indirect
written corrective
feedback,
the teacher...

F5
F4
F3
F2
F1
WM
DR











n
%
n
%
n
%
n
%
n
%














1. implicitly signals the errors and lets me do the correction by myself

235.035737.255233.9914 9.15 7 4.58 3.49 Often













2. identifies the errors in my text without providing the ways to correct them

203.075032.685233.992214.37 9 5.88 3.32 Sometimes













3. uses a set of correction symbols without providing the correct forms

203.075133.334932.032415.69 9 5.88 3.33 Sometimes













4. only encircles the words or phrases containing errors

224.375032.686240.5213 8.50 6 3.92 3.46 Often













5. uses question marks for the unclear expressions

288.305334.645334.6414 9.15 5 3.27 3.57 Often













6. records in the margin the number of errors in a given line without providing the correct forms

513.336341.183321.57 6 3.92 0 0 3.48 Often













7. shows where the error is and gives a clue about how to correct it

448.766240.523724.18 6 3.92 4 2.61 3.65 Often













Average weighted mean
3.47
Often














The findings support Ellis [9] and Siewert [24] research, which found that teachers believe indirect feedback encourages students to take responsibility for their learning process. Furthermore, teachers prefer offering indirect feedback since it helps students to self-correct their mistakes [29].

The findings imply that teachers commonly offer feedback that highlights students’ errors in their writing without directly correcting them. This extends to the notion that indirect feedback can enhance critical thinking and metacognitive awareness in non-education students. However, relying solely on indirect feedback may lead to ambiguity or confusion. Educators should balance indirect feedback for autonomy with direct feedback for clarity in writing conventions.

In essence, the implication underscores the need for educators to adopt a flexible and adaptive approach to feedback provision, considering the diverse learning preferences and needs of non-education students. This aligns with the study’s results in tables 2 and 3, showing that non-education students received both types of feedback. Incorporating direct and indirect written corrective feedback strategies can foster students’ autonomy, critical thinking, and academic growth in research writing.

Table 4 presents the indicator with the highest weighted mean of 3.51, stating that “When giving focused written corrective feedback, the teacher ‘Often’ comments on one or two linguistic error categories at a time rather than feedback on too comprehensive a range of features”.


Table 4: Extent of use of focused written corrective feedback strategies.













Indicators: When
giving focused
written corrective
feedback,
the teacher...

F5
F4
F3
F2
F1
WM
DR











n
%
n
%
n
%
n
%
n
%














1. selects specific errors to be corrected and ignores other errors

2415.695133.334730.722214.38 9 5.88 3.41 Often













2. corrects only the errors that interfere with the meaning/content

2113.736039.224932.031912.42 4 2.61 3.50 Often













3. comments on grammatical errors only

2315.035032.685435.292516.34106.54 3.36 Sometimes













4. focuses on organization only

1610.465032.686240.522516.34106.54 3.24 Sometimes













5. comments on the minor errors for example those related to mechanics only

1811.765535.944630.072616.99 8 5.23 3.33 Sometimes













6. focuses on a single error type (e.g., tenses)

2415.694730.724630.072717.65 9 5.88 3.33 Sometimes













7. comments on one or two linguistic error categories at a time rather than feedback on too comprehensive a range of features

2717.655133.335233.991811.76 5 3.27 3.51 Often













Average weighted mean
3.38
Sometimes














It is reflected in the result of the findings in table 4 that non-education students occasionally received targeted written corrective feedback from their teachers on their research writing. Sometimes, teachers give feedback that only targets specific errors or a small number of errors, allowing them to prioritise and address the most significant areas of improvement in the research writing of the non-education students. The findings confirmed the idea of Ellis [9] wherein giving focused feedback is difficult; it takes time and energy for learners who can’t process all the corrected errors at once as it only focuses on the significant errors. Learners may face difficulty in comprehending and addressing all the corrected errors simultaneously, particularly when the feedback only concentrates on the most substantial or critical errors.

Meanwhile, Ellis et al. [10] and Ene and Yao [11] discovered that learners who got focused feedback gained a better comprehension of the linguistic characteristics of certain error types. This shows that, while focused feedback may provide barriers at first, it might eventually lead to a more profound understanding of language errors and their fixes, which will assist learners in the long run.

Table 5 shows that non-education students frequently received unfocused written corrective feedback from their teachers. In the table, all seven indicators received a descriptive rating of “Often”, with the indicator “When giving unfocused written corrective feedback, the teacher comments on what I did in both right and wrong sentences” obtaining the highest weighted mean value of 3.80.


Table 5: Extent of use of unfocused written corrective feedback strategies.













Indicators: When giving
unfocused written
corrective feedback,
the teacher...
F5
F4
F3
F2
F1

WM
DR










n
%
n
%
n
%
n
%
n
%














1. generally comments on all noticed errors, whether with or without correction

3120.266542.484126.80159.80 1 0.65 3.73 O













2. randomly comments on errors or writing problems

2818.306743.794227.45149.15 2 1.30 3.70 O













3. corrects all types of errors, including mechanical errors

3522.886441.834026.14117.19 3 1.96 3.77 O













4. comments on what I did in both right and wrong sentences

3321.576542.484630.07 8 5.23 1 0.65 3.80 O













5. gives general comments on a separate piece of paper

2818.306139.874831.37159.80 1 0.65 3.67 O













6. writes in the margins next to every error I make

3925.495938.564126.80117.19 3 1.96 3.79 O













7. provides notes on the last page of my paper about what I should revise without marking my text

2818.307240.064529.41 7 4.58 1 0.65 3.79 O













Average weighted mean
3.75
Often













Overall average weighted mean
3.65
Often














The result indicates that non-education students often received feedback aimed at correcting all errors in their research writing. However, comparing focused written corrective feedback to unfocused written corrective feedback shows a difference in the mean: 3.38 (Sometimes) and 3.75 (Often), respectively. This shows that non-education students received more unfocused feedback than focused feedback. This contradicts the study of Farrokhi and Sattarpour [12], highlighting that providing focused written CF can lead to more improvement in the accurate use of targeted structures by low-proficient learners. According to Sheen et al. [23], the probable reasons for the differential effectiveness of focused and unfocused CF are: focused CF may enhance learning by helping learners to (1) notice their errors in their written work, (2) systematically engage in hypothesis testing and (3) monitor the accuracy of their writing by tapping into their existing explicit grammatical knowledge. In contrast, unfocused corrective feedback runs the risk of (1) providing CF in a confusing, inconsistent, and unsystematic way and (2) overburdening learners.

The results of the extent of use of teachers’ written corrective feedback on students’ research writing reveal that among the four types of written feedback, direct written corrective feedback obtained the highest average weighted mean of 4.01, while focused written corrective feedback received the lowest average weighted mean of 3.38.

This suggests that teachers of non-education students predominantly utilised direct written corrective feedback, with focused written corrective feedback being the least utilised type. This observation aligns with the research conducted by Aquino and Cuello [2], where respondents showed a belief that providing direct corrections to specific errors would be more beneficial for the recipients of the written corrective feedback process, indicating a preference for direct feedback over other types.

However, this preference for direct feedback contrasts with the findings of Lee [16], who discovered that teachers favoured focused written corrective feedback and perceived it to have greater pedagogical value compared to comprehensive or direct written corrective feedback.

The results generally indicate that non-education students frequently received written corrective feedback from their teachers on their research writing. This assertion is supported by the overall average weighted mean of 3.65, denoting “Often”. This suggests that teachers often provide students with written corrective feedback, leading to improvements in their research writing.

These findings are consistent with the research conducted by Sun and Qi [25], which demonstrated that regardless of the feedback from students received, they exhibited enhanced writing accuracy after receiving written corrective feedback. These outcomes are not unexpected, as all types of feedback can contribute to increasing students’ awareness of correct language usage, enabling them to consciously monitor their language and enhance accuracy in their written output [15].

The frequent provision of written corrective feedback to non-education students suggests an opportunity for improving their research writing skills. This indicates that regardless of the feedback form, students demonstrate enhanced writing accuracy after receiving written corrective feedback. Therefore, while there may be variations in teachers’ feedback preferences and practices, the consistent provision of feedback contributes to students’ awareness of correct language usage and improves their writing accuracy over time.

4.3 Effects of written corrective feedback on students’ research writing

The data (table 6) illustrates that non-education students perceive the teacher’s written corrective feedback positively, as evidenced by all indicators averaging a weighted mean of 4.10. The indicator, “Through the teacher’s written corrective feedback, I can reflect on my mistakes and think of ways I can do”, obtained the highest weighted mean of 4.45, indicating a “Strongly Agree” response.


Table 6: Effects of written corrective feedback on students’ research writing.













Indicators: Through
the teacher’s written
corrective feedback...
F5
F4
F3
F2
F1
WM
DR










n
%
n
%
n
%
n
%
n
%














1. I am encouraged to enhance my research writing skills

5535.957347.711912.42 4 2.61 2 1.31 4.14 Agree













2. I am able to gain confidence in my research writing

5233.997650.992013.07 3 1.96 0 0 4.18 Agree













3. I can see my writing strengths

4328.108152.942616.99 3 1.96 0 0 4.07 Agree













4. I can reflect on my mistakes and think of ways I can do

5133.337750.332315.03 1 0.65 0 0 4.45
Strongly
Agree













5. I can easily understand what I am writing

4126.808756.862416.69 1 0.65 0 0 4.10 Agree













6. I am able to better when all errors are indicated and corrected

5435.297549.021912.42 2 1.31 3 1.96 4.14
Strongly
Agree













7. I am more engaged when written suggestions are good

5737.257549.021811.76 3 1.96 0 0 4.22
Strongly
Agree













8. I am taught how to self-correct so I can remember my errors

4328.107850.992717.65 3 1.96 2 1.31 4.03 Agree













9. I am given many ways to identify my errors in writing

4630.078555.562113.73 1 0.65 0 0 4.13 Agree













10. I am given many ways to identify errors in writing

4630.078354.252013.07 4 2.61 0 0 4.15 Agree













11. I comprehend information more effectively

3925.499260.132113.73 1 0.65 0 0 4.10 Agree













12. I am motivated to do more revisions after reading the feedback

4328.108253.592818.30 0 0 0 0 4.10 Agree













13. I avoid making the same errors in writing

5233.997247.062818.30 1 0.65 0 0 4.14 Agree













14. I can correct teacher-identified errors on my own

3120.267045.754428.76 6 3.92 2 1.31 3.80 Agree













15. I learn to locate errors on my own

3422.227549.024227.45 2 1.31 0 0 3.93 Agree













16. I learn to locate and correct errors on my own

3422.227146.414630.07 2 1.31 0 0 3.90 Agree













17. I always refer to the teacher’s analysis errors on my own

3522.887448.374328.10 1 0.65 0 0 3.93 Agree













18. I always reflect on my errors whenever I need to

5636.607448.372113.73 2 1.31 0 0 4.21
Strongly
Agree













19. I can identify my strong and weak points

4227.457750.333422.22 0 0 0 0 4.05 Agree













20. I can apply what I have learned into practice

5032.688052.292315.03 0 0 0 0 4.19 Agree













Average weighted mean
4.10
Agree














The findings show that non-education students place high importance on reflective learning through written corrective feedback from teachers, particularly in terms of recognising errors and responding to constructive feedback. These findings were supported closely by the students’ answers to the open-ended questions regarding the effects of written corrective feedback on non-education students’ research writing competence. Respondents answered as follows:

Feedback helps me to identify where I need to improve on my papers. It helped me to acknowledge my mistakes and improve my writing skills.

Feedback helps me to identify where I need to improve on my papers. It helped me to acknowledge my mistakes and improve my writing skills.

Without feedback, it’s hard to know the errors and how to do better next time.

It enhanced my academic writing skills, and yes, it helped me grow and acknowledge my mistakes and shortcomings in writing.

Moreover, non-education students strongly agreed that they are more engaged with good written suggestions and refer to their teachers’ feedback on their research writing.

I consider it important for as a student who has already conducted a research study, feedback is where we rely more on writing our whole paper.

I take them positively for corrections is where knowledge begins to foster. Another, I do really enjoy learning. Correcting my mistakes will be so much appreciated.

The non-education students who agreed and acknowledged the positive effect of teachers’ written corrective feedback on their research writing expressed a willingness to receive and utilise feedback constructively. Respondents highlighted various ways in which feedback benefited their writing, as revealed through responses to the questionnaire. The respondents’ statements such as “It helps you see what mistakes you made and how to fix them” and “It serves as a tool for self-evaluation” resonated significantly with the indicator with the highest weighted mean, “Through the teacher’s written corrective feedback, I can reflect on my mistakes and think of ways I can do”.

These findings were further supported by the study of Alamis [1], where a majority of student respondents from the University of Santo Tomas found positive feedback to be beneficial in enhancing their written work.

However, non-education students may encounter challenges in independently identifying and rectifying errors based on teacher comments, especially when they struggle to comprehend the errors highlighted in their research writing. This aligns with the findings of De Los Santos and Dayan [8], indicating that students heavily rely on teachers for corrections and feedback, viewing them as writing models and experts. Given that students may lack confidence in their own writing and revision skills, they continue to seek their teachers’ guidance and assistance [26].

This implies that students may reflect on their own mistakes, engage with the written suggestions, and refer to the feedback whenever they need to. However, it is still highly needed that teachers should give proper guidance to their students on addressing the errors in their papers.

The study of Ferris [13] indicated that students who received written corrective feedback from teachers showed significant improvement in their writing abilities compared to those who did not receive such feedback. The study highlighted the importance of feedback in helping students reflect on their mistakes, identify areas for improvement, and engage in self-evaluation, which aligns with the opinions expressed by non-education students in acknowledging the benefits of feedback on their research writing.

The demographic factors (table 7): Age, Sex, and Degree program, the analyses yielded non-significant findings, as evidenced by F-values of 1.597, 0.427, and 0.345, respectively, all accompanied by corresponding p-values greater than 0.05 (0.206, 0.514, and 0.709). Additionally, the Partial eta squared values for Age, Sex, and Degree program are relatively low at 0.022, 0.003, and 0.005, respectively, indicating that only a small proportion of the variance in the effects of written corrective feedback can be attributed to these demographic factors. Hence, the results suggest that neither age, sex, nor degree program significantly influences the effectiveness of teachers’ written corrective feedback.


Table 7: Relationship between the demographic profile and the effects of teachers’ written corrective feedback.





Variables F-value
p-value
(sig. 2 tailed)
Partial eta
squared value
Interpretation





Age 1.597 0.206 0.022 Not significant





Sex 0.427 0.514 0.003 Not significant





Degree program 0.345 0.709 0.005 Not significant







Table 8: Relationship between the GPA in Research 1 and 2 and the effects of teachers’ written corrective feedback.






Variables Mean SD r p-value (sig. 2 tailed)Interpretation






GPA in Research 1 1.95 0.461-0.191 0.018 Significant






GPA in Research 2 2.08 0.423 0.723 0.000 Significant







The demographic profile of the respondents, particularly in terms of sex, plays no significant role in understanding the impact of written corrective feedback (WCF) in research writing. The study by Wondim et al. [27] highlighted that gender differences do not have a unique effect on the outcome of WCF. This implies that the effectiveness of WCF in improving language skills is not solely dependent on an individual’s gender.

Table 8 illustrates the significant relationship between GPAs in Research 1 and 2 and the effects of teachers’ WCF. Notably, the mean GPA for Research 1 is 1.95 with a standard deviation of 0.461, while for Research 2, it’s slightly lower at 2.08 with a standard deviation of 0.423. The correlation analysis reveals contrasting trends: a weak negative correlation of r = -0.191 between the mean of effects of teachers’ WCF and GPA in Research 1, and a strong positive correlation of r = 0.723 between the mean of effects of teachers’ WCF and GPA in Research 2. Both correlations are statistically significant with p-values of 0.018 and 0.000, respectively, indicating that as the mean of effects of teachers’ WCF, the GPA in Research 1 decreases or vice versa. On the other hand, as the mean effects of teachers’ WCF increases, the GPA in Research 2 also increases.

The implications of the results indicating a weak negative correlation in Research 1 and a strong positive correlation in Research 2 between the effects of teachers’ WCF and GPA can be interpreted as follows. The decrease in Research 1 GPA suggests that as the feedback increases, the GPA in Research 1 decreases. This could imply that the feedback provided in Research 1 may not have been as effective in improving academic performance compared to Research 2.

On the other hand, the increase in Research 2 GPA indicates that as the feedback increases, the GPA in Research 2 also increases. This suggests that the feedback provided in Research 2 may have positively impacted academic performance, leading to higher GPAs.

Potential causes for these findings include distinctions between the quality or type of feedback offered in Research 1 and Research 2, differences in student involvement or responsiveness to input, and other external factors influencing academic achievement. Thus, the hypothesis that there is no significant relationship between respondents’ demographic profiles and the effects of written corrective feedback from teachers on students’ research writing is rejected.

Correlation analysis through Pearson – r was performed to determine the mean (μ), standard deviation (SD), correlation coefficient (r-value), and p-value to test the significant relationship between different indicators of the extent of use of written corrective feedback strategies and effects of teachers’ written corrective feedback on students’ research writing (table 9). Results showed that “direct written corrective feedback strategy”’ obtained the highest mean (μ = 4.02), with a relatively low standard deviation (SD = 0.65), followed by “unfocused written corrective feedback” (μ = 3.74; SD = 0.61), “indirect written corrective feedback” (μ = 3.46; SD = 0.78), and “focused written corrective feedback” (μ = 3.37; SD = 0.83). Moreover, the computed r-value and p-value for “direct written corrective feedback” (r = 0.464;p < 0.000) and “unfocused written corrective feedback” (r = .315;p < 0.000) indicate a significantly moderate positive relationship in the extent of the use of written corrective feedback on students’ research writing. While “indirect written corrective feedback” with a mean of 3.46 and standard deviation of 0.78 obtained an r -value of 0.235 and p-value less than 0.01, indicating that “indirect written corrective feedback” had a significantly weak positive correlation with the extent of the use of written corrective feedback on students’ research writing. On the other hand, “focused written corrective feedback” with a mean of 3.37, SD of 0.83, r-value of 0.173, and p-value greater than 0.05, which indicates a weak positive correlation. However, the p-value is 0.092, which is greater than the conventional significance level of 0.05, indicating that the relationship is not statistically significant.


Table 9: Relationship between extent of use of written corrective feedback strategies and the effects of teachers’ written corrective feedback on students’ research writing.






Variables Mean SD r p-value (sig. 2 tailed)Interpretation






Direct WCF 4.02 0.650.464 0.000 Significant






Indirect WCF 3.46 0.780.235 0.003 Significant






Focused WCF 3.37 0.830.173 0.092 Not Significant






Unfocused WCF 3.74 0.610.315 0.000 Significant







In summary, direct written corrective feedback, unfocused written corrective feedback, and indirect written corrective feedback showed a statistically significant relationship in the extent of teachers’ written corrective feedback use, while focused written corrective feedback shows no significant correlation. Thus, the hypothesis stating that there is no significant relationship between the extent of use and the effects of teachers’ written corrective feedback on students’ research writing is rejected.

The implication of these findings emphasises the importance of evaluating different written corrective feedback strategies to enhance students’ research writing outcomes. Direct, unfocused, and indirect feedback strategies show significant positive relationships with their frequency of use by teachers, indicating a potential impact on students’ writing. Direct feedback emerges as the most influential strategy, with a moderate positive correlation with its frequency of use, suggesting its potential to enhance students’ research writing skills significantly. Similarly, unfocused feedback also demonstrates a moderate positive correlation, while indirect feedback, though weaker, contributes to students’ writing development. Focused feedback, however, does not show a significant impact on research writing outcomes when used independently.

Educators should prioritise direct, unfocused, and indirect feedback strategies to maximise effectiveness. Comprehensive feedback covering various writing aspects is crucial for student improvement, as supported by Bitchener et al. [6]. Table 10 presents the results of assessing the relationship between demographic factors and the effects of teachers’ written corrective feedback.


Table 10: Significant difference in the GPA of respondents in research 1 and 2.






Variables Mean SD t-valuep-value (sig. 2 tailed)Interpretation






GPA in Research 1 1.95 0.461
-4.704
0.000
Significant



GPA in Research 2 2.08 0.423







A paired sample t-test was utilised to determine whether the GPA in Research 1 of the respondents significantly differed from the GPA in Research 2. Results show that the GPA in Research 1 (μ = 1.95;SD = 0.461) is significantly higher than the GPA in Research 2 (μ = 2.08;SD = 0.423), t = -4.704, p < 0.01. Thus, the null hypothesis stating that there is no significant difference in the GPA of respondents in Research 1 and 2 was rejected.

This corroborates the findings of Sarmita [22] wherein the study explores potential factors contributing to the discrepancy between GPA 1 and GPA 2, particularly focusing on scenarios where students’ initial GPA (GPA 1) was higher than their post-feedback GPA (GPA 2). The significant difference observed between GPAs 1 and 2 suggests that there are factors involved, such as their initial writing proficiency, motivation levels, and ability to incorporate or utilise feedback into their writing processes. The findings also highlight the beneficial impact of written corrective feedback on students’ academic performance, even when they initially possess higher GPAs. This underscores the importance of providing effective feedback to support continuous improvement and learning among college students.

5 Conclusion and recommendation

The study examined various aspects, including the demographic profile of the respondents, the types and frequency of feedback provided by teachers, and the relationship between these factors and students’ performance in their research courses. These insights provide a comprehensive understanding of how written corrective feedback influences students’ academic development and writing proficiency. The respondents, who are predominantly 22 years old or younger, are mainly female and primarily enrolled in BSIT and BSBA programs. Both groups demonstrated strong performance in their research courses, with a significant number earning “Good” grades.

Regarding the extent of teachers’ use of written corrective feedback strategies, it was found that teachers frequently provide direct, indirect, and unfocused feedback, with direct feedback being the most commonly used and unfocused feedback the least used.

In terms of the effects of this feedback, respondents generally have a positive perception of receiving it. They value constructive feedback for its role in enhancing their learning and personal growth, recognising the importance of teachers in guiding them toward improved research writing competence. However, the findings indicate that the effectiveness of teachers’ written corrective feedback is not influenced by demographic factors such as age, sex, or degree program; the quality and impact of the feedback remain consistent across different groups.

The analysis also revealed a significant relationship between the extent of teachers’ use of written corrective feedback and the type of feedback provided, with the exception of focused feedback, which showed no significant correlation. Additionally, the effects of teachers’ written corrective feedback varied significantly between students’ GPAs in Research 1 and Research 2, suggesting that factors such as initial writing proficiency, motivation levels, and the ability to utilise feedback effectively may influence the writing process and outcomes.

A set of recommendations were made based on the findings and conclusions of the study. These made the researchers propose that:

1.
Teachers should implement written corrective feedback strategies to meet the specific needs of non-education students in research writing. These strategies should focus on individual qualities and learning styles rather than demographic factors like age, sex, or degree program.
2.
Teachers should continuously assess and adjust feedback strategies based on student performance and progress.
3.
Students may collaborate with peers or seek peer feedback to supplement teacher-provided feedback and gain diverse perspectives on research writing proficiency.
4.
Campus administrators may conduct training sessions or workshops for teachers about the effective use of written corrective feedback.
5.
Future researchers may identify the potential challenges and difficulties encountered by the students in receiving written corrective feedback.
6.
Similar research should be conducted in the future to ascertain further the validity of the results of this study by considering the following:

(a)
respondents from private institutions
(b)
preferred type of feedback of the students
(c)
factors influencing students’ GPAs in research courses
(d)
teachers as respondents for the extent of use of WCF in students’ research writing
(e)
different research methods (add qualitative) and data-gathering instruments (pen and paper tests)

References

[1]    Alamis, M.M.P.: Evaluating Students’ Reactions and Responses to Teachers’ Written Feedbacks. Philippine ESL Journal 5, 40–57 (2010)

[2]    Aquino, C.J.B., Cuello, R.: Teachers’ Beliefs and Practices on Written Corrective Feedback: Matched or Mismatched? In: DLSU Research Congress 2020 “Building Resilient, Innovative and Sustainable Societies”. June 17-19, 2020, vol. 8 Series 7: Learners and Learning Innovations, DLSU Manila, Philippines (2020), URL https://www.dlsu.edu.ph/wp-content/uploads/pdf/conferences/research-congress-proceedings/2020/LLI-01.pdf

[3]    Aridah, A., Atmowardoyo, H., Salija, K.: Teacher Practices and Students’ Preferences for Written Corrective Feedback and Their Implications on Writing Instruction. International Journal of English Linguistics 7(1), 112 (Jan 2017), https://doi.org/10.5539/ijel.v7n1p112

[4]    Balanga, R.A., Fidel, I.V.B., Gumapac, M.V.G.P., Ho, H.T., Tullo, R.M.C., Villaraza, P.M.L., Vizconde, C.J.: Student Beliefs Towards Written Corrective Feedback: The case of Filipino High School Students. i-manager’s Journal on English Language Teaching 6(3), 22–38 (2016), https://doi.org/10.26634/jelt.6.3.8176

[5]    Best, K., Jones-Katz, L., Smolarek, B., Stolzenburg, M., Williamson, D.: Listening to Our Students: An Exploratory Practice Study of ESL Writing Students’ Views of Feedback. TESOL Journal 6(2), 332–357 (2015), https://doi.org/10.1002/tesj.152

[6]    Bitchener, J., Young, S., Cameron, D.: The effect of different types of corrective feedback on ESL student writing. Journal of Second Language Writing 14(3), 191–205 (2005), https://doi.org/10.1016/j.jslw.2005.08.001

[7]    Chen, S., Nassaji, H., Liu, Q.: EFL learners’ perceptions and preferences of written corrective feedback: a case study of university students from Mainland China. Asian-Pacific Journal of Second and Foreign Language Education 1(1), 5 (Apr 2016), https://doi.org/10.1186/s40862-016-0010-y

[8]     De Los Santos, T.M.A.B., Dayan, B.E.: Teachers’ Feedback Methods and Students’ Motivation in Writing. In: Proceeding of International Seminar Commemorating the 100th Anniversary of Tamansiswa “Education, Culture, and Nationalism in New Era”, pp. 107–122, UST-Press (2022), URL {https://seminar.ustjogja.ac.id/index.php/ISECN/article/view/104}

[9]    Ellis, R.: A typology of written corrective feedback types. ELT Journal 63(2), 97–107 (05 2008), https://doi.org/10.1093/elt/ccn023

[10]    Ellis, R., Sheen, Y., Murakami, M., Takashima, H.: The effects of focused and unfocused written corrective feedback in an English as a foreign language context. System 36(3), 353–371 (2008), https://doi.org/10.1016/j.system.2008.02.001

[11]    Ene, E., Yao, J.: How Does that Make You Feel: Students’ Affective Engagement with Feedback. Language Teaching Research Quarterly 25, 66–83 (Sep 2021), https://doi.org/10.32038/ltrq.2021.25.04

[12]    Farrokhi, F., Sattarpour, S.: The Effects of Focused and Unfocused Written Corrective Feedback on Grammatical Accuracy of Iranian EFL Learners. Theory and Practice in Language Studies 1(12), 1797–1803 (Dec 2011), https://doi.org/10.4304/tpls.1.12.1797-1803

[13]    Ferris, D.: Does error feedback help student writers?: New evidence on short-and long-term effects of written error correction. In: Hyland, K., Hyland, F. (eds.) Feedback in Second Language Writing: Contexts and Issues, chap. 5, pp. 81–104, Cambridge University Press, Cambridge (2006), https://doi.org/10.1017/CBO9781139524742.007

[14]    Hassan, M.: Descriptive Research Design – Types, Methods, and Examples (2024), URL https://researchmethod.net/descriptive-research-design/#sidr-main

[15]    Hattie, J., Timperley, H.: The Power of Feedback. Review of Educational Research 77(1), 81–112 (2007), https://doi.org/10.3102/003465430298487

[16]    Lee, I.: Understanding teachers’ written feedback practices in Hong Kong secondary classrooms. Journal of Second Language Writing 17(2), 69–85 (2008), https://doi.org/10.1016/j.jslw.2007.10.001

[17]    Mahfoodh, O.H.A.: “I feel disappointed”: EFL university students’ emotional responses towards teacher written feedback. Assessing Writing 31, 53–72 (2017), https://doi.org/10.1016/j.asw.2016.07.001

[18]    Martin, R.: Rhetoric of Teacher Comments on Student Writing. Young Scholars in Writing 8, 16–29 (Sep 2015), URL https://youngscholarsinwriting.org/index.php/ysiw/article/view/107

[19]    Pennington, C.R., Kaye, L.K., Qureshi, A.W., Heim, D.: Do gender differences in academic attainment correspond with scholastic attitudes? An exploratory study in a UK secondary school. Journal of Applied Social Psychology 51(1), 3–16 (2021), https://doi.org/10.1111/jasp.12711

[20]    Race, P.: The Lecturer’s Toolkit: A Practical Guide to Learning, Teaching and Assessment. Routledge, 4 edn. (2015), URL https://edc.mrgums.ac.ir/Uploads/User/4614/The%20Lecturers%20Toolkit%20A%20practical%20guide%20to%20assessment%2C%20learning%20and%20teaching%20by%20Phil%20Race%20(z-lib.org).pdf

[21]    Redd, B.R., Kennett, L.N.: Getting Students to Read Instructor Feedback (and maybe actually learn from it). College Quarterly 20(2) (2017), URL https://tinyurl.com/3k9akha3

[22]    Sarmita, R.N.: Contributing factors to the low grade point average (GPA) of undergraduate students. International Journal of Economics, Business and Management Research 2(3), 1–27 (2018), URL https://ijebmr.com/uploads/pdf/archivepdf/2020/IJEBMR_02_193.pdf

[23]    Sheen, Y., Wright, D., Moldawa, A.: Differential effects of focused and unfocused written correction on the accurate use of grammatical forms by adult ESL learners. System 37(4), 556–569 (2009), https://doi.org/10.1016/j.system.2009.09.002

[24]    Siewert, L.: The Effects of Written Teacher Feedback on the Academic Achievement of Fifth-Grade Students With Learning Challenges. Preventing School Failure: Alternative Education for Children and Youth 55(1), 17–27 (2011), https://doi.org/10.1080/10459880903286771

[25]    Sun, H., Qi, W.: Effects of Written Corrective Feedback on College EFL Students’ Writing Accuracy and Linguistic Knowledge Acquisition. Chinese Journal of Applied Linguistics 45(3), 445–461 (2022), https://doi.org/10.1515/CJAL-2022-0310

[26]    Wirantaka, A.: Investigating Written Feedback on Students’ Academic Writing. In: Nurmandi, A., Chen, Y., bin Ismail, N.A., Rafique, Z. (eds.) Proceedings of the Third International Conference on Sustainable Innovation 2019 – Humanity, Education and Social Sciences (IcoSIHESS 2019), Advances in Social Science, Education and Humanities Research, vol. 353, pp. 1–7, Atlantis Press (2019), https://doi.org/10.2991/icosihess-19.2019.1

[27]    Wondim, B.M., Bishaw, K.S., Zeleke, Y.T.: Effects of Teachers’ Written Corrective Feedback on the Writing Achievement of First-Year Ethiopian University Students. Education Research International 2023(1), 7129978 (2023), https://doi.org/10.1155/2023/7129978

[28]    Zhang, T., Chen, X., Hu, J., Ketwan, P.: EFL Students’ Preferences for Written Corrective Feedback: Do Error Types, Language Proficiency, and Foreign Language Enjoyment Matter? Frontiers in Psychology 12 (2021), https://doi.org/10.3389/fpsyg.2021.660564

[29]    Zohra, R.F., Fatiha, H.: Exploring Learners’ and Teachers’ Preferences Regarding Written Corrective Feedback Types in Improving Learners’ Writing Skill. Arab World English Journal 13(1), 117–128 (2022), https://doi.org/10.24093/awej/vol13no1.8