The purpose of this article is to provide insight into the way in which the grades are calculated, and which aspects and features could possibly influence the calculations.
Within the grading module of each of our tools, you have full control over the desired factors and their weight towards the final grade of the assignment. For the Feedback Assignments (Peer Review, Group Member Evaluation, Skill Review and Assignment Review), examples of these factors (among others) are whether students completed the hand-in step, whether they completed their reviews or the scores they received from their peers.
In case the latter has been chosen, the grades of the students will be calculated based on the average of the scores they have received for each of the criteria. For example, the average of the scores as provided by three reviewers- based on three added criteria. Here it is important to note that only the scores that students actually received will be included in the average, rather than the scores they are expected to receive. In the case that three reviews were required but only two reviewers completed, the average will be based on the scores as left by the two reviewers.
Our allocation method strives to prevent an unequal distribution of the reviews amongst the students. Allocations will only be made once students have been active in the assignment in question.
What happens to the grading when students do not participate in the Group Member Evaluation?
In case students do not participate in the Group Member Evaluation process, it is still possible for them to receive a review if they did open the assignment, yet by not participating they will withhold a peer from a review.
Depending on the settings that were added in terms of grading while setting up the assignment, this could potentially have an effect on the grading of the students. Particularly if the Group Configurable Grading option has been enabled, thus calling for the contribution of the student- as measured by the scores of their peers- to be the dominant determinant of the grade they will receive.
The effect of the lack of participation depends on the number of reviews a student is set to receive. In case multiple peers are to review student A, and only one of the peers does not complete the feedback, the grade of student A will be determined by the evaluation of the peers who did complete their review.
In case all students that are set to review student A do not complete their reviews, the student will not receive any scores and no grade will be available for them. This allows for a limitation where FeedbackFruits will not be able to provide a score to the LMS once grades are published.
In case a student does receive even a single rating, their grade will be based on this. Taking weight element of the grade into account. This calls for the grade not to be representative of the student's performance.
In the rare case that the above-mentioned scenarios apply, you still have the option to manually adjust the grades of the students in question within the assignment or the grade book of your LMS. Here you also have the option to reallocate the student by using the Manual Allocation option.
In case you still have doubts about your specific use case, please don't hesitate to contact our support department for additional guidance.
How could the Group Contribution Factor influence the grading?
By enabling the Group Contribution Factor within Group Member Evaluation, the participation of the students relative to their fellow group members will be evaluated. This will call for a factor between 0 and 2 to be distributed in regards to their participation. Based on this comparison (and the settings in regards to the contribution factor) the system will suggest a possible adjustment towards the grade. Here three scenarios are possible,
A student will receive a zero (0) if their contribution is between 0 and 0,49.
A student's grade will be multiplied by their received contribution if this is between 0,5 and 1.
A student's grade will not be altered if the contribution factor lies between 1,01 and 2.
After enabling the Group Contribution Factor within the grading module, you can configure the values that will be used to categorize the student's contribution. Here it is important to note that the factor will change when new ratings for students are added.
This feature serves as an addition while evaluating the group process. Because you as a teacher have the flexibility to decide whether it is appropriate to implement the suggested grade adjustments for all students, a specific group, individual students ór not at all, you are in full control.
The grading will therefore only be influenced if you decide that it is appropriate and adjust the grades.
The Group Contribution Factor builds on the previous work in this field by SparkPlus.
Note: The Group Contribution Grading function is currently a beta feature. If you would like to use this feature and it is not yet available for you, please contact our support chat so they can enable the function for your account.
Can the system analyze and detect student behavior within a Group Member Evaluation?
By enabling our Detect Outliers feature within Group Member Evaluation, you will be provided with insight into the behavior of the students based on how they rated themselves and their peers. Please note that in order for this feature to work, the groups need to have at least four members.
Based on this comparison the system can assign a label to the student. These labels are a mere indication of the situation and serve as a recommendation for the teacher to take the action they see fit.
These labels will classify the students based on various factors, below you will find an overview along with details on criteria (numerical) that are to be met in order for a label to be appointed. The detect outlier feature is inspired by the research of Purdue University and their CATme assessment technique.
High Performer, this will be appointed when a student receives an average rating above 70% and when their average rating is more than 10% higher than the overall average rating of the team as a whole.
Low Performer, this is appointed when a student appears not to have contributed to the success of the team. The condition for this label is for the student to receive an overall rating lower than 50%.
Underconfident, this label is appointed when the overall team rating is above 60%, yet the student has rated themselves at least 20% lower than the overall rating. Thus indicating that the student appears to be 'underconfident' or is too critical of their personal contributions.
Overconfident, this is appointed when the overall rating for a student is less than 60%, yet the student has rated themselves at least 20% higher. Thus indicating that they appear to be 'overconfident' and rate their contributions higher than their fellow group members.
Manipulator, this label is appointed when a student appears to "skew the curve" by providing themselves with high ratings, while poorly rating their fellow group members. The condition for this label is for the student to have rated themselves with an overall rating of 80% or more, while rating their fellow team members at least 40% below this rating. This difference is measured using the average amongst the team members.
Conflict, this label is appointed when a student rates a specific team member 40% or less, while the median rating from the rest of the team is 60% or more. This generally indicates that there is a conflict between 2 team members.
Clique, this is appointed in case it appears as if the team has split into two non-cooperating groups, showing protective insider ratings. This happens when there is significant disagreement between the ratings from various team members, as evidenced by the standard deviation of the 0-to-1 normalized ratings given by peers being above 0.23.
In order for the detect outliers feature to properly work, it is of importance that the students have made significant progress with giving their reviews.
Because the main goal of this feature is to draw attention to situations that could possibly require extra attention, and cannot guarantee that the labels as appointed are valid, there will be no suggestions available in regards to the grade adjustments. However, once the situation has been verified, you can still manually adjust the grades as you see fit within the grading module before publishing the grades.
Hopefully, this article was able to provide you with more insight into the various aspects that influence the grading calculations, along with a better understanding of our features which could help you ensure proper grading is distributed.
This concludes the Grade Calculations article.
If you have any questions or experience a technical issue, please contact our friendly support team by clicking on the blue chat button (Note: support is available 24h every weekday & unavailable on the weekend).