dQw4w9WgXcQ
New Member
- Joined
- Nov 14, 2024
- Messages
- 1
- Gender
- Male
- HSC
- 2024
What is moderation:
Before I begin I want to clarify exactly what I mean, every person's final mark for each subject is calculated from the average of their HSC mark and moderated school mark, what I've found is the formula that NESA uses to calculate these moderated marks by using the original marks given by schools as well as HSC marks. The idea behind this is that different schools have different exams with different difficulties, so it wouldn't be fair to compare these marks since equally qualified students could get different marks simply because one sat harder exams than the other. The way that NESA accounts for this is by using the HSC as a benchmark to compare marks between different schools and the average school mark of each school is moderated to equal the average HSC mark of that school. For example, if the average mark in a school was 75%, but those same students scored an average of 80% in the HSC, then on average 5 marks would be added to each student's school mark to reflect the fact that their exams were harder than the HSC, but importantly, this is on average and a given student might gain or lose any number of marks as long as on average everyone gains 5 marks, what I've found is (at least what I believe to be) the exact formula that determines how each person's mark changes. In case I left anything out or this explanation wasn't that clear I've attached a video from NESA explaining the concept
Creating the formula:
NESA's website gives some clues as to how exactly this is done, namely it states that three anchor points are taken to create a parabola which then determines how the rest of the cohort is moderated. From the website we know that two of these anchor points come from sending the highest school mark to the highest HSC mark and the lowest school mark to the lowest HSC mark. Using this I found a way to form a parabola around three points and I've left links to some YouTube videos on the topic if you're interested in how it works. As for the third anchor point, I used the fact that the sum of moderated marks has to equal the sum of HSC marks and found a formula to give the point that would ensure this. I don't know if this is exactly how NESA does it but I plugged in the example marks on their website and it predicted the same moderated marks that they show, so even if this isn't exactly how they do the calculations, it seems to lead to the same results. (from this point on I had to mess with some of the links because the site was being funny about them)
6 student simulation:
htt
ps://ww
w.desmos.com/calculator/gyllxw8igw
Maths videos:
NESA moderation page:
htt
ps://ww
w.nsw.gov.au/education-and-training/nesa/hsc/exams-and-marking/assessment-moderation
How to use:
If you want to play around with this yourself here's how you can go about doing that. The first entry is set to f(90) and by changing the number inside the brackets you can see what that mark will be moderated to e.g. f(78) shows what 78 would be moderated to. s1 and s3 are the highest and lowest school marks respectively while m1 and m3 are the highest and lowest HSC marks respectively. s2 and m2 are a bit more complicated and there's a lot of maths involved but if you just want to play with the simulation all you need to know is that it doesn't actually matter what you make s2, I just made it the second highest mark but the formula for m2 will make sure you get the same graph regardless of what you make s2. The 409 in the m2 formula is the sum of all HSC marks and this is all you'll need to change if you're changing that e.g. if you want to see what would happen if each of the 6 students scored 10 points higher in the HSC you can add 60 to 409. As for w1-w3 and l1-l3, again there's some complex maths involved which you don't really need to know to use the formula, all you need to know is that because of these the simulation only works for exactly six students. The numbers a1-a6 are the original school marks to be moderated and these can be changed to see what changes based on the original school marks. Finally, I'll walk through the process of adding more students to the simulation, let's say I want to see what would happen if this class had an extra student who got 65 in school and 70 in the HSC, firstly add another entry a7=65 and change 409 to 479 in the m2 formula, after this add l1(a7) to w1, l2(a7) to w2 and l3(a7) to w3, also keep in mind that if the new student you add ends up with the highest or lowest school or HSC mark you might need to change s1, s3, m1 or m3. If six students seems a bit small to start with I've made another simulation with 30 students with a broad range of marks and set the HSC results to be identical to the school results to give a good starting point to see how certain changes will affect different students.
30 student simulation:
htt
ps://ww
w.desmos.com/calculator/lld6wum7ym
The issues:
Of course, the formula isn't perfect (far from it) and there's many ways that seemingly small changes can dramatically change how marks are distributed, after a few hours of playing around with it here's some of what I've found. Firstly, the formula assumes that the only reason students from different schools would perform differently is due to a difference in the difficulty of exams, but this fails to account for all sorts of other factors like early entry, burnout and different marking methods just to name a few. To illustrate this I went into the simulation with 30 students and subtracted 25 marks from the 1985 total to simulate 5 students getting an early entry offer and not studying for the final as much as they otherwise would have, hence scoring 5 marks less than they otherwise would have, and already most students lose a mark despite putting in the same hours and getting the same results as before. A mark doesn't sound like a lot but it could be the deciding factor in whether someone gets accepted into a degree or not and I know plenty people whose ATARs were less than one point below what they needed. The next issue here is that two of the three points which determine the graph come from the results of individual students and if any of these two students overperform or underperform for any reason, everyone else's marks change dramatically. To illustrate this I again went to the simulation with 30 students, changed m1 from 95 to 92 and changed the sum of HSC marks from 1985 to 1982, this minor change of one person underperforming by 3 marks now means that 92 is moderated to 90, 90 to 88, 87 to 85, you get the picture, all from a single person scoring three marks less in the HSC than they did in school. This leads me to the final and most devastating flaw, since the first and last school marks are always moderated to the first and last HSC marks it doesn't actually matter what they get in school as long as they stay first and last, this means that a school could theoretically enter the highest mark as one above the second highest and the lowest as zero to artificially lower the school average. To show this last flaw I returned to the 6 student simulation and changed the highest mark from 90 to 79 and the lowest from 40 to 0, now the student who scored 78 in school and was originally moderated to 77 gets 89 and 75 becomes 82 while the students who scored 58 and 55 both lose marks and would actually end up with a lowed moderated mark than the person who got sent to 0, so the formula appears to be breaking down. Now I don't know if a change this large is something NESA would notice, their website does say that students with atypical exam performance are excluded, but as for what qualifies as atypical exam performance, I don't know, and even small changes to these two marks are enough to dramatically sway the final distribution of marks. This effectively means that schools can play around with their marks before reporting them to NESA in order to boost students into a band 6, hence increasing the school's ranking which is determined by the percentage of band 6s. Now this is just what I found and there might be more, I made this primarily as a tool for people to estimate their moderated marks, but I also think the flaws I've found deserve to be more widely known, especially given the potential consequences and that it might be entirely possible to fix some of them.
Before I begin I want to clarify exactly what I mean, every person's final mark for each subject is calculated from the average of their HSC mark and moderated school mark, what I've found is the formula that NESA uses to calculate these moderated marks by using the original marks given by schools as well as HSC marks. The idea behind this is that different schools have different exams with different difficulties, so it wouldn't be fair to compare these marks since equally qualified students could get different marks simply because one sat harder exams than the other. The way that NESA accounts for this is by using the HSC as a benchmark to compare marks between different schools and the average school mark of each school is moderated to equal the average HSC mark of that school. For example, if the average mark in a school was 75%, but those same students scored an average of 80% in the HSC, then on average 5 marks would be added to each student's school mark to reflect the fact that their exams were harder than the HSC, but importantly, this is on average and a given student might gain or lose any number of marks as long as on average everyone gains 5 marks, what I've found is (at least what I believe to be) the exact formula that determines how each person's mark changes. In case I left anything out or this explanation wasn't that clear I've attached a video from NESA explaining the concept
Creating the formula:
NESA's website gives some clues as to how exactly this is done, namely it states that three anchor points are taken to create a parabola which then determines how the rest of the cohort is moderated. From the website we know that two of these anchor points come from sending the highest school mark to the highest HSC mark and the lowest school mark to the lowest HSC mark. Using this I found a way to form a parabola around three points and I've left links to some YouTube videos on the topic if you're interested in how it works. As for the third anchor point, I used the fact that the sum of moderated marks has to equal the sum of HSC marks and found a formula to give the point that would ensure this. I don't know if this is exactly how NESA does it but I plugged in the example marks on their website and it predicted the same moderated marks that they show, so even if this isn't exactly how they do the calculations, it seems to lead to the same results. (from this point on I had to mess with some of the links because the site was being funny about them)
6 student simulation:
htt
ps://ww
w.desmos.com/calculator/gyllxw8igw
Maths videos:
htt
ps://ww
w.nsw.gov.au/education-and-training/nesa/hsc/exams-and-marking/assessment-moderation
How to use:
If you want to play around with this yourself here's how you can go about doing that. The first entry is set to f(90) and by changing the number inside the brackets you can see what that mark will be moderated to e.g. f(78) shows what 78 would be moderated to. s1 and s3 are the highest and lowest school marks respectively while m1 and m3 are the highest and lowest HSC marks respectively. s2 and m2 are a bit more complicated and there's a lot of maths involved but if you just want to play with the simulation all you need to know is that it doesn't actually matter what you make s2, I just made it the second highest mark but the formula for m2 will make sure you get the same graph regardless of what you make s2. The 409 in the m2 formula is the sum of all HSC marks and this is all you'll need to change if you're changing that e.g. if you want to see what would happen if each of the 6 students scored 10 points higher in the HSC you can add 60 to 409. As for w1-w3 and l1-l3, again there's some complex maths involved which you don't really need to know to use the formula, all you need to know is that because of these the simulation only works for exactly six students. The numbers a1-a6 are the original school marks to be moderated and these can be changed to see what changes based on the original school marks. Finally, I'll walk through the process of adding more students to the simulation, let's say I want to see what would happen if this class had an extra student who got 65 in school and 70 in the HSC, firstly add another entry a7=65 and change 409 to 479 in the m2 formula, after this add l1(a7) to w1, l2(a7) to w2 and l3(a7) to w3, also keep in mind that if the new student you add ends up with the highest or lowest school or HSC mark you might need to change s1, s3, m1 or m3. If six students seems a bit small to start with I've made another simulation with 30 students with a broad range of marks and set the HSC results to be identical to the school results to give a good starting point to see how certain changes will affect different students.
30 student simulation:
htt
ps://ww
w.desmos.com/calculator/lld6wum7ym
The issues:
Of course, the formula isn't perfect (far from it) and there's many ways that seemingly small changes can dramatically change how marks are distributed, after a few hours of playing around with it here's some of what I've found. Firstly, the formula assumes that the only reason students from different schools would perform differently is due to a difference in the difficulty of exams, but this fails to account for all sorts of other factors like early entry, burnout and different marking methods just to name a few. To illustrate this I went into the simulation with 30 students and subtracted 25 marks from the 1985 total to simulate 5 students getting an early entry offer and not studying for the final as much as they otherwise would have, hence scoring 5 marks less than they otherwise would have, and already most students lose a mark despite putting in the same hours and getting the same results as before. A mark doesn't sound like a lot but it could be the deciding factor in whether someone gets accepted into a degree or not and I know plenty people whose ATARs were less than one point below what they needed. The next issue here is that two of the three points which determine the graph come from the results of individual students and if any of these two students overperform or underperform for any reason, everyone else's marks change dramatically. To illustrate this I again went to the simulation with 30 students, changed m1 from 95 to 92 and changed the sum of HSC marks from 1985 to 1982, this minor change of one person underperforming by 3 marks now means that 92 is moderated to 90, 90 to 88, 87 to 85, you get the picture, all from a single person scoring three marks less in the HSC than they did in school. This leads me to the final and most devastating flaw, since the first and last school marks are always moderated to the first and last HSC marks it doesn't actually matter what they get in school as long as they stay first and last, this means that a school could theoretically enter the highest mark as one above the second highest and the lowest as zero to artificially lower the school average. To show this last flaw I returned to the 6 student simulation and changed the highest mark from 90 to 79 and the lowest from 40 to 0, now the student who scored 78 in school and was originally moderated to 77 gets 89 and 75 becomes 82 while the students who scored 58 and 55 both lose marks and would actually end up with a lowed moderated mark than the person who got sent to 0, so the formula appears to be breaking down. Now I don't know if a change this large is something NESA would notice, their website does say that students with atypical exam performance are excluded, but as for what qualifies as atypical exam performance, I don't know, and even small changes to these two marks are enough to dramatically sway the final distribution of marks. This effectively means that schools can play around with their marks before reporting them to NESA in order to boost students into a band 6, hence increasing the school's ranking which is determined by the percentage of band 6s. Now this is just what I found and there might be more, I made this primarily as a tool for people to estimate their moderated marks, but I also think the flaws I've found deserve to be more widely known, especially given the potential consequences and that it might be entirely possible to fix some of them.

