0 members (),
226
guests, and
18
robots. |
Key:
Admin,
Global Mod,
Mod
|
|
S |
M |
T |
W |
T |
F |
S |
|
|
|
|
|
1
|
2
|
3
|
4
|
5
|
6
|
7
|
8
|
9
|
10
|
11
|
12
|
13
|
14
|
15
|
16
|
17
|
18
|
19
|
20
|
21
|
22
|
23
|
24
|
25
|
26
|
27
|
28
|
29
|
30
|
|
|
Joined: Sep 2008
Posts: 1,898
Member
|
Member
Joined: Sep 2008
Posts: 1,898 |
In England [UK complexity discussion deleted...] there are a number of examination boards, each setting its own syllabus and examinations. Schools choose which to put their students in for, so they retain some choice of syllabus: it's common for a school to use mostly Board A but to choose Board B for a particular subject because they prefer Board B's syllabus for that subject. Most 18yos in England take A-levels, but many of the most highly selective schools have switched partly or completely to using the newish Cambridge Pre-U which is designed to have more headroom - someone who got a top grade at A level might not do so at Pre-U. There is coordination and control done by Ofqual to prevent "race to the bottom" and someone (not sure who actually) ensures there's a core of material universities can rely on in maths, for example. You can see syllabuses and sample papers at the exam boards' sites, e.g. CIE. Here is a sample Pre-U Chemistry set of papers - note that it does in fact include a multiple choice paper, but it's not that soft an option :-)
Last edited by ColinsMum; 07/23/11 02:00 AM. Reason: links, accuracy!
Email: my username, followed by 2, at google's mail
|
|
|
|
Joined: Jul 2010
Posts: 480
Member
|
Member
Joined: Jul 2010
Posts: 480 |
When everything is based on "one exam", presumably for all the colleges in the country, who determines what that exam is? I don't trust the federal government to do so. The exams are very different overseas. First, no multiple choice. Second, students often get to choose the questions they want to answer; the paper may have 8 questions, and you only have to answer 5 (or whatever). The exams are graded by humans who are subject experts, using guidelines created by the examination commission ( here's an example from a recent mathematics exam in Ireland). If you look at that exam, you'll see that it's light years ahead of our SAT or AP exams in terms of what it measures and how it measures it. There's typically a national curriculum and all students take the same exam. I understand that many Americans are used to the idea of local control, but this approach wastes a lot of money by repeating effort and doesn't guarantee quality anyway. A national curriculum is also transparent. As far as I know (in Ireland and the UK anyway), the exams are written by subject experts (university academics, possibly teachers). I recall that my university tutor (in Ireland) wrote questions for the O or A levels (in the UK). In predicting college grades, both high school grades and SAT/ACT scores matter. Why shouldn't high school grades count? With respect, that's a very US-centric view of how things ought to be (and it's not backed up by evidence; I also think there are questions about the predictive value of the SAT. Not to mention the effects of grade inflation.). The goal of the overseas exams is to measure how well you learned the material overall, not how well you learned as it came, in small pieces. Obviously, European students get grades every year, but as ColinsMum said so well, it's a mistake to use the same work for learning and summative assessment. What counts is eventual mastery, not the steps taken getting there. A parallel is athletic competitions. We don't pick Olympic competitors or medalists based on how well they did throughout the duration of their training. Nothing matters but the "test:" how fast you ran the 100 in the qualifier race, how good your axels looked, etc. One thing about US schools that really bothers me is that the percent of correct problems on homework often counts toward final grades. Students aren't even allowed to make mistakes when they're learning! To me, that kind of constant pressure has got to be worse than one set of exams that you can repeat next year. Apart from the limited scope of the testing, and ending up with no guaranteed knowledge, sitting the SAT before your last week of school wastes that whole last year, which could be spent learning and working. And, scores from any time before that final year (or two) should not have any bearing on further study or even be reported once you're done.
|
|
|
|
Joined: Feb 2010
Posts: 2,640 Likes: 2
Member
|
Member
Joined: Feb 2010
Posts: 2,640 Likes: 2 |
SAT scores were found to be correlated with FYGPA (r = 0.54), with a magnitude similar to HSGPA (r = 0.56). The best set of predictors of FYGPA remains SAT scores and HSGPA (r = 0.63), as the addition of the SAT sections to the correlation of HSGPA alone with FYGPA leads to a substantial improvement in prediction (Δr = 0.07). Okay, but this study found that the SAT I is a poor predictor compared to grades. Also, that article you cited was written by the College Board. They don't seem to be in a position to be objective about the value of the SAT! Lots of studies have found that high school grades and SAT scores together are predictive. A recent study found that previous studies may have underestimated their predictive power because they do not control for course selection effects where able students cluster in the same courses, which may be graded on a curve. http://pss.sagepub.com/content/20/7/822.abstractIndividual Differences in Course Choice Result in Underestimation of the Validity of College Admissions Systems Christopher M. Christopher Berry, Wayne State University, Department of Psychology, 5057 Woodward Ave., 7th Floor, Detroit, MI 48202, e-mail: berry@wayne.edu. Psychological Science July 2009 vol. 20 no. 7 822-830 Abstract We demonstrate that the validity of SAT scores and high school grade point averages (GPAs) as predictors of academic performance has been underestimated because of previous studies' reliance on flawed performance indicators (i.e., college GPA) that are contaminated by the effects of individual differences in course choice. We controlled for this contamination by predicting individual course grades, instead of GPAs, in a data set containing more than 5 million college grades for 167,816 students. Percentage of variance accounted for by SAT scores and high school GPAs was 30 to 40% lower when the criteria were freshman and cumulative GPAs than when the criteria were individual course grades. SAT scores and high school GPAs together accounted for between 44 and 62% of the variance in college grades. This study provides new estimates of the criterion-related validity of SAT scores and high school GPAs, and highlights the care that must be taken in choosing appropriate criteria in validity studies.
"To see what is in front of one's nose needs a constant struggle." - George Orwell
|
|
|
|
Joined: Feb 2010
Posts: 2,640 Likes: 2
Member
|
Member
Joined: Feb 2010
Posts: 2,640 Likes: 2 |
When everything is based on "one exam", presumably for all the colleges in the country, who determines what that exam is? I don't trust the federal government to do so. The exams are very different overseas. First, no multiple choice. Second, students often get to choose the questions they want to answer; the paper may have 8 questions, and you only have to answer 5 (or whatever). The exams are graded by humans who are subject experts, using guidelines created by the examination commission ( here's an example from a recent mathematics exam in Ireland). If you look at that exam, you'll see that it's light years ahead of our SAT or AP exams in terms of what it measures and how it measures it. There's typically a national curriculum and all students take the same exam. I understand that many Americans are used to the idea of local control, but this approach wastes a lot of money by repeating effort and doesn't guarantee quality anyway. A national curriculum is also transparent. As far as I know (in Ireland and the UK anyway), the exams are written by subject experts (university academics, possibly teachers). I recall that my university tutor (in Ireland) wrote questions for the O or A levels (in the UK). Thanks for pointing to the Irish exam. It does look challenging. I wonder what math textbooks are used by students preparing for this exam.
"To see what is in front of one's nose needs a constant struggle." - George Orwell
|
|
|
|
Joined: Sep 2008
Posts: 1,898
Member
|
Member
Joined: Sep 2008
Posts: 1,898 |
I don't know specifically what is usually used, but the material looks like a good match for the various books by Bostock and Chandler for A level, (A core course for A level, Further Pure Mathematics, Mechanics and Probability, etc. - they've been around and continually reissued at least since I used them in school in the early 80s, and I recently bought a set for DS!)
Email: my username, followed by 2, at google's mail
|
|
|
|
|