1. We’re no longer the gold standard in testing.
The SAT (the college admissions test formerly known as the Scholastic Aptitude Test) — and the anxiety that comes with it — have plagued students for roughly a century now.
The first standardized admissions test, administered in 1901, was an essay-only exam developed by a group of U.S. colleges, including Harvard, that wanted a uniform way to determine if prospective students could handle college coursework. In 1926, the first multiple-choice SAT was administered, according to the College Board.
Today, the SAT holds a critical spot in the college admissions process. While only 8,000 students took the test in 1926, more than 1.6 million did last year. That’s partially because now the majority of four-year colleges require prospective students to take the SAT, or its cousin, the ACT, before they will admit them. The cost of college is soaring — in the past decade, the average annual cost of a public four-year college increased by an inflation-adjusted 37% to $18,391 — and many scholarships are tied to test performance. Plus, college is a must for many students who want careers: theunemployment rate for college graduates is just 3.4%, compared to 6.3% for those with just a high-school diploma.
So it’s no wonder that parents and students looked on with interest (and alarm) when in March, the College Board, which operates the SAT, announced that it would revamp the test, getting rid of the mandatory essay and obscure vocabulary words, adding more reading materials from across disciplines like science and social studies and shifting a perfect score from 2400 back to 1600, with a separate score for the optional essay. The College Board says that the moves — which will take effect in spring 2016 — are designed to make the test “more focused and useful than ever before.” “The College Board is making a commitment to increase the college and career readiness of all students by offering a solution that goes well beyond simply administering another test — and well beyond what is offered by the ACT,” says Katherine Levin, a spokeswoman for the College Board.
But critics say that some of these moves make the SAT more like its arch-nemesis, the ACT, and that these moves may have been, at least in part, borne out of the fact that the SAT has lost its once-dominant spot to the ACT. “The College Board was feeling pressure from this,” says Joseph Soares, a professor of sociology at Wake Forest University and the author of “SAT Wars.” In 2012, for the first time ever, more students took the ACT (1,666,017) than the SAT (1,664,479), a trend that continued into 2013 when roughly 140,000 more students took the ACT than the SAT, according to FairTest, The National Center for Fair and Open Testing, a non-profit testing watchdog organization. And that will likely happen again this year, says Bob Schaeffer, the public education director of FairTest.
There are many reasons for this shift, including the fact that you don’t “need” the SAT to go to college anymore. Almost all schools will now accept the ACT, and the ACT is perceived as more consumer friendly than the SAT, as its writing section is optional, there’s no deduction for wrong answers and its subject areas tend to more closely reflect what’s learned in high school, says Schaeffer. Plus, for years, the ACT aggressively marketed itself as a supplement for high school exit exams, which are supposed to reflect students’ readiness to graduate, says Schaeffer; thus the ACT now has contracts to test the eleventh graders in 13 states, which means that nearly all students in these states automatically take the ACT anyway.
2. The richer you are, the better you do.
Money talks — at least when it comes to college admissions tests like the SATs. In 2013, students whose families had annual income of less than $20,000 scored an average of 1326; between $40,000 and $60,000, an average of 1461; between $100,000 and $120,000 an average of 1569; and more than $200,000, an average of 1714. This relationship between income and performance has been around for years, the data show. “Kids from wealthy families do better on the test,” says Soares.
Part of this discrepancy is that family income is “connected to a least half a dozen other factors” that give richer kids better opportunities to score well on the SAT, including the fact that wealthier families tend to send their children to better schools and parents in these families tend to be better educated themselves, says Jay Rosner, the executive director of the Princeton Review Foundation, a nonprofit dedicated to test fairness.
Plus, wealthier families can afford pricey test prep, Rosner adds. “Some people pay $5,000 to $10,000 or more for private tutors,” he says — which helps ensure the students do the prep work and learn test-taking strategies to help on the test. The College Board points out that it is partnering with Khan Academy to create free test-prep materials. But Soares says that this won’t do anything to change the rich/poor score discrepancy, as “there are already a lot of free materials online and the digital divide is not going away — kids in the bottom 50% still will not have access to materials.”
3. We favor white and Asian men.
Income isn’t the only factor that’s highly correlated with SAT scores — ethnicity is, as well. In general, Asians get the highest scores (1645 on average) on the SAT, followed by whites (1576). Compare that to African-Americans (who tend to score lowest at 1278), Puerto Ricans (1354) and Mexican or Mexican-American test-takers (1355), and you’ll see a 200+ point score differential. That trend holds up in other years — a study by Rosner found that Latinos and African-Americans tend to score about 67 and 100 points lower, respectively, on both the math and verbal sections than whites.
Gender matters, too. In 2013, women scored an average of 1,486, while men scored 1,512. That’s also a long-standing trend Rosner says, with women scoring about 33 points lower than men on the math section on average. Women and men get roughly equal verbal scores.
Some say that these gaps reflect differences in things like wealth, income or quality of schools, to name a few. But Rosner says that this is the result, in large part, of how the College Board selects test questions. The test makers select new questions for the test that fall along a certain point on the age-old bell curve (so that a certain percentage of people get it right and a certain percentage get it wrong) — and “they kick out questions that mess up the bell curve,” says Soares. That means that the test may inadvertently discriminate, Rosner says: “If high-scoring test-takers — who are more likely to be white (and male, and wealthy) — tend to answer the question correctly in pre-testing, it’s a worthy SAT question; if not, it’s thrown out,” writes Rosner in a 2003 study . He notes that while race and ethnicity are not considered explicitly, “racially disparate scores drive question selection, which in turn reproduces racially disparate test results in an internally reinforcing cycle.”
“Years of research have consistently demonstrated that the SAT is a valid predictor of first-year college success for all students, regardless of gender, race or socio-economic status,” says the College Board’s Levin. “We pre-test each question and gather statistics by race and ethnicity to make sure questions are fair to all students.”
4. We get the answers wrong on our own test.
By its own admission, the College Board has gotten the answers wrong on its own test and been forced to correct test-takers scores because of it. And in one case, it was a then 17-year old student who alerted them to the error.
But that may not be the most egregious of SAT errors, as some critics say that the test is flawed in other ways — in particular with respect to the essay portion of the SAT, as it doesn’t have a “right” answer. Indeed, Les Perelman, the former director of undergraduate writing at MIT, has blasted the essay because he says it rewards students who write longer essays (and as any writer knows, length doesn’t always mean quality) and it doesn’t focus on factual accuracy; others have complained that essay graders have too many essays to grade and may get grading fatigue.
The College Board points out that each essay is read by two people who give it a score between 1 and 6, and if the scores differ by more than one point, a third reader reads the essay. What’s more, this will become a moot point for some students come 2016, as the College Board will make the essay optional (though some schools may still require students to take it and turn in their scores).
5. We can be as bad as the airlines when it comes to fees.
Students and their parents can easily spend hundreds of dollars in a single year by taking the SAT, even though taking the actual test only costs $51. That’s thanks to all the ancillary fees that the College Board tacks on. Some of these fees include: $15 to register by phone or get scores by phone; $27.50 to change the date you want to take the test or where you’ll take it; $45 to join the waitlist for a test (this is only charged if you get admitted to take the test); $31 to rush test scores to a school; and a hefty $55 to get your test scored by a person rather than machine. And if you change your mind about one of these services, too bad: the fees are not refundable. “They are pushing the costs off on students,” says Soares.
The ACT also has a similar set of fees (it costs $52.50 to take the test with the optional writing portion), and The College Board does offer fee waivers for some students who can’t afford to pay these fees. Still, many students who don’t qualify for this waiver still find it a stretch to pay.
All those fees are padding the College Board’s bottom line. Though the organization is a non-profit, The College Board rakes in more than $750 million in annual revenue, according to its latest Form 990, and pays out $191 million in salaries and employee benefits and compensation (the compensation for the College Board’s former president Gaston Caperton was more than $1.4 million in 2011, and nearly two dozen employees made $200,000 or more). “The College Board gives the impression that it’s a low-budget educational institution, but it has its own building across from Lincoln Center and the previous president made $1.5 million,” says Schaffer, who says that the organization made $45 million in SAT revenue alone. The College Board also makes money from other exams like Advanced Placement tests, professional development and college-readiness programs and services and investment income, among other sources.
6. Schools say we’re less relevant.
More schools are making standardized tests like the SAT optional for admission. Schaffer says that more than 800 of the roughly 3,000 bachelor-degree granting colleges and universities in the U.S. are now test-optional — and it’s not just schools with less-than-stellar reputations. Top-tier schools like Wake Forest University , Bard College and Bowdoin College don’t require students to take the SAT (or ACT) to apply for admission; FairTest data show that the list of top-tier schools that don’t require the ACT or SAT has hit 150.
When Wake Forest made the decision to go test-optional starting in 2009, it said the move was designed to “broaden the applicant pool and increase access at Wake Forest for groups of students who are currently underrepresented at selective universities.” It seems to have had its desired effect : Wake Forest dean of admissions Martha Allman says that the school now has more students who are eligible for Pell Grants, which are given to low-income students who demonstrate financial need, more first-generation college students and more minorities.
Schaffer says that he thinks more colleges will follow suit, as schools realize that using SAT scores might not yield the best admissions results. He cites a 2014 study that examined 33 schools with test-optional policies and revealed that use of the SAT may “artificially truncate the pools of applicants who would succeed if they could be encouraged to apply.” Furthermore, the study found that “there are no significant differences either in cumulative GPA or graduation rates between submitters [those who opted to send in their test results] and non-submitters.” “It’s only going to take one of the more prestigious schools to go test-optional and then the dam will break,” says Soares.
Of course, most schools still require the SAT or ACT and many believe in its predictive ability. Elizabeth Heaton, a college admissions consultant for College Coach and former University of Pennsylvania senior admissions officer, says that strong SAT scores on math, for example, can often predict success in the school’s engineering program; and some studies show that the SAT performance does track first-year GPA. And, she adds that almost all students applying to college still have to take the SAT or ACT, even if their top pick is a test-optional school, as students usually apply to multiple schools to ensure they get into at least one.
7. We’re not the best predictor of your college success.
Whether or not your child will do well academically in college may have a lot less to do with her SAT scores than it does with her high school grade-point average, many studies show. A 2012 study of more than 3,000 college sophomores and seniors published by the Council for Aid to Education found that high school GPA was “the single best predictor of college GPA” and a study of nearly 80,000 students published in 2007 by the Center for Studies in Higher Education found that high school GPA is “ consistently the strongest predictor of four-year college outcomes.”
But that’s not to say that the SAT is worthless. Several studies — including one done by The College Board — show that SAT scores do help predict first-year college grades (though most also show that high school GPA is still a better predictor of college GPA in most years than standardized test scores).
The College Board’s Levin says that the SAT is rigorously researched and designed tests and studies show it is a “valid predictor of college success for all students.”
“Combine high school grades and test scores and you get more of a statistical punch, but it’s on a magnitude of just one to two percentage points more when you add in the test scores,” says Soares. “It’s not a lot of additional statistical power.”
8. We’ll haunt you forever.
There are plenty of things students hate about the SATs, but most think that once they’ve gotten into college all will be forgotten. Alas, you may be reliving those testing nightmares for decades to come, as some companies ask some applicants to submit their SAT scores. At Goldman Sachs and McKinsey & Co. some new recruits — who don’t have years of work experience to fill out their resumes — are asked for their SAT scores. At other firms even senior executives have been asked for their scores. Alan Weatherbee, the senior vice president for talent search at public relations firm Allison + Partners, says that some hiring managers look at SAT scores because they are often “starved” for as many data points as they can get about a potential hire so they can make the smartest choice. “The companies see the SAT as something that measures general intelligence and ability,” says executive coach Marc Dorio, the author of “The Complete Idiot’s Guide to the Perfect Job Interview.” “They see it as another marker for the probability of success.” Goldman Sachs did not respond to request for comment and McKinsey & Co. declined to comment.
Critics say this kind of requirement can be unfairly biased against some candidates. “SAT scores tell you nothing about performance in careers -- that’s laughable,” says Rosner. “It doesn’t tell you anything specific about the real world.” But others say that there is some value in this. “What is being tested [on the SAT] is important for your career,” says Shiv Gaglani, the author of “Standing Out on the SAT and ACT: Perfect Scorers’ Uniquely Effective Strategies for Testing and Admissions Success.” He says that the vocabulary and grammar that the test examines can help you write better, and that basic math skills help you problem solve — both of which are essential career skills. And a low score doesn’t always mean no job: Boston Consulting Group told The Wall Street Journal that while the firm doesn’t set minimum SAT score requirements, it does ask candidates with low math scores to show other skills like leadership or subject-matter expertise.
9. Cheating plagues us.
The College Board investigates a couple thousand cases of SAT cheating each year, says Schaeffer — and in some cases, the incidences make splashy headlines. Last year, officials cancelled the May 4 SAT exam in South Korea when they learned that questions from the SAT were being passed around test-prep centers (apparently, official test booklets could be bought for about $4,575 apiece); it was the first time the tests had been called off in an entire country for suspected cheating. And in 2011, a handful of Nassau County, New York teens were accused of paying other people thousands of dollars to take the SAT for them.
Thanks to stricter identification requirements at testing centers, officials for the most part have cracked down on this type of cheating where one student impersonates another, says Schaeffer. And though this kind of cheating still sometimes occurs, Schaeffer says that something called time-zone cheating is now more prevalent. That’s if a student in New York, for example, has a co-conspirator in London, which is six hours ahead, who takes the test and shares what the questions were like. Another more common type of cheating is when two or more students share answers with one another in the testing center, he adds. The College Board notes that it can take legal action against cheaters and has a list of rules that students must follow.
10. Your test anxiety may hurt your score.
A number of studies show that students who have high levels of test anxiety tend to score significantly worse on tests in general than those with low levels of test anxiety. A study published in the 2013 in the Universal Journal of Educational Research found that the level of test anxiety before an exam predicted a student’s score, and a study from 2003 found that students with higher text anxiety tended to score about one-third of a letter grade lower than those with low test anxiety.
Reactions to the SAT are no different, experts say, and can influence test scores. “It [the SAT] can be incredibly stressful,” says Heaton. Gaglani, who tutors kids for the SAT, says that the test can be “very stressful” because there is a lot of parental pressure, the kids know that their scores matter to get into college and get money for college, and they worry about how others will perceive their score.
There is also some evidence that students’ SAT scores in particular are lowered because of this type of worry. A study published in the journal Clinical Pediatrics looked at students who had taken the SAT before the study and had severe test anxiety; the researchers then had them take a different version of the test again, but this time gave them a beta-blocker, and saw their scores jump 130 points, on average. The College Board’s Levin says that the new exam will be “more clear and open than ever before” and that there will be high-quality free practice materials that students can use to prep for the test.