Friday, April 27, 2012
Automated Essay Scoring Systems as Effective as Human Graders
A direct comparison between human graders and software designed to score student essays achieved virtually identical levels of accuracy, with the software in some cases proving to be more reliable, a groundbreaking study has found.
“The demonstration showed conclusively that automated essay scoring systems are fast, accurate, and cost effective,” said Tom Vander Ark, CEO of Open Education Solutions, which provides consulting serves related to digital learning, and co-director of the study. That’s important because writing essays are one important way for students to learn critical reasoning, but teachers don’t assign them often enough because grading them is both expensive and time consuming. Automated scoring of essays holds the promise of lowering the cost and time of having students write so they can do it more often.
Education experts believe that critical reasoning and writing are part of a suite of skills that students need to be competitive in the 21st century. Others are working collaboratively, communicating effectively and learning how to learn, as well as mastering core academic content. The Hewlett Foundation calls this suite of skills Deeper Learning and is making grants to encourage its adoption at schools throughout the country.
“Better tests support better learning,” says Barbara Chow, Education Program Director at the Hewlett Foundation. “This demonstration of rapid and accurate automated essay scoring will encourage states to include more writing in their state assessments. And, the more we can use essays to assess what students have learned, the greater the likelihood they’ll master important academic content, critical thinking, and effective communication.”
For more than 20 years, companies that provide automated essay scoring software have claimed that their systems can perform as effectively, more affordably and faster than other available methods of essay scoring. The study was the first comprehensive multi-vendor trial to test those claims. The study challenged nine companies that constitute more than ninety-seven percent of the current market of commercial providers of automated essay scoring to compare capabilities. More than 16,000 essays were released from six participating state departments of education, with each set of essays varying in length, type, and grading protocols. The essays were already hand scored according to state standards. The challenge was for companies to approximate established scores by using software.
At a time when the U.S. Department of Education is funding states to design and develop new forms of high-stakes testing, the study introduces important data. Many states are limited to multiple-choice formats, because more sophisticated measures of academic performance cost too much to grade and take too long to process. Forty-five states are already actively overhauling testing standards, and many are considering the use of machine scoring systems. The study grows from a contest call the Automated Student Assessment Prize, or ASAP, which the Hewlett Foundation is sponsoring to evaluate the current state of automated testing and to encourage further developments in the field.
In addition to looking at commercial vendors, the contest is offering $100,000 in cash prizes in a competition open to anyone to develop new automated essay scoring techniques. The open competition is underway now and scheduled to close on April 30th. The pool of $100,000 will be awarded the best performers. Details of the public competition are available at www.kaggle.com/c/ASAP-AES. The open competition website includes an active leader board to document prize rules, regularly updated results, and discussion threads between competitors.
The goal of ASAP is to offer a series of impartial competitions in which a fair, open and transparent participation process will allow key participants in the world of education and testing to understand the value of automated student assessment technologies.
ASAP is being conducted with the support of the Partnership for Assessment of Readiness for College and Careers and Smarter Balanced Assessment Consortium, two multi-state consortia funded by the U.S. Department of Education to develop next-generation assessments. ASAP is aligned with the aspirations of the Common Core State Standards and seeks to accelerate assessment innovation to help more students graduate from college and to become career ready.
Related article
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment