You are here

Automated Essay Scoring

Journal Name:

Publication Year:

Author NameUniversity of Author
Abstract (2. Language): 
The impacts of computers on writing have been widely studied for three decades. Even basic computers functions, i.e. word processing, have been of great assistance to writers in modifying their essays. The research on Automated Essay Scoring (AES) has revealed that computers have the capacity to function as a more effective cognitive tool (Attali, 2004). AES is defined as the computer technology that evaluates and scores the written prose (Shermis & Barrera, 2002; Shermis & Burstein, 2003; Shermis, Raymat, & Barrera, 2003). Revision and feedback are essential aspects of the writing process. Students need to receive feedback in order to increase their writing quality. However, responding to student papers can be a burden for teachers. Particularly if they have large number of students and if they assign frequent writing assignments, providing individual feedback to student essays might be quite time consuming. AES systems can be very useful because they can provide the student with a score as well as feedback within seconds (Page, 2003). Four types of AES systems, which are widely used by testing companies, universities, and public schools: Project Essay Grader (PEG), Intelligent Essay Assessor (IEA), E-rater, and IntelliMetric. AES is a developing technology. Many AES systems are used to overcome time, cost, and generalizability issues in writing assessment. The accuracy and reliability of these systems have been proven to be high. The search for excellence in machine scoring of essays is continuing and numerous studies are being conducted to improve the effectiveness of the AES systems.

REFERENCES

References: 

A description of a new AI system with superior learning capabilities, Retrieved on June
06, 2004 at http://www.geocities.com/ainew.geo/index.html.
Attali, Y. (April, 2004). Exploring the feedback and revision features of Criterion. Paper
presented at the National Council on Measurement in Education (NCME), San Diego, CA.
Attali, Y. & Burstein, J. (June, 2004). Automated essay scoring with e-rater V.2.0. Paper
presented at the Conference of International Association for Educational Assessment
(IAEA), Philadelphia, PA.
60
Bereiter, C. (2003). Automated essay scoring: a cross disciplinary approach. In Mark D.
Shermis and Jill C. Burstein (Eds.), Foreword (pp. vii- ix), Lawrence Erlbaum Associates:
Mahwah, NJ.
Brill, E. & Mooney, R. (1997). An overview of emprical natural language processing. AI
Magazine 18 (4), 13-24.
Burstein, J. (2003). The e-rater scoring engine: automated essay scoring with natural
language processing. In Mark D. Shermis and Jill C. Burstein (Eds.). Automated essay
scoring: a cross disciplinary approach. Mahwah, NJ: Lawrence Erlbaum Associates.
Burstein, J., & Chodorow, M. (June, 1999). Automated essay scoring for nonnative
English speakers. Proceedings of the ACL99 Workshop on Computer-Mediated Language
Assessment and Evaluation of Natural Language Processin, College Park, MD.
Burstein, J., Kukich, K., Wolff, S., Lu, C., & Chodorow, M. (April, 1998). Computer analysis
of essays. Proceedings of the NCME Symposium on Automated Scoring, Montreal, Canada.
Burstein, J., Chodorow, M., & Leacock, C. (August, 2003). Criterion: Online essay
evaluation: an application for automated evaluation of student essays. Proceedings of the
15th Annual Conference on Innovative Applications of Artificial Intelligence, Acapulco,
Mexico.
Burstein, J. & Marcu, D. (2000). Benefits of modularity in an Automated Essay Scoring
System (ERIC reproduction service no TM 032 010).
Burstein, J. (2003). The e-rater scoring engine: Automated essay scoring with natural
language processing. In M. D. Shermis & J. Burstein (Eds.). Automated essay scoring: A
cross-disciplinary perspective. Mahwah, NJ: Lawrence Erlbaum Associates.
Burstein, J., Leacock, C., & Swartz, R. (2001). Automated evaluation of essays and short
answers. Proceedings of the 5th International Computer Assisted Assessment Conference
(CAA 01), Loughborough University.
Chodorow, M. & Burstein, J. (2004). Beyond essay length: evaluating e-rater’s
performance on TOEFL essays (Research report no 73). Princeton, NJ: Educational Testing
Service (ETS).
Chung, K. W. K. & O’Neil, H. F. (1997). Methodological approaches to online scoring of
essays (ERIC reproduction service no ED 418 101).
Educational Testing Service (ETS). (n.d.). E-rater. Retrieved on May 06, 2004 at
www.ets.org/e-rater
Educational Testing Service (ETS). (n.d.). Criterion, Retrieved on May 06, 2004 at
http://www.ets.org/criterion/ell/faq.html.
Elliot, S. (2000a). A study of expert scoring and IntelliMetric scoring accuracy for
imensional scoring of Grade 11 student writing responses (RB- 397). Newtown, PA:
Vantage Learning.
Elliot, S. (2000b). A true score study of IntelliMetric accuracy for holistic and dimensional
scoring of college entry-level writing program (RB-407). Newtown, PA: Vantage Learning.
Elliot, S. (2001a). About IntelliMetric (PB-540). Newtown, PA: Vantage Learning.
Elliot, S. (2001c). Applying IntelliMetric Technology to the scoring of 3rd and 8th grade
standardized writing assessments (RB-524). Newtown, PA: Vantage Learning.
61
Elliot, S. (2002). A study of expert scoring, standard human scoring and IntelliMetric
scoring accuracy for statewide eighth grade writing responses (RB-726). Newtown, PA:
Vantage Learning.
Elliot, S. (2003a). A true score study of 11th grade student writing responses using
IntelliMetric Version 9.0 (RB-786). Newtown, PA: Vantage Learning.
Elliot, S. (2003b). Assessing the accuracy of IntelliMetric for scoring a district- wide
writing assessment (RB-806). Newtown, PA: Vantage Learning.
Elliot, S. (2003c). How does IntelliMetric score essay responses? (RB-929). Newtown, PA:
Vantage Learning.
Elliot, S. (2003d). IntelliMetric: from here to validity. In Mark D. Shermis and Jill C.
Burstein (Eds.). Automated essay scoring: a cross disciplinary approach. Mahwah, NJ:
Lawrence Erlbaum Associates.
Foltz, P. W., Laham, D. & Landauer, T. K. (1999). Automated Essay Scoring:Applications to
Educational Technology. Proceedings of EdMedia '99. Retrieved on 5/15/04 from
http://www-psych.nmsu.edu/ ~pfoltz/reprints/Edmedia99.html.
Hyland, F. (1998). The impact of teacher written feedback on individual writers. Journal
of Second Language Writing, 7 (3), 255-286.
Kukich, K. (September/October, 2000). Beyond Automated Essay Scoring. In Marti A.
Hearst (Ed), The debate on automated essay grading. IEEE Intelligent systems, 27-31.
Retrieved on November 12, 2004,
http://que.info-science.uiowa.edu/~light/research/mypapers/autoGradingIE....
Landauer, T. K., Laham, D., Rehder, B. & Schreiner, M. E. (1997). How well can passage
meaning be derived without using word order? A comparison of Latent Semantic Analysis
and humans. Proceedings of the 19th Annual Conference of the Cognitive Science Society,
(pp. 412-417). Mawhwah, NJ: Erlbaum.
Landauer, T. K., Laham, D., & Foltz, P. W. (September/ October, 2000). The Intelligent
Essay Assessor. In Marti A. Hearst (Ed), The debate on automated essay grading. IEEE
Intelligent systems, 27- 31. Retrieved on November 12, 2004 from
http://que.info-science.uiowa.edu/~light/research/mypapers/autoGradingIE....
Landauer, T. K., Laham, D., & Foltz, P. W. (2003). Automated Essay Scoring: A Cross
Disciplinary Perspective. In Mark D. Shermis and Jill C. Burstein (Eds.), Automated Essay
Scoring and Annotation of Essays with the Intelligent Essay Assessor. Mahwah,
NJ:Lawrence Erlbaum Associates.
Latent Semantic Analysis (LSA). (n.d.). Retrieved on May 8, 2004 from
http://lsa.colorado.edu/whatis.html.
Lemaire, B. & Dessus, P. (2001). A system to assess the semantic content of student
essays. J. Educational Computing Research, Vol. 24 (3), 305-306.
Murray, B. (1998). The latest techno tool: essay grading computers. American
Psychological Association (APA), 8 (29). Retrieved from:
http://www.apa.org/monitor/aug98/grade.html.
Myers, M. (2003). What can computers and AES contribute to a K-12 writing program? In
M. D. Shermis & J. Burstein (Eds.). Automated essay scoring: A cross-disciplinary
perspective. Mahwah, NJ: Lawrence Erlbaum Associates.
62
Nichols, P. D. (April, 2004). Evidence for the interpretation and use of scores from an
Automated Essay Scorer. Paper presented at the annual meeting of the American
Educational Research Association (AERA), San Diego, CA.
Page, E. B. (2003). Project Essay Grade: PEG. In M. D. Shermis & J. Burstein (Eds.).
Automated essay scoring: A cross-disciplinary perspective. Mahwah, NJ: Lawrence
Erlbaum Associates.
Pearson Knowledge Technologies (PKT) official website,
http://www.knowledge-technologies.com.
Psotka, J, & Streeter, L.(n.d.) Automatically critiquing writing for army educational
settings. Retrieved on December 02, 2004
http://www.hqda.army.mil/ari/pdf/critiquing_writing.pdf.
Rudner, L. & Gagne, P. (2001). An overview of three approaches to scoring written essays
by computer (ERIC Digest number ED 458 290).
Salem, A. B. M. (2000). The potential role of artificial intelligence technology in Education
(ERIC document reproduction service no ED 477 318).
Shermis, M. D. & Burstein, J. (2003). Automated Essay Scoring: A Cross Disciplinary
Perspective. Mahwah, NJ: Lawrence Erlbaum Associates.
Shermis, M. & Barrera, F. (2002). Exit assessments: evaluating writing ability through
automated essay scoring (ERIC document reproduction service no ED 464 950).
Shermis, M. D., Raymat, M. V., & Barrera, F. (2003). Assessing writing through the
curriculum with automated essay scoring (ERIC document reproduction service no ED 477
929).
Streeter, L., Psotka, J., Laham, D., & MacCuish, D. (2004). The credible grading machine:
essay scoring in the DOD [Department of Defense]. Retrieved on January 10, 2005 at
http://www.k-a-t.com/papers/essayscoring.pdf.
Vantage Learning. (n.d.). My Access. Retrieved on May 06, 2004 at
http://www.vantagelearning.com.

Thank you for copying data from http://www.arastirmax.com