Rater Scoring Modeling Tool (RSMTool) - RSMTool: collection of tools building and evaluating automated scoring models. Automated scoring of written and spoken responses is a growing field in educational natural language processing. Automated scoring engines employ machine learning models to predict scores for such responses based on features extracted from the text/audio of these responses. Examples of automated scoring engines include Project Essay Grade for written responses and SpeechRater for spoken responses. RSMTool is a python package which automates and combines in a single :doc:`pipeline <pipeline>` multiple analyses that are commonly conducted when building and evaluating automated scoring models. The output of RSMTool is a comprehensive, customizable HTML statistical report that contains the outputs of these multiple analyses. While RSMTool does make it really simple to run this set of standard analyses using a single command, it is also fully customizable and allows users to easily exclude unneeded analyses, modify the standard analyses, and even include custom analyses in the report.
Keywords for this software
References in zbMATH (referenced in 2 articles , 1 standard article )
Showing results 1 to 2 of 2.
- Hitoshi Manabe, Masato Hagiwara: EXPATS: A Toolkit for Explainable Automated Text Scoring (2021) arXiv
- Nitin Madnani, Anastassia Loukina: RSMTool: collection of tools building and evaluating automated scoring models (2016) not zbMATH