Normal view MARC view ISBD view

Psychometric features as a function of scoring method in performance-based test scores / Buško, Vesna.

By: Buško, Vesna.
Material type: ArticleArticleDescription: str.Other title: Psychometric features as a function of scoring method in performance-based test scores [Naslov na engleskom:].Subject(s): 5.06 | performance-based scores, multiple choice items, differential weighting, psychometric evaluation hrv | performance-based scores, multiple choice items, differential weighting, psychometric evaluation engOnline resources: Elektronička verzija sažetka In: ISSID 2011 - International Society for Study of Individual Differences (25-28.07.2011. ; London, Velika Britanija)Summary: The study presents an empirical demonstration of a model for quantitative treatment of performance-based test scores composed of multiple choice items. The proposed model assumes that the choice of an incorrect option is not a random function of an implied score on an underlying construct or a latent variable measured. Once distracters are made so that their choice is dependent on scores on the measured construct, individual items potentially turn into better measures, and the composite scores necessarily become more valid estimates compared to those based on a binary model. The performance of the formulated model of differential weighting was tested and exemplified using several sets of empirical data on different forms of a nonverbal abstract reasoning test and a verbal ability-based emotional intelligence test. The results showed an improvement in performance of both items and derived composite scores when the proposed differential weighting model was used, even with the tests initially created under standard, binary-scoring paradigm.
Tags from this library: No tags from this library for this title. Log in to add tags.
No physical items for this record

The study presents an empirical demonstration of a model for quantitative treatment of performance-based test scores composed of multiple choice items. The proposed model assumes that the choice of an incorrect option is not a random function of an implied score on an underlying construct or a latent variable measured. Once distracters are made so that their choice is dependent on scores on the measured construct, individual items potentially turn into better measures, and the composite scores necessarily become more valid estimates compared to those based on a binary model. The performance of the formulated model of differential weighting was tested and exemplified using several sets of empirical data on different forms of a nonverbal abstract reasoning test and a verbal ability-based emotional intelligence test. The results showed an improvement in performance of both items and derived composite scores when the proposed differential weighting model was used, even with the tests initially created under standard, binary-scoring paradigm.

Projekt MZOS 130-1301683-1402

ENG

There are no comments for this item.

Log in to your account to post a comment.

Powered by Koha

//