Constructed response or multiple-choice questions for assessing declarative programming knowledge? That is the question!

Bibliographic Details
Main Author: Belo, Y.
Publication Date: 2019
Other Authors: Moro, S., Martins, A., Ramos, P., Costa, J. M., Esmerado, J.
Format: Article
Language: eng
Source: Repositórios Científicos de Acesso Aberto de Portugal (RCAAP)
Download full: http://hdl.handle.net/10071/20186
Summary: Aim/Purpose This paper presents a data mining approach for analyzing responses to advanced declarative programming questions. The goal of this research is to find a model that can explain the results obtained by students when they perform exams with Constructed Response questions and with equivalent Multiple-Choice Questions. Background The assessment of acquired knowledge is a fundamental role in the teachinglearning process. It helps to identify the factors that can contribute to the teacher in the developing of pedagogical methods and evaluation tools and it also contributes to the self-regulation process of learning. However, better format of questions to assess declarative programming knowledge is still a subject of ongoing debate. While some research advocates the use of constructed responses, others emphasize the potential of multiple-choice questions. Methodology A sensitivity analysis was applied to extract useful knowledge from the relevance of the characteristics (i.e., the input variables) used for the data mining process to compute the score. Contribution Such knowledge helps the teachers to decide which format they must consider with respect to the objectives and expected students results. Findings The results shown a set of factors that influence the discrepancy between answers in both formats. Recommendationsfor Practitioners Teachers can make an informed decision about whether to choose multiplechoice questions or constructed-response taking into account the results of this study. Recommendations for Researchers In this study a block of exams with CR questions is verified to complement the area of learning, returning greater performance in the evaluation of students and improving the teaching-learning process. Impact on Society The results of this research confirm the findings of several other researchers that the use of ICT and the application of MCQ is an added value in the evaluation process. In most cases the student is more likely to succeed with MCQ, however if the teacher prefers to evaluate with CR other research approaches are needed. Future Research Future research must include other question formats.
id RCAP_e63b693e3ade27965831b2264d19aab5
oai_identifier_str oai:repositorio.iscte-iul.pt:10071/20186
network_acronym_str RCAP
network_name_str Repositórios Científicos de Acesso Aberto de Portugal (RCAAP)
repository_id_str https://opendoar.ac.uk/repository/7160
spelling Constructed response or multiple-choice questions for assessing declarative programming knowledge? That is the question!Constructed responseMultiple-choice questionsEducational data miningSupport vector machineNeural networksAim/Purpose This paper presents a data mining approach for analyzing responses to advanced declarative programming questions. The goal of this research is to find a model that can explain the results obtained by students when they perform exams with Constructed Response questions and with equivalent Multiple-Choice Questions. Background The assessment of acquired knowledge is a fundamental role in the teachinglearning process. It helps to identify the factors that can contribute to the teacher in the developing of pedagogical methods and evaluation tools and it also contributes to the self-regulation process of learning. However, better format of questions to assess declarative programming knowledge is still a subject of ongoing debate. While some research advocates the use of constructed responses, others emphasize the potential of multiple-choice questions. Methodology A sensitivity analysis was applied to extract useful knowledge from the relevance of the characteristics (i.e., the input variables) used for the data mining process to compute the score. Contribution Such knowledge helps the teachers to decide which format they must consider with respect to the objectives and expected students results. Findings The results shown a set of factors that influence the discrepancy between answers in both formats. Recommendationsfor Practitioners Teachers can make an informed decision about whether to choose multiplechoice questions or constructed-response taking into account the results of this study. Recommendations for Researchers In this study a block of exams with CR questions is verified to complement the area of learning, returning greater performance in the evaluation of students and improving the teaching-learning process. Impact on Society The results of this research confirm the findings of several other researchers that the use of ICT and the application of MCQ is an added value in the evaluation process. In most cases the student is more likely to succeed with MCQ, however if the teacher prefers to evaluate with CR other research approaches are needed. Future Research Future research must include other question formats.Insformation Science Institutew2020-03-24T11:51:02Z2019-01-01T00:00:00Z20192020-03-24T11:50:30Zinfo:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/articleapplication/pdfhttp://hdl.handle.net/10071/20186eng2165-315110.28945/4479Belo, Y.Moro, S.Martins, A.Ramos, P.Costa, J. M.Esmerado, J.info:eu-repo/semantics/openAccessreponame:Repositórios Científicos de Acesso Aberto de Portugal (RCAAP)instname:FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologiainstacron:RCAAP2024-07-07T02:25:24Zoai:repositorio.iscte-iul.pt:10071/20186Portal AgregadorONGhttps://www.rcaap.pt/oai/openaireinfo@rcaap.ptopendoar:https://opendoar.ac.uk/repository/71602025-05-28T17:57:45.080885Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) - FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologiafalse
dc.title.none.fl_str_mv Constructed response or multiple-choice questions for assessing declarative programming knowledge? That is the question!
title Constructed response or multiple-choice questions for assessing declarative programming knowledge? That is the question!
spellingShingle Constructed response or multiple-choice questions for assessing declarative programming knowledge? That is the question!
Belo, Y.
Constructed response
Multiple-choice questions
Educational data mining
Support vector machine
Neural networks
title_short Constructed response or multiple-choice questions for assessing declarative programming knowledge? That is the question!
title_full Constructed response or multiple-choice questions for assessing declarative programming knowledge? That is the question!
title_fullStr Constructed response or multiple-choice questions for assessing declarative programming knowledge? That is the question!
title_full_unstemmed Constructed response or multiple-choice questions for assessing declarative programming knowledge? That is the question!
title_sort Constructed response or multiple-choice questions for assessing declarative programming knowledge? That is the question!
author Belo, Y.
author_facet Belo, Y.
Moro, S.
Martins, A.
Ramos, P.
Costa, J. M.
Esmerado, J.
author_role author
author2 Moro, S.
Martins, A.
Ramos, P.
Costa, J. M.
Esmerado, J.
author2_role author
author
author
author
author
dc.contributor.author.fl_str_mv Belo, Y.
Moro, S.
Martins, A.
Ramos, P.
Costa, J. M.
Esmerado, J.
dc.subject.por.fl_str_mv Constructed response
Multiple-choice questions
Educational data mining
Support vector machine
Neural networks
topic Constructed response
Multiple-choice questions
Educational data mining
Support vector machine
Neural networks
description Aim/Purpose This paper presents a data mining approach for analyzing responses to advanced declarative programming questions. The goal of this research is to find a model that can explain the results obtained by students when they perform exams with Constructed Response questions and with equivalent Multiple-Choice Questions. Background The assessment of acquired knowledge is a fundamental role in the teachinglearning process. It helps to identify the factors that can contribute to the teacher in the developing of pedagogical methods and evaluation tools and it also contributes to the self-regulation process of learning. However, better format of questions to assess declarative programming knowledge is still a subject of ongoing debate. While some research advocates the use of constructed responses, others emphasize the potential of multiple-choice questions. Methodology A sensitivity analysis was applied to extract useful knowledge from the relevance of the characteristics (i.e., the input variables) used for the data mining process to compute the score. Contribution Such knowledge helps the teachers to decide which format they must consider with respect to the objectives and expected students results. Findings The results shown a set of factors that influence the discrepancy between answers in both formats. Recommendationsfor Practitioners Teachers can make an informed decision about whether to choose multiplechoice questions or constructed-response taking into account the results of this study. Recommendations for Researchers In this study a block of exams with CR questions is verified to complement the area of learning, returning greater performance in the evaluation of students and improving the teaching-learning process. Impact on Society The results of this research confirm the findings of several other researchers that the use of ICT and the application of MCQ is an added value in the evaluation process. In most cases the student is more likely to succeed with MCQ, however if the teacher prefers to evaluate with CR other research approaches are needed. Future Research Future research must include other question formats.
publishDate 2019
dc.date.none.fl_str_mv 2019-01-01T00:00:00Z
2019
2020-03-24T11:51:02Z
2020-03-24T11:50:30Z
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/article
format article
status_str publishedVersion
dc.identifier.uri.fl_str_mv http://hdl.handle.net/10071/20186
url http://hdl.handle.net/10071/20186
dc.language.iso.fl_str_mv eng
language eng
dc.relation.none.fl_str_mv 2165-3151
10.28945/4479
dc.rights.driver.fl_str_mv info:eu-repo/semantics/openAccess
eu_rights_str_mv openAccess
dc.format.none.fl_str_mv application/pdf
dc.publisher.none.fl_str_mv Insformation Science Institutew
publisher.none.fl_str_mv Insformation Science Institutew
dc.source.none.fl_str_mv reponame:Repositórios Científicos de Acesso Aberto de Portugal (RCAAP)
instname:FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologia
instacron:RCAAP
instname_str FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologia
instacron_str RCAAP
institution RCAAP
reponame_str Repositórios Científicos de Acesso Aberto de Portugal (RCAAP)
collection Repositórios Científicos de Acesso Aberto de Portugal (RCAAP)
repository.name.fl_str_mv Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) - FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologia
repository.mail.fl_str_mv info@rcaap.pt
_version_ 1833597097146843136