教師命題過程與學生答題過程研究
No Thumbnail Available
Date
2014
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
本文旨在研究以下三個問題:第一,高中教師如何命一份英文學科能力測驗的模擬試題?資深教師與新手教師在命題時的考慮點有何不同?第二,高中學生如何回答英文學科能力測驗模擬試題的題目?高程度學生與低程度學生的答題策略有何不同?第三,學生答題時的考慮點與教師命題時的考慮點是否一致?
四位高中英文教師及四十八位高中學生參與此研究。教師的任務是要命一份英文學測模擬試題,內含詞彙測驗、綜合測驗、及閱讀測驗等共二十八題選擇題;學生的任務則是要回答教師所命的模擬試題題目。所有參與者在執行任務時,必須要進行有聲思考法,以作為本研究的主要分析資料。
本研究主要結果如下。首先,資深教師與新手教師在命題時的考慮點略有不同;資深教師的命題考量較以學生為中心,而新手教師的命題考量則較符合評量上的命題原則。此外,資深教師所命的試題並沒有優於新手教師;而且,在四位教師所命的題目中,有不少試題是被專家評定為有暇疵、不合適,並需要修正及改進的。
其次,學生在作答不同類型題目時,大致上會採用不同的策略。然而,學生在作答三種類型的題目時,均有使用「消去法」。此結果顯示,消去法乃學生在本研究最常使用的答題策略。另外,高程度學生比低程度學生較常使用字彙及文法知識和演繹思考法來作答;而低程度學生比高程度學生較常利用「猜測法」來回答任何類型的題目。
研究也發現,學生的答題考慮點與教師的命題考慮點大不相同,兩者的一致率只有33%。此外,學生的想法和新手教師的想法較一致,而和資深教師的想法較不相同。高程度學生在綜合測驗的答題考慮點上,和教師們的命題考慮點出入較大;而低程度學生在閱讀測驗的考慮點上,和教師們的考慮點不一致性較高。
This study aims to investigate three research questions. First, how did experienced and novice teachers construct mock tests for the Scholastic Ability English Test (SAET)? Second, how did higher- and lower-proficiency students take those mock tests? Third, were students’ considerations for answering the tests consistent with teachers’ test-constructing considerations? Four senior high school teachers and forty-eight senior high school students participated in this study. All participants were asked to do think-aloud while performing their tasks. The teachers were asked to construct twenty-eight items of multiple-choice questions on vocabulary, cloze, and reading comprehension. The students were asked to answer the questions constructed by the teachers. Major findings of this study are summarized as follows. First, the experienced teachers and novice teachers seemed to make different types of considerations in constructing their tests. The experienced teachers took more student-oriented factors into account while the novice teachers took more test-construction principles into consideration. Despite their different considerations in test-constructing processes, the two experienced teachers did not seem to produce better test items than the two novice teachers. All four teachers had constructed some items that were deemed poor, problematic, or inappropriate from the authority’s perspective. Second, students generally used different strategies when answering different types of questions. However, they seemed to use the strategy of “elimination” very frequently on three types of tests. In terms of the proficiency levels, higher-proficiency students tended to use their vocabulary knowledge, grammar knowledge, and deductive reasoning more frequently than lower-proficiency students in answering the items. On the other hand, lower-proficiency students tended to use the strategy of “guessing” more frequently than higher-proficiency students across three types of questions. Third, students’ considerations for answering test items clashed with teachers’ test-constructing considerations to a great extent; the overall consistency rate between them was only about 33% in this study. Furthermore, students generally thought in a way more congruent with novice teachers than with experienced teachers. In addition, higher-proficiency students’ considerations clashed more with teachers’ considerations on cloze items while lower-proficiency students’ considerations clashed more with teachers’ considerations on reading comprehension questions.
This study aims to investigate three research questions. First, how did experienced and novice teachers construct mock tests for the Scholastic Ability English Test (SAET)? Second, how did higher- and lower-proficiency students take those mock tests? Third, were students’ considerations for answering the tests consistent with teachers’ test-constructing considerations? Four senior high school teachers and forty-eight senior high school students participated in this study. All participants were asked to do think-aloud while performing their tasks. The teachers were asked to construct twenty-eight items of multiple-choice questions on vocabulary, cloze, and reading comprehension. The students were asked to answer the questions constructed by the teachers. Major findings of this study are summarized as follows. First, the experienced teachers and novice teachers seemed to make different types of considerations in constructing their tests. The experienced teachers took more student-oriented factors into account while the novice teachers took more test-construction principles into consideration. Despite their different considerations in test-constructing processes, the two experienced teachers did not seem to produce better test items than the two novice teachers. All four teachers had constructed some items that were deemed poor, problematic, or inappropriate from the authority’s perspective. Second, students generally used different strategies when answering different types of questions. However, they seemed to use the strategy of “elimination” very frequently on three types of tests. In terms of the proficiency levels, higher-proficiency students tended to use their vocabulary knowledge, grammar knowledge, and deductive reasoning more frequently than lower-proficiency students in answering the items. On the other hand, lower-proficiency students tended to use the strategy of “guessing” more frequently than higher-proficiency students across three types of questions. Third, students’ considerations for answering test items clashed with teachers’ test-constructing considerations to a great extent; the overall consistency rate between them was only about 33% in this study. Furthermore, students generally thought in a way more congruent with novice teachers than with experienced teachers. In addition, higher-proficiency students’ considerations clashed more with teachers’ considerations on cloze items while lower-proficiency students’ considerations clashed more with teachers’ considerations on reading comprehension questions.
Description
Keywords
試題命製, 試題命製過程, 答題過程, 策略運用, 字彙測驗, 綜合測驗, 閱讀測驗, 有聲思考法, test-construction, test-constructing process, test-taking process, strategy use, vocabulary test, cloze test, reading comprehension test, think-aloud