Chapter 6 Powerpoint Notes
Chapter 6 Powerpoint Notes KH 3550
Popular in Evaluation and Instrumentation in Exercise Science
Popular in PHIL-Philosophy
verified elite notetaker
This 52 page Class Notes was uploaded by Apollo12 on Tuesday February 2, 2016. The Class Notes belongs to KH 3550 at Georgia State University taught by Brandenberger in Spring 2014. Since its upload, it has received 14 views. For similar materials see Evaluation and Instrumentation in Exercise Science in PHIL-Philosophy at Georgia State University.
Reviews for Chapter 6 Powerpoint Notes
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 02/02/16
Chapter 6 Construction of Knowledge Tests Chapter Objectives After completing this chapter, you should be able to 1. List and describe the steps for knowledge test construction. 2. Construct a table of specifications and explain its use. 3. State the purposes of item analysis. 4. Define item difficulty, index of discrimination, and response quality. 5. Conduct item analysis. 6. Contrast the advantages and disadvantages of various test items. 7. Construct true-false, multiple-choice, short answer, completion, matching, and essay test items. 6-2 Steps in Construction of a Test Knowledge test refers to tests that measure thought processes. TEST PLANNING 1. Consider content validity. 2. Develop a table of test specifications (serves as an outline for construction of the test). Includes: - kinds and number of test items - kinds of tasks (though processes) the items will present and number of each kind of tasks - content area and number of items in each area (See table 6.1.) 6-3 Steps in Construction of a Test Table 6.1 Specifications for a 50-item multiple-choice test Task (Number of Questions and Percentages of Total) Content Area Knowledge Comprehension Analysis Application History 2(4%) Rules 5(10%) 5(10%) 5(10%) Technique 2(4%) 5(10%) 8(16%) Offensive 4(8%) 5(10%) strategy Defensive 4(8%) 5(10%) strategy 6-4 Steps in Construction of a Test *Most commonly used kinds of objective items are multiple-choice, true-false, matching, and completion. The total number of items usually determined by: - length of time to take to take test - the length of the items - the difficulty of the items - the conditions under which the test is to be administered - age of individuals taking the test *In addition to factual information, test takers may be asked to demonstrate ability (thought processes) to: - comprehend - apply - evaluate - analyze - synthesize 6-5 Table 6.2 Bloom’s Classification of Thought Processes (Cognitive Behavior) 1. Knowledge – recall 2. Comprehension – lowest level of understanding 3. Application – use of knowledge and understanding 4. Analysis – separate whole into parts 5. Synthesis – rearrange 6. Evaluation (most advanced) – make judgments about value of information 6-6 Steps in Construction of a Test *Content area deals with areas covered during instruction. In physical education test might include: - history - technique or mechanical analysis - terminology - strategy - rules - benefits of participation - equipment *Item difficulty should be related to the purpose of the test; determined only after test is administered; but through experience you can develop the ability to estimate. 6-7 Test Item Construction General Guidelines 1. Allow enough time to complete the test construction. 2. No item should be included on a test unless it covers an important fact, concept, principle, or skill. Why is the test taker responsible for this? What is the value of this point? What future benefit will it have? 3. Items should be independent of each other. 4. Write simply and clearly. 6-8 Test Item Construction General Guidelines 5. Be flexible; as a general rule the test should include more than one type of item. 6. Place easy items first. 7. Record the test number of each item in the table of test specifications. 8. Prepare clear, concise, and complete directions. 9. Ask your peers to review the test. 6-9 Test Administration Guidelines 1. Provide a typed copy of the test. 2. Start the test on time. 3. Be sure the test is administered under normal conditions. 4. Read the directions to the test takers. 6-10 Item Analysis Determines the quality of the items. Purposes • Indicates which items may be too easy or too difficult • Indicates which items may fail to discriminate clearly between the better and poor students for reasons other than item difficulty • Indicates why an item has not functioned effectively and how it might be improved • Improves your skills in test construction 6-11 Steps In Item Analysis 1. Arrange the scored tests in order from high score to low score. 2. Determine the upper 27% of the test scores and place them in one group (referred as upper group). Do the same for the bottom 27% of the test scores (referred to as lower group). Although upper and lower groups of 27% are considered the best for maximizing the difference between the two groups, any percentage between 25% and 33% may be used. 3. Tally the number of times the correct response to each item was chosen on the tests of each group. 6-12 Item Difficulty Defined as the proportion of test takers who answer an item correctly. If upper and lower groups are not formed, the difficulty index (p) may by found by: p = number answering correctly total number in group Example If 60 test takers completed a test and 41 correctly answered an item, the difficulty index would be: p = 41 = .68 60 6-13 Item Difficulty Use of upper group (UG) and lower group (LG) p = number correct in UG + number correct in LG number in UG + number in LG Example number of test takers = 150 number in UG = .27 x 150 = 40.5 = 41 number in LG = .27 x 150 = 40.5 = 41 number correct in UG = 33 number correct in LG = 17 p = 33 + 17 = 50 = .61 41 + 41 82 INTERPRETATION! 6-14 Item Difficulty Table 6.3 Evaluation of Item Difficulty Difficulty Index Item Evaluation .80 and higher reject item .71 - .79 accept item if index of discrimination is acceptable, but revise if discrimination is marginal .30 - .70 accept item if index of discrimination is acceptable .20 - .29 accept item if index of discrimination is acceptable, but revise if discrimination is marginal .19 and below reject item 6-15 Item Difficulty *Easy item has a high index and difficult item has low index. *Typical norm-referenced test includes items with a range of difficulty; average test item difficulty should be around 50%. *Difficulty for criterion-referenced tests is established at the minimum proficiency level; items on C-R tests are usually constructed so that at least 80% to 85% of test takers are expected to pass. 6-16 Item Difficulty *Interpretation of item difficulty not always an easy task. *Item may be easy either because answer is obvious (poor construction) or because test takers have learned the material. *May be difficult because it is poorly constructed or because the test takers have not learned the material. 6-17 Item Discrimination Determines how well the item differentiates between the good test taker and the poor test taker. If item discriminates, more test takers with high scores will answer the item correctly than will test takers with low scores. D = number correct in UG - number correct in LG number in each group 6-18 Item Discrimination Example number of test takers = 150 number in UG = .27 x 150 = 40.5 = 41 number in LG = .27 x 150 = 40.5 = 41 number correct in UG = 33 number correct in LG = 17 D = 33 - 17 = 16 = .39 41 41 6-19 Item Discrimination *Index of discrimination can range from +1.00 to -1.00. *Negative index indicates that more individuals in LG answered item correctly than individuals in UG. *Negative index has no place in test. *Generally, index of .40 or above on norm-referenced test indicates that item discriminates well. 6-20 Item Discrimination *For criterion-referenced tests, administer test before instruction (pretest) and administer test after instruction (posttest); if large difference in proportion of correct answers from pretest to posttest, item discriminates. *Test item with difficulty index between .30 and .70 has good chance of being a discriminating item, but should not assume this always to be true. 6-21 Item Discrimination Table 6.4 Evaluation of Index of Discrimination Index of Discrimination Item Evaluation .40 and above item discriminates; accept item if item difficulty acceptable .30 -.39 item provides reasonably good discrimination; may need improvement, particularly if item difficulty is marginal .20 - 29 item provides marginal discrimination; consider revision if item difficulty acceptable below .20 item does not discriminate; reject item 6-22 Response Quality *Choices of answers for multiple-choice items called responses. *Incorrect responses are referred to as distractors or foils. *Each response should be selected by some of the students; a response should be selected by at least 2% to 3% of the test takers. *Should consider pattern of incorrect responses by UG and LG; if an incorrect response selected by many students in the UG but few in LG, item may need revision. *Item analysis of multiple-choice test should include a record of the number of students who selected each response as well as the item difficulty and index of discrimination. 6-23 Item Revision *After completing the item analysis, you should perform the necessary revisions. *Revision usually involves discarding or rewording some items, changing responses, and changing items to different types. *After at least two administrations of the test to similar groups, with the necessary revisions, you should have a good test. 6-24 Table 6.5 Example of Item Analysis for Multiple-Choice Test 60 students completed the test; groups of 27% (16 test scores in each group); correct responses in bold print Item Responses p D A B C D 1 UG 1 13 2 0 .50 .63 LG 4 3 5 4 A B C D .50 .00 3 UG 0 8 0 8 LG 1 8 0 7 Item 1.All responses considered; difficulty and discrimination good. Retain item. Item 3. Response C not considered andAconsidered only once; no discrimination. Reject item. 6-25 Objective Test Items • Choices of answers provided for each test item • True-false, multiple-choice, and matching items • With each item, test taker must select one of choices provided 6-26 Objective Test Items Objective test items can measure more than simple recognition, rote memory, or association. Can measure several different kinds of thought processes; e.g., comprehension, analysis, and application. Takes time to construct objective test items that measure different kinds of thought processes. Table 6.6 includes tasks often associated with each thought process 6-27 True-False Items *A declarative statement; test taker must decide if the statement is correct or incorrect. *Often limited to factual content, but can be used to test applications, principles, and knowledge in the form of propositions. Examples If the score is 15-30, the server is ahead in points. If the score is 15-30, the serve should be to the receiver’s left service court. *Also can use T/F items to describe a situation and then ask test-taker to respond to items about the situations; e.g., game situations and strategies. 6-28 True-False Items Advantages 1. Cover wide range of material in a single testing period. 2. Easy to score. 3. Generally, easy to construct. However, if thought processes other simple knowledge are measure, test will require some time to construct. Disadvantages 1. Random guessing could produce a score of 50% correct. 2. With 50% chance of guessing correct answer, reliability of test items tends to be lower. 3. Correct answer often depends on one word. 6-29 True-False Items Guidelines for Writing True-False Items 1. Avoid use of specific determiners. Words like all, always, never , no, and none indicate question is probably false. Words like sometimes, usually, and typically suggest that item is probably true. 2. Include an equal number of true and false item, or include more false items than true items. 3. Avoid the exact language of the textbook. 4. Avoid trick items. 6-30 True-False Items Guidelines for Writing True-False Items 5. Avoid negative and double negative terms. Underline negative terms if you use them. 6. Avoid ambiguous statements. 7. All items should be of the same approximately length. 8. Limit each item to a single concept. SEE EXAMPLES 6-31 Multiple-Choice Items *Consist of two main components: the stem and three to five responses (one correct response and incorrect responses referred to as distractors or foils) *Stem may be more than one sentence long; usually a direct question or an incomplete statement. *Should present problem in enough detail so that there is no ambiguity about what is being asked. 6-32 Multiple-Choice Items Advantages 1. Can measure almost any understanding or ability if designed to do so. 2. Can be used to test most types of material. 3. Chances of guessing correct answer are much less than they are for true-false items. 4. Can be scored easily. 6-33 Multiple-Choice Items Disadvantages 1. More difficult to construct than other objective tests; considerable time to develop good items that include at least four responses. 2. Sometime encourage memorization of facts rather than understanding of concepts. 3. Less material can be covered than with true-false test. 6-34 Multiple-Choice Items Guidelines for Writing Multiple-Choice Items Stem Construction 1. Stem should be concise, easy to read and understand, and contain the central issue of the item. Properly constructed item has meaning by itself so that good student knows correct answer before reading all responses. 2. Avoid specific determiners such as always, never , all, none, and so on. 3. Avoid negative wording; if necessary to use, underline. 6-35 Multiple-Choice Items 4. Not mandatory, but sometimes helps test taker to begin stem with w word such as which, why, where, what, when, or who. Introduces main point of stem. 5. Do not work stem so that you are asking opinion of test taker. 6. If testing definition, place word in stem and definitions in responses 6-36 Multiple-Choice Items Response Construction 1. All responses should be plausible, but only one correct response. 2. Use at least four responses for each item. 3. All responses should be grammatically consistent, homogeneous in content, and approximately the same length. If some responses begin with a vowel but others in the same item do not, us a(n) in the stem to introduce the response. 4. If items numbered, useA, B, C, D, and E to designate the responses.Also, unless limited by the number of pages, place responses in vertical order. 6-37 Multiple-Choice Items Response Construction 5. Avoid patterns in positions of correct responses. Use A, B, C, D, and E equally. 6. Use the response ▯none of the above▯ or ▯all of the above▯ with care. 7. When possible, list the responses in logical or sequential order. If arrange in sequence, correct response occasionally should be first or last in sequence. SEE EXAMPLES 6-38 Matching Items *Consists of a column of items (stimulus words of phrases) on the left-hand side of a page and a column of options (alternatives) on the right. *Test taker selects the option that correctly associates with the item. *Similar to multiple-choice items; options serve as alternatives for all items. *Also similar to short-answer items; usually limited to specific factual information (names, dates, labels) 6-39 Matching Items Advantages 1. Easy to construct and score. 2. Provide many scoreable responses per test page or per unit of test time. 3. Motive test taker to integrate their knowledge and consider relations among items. 4. Odds of guessing the correct answer are low. Disadvantages 1. Time consuming to complete. 2. Usually test on factual material. 3. Limited to association tasks. 6-40 Matching Items Guidelines for Writing Matching Items 1. Include only homogeneous material in each matching exercise. 2. Clear directions - if an option be use more than once; if each item has only one correct answer; how the marking is to be done. 3. Keep the sets of items relatively short (five or six in lower grades and ten to fifteen in upper grades). 4. Place all items and options for a matching exercise on one page. 6-41 Matching Items 5. Use an appropriate format. Usually best to the homogeneous items on the left and the options on the right. If permitted to write on test, leave a blank space beside each numbered item for the letter of the matched option. 6. Arrange the responses in alphabetical or logical order; reduces time to find correct answer. 7. Develop more options than items; usually two or three more options. SEE EXAMPLES 6-42 Short-Answer and Completion Items *Distinction between short-answer items and completion items is primarily the length and format of the response. *Short-answer item requires test taker to respond to a question with a word, a phrase, or a sentence or two. *In a completion item, the simplest short-answer form, the test taker is asked to provide the missing information. *Both items are suited to measure factual knowledge, comprehension of principles, ability to identify and define concepts. 6-43 Short-Answer and Completion Items Advantages 1. Affected much less by guessing than are true-false or multiple-choice items. 2. Come closer to assessing recall, as contrasted with recognition 3. Are valuable when steps or procedures are to be learned. 4. Are easy to construct. 6-44 Short-Answer and Completion Items Disadvantages 1. Scoring takes longer than choice-type items, especially if spelling considered. 2. Unless extreme care is taken in construction, a number of answers might be wholly or partially correct; sometimes means only test constructor is able to score tests. 3. Encourage rote learning; occasions when recall and memorization are appropriate; e.g., CPR 6-45 Short-Answer and Completion Items Guidelines for Writing Short-Answer and Completion Items 1. Be sure item can be answered with a unique word, phrase, or number and that only one answer is correct. 2. Be sure test taker knows what type of response is required; indicate how precise response should be. 3. Think of the answer first. Then write item. 4. With completion items, place the blank near the end of the sentence; intent of item clearer and avoids multiple answers. 6-46 Short-Answer and Completion Items 5. Use no more than two blanks in an item. 6. Avoid lifting items directly from textbook. One sentence taken out of context from a paragraph may fail to adequately present concept of paragraph. 7. Make the actual blanks for the responses the same length. SEE EXAMPLES 6-47 Essay Test Items *Designed to measure ability to use higher mental processes - identifying, interpreting, integrating, organizing, and synthesizing. Advantages 1. Easily and quickly constructed. 2. Can measure complex concepts, thinking ability, and problem-solving skills. 3. Encourage test taker to learn how to effectively organize and express their own ideas. 4. Minimize guessing. 6-48 Essay Test Items Disadvantages 1. Time consuming to score. 2. Scoring requires some decision making on the part of scorer; reliability may be decreased. 3. Since they take longer to answer, only a few items can be answered in test period. 6-49 Essay Test Items Guidelines for Writing Essay Items 1. Should require test taker to demonstrate a command of essential knowledge (not reproduction of materials presented in textbook). 2. Phrase each item so that only one answer is correct. 3. Indicate the scope and direction of the required answer; vague phrasing leads to a wide variation of responses. 4. Require all test takers to answer the same questions; if answer different items, basis for comparing is limited. 6-50 Essay Test Items Guidelines for Writing Essay Items 5. Indicate the approximate amount of time that should be devoted to each answer; also state point value. 6. Generally, best to use a reasonable number of short essay items rather than a few longer ones. 7. Write the ideal answer for each item; provides idea of item’s reason and aids in scoring. 6-51 Essay Test Items Guidelines for Scoring Essay Items 1. Develop a method for scoring items. May identify essential points that should be included; also may rank each item according to the quality of response. 2. Evaluate the same item on all test taker’s papers before going on to the next item. Also wise to occasionally check your consistency by reviewing how you evaluated an item on the first few papers you scored. 3. Conceal the name of the person whose test you are evaluating. SEE EXAMPLES 6-52
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'