DTMAS - Middle Science Teacher Assessment
Sidebar
Diagnostic Science Assessments for Middle School Teachers serve two purposes: (1) to describe the breadth and depth of science content knowledge so that researchers and evaluators can determine teacher knowledge growth over time, the effects of particular experiences (courses, professional development) on teachers' knowledge, or relationships among teacher content knowledge, teaching practice, and student performance and (2) to describe middle school teachers' strengths and weaknesses in science knowledge so that teachers can make appropriate decisions with regard to courses or further professional development.
The assessments measure science knowledge in four content domains: (Physical Science; Life Science; Earth/Space Science). Each assessment is composed of 25 items—20 multiple-choice and 5 open-response. Six versions of each assessment are available in paper-and-pencil format so that researchers, professional development providers, and course instructors can administer them as pre- and post-tests before and after workshops, institutes, or courses to determine growth in teachers' content knowledge.
Teams of researchers analyzed a number of standards documents and research literature to synthesize the science content (detailed below) middle school teachers should know. Five types of knowledge (detailed below) were also identified. This provided a 2-dimensional chart within which questions were generated to ensure both breadth of coverage (content) and depth of coverage (knowledge type). Click on Middle School Science Content Summary Chart [PDF] to see a summary of the content analysis of these documents. The numbers in each cell represent page numbers in the documents and the letters (A1, PS3, NC6, . . .) represent bibliographic references for research articles. Science topics that were identified in more than half of the sources (A in the far right column) were included in the assessments. The chart below summarizes this structure for the physical science assessment. Check out the Science Assessments for Middle School Teachers Accordion Item at the end of this page to see descriptions of the knowledge types.
Teams of practicing science teachers, science teacher educators, and scientists generated test items intended to simultaneously target a particular content area and a particular knowledge type. Assessment-wide, items were targeted to be balanced across both dimensions.
DTMAS - Middle Science
Physical Science (PS)
Life Science (PS)
Earth/Space Science (PS)
Item Specification Grids indicates which content subscore and which type of knowledge each test item contributes to.
- Development process summary
- Middle Science Content Chart (summarizes identification of content across sources)
- Content Subscore Topics (more detail on the content breadth included in each subscore)
- Types of Knowledge Subscore Details (more detail on each type of knowledge)
- Item Specification Grids (mapping specific items onto 2-D chart of breadth & depth)
Selected Papers and Presentations
- Saderholm, J., & Tretter, T. R. (2008). Identification of the most critical content knowledge base for middle school science teachers. Journal of Science Teacher Education, 19(3), 269-283. doi: 10.1007/s10972-008-9092-9
[detailed description of process for identifying content breadth coverage of PS DTAMS – similar process used for LS and ES] - Tretter, T. R., Brown, S. L., Bush, W. S., Saderholm, J. C., & Holmes, V. (2013). Valid and reliable science content assessments for science teachers. Journal of Science Teacher Education, 24(2), 269-295. doi: 10.1007/s10972-012-9299-7
[describes multiple sources of validity & reliability for middle science DTAMS plus results from >4000 teachers nationwide which may serve as useful comparison] - Tretter, T. R., Philipp, S. B., & Brown, S. L. (2012, March). Characteristics of Teachers and Professional Development that Predict Growth in Life Science Content Knowledge. Paper presented at the National Association for Research in Science Teaching (NARST) Annual Conference, Indianapolis, IN.
Conference Paper - Tretter, T. R. (2010, May). Strengthening and assessing teachers’ physics content knowledge. Paper presented at the American Educational Research Association (AERA) Annual Meeting, Denver, CO.
Conference Short Paper
Declarative Knowledge (DEC): This is knowledge of definitions and facts. It includes memorized statements of concepts, rules, and laws—“knowing that. Although they may or may not understand the basis of the concepts or how to apply them to solve problems, teachers with this knowledge can
- perform skills by rote.
- apply rules.
- give definitions.
- recall facts.
Scientific Inquiry and Procedures (INQ): This is knowledge of scientific procedures and approaches—“knowing how to do science.” Teachers with knowledge of the elements of scientific inquiry can
- identify questions for scientific inquiry.
- design and conduct scientific investigations and experiments.
- use appropriate tools, instruments, and techniques to gather, analyze, and interpret data.
- develop descriptions, explanations, predictions, and models using experimental evidence.
- think critically and logically to relate evidence and explanations.
- recognize and analyze alternative explanations and predictions.
- communicate scientific procedures and explanations.
- use mathematics in all aspects of scientific inquiry.
Schematic Knowledge (SCH): Schematic knowledge represents a deep understanding of science concepts, laws, theories, principles, and rules—“knowing and understanding why.” Teachers with this knowledge
- understand connections and relationships among scientific phenomena.
- understand the reasons for scientific rules and laws.
- can explain the basis for concepts and understandings.
- discern the nature of relationships.
- can compare and contrast properties and characteristics of scientific concepts.
- can explain natural phenomena within the limits of current scientific knowledge.
- know that many phenomena have not yet been explained and that scientific knowledge is always evolving.
Pedagogical Content Knowledge (PED): This knowledge represents strategic knowledge for science teaching—“knowing when, where, and how to best teach science.” For these assessments, we are concentrating on the use of pedagogical content knowledge in the correction of student misconceptions about science. Teachers with this knowledge can satisfy two criteria:
- Recognize the students' misconceptions, and
- Describe the most effective ways to teach scientific concepts--the most powerful analogies, illustrations, examples, explanations, experiments, and demonstrations.
Note: The BOLD words are the abbreviated label used for each subcategory on the score report returned to the field test coordinator.
- Matter - Properties and changes of properties in matter
- physical properties (e.g. density, boiling point, solubility), mixtures, physical vs. chemical change
- chemical reactions, compounds, conservation of mass, chemical families
- elements & atomic structure
- states of matter, kinetic theory, and gas laws
- Motion and Forces
- position and direction of motion, speed & velocity, graphical representation of motion
- force & acceleration (includes friction, weight, f = ma, gravity), addition of forces, balanced & unbalanced forces, momentum and impulse (includes action-reaction)
- Newton's first law of motion (inertia)
- Energy
- energy as ability to do work/change, mechanical energy, kinetic and potential energy, simple machines, systems & conservation of energy
- waves, sound, light (refraction, absorption, scattering & reflection), color and vision, electromagnetic spectrum, and sunlight
- static electricity, electric current &circuits, magnetism, electromagnetism
- thermodynamics, heat, temperature, and temperature scales
- chemical energy, nuclear energy (radioactivity, fusion, fission)
Establishing Validity
Test items for each content area were sent out to approximately 40 external reviewers from each of the same three groups (science teachers, science educators, scientists). These external reviewers categorized questions into a content category and a knowledge type. They also rated the appropriateness of each question and provided other suggestions for improving the questions.
Based on reviewer feedback, questions were selected, revised, and assembled into field tests. Parallel questions were generated to produce 6 versions of each content-area field test. Tests are designed to be completed by test-takers within an hour. Each test consisted of 20 multiple-choice and 5 open response-questions. Each assessment has 3-4 science subdomains. Check the on Content Subcategories - Physical Science item in the Accordion at the end of this page to see the specific topics in each subcategory. The table below summarizes the subdomains for each assessment:
Physical Science | Life Science | Earth/Space Science |
---|---|---|
Matter | Structure/Function | Atmosphere/Hydrosphere |
Motion and Forces | Internal Regulation | Lithosphere |
Energy | Heredity/Diversity | Space |
Interdependence |
Each team developed item specification charts for each of the assessments. These charts describe the content and knowledge type of items on each of the four assessments. Click on Physical Science [PDF], Life Science [PDF], or Earth/Space Science [PDF] to view the item specification chart for each assessment.
Evidence of validity of the items for measuring teacher content knowledge in the various categories was established by asking external reviewers to review the items. Items were edited and sorted into randomized sets. They were sent to reviewers along with a review form that solicited: 1) the correct answer to the multiple choice items; 2) categorization of each item into a content category and subcategory; 3) categorization of each item into a knowledge type category; 4) a rating of the item as STS or not; and 5) a rating of the appropriateness of the item for middle school teachers.
Reviewers for each content assessment included scientists, science educators, and science teachers. Each item was reviewed by 27-31 reviewers in life science, 29-33 reviewers in physical science, and 20-22 reviewers in earth science. Each person reviewed about 75 items.
Data from the reviewers were analyzed to identify items that met criteria the DTAMS staff established for measuring the assigned constructs. The criteria for an item receiving verification as fitting a content category was that at least 75% of the reviewers identified the item as assessing a given category. To guarantee a balanced distribution within each category, subcategories within the categories had to be agreed upon by more than 50% of the reviewers; both of these criteria were required for an item to be accepted on the content category criterion. The knowledge type criterion was considered acceptable if more than 50% of reviewers rated the item as belonging to one type. For appropriateness, items that received an average ranking over 2.4 (on a scale of 1=low, 2=medium, 3=high) were considered appropriate. If an item met all three of those criteria, it was accepted to be included in the field tests. If it met two of the criteria, it was reviewed to determine if the wording could be clarified or improved. Revised items were or will be sent out for a second review. The items that met review criteria were selected to be the prototype for items in the field tests.
Using the Assessments
Currently these assessments are available for use free of charge. However, the assessments will be scored for a fee of $10 per teacher per assessment by CRMSTD staff. Once scored, CRMSTD staff will send instructors and professional development providers detailed summary of teachers' performance that includes scores on individual items, on each science subdomain in the content area, and on four different knowledge types (memorized, conceptual understanding, higher-order thinking, pedagogical content knowledge), allowing them to use the scoring summary to analyze performance on specific items, subdomain topics, or knowledge level.
Ordering the Assessments
Send an email to CRMSTD staff at CRMSTD indicating your interest with a brief description of your intended use (e.g. with Math-Science Partnership grant, for a research study, for other professional development purposes, etc.). Also include the following information to help us plan and schedule our scorers:
- content area(s) you wish to use (Physical, Life, Earth/Space)
- approximate dates of administration
- approximate number of teachers completing assessments
- contact information (with email address) to whom to return the completed scoring summaries and fee invoices