Diagnostic Teacher Assessments in Mathematics & Science (DTMAS)
Assessments of teacher content knowledge
Sidebar
About DTMAS
Diagnostic Teacher Assessments in Mathematics & Science (DTAMS) serve two purposes:
(1) describe teachers' breadth and depth of content knowledge for them to use diagnostically for themselves; and (2) capture the breadth and depth of content knowledge for evaluators, researchers, or others to use in support of strengthening teacher capacity. They were initially developed iteratively between 2002-2006 by teams of researchers and teachers, under the leadership of Dr. William (Bill) S. Bush (retired). DTAMS assessments are available in 3 domains: elementary mathematics, middle mathematics, and middle science (see side navigation).
DTAMS resources are available since January 2019 on a free, self-serve basis
The fee-based scoring service previously provided by CRIMSTED at University of Louisville has been discontinued. Links (see side navigation) for each DTAMS domain include: multiple versions of each assessment, scoring guidelines, subscore generation guidance, and supporting documents.
Terms of Use: DTAMS resources are available free of charge for appropriate use under the following conditions:
You may copy and distribute the material in any medium or format, or you may adapt the material (remix, transform, or build upon). The terms of this use include: you must give appropriate credit but not suggest that CRIMSTED endorses you or your use; you may not use the material for commercial purposes; any adaptation of the material must be distributed under these same conditions; and you may not legally restrict others from using any of these materials (or adaptations) different from these terms.
Suggested citation for DTAMS web materials broadly:
University of Louisville CRIMSTED. (n.d.). Diagnostic Teacher Assessments in Mathematics and Science (DTAMS).
Questions? Please contact CRIMSTED Program Coordinator at CRMSTD@louisville.edu
Frequently Asked Questions
Please see the information on the “overview” page of DTAMS. Briefly, these DTAMS resources are now (since January 2019) available free of charge in a self-service format. Users are requested to acknowledge (cite) our research center as the original source for any DTAMS use you may find helpful.
Training is not required. We recommend allotting approximately 1 hour for teachers to complete the assessment.
No. As of January 2019 we no longer are offering our DTAMS scoring service, and there is no longer any fee associated with this.
The length of time to take the assessments generally varies between 30-50 minutes, with the bulk of participants taking less than 45 minutes. For the posttest, some teachers have more to say for the open response questions and thus that can take about 10 minutes longer.
Browse the DTAMS websites (part of Center for Research in Mathematics and Science Teacher Development – CRIMSTED – at University of Louisville) to locate the specific assessment you are interested in. You may download and use (or adapt) as detailed on the “overview” page. Scoring guides and a selection of other resources to aid in interpreting DTAMS scores are also available.
As is true of a number of educational measures published in journals and measurement yearbooks and other publically available sites, the DTAMS assessments are now available to anyone via the internet. Often, educational researchers do not seem to be overly concerned about using such measures in terms of test security. We do not undertake any specific procedures at this point to maintain test security. Project directors/evaluators and others who wish to use these need to decide for themselves how serious this issue may be for their context and decide to use or not use DTAMS accordingly. As noted in the “overview” tab for DTAMS, users may choose to edit or adapt (e.g. could rearrange response choices, edit some response choices, mix items across different versions while paying attention to which subscores each items contributes to, etc.) as one possibility to minimize any test security concerns.
Our position is to leave it up to the users to decide how these assessments will best serve them. Below are some examples of what others have done or are doing with these assessments.
- These assessments are intended for diagnostic purposes and we suggest they are best used to measure growth or to identify strengths and weaknesses of individual teachers.
- Some have administered all 3 science content areas as a pretest in order to use results to determine which content area they'd like to focus on for upcoming professional learning offerings.
- CAUTION: Due to test fatigue, we recommend not administering all 3 of these on the same day.
- Some have chosen to primarily focus on one or more of the knowledge type subscores or content subcategory scores. For example, some users were more interested in the pedagogy of teachers using their knowledge to teach, others were interested in enhancing science inquiry skills, and still others wanted to focus on deep, schematic knowledge. Some have used the content category subscores to choose an emphasis on one content area over another, e.g. "force and motion" for physical science.
- CAUTION: since each of these subscores are based on a fewer number of items than the assessment overall, conclusions drawn from subscores alone are more tentative than from total score and should be done cautiously.
- Many have used these assessments in a pre-post design to look for gains. Some looked at gains in subscores (either knowledge type or content subcategory) as well as overall gains. Same caution as in 7(c) applies.
- CAUTION: Another caution is to be sure that administration conditions are conducive to enhancing validity of interpretation of any scores. For example, some users reported to us that when they administered a posttest at the end of a Friday after a long and mentally tiring series of professional learning days, teachers may have been tired and/or less than fully motivated to demonstrate their full knowledge, which would undermine interpretations of scores.