Biblio

Export 210 results:
Conference Paper
Wilson, M., Hoskens, M., & Wang, W. - C.. (1996, April). Complex item responses: Rater effects and item dependency. Presented at the annual meeting of the American Educational Research Association. New York, NY.
Wilson, M., & Draney, K.. (1997, July). Mapping student progress in science with embedded assessments. Presented at the annual conference of the American Association for the Advancement of Science, Seattle WA.
Wilson, M., & Scalise, K.. (2003, June). Research on learning: Implications for assessment in higher education. Presented at the American Association for Higher Education Assessment Conference, Seattle, WA.
Wilson, M., & Draney, K.. (1996, April). Mapping student progress with embedded assessments. Presented at the annual meeting of the American Educational Research Association, New York, NY.
Wilson, M., & Draney, K.. (1997, March). Developing maps for student progress in the SEPUP assessment system. Presented at the ninth International Objective Measurement Workshop, Chicago, IL.
Wilson, M., & Case, H.. (1996, June). An investigation of the feasibility and potential effects of rater feedback on rater errors. Presented at the Council of Chief State School Officers National Conference. Phoenix, AZ.
Wilson, M. (1997, Jannuary). Assessment and evaluation together in the service of instruction and learning. Presented at the National Science Foundation Status of Middle School Science Conference. Washington, DC.
Wilson, M., & Adams, R. J.. (1992, June). Evaluating progress with alternative assessments: A model for Chapter 1. Presented at the Invited address to the conference on Curriculum and Assessment Reform. Boulder, CO.
Wilson, M., Thier, H., Sloane, K., & Nagle, B.. (1996, April). What have we learned from developing an embedded assessment system?. Presented at the annual meeting of the American Educational Research Association, New York, NY.
Wilson, M. (1998, Jannuary). Embedding developmental assessment into an instructional curriculum: The case of SEPUP's Issues, Evidence and You. Presented at the International Developmental Assessment Seminar, Royal Melbourne Institute of Technology, Melbourne, Australia.
Wilson, M. (1994, April). Multimode assessment involving raters. Presented at the annual meeting of the American Educational Research Association. Atlanta, GA.
Yamada, H., Draney, K., Karelitz, T. M., Moore, S., & Wilson, M.. (2006, April). Comparison of dimension-aligning techniques in a multidimensional IRT context. Presented at the 13th International Objective Measurement Workshops, Berkeley, CA.
Book Chapter
Baek, S. - G., & Choi, I. - H.. (2009). The mastery learner judgment consistency rate of Rasch model-based standard setting method: Focused on the comparison with raw-score and Angoff methods. In Criterion-referenced testing: Practice analysis to score reporting using Rasch measurement. Chicago: JAM Press.
Diakow, R., Torres Irribarra, D., & Wilson, M.. (2013). Some Comments on Representing Construct Levels in Psychometric Models. In New Developments in Quantitative Psychology (pp. 319–334). Springer New York.
 (981.89 KB)
Draney, K., & Wilson, M.. (2007). Application of the Saltus model to stagelike data: Some applications and current developments. In Multivariate and mixture distribution Rasch models (pp. 119–130). Springer.
 (1.12 MB)
Kennedy, C., & Draney, K.. (2007). Interpreting and using multidimensional performance data to improve learning. In X. Liu, Applications of Rasch Measurement to Science Education. Chicago: JAM Press.
Kennedy, C. (2005). Ten surprises about teaching. In M. Kallet & Morgan, A., The Art of College Teaching: 28 Takes. Knoxville, TN: University of Tennessee Press.
Lehrer, R., Kim, M., Ayers, E., & Wilson, M.. (In Press). Toward establishing a learning progression to support the development of statistical reasoning. In J. Confrey & Maloney, A., Learning over Time: Learning Trajectories in Mathematics Education. Charlotte, NC: Information Age Publishers.
Roberts, L., & Henke, R.. (1997). Mapping Middle School Students' Perceptions of the Relevance of Science. In M. Wilson & Engelhard, G., Objective Measurement: Theory into Practice. Volume 4) Norwood, NJ: Ablex.
Wilson, M., & Draney, K.. (2013). A Strategy for the Assessment of Competencies in Higher Education. In Modeling and Measuring Competencies in Higher Education (pp. 61–80). Springer.
 (1.46 MB)
Wilson, M., & Adams, R. J.. (1996). Evaluating progress with alternative assessments: A model for Chapter 1. In M. B. Kane, Implementing performance assessment: Promise, problems and challenges. Hillsdale, NJ: Erlbaum.
Wilson, M., Bejar, I., Scalise, K., Templin, J., Wiliam, D., & Torres Irribarra, D.. (2012). Perspectives on Methodological Issues. In Assessment and Teaching of 21st Century Skills (pp. 67–141). Springer.
Wilson, M. (2012). Responding To A Challenge That Learning Progressions Pose To Measurement Practice. In Learning progressions in science (pp. 317–343). Springer.
 (2.43 MB)
Wilson, M., Bejar, I., Scalise, K., Templin, J., Wiliam, D., & Torres Irribarra, D.. (2012). Perspectives on Methodological Issues. In Assessment and Teaching of 21st Century Skills (pp. 67–141). Springer.
 (1.46 MB)

Pages