Designing Assessments for Today’s Learners



One of the biggest questions that came out of the recent design and build of a school makerspace was: How can we meaningfully assess the learning that is happening here?

Pellegrino (2014) states:

Research on cognition and learning suggests a broad range of competencies that should be assessed when measuring student achievement, many of which are essentially untapped by current assessments. Examples are knowledge organization, problem representation, strategy use, metacognition, and participatory activities (e.g., formulating questions, constructing and evaluating arguments, contributing to group problem solving” (p. 243).

When one looks on the internet, there are numerous examples of frameworks for 21st century learning competencies (e.g., P21, C21, Alberta Education Framework for Student Learning). What is more difficult to find are practical tools, other than rubrics, to assess these competencies.

Implementation of teaching and learning environments with a focus on 21st century competencies will continue to be a challenge, without the accompanying assessments that will be practical to implement and informative in terms of learning and teaching. Pellegrino calls for learning scientists to “embrace the challenge of designing assessments that are aligned to our evolving conceptions of what it means to know and to learn” (p. 249). This will be critical if we are to create future learning environments that develop the doers and thinkers of tomorrow.

Alberta Education. (2011). Framework for student learning: Competencies for engaged thinkers and ethical citizens with an entrepreneurial spirit. Retrieved February 2, 2016 from

Milton, P. (2012). Shifting Minds: A 21st Century Vision for Public Education in Canada. C21 Canada. Retrieved February 2, 2016 from http://www.c21canada .org/wp-content/uploads/2012/10/Summit-design-English-version-Sept.-26.pdf

Partnership for 21st Century Skills. (2011). Framework for 21st Century Learning. Retrieved February 2, 2016 from storage/documents/1.__p21_ framework_2-pager.pdf

Pellegrino, J. W. (2014). A learning sciences perspective on the design and use of assessment in education. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (2nd ed.) (pp. 233-252). New York, NY: Cambridge University Press.




About sandralbecker

An educator who is passionate about the creation of a school Learning Commons, which supports inquiry, critical thinking, and collaboration.
This entry was posted in Uncategorized and tagged , . Bookmark the permalink.

4 Responses to Designing Assessments for Today’s Learners

  1. Krista Francis says:

    Hi Sandra
    What would meaningful assessment look like? For instance, how would you assess learning in a maker space? What would you look for? Is there still a disconnect between what is valued for learning and for standardized tests? How can we change the disconnect?

    • These are important questions, Krista. I think there is definitely a disconnect between what is valued for learning and standardized tests. I am always amazed how little teachers trust their own intuitive knowledge. For example, in the school I taught at last, there were many rich, authentic ways to gather data about our students reading abilities, but the default was to use a canned reading assessment that was diagnostic, as opposed to standardized. I think this spoke to the teachers’ lack of confidence in using data that they had collected in a meaningful way.

  2. Reyhaneh says:

    Hi Sandra,
    I think you mentioned very important points about the practicality issues tied with the new trends, focused on improving students’ 21st century competencies. In addition to your points, I am wondering how the assessment of these competencies is feasible in short periods of time that are the usually the case about the assessments processes in schools. It is actually the issue that I am thinking about regarding my own research on improving system thinking skills in children. Regarding the limitations of a dissertation study, I am wondering how I can assess the learning outcomes of the activities I would design for enhancing system thinking skills, which I believe develop gradually and may emerge in the long run. How do you think about the timing aspects of assessing the desired competencies?


    • Reyhaneh,
      This is such an important question, because I think that learning often does gradually develop and emerge in the long run. I think the most authentic data would be listening to and watching the children themselves, but this is extremely labour intensive. If you are using design-based research as a methodology, perhaps your assessment and data collection methods might have to be iterative and intuitive. Disease & Cobb speak about ontological innovation – could this part of your research plan?

      DiSessa, A. A., & Cobb, P. (2004). Ontological innovation and the role of theory in design experiments. The journal of the learning sciences, 13(1), 77-103.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s