A Better way of Revealing Student Thinking in STEM

As students learn scientific concepts, they often have a mix of scientific and non-scientific ideas that they struggle to integrate. Constructed response questions – in which students must explain phenomena in their own words – provide students an opportunity to demonstrate their proficiency and also can reveal their mixed ideas.

Over years of research and development, the Automated Analysis of Constructed Response (AACR) collaboration has applied computerized analysis techniques, like machine learning, to evaluate student constructed responses. From this research, we've built tools like the Constructed Response Classifier (or CRC tool). These tools are freely accessible at this website, making it possible for you to easily evaluate your students' responses.

Already a user? Log in now

Student on her laptop outside

Why Use Constructed Response?

  • Requires students to demonstrate proficiency in their own words
  • Reveals complex student thinking, which can be a mix of scientific and non-scientific ideas
  • Is an authentic, scientific practice
  • Constructing is a different cognitive task than forced selection
  • Allows rich formative assessment practice
  • Our developed items are aligned with foundational concepts in STEM disciplines 
Illustration implying thought

Question of the Day

The figure shows a cell with the following labeled:

* Chloride (Cl-) ion concentrations (measured in mg/L) inside and outside the cell

* Membrane potential (-70 mV)

* Cl- channel

a) In this situation there is net movement of Cl- ions out of the cell through the Cl- channel (as indicated by the arrow). What would have to change to cause net movement of Cl- INTO the cell? Identify as many ways as you can.

b) Explain how the ways you identified above cause Cl- to move INTO the cell.

View Full Question

Register for an account to see more information about this and other questions we have available for you to use.

Sample Dashboard

Why register for an account?

There is no charge to use this website but you must register for an account.  We require registration in order to allow instructors to upload and analyze their data. 

Registered users have individualized homepages and access to our free BMC tool that automatically score student constructed responses! 

Coming Soon!  We're working on building a BMC user dashboard to give you quick access to custom tools and information. Visit your dashboard whenever you're ready to jump back into previous reports or get started on setting up new questions. From your dashboard you can perform a variety of actions, like uploading new data, creating a new course, or reviewing previous reports. Follow discussion threads for topics you care about and connect with a large community of BMC users.

How do I get insight into student thinking?

When you use our automated tools to score student responses in your course, results are returned to you as an interactive report. These interactive, course reports are designed to reveal key insights into student learning in STEM.  These insights can help educators pinpoint areas of success and areas to target for improvement. Each interactive report provides a high-level class overview of your students' understanding of course material. Explore the complex mix of ideas displayed by your students' and identify key STEM concepts in their writing.

Sample Dashboard

I already understand that students struggle with this and this [report] provides me with more input to better understand why they struggle with it

Introductory Biology Professor from Michigan State University

Everyone knows what the problems are, but I think that... a way to come up with questions to address those issues are difficult. You guys did a great job.

Instructor from Stony Brook University

Announcements

Models for scoring cell respiration explanations are accurate across institution types

Monday, September 13, 2021

In a study led by Dr. Megan Shiroda, the group compared model accuracy across responses provided by students at community college, primarily undergraduate institutions and research intensive universities and found that models scored new student explanations at similar accuracy rates as when developed and no significant differences between institution types.

Our Collaborators