ESSA and AESA: Measuring Learning in Econometrics

Standard assessments of student learning are vital to quantifying the effectiveness of different teaching methods (Freeman et al. 2014). Physics, biology, and chemistry have well over 100 publicly available assessments that cover a wide range of courses and topics. Economics currently has only one high quality standard assessment that is appropriate for undergraduate students: the Test of Understanding College Economics (TUCE). George Orlov (Cornell’s first Economics Active Learning Initiative postdoc) and I have spent the last six months developing two new assessments of learning in econometric methods that we hope will catalyze improvements in economic education.

The Economic Statistics Skills Assessment (ESSA) is a measure of learning for students in an introductory statistics course, but it can also be used as a measure of pre-existing knowledge for students entering more advanced econometrics courses. Students are notoriously poor at recognizing the extent of their own knowledge, and this test provides objective insight into areas that need to be reviewed before teaching new material. I wrote about giving this test to students in my Applied Econometrics course a few weeks ago.

We started working on ESSA by explicitly documenting the learning goals of the introductory statistics course, and interviewing faculty about common student misconceptions. Next, we drafted 18 multiple choice questions that covered a large subset of the course learning goals. We piloted the test with students to identify questions that were unclear or too difficult. This spring, we will carerfully observe students working through the assessment aloud to give us better insight into their thought process.

Our second effort is the Applied Econometrics Skills Assessment (AESA). Following the procedure outlined in Adams and Wieman (2011), we documented the learning goals of our applied econometrics course and created corresponding multiple-choice questions. We then recruited several faculty members to provide us with feedback on whether the assessment evaluated expert-level thinking. Following revisions, we conducted think-aloud interviews with undergraduate students who had previously taken the course. These were truly eye-opening, and between the interviews we revised questions to remove ambiguity and ensure that students were able to identify the concepts that were being tested. We gave the resulting 45 minute assessment to students at the end of our fall applied econometrics course, and this spring we are teaching the same course with several brand new in-class activities that we hope will improve learning. At the end of the term, AESA will be one of the ways we measure the effectiveness of our new activities.

We are currently partnering with several faculty at other institutions to pilot both assessments in other contexts and will be reporting on the results at the Conference on Teaching and Research in Economics Education (CTREE) this May. If you teach econometrics or economic statistics and are interested in working with us to turn these assessments into great resources for the economics community, please do get in touch–It’s not too late to pilot these in your own classes this spring. You’ll learn more about what your students are learning and we’ll get valuable feedback that will help us improve assessments themselves.