November 30, 2022

You might wonder how we can ensure a math online admission test can assess students’ math proficiency. How do experts design the questions? How do they guarantee all the tests have the same difficulty? How can we confirm that this process is fair?

With the help of Paula Beukers, our mathematics admissions expert and former Head of the Admissions Committee at the Faculty of Science and Engineering (University of Groningen), we will give you an overview of the creation process behind our online math admissions exams.

We offer six math admission exams for different levels that depend on the program students apply to. The need to design an exam often comes from university admission office requests because they are extremely busy with the high influx of international applicants they receive.

We have co-created tests with institutions like the University of Amsterdam, Maastricht University, and the Erasmus School of Economics. In these processes, our team of in-house mathematicians, educators, and education scientists work with admission officers and teachers. Together, we deliver the best tests to help institutions target students who will be a good fit for their programs.

Since programs like Humanities, Economics, or Engineering require different math proficiency levels, we offer diverse alternatives that cover these fields. If we want to create a test, we think we should know first: 1) what we want to test and 2) how we evaluate what we want to assess. Therefore, if something is wrong, we can spot where it was, or if something is successful, we should be able to replicate it.

We use Bloom’s Taxonomy to guarantee our math admission exams evaluate the level they should. This framework includes general categories with subcategories and is widely used to design and organize educational learning goals. Its six main categories are knowledge, comprehension, application, analysis, synthesis, and evaluation.

OMPT uses knowledge, comprehension, application, and analysis to confirm students grasp the required mathematical content. With OMPT-A and B, we mainly test the first level: knowledge. With OMPT-F and D, we focus on the application and analysis levels.

To design OMPT-D (for STEM programs), we started by examining the official Dutch Math B program as a reference to construct more specific test outcomes later. Some key questions were: What do we really want to achieve with this? What does the student need to show? After solving them, we designed a clear and structured test assessment table with multiple properties. Once this table was ready, our authors could start creating the questions.

Our test creation is a collaborative and methodical process. All our authors have access to the same assessment table, so everyone is on the same page when creating questions. Likewise, our team always knows who works on what to coordinate themselves. Then, they can pick a specific test outcome and design an exercise that evaluates whether the student meets this requirement.

For example, with OMPT-D, we intend to have three equivalent exercises for each test outcome. Although the three exercises are not equal, they evaluate the same test outcome (e.g., analyze graphs and calculate an area). In each exam, students see one of the three. So, if students successfully replied to any of these questions and repeated the test, they should be able to calculate an area in the other versions as well.

We thoroughly evaluate the questions’ quality with a double review process. When an author has created a question, another colleague will check it. If that mathematician spots something that should be changed, the author will be notified to be able to fix it. Afterward, we will repeat this process with a second reviewer. Then, the question will be ready to be published in our live environment!

Apart from our thorough and systematic approach toward test design, we examine our work with analysis tools. Our system follows standardized formulas to calculate which exercise is (not) suitable to keep improving our tests.

We must confirm all the exercises that evaluate a test outcome have, indeed, the same complexity. During design and development, we check that one version should not be completed in one calculation line while the other requires plenty of pages. To verify their equivalence, we look at applicants’ scores (tested on an extensive number of students). Let’s use an extreme case to illustrate this. If we see that, on average, only 20% of students solve exercise A while 80% succeed with exercises B and C, we correct this situation.

We apply the same criteria to our real and mock exams. The idea behind our operations is that all our exams should be alike by designing with the test assessment table.

The OMPT exams are automatically graded by our engine. Additionally, we also have teachers grading them as any other teacher would do it. We have grading guidelines for OMPT-A, B, C, and E. When the questions are simple, straightforward calculations, students don’t get to add notes. In most cases, the answer is right or wrong. When exercises can have multiple answers, students can get partial scores. In OMPT-B, applicants have the option to submit notes of how their solutions came to an end, which might result in partial scores as well.

In our latest tests, D and F, it’s essential to have specific evaluation guidelines for each exercise, especially in exercises where we also have to grade students’ worked-out solutions or reasoning. For example, we state, “25% of the full grade students write this step” and “25% if they achieve that”. This is a typical evaluation process followed by other tests like the central exam in the Netherlands. Furthermore, two graders evaluate these tests to maximize the fairness of this process.

Do you have questions about how we design our math admission exams? Would you like to create a new mathematics admission exam for your institution? Feel free to schedule a free 30 min demo with one of our product experts at your convenience. We will show you how it works and how online mathematics admissions exams can benefit your admission office and prospective students.

No items found.