Many students tend to view math word problems as uninteresting, unimportant, and unnecessary when the problems are perceived to be irrelevant and disconnected from student's real-world experiences. As a result, students may become disengaged and uninterested in the subject altogether. How could I design and develop a math word problem technology that generates comprehensible problems to increase student engagement and interests?
For my dissertation research, I have designed, developed, and will be evaluating a math word problem generator that outputs math word problems based on students' interests.
In-person interviews were conducted with 14 stakeholders (10 5th-7th grade participants and four math teachers) to analyze the needs of the users and inform the design of the system. I wanted to understand student's:
For the teachers, I wanted to understand:
Needs of Stakeholders
The interview transcripts were cleaned and added to MAXQDA for coding. Based on the scope of the project, relevant needs were created from participant statements (see below). In hindsight, I would add customer statements with the interpreted needs to further validate my findings.
Have ownership of the theme of problems for personalization
Interact with engaging material that includes their interests
Learn the math skills that will be helpful in the present and future.
Traditional math word problems are boring and irrelevant
Students have difficulty retaining math skills learned in class
Student perceive little to no utility value among traditional math word problems
Provide materials with a good balance between learning and engagement
Access to student data easily
Provide systems that reinforce taught math solving strategies
Lack of classroom management support
Lack of systems that align with teacher lesson plans
Lack of systems with aids and hint features to support student retention of information
Personas were created to reflect the motivations and expectations of the users to inform the design of the system. The illustrations of these key users were based on the interviews and research reports in math education. Two student and one teacher personas were created:
Trends in math technology were observed. Google Classroom was mentioned by one of the teacher participants to describe the ease of logging in using a generated Class ID. The design iteration of the proposed system incorporated this feature.
As shown below, the first sketches of illmatics' assessment pages displayed multiple questions on one page. Khan Academy's website design inspired the reorganization of the assessment/math question pages. The design was updated to include this design practice.
After completing the low-fidelity prototype, the feedback was gathered from four teachers using the think-aloud method. The goal was to receive input on the functionality and features of the system. The following tasks were performed by the participants:
Once the design and development were complete, the next steps were to evaluate the system with the target users. The usability of the system is currently being examined as measured by the System Usability Scale (SUS). Additionally, the readability of the generated problems will be compared to the traditional math word problems as measured by the Flesch-Kincaid Reading Ease score.
The effectiveness of the system will be examined in the evaluation/learning impact phase. In this study, the cohesiveness of the problems, student performance, and students' triggered situational interests will be explored.
I look forward to sharing additional details and the results of these investigations soon!