Robotic Literacy Project

An AI tutoring system for kindergarten students

Robotic Literacy Project

An AI tutoring system for kindergarten students

Problem Overview
Reading is one of the most important learning tasks in life. However, previous studies have shown that less than 50% of American children are not reading at their grade level. ​​​​​​Learning literacy skills at a young age could significantly impact the quality of life of children.
Nowadays, digital devices are essential parts of students' lives and potentially assist learning. Research studies on educational technology reported that innovative technology such as interactive robots is associated with students' motivation to learn. However, there is limited information about social robots and their potential role in early literacy learning in younger students.
Project Goals 
The GSU Reading Lab and MIT Personal Robots Group recognized that there is a potential in digital devices to provide a high-quality education for children. Collaboratively, they proposed and designed an innovative robot-based literacy intervention to foster the reading and language development of kindergarten students living in at-risk communities.
My Roles
Research Assistant
Ethnographer
Data Analyst 
Graphics Editor
Tools
Adobe Acrobat DC
Adobe Photoshop
Adobe Premiere Pro​​​​​​​
Microsoft Excel
Team 
Robin Morris, Ph.D. ​​​​​​​
Cynthia Breazeal, Ph.D. 
Hae Won Park, Ph.D.
Lynda Osborne, Ph.D.
Mykayla Jeter, M.S.
Kejia Patterson, Ph.D.
Xiajie (Brayden) Zhang, Ph.D.
Timeline
2019-2022
Context
This research study placed socially interactive robots in a kindergarten classroom in a local Atlanta school as a peer-tutoring system. It included a robot named Jibo and a tablet in an integrated station, with a storybook game app, developed by the Personal Robots Group at MIT. 
The game presented the school's reading curriculum, allowing students to explore their classroom storybooks, and identify words for reading and meaning with various support features. This included word identification via touching the text, image identification via touching a picture, automatic story reading, along with Jibo asking them questions about the story, and assisting them in these processes
Phase 1: Deployment
Designing Educational Materials
We gathered 42 storybooks from the school curriculum to develop the storytelling game app for the robot platform. Our goal was to design a game app that matched the physical experience of reading.
My Contributions
 I collaborated with the designers and engineers to incorporate the physical books into digital form.
The Process 
1. I scanned and uploaded each page of the physical books.
2. Extract each text line from the scans using Adobe Acrobat DC to Notepad for app development.
3. Edited book illustrations using Adobe Photoshop and adapted sizes for tablet visualization.

Storybook image edited without text lines

Storybook look on tablet screen

Storybook game app library with book covers

Phase 2: Integration in Schools
Introductory Session with Students
Before integrating the robots into their classrooms, we led instructional sessions with kindergarten students and their teachers.
We developed coloring books for the students to use while learning different parts of the robot platform such as cameras, headphones, and AI brain.
Also, we created cards with the vocabulary students' learned during the introductory sessions for later resources use.
My Contributions
I developed educational and interactive materials to introduce robots to the students and their teachers.
Acclimation Sessions and Face ID Training
Once the robots were implemented into the classrooms, we started the tutorial session with all of the kindergarten students. We decided to create profiles for every student and only collected interaction and face recognition data from consented students.
We conducted a 10 minutes tutorial session with participants. First, we review the early tutorial session with the cards before starting with the training and indicated for students to point out the parts of the robot station.
After the review, we introduced students to the profile selection page where they had to identify their names. Then, we proceeded to do face recognition. We showed the child how to sit for the platform to recognize their face every time they log into the system such as making sure their headphones mic is not covering their face, and calling their teachers if the camera is facing up and not catching their face. 
Once the face training is completed, we introduced them to the storytelling game. We made the child select any of the available books. We indicated to them what each of the features meant and how they can use it to read and interact with the robot. 
My Contributions
I trained students on how to use the robot platform. 
Free Use Sessions
After we finished the training sessions with all the students, we started to conduct naturalistic observations and focus groups with teachers.
My Contributions
After the acclimation session, I conducted naturalistic observations in the classrooms.
In addition, my role was to check the system network connection and address any arising technical difficulties and report to the engineers.
During the focus groups, I took the role of notetaker and contributed to scheduling times and finding a space in the school to run meetings.
COVID-19 Impact
The project was affected by the COVID-19 pandemic. Most of the students enrolled in the study could not complete more than one week of free use with the robot platform due to school closures.
We continue the project from home moving forward. See Jibo at Home. 
More information on this project is still limited, if you want to learn more please contact me. 
Watch my video presentation uploaded for Georgia State Undergraduate Research Conference 2021
Sponsors
#1717362
#1717362
Dr. Robin Morris' Reading Laboratory
Dr. Robin Morris' Reading Laboratory

You may also like

Back to Top