top of page

AI in the classrooms of the future,
part II: Reading students’ emotions

FAITH LAB TECH BYTES || ARNOLD SCHUCHTER

ArnoldHeadshot[1].0[1].jpg
JULY 22, 2018

Envision attending a symposium in which the founder of a well-funded startup company talks about the uses of “machine vision” to detect human emotion and the business value for enterprises of using cameras and facial-recognition technology to gain an understanding of customer engagement and emotions. Partway through the presentation, however, this CEO casually drops a prediction, without any embellishment or hype, that high-tech cameras of the future probably will be an integral part of K–12 classrooms so that teachers can use the facial expressions and emotions of students to evaluate their educational experiences. 

 

As part of the presentation, this CEO also mentions that the company is discussing with software developers the possibility of using its “machine vision” software to emotion-enable education apps in order to analyze student’s responses to digital content in the classroom. In case you haven’t heard about it, a growing number of high-tech companies have been creating software that analyzes human emotions and associated cognitive states from faces and voices using AI, optical sensors or just webcams.

 

With the aid of machine learning and deep learning, “machine vision” trains algorithms to screen and classify emotions. So now Big Data includes “emotion databases” gathered from faces analyzed around the globe. The way it works is that facial muscles generate hundreds of facial expressions of emotion, many nuanced and very subtle. Speech has many different dimensions, from pitch to voice quality. Algorithms can measure moment-to-moment changes in speech and facial expressions. Emotion AI can model all of the resulting complexities in real time for possible patterns of value to various end-users.

 

Rest assured that, at least for the moment, the spooky part of this hypothetical presentation about emotion-enabling education apps is fictitious, but not at all implausible given the potential of next-generation AI in classrooms of the future. The idea of using AI technology to analyze student’s emotions obviously would be a very controversial future application of AI technology in K–12 education. Conceivably, however, some day in the foreseeable future, “machine vision” in K–12 education might be given consideration, at least for the purpose of experimentation, to achieve the personalized learning (PL) goals of technology-oriented foundations like those of Melinda and Bill Gates and the Zuckerbergs.

 

For several years, for example, Mark Zuckerberg and his wife Priscilla Chan have been investing in the development of software that “understands how you learn best.” Not to be outdone, the Gates Foundation has been funding several initiatives around the nation based on personalized learning models that focus on “data-informed learner profiles” and “personalized learning paths.”

 

Thus the focus of hundreds of millions invested by these foundations is technology-assisted PL to develop and provide effective tools for teachers to manage competency-based classroom learning environments in which the curriculum aims to instruct each student in ways that maximize each student’s student achievement. None of these foundations as yet has set any limits on the types of technology that can be deployed by school systems and teachers for the purpose of PL. At some point these foundations will have to make decisions about supporting research in the emerging field of empathy technologies and applications in the personalization of education.

 

The results of investments by these foundations in various aspects of PL in school systems across the nation have been studied by the Rand Corporation. The good news from Rand studies is that student achievement growth in mathematics and reading over two years in schools with Gates Foundation funding exceeded that of a comparison group that did not use PL strategies and technology. The bottom line of the Rand study was that the most significant progress in student achievement was made in schools using teaching methods driven by data that enabled PL.

 

The not-so-good news from Rand was that none of the schools funded by Zuckerberg or Gates really looked much different from traditional schools as a result of their funding. Without saying so very explicitly, these foundations have concluded that they should focus more on direct funding for innovative EdTech developers who work with teachers and curriculum experts to design experimental systems designed to support and test different strategies to promote PL.

 

This is a very important shift in foundation-funding strategies since it coincides with rapid growth of interest across the nation in adoption of schoolwide PL models. While PL and its focus on individualized instruction seem to be a promising concept, a critical question remains without definitive, research-based answers: Does PL actually improve student learning more than other educational strategies, and in what ways can AI be utilized to effectively implement and evaluate PL?

 

St. James Faith Lab will be looking for K–12 research that reveals how AI can successfully function as a technology collaborator for teachers, providing holistic views of students and their learning styles in real time, together with recommended ways to engage, motivate, and advance each student’s achievement.

bottom of page