Today: Nov 24, 2024

Hey, Alexa, What Should Students Learn About A.I.?

Hey, Alexa, What Should Students Learn About A.I.?
June 8, 2023

A senior Amazon executive, Rohit Prasad, spoke to ninth and 10th graders at Dearborn STEM Academy in Boston’s Roxbury neighborhood. Prasad observed a lesson in artificial intelligence (A.I.) sponsored by Amazon that teaches students how to program simple tasks for Alexa, Amazon’s voice-activated virtual assistant. During the lesson, Prasad remarked that millions of new jobs in A.I. will soon be available, and he emphasized the need for grass-root level education to create talent for the next generation. Meanwhile, Sally Kornbluth, the president of the Massachusetts Institute of Technology (MIT), was giving a presentation about A.I. risks and regulation to students at Boston’s Kennedy Library complex. Kornbluth’s sobering message warned against deploying A.I. technology too hastily, emphasizing the need for rules to ensure that A.I. does not cause harm.

MIT’s “responsible A.I.” initiative, organized A.I. workshops for schools that both encouraged students to work in A.I. and cautioned against its risks and potential harm. The future of A.I. education in schools remains unclear, with schools trying to discern whether they should teach students to program and use A.I. tools or anticipate and mitigate A.I. harms. Cynthia Breazeal, who directs the university’s initiative on Responsible A.I. for Social Empowerment and Education, claims that MIT’s program aims to teach students to be informed and responsible users and designers of A.I. technologies.

As the push for A.I. education gains momentum, the term “A.I. literacy” is becoming a new buzz phrase. Schools are scrambling for resources to teach A.I. Many universities, tech companies, and nonprofits are offering ready-made curriculums. A few concerns, however, remain that schools may only focus on the use and programming of A.I. tools and not consider the broader ecosystem of A.I. systems, including researching the business models behind new technologies or examining how A.I. tools exploit user data.

Some universities and tech companies are providing professional software tools to help high school students with future-proof skills and perspectives of how to work with A.I. to do things they care about. These tools include coding curriculums and other programs for K-12 schools. For example, MIT created lessons in voice A.I., developed with Amazon Future Engineer to teach students about “utterances” – the phrases that consumers might say to prompt Alexa to respond.

Schools, like Dearborn STEM, with a track record of critically examining technology, introduced past courses in which students used A.I. tools to create deepfake videos of themselves to examine the consequences. Nonetheless, many students expressed privacy and other concerns about A.I.-assisted tools. Amazon records the conversations consumers have with Echo speakers after they say the wake word “Alexa.” Unless users opt out, Amazon may use their interactions with Alexa to target them with ads or use their voice recordings to train its A.I. models. Privacy concerns remained a topic discussed with students at Dearborn STEM.

Moreover, thousands of students worldwide participated in a “Day of A.I.” event organized by MIT’s initiative on Responsible A.I. for Social Empowerment and Education. Different topics were discussed, and students from Warren Prescott School in Charlestown, Massachusetts, played the role of senators from different states, debating provisions for a hypothetical A.I. safety bill. Some students wanted companies and police departments banned from using A.I. to target people based on data such as their race or ethnicity. Others wanted schools and hospitals to assess the fairness of A.I. systems before deploying them. Nancy Arsenault, an English and civics teacher at Warren Prescott, often teaches her students to consider how digital tools affect them and the people they care about. Arsenault believes that students are aware that unfettered A.I. is not something they want and that they want to see limits.

OpenAI
Author: OpenAI

Don't Miss