California's two largest school districts botched AI deals. Here are the teachings from their mistakes. – The Mercury News

With all of the arrogance of a startup founder, Alberto Carvalho, superintendent of the Los Angeles Unified School District, took the stage in March to Ed, the chatbotHe told parents and students that it has “the potential to personalize the educational journey at a level never seen before in this district, across the country and across the world.”

“No other technology can deliver on this promise in real time,” he said. “We know it will happen.”

In June, after just three months and nearly $3 million, placed on ice Ed after shedding greater than half the staff at AllHere, the startup that developed the conversational AI assistant. District spokeswoman Britt Vaughan declined to reply questions on the bot's performance or say what number of students and oldsters were using it before the closure.

Also in June, an AI controversy erupted in San Diego, where school board members were reportedly unaware that the district had purchased a tool last summer that routinely suggests grades for written assignments. The dispute began after Point Loma High School teacher Jen Roberts to CalMatters that using the tool saved her time and prevented burnout, but sometimes also gave students incorrect grades. One week later Voice of San Diego quoted Two school board members said they were unaware the district had signed a contract for AI. In fact, nobody on the council appeared to know concerning the tool, the news outlet said, because it was a part of a broader contract with Houghton Mifflin that was approved unanimously and without discussion together with greater than 70 other items. (None of the council members responded to CalMatters' request for comment. Michael Murad, spokesman for the San Diego Unified School District, said that because AI is a rapidly evolving technology, “we will increase our efforts to inform council members of additional relevant details about the contracts presented to them in the future.”)

Failures in Los Angeles and San Diego could also be on account of growing pressure on educators to adopt AI. They underscore the necessity for decision makers to ask more and harder questions on such products before buying them, say individuals who work on the intersection of education and technology. Outside experts can assist education leaders higher vet AI solutions, these people say, but even asking easy questions and demanding answers in plain English can go a good distance toward avoiding purchase regret.

No one denies that educators are increasingly challenged to search out ways to make use of AI. Following the discharge of OpenAI’s generative AI tool ChatGPT nearly two years ago, the California Department of Education published Orientation aid They point to an “AI revolution” and encourage adoption of the technology. Educators who previously spoke to CalMatters expressed concern that if their students miss the revolution, they may fall behind in learning or job preparation.

Rating AI tools

She believes that recent events in Los Angeles and San Diego show that more education leaders have to conduct critical evaluation before bringing AI tools into classrooms. But whether a selected AI tool deserves closer scrutiny relies on the way it is used and the danger it poses to students. Some types of AI, equivalent to those used for grading and Predicting whether a student will drop out of collegeshe said, need to be classified as “high risk.”

The European Union regulates AI varies depending on risk leveland within the USA the National Institute of Standards and Technology published a Frame to assist developers Government authoritiesand users of AI technology manage risks.

Tony Thurmond, California's Secretary of Education, was unavailable to reply CalMatters' questions on any steps he might take to forestall future AI-related mishaps in schools.

The legislator is considering a The invoice That would require the superintendent to convene a working group to make recommendations for the “safe and effective” use of artificial intelligence in education. The bill was introduced by Josh Becker, a Silicon Valley Democrat, and supports by Thurmond and the California Federation of Teachers.

Quay-de la Vallee suggested that educators work with organizations that test and certify educational technology tools, equivalent to Project Unicorna nonprofit organization that evaluates edtech products.

If education leaders rush to adopt AI from educational technology vendors desperate to sell AI, each could skimp on quality, said Michael Matsuda, superintendent of the Anaheim Union High School District. Hosting an AI summit in March, attended by educators from 30 states and over 100 school districts.

He believes that recent AI problems in San Diego and Los Angeles show that one must not get carried away by hype and thoroughly examine the claims of firms selling AI tools.

School districts can use tech-savvy teachers and internal IT staff to evaluate how well AI tools work within the classroom, Matsuda said. But nonprofits like The AI ​​Education Projectadvises school districts across the country on using technology, or a bunch just like the Association of California School Boardsthat has a AI Task Force The project goals to assist districts and counties “manage the complexity of integrating artificial intelligence.”

“We need to work together, think about what we've learned from mistakes and talk openly about it,” he said. “There are a lot of good products coming to market, but you have to have the infrastructure and the strategic policies and board policies in place to really look at some of these things.”

Education leaders don't all the time have a full understanding of the technologies teachers are using of their district. Matsuda said the Anaheim Union High School District uses AI to personalize student learning and even offers courses for college students desirous about pursuing careers in AI. But he doesn't know if Anaheim educators are using AI for grading today. After the events in San Diego, Matsuda said the district may consider labeling high-risk devices for certain use cases, equivalent to grading.

Common sense

You don't should be an AI expert to be critical of claims about AI's advantages for college students or teachers, says Stephen Aguilar, co-director of the Center for Generative AI and Society on the University of Southern California and a former educational technology developer. District officials signing contracts with AI firms have to know their very own policies, know what the district wants to perform with the contract, and ask questions. If contractors can't answer questions in plain English, it may well be an indication they're exaggerating what's possible or attempting to hide behind technical jargon.

“I think everyone should take the lessons from LA Unified and do a debrief, ask questions that weren't asked, and take things slower,” Aguilar said. “Because there's no rush. AI is going to evolve, and it's really up to the AI ​​edtech companies to prove that what they're selling is worth the investment.”

The challenge is that you simply don't evaluate an AI model only once, he said. Different versions can produce different results, and which means evaluation needs to be a continuous process.

Aguilar said that while the events in Los Angeles and San Diego schools demonstrated the necessity for more scrutiny of AI, school district administrators seemed convinced that they needed to be on the leading edge of technology to do their jobs, which is solely not true.

“I don’t know exactly how we got into this cycle,” he said.

The market is putting pressure on edtech providers to include AI into their services and products, foundations are putting pressure on school administrators to incorporate AI of their curriculum, and teachers are being told that their students could also be left behind in the event that they don't adopt AI tools, says Alix Gallagher, director of strategic partnerships at Stanford University's Policy Analysis for California Education Center.

Because AI is being built into many existing curriculum-related products and contracts, it's very likely that the San Diego school board isn't the just one to unexpectedly discover AI built right into a contract. Gallagher said administrative staff have to ask questions on supplemental curricula or software updates.

“It's almost impossible for districts and schools to keep up,” she said. “I think that's even more true for smaller school districts that don't have the extra staff they can deploy to do this.”

Gallagher said AI can do positive things, like reducing teacher burnout, but individual teachers and small school districts won't have the opportunity to maintain up with the pace of change, so trusted nonprofits or state education agencies should help determine which AI tools are trustworthy. The query in California, she said, is who will take the initiative and lead that effort.

Originally published:

image credit : www.mercurynews.com