Scott Wiener would have just smiled if Gov. Gavin Newsom had decided to sign his supposedly groundbreaking bill regulating the event of recent artificial intelligence devices and programs in California.
Instead, Newsom selected Sunday to veto the measure, also often known as SB 1047.
This bill was originally intended as a model for other states, but fell far wanting this model. Instead, it was so watered down within the legislative process, so watered down for political reasons, that it’d as well have contained no latest rules.
Open AI is the developer of the widely used AI tool Chat GPT, which was often flawed about many things.
But here's the actual query for Wiener and why the governor could have vetoed his bill: Why introduce a sophisticated, often veiled so-called protection from harmful robots and mechanical ghosts when some 82 easy rules have been established that protect against all varieties of problems could protect? years ago by a number one scientist and science fiction writer?
In his 1942 short story “Runaround,” Isaac Asimov first introduced his three laws of robotics, which might turn out to be cornerstones of his countless later works, including the famous “Foundation” series.
“The first law states that a robot may not harm a human being or allow harm to come to a human being through inaction. The second law,” Asimov wrote, “is that a robot must obey every instruction given by a human, and the third law states that a robot must avoid actions or situations that could harm itself.”
Instead of offering this comprehensive but easy protection, politics intervenes. Some opponents even questioned Vienna's watered-down bill, which eliminated a previously proposed State Department that specialized in security measures for AI devices of all types. Instead, they were submitted for approval to the Attorney General's Office, never known for its cybernetic ingenuity.
The attorney general, nominally California's top law enforcement official, could punish firms that pose an imminent threat or harm. But there isn’t a solid definition of what which means.
Proponents of the Wiener measure claimed it creates guardrails to stop AI programs from crippling the facility grid and causing other sudden disasters. It is obvious that some controls are needed because AI is developing quickly and in lots of forms, from taking up most mathematical functions in banks to composing messages robotically.
Then there's the state's legitimate concern that it won't impose rules so strict that they threaten to displace the newest potential high-tech economic engine that already has firms like Tesla and Toyota, which have moved their headquarters to other states makes up for a few of the slack.
Then there are those that claim that this could be overwhelming regulation that doesn't address on a regular basis, real-world problems like privacy and misinformation. To make certain, AI produces quite a lot of misinformation and sometimes distorts basics like dates and places of birth, complicating some people's lives. Wiener's bill offered no compensation for these evils.
Why not only adopt Asimov's rules as a substitute? They are easy, and his vivid imagination used them as central elements of many novels and stories involving robots with different personalities and functions.
The advantage of starting with easy rules to control an industry where there have previously been few is that latest rules will be designed as needed, giving people and firms the chance to explore latest AI features and wrinkles to develop without government intervention unless circumstances require intervention.
There's an old principle that claims, “Just start” – and if ever there was a situation that required this, it's the possibly limitless field of artificial intelligence.
Originally published:
image credit : www.mercurynews.com
Leave a Reply