Image showing Ai-Da with a self-portrait. (Image Credit: Leemurz/ CC BY 4.0)
On October 11th, Ai-Da became the first-ever robot to speak at the U.K. Parliament after addressing the House of Lords Communications and Digital committee. She presented evidence that AI and technology threaten human creativity. The humanoid bot also experienced an insignificant glitch after wandering off into robotic dreamland.
Earlier in the session, the creator Aidan Meller needed to restart Ai-Da due to a technical glitch making her cross-eye. Afterward, he put sunglasses on her, and when the committee asked why, he explained that "she sometimes can pull quite interesting faces" when reset.
The robot artist, named after the mathematician Ada Lovelace, was developed by Engineered Arts along with Oxford university and Birmingham university researchers in 2019. Originally meant as a project to explore A.I.'s art potential, the humanoid robot is well-known for its Queen Elizabeth and "Leaping into the Metaverse" painting.
“I thought, is it actually possible that we could critique, comment, and look at this world of technology by actually the technology speaking for itself?” Meller explained during his Ai-Da introduction.
Her answers to each question were prepared by the A.I. language model, allowing her to produce high-quality responses.“The role of technology in creating art will continue to grow as artists find new ways to use technology to express themselves and reflect and explore the relationship between technology, society, and culture,” Ai-Da said.
“Technology has already had a huge impact on the way we create and consume art, for example, the camera and the advent of photography and film. It is likely that this trend will continue with new technologies,” she added. “There is no clear answer as to the impact on the wider field, as technology can be both a threat and an opportunity for artists.”
The humanoid responded to Baroness Featherstone, who inquired about technology’s involvement in producing art. She continued by asking Meller if he was the robot’s puppet master.
He stated that Ai-Da’s dataset to create art could be as big as the internet. “I’ll give an example of how far-reaching this is, which is very upsetting for humans,” he said. “We actually do ask her about the work, what she would like to do and what her ideas are for it. We are able to get quite a collaborative conversation going about what potential areas of data she could look at.”
“This feeds into all the films about A.I. taking over the world,” Featherstone said and stated the technology exceeded her expectations.
Cameras in Ai-Da’s eyes, with computer vision algorithms, enable her to interpret anything in front of her. A custom control system then mobilizes the robotic arms in response to what she sees, so she can paint.
“The greatest artists questioned and engaged with the societal shifts within their times. The more I read about the future, and where we were going as a world, I realized this very much-needed debate about the nature of technology wasn’t having a lot of airtime. I increasingly became very worried about it,” Meller said.“I want to be very clear that we’re not here to promote robots or any specific technology. It really is a contemporary art project.”
Have a story tip? Message me at: http://twitter.com/Cabe_Atwell