The creators of the widely used AI painting application Stable Diffusion, Stability AI, made a big upgrade to the program on Thursday, European time (Wednesday night in the US). The new application is freely available on Microsoft’s famous programming site GitHub, and it can generate graphics in response to text prompts like “An astronaut creating a sand castle.” Enhanced image quality, such as increased pixel counts, is a primary goal. This alleviates certain worries about the technology and makes it more challenging to develop pornographic content using AI.
According to the developers, the first version of Stable Diffusion V1 “transformed the landscape of open source AI models and inspired hundreds of new models and breakthroughs throughout the world.” It was widely published earlier this year, at the same time as competitors Midjourney and Dall-E 2, which Microsoft is putting into its Office productivity software to assist make aesthetic renderings for papers and presentations.
Using textual descriptions such as “a picture of hipster dogs playing poker, in the manner of Tomma Abts,” artificial intelligence art technology generates visual representations. It then uses those suggestions as input for a software built to spot repeating patterns in massive troves of real-world data. As a result, the definition of art and the question of who should hold the copyright to it have been thrown into disarray in the fields of both art and technology. Famous people and artists have voiced worry that these applications may inappropriately use their likeness or trademark styles, which may alter how people view them.
This summer, when a guy from Colorado won a digital art award with a Midjourney AI piece, such concerns came to the forefront. Not long after, Getty Images, a stock photo agency, stopped selling AI-generated artwork.
Stable Diffusion makes an effort to address these issues by incorporating, for instance, a shallower comprehension of celebrity likenesses. According to The Verge, some critics saw the company’s efforts to decrease the probability that users would develop AI-generated nudity and porn as a sort of censorship.
According to Stability AI, the firm will provide access to its program running on its own machines in the near future, in addition to exposing its source to