Human Rights Watch has sounded the alarm over Australian children’s images found in a huge data set used to train AI models. It could be a breach of our privacy law.
Since last year, I have been working on a project with my Thai mother who migrated to Australia in 1974 while pregnant with me. To fill in the archive, we’ve been looking towards AI.
Tools such as ChatGPT dominate the conversation around AI in schools. But with teachers looking to meet Indigenous content requirements, using generative AI could do more harm than good.
Generative AI has changed the ways we work, study and even pray. Here are some highlights of an astonishing year of change – and what we can expect next.
Visual artists draw from visual references, not words, as they imagine their work. So when language is in the driver’s seat of making art, it erects a barrier between the artist and the canvas.
Robert Mahari, Massachusetts Institute of Technology (MIT); Jessica Fjeld, Harvard Law School, and Ziv Epstein, Massachusetts Institute of Technology (MIT)
Intellectual property law wasn’t written with AI in mind, so it isn’t clear who owns the images that emerge from prompts – or if the artists whose work was scraped to train AI models should be paid.
Generative AI can seem like magic, which makes it both enticing and frightening. Scholars are helping society come to grips with the potential benefits and harms.
During the brainstorming stage of the design process, AI-powered image generation programs can open creative doors that may have otherwise never been accessed.
Artists and photographers have strongly opposed their distinct styles being replicated by AI image generators. And the law has yet to catch up with this issue.