Skip to content

Instantly share code, notes, and snippets.

@mechaneyes
Created January 26, 2024 00:34
Show Gist options
  • Save mechaneyes/4213c26463beae1b642b5561d77c13e5 to your computer and use it in GitHub Desktop.
Save mechaneyes/4213c26463beae1b642b5561d77c13e5 to your computer and use it in GitHub Desktop.
AI Art Generation

AI Image Generators

Midjourney

https://www.midjourney.com/home?callbackUrl=/explore

Midlibrary

https://midlibrary.io/midlibrary/inside

Andrei Kovalev's Midlibrary is a carefully curated collection of hand-picked Midjourney Styles and a Midjourney Guide

DALL-E 3

https://openai.com/dall-e-3

Stable Diffusion

https://clipdrop.co/stable-diffusion-turbo

Runway

https://runwayml.com/

Video Goodness

Copyright Legality

NYTimes v OpenAI

The Times Sues OpenAI and Microsoft Over A.I. Use of Copyrighted Work

https://www.nytimes.com/2023/12/27/business/media/new-york-times-open-ai-microsoft-lawsuit.html

Millions of articles from The New York Times were used to train chatbots that now compete with it, the lawsuit said.

OpenAI and journalism

https://openai.com/blog/openai-and-journalism

We support journalism, partner with news organizations, and believe The New York Times lawsuit is without merit.

While we disagree with the claims in The New York Times lawsuit, we view it as an opportunity to clarify our business, our intent, and how we build our technology. Our position can be summed up in these four points, which we flesh out below:

  1. We collaborate with news organizations and are creating new opportunities
  2. Training is fair use, but we provide an opt-out because it’s the right thing to do
  3. “Regurgitation” is a rare bug that we are working to drive to zero
  4. The New York Times is not telling the full story

Current Discussion

Making Music With AI? Start With These Ethical Guidelines

https://ra.co/features/4198

Tools for Artists

Nightshade

https://nightshade.cs.uchicago.edu/whatis.html

Nightshade, a tool that turns any image into a data sample that is unsuitable for model training. More precisely, Nightshade transforms images into "poison" samples, so that models training on them without consent will see their models learn unpredictable behaviors that deviate from expected norms, e.g. a prompt that asks for an image of a cow flying in space might instead get an image of a handbag floating in space.

Nightshade, the free tool that ‘poisons’ AI models, is now available for artists to use

https://venturebeat.com/ai/nightshade-the-free-tool-that-poisons-ai-models-is-now-available-for-artists-to-use/

Glaze

https://glaze.cs.uchicago.edu/what-is-glaze.html

Protecting artists from programmes like Midjourney and Stable Diffusion, Glaze alters the digital data of an image so that it “appears unchanged to human eyes, but appears to AI models like a dramatically different art style”.

While imperfect, the free system has been increasingly recommended in response to new concerns for targeted style mimicry—a post on X following the “Midjourney Style List” urging artists to “Glaze” their work received more than 1,000 likes and 400 reposts.

Have I Been Trained

https://haveibeentrained.com/

See whether one’s work has been included as a training image in a generative-AI programme. Also has a Do Not Train Registry, which precludes works from inclusion in cooperating datasets.

Misinformation & Deepfakes

From Audio to Photoreal Embodiment: Synthesizing Humans in Conversations

https://github.com/facebookresearch/audio2photoreal/blob/main/README.md

YouTube is cracking down on AI-generated true crime deepfakes

YouTube is updating its cyberbullying and harassment policies and will no longer allow content that “realistically simulates” minors and other victims of crimes narrating their deaths or the violence they experienced.

https://www.theverge.com/2024/1/8/24030107/youtube-ai-deepfakes-true-crime-victims-minors

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment