- cross-posted to:
- hackernews@derp.foo
- cross-posted to:
- hackernews@derp.foo
Here’s the kicker: based on these AI-assigned definitions in the updated terms, your access to certain content might be limited, or even cut off. You might not see certain tweets or hashtags. You might find it harder to get your own content seen by a broader audience. The idea isn’t entirely new; we’ve heard stories of shadow banning on Twitter before. But the automation and AI involvement are making it more sophisticated and all-encompassing.
It’s controversial, to be sure, but I’ve always been of the mind that if someone wants to do something transformative to one of my works, they’ve generated something different, despite being “inspired” by my work.
ML gens are transformative by nature, so I don’t think my work being one of millions of datapoints used to create something is a huge deal.
That said, I’m also an advocate of preservation through piracy, so I’d be a hypocrite if I wanted to go copyright mad at bots for looking at images I uploaded on the public internet.