Given the latest court ruling in March that AI works can't be copyrighted, this makes a lot of sense. The movie itself can't be copyrighted if it uses AI (although there is still some unresolved issues around how much AI).
Hah, no. Just because AI was employed in the production to some extent doesn't mean it can't be copyrighted. It is not so black and white. You are not describing the situation accurately.
Obviously just performative signalling that doesn't really do much. You can't definitively tell if AI was used, so the rule can never realistically be enforced.
Then again, the Oscars are surely almost entirely vibes based anyway. So it's hardly some internally consistent system of merit in the first place.
I wish we could stop the slide of the term "performative" into meaninglessness.
Just because something is hard or even impossible to enforce, doesn't mean you don't state that it is not allowed and that there are consequences for being caught. That's a common fallacy that overly engineering-minded people fall into.
We're humans. We care about things. There is nothing strange about me asking you not to do something that I can't stop you from doing.
If you consider low-stakes crimes, typically to get to a steady state of effectiveness you need at least some sort of bootstrapped period of ubiquitous enforcement. If that's impossible then I'm not sure you ever get to effectiveness.
If we're talking high-stakes, death-penalty-lottery-if-you-break-the-rules type stuff, then I think actually detection rate (i.e. consistent enforcement) is the biggest predictor of reduced rates, not severity of punishment.
Sure, but even giving 100% of the benefit of the doubt you're raising, it still doesn't follow that it is purely "performative" to formally establish a rule just because it may soon become impossible to identify rule-breakers without whistle-blowers or intel.
How are there consequences for being caught if it's impossible to detect?
Moreover, why stop here? There are many great rules that are impossible to enforce. Why not a rule that the author isn't allowed to have any racist thoughts when writing the material?
We can't read minds, but it sure is a nice thing to care about, don't you think?
It doesn’t always have to have consequences when it’s a curated access club like the Oscars. It’s ok to have cultural norms that aren’t enforced by consequences, at the very least some of the ethical participants will follow them. I know that I try to follow the spirit of the clubs I participate in, and if they don’t have these types of statements often I just don’t know what the community thinks is ok.
It breaks down when assholes join, or the overly self-interested. This mindset permeates America today, but there are still many collective organizations that don’t need punitive measures. These are less common but when you find them, it’s often a positive signal.
I guess the Best Visual Effects category is going to be tough to judge, but don't you think it might be quite hard to win the Best Actress Academy Award if your AI-generated heroine can't come get the trophy?
Also, "truth" is a thing that exists, and just because you can't always tell if somebody cheated the rules or not, does not mean the rules are "performative signalling".
I don't think AI-generated 'avatars' are anywhere close to being Oscar-worthy as things stand, so it seems kind of a moot point (hence the 'signalling' thing).
If they ever get that good, I would just say you can't really fight the market. If AI content is good enough that people want it, then the Oscars just get left behind after a while. But that's fine, and up to them.
> Also, "truth" is a thing that exists, and just because you can't always tell if somebody cheated the rules or not, does not mean the rules are "performative signalling".
I don't really understand. If you can't hope to discover the truth, in what way is it not performative or signalling?
Can you explain how an Oscar-worthy piece of writing would somehow be able to contain blatant AI-generated content? How would it have already passed the good-enough-for-an-Oscar filter?
The younger generation also increasingly pays less attention to traditional mainstream entertainment and media, as now they can create more of it with AI.
Edit: funny to see the anti-AI crowd showing up again, how predictable... you can downvote but you can't stop the truth! Legacy entertainment is dying, and will soon become irrelevant.
It's much easier to tell if athletes are doping than to 'detect' AI in text that's already Oscar-for-writing level good. I would suggest the latter is quite literally impossible.
I have always heard that dopers are consistently ahead of testing regimes. I don’t think it is easier to tell than AI, which always seems pretty obvious to me.
You have to consider that any AI content worthy of the Oscar shortlist is going to be very high-quality, and likely intensely hand/human-tweaked in the first place. It's not from the general population of all AI content out there.
> I have always heard that dopers are consistently ahead of testing regimes
I don't know about that, even the very biggest names with the most funding quite often get dinged for it. I suppose I'm not really saying that the detection rate for doping is high, though, just that it's much higher than AI detection in high-quality content (which I would suggest is approximately zero).
agreed, as AI is more widely adopted in cinematography i assume they will start adding categories specifically for it... hate the idea of them ever competing directly against actual humans performing
The rationale (which, again, I'm not arguing for or against) is that mocap performances are not strictly speaking totally the actors, because mocap has to be cleaned and can be (and very often is) edited and tweaked after the fact by animators. Not to mention there are often required liberties taken because a model cannot line up one to one with an actor anatomically.
In a sense, mocap performances are done by a team of animators where one animator puppeted a model in real time.
Every last motion Gollum makes was Serkis doing it, including when he's jumping up on rocks and climbing down head-first. The animators certainly deserve credit for the facial expressions and the rest of the work of the digital costume, but he physically acted the part.
- emotional connection
- aesthetics
- zeitgeist
- lived experience
- artist journey
You're free to fall in love with your sexbot, but it's still just jerking off.
Which is really the crux of the issue.
Then again, the Oscars are surely almost entirely vibes based anyway. So it's hardly some internally consistent system of merit in the first place.
Just because something is hard or even impossible to enforce, doesn't mean you don't state that it is not allowed and that there are consequences for being caught. That's a common fallacy that overly engineering-minded people fall into.
We're humans. We care about things. There is nothing strange about me asking you not to do something that I can't stop you from doing.
If you consider low-stakes crimes, typically to get to a steady state of effectiveness you need at least some sort of bootstrapped period of ubiquitous enforcement. If that's impossible then I'm not sure you ever get to effectiveness.
If we're talking high-stakes, death-penalty-lottery-if-you-break-the-rules type stuff, then I think actually detection rate (i.e. consistent enforcement) is the biggest predictor of reduced rates, not severity of punishment.
Moreover, why stop here? There are many great rules that are impossible to enforce. Why not a rule that the author isn't allowed to have any racist thoughts when writing the material?
We can't read minds, but it sure is a nice thing to care about, don't you think?
It breaks down when assholes join, or the overly self-interested. This mindset permeates America today, but there are still many collective organizations that don’t need punitive measures. These are less common but when you find them, it’s often a positive signal.
Also, "truth" is a thing that exists, and just because you can't always tell if somebody cheated the rules or not, does not mean the rules are "performative signalling".
If they ever get that good, I would just say you can't really fight the market. If AI content is good enough that people want it, then the Oscars just get left behind after a while. But that's fine, and up to them.
> Also, "truth" is a thing that exists, and just because you can't always tell if somebody cheated the rules or not, does not mean the rules are "performative signalling".
I don't really understand. If you can't hope to discover the truth, in what way is it not performative or signalling?
Edit: funny to see the anti-AI crowd showing up again, how predictable... you can downvote but you can't stop the truth! Legacy entertainment is dying, and will soon become irrelevant.
> I have always heard that dopers are consistently ahead of testing regimes
I don't know about that, even the very biggest names with the most funding quite often get dinged for it. I suppose I'm not really saying that the detection rate for doping is high, though, just that it's much higher than AI detection in high-quality content (which I would suggest is approximately zero).
The rationale (which, again, I'm not arguing for or against) is that mocap performances are not strictly speaking totally the actors, because mocap has to be cleaned and can be (and very often is) edited and tweaked after the fact by animators. Not to mention there are often required liberties taken because a model cannot line up one to one with an actor anatomically.
In a sense, mocap performances are done by a team of animators where one animator puppeted a model in real time.
I would've given him the best voice acting award though.