I’m not a big poster at all, but ran into this precise issue.
They analyze the video posts on instagram. If they detect the video has even a small amount of commercial value, they classify it as branded content and you need to pay for it to get promoted.
I believe we need to strengthen 230, but with the added caveat that affected platform owners must stop gaming the algorithms, that it must require user-driven curation. Let me curate my own feed, stop shoving shit in front of my eyes. When you do so, you're making heavy editorial decisions, and should be open to liability.
A few years ago this seemed a bit too extreme for me. Now, with the web mostly burned down anyway, I see little to lose and lots to gain in a section 230 repeal. My, how the Overton Window changes on some ideas. And when it's changing on some things it tends to accelerate on others too, like a social momentum on reconsidering past norms.
My compromise pitch, since the "You need ID from your users" ship has sailed:
Companies are not liable if they have proper ID of the person who submitted the content and can provide that to a plaintiff. If they have not made a good-faith effort to know who submitted this info (like taking ID, not just an email address) then they're taking responsibility for the submitted content.
Which means sites that have responsible moderation can still allow anonymous contributions.
The real problem is the inherent asymmetry of legal battles, where the wealthiest can fight forever with endless motions and have near-total impunity while a legal action would basically nuke a normal person's life. Not to mention the fact that an international border can often make this whole conversation moot.
One of my issues is the lack of liability in practice. The poster is technically liable but they're anon, behind proxies, foreign, etc. and unaccountable. It results in people being harmed online without recourse.
These companies should have a duty to know who their users are.
As an aside, class-action lawsuits seem less than ideal for the public. The awards benefit the lawyers and perhaps a small handful, but the actual plaintiffs only get $0.05. In addition, successful class-action suits prevent further litigation from being allowed for the same issue.
Individuals bringing their own lawsuits seems like it would affect better change as 1) the award money would be better distributed instead of concentrated and 2) the amounts levied against the companies would be higher and more of concern than the class-action slap-on-the-wrist they currently get.
1. Why should harming a million people identically reduce their right to a fair legal evaluation and possibly compensation for damages? <-- maybe it makes sense for large corporations to carry insurance to pay for the potentially massive legal costs they could impose on governments?
2. Shouldn't we be able to quickly resolve these cases assuming there are no substantially different pieces of evidence?
Agreed. Naturally, the solution is to get meta to compensate for the actual and cumulative damage they've done to mankind. Then plaintiffs might actually benefit.
> "We will not allow trial lawyers to profit from our platforms while simultaneously claiming they are harmful."
Wow.. That is quite a statement. Am I right in saying that in order to claim for the class action lawsuit, which facebook has been 'found negligent', that the victims need to take an action collectively in order to claim ? IE They need to be reached somehow to inform them of the possibility ?
Seems the most obvious place to advertise would be Meta.
I understand Meta can basically do whatever they like with their ToS but the statement from the Meta spokesperson seems like an extremely bad idea.
It would be a better analogy if tobacco companies sold ad space on their packs and chose not to do business with a private for-profit anti-smoking solicitation group.
I understand the impulse, but there are not only significant differences, i.e., the requirement to add labeling to cigarettes was mostly a judicial or legislative action, but there is also that rather perverse fact that this kind of legislation that people are championing is often funded by profit and greed just like the harm being sued over.
The article even at least mentions that at least one of the suits is private equity funded; which generally will result in the partners and/or investors of the private equity firm and the attorneys suing, which are often all one and the same in what is just a financial and legal shell game, net tens of millions of dollars, while the supposed victims will end up with nothing but pennies on the dollar of harm and injury.
I get the impulse to also “cheer” for the lawsuits, but if you thought Meta, etc. are bad; you really don’t want to look into the vile pestilence that is the law firms that are basically organized crime too by the core definition of crime being an offense and harm upon society.
I don’t really know a solution for this problem because it is so rooted in the core foundation of this rotten system we still call America for some reason, but for the time being I guess, the only moderately effective remedy for harm and injury is to combat it with more harm and injury.
> the statement from the Meta spokesperson seems like an extremely bad idea.
All corporate CYA ideas sound that way, but ultimately end up benefiting the company in the end. Meta is right to do this. That's not to say it's right to do, but it's right for the company.
It often comes up in (anti) free-speech trials, where the government compels the perpetrator to issue a public apology to the victim. Forcing them to buy an ad in a newspaper for example is not unheard of.
As far as I understand, Americans consider this to be "compelled speech" and hence prohibited, but I might be wrong on this.
Not likely to survive 1st Amendment challenge - it is possible to compel somebody to certain speech as a result of losing a case, but doing this as a prerequisite when the case has just started is not likely to fly. Otherwise I could force Facebook (or any other platform) to publish anything just by suing them - and anybody could sue anybody else on virtually any grounds.
that is a problem of the system that needs to be solved. meta is among the purest forms of evil inflicting irreversible effects on our society (and youth in particular) and the fact that you are quite right isn't really an issue with the lawyers but system that allows punishment to not fit the crime.
If you read the settlements that come out of these lawsuits, you will pretty much always find an 8 to low 9 figure settlement (that the lawyers get a third of), maybe some superficial policy changes, and $12 checks to the supposed victims who only became victims when they randomly got an email telling them they should join the lawsuit. The only people who benefit are the lawyers.
$12 dollars is $12 dollars people wouldn't have without them. You can always opt out of a class action settlement and sue yourself if you're not happy with the terms.
But at the end of the day, the lawyers did real work, took on real risk and achieved something. They held a big tech company accountable, and that is a meaningful difference from the status quo. I don't care that they made money doing that, they should.
My special savings account where I deposit the settlement checks from the various tech companies that have violated my privacy or other rights disagrees.
Sometimes it's 43¢. Sometimes it's $400.
In the last three years, I've put… checking… $5,351.83 in that account because tech companies think laws and morals don't apply to them.
Saying that these lawsuits only benefit lawyers is both false and yet another lazy tech bubble cliche.
Yes, the lawyers get way more than I do. They also did 99% the work, so I don't hold it against them.
Just read the newspaper. Every time you see an article about one of these suits, check it out to see if it applies to you.
Hey at least you get to pocket all of that. Here in Europe the government keeps the money and then distributes it to the scum of the Earth. I'd rather give the money to lawyers, at least they did _something_.
You may think Meta is bad. But plaintiff counsel like this are generally the scummiest people in the US. (Maybe not universal, but 90% are morally repugnant).
As they say, "95% of lawyers give the remaining 5% a bad name."
At the same time, 99% of social networks give the remaining 1% a bad name.
I mean those class action lawsuits enrich trial lawyers and maybe force companies to behave better (though i bet empirical evidence would show that its more a cost of business).
The 20$ dollars people get is nothing but a guise that the trial lawyers are helping people.
I'm not sure if the lower price means that class actions shouldn't still be taken.
It's to allow companies to not have to deal with individual claims for each person. I see that the ranges can be substantial though, several thousands, but seems to be criteria.
> Nearly nine months later, Mark received a notification that his claim had been approved. Two weeks after that, $186 was deposited into his bank account. While the amount wasn’t substantial, it covered a grocery run and a phone bill—and more importantly, it reminded him that companies can be held accountable, even in small ways. [0]
If the fine's don't dissuade companies from bad practices, the class actions with theoreticaly no upper limit might be a better option to enforce proper behaviour.
I can agree with that -- however the amount of money the trial lawyers make comparatively is wildly disproportionate. I think that 186$ figure is an example on the high side of payouts to individuals.
Social Media, and specifically Facebook / Meta, will go down in history as one of the worst developments in technology in the 21st century. As Frances Haugen stated in her testimony, Mark Zuckberg needs to be removed from the helm at Meta.
LLMs love this style, but they love it because it's just about every single piece of advertisement writing for the last aeon or so, and it's a mighty chunk of their training corpora.
Reminds me of Carl Sagan’s Contact, where Haden, the millionaire funding Ellie’s work, made a TV ad blocker and then sued the TV companies when they refused to play ads for his product.
There is a humor that these law firms won a case against Meta and the first thing they did is give them advertising money won from the court case. That said the ads sound pretty aggressive, and from what I've read it sounds like it wasn't a very fair decision. I understand the conflict of interest but I have sympathies for Meta here
Zuckerberg is a rich and high profile guy, so photographers capture many pictures of him, and news editors often find that choosing unflattering pictures of people their readers don't like is helpful for reach. This picture in particular was taken after he'd just finished testifying for 8 hours in a February trial, which I think would wear down the best of us, and even among Getty's extensive gallery of pictures taken then (https://www.gettyimages.com/detail/news-photo/mark-zuckerber...) this one is particularly unflattering IMO.
The idea that Meta is obligated to be so impartial that it must allow lawsuits against itself to be promoted on its own platform is a bit naive and utopian.
TOS are not laws. In fact, they often partially violate laws and those parts are then void. In some countries, anything written in TOS that is not "expected to be there" is void.
Ok but I don’t really see why this specific term would violate any law? Do we really want a society where platforms are forced to present speech that is harmful to them? If you own a store and I put a sign up on your wall advertising a rival store wouldn’t it be reasonable for you to disallow that?
An alternative reply, with analogy, if you like them:
You own a restaurant, where you sell poisoned (intentionally and knowingly) food. A group of people band up for class action lawsuit for poisoning them, and have the lawyers post a sign at your restaurant, that everyone poisoned there should reach out and get some compensation.
It’s a lawsuit, with the users of the platform as the damaged party, against the platform. Removing the possibility to reach the users should result in a default judgement with maximum damages immediately.
The parent comment brings up the ToS as an example of why it's naive to believe Meta is obligated to do something, but what Meta is obligated to do depends on the law.
I’m not against these companies losing their Section 230 immunity. Social media platforms are, in my personal opinion, publishers in their current form.
If they went back to operating as “friends and family feed providers” then letting them keep their 230 immunity would be easier to justify.
To me that’s how it should be. They shouldn’t have to run ads against themselves yet they should be liable or accountable for harm they are found guilty of.
I tend to agree with you on this. I wanted to add however that Meta itself lets so many TOS violating ads in, that it seems like special treatment for ads that are much less undesirable than the ads normally pushed.
Companies have to inform affected individuals of data breaches, especially when HIPAA gets involved. Brokers have to inform clients of transaction errors. Auto manufacturers have to inform owners of recalls. Retirement funds have to inform plan participants of lawsuits involving those funds.
You don't even have to invoke the idea that Meta is big enough to be regulated as a public utility for this to have broad precedent in favor of forcing a malicious actor to inform its victims that they might be entitled to a small fraction of their losses in compensation.
Well we aren’t discussing the government requiring meta to inform users. We are discussing whether meta can choose which private actors’ ads to allow. It would seem silly that a platform would be forced to allow all ads.
That idea was not expressed in the article, only the fact that the ads were removed. This is worth covering, especially when coupled with the context for what ads Meta regularly does allow. One does not have to believe that they're obligated to do so while also believing that it's incredibly scummy behavior that consumers should be aware of and question.
Maybe, but so what? Your remark lacks a conclusion.
Mine is that it could then well be required to do so by law. Companies are not individuals, so I don't think they are owed any freedoms beyond what is best for utility they can provide.
at certain scales, reality has to win out over whatever ideal you have in your head about how things should be. facebook is massive, a lot of society is on it, and its a problem to make recourse invisible to people most affected by the thing stealing their attention.
It indeed doesn't, but conservative lawmakers signalled repeatedly that they were unhappy about Meta's protection under section 230 if their moderation policies were not politically neutral
Can't we all just agree there are no GOOD people in this situation? Meta, class-action lawyers, PE and big money that funds the lawsuits as a profit venture... The one thing they all appear to share: parasites extracting resources from their host.
They analyze the video posts on instagram. If they detect the video has even a small amount of commercial value, they classify it as branded content and you need to pay for it to get promoted.
I believe we need to strengthen 230, but with the added caveat that affected platform owners must stop gaming the algorithms, that it must require user-driven curation. Let me curate my own feed, stop shoving shit in front of my eyes. When you do so, you're making heavy editorial decisions, and should be open to liability.
Companies are not liable if they have proper ID of the person who submitted the content and can provide that to a plaintiff. If they have not made a good-faith effort to know who submitted this info (like taking ID, not just an email address) then they're taking responsibility for the submitted content.
Which means sites that have responsible moderation can still allow anonymous contributions.
The real problem is the inherent asymmetry of legal battles, where the wealthiest can fight forever with endless motions and have near-total impunity while a legal action would basically nuke a normal person's life. Not to mention the fact that an international border can often make this whole conversation moot.
One of my issues is the lack of liability in practice. The poster is technically liable but they're anon, behind proxies, foreign, etc. and unaccountable. It results in people being harmed online without recourse.
These companies should have a duty to know who their users are.
Individuals bringing their own lawsuits seems like it would affect better change as 1) the award money would be better distributed instead of concentrated and 2) the amounts levied against the companies would be higher and more of concern than the class-action slap-on-the-wrist they currently get.
This is humanity vs Mark Zuckerberg.
Wow.. That is quite a statement. Am I right in saying that in order to claim for the class action lawsuit, which facebook has been 'found negligent', that the victims need to take an action collectively in order to claim ? IE They need to be reached somehow to inform them of the possibility ?
Seems the most obvious place to advertise would be Meta.
I understand Meta can basically do whatever they like with their ToS but the statement from the Meta spokesperson seems like an extremely bad idea.
The article even at least mentions that at least one of the suits is private equity funded; which generally will result in the partners and/or investors of the private equity firm and the attorneys suing, which are often all one and the same in what is just a financial and legal shell game, net tens of millions of dollars, while the supposed victims will end up with nothing but pennies on the dollar of harm and injury.
I get the impulse to also “cheer” for the lawsuits, but if you thought Meta, etc. are bad; you really don’t want to look into the vile pestilence that is the law firms that are basically organized crime too by the core definition of crime being an offense and harm upon society.
I don’t really know a solution for this problem because it is so rooted in the core foundation of this rotten system we still call America for some reason, but for the time being I guess, the only moderately effective remedy for harm and injury is to combat it with more harm and injury.
Also why we need much less megacorps than there are now.
Wild stuff
All corporate CYA ideas sound that way, but ultimately end up benefiting the company in the end. Meta is right to do this. That's not to say it's right to do, but it's right for the company.
It's not a hard thing to implement on their end and should be mandated by a judge as you said.
Filing this away for later use.
It often comes up in (anti) free-speech trials, where the government compels the perpetrator to issue a public apology to the victim. Forcing them to buy an ad in a newspaper for example is not unheard of.
As far as I understand, Americans consider this to be "compelled speech" and hence prohibited, but I might be wrong on this.
These people are one of the few people holding Meta accountable for their evil acts and because of that you call them "scummiest people in the US"
That's nonsense.
But at the end of the day, the lawyers did real work, took on real risk and achieved something. They held a big tech company accountable, and that is a meaningful difference from the status quo. I don't care that they made money doing that, they should.
My special savings account where I deposit the settlement checks from the various tech companies that have violated my privacy or other rights disagrees.
Sometimes it's 43¢. Sometimes it's $400.
In the last three years, I've put… checking… $5,351.83 in that account because tech companies think laws and morals don't apply to them.
Saying that these lawsuits only benefit lawyers is both false and yet another lazy tech bubble cliche.
Yes, the lawyers get way more than I do. They also did 99% the work, so I don't hold it against them.
Just read the newspaper. Every time you see an article about one of these suits, check it out to see if it applies to you.
Who?
They don't even bother trying to get more when they can, because they're just bottom feeding.
As they say, "95% of lawyers give the remaining 5% a bad name."
At the same time, 99% of social networks give the remaining 1% a bad name.
The 20$ dollars people get is nothing but a guise that the trial lawyers are helping people.
It's to allow companies to not have to deal with individual claims for each person. I see that the ranges can be substantial though, several thousands, but seems to be criteria.
> Nearly nine months later, Mark received a notification that his claim had been approved. Two weeks after that, $186 was deposited into his bank account. While the amount wasn’t substantial, it covered a grocery run and a phone bill—and more importantly, it reminded him that companies can be held accountable, even in small ways. [0]
[0] https://peopleforlaw.com/blog/how-much-do-people-typically-g...
If the fine's don't dissuade companies from bad practices, the class actions with theoreticaly no upper limit might be a better option to enforce proper behaviour.
"MZ Is A Punk-Ass B
payed for by Person & Guy LLP"
Seems like they couldn't write even three lines without a LLM.
I wonder if that is what will happen next.
Zuckerberg was told about gay people being added to groups and it outed them by posting to their wall, and he ignored it https://www.youtube.com/watch?v=nRYnocZFuc4
And obviously https://news.ycombinator.com/item?id=1692122 (guess we don't get access to his other messages, though https://news.ycombinator.com/item?id=16770818)
His stare isn't the only thing about him that's sociopathic
Edit: oh yeah and https://news.ycombinator.com/item?id=42651178
Its own TOS states that they won’t allow that.
You own a restaurant, where you sell poisoned (intentionally and knowingly) food. A group of people band up for class action lawsuit for poisoning them, and have the lawyers post a sign at your restaurant, that everyone poisoned there should reach out and get some compensation.
Should you be allowed to take the sign down?
It’s a lawsuit, with the users of the platform as the damaged party, against the platform. Removing the possibility to reach the users should result in a default judgement with maximum damages immediately.
If they went back to operating as “friends and family feed providers” then letting them keep their 230 immunity would be easier to justify.
When they are making editorial decisions about what to content to promote to you and what content to hide from you, then they should lose it.
This is not how it works when you're found guilty of committing harm. Tobacco companies are a good example of this.
It's not just a Meta issue either.
You don't even have to invoke the idea that Meta is big enough to be regulated as a public utility for this to have broad precedent in favor of forcing a malicious actor to inform its victims that they might be entitled to a small fraction of their losses in compensation.
https://www.reuters.com/investigations/meta-is-earning-fortu...
Mine is that it could then well be required to do so by law. Companies are not individuals, so I don't think they are owed any freedoms beyond what is best for utility they can provide.
Is their defence of Section 230 protections not in part rooted in that claim of impartiality?