It seems nice but every single time I see service allowing anonymous uploads like such I’m thinking immediately: criminal use.
How hard would it be write a protocol that uses relatively safe urls to encode messages, e.g. by ensuring that the ratio of emojis isn’t serialized URL, credentials to some stash or an encoded picture no one wants to keep?
> It seems nice but every single time I see service allowing anonymous uploads like such I’m thinking immediately: criminal use.
This seems like the Hollywood movie plot criminal use.
Actual criminals just put a normal server/proxy in a non-extradition country or compromise any of the zillion unpatched Wordpress instances on the internet or something equally boring.
Might I say that this whole safetyist moral panic is very convenient for large corporations? If you can't host your own service due to these concerns, you'll use the cloud :)
It's not a moral panic it's called "an extended engagement with law enforcement will be unpleasant and costly" and you probably don't want that.
And if you're wondering why it's that way, then casually observe everytime people declare that people under arrest or being tried "don't deserve..." something.
The problem here is that we keep acting like the way we should solve this is by having people making toy projects or general purpose tools cower in fear of their own government and stop trying to make anything, instead of establishing a government that can distinguish between violent drug cartels and child abusers vs. innocent behavior or minor offenses and then not inflict senseless damage on the latter.
Government is incentivized and rewarded for finding and punishing violent drug cartels and child abusers. When those become hard to find, the government punishes minor offenses, since it is easy to paint these as hardened criminals, and nobody is in a real position to discover or publicize the actual state of things.
It's more along the lines of, people hear "money laundering" and think this implies some kind of drug ring or terrorism, when it's really some laws so expansive and nebulous that ordinary people frequently do it without knowing, so now there are laws on the books that allow random normies to be charged with a felony at the discretion of the prosecutor.
And these laws tend to take a very specific form: They're laws against things adjacent to other crimes, instead of laws against the original crimes themselves. So this is like, the CFAA putting felony penalties on "unauthorized access" when the implication justifying the penalty is "unauthorized access in order to commit a crime like credit card fraud" and the solution is to put those penalties on the actual fraud. Or "money laundering" which implies an underlying crime to be laundering the proceeds of which implies that it's redundant and they should instead be charged with the underlying crime.
Because what those laws erroneously allow is for someone to be charged with the secondary offense without ever establishing the primary one, or substituting a minor primary offense even though the penalties for the secondary offense were set under the assumption it was a major one. Which is how ordinary people get ensnared.
But we don't need those laws at all because you can charge the actual criminals with their actual crimes, so they should just be repealed, or converted into minor misdemeanors with the heavy penalties instead being imposed on the associated serious crime and only when it actually exists.
Some “adjacent” crimes like that exist because enforcement and/or detection of the original crime is hard and/or expensive. Like gun laws. Or curfew.
I still think that the real problem is the incentives of government; the problem you describe exist simply because government also has the power to create new laws in order to make life easier for itself, at the expense of the governed. I.e. the problem is government prioritizing being seen as useful over actually being useful.
> Some “adjacent” crimes like that exist because enforcement and/or detection of the original crime is hard and/or expensive. Like gun laws. Or curfew.
So we have to do the hard and/or expensive thing instead. It's the government, they spend six trillion dollars a year, "not expensive" is clearly not a thing we're currently receiving as a benefit of the status quo.
In general these laws will be making things more expensive, because investigations, prosecutions and incarceration of people convicted of adjacent crimes but not primary crimes all cost a ton of money for negligible if not overtly negative outcomes. When you throw minor offenders in prison you have to pay to prosecute and incarcerate them and lose the benefits of their contributions to society if they hadn't been incarcerated. It's just setting money on fire, except that in this case (as in many other cases) "money" is really "lives".
> I still think that the real problem is the incentives of government; the problem you describe exist simply because government also has the power to create new laws in order to make life easier for itself, at the expense of the governed. I.e. the problem is government prioritizing being seen as useful over actually being useful.
This isn't really a different problem, it's just asking the question in the form of, given that these laws are stupid how do we bring about a system that doesn't have them and can't pass them anymore?
It's even more boring: When I share criminal data (usually old movies that are still in copyright), I just put them in an encrypted 7zip archive and upload to google drive, then delete after my friend downloads it.
I mean, in this case we're talking about emoji, so I'm having a hard time picturing the criminal use, but in general anonymous file uploads or text uploads absolutely get used by criminals as soon as they're discovered. Anyone who's run a service for long enough will have stories of the fight against spam and CSAM (I do!).
> I mean, in this case we're talking about emoji, so I'm having a hard time picturing the criminal use, but in general anonymous file uploads or text uploads absolutely get used by criminals as soon as they're discovered
You can use the emoji service as an anonymous data upload service because it transfers information and you can encode arbitrary data into other data. But that sounds like work and people are lazy and criminals are people so they'll generally do the lazy thing and use one of the numerous other options available to them which are less work than creating and distributing an emoji encoder.
If you make a generic file upload service, well, they don't have to do as much work to use that. Then the question is, what should we do about that?
The next question is, does preventing them from using a given service meaningfully prevent any crime? That one we know the answer to. No, it does not. Because they still have all of the other alternatives, like putting it on a server or service in a foreign country or compromising random Wordpress instances etc.
Then we can ask, from the perspective of what the law should be and the perspective of a host under a given set of laws, what should we do? And these are related questions, because you want to consider how people are going to respond to a given set of laws.
So, what happens if you impose strict liability on hosts regardless of whether they know that a given thing is crime? Well then you don't have any services hosting data for people because nobody has a 0% false negative rate but without one you're going to jail.
What if you only impose liability if they know about it? Then knowing is a liability because you still can't have a 0% false negative rate, so they're going to prevent knowing and you end up with Mega encrypting user data so they can't themselves see it. That seems pretty dumb, you'd like them to be able to remove obvious bad stuff without putting liability on them if they're not 100% perfect.
What if you only impose liability if someone else reports it? This works like the DMCA takedown process, and then you get a combination of the first two. They can allow uploads but they can also remove things they're aware of and want to remove, but they end up de facto required to remove anything anyone reports, because if they don't and they ever get it wrong then they're screwed. So then you get widespread takedown abuse and have created a trolling mechanism. This is not a great option.
What if you let them moderate without any liability but require a court order to force them to take something down? This is like the approach taken by the CDA and is the best option, because you're not forcing risk-averse corporate bureaucrats to comply with evidence-free fraudulent takedowns but you still allow them to remove obvious spam etc. without liability. This leaves the service with a good set of incentives, because in general they'll want to satisfy users, so they'll try to remove spam etc. but not remove non-spam. Meanwhile this still leaves the option for crimes to be investigated by the people who are actually supposed to be investigating crimes, i.e. law enforcement, and then the courts can still order things to be taken down -- and more than that, put the actual criminals in jail -- without putting penalties on the service for not themselves being infallible adjudicators of what is and isn't crime.
The ultimate exploit is to create fake "likes". Once any system of likes becomes successful, it gets used for (a) filtering news feeds, and (b) establishing consensus & social truth. This is the biggest exploit there is.
A cheap system for "likes", such as this, is only safe when few people use it. Once it becomes popular, and worth something, it gets exploited, and then utterly fails.
"A Open Heart message should contain of a single emoji sequence. However, the emoji sequence may be followed by arbitrary data which the server is expected to ignore."
Italics mine.
That arbitrary data could be a multi-gigabyte zip file of some expensive program, classified data, copyrighted video/music,or anything for all this spec cares.
Ok, so? Provided the receiving server is configured to redirect the arbitrary data into the trash, you're in the clear, right? Not your fault someone sent you extra data, and you're not expected to keep it, and if you don't keep it anywhere, law enforcement could search your server but there's nothing to find because your system doesn't retain anything from the arbitrary data.
With ZWJ (Zero Width Joiner) sequences you could in theory encode an unlimited amount of data in a single emoji.
Particularly interesting are the "family" emojis, made by joining any number of person-type emoji with ZWJ characters. So in theory, a family made of thousands of men, women, girls, boys, etc... would be a valid emoji.
This probably refers to emojis made out of multiple codepoints (e.g. skin color + person, or flags which are built out of the country code in a special range).
Huh. ZWJ is an interesting corner case. Technically it's supposed to be used when combining emoji sequences that compose another single emoji (flags, skin color, gender, etc), but if the server doesn't discriminate for "real" emoji combos it technically can be an arbitrary combo.
It seems to only allow sending a single character at a time from a limited set. What criminal use does that allow?
In the age of beepers, criminals found plenty of creative ways to send messages in just a few characters. And this permits emojis, which -- binarily speaking, contain far more bits than a beeper message.
The problem isn't that criminals can use your service, its that the service provider really doesn't want to be liable for that happening, which generally only happens when you host illegal content.
You don’t need a upload a lot of data in order to have illegal data stash and there are creative criminals out there.
E.g. for GPS coordinates you need only a 16 digits. Emojis are 8 bytes so by selecting specific ones and adding a control character (or two) and ensuring other stay in sequence you can encode this data in.
And then I can only respond with „Did you read article on ACME Times about a car riding a bike?” which is a simple pointer for URL which you might check for the drop coordinates.
This it’s also possible to provide encryption keys, url serialization, cryptocurrency wallet pointers etc. And sure, this seems complicated and dystopian but when government asks you to provide data of your users who committed hard crimes it’s not really fun to be at position when you say „I don’t know who my users are”.
From my experience any service that allows anonymous write and anonymous read over long periods will sooner or later be used for illicit activity. It doesn’t matter if that’s 1mb or 10 bytes.
Sure, I guess that could happen. Hackernews allows anon data uploads over long periods. How many online services actually do KYC if they don't legally have to?
Any motivated criminal could also just use a book cipher or any number of less trackable options.
The GET request does not return data in sequence, does it? Just counts fr each emoji.
What exactly does the govt do if you do not have data they want? I assume if you run a service like this you would comply with any data retention requirements in your country and hand over logs - although older ones which you might have deleted to comply with other laws!
Unless you have id verification crminals can sign up with false identities.
> Unless you have id verification crminals can sign up with false identities.
Having registration is enough to not be liable, that’s why everyone is doing that. You get subpoenaed, you give logs for user that you have, case closed.
Data can be linked to your server. If you cannot pass the torch it’s you who will be investigated as potential partner in crime.
Why not just use pastebin for a "hey I left ur drugs at this coord", or even just a plain ol' encrypted message over email, Signal, etc...? I'm a little lost here, probably due to naivete. Is the storage of URLs or crypto wallet pointers really the bottleneck for cybercrime?
Because that way it’s easy to track both poster and visitor (one could say that every visitor of such URL was involved).
Indirect communication shifts focus from channel to method. And if anyone can use channel and anyone can read message then it’s impossible to pinpoint true poster and true recipient.
E.g. Few years back I was helping fix a Wordpress site which shared leaked CC through page visitor counters.
Imaging proving anyone’s participation.
And finally I didn’t say anything about it cybercrime, the cases I know of were related to identity theft, assets theft, extortion and illicit videos. Seized servers and personal computer for years.
that's really only a risk when you allow direct retrieval of the uploaded data.
if you're only returning counts and you're not even offering a guarantee that every submission will be counted, then the potential for abuse isn't really any higher than any other website out there.
Byte value is a count of flipped bits. Those bits aren’t even guaranteed to be correct (see cosmic bit flipping) and yet our computers work this out.
IMO this is risky because it’s easy to distribute upload, e.g. I could have infected, semi popular website that would submit distributed request on visit (think about it like 1000 credits daily to use to encode message). Visitors of this website wouldn’t see a thing and yet the encoded message would be consistent.
As for other websites - especially free image hosts - they often keep a metric ton of data, some won’t work if you won’t have an identifiable partner cookie on submission request, and there is post upload analysis etc.
like yes, theoretically somebody could probably manage to encode a few bytes of secret message.
but it's just that there's so many easier and better ways to do that, and even if they managed to accomplish it here it's hardly hurting this project - worst case scenario it's a bit of unwanted noise. it seems very silly to worry about it.
In theory, Bluesky, Mastodon, Pixelfed etc could offer a service where you drag and drop the button onto the Bluesky/Mastodon/Pixelfed website where you are logged in and sign the reaction in your name. And send the post with a signed message like "Peter Prima endorsed this page with a thumbs up emoji /signed: Peter Prima"
This way, the web would get a decentralized like system.
It's a little button counter in an iframe that you can embed on your website. It also looks great next to 88x31s.. I have one in my footer https://varun.ch/
I think I just love the idea of making static pages a little more interactive by adding in these little widgets. I have an HTML form embedded on my contact page that's hooked up to `ntfy` and acts like a 21st century pager. So much fun.
I like the way this being proposed in a decentralised manner. Kudos to the author for the effort and thought put in.
However, I am curious what the incentive for publishers is to adopt this standard if those emojis are only relevant for the websites own silo? Use cases like these call for customized deep integrated implementations.
My question is a curious one. I might be missing the big picture and would like to get educated.
For many services which allow arbitrary emoji reactions (most notably discord) they remain ordered by "first reacted" order, which can allows emergent behavior like spelling out words with the letter emojis
But JSON mappings are ordered. The thing producing/consuming them might choose to map them to an unordered mapping but inherent to them being serialized is you get an order for free.
The JSON specification describes objects as unordered. Which means any standards compliant JSON encoders or decoders can and will produce maps in different orders even when the same object is passed through twice.
It’s also worth noting that quite a few languages don’t guarantee ordered maps either.
If you want an ordered map then you really need a key/value map inside an array:
There is Zeeker <https://addons.mozilla.org/en-US/firefox/addon/zeeker/>, but there seems to be no new versions recently, and I cannot find a link to the browser add-on on their web site, so they might have abandoned it.
The problem with comment sections on web sites is that the web sites are incentivized to have shitty comments for engagement and ease of moderation. If it's a browser plugin, it's out of the website's hands and as the user I could probably configure it so I only see comments from people I think are good at commenting, like my friends or people I follow on social media.
I think it was to be able to discuss things on websites that removed comment sections or that had a lot of heavy moderation that prevented meaningful discussions.
If it's a decentralized like button, why is a new protocol needed? `PUT /count/increment` is a pretty straightforward RESTFUL solution over the existing HTTP protocol.
I like this idea but think there is a missing link, literally. It needs a built in way to notify URL A of a reaction to URL B, so that reactions can be recorded independently of the target URL. Like putting URL B in a query parameter. This would support repositories of reactions that are independent of the reactee, and not subject to their feelings on the matter.
Use ATproto, but with a different data model so that each "post" is associated with a URL being commented on. Then bring the moderation system from Bluesky.
(As much as I don't like ATproto's centralization, ActivityPub doesn't work the right way)
Accepting arbitrary input is a complexity that will be abused. Why not lock it down to a limited set of inputs… or just a heart? `POST /openheart/heart{?url}`
I love the idea of a completely open protocol that thwarts spam by being unable to post any text at all, only tiny predefined pictures, aka emoji.
The obvious abuse will be, of course, pumping some counters to ridiculous values, making them useless as a measure of readers' reaction. Though it can be lighthearted fun in the spirit of the Web form 1994, I suspect that implementing caps could be useful.
Less fun could be posting tons of negative emoji (anger, crap, etc). Some site owners will limit the set of allowed emoji to only positive reactions, as seen in some large Telegram channels currently.
The most pervasive misunderstanding about evolution is that it leads to "perfection". Sure it is a kind of optimization procedure, but a) it's optimizing on a loss function that is measured on the population scale, not the scale of a particular organ, so don't expect your pet figure of merit to be optimized even in an average sense, and certainly not in an individual; and b) there is not a unique optimum, the optima are not stationary, and local optima generally are not all that sharp, so do not expect the population to be all that close to the optimum either.
I wonder what energy-saving optimisations the human body has which lead to worse outcomes in our energy-rich environment.
I don't just mean fat storage algorithms leading to metabolic syndrome and T2D, but for example whether we could afford to have better immune and musculoskeletal systems, more cancer-fighting cells, better regeneration etc if we were tuned to an energy budget of 3000kcal instead of 2000kcal.
Engineering in general is a pile of kludges on top of other kludges. Theory looks good in physics textbooks but it hardly ever survives contact with reality
How hard would it be write a protocol that uses relatively safe urls to encode messages, e.g. by ensuring that the ratio of emojis isn’t serialized URL, credentials to some stash or an encoded picture no one wants to keep?
This seems like the Hollywood movie plot criminal use.
Actual criminals just put a normal server/proxy in a non-extradition country or compromise any of the zillion unpatched Wordpress instances on the internet or something equally boring.
And if you're wondering why it's that way, then casually observe everytime people declare that people under arrest or being tried "don't deserve..." something.
It's more along the lines of, people hear "money laundering" and think this implies some kind of drug ring or terrorism, when it's really some laws so expansive and nebulous that ordinary people frequently do it without knowing, so now there are laws on the books that allow random normies to be charged with a felony at the discretion of the prosecutor.
And these laws tend to take a very specific form: They're laws against things adjacent to other crimes, instead of laws against the original crimes themselves. So this is like, the CFAA putting felony penalties on "unauthorized access" when the implication justifying the penalty is "unauthorized access in order to commit a crime like credit card fraud" and the solution is to put those penalties on the actual fraud. Or "money laundering" which implies an underlying crime to be laundering the proceeds of which implies that it's redundant and they should instead be charged with the underlying crime.
Because what those laws erroneously allow is for someone to be charged with the secondary offense without ever establishing the primary one, or substituting a minor primary offense even though the penalties for the secondary offense were set under the assumption it was a major one. Which is how ordinary people get ensnared.
But we don't need those laws at all because you can charge the actual criminals with their actual crimes, so they should just be repealed, or converted into minor misdemeanors with the heavy penalties instead being imposed on the associated serious crime and only when it actually exists.
I still think that the real problem is the incentives of government; the problem you describe exist simply because government also has the power to create new laws in order to make life easier for itself, at the expense of the governed. I.e. the problem is government prioritizing being seen as useful over actually being useful.
So we have to do the hard and/or expensive thing instead. It's the government, they spend six trillion dollars a year, "not expensive" is clearly not a thing we're currently receiving as a benefit of the status quo.
In general these laws will be making things more expensive, because investigations, prosecutions and incarceration of people convicted of adjacent crimes but not primary crimes all cost a ton of money for negligible if not overtly negative outcomes. When you throw minor offenders in prison you have to pay to prosecute and incarcerate them and lose the benefits of their contributions to society if they hadn't been incarcerated. It's just setting money on fire, except that in this case (as in many other cases) "money" is really "lives".
> I still think that the real problem is the incentives of government; the problem you describe exist simply because government also has the power to create new laws in order to make life easier for itself, at the expense of the governed. I.e. the problem is government prioritizing being seen as useful over actually being useful.
This isn't really a different problem, it's just asking the question in the form of, given that these laws are stupid how do we bring about a system that doesn't have them and can't pass them anymore?
You can use the emoji service as an anonymous data upload service because it transfers information and you can encode arbitrary data into other data. But that sounds like work and people are lazy and criminals are people so they'll generally do the lazy thing and use one of the numerous other options available to them which are less work than creating and distributing an emoji encoder.
If you make a generic file upload service, well, they don't have to do as much work to use that. Then the question is, what should we do about that?
The next question is, does preventing them from using a given service meaningfully prevent any crime? That one we know the answer to. No, it does not. Because they still have all of the other alternatives, like putting it on a server or service in a foreign country or compromising random Wordpress instances etc.
Then we can ask, from the perspective of what the law should be and the perspective of a host under a given set of laws, what should we do? And these are related questions, because you want to consider how people are going to respond to a given set of laws.
So, what happens if you impose strict liability on hosts regardless of whether they know that a given thing is crime? Well then you don't have any services hosting data for people because nobody has a 0% false negative rate but without one you're going to jail.
What if you only impose liability if they know about it? Then knowing is a liability because you still can't have a 0% false negative rate, so they're going to prevent knowing and you end up with Mega encrypting user data so they can't themselves see it. That seems pretty dumb, you'd like them to be able to remove obvious bad stuff without putting liability on them if they're not 100% perfect.
What if you only impose liability if someone else reports it? This works like the DMCA takedown process, and then you get a combination of the first two. They can allow uploads but they can also remove things they're aware of and want to remove, but they end up de facto required to remove anything anyone reports, because if they don't and they ever get it wrong then they're screwed. So then you get widespread takedown abuse and have created a trolling mechanism. This is not a great option.
What if you let them moderate without any liability but require a court order to force them to take something down? This is like the approach taken by the CDA and is the best option, because you're not forcing risk-averse corporate bureaucrats to comply with evidence-free fraudulent takedowns but you still allow them to remove obvious spam etc. without liability. This leaves the service with a good set of incentives, because in general they'll want to satisfy users, so they'll try to remove spam etc. but not remove non-spam. Meanwhile this still leaves the option for crimes to be investigated by the people who are actually supposed to be investigating crimes, i.e. law enforcement, and then the courts can still order things to be taken down -- and more than that, put the actual criminals in jail -- without putting penalties on the service for not themselves being infallible adjudicators of what is and isn't crime.
A cheap system for "likes", such as this, is only safe when few people use it. Once it becomes popular, and worth something, it gets exploited, and then utterly fails.
Italics mine.
That arbitrary data could be a multi-gigabyte zip file of some expensive program, classified data, copyrighted video/music,or anything for all this spec cares.
Particularly interesting are the "family" emojis, made by joining any number of person-type emoji with ZWJ characters. So in theory, a family made of thousands of men, women, girls, boys, etc... would be a valid emoji.
I tried with ZWJ but it turns out variation selectors were easier to make work.
Tried up to 4.
Too lazy to push it to see how many joins until the api breaks.
https://emojipedia.org/zero-width-joiner
https://api.oh.dddddddddzzzz.org/github.com/dddddddddzzzz/Op...
https://emojipedia.org/zero-width-joiner
In the age of beepers, criminals found plenty of creative ways to send messages in just a few characters. And this permits emojis, which -- binarily speaking, contain far more bits than a beeper message.
E.g. for GPS coordinates you need only a 16 digits. Emojis are 8 bytes so by selecting specific ones and adding a control character (or two) and ensuring other stay in sequence you can encode this data in.
And then I can only respond with „Did you read article on ACME Times about a car riding a bike?” which is a simple pointer for URL which you might check for the drop coordinates.
This it’s also possible to provide encryption keys, url serialization, cryptocurrency wallet pointers etc. And sure, this seems complicated and dystopian but when government asks you to provide data of your users who committed hard crimes it’s not really fun to be at position when you say „I don’t know who my users are”.
From my experience any service that allows anonymous write and anonymous read over long periods will sooner or later be used for illicit activity. It doesn’t matter if that’s 1mb or 10 bytes.
Any motivated criminal could also just use a book cipher or any number of less trackable options.
What exactly does the govt do if you do not have data they want? I assume if you run a service like this you would comply with any data retention requirements in your country and hand over logs - although older ones which you might have deleted to comply with other laws!
Unless you have id verification crminals can sign up with false identities.
Having registration is enough to not be liable, that’s why everyone is doing that. You get subpoenaed, you give logs for user that you have, case closed.
Data can be linked to your server. If you cannot pass the torch it’s you who will be investigated as potential partner in crime.
Indirect communication shifts focus from channel to method. And if anyone can use channel and anyone can read message then it’s impossible to pinpoint true poster and true recipient.
E.g. Few years back I was helping fix a Wordpress site which shared leaked CC through page visitor counters. Imaging proving anyone’s participation.
And finally I didn’t say anything about it cybercrime, the cases I know of were related to identity theft, assets theft, extortion and illicit videos. Seized servers and personal computer for years.
if you're only returning counts and you're not even offering a guarantee that every submission will be counted, then the potential for abuse isn't really any higher than any other website out there.
IMO this is risky because it’s easy to distribute upload, e.g. I could have infected, semi popular website that would submit distributed request on visit (think about it like 1000 credits daily to use to encode message). Visitors of this website wouldn’t see a thing and yet the encoded message would be consistent.
As for other websites - especially free image hosts - they often keep a metric ton of data, some won’t work if you won’t have an identifiable partner cookie on submission request, and there is post upload analysis etc.
but it's just that there's so many easier and better ways to do that, and even if they managed to accomplish it here it's hardly hurting this project - worst case scenario it's a bit of unwanted noise. it seems very silly to worry about it.
This way, the web would get a decentralized like system.
It's a little button counter in an iframe that you can embed on your website. It also looks great next to 88x31s.. I have one in my footer https://varun.ch/
I think I just love the idea of making static pages a little more interactive by adding in these little widgets. I have an HTML form embedded on my contact page that's hooked up to `ntfy` and acts like a 21st century pager. So much fun.
https://emojipedia.org/zero-width-joiner
[0]: https://github.com/dddddddddzzzz/api-oh/blob/312d490641c7ec7...
Wouldn't HTTP 204 be more appropriate here? A 4xx would make it seem like the request failed, when in fact it succeeded.
However, I am curious what the incentive for publishers is to adopt this standard if those emojis are only relevant for the websites own silo? Use cases like these call for customized deep integrated implementations.
My question is a curious one. I might be missing the big picture and would like to get educated.
This means ordering semantics are lost.
But lots of fun being had in our Slack by people putting emojis in the right order, to make pictograms etc
All forms of communication is "abused", such abuse is part of being human, that is how languages have evolved..
It’s also worth noting that quite a few languages don’t guarantee ordered maps either.
If you want an ordered map then you really need a key/value map inside an array:
Though in this specific instance, you’d be better off with more specific key names like “emoji” and “count” (respectively).Edit: HN stripped the emojis from my comment so I added ASCII placeholder strings into the example to illustrate the same point.
There was also “Dissenter”, still available at <https://github.com/gab-ai-inc/gab-dissenter-extension/releas...>, but the website seems to have pivoted to something else, and the add-on seems to have been removed from the official add-on repositories (possibly due to negative press coverage: <https://archive.fo/sWxAS>). Further discussion: <https://discourse.mozilla.org/t/the-removal-of-the-dissenter...> and <https://www.reddit.com/r/browsers/comments/ptaau2/what_happe...>
(I earnestly hope I'm wrong, because I'll learn something new and cool.)
admittedly, it's a good project for an entry-level dev's portfolio, but that's about it.
uh...
I'm not sure you and I use the same definition for meaningful.
(As much as I don't like ATproto's centralization, ActivityPub doesn't work the right way)
unless you're trying to say it'll make us less reliant on services like Disque.
The obvious abuse will be, of course, pumping some counters to ridiculous values, making them useless as a measure of readers' reaction. Though it can be lighthearted fun in the spirit of the Web form 1994, I suspect that implementing caps could be useful.
Less fun could be posting tons of negative emoji (anger, crap, etc). Some site owners will limit the set of allowed emoji to only positive reactions, as seen in some large Telegram channels currently.
I don't just mean fat storage algorithms leading to metabolic syndrome and T2D, but for example whether we could afford to have better immune and musculoskeletal systems, more cancer-fighting cells, better regeneration etc if we were tuned to an energy budget of 3000kcal instead of 2000kcal.
4B years, yes, but evolution is a pile of kludges on top of kludges.
Too bad the modern internet is a monitized cesspool. This is a cool idea.