16 comments

  • Aurornis 1 hour ago
    Kickstarter is full of projects like this where every possible shortcut is taken to get to market. I’ve had some good success with a few Kickstarter projects but I’ve been very selective about which projects I support. More often than not I can identify when a team is in over their heads or think they’re just going to figure out the details later, after the money arrives.

    For a period of time it was popular for the industrial designers I knew to try to launch their own Kickstarters. Their belief was that engineering was a commodity that they could hire out to the lowest bidder after they got the money. The product design and marketing (their specialty) was the real value. All of their projects either failed or cost them more money than they brought in because engineering was harder than they thought.

    I think we’re in for another round of this now that LLMs give the impression that the software and firmware parts are basically free. All of those project ideas people had previously that were shelved because software is hard are getting another look from people who think they’re just going to prompt Claude until the product looks like it works.

    • lr4444lr 1 hour ago
      At this point, I trust LLMs to come up with something more secure than the cheapest engineering firm for hire.
      • Aurornis 47 minutes ago
        The cheapest engineering firms you hire are also using LLMs.

        The operator is still a factor.

        • jama211 18 minutes ago
          Yeah, but they’ll add another layer of complexity over doing it yourself
      • lukan 48 minutes ago
        And the cheapest engineering firm won't use LLMs as well, wherever possible?
        • TheRealPomax 44 minutes ago
          fun fact, LLMs come in cheapest and useless and expensive but actually does what's being asked, too.

          So, will they? Probably. Can you trust the kind of LLM that you would use to do a better job than the cheapest firm? Absolutely.

      • minimalthinker 59 minutes ago
        this.
  • SubiculumCode 1 hour ago
    How about complaining that brain waves get sent to a server? I'm a neuroscientist, so I'm not going to say that the EEG data is mind reading or anything, but as a precedent, non privacy of brain data is very bad.
    • amarant 59 minutes ago
      How useful could something like this be for research? I'm not a neuroscientist so I have no clue, but it seems like the only justification I can think of..
      • brabel 21 minutes ago
        Not a neuroscientist either but I would imagine that raw data without personal information would not be useful for much. I can imagine that it would be quite valuable if accompanied with personal data plus user reports about how they slept each night, what they dreamed about if anything, whether it was positive dreams or nightmares etc. And I think quite a few people wouldn’t mind sharing all of that in the name of science, but in this case they don’t seem to have even tried to ask.
      • minimalthinker 23 minutes ago
        I believe they use it for sleep tracking
      • AnimalMuppet 24 minutes ago
        If they're taking patient data for research without permission, they are not ethical researchers.
    • minimalthinker 44 minutes ago
      I would presume data privacy laws already have good precedent for health data?
      • baby_souffle 19 minutes ago
        > I would presume data privacy laws already have good precedent for health data?

        Google for a list of all the exceptions to HIPPA. There are a lot of things that _seem_ like they should be covered by HIPPA but are not...

      • freedomben 17 minutes ago
        Only for "covered entities" under HIPAA (at least in the US)
  • autoexec 8 minutes ago
    This guy bought an internet connected sleep mask so it's not surprising that it was collecting all kinds of data, or that it was doing it insecurely (everyone should expect IoT anything to be a security nightmare) so to me the surprising thing about this is that the company actually bothered to worry about saving bandwidth/power and went through the trouble of using MQTT. Probably not the best choice, and they didn't bother to do it securely, but I'm genuinely impressed that they even tried to be efficient while sucking up people's personal data.
  • dnw 1 hour ago
    I would love to see the prompt history. Always curious how much human intervention/guidance is necessary for this type of work because when I read the article I come away thinking I prompt Claude and it comes out with all these results. For example, "So Claude went after the app instead. Grabbed the Android APK, decompiled it with jadx." All by itself or the author had to suggest and fiddle with bits?
    • minimalthinker 1 hour ago
      Very little intervention tbh. I will try to retrieve it and post.
      • selkin 21 minutes ago
        By default, Claude code keeps session history (as jsonl files in ~/.claude).

        It’s wasteful not to save and learn from those.

    • cyanydeez 1 hour ago
      Really is a derth of livestreams demostrating these things. Youd think if thetes so much Unaided AI work people would stream it.
  • speedgoose 1 hour ago
    Remember that the S in IoT stands for Security.

    I have deployed open MQTT to the world for quick prototypes on non personal (and healthcare) data. Once my cloud provider told me to stop because they didn’t like it, that could be used for relay DDOS attacks.

    I would not trust the sleep mask company even if they somehow manage to have some authentication and authorisation on their MQTT.

  • intellirim 1 hour ago
    This is exactly why we need audit trails for connected devices. Users have no visibility into what data is being sent where. The fact that brainwave data is broadcast to an open broker without user knowledge is a governance failure, not just a security bug.
    • ai-x 23 minutes ago
      There should be two separate lines of products. One in which privacy is priority and adheres to government regulations (around privacy) and probably costs 2x and one with zero government intervention (around privacy) which costs less and time-to-market is faster.

      I don't want a few irrationally paranoid people bottlenecking progress and access to the latest technology and innovation.

      I'm happy to broadcast my brainwaves on an open YouTube channel for the ZERO people who are interested in it.

      • tgv 9 minutes ago
        Explain how sending EEG recordings is progress. And why faster access to the latest tech is always good, for everyone.
      • selkin 18 minutes ago
        otoh: the non regulated should cost more.

        It’s kinda like “qualified investors” - you want to make sure people who are wiling to do something extremely stupid can afford it and acknowledge their stupidity.

        We don’t need regulation to protect those that can afford to buy protection: we need it for those who can’t.

    • plagiarist 1 hour ago
      It is a governance failure.

      It is also technically a user failure to have purchased a connected device in the first place. Does the device require a closed-source proprietary app? Closed-source non-replaceable OS? Do not buy it.

      • jmb99 1 minute ago
        Yes, that’s right, don’t buy any new car, any phone, any television. Hell don’t buy any x86 laptop or desktop computer, since you can’t disable out replace Intel ME/etc.
      • brabel 4 minutes ago
        Very few options available, if any, if you actually do that. The IoT market is unfortunately small and dominated by vendors that don’t want at all an open ecosystem. That would hinder their ability to force you to pay for a subscription which is where all the money is.
  • basedrum 1 hour ago
    Name the company, hiding it is irresponsible
    • brabel 3 minutes ago
      It’s probably safe to assume they are all like that.
  • SilentM68 4 minutes ago
    Interesting project. Here's a thought which I've always had in the back of my mind, ever since I saw something similar in an episode of Buck Rogers (70s-80s)! Many people struggle with falling asleep due to persistent beta waves; natural theta predominance is needed but often delayed. Imagine an "INEXPENSIVE" smart sleep mask that facilitates sleep onset by inducing brain wave transitions from beta (wakeful, high-frequency) to alpha (8-13 Hz, relaxed) and then theta (4-8 Hz, stage 1 light sleep) via non-invasive stimulation. A solution could be a comfortable eye mask with integrated headphones (unintrusive) and EEG sensors. It could use binaural beats or similar audio stimulation to "inject" alpha/theta frequencies externally, guiding the brain to a tipping point for abrupt sleep onset. Sensors would detect current waves; app-controlled audio ramps from alpha-inducing beats to theta, ensuring natural predominance. If it could be designed, it could accelerate sleep transition, improve quality, non-pharmacological.
  • baby_souffle 2 hours ago
    Well that’s a brand new sentence.
    • amelius 1 hour ago
      But not a beautiful sentence.
  • bryanrasmussen 2 hours ago
    huh, not sure if life imitates snark and bull https://medium.com/luminasticity/great-products-of-illuminat...

    "The ZZZ mask is an intelligent sleep mask — it allows you to sleep less while sleeping deeper. That’s the premise — but really it is a paradigm breaking computer that allows full automation and control over the sleep process, including access to dreamtime."

    or if this is another scifi variation of the same theme, with some dev like embellishments.

    • mrguyorama 11 minutes ago
      That is the premise of HypnoSpace Outlaw, a neat game about 90s internet nostalgia and scifi.
  • digiown 17 minutes ago
    As an aside, it seems cool that the bar to reverse engineering has lowered from all the LLMs. Maybe we'll get to take full control of many of these "smart" devices that require proprietary/spyware apps and use them in a fully private way. There's no excuse that any such apps solely to interact with devices locally need to connect to the internet, like dishwasher.

    https://www.jeffgeerling.com/blog/2025/i-wont-connect-my-dis...

  • morkalork 1 hour ago
    >Since every device shares the same credentials and the same broker, if you can read someone's brainwaves you can also send them electric impulses.

    Amazing.

  • bobim 1 hour ago
    Won't they sue for the reverse engineering?
  • throw876987696 31 minutes ago
    Without a brand name, how can we verify this is real?
    • ohyoutravel 22 minutes ago
      Without any skin in the game with your username, why should we take anything you say seriously?
  • mystraline 1 hour ago
    [flagged]
    • a4isms 1 hour ago
      Doesn't disclosing this to the world at the same time as you disclose it to the company immediately send hundreds of black hats to their terminals to see how much chaos they can create before the company implements a fix?

      Perhaps the author is not a coward, but is giving the company time to respond and commit to a fix for the benefit of other owners who could suffer harm.

      • rkagerer 1 hour ago
        but is giving the company time to respond and commit to a fix for the benefit of other owners who could suffer harm.

        If that's the case then they should have deferred this whole blog post.

      • mystraline 1 hour ago
        It took me 30 seconds with ChatGPT by saying:

        Identify the kickstarter product talked around in this blog post: (link)

        To think some blackhat hasn't already did that is frankly laughable. What I did was like the lowest of low-bars these days.

        • Barbing 1 hour ago
          Put the product name in the title & maybe it sends thousands instead of hundreds of blackhats…

          We often treat doxxing the same way, prohibiting posting of easily discovered information.

          • mystraline 1 hour ago
            So your plan is to let the blackhats in the know attack user devices, rather than send out a large warning to "Quit using immediately"?

            If we applied this similar analogy to a e.coli infection of foods, your recommendation amounts to "If we say the company name, the company would be shamed and lose money and people might abuse the food".

            People need to know this device is NOT SAFE on your network, paired to your phone, or anything. And that requires direct and public notification.

        • pphysch 1 hour ago
          And ChatGPT hallucinated a misleading answer that you are confidently regurgitating.
          • croisillon 45 minutes ago
            their original message said "my guess", not ChatGPT's, talk about responsible disclosure...
    • itishappy 1 hour ago
      I don't see estim mentioned on that website, but I do see a comparison chart with 4 other competitors with similar capabilities to the one you linked.

      What makes you think this is the one?

    • minimalthinker 1 hour ago
      I did consider naming, but they were very responsive to the disclosure and I was not entirely familiar with potential legal implications of doing so. (For what it's worth, it is not Luuna)
      • stavros 16 minutes ago
        Please name 50 other companies it's not.

        It's good that they were responsive in the disclosure, but it's still a mark of sloppiness that this was done in the first place, and I'd like to know so I can avoid them.

    • everdrive 1 hour ago
      Even if naming and shaming doesn't work, I sure want to know so I can always avoid them for myself and my family. Thanks for the call-out and the educated guess.
    • j45 56 minutes ago
      EEG devices can cost a lot to own personally as well.

      The other side of owning equipment like this is it still could be useful for some for personal and private use.

      • minimalthinker 3 minutes ago
        EEG is very useful for accurate sleep tracking.
    • hxbdg 1 hour ago
      Presumably they’ll be named and shamed after they’ve been given a chance to fix things.
  • roywiggins 1 hour ago
    cyberpunk