Show HN: MARS – Personal AI robot for builders (< $2k)

Hey, we’re Axel and Vignesh, cofounders of Innate (https://www.innate.bot/). We just launched MARS, a general-purpose robot with an open onboard agentic OS built on top of ROS2.

Overview: https://youtu.be/GEOMYDXv6pE

Control demo: https://youtu.be/_Cw5fGa8i3s

Videos of autonomous use-cases: https://docs.innate.bot/welcome/mars-example-use-cases

Quickstart: https://docs.innate.bot/welcome/mars-quick-start.

Our last thread: https://news.ycombinator.com/item?id=42451707

When we started we felt there is currently no good affordable general-purpose that anyone can build on. There’s no lack of demand: hugging face’s SO-100 and LeKiwi are pretty clear successes already; but the hardware is unreliable, the software experience is barebone and keeps changing, and you often need to buy hidden extras to make them work (starting with a computer with a good gpu). The Turtlebots were good, but are getting outdated.

The open-source hobbyist movement lacks really good platforms to build on, and we wanted something robust and accessible. MARS is our attempt at making a first intuitive AI robot for everyone.

What it is:

- It comes assembled and calibrated

- Has onboard compute with a jetson orin nano 8gb

- a 5DoF arm with a wrist camera

- Sensors: RGBD wide-angle cam, 2D LiDAR, speakers

- Control via a dedicated app and a leader arm that plugs in iPhone and Android

- 2 additional USB ports + GPIO pins for extra sensors or effectors.

- And our novel SDK called BASIC that allows to run it like an AI agent with VLAs.

It boots in a minute, can be controlled via phone, programmable in depth with a PC, and the onboard agent lets it see, talk, plan, and act in real-time.

Our SDK BASIC allows to create “behaviors” (our name for programs) ranging from a simple hello world to a very complex long-horizon task involving reasoning, planning, navigation and manipulation. You can create skills that behaviors can run autonomously by training the arm or writing code tools, like for an AI agent.

You can also call the ROS2 topics to control the robot at a low-level. And anything created on top of this SDK can be easily shared with anyone else by just sharing the files.

This is intended for hobbyist builders and education, and we would love to have your feedback!

p.s. If you want to try it, there’s a temporary code HACKERNEWS-INNATE-MARS that lowers the price to $1,799.

p.p.s The hardware and software will be open-sourced too, if some of you want to contribute or help us prepare it properly feel free to join our discord at https://discord.gg/YvqQbGKH

91 points | by apeytavin 17 hours ago

11 comments

  • dave1010uk 12 hours ago
    Looks awesome!

    This isn't so clear though: https://docs.innate.bot/main/software/basic/connecting-to-ba...

    > BASIC is accessible for free to all users of Innate robots for 300 cumulative hours - and probably more if you ask us.

    Is BASIC used just to create the behaviours or to run them too? It sounds like this is an API you host that turns a behaviour like "pick up socks" into ROS2 motor commands for the robot. Are you open sourcing this too, so anyone can run the (presumably GPU heavy) backend?

    Does the robot needs an internet connection to work?

    Also, more importantly, what does it look like with googly eyes stuck on?

    • apeytavin 12 hours ago
      On BASIC: Yes it does require an internet connection and until we figure out how this works for you it will remain free of use!

      It is required to run them, not to create them. And it's not about running "pick_up_socks", this one can already run on your robot. BASIC is required to chain it with other tasks such as navigating to another part of your house and then running another skill to drop the sock somewhere for example

      Thank you for the remark, we will make it clearer in the docs

      As a consequence: The robot does not necessarily require Internet to run, but if you want it to chain tasks while talking and using memory, yes it does.

      As for the googly eyes, give me a minute...

  • madamelic 10 hours ago
    How complex of tasks can you give it? Seems, not to be punny, super basic.

    Can it do complex tasks like "pick up socks from room A, drive to room B, and put in basket"? Is the intention to allow hobbyists to do actual work with it or is this version purely novelty rather than a functional "personal robot"?

    Additionally, what is the limitation on speed of movement? It seems very slow in movement, is that intentional for safety or is that purely because of running the AI model locally?

    • apeytavin 10 hours ago
      The example you gave is exactly this. The driving around in the example use-cases shows an example of going from room to room and execute policies in context:

      https://docs.innate.bot/main/welcome/mars-example-use-cases

      I take from your comment that the full capabilities of the robot are not properly represented and I take a note to film longer ones. And it can definitely do what you just asked. And multiple times in a row. I will note that it depends on the training quality of course.

      On speed of movement, I now realize we didn't mention it anywhere so I added it in the overview but it's pretty fast. 0.7m.s-1 for the base, and the arm can be tormented quite a bit. Just took this video for you:

      https://youtu.be/H-gAaTKLm9c

      • madamelic 9 hours ago
        Oh wow, didn't expect this detailed of a response. Thanks!
  • cowteriyaki 11 hours ago
    Coming with a Lidar out of the box seems nice.

    Does the MARS hardware really remove the hidden extras (computer with a gpu) mentioned as the downside of HF SO-101 or LeKiwi? While a jetson is good for inference, I feel like to train VLAs you would need access to a powerful GPU regardless. For Lerobot based hardware training ACT was relatively low profile if you use low resolution for the camera feeds, but with increased resolution or with more than one camera I already saw needing more than 8GB of VRAM. If VLA is on the table, finetuning something like the open sourced version of pi0 should already necessitate access to more than one 4090 or above I think.

    Also, do you have plans for community-level datasets? I think Lerobot sort of does this with their data recording pipeline and HF integration.

    • apeytavin 11 hours ago
      One of our objectives here was to fix everything that we don't like about the SO-101 and the Kiwi, which have several hardware and software flaws in our view. Including, yes, the constant need for a computer to simply run your robot.

      The training does require external GPUs (but we provide that infra for free, straight from the app!), but the onboard jetson can run models trained though, as you can see in the examples. Everything you see in the vids is running onboard when it comes to manipulation, because we use a special version of ACT made specifically by us for this robot, that also includes a reward model (like DYNA does).

      We have developed this system to also be able to run the other components smoothly so it also does SLAM, and has room for more processing even when running our ACT.

      Now indeed this cannot run Pi-0 but from our experience - and the whole community in general - VLAs are not particularly better than ACT in the low data regime, and need a lot more compute.

      As for community-level datasets, yes this is the plan. Anything you train can already be shared with others - just share the files. We didn't develop a centralized place for sharing datasets and behaviors but it is on the plan.

      • dimatura 1 hour ago
        Hi, I am currently considering a Lekiwi build but I am intrigued by Mars. Outside of the need for external compute, what issues did you find with SO101 and Kiwi?

        Also I am curious about a couple of the parts, if you don't mind sharing - are those wheels the direct drive wheels from waveshare? And what is the RGBD camera? (Fwiw, even if it's hefty the MARS price tag seems fair to me).

        • apeytavin 53 minutes ago
          There's several things but for example, there is no LiDAR on it nor even a good place to put one. If you're going to navigate around, without a LiDAR or good compute for VSLAM (which is very hard to setup and VERY demanding in compute), you will very quickly get lost. At this point the Kiwi is only for very local navigation (and you will still have IMU drift).

          There is also a possibility for it to tip the base if the arm is fully extended. And the SO-101 has quite poor repeatability.

          The base is also slow to move, and depending on which surface you are the omniwheels can get dirt in quickly.

          Finally, external compute means you need in particular to teleoperate from your computer, so you have to be far from the robot and not necessarily in the same orientation than it which is very, very uncomfortable. This app system we made is one of the things people love the most about MARS.

          Ah, and RGBD really does matter for navigation AND for learning (augmenting ACT with depth yields better results).

          The wheels are indeed these ones, and the camera on the video is a luxonis oak-d wide, pretty expensive but comfortable to work with. However, the version we're shipping includes a much cheaper stereo-depth camera that we calibrate ourselves - I can't get you the reference right right now cause it's late at night but feel free to reach out on discord

      • greggsy 6 hours ago
        If these are intended to be single-dwelling or single-workplace, is there a need to have any onboard processing greater than a few watts?

        You could simply host the raw grunt in a base station somewhere else in the premises, keeping the device lighter and lower power.

        • apeytavin 5 hours ago
          That would not make it a complete product and would always require a complex setup whenever and wherever you want to use it.

          This one is really, really convenient and intuitive. Turn it on anywhere, even outside, it just works. Even when I want to dev on it, it's super convenient.

          On some level I truly believe robotics has to become more "complete", we can't always just piece things together, it makes it very hard to have a beautiful product.

          I realize this is more of a philosophical answer, but I also think it is the right one to take this field to the next level

          • greggsy 5 hours ago
            Aren’t you literally selling this with a cloud-based subscription service?

            How does that fit into your ‘complete’ ethos?

            • apeytavin 4 hours ago
              If we could sell it without we would for sure, but this is a current technological limitation. And we make extra easy to connect to it anywhere still, from your phone. Several components of the robots do not need this cloud service, and because the OS is accessible to you, you could even replace it with your own of doing things.

              For this one, it's just the only feasible way we found to bring the kind of experience we created to folks.

  • v9v 14 hours ago
    What motors do you use for the arm and what interfaces do you provide (position, velocity, effort)? How long does the battery last when idling?
    • apeytavin 14 hours ago
      On the battery, at least 5 hours when idle, and 3 when moving at least

      On the motors, these are dynamixels from robotis, and we provide all three of position, velocity, and effort on the low-level SDK (in ROS2 too)

    • apeytavin 13 hours ago
      we also provide interfaces for IK

      and much higher interfaces for interaction and ai manipulation. Like directly recording episodes of training data so that the arm can use a VLA instead of simple IK.

  • alextousss 10 hours ago
    Had a chance to see a live demo last month. Looks great, can do a lot more things than an SO-101 and the teleop via the app if both fun and useful. Would definitely buy if I had the money.
  • rosenjon 13 hours ago
    I'll buy one...but your discount code doesn't work. It says "Enter a valid discount code".
    • apeytavin 12 hours ago
      Fixed it! It was only possible to use a given number of times
  • NewUser76312 12 hours ago
    That's a cool little robot, and I see the appeal, definitely as an educational toy. I don't think the fidelity is there for real autonomous research. But the bigger issue imo is that there's no way this should cost $2000.

    You've got a $250 computer, some lidar+camera sensor for maybe $1-200, 6 servos, and cheap plastic. Plus you want to charge a $50/mo software subscription fee for some software product, whatever I guess that's beside the point.

    No shade on the idea because low-cost robotics is an unsolved need for the future. But this current iteration is just not competing well with other alternatives. Perhaps this is more of a comment on what we can accomplish in the West vs what's possible in Asia.

    Why would I not go for this guy for $1600, and attach an arm? https://www.unitree.com/go2

    It's not an apples-to-apples product comparison, but you get the point. There's just so much more raw value offered per dollar elsewhere.

    • apeytavin 12 hours ago
      Actually the BOM cost required to make something stable that can execute manipulation tasks well enough is around $1k+ hence our price. You will find very cheap robots that can pretend to do what this one can, but in practice won't work well enough.

      As for the unitree robot, this one is not unlocked for development, does not have onboard GPU, and does not have an arm. If you want it, check the price they give, it's very prohibitive.

      You could attach a cheap arm to it but it would also not be stable enough for AI algorithms to run it. We're researchers ourselves, we would have made it cheaper if we could, but then you just can't do anything with it.

      Our platform will deliver the experience of a real AI robot, anything cheaper than that is kind of a lie - or forces you to assemble and calibrate, which we do for you here. It is just the nature of trying to deliver a really complete product that works, and we want to stand for that.

      EDIT: You can take a look at our autonomous demos there, you need something reliable for these: https://docs.innate.bot/welcome/mars-example-use-cases

      • NewUser76312 11 hours ago
        That's just one example that came to mind. I guarantee I could dig for 30 mins and find a mobile manipulator platform from China that kills it on hardware-to-price ratio that is either 'open enough' or could be made so.

        As someone who's dabbled in this before, I guess I'd rather just sit down and plan a BOM and do it myself if that's your markup anyways. Not that it's totally unreasonable for people who just want something super simple out of the box that works.

        My general commentary is just that it's sad how much basic servos and what not cost in North America. We've completely ceded this industry to Asia.

        • apeytavin 11 hours ago
          Our servos come from Asia. If you can find a platform with everything we have for around $1k BOM happy to review it but we've been pretty deep in picking our components.

          Also, fair to say that if indeed you're the kind of person who likes to assemble all of this yourself, you're not directly in our target :)

          This is more for AI / software folks who don't want to have to assemble and calibrate everything and risk having an arm that is not repeatable and thus can't actually properly learn. We have seen many folks spend a weekend or more trying to put these together and end up with a barely working platform and then be disgusted of AI robotics

    • robertritz 7 hours ago
      Actually the reason is that with the Unitree products if you want the Python SDK the price jumps to $5,000 for the same hardware. At least it was the last time I checked earlier this year.
    • apeytavin 12 hours ago
      As a side note, the previous generation of research platforms for that size made in Asia were the Turtlebots, which go for that same price, but without GPU, arm...

      I would say the problem is that most manufacturers, including chinese, sell you platforms that are not reliable enough for AI manipulation, and there's a race to the bottom for it, to which we try not to participate to

      • NewUser76312 11 hours ago
        > I would say the problem is that most manufacturers, including chinese, sell you platforms that are not reliable enough for AI manipulation, and there's a race to the bottom for it, to which we try not to participate

        Pretty lofty claims though, really think you're so above everyone on quality at this price point? I know what dynamixels are capable of, and I see the jitter in the demo videos.

        Why aren't the manipulator specs easily accessible on the website? Have you run a real repeatability test? Payload even?

        It's a neat high-fidelity garage build platform, but I don't see any reason to assume this price premium is due to hardware quality.

        • apeytavin 11 hours ago
          The jitter is some demos is arguably because of bad connectivity, we will retake those.

          You can see however in these demos: https://docs.innate.bot/main/welcome/mars-example-use-cases

          that it is indeed pretty smooth.

          Also, sorry the arm specs were not there! You can now have them at: https://docs.innate.bot/robots/mars/arm

          • NewUser76312 10 hours ago
            That's fine, but for future reference, robotic arms should have their specs listed and quantified - stuff like reach, payload, repeatability. If I'm a researcher, how do I know if this arm can do what I need? I can only infer so much from a few demo videos.

            Final comment I'll say, it's a weird and tough price point. Actual research labs would rather spend $20,000 on a very high quality and likely larger high-fidelity platform. A random hacker or grad student will need some real convincing to shell out $2,000, sub $1K might better serve them. So what's the target customer profile exactly?

            I encountered similar issues developing a $3K plug and play robot research arm in the past. The economics are awkward. You can actually just spend $5K and get a really good second-hand industrial robot (maybe even first-hand now from China). Or you could spend $500 and get a 6 DOF platform at least as good as your current platform's arm and then buy the sensor separately and bolt it to your workspace - bam, done. And no, the software isn't that important, servos are easy to work with...

            Therefore my 'in between platform' was stuck in a hard place. I made some one-off sales, but never really scaled the business, which is what would be needed for any fancy "we're the platform where people do AI" vision to manifest to investors. Hardware is tough - they'll see your numbers and easily pass. They'll realize you need sales in quantity to get anywhere meaningful otherwise.

            So I wanted to share criticisms and my experience so you can look ahead to likely challenges and hopefully get further. Best of luck.

            • apeytavin 10 hours ago
              Absolutely, the link I sent you has these specs you mentioned listed.

              And yeah, I agree this mid-market is indeed tough, but this is the upper price I was looking for when I started with my AI background and bought a similar-price turtlebot then struggled to put a cheap arm on it. Anything under this is really bad for algorithms, although you can reduce it with just the arm and clamping it to your workspace as you suggested but then you don't have mobility.

              I will keep your comment in mind, and thank you for the thoughtfulness. You might be interested to know that we intend to show something bigger not long from now. But this is, as you said, more for investors.

              For now I'm content if there's enough people that want this one

    • aetherspawn 9 hours ago
      The kind of person who would buy this doesn’t care whether it’s $2000 or $5000 probably. They care more about whether it actually exists and will arrive. Complaining about price, especially for something so niche, is useless feedback if you’re not actually in the market for one.
    • trhway 5 hours ago
      >Why would I not go for this guy for $1600, and attach an arm? https://www.unitree.com/go2

      Quasi-Lego-style robo dog for RPi is $100-150 on AMZN

  • justin66 16 hours ago
    > our novel SDK called BASIC

    Sigh.

    • apeytavin 15 hours ago
      yeah it's a tribute
      • qzw 13 hours ago
        Doesn’t that name make it confusing for your users? A tribute maybe shouldn’t be identical?
        • apeytavin 11 hours ago
          So far folks have been getting it but i see the point. I guess we'll see if a renaming makes sense later
  • greenie_beans 15 hours ago
    this website froze my computer for several minutes, like i'm browsing the web in 2004. making the impression that the robot will work the same way, very uninterested now
    • apeytavin 15 hours ago
      you're the first person to mention that to me and it's very very helpful actually. can i contact you to see what you see?
      • greenie_beans 14 hours ago
        ain't got time for that. happened in both firefox (locked down with lot of privacy settings/adblockers) and chrome (no privacy/adblockers etc). froze entire computer on both browsers for a few minutes for each browser. my computer has like 16gb of RAM
        • pugworthy 13 hours ago
          Runs fine for me. Perhaps a bit more investigation on your end before putting fault on the site and authors?
          • greenie_beans 11 hours ago
            i spent ten minutes waiting for the computer to be unfrozen + write up these comments. i'm already being generous enough with my time after those 10 minutes, especially by describing how i tested in different browsers. if you can't see that, not my problem!
            • apeytavin 11 hours ago
              Hey, OP here, just sharing that I only wrote the first answer to your comment, not the next one. Wouldn't want to dismiss your issue!

              I have been running performance checks during the morning and tried with other browsers, used the debugging tools. I can see why it could be slow but I definitely need more trials to understand where it comes from. Bear with us while we're on it :)

              and as i said earlier, this is really helpful for us to know so thank you

          • creer 13 hours ago
            I'm with the commenter: when my first (or within 15 minute) impression of a vendor or product is craptastic I cut my losses and consider myself lucky for dodging a longer term bullet.

            The commenter was actually very considerate and raised a warning where it might be seen. And they were kind enough to attempt this with two different browsers. After that you can buy my troubleshooting time at its usual hourly rate.

            (Because it's a frequent enough issue: I wouldn't see that warning as being about a one-off obscure bug that will affect few people and doesn't matter. It's a warning that the web site probably did not enough take compatibility in consideration - and was approved without such consideration.)

            "Runs fine for me" is an absurd bar for reliability / compatibility, no?

            • apeytavin 10 hours ago
              I can assure your this was a strong consideration of ours and all assets underwent a very strict regime. There is definitely more than other types of website but it's in the median of other robots websites with significants amount of videos.

              Not to say I dismiss the comment, definitely looking at it cause it might come from somewhere else. Just that I yet don't see what is the bottleneck.

        • foxglacier 8 hours ago
          If your whole computer's freezing, you can't blame a website for that. That would mean there's a pretty serious vulnerability in both browsers. If you really believe that, you should tell the browser vendors, not the website operator.
  • chfritz 16 hours ago
    Very cool! But I think we can help you improve your video-streaming and teleop (it seems to be pretty low frame rate in the demo). We've built the probably best remote teleop solution for robotics in the market today and it can be embedded anywhere (white-label). Want to get in touch to discuss? You can find my linkedin in my profile.
    • apeytavin 15 hours ago
      the video stream will get a lot of improvement for sure! the teleop is pretty good already i believe, but happy to chat further. feel free to dm / join discord
      • chfritz 15 hours ago
        Don't underestimate the challenges of making remote-teleop work reliably and efficiently with low-latency via the Internet: https://transitiverobotics.com/blog/streaming-video-from-rob....
        • apeytavin 15 hours ago
          webrtc for video stream is what we have in the pipeline for improving the stream yep!
          • chfritz 15 hours ago
            with congestion control, packet loss mitigation, hardware accelerated encoding, multiple streams, teleop commands on the same channel (required for safety)? Do you host your own TURN server? My point is that robotics companies should focus and push the envelop on the application of their product, not reinvent the wheel on infrastructure and tooling.
            • apeytavin 15 hours ago
              i don't know if it is that necessary for just phone control, wdyt?

              do you have anything plug-and-play for jetson nano?

              • chfritz 15 hours ago
                I assume you mean Orin Nano? (Jetson nano is EOL and only supports Ubuntu 18). Yes, we have users that run on Orin Nano and it's plug-and-play on any hardware. However note that the Orin Nano doesn't have hardware encoders, so it will take CPU cycles to encode the video, making it a less then ideal choice for teleoperated robots. Cheaper boards like the Orange Pi 5 are a better fit.

                All the webrtc-features really only become relevant when you want to control the robot over the Internet, i.e., not just locally where you can assume reliable network.

  • admiralrohan 3 hours ago
    I believe the core challenge in AI robotics is: Can we transfer the cultural knowledge inherent in human bodies and memories?

    It's very difficult. Hard to transfer norms, rituals, and intuitive social cues passed organically drives human actions and evolution by enabling adaptive cooperation, empathy, and innovation in diverse societies.

    For example, which books to read and whom to trust. You often make decisions on gut feeling which is hard to transfer.

    The product looks promising. Hoping for the best.