14 comments

  • shireboy 44 days ago
    Fun related anecdote. I learned to program Amiga BASIC when I was around 12 by copying games from the back of Amiga magazine. At the same time, my sister was learning to read and write. As a nerdy big brother, I set out to write something where she could type plain english instructions and it would do what she typed. Playing Zork made this seem very feasible at least to my 12 year old mind. I never finished the project, but have been thinking about something like this again lately.
  • skybrian 44 days ago
    Could write some documentation with some examples of common commands?

    Otherwise, it seems like “guess the verb” in a text adventure.

    [Oops, edited before I saw the reply.]

    • kenstler 44 days ago
      Hi! A good general point.

      We're doing a few things to try to guide the user:

      1. We include ~26 fully-formed "Example" and "Template" notes in the app - these are intended to show users what they can do, but they can also be copied and used as-is or edited.

      2. When users write new notes, we use a custom keyboard tool called "AutoPrompt" to show the user what actions they can take in each section. It includes prompts like "Search reddit for...", etc. that is inserted into the text entry and the user completes the statement. (You can see this in the vid on our home page).

      3. If users write incorrect or wrong things in the note, they get a response back that instructs them what they need to correct.

      Admittedly, as a notes-based interface there's a tradeoff we're trying to balance on flexibility vis-a-vis unstructured text input, and guardrails to inform the user. Appreciate any further feedback you can share!

      • kissgyorgy 44 days ago
        If you highlight the actions in the text with something like a differnet background color, it would be clear what it will trigger and over time the user would remember these actions. It would also be clearer to see them in predefined notes.
        • kenstler 44 days ago
          Love this - thank you!
      • skybrian 44 days ago
        That sounds useful for someone who already signed up, but I’d rather read them before signing up or installing anything. The documentation is how I decide if it’s worth looking further.

        (I also don’t play videos or animations, and try to ignore them if they autoplay. They are usually too fast or too slow. A slideshow that you step through is okay.)

        Maybe I’m not your target audience, though?

        • kenstler 44 days ago
          Ah - I assumed you were talking about the in-app experience. My bad!

          Point well taken and we'll definitely add more details/docs on the site.

          Thanks!

          • dr_kiszonka 44 days ago
            I just wanted to add that this information would be useful to me too, and I did watch the video. Also, I would be curious to know what integrations you already have and plan to add.
  • teachmetolearn 45 days ago
    Love the UX, AI is unlocking a completely new ux paradigm and kudos to you for pushing it forward!
    • kenstler 44 days ago
      Appreciate the feedback!
  • hexcase 44 days ago
    Not available in Canada unfortunately. I'm excited to see, always looking for personal automation tools
  • smusamashah 44 days ago
    The video shows actions on external services. Can NoteTech be used to automate phone activities e.g. "Turn phone silent from 10pm to 6am" , "When John is calling, ring at full volume" ?
    • salil999 44 days ago
      I think you can already do some of this in iOS with the Shortcuts app
  • stefanyas 44 days ago
    It's not available in my country via Play Store - any chance of enabling? I'm in Serbia.
  • pyinstallwoes 44 days ago
    This is cool in a similar way that Forth is cool.
  • nbbaier 45 days ago
    This is interesting. I assume you're using an LLM to generate the code that runs these automations? Like "compiling" the prompt to code?
    • kenstler 45 days ago
      Yes, but instead of "compiling" directly to generated code, we generate a type of DAG that defines an execution flow of pre-defined parameterized code modules.
  • jwong_ 44 days ago
    Interesting idea. I liked the AI button, but had difficulty even finding it since you need to click into a line item before you can see it.

    Also, I wanted to "test" my work before deploying it, but that seems not possible?

    • kenstler 44 days ago
      Thanks for the feedback!

      Yes - at present, a note needs to be deployed before you can test. Then you can make a change, and redeploy. We can look at making this a bit more seamless in future iterations.

      Re: the AI button - any suggestion on how we might improve there?

      • jwong_ 44 days ago
        Perhaps using the same nomenclature for both? In the tooltip you call it an Autoprompt, but the icon itself just says "AI". Also, I typed some steps in, and expected to need to review before deploying since I didn't have any "AI" or "autoprompt" to help.

        Maybe some sort of more obvious workflow that it's "done" and can be tested?

  • Brajeshwar 44 days ago
    Is there a desktop version planned? Watched the video. I'd love to have a Desktop version.
  • poulpy123 44 days ago
    Looks interesting. But you should add a note that it is not available in Europe (or at least France) for Android
  • devansh_jain 44 days ago
    Not available in India :(
  • breck 44 days ago
    1. Could you make a web version? (I don't own a smartphone).

    2. Are you looking for investors?