Node.js adds experimental support for TypeScript

(github.com)

1204 points | by magnio 85 days ago

45 comments

  • zarzavat 84 days ago
    One thing to note is that it is impossible to strip types from TypeScript without a grammar of TypeScript. Stripping types is not a token-level operation, and the TypeScript grammar is changing all the time.

    Consider for example: `foo < bar & baz > ( x )`. In TypeScript 1.5 this parsed as (foo<bar) & (baz > (x)) because bar&baz wasn’t a valid type expression yet. When the type intersection operator was added, the parse changed to foo<(bar & baz)>(x) which desugared to foo(x). I realise I’m going back in time here but it’s a nice simple example.

    If you want to continue to use new TypeScript features you are going to need to keep compiling to JS, or else keep your node version up to date. For people who like to stick on node LTS releases this may be an unacceptable compromise.

    • madeofpalk 84 days ago
      It looks like the team has already considered this in one regard

      > There is already a precedent for something that Node.js support, that can be upgraded seperately, its NPM. Node bundles a version of npm that can upgraded separately, we could do the same with our TypeScript transpiler.

      > We could create a package that we bundle but that can also be downloaded from NPM, keep a stable version in core, but if TypeScript releases new features that we don't support or breaking changes, or users want to use the new shiny experimental feature, they can upgrade it separately. This ensures that users are not locked, but also provides support for a TypeScript version for the whole 3 years of the lifetime of Node.js release.

      https://github.com/nodejs/loaders/issues/217

      • SomeCallMeTim 84 days ago
        As long as Node understands to use the project-specific version of TypeScript (i.e., the one in node_modules or the PNP equivalent), that should be fine.

        But it would be a step backward to need to globally upgrade TypeScript (as you do with npm), since some older projects will not be compatible with newer versions of TypeScript.

        Ask me how I know. ;)

        • madeofpalk 84 days ago
          I think Node’s strategy here is to not let perfect be the enemy of useful for some people.
        • nilsbunger 83 days ago
          At first I thought that's not a big deal, I could manage that with `nvm`. But I think you're right, you really want it to pick up the project-specific typescript so that you're using ONE version of typescript for type checking, execution, and more.
        • silverwind 84 days ago
          > As long as Node understands to use the project-specific version of TypeScript

          It won't, but in such a scenario, typescript would only be a type checker, wich is a entirely different endeavor than running typescript.

    • WorldMaker 84 days ago
      The syntax from the perspective of type stripping has been relatively stable for more versions of Typescript than it was unstable. You had to reach all the way back to 1.5 in part because it's been very stable since about 2.x. The last major shift in syntax was probably Conditional Types in 2.8 adding the ternary if operator in type positions. (The type model if you were to try to typecheck rather than just type-strip has changed a lot since 2.x, but syntax has been generally stable. That's where most of Typescript's innovation has been in the type model/type inferencing rather than in syntax.)

      It's still just (early in the process) Stage 1, but the majority of Typescript's type syntax, for the purposes of type stripping (not type checking), is attempting to be somewhat standardized: https://github.com/tc39/proposal-type-annotations

      • teaearlgraycold 84 days ago
        They did just add a new keyword, satisfies, in 5.4. That would be a breaking change if you can’t upgrade the type stripper separately.
        • felixfbecker 84 days ago
          This is true, but in other cases they added keywords in ways that could work with type stripping. For example, the `as` keyword for casts has existed for a long time, and type stripping could strip everything after the `as` keyword with a minimal grammar.

          When TypeScript added const declarations, they added it as `as const` so a type stripping could have still worked depending on how loosely it is implemented.

          I think there is a world where type stripping exists (which the TS team has been in favor of) and the TS team might consider how it affects type stripping in future language design. For example, the `satisfies` keyword could have also been added by piggy-backing on the `as` keyword, like:

              const foo = { bar: 1 } as subtype of Foo
          
          (I think not using `as` is a better fit semantically but this could be a trade-off to make for better type stripping backwards compatibility)
          • Timon3 84 days ago
            I don't know a lot about parser theory, and would love to learn more about ways to make parsing resilient in cases like this one. Simple cases like "ignore rest of line" make sense to me, but I'm unsure about "adversarial" examples (in the sense that they are meant to beat simple heuristics). Would you mind explaining how e.g. your `as` stripping could work for one specific adversarial example?

                function foo<T>() {
                    return bar(
                        null as unknown as T extends boolean
                        ? true /* ): */
                        : (T extends string
                            ? "string"
                            : false
                        )
                        )
                }
            
                function bar(value: any): void {}
            
            Any solution I can come up with suffers from at least one of these issues:

            - "ignore rest of line" will either fail or lead to incorrect results - "find matching parenthesis" would have to parse comments inside types (probably doable, but could break with future TS additions) - "try finding end of non-JS code" will inevitably trip up in some situations, and can get very expensive

            I'd love a rough outline or links/pointers, if you can find the time!

            [0] TS Playground link: https://www.typescriptlang.org/play/?#code/AQ4MwVwOwYwFwJYHs...

            • WorldMaker 84 days ago
              Most parsers don't actually work with "lines" as a unit, those are for user-formatting. Generally the sort of building blocks you are looking for are more along the lines of "until end of expression" or "until end of statement". What defines an "expression" or a "statement" can be very complex depending on the parser and the language you are trying to parse.

              In JS, because it is a fun example, "end of statement" is defined in large part by Automatic Semicolon Insertion (ASI), whether or not semicolons even exist in the source input. (Even if you use semicolons regularly in JS, JS will still insert its own semicolons. Semicolons don't protect you from ASI.) ASI is also a useful example because it is an ancient example of a language design intentionally trying to be resilient. Some older JS parsers even would ignore bad statements and continue on the next statement based on ASI determined statement break. We generally like our JS to be much more strict than that today, but early JS was originally built to be a resilient language in some interesting ways.

              One place to dive into that directly (in the middle of a deeper context of JS parser theory): https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...

              • Timon3 84 days ago
                Thanks for the response, but I'm aware of the basics. My question is pointed towards making language parsers resilient towards separately-evolving standards. How would you build a JS parser so that it correctly parses any new TS syntax, without changing behavior of valid code?

                The example snippet I added is designed to violate the rules I could come up with. I'd specifically like to know: what are better rules to solve this specific case?

                • thanksgiving 83 days ago
                  > How would you build a JS parser so that it correctly parses any new TS syntax, without changing behavior of valid code?

                  I don't know anything about parsers besides what I learned from that one semester worth of introduction class I took in college but from what I understand of your question, I think the answer is you can't simply because we can't look into the future.

                • WorldMaker 83 days ago
                  In your specific case:

                  1. Automatic semicolon insertion would next want to kick in at the } token, so that's the obvious end of the statement. If you've asked it to ignore from `as` to the end of the statement (as you've established with your "ignore to the end of the 'line'"), that's where it stops ignoring.

                  1A. Obviously in that case `bar(null` is not a valid statement after ignoring from `as` to the end of the statement.

                  2. The trick to your specific case, that you've stumbled into is that `as` is an expression modifier, not a statement modifier. The argument to a function is an expression, not a statement. That definitely complicates things because "end of the current expression" is often a lot more complicated than ASI (and people think ASI is complicated). Most parsers are going to have some sort of token state counter for nested parentheses (this is a fun implementation detail of different parsers because while recursion is easy enough in "context-free grammars" the details of tracking that recursion is generally not technically "context-free" at that point, so sometimes it is in the tokenizer, sometimes it is a context extension to the parser itself, sometimes it is using a stack implementation detail of the parser) and you are going to want to ignore to the next "," token that signals a new argument or the next ")" that signals the end of arguments, with respect to any () nesting.

                  2A. Because of how complicated expression parsing can get, that probably sets some resiliency bounds on your "ignorable grammar": it may require that internally it still follows most of the logic of your general expression language: balanced nested parentheses, no dangling commas, usual comment syntax, etc.

                  2B. You probably want to define those sorts of boundaries anyway. The easiest way is to say that ignorable extensions such as `as` must themselves parse as if it was a valid expression, even if the language cannot interpret its meaning. You can think of this as the meta-grammar where one option for an expression might be `<expression> ::= <expression> 'as' <expression>` with the second expression being parseable but ignorable after parsing to the language runtime and JIT. You can see that effectively in the syntax description for Python's original PEP 3107 syntax-only type hints standard [1], it's surprisingly that succinct there. (The possible proposed grammar in the Type Annotations proposal to TC39 is a lot more specific and a lot less succinct [2], for a number of reasons.)

                  [1] https://peps.python.org/pep-3107/

                  [2] https://tc39.es/proposal-type-annotations/grammar.html

            • bazoom42 83 days ago
              CSS syntax have specific rules for how to handle unexpected tokens. E.g if an unexpected character is encountered in a declaration the parser ignores characters until next ; or }. But CSS does not have arbitrary nesting, so this makes it easier.

              Comments as in your example is typically stripped in the tokenization stage so would not affect parsing. The TpeScript type syntax has its own grammar, but it uses the same lexical syntax as regular JavaScript.

              A “meta grammar” for type expressions could say skip until next comma or semicolon, and it could recognize parentheses and brackets as nesting and fully skip such blocks also.

              The problem with the ‘satisfies’ keyword is a parser without support would not even know this is part of the type language. New ‘skippable’ syntax would have to be introduced as ‘as satisfies’ or similar, triggering the type-syntax parsing mode.

              • Timon3 83 days ago
                I understand that you can define a restricted grammar that will stay parseable, as the embedded language would have to adapt to those rules. But that doesn't solve the question, as Typescript already has existing rules which overlap with JS syntax. The GP comment was:

                > For example, the `as` keyword for casts has existed for a long time, and type stripping could strip everything after the `as` keyword with a minimal grammar.

                My question is: what would a grammar like this look like in this specific case?

                • bazoom42 82 days ago
                  How about:

                      TypeAssertion ::= Expression “as” TypeStuff
                      TypeStuff ::= TypeStuffItem+
                      TypeStuffItem ::= Block | any     token except , ; ) } ]
                      Block ::= ParenBlock | CurlyBracketsBlock | SquareBracketsBlock | AngleBracketsBlock
                      ParenBlock ::= ( ParenBlockItem* )
                      ParenBlockItem ::= Block | any token except ( )
                  etc.
          • zarzavat 83 days ago
            It can’t strip what’s after the as keyword without an up-to-date TS grammar, because `as` is an expression. The parser needs to know how to parse type expressions in order to know when the RHS of the `as` expression ends.

            Let’s say that typescript adds a new type operator “wobble T”. What does this desugar to?

                x as wobble
                T
            
            Without knowing about the new wobble syntax this would be parsed as `x as wobble; T` and desugar to `x; T`

            With the new wobble syntax it would be parsed as `x as (wobble T);` according to JS semicolon insertion rules because the expression wobble is incomplete, and desugar to `x`

            • bazoom42 83 days ago
              The “as” expression is not valid JavaScript anyway, so the default rule for implicit semicolon does not apply. A grammer for type expressions could define if and how semicolons should be inserted.
              • zarzavat 83 days ago
                TypeScript already has such type operators though. For example:

                    type T = keyof
                    {
                      a: null,
                      b: null
                    }
                
                Here T is “a”|”b”, no automatic semicolon is inserted after `keyof`. While I don’t personally write code like this, I’m sure that someone does. It’s perfectly within the rules, after all.

                While it’s true that TS doesn’t have to follow JS rules for semicolon insertion in type expressions, it always has done, and probably always should do.

                • bazoom42 79 days ago
                  This is just the default. Automatic semicolon insertion only happen in specific well-defined cases, for example after the “return” keyword or when an invalid expression can be made valid by semicolon insertion. Neither applies here.
        • panzi 84 days ago
          But at least its a loud failure and not a silent different interpretation of the source as the example was. Still.
    • the_gipsy 84 days ago
      You would also have to update your compiler. I guess you could phrase this as: you can't update your TS versions independently from your node.js version. But that's probably not an issue.
      • zarzavat 84 days ago
        It’s an issue because node has a system of LTS releases, whereas TypeScript has quarterly updates, so the release cadence is different.

        Updating node is much more fraught than updating TypeScript. For example, it may break any native code modules. That’s why users are directed to use the LTS and not the most recent release, so that there’s enough time for libraries to add support for the new version.

        On the other hand, I usually adopt a new TypeScript version as soon as it comes out.

        • satanacchio 84 days ago
          I made it sure to decouple the transpiler from node itself, the transpiler is in a npm package called amaro, that is bundled in node. The goal is to allow user to upgrade amaro indipendently so we dont have to lock a ts version for the whole lifespan of a release
        • mort96 84 days ago
          TypeScript feels "boring" enough at this point that being a few years behind isn't gonna be an issue in most cases. For teams who want to stay on the absolute latest release of TypeScript but want to be more conservative with their Node version, external compilation will remain necessary; but for someone like me, where TypeScript has been "good enough" for many years that I'm not excited by new TypeScript releases, this feature will be really nice.

          ("Boring" in this context is a compliment, by the way)

          EDIT: Though reading other comments, it seems like you can update the typescript stripper independent of node? That makes this moot anyway

          • mu53 84 days ago
            typescript is still evolving in dramatic ways. The 5.0 release has some really good improvements
            • mort96 84 days ago
              But at the same time, it's good enough and has been good enough for many years. It's like how I'm sure EcmaScript 2024 contains cool new stuff, but if node only supported ES6, I would have no trouble writing ES6.
        • another-dave 84 days ago
          Though I'd primarily see this as a feature for the REPL or manual scripts where I'm not going to mind doing a `nvm use X`.

          For production use, I'd still put my TS files through a build pipeline as normal

          • chrisweekly 84 days ago
            tangential protip: if you're using nvm to manage node versions, take a look at fnm as a superior replacement. (It can read the same .nvmrc file to switch on cd into a given dir, but it's faster and "cleaner" wrt impact on your shell.)
      • klodolph 84 days ago
        Not necessarily.

        With a couple exceptions (like enums), you can strip the types out of TypeScript and end up with valid JS. What you could do is stabilize the grammar, and release new versions of TypeScript using the same grammar. Maybe you need a flag to use LTS grammar in your tsconfig.json file.

        • 3np 83 days ago
          > With a couple exceptions (like enums)

          Apart from a clueless engineer attempts them, it's been working out fine to pretend they don't exist.

    • nialv7 84 days ago
      Using inequality signs as angled brackets really is a mistake isn't it...
      • jmull 84 days ago
        Mistake isn't the right word. It's just a tradeoff.

        There was no perfect solution available, so a tradeoff was necessary. You can disagree with this particular tradeoff, but had they gone another way some people would disagree with that as well. To be a mistake there would have had to have been an option available that was clearly better at the time.

        Anyway, the idea that TS 5 should be backwards compatible with TS 1 is probably a bad one. Personally, I think packages with wide usage break backwards compatibility far too easily -- it puts everyone who uses it on an upgrade treadmill, so it should be done very judiciously. But even I wouldn't argue that TS 1 should have been its final form.

      • klodolph 84 days ago
        I’ll flip this around… reusing comparison as angle brackets is the mistake. C++ ran into some issues too.

        I think Rust made the really smart move of putting :: before any type parameters for functions. Go made the good move of using square brackets for type parameters.

        • zarzavat 84 days ago
          The problem can be traced back to ASCII/typewriters only including three sets of paired characters, plus inequality signs, which is not enough for programming languages.

          We really need five sets: grouping, arrays/indexing, records, type parameters, and compound statements. Curly braces {} are also overloaded in JS for records and compound statements, leading to x => {} and x => ({}) meaning different things.

          Square brackets wouldn't work for parametric functions because f[T](x) already means get the element at index T and call it.

          • behnamoh 84 days ago
            Everytime a standardization happens, part of human creativity gets suppressed. Before ASCII, people were inventing all sorts of symbols and even the alphabet was flexible to changes. After ASCII, we got stuck with a certain set of letters and symbols. Heck, even our keyboards haven't changed that much since then. I really think we need more symbols than just &@$#%^*}{][<>()/\_~|
          • parasense 84 days ago
            Paired characters...

            That's an interesting topic. Do you happen to know if UTF-8 contains more "pair" characters? In Latex we call these delimiteres, but that's just my limited experience coming in from math side. I tend to agree that it would be helpful to have more kind of nesting/pairing/grouping/delimiting characters. The problem is my imagination is limited to what I know from the ASCII world, and so it goes... no idea what new sets would look like.

            • pshc 84 days ago
              So many different pairs are available. I like Asian corner brackets 「」and French guillemets « ». The angle brackets 〈〉 are popular in CS/math papers I think, though they might be confused with <>.
              • butwhynotTSR 83 days ago
                I have had access to « » before, which led me to doing a double take when I first encountered a << on a work station with font ligatures. It was funny.

                Which brings us to the philosophical question about paired characters: If we were to pick paired characters from a key set not available on everyone's keyboard, why must the paired characters even be used in real languages? Is that not actually actively detrimental when we end up needing the characters for real? Is this not why we even have escape characters to begin with?

                Plus, must they be a part of anyone's real keyboard to begin with? What makes 「」any more valid than ¿? Could we not have saved ourselves a lot of mental strain if we solved it earlier on with a full set of truly uncommon characters?

                I can imagine an alternate history, where some programming language in the late 70's made their editors with simple shortcuts (such as Ctrl+A for "array block") to input barely-if-ever-used yet low code character such as † or ‡, which would never be used outside of a string. And nowadays, with modern IDE's, we wouldn't even see those characters half the time. It would be syntax sugar, with blocks types stated in gutters and data types represented with colors or text.

            • IshKebab 84 days ago
              Sure, you can use Canadian Aboriginal Syllabics!

              https://www.reddit.com/r/rust/comments/5penft/comment/dcsq64...

            • paulddraper 84 days ago
              Nit: UTF-8 is a particular encoding. Unicode is a character system.

              UTF-8, UTF-16, UTF-32 all have the same characters.

          • klodolph 84 days ago
            Five sets, but at any given place in the syntax, not all five are possible. (I would add function calls to the list—so, six.)

            In most languages, (grouping and compound statements) cannot syntactically appear in the same place as (indexing, records, type parameters, function calls). So we are immediately down to four.

            Rust takes the approach that you use :: before type parameters, so they are easily distinguished from comparison operators at a syntactic level.

            Go takes the approach that [] is just fine for type parameters—which seems pretty reasonable to me. In Go, there’s nothing that can be both indexed *and* take a type parameter.

            • zarzavat 83 days ago
              > In Go, there’s nothing that can be both indexed and take a type parameter.

              True, but TypeScript has a rule against type-dependent emit - it’s not allowed. Code must always do the same thing regardless of what the types are. And in any case JavaScript does allow indexing on functions, since functions are just objects.

          • jenadine 83 days ago
            There is also the possibility to use digraphs such as <: :> or other combinations.

            (And they could even converted to unicode by the code formatter)

          • xigoi 84 days ago
            I think the D syntax would work: f!T(x)
        • lolinder 84 days ago
          I don't think you're flipping it around, I think that's exactly what OP was saying, just clearer.
      • SomeCallMeTim 84 days ago
        It's not a TypeScript mistake.

        You could argue that it was a C++ mistake. It makes parsing harder, but otherwise seems to work as expected, so I don't consider it a mistake, but you could at least argue that way.

        But regardless if it was a mistake in C++, it's now a complete standard, used in C++, Java, C#, and other languages to denote type parameters.

        I would argue that it would have been a mistake to break that standard. What would you have used, and in what way would that have been enough better to compensate for the increased difficulty in understanding TypeScript generics for users of almost every other popular language?

        • vips7L 84 days ago
          It's definitely the right choice for Typescript. You could have gone the Scala route and used [] for generics, but that is so heavily used in ts/js as arrays it would not have made any sense.
    • felixfbecker 84 days ago
      I think the kind of teams that always stay on top of the latest TypeScript version and use the latest language features are also more likely to always stay on top of the latest Node versions. In my experience TypeScript upgrades actually more often need migrations/fixes for new errors than Node upgrades. Teams that don't care about latest V8 and Node features and always stay on LTS probably also care less about the latest and greatest TypeScript features.
      • jitl 84 days ago
        I work on a large app that’s both client & server typescript called Notion.

        We find Typescript much easier to upgrade than Node. New Node versions change performance characteristics of the app at runtime, and sometimes regress complex features like async hooks or have memory leaks. We tend to have multi-week rollout plans for new Node versions with side-by-side deploys to check metrics.

        Typescript on the other hand someone can upgrade in a single PR, and once you get the types to check, you’re done and you merge. We just got to the latest TS version last week.

    • noname120 84 days ago
      It's already the case for ECMAScript and I don't see why TypeScript should be treated differently when Node.js has to transpile it to JavaScript and among other things ensure that there are no regressions that would break existing code.

      Unlike Python typing it's not only type erasure: enums, namespaces, decorators, access modifiers, helper functions and so on need to be transformed into their JavaScript equivalent.

      • shortrounddev2 83 days ago
        and there's a slight runtime difference between typescript classes and javascript classes
    • Tade0 84 days ago
      I'm not worried about that too much to be honest.

      To me beyond v4.4 or so, when it started being possible to create crazy recursive dependent types (the syntax was there since ~4.1 - it's just that the compiler complained), there weren't a lot of groundbreaking new features being added, so unless an external library requires a specific TS version to parse its type declarations, it doesn't change much.

      • 3np 84 days ago
        > crazy recursive dependent types

        Some edge-cases involving those have bugfixes and ergonomy improvents I've run into on 5.x.

        • Tade0 83 days ago
          That's great to know, thank you.

          That being said I regret every use of `infer` in application code.

    • getcrunk 84 days ago
      Do you really need to update all the time? Are the new features always that immediately important?
    • spoiler 84 days ago
      It's possible that internal SWC version will be versioned alongside Node, meaning TS syntax support won't drift. Or am I missing something?
      • thecopy 84 days ago
        TypeScript evolves independent of Node, and the syntax you can use depends on your `typescript` version in the `package.json`
        • re-thc 84 days ago
          Not if SWC or your tooling doesn't support it.
      • satanacchio 84 days ago
        yes but can also be upgraded separately as npm package
    • pornel 83 days ago
      Does TypeScript have some disambiguating syntax like turbofish?
    • baybal2 84 days ago
      [dead]
    • js4ever 84 days ago
      If only typescript could avoid changing syntax every 2 weeks.
  • pansa2 85 days ago
    If Node.js can run TypeScript files directly, then the TypeScript compiler won't need to strip types and convert to JavaScript - it could be used solely as a type checker. This would be similar to the situation in Python, where type checkers check types and leave them intact, and the Python interpreter just ignores them.

    It's interesting, though, that this approach in Python has led to several (4?) different popular type checkers, which AFAIK all use the same type hint syntax but apply different semantics. However for JavaScript, TypeScript seems to have become the one-and-only popular type checker.

    In Python, I've even heard of people writing types in source code but never checking them, essentially using type hints as a more convenient syntax for comments. Support for ignoring types in Node.js would make that approach possible in JavaScript as well.

    • winter_blue 85 days ago
      Flow (by Facebook) used to be fairly significant in the JavaScript several years ago, but right now it's somewhat clear that TypeScript has won rather handily.
      • mattnewton 84 days ago
        Before that there was the closure compiler (Google) which had type annotations in comments. The annotation syntax in comments was a little clunky but overall that project was ahead of it's time. Now I believe even inside google that has been transpiled to typescript (or typescript is being transpiled to closure, I can't remember which - the point is that the typescript interface is what people are using for new code).
        • miki123211 84 days ago
          Closure was also interesting because it integrated type checking and minification, which made minification significantly more useful.

          With normal Javascript and typescript, you can't minify property names, so `foo.bar.doSomethingVeryComplicated()` can only be turned into `a.bar.doSomethingVeryComplicated()`, not `a.b.c()`, like with Closure. This is because objects can be indexed by strings. Something like `foo.bar[function]()` is perfectly valid JS, where the value of `function` might come from the user.

          A minifier can't guarantee that such expressions won't be used, so it cannot optimize property accesses. Because Closure was a type checker and a minifier at the same time, it could minify the properties declared as private, while leaving the public ones intact.

          • kevincox 84 days ago
            > it could minify the properties declared as private, while leaving the public ones intact.

            I don't think it ever actually did this. It renamed all properties (you could use the index syntax to avoid this) and just used a global mapping to ensure that every source property name was consistently renamed (no matter what type it was on). I don't think type information was ever actually used in minification.

            So if you had two independent types that had a `getName` function the compiler would always give them the same minified name even though in theory their names could be different because they were fully independent types. The mapping was always bijective. This is suboptimal because short names like `a` could only be used for a single source name, leading to higher entropy names overall. Additionally names from the JS runtime were globally excluded from renaming. So any `.length` property would never be renamed in case it was `[].length`.

          • mananaysiempre 84 days ago
            > A minifier can't guarantee that such expressions won't be used, so it cannot optimize property accesses.

            Given TypeScript’s type system is unsound, neither could it even if it tried, right? I guess Flow could, but well, here we are.

            • epolanski 84 days ago
              What do you mean by unsound exactly.

              I'm asking because there's no accepted definition of what an unsound type system is.

              What I often see is that the word unsound is used to mean that a type system can accept types different to what has been declared, and in that case there's nothing unsound about ts since it won't allow you to do so.

              • cstrahan 84 days ago
                > and in that case there's nothing unsound about ts since it won't allow you to do so

                Consider this example (https://www.typescriptlang.org/play/?ssl=10&ssc=1&pln=1&pc=1...):

                  function messUpTheArray(arr: Array<string | number>): void {
                      arr.push(3);
                  }
                  
                  const strings: Array<string> = ['foo', 'bar'];
                  messUpTheArray(strings);
                  
                  const s: string = strings[2];
                  console.log(s.toLowerCase())
                  
                Could you explain how this isn't the type system accepting types "different to what has been declared"? Kinda looks like TypeScript is happy to type check this, despite `s` being a `number` at runtime.
                • epolanski 84 days ago
                  That's a good example, albeit quite of a far-fetched one.

                  In Haskell land, where the type system is considered sound you have `head` functions of type `List a -> a` that are unsound too, because the list might be empty.

                  • crdrost 84 days ago
                    That option also exists, you can just leave out the `messUpTheArray` lines and you get an error about how `undefined` also doesn't have a `.toLowerCase()` method.

                    However this problem as stated is slightly different and has to do with a failure of OOP/subtyping to actually intermingle with our expectations of covariance.

                    So to just use classic "animal metaphor" OOP, if you have an Animal class with Dog and Cat subclasses, and you create an IORef<Cat>, a cell that can contain a cat, you would like to provide that to an IORef<Animal> function because you want to think of the type as covariant: Cat is a subtype of Animal, F<Cat> should be a subtype of F<Animal>. The problem is that this function now has the blessing of the type system to store a Dog in the cell, which can be observed by the parts that still consider this an IORef<Cat>.

                    Put slightly differently, in OOP, the methods of IORef<Cat> all accept an implicit IORef<Cat> called `this`, if those methods are part of what define an IORef<x> then an IORef<x> is necessarily invariant, not covariant, in <x>. And then you can't assume subtyping. So to be sound a subtype system would presumably have to actually mark contra/covariance around everything, and TypeScript very intentionally documents that they don't do this and are just trying to make a "best effort" pass because JavaScript has 0 types, and crappy types are better than no types, and we can't wait for perfect types to replace the crappy types.

                  • cstrahan 84 days ago
                    > In Haskell land, where the type system is considered sound you have `head` functions of type `List a -> a` that are unsound too, because the list might be empty.

                    Haskell's `head` not is not an example of the type system being unsound (I stress this point because we've been talking about type system soundness, not something-else-soundness).

                    From the view of the type system, `head` is perfectly sound: if the list is empty, the resulting value is ⊥ ("bottom"). And ⊥ is an inhabitant of every type. Therefore, `head` returning ⊥ when given an empty list is perfectly fine. When you force ⊥ (i.e. use it any way whatsoever), an exception is thrown. See https://wiki.haskell.org/Bottom

                    This is very much not the same thing (or remotely analogous) to what we have in my TypeScript example. There, the code fails at runtime when I attempt to call `toLowerCase`, yes; what's worse is the slightly different scenario where we succeed in calling something we shouldn't:

                      class Person {
                        name: string;
                       
                        constructor(name: string) {
                          this.name = name;
                        }
                       
                        kill() {
                          console.log("Killing: " + this.name);
                        }
                      }
                      
                      class Murderer extends Person { }
                      
                      class Innocent extends Person { }
                      
                      function populatePeopleFromDatabase(people: Array<Innocent | Murderer>): void {
                          // imagine this came from a real SQL query
                          people.push(new Innocent("Bob"));
                      }
                      
                      
                      function populateMurderersFromDatabase(people: Array<Murderer>): void {
                          // TODO(Aleck): come back and replace this with a query that only selects murderers.
                          //              i wanted to get the rest of the code in place, and this type checks,
                          //              so I'll punt on this for now and come back later when I wrap my head
                          //              around the proper SQL.
                          //              we're not actually using this anywhere just yet, so no biggie ¯\_(ツ)_/¯
                          populatePeopleFromDatabase(people);
                      }
                      
                      // ... some time later, Bob comes along and implements the murderer execution logic:
                      const murderers: Array<Murderer> = [];
                      populateMurderersFromDatabase(murderers);
                      // Bob is about to have a really shitty day:
                      murderer.forEach((murderer) => murderer.kill());
                    
                    It is not possible to write an analogous example in Haskell using `head`.
              • sharlos201068 84 days ago
                That’s not correct, there’s several ways the actual type of a value differs from what typescript thinks it is. But soundness isn’t a goal of typescript.
                • WorldMaker 84 days ago
                  It's maybe useful to note in this discussion for some that "soundness" of a type system is a bit of technical/theoretical jargon that in some cases has specific mathematical definitions and so "unsound" often sounds harsher (connotatively) than it means. The vast majority of type systems are "unsound" for very pragmatic reasons. Developers don't often care to work in a "sound" type systems. Some of the "most sound" type systems we've collectively managed to build are in things like theorem provers and type assertion systems that some of us don't always even consider useful for "real" software development.

                  Typescript is a bit more unsound than most because of the escape hatch `any` and because of the (intentional) disconnect between compiler and runtime environment. Even though "unsound" sounds like a bad thing to be, it's a big part of why Typescript is so successful.

                  • price 83 days ago
                    There's nothing arcane or particularly theoretical about soundness. It means that if you have an expression of some type, and at runtime the expression evaluates to a value, the value will always be of that type.

                    For example if you have a Java expression of type MyClass, and it gets evaluated, then it must either throw (so that it doesn't produce any value) or produce a value of type MyClass: either an instance of MyClass, or of one of its subclasses, or null. It will never produce an instance of some other class, or an int, or anything else that isn't a valid value for the type MyClass.

                    In addition to helping human readers reason about the code, a sound type system is a big deal for a compiler: it makes it possible to compile the code AOT to fast native code, without inserting a bunch of runtime checks and dynamic dispatching to handle the fact that inevitably some of the types (but you don't know which) are wrong.

                    The compiler implications are what motivated the Dart language's developers to migrate from an unsound to a sound type system a few years ago: https://dart.dev/language/type-system#the-benefits-of-soundn... so that they could compile Flutter apps AOT. This didn't require anyone to make their code resemble what you'd do in a theorem prover — it just means that, for example, all casts are checked, so that they throw if the value doesn't turn out to have the type the cast wants to return.

                    TypeScript is unsound because when you have an expression with a type, that tells you nothing at all for sure about what the value of the expression can be — it might be of that type, or it might be anything else. It's still valuable because you can maintain a codebase where the types are mostly accurate, and that's enough to help a lot in reading and maintaining the code.

                    • afiori 83 days ago
                      The key factor is that typescript is not a language, it is a notation system for a completely independent language.

                      The purpose of typescript is usefully type as much javascript as possible, to do both this and have a sound type system it would require to change javascript.

                      • price 83 days ago
                        Definitely to get the most ergonomic programming experience, while also having a sound type system, you'd need to change some of the semantics of the language.

                        A prime example is that if you index into an array of type `T[]`, JS semantics mean the value you get back could be undefined as well as a `T`. So to describe existing JS semantics in a sound type system, the type would have to be `T | undefined`, which would be a big pain. Alternatively you could make the type `T` and have that be sound, but only if you make the runtime semantics be that an out-of-bounds access throws instead of returning undefined.

                  • afiori 83 days ago
                    any and unknown are perfectly sound, if they were the only types then soundness would be automatic.

                    The problem is that you can arbitrarily narrow types (and any can narrow to any type) eg: https://www.typescriptlang.org/play/?#code/DYUwLgBAzg9grgOwC...

                    • price 83 days ago
                      That's true but misleading: if "any" and "unknown" were the only types, then "any" would be indistinguishable from "unknown" and you'd really have just the one type. Which makes the type system sound because it doesn't say anything.

                      If your type system has at least two types that aren't the same as each other, then adding "any" makes it unsound right there. The essence of "any" is that it lets you take a value of one type and pretend it's of any other type. Which is to say that "any" is basically the purified form of unsoundness.

                      • afiori 83 days ago
                        a typing of any cannot be unsound because it is always correct, narrowing any can be unsound.
              • mananaysiempre 84 days ago
                > there's no accepted definition of what an unsound type system is

                Huh?

                The cheeky answer would be that the definition here is the one the TypeScript documentation itself uses[1].

                The useful answer is that there’s only one general definition that I’ve ever encountered: a type system is sound if no well-typed program encounters type errors during its execution. Importantly, that’s not a statement about the (static) type system in isolation: it’s tied to the language’s dynamic semantics.

                The tricky part, of course, is defining “type error”. In theoretical contexts, it’s common to just not define any evaluation rules at all for outwardly ill-typed things (negating a list, say), thus the common phrasing that no well-typed program must get stuck (unable to evaluate further). In practical statically-typed languages, there are on occasion cases that are defined not to be type errors essentially by fiat, such as null pointer accesses in Java, or escape hatches, such as unsafeCoerce in practical implementations of Haskell.

                Of course, ECMAScript just defines behaviour for everything (except violating invariants in proxy handlers, in which case, lol, good luck), so arguably every static type system for it is sound, even one that allows var foo: string = 42. Obviously that’s not a helpful point of view. I think it’s reasonable to say that whatever we count as erroneous situations must at the very least include all occurrences of ReferenceError and TypeError.

                TypeScript prevents most of them, which is good enough for its linting use case, when the worst possible result is that a buggy program crashes. It would definitely not be good enough for Closure Compiler’s minification use case, when the worst possible result is that a correct program gets silently miscompiled (misminified?).

                [1] https://www.typescriptlang.org/docs/handbook/type-compatibil...

            • RyanCavanaugh 84 days ago
              Flow's type system isn't sound either FWIW
              • price 83 days ago
                Yeah, Flow had the ambition to be sound but has never accomplished it.

                If you read the Flow codebase and its Git history, you can see that it's not for lack of trying, either — every couple of years there's an ambitious new engineer with a new plan for how to make it happen. But it's a real tough migration problem — it only works if they can provide a credible, appealing migration path to the other engineers across Facebook/Meta's giant JS codebase. Starting from a language like JS with all the dynamic tricks people use there, that's a tough job.

                (And naturally it'd be even harder if they were trying to get any wider community to migrate, outside their own employer.)

                • RyanCavanaugh 83 days ago
                  Flow doesn't even check that array access is in-bounds, contrast to TypeScript with noUncheckedIndexedAccess on. They're clearly equally willing to make a few trade-offs for developer convenience (a position I entirely agree with FWIW)
                  • price 83 days ago
                    Neat example, thanks! I hadn't known TS had that option. Array access was actually exactly the example that came to mind for me in a related discussion: https://news.ycombinator.com/item?id=41076755

                    I wonder how widely used that option is. As I said in that other comment, it feels to me like the sort of thing that would produce errors all over the place, and would therefore be a real pain to migrate to. (It'd be just fine if the language semantics were that out-of-bounds array access throws, but that's not the semantics JS has.) I don't have a real empirical sense of that, though.

            • afavour 84 days ago
              Theoretically TS could… until it encounters an ‘any’ type in that code path, then it would have to give up.

              But there are TSconfig options to ensure no use of any so with the right level of strictness it could happen.

          • wiktor-k 84 days ago
            > Something like `foo.bar[function]()` is perfectly valid JS,

            A minor thing but `function` is a keyword in JS so technically it's not a "perfectly valid JS".

            • mkesper 84 days ago
              Oh boy, just think functionName and it fits.
          • AprilArcus 84 days ago
            of course this created an interoperability nightmare with third party libraries, which irrevocably forked Google's whole JS ecosystem from the community's 20 years ago and turned their codebases into a miserable backwater.
          • antifa 83 days ago
            Teaser has a mangle props feature. The mangle props by exact name or pattern has worked for me, but it might affect library objects or browser built-ins that I definitely don't want it to, but I've not gotten the mangle all props of this object version of it to work.
        • btown 84 days ago
          Google’s Closure Library is fascinating too. It’s being retired, but if you want to build a rich text interface for email authoring that truly feels like Gmail, warts and all, you can just use a pre-compiled version of the library and follow https://github.com/google/closure-library/blob/master/closur... within a more modern codebase!
        • rty32 84 days ago
          Closure is almost a forgotten child of Google now. Does not even fully support ES2022 as of today. We are working hard to get rid of it completely. Surprise, lots of important projects still rely on it today.
          • jazzypants 84 days ago
            This is true. For instance, React still uses Closure compiler in their build process.
        • LoganDark 84 days ago
          Oh, Closure Compiler is such a throwback. I still remember staring at the project page on Google Code. Isn't it like two decades old or even older by this point? Is it still alive?
          • rty32 84 days ago
            This can give you some hints of the current status of closure compiler:

            https://github.com/google/closure-compiler/issues/2731

            I happen to know this because we have some old projects that depend on this and are working hard to get rid of the dependency.

            I wish Google either updates it or just mark the whole thing deprecated -- the world has already moved on anyway. Relating this to Google's recent cost cutting, and seeing some other Google's open source projects more or less getting abandoned, I have to say that today's Google is definitely not the same company from two decades ago.

          • mdhb 84 days ago
            The compiler itself lives on but it works with TypeScript now rather than the JSDoc comments style approach which is officially EOL AFAIK.
          • flohofwoe 84 days ago
            Closure is still used in Emscripten to optimize the generated Javascript shim file.
      • sesm 84 days ago
        There was no real competition, Flow was a practical internal tool with 0 marketing budget. Typescript is typical MS 3E strategy with a huge budget. Needless to say, Flow is much more practical and less intrusive, but marketing budget captured all the newbie devs.
        • ignoramous 84 days ago
          > There was no real competition

          There was: Anders Hejlsberg and Lars Bak: TypeScript, JavaScript, and Dart https://www.youtube.com/watch?v=5AqbCQuK0gM (2013).

          Summary: https://g.co/gemini/share/a60c3897bae1 / https://archive.is/qJ1wA

        • treflop 84 days ago
          Have to disagree. I tried Flow irrespective of marketing and didn’t think it was polished. Kept running into type situations that the language didn’t support well. Kept bugging out in my IDE. Had no elegance.
          • spoiler 83 days ago
            When I last used it, as a type system it was much better than TypeScript. A lot of flow features now exist in TypeScript though too.

            One big annoyance with Flow I had is like you said: unpolished tooling. Another was frequent breaking changes (I don't hold it against them too much, it was 0.x software after all)

            Also because features diverged, you had to maintain type defs for multiple versions of flow for for multiple library versions. And then at one point, they also decided to convert internal errors to any types instead of displaying the error. That was the last straw for me, especially since I maintained a few flow type defs. I spent _so_ much of _my_ time just on type def maintenance for open source libraries already, with the any decay I like flying blind too. So I just switched to TS with its interior type system: it was good enough and others maintained library typedefs for me. But now the type systems are much more closely aligned (unless flow drifted), so switching to TS paid off in the end.

        • com2kid 84 days ago
          TypeScript was really really easy to get started with back in the day. It allows for incremental correctness, has good docs, and good tooling. On top of that a lot of beginner React tutorials started out with TypeScript, which onboarded a lot of new engineers to the TS ecosystem, and got them used to the niceties of TS (e.g. import syntax).
      • simplify 84 days ago
        Facebook never gave Flow enough resources, whereas Microsoft has had 10+ devs on TypeScript for a long time.
        • sesm 84 days ago
          Because Flow is an actual developer tool, not a rent-seeking landgrab with a marketing budget.
          • 9dev 84 days ago
            I don’t know what axe you have to grind, but TypeScript is firmly in the hands of the community now. There’s not much Microsoft could do to change that. In what way would it be rent-seeking?
          • recursive 84 days ago
            If you're paying rent for typescript, you're doing it wrong.
      • hajile 84 days ago
        Flow tries to be sound and that makes it infinitely better than TS where the creators openly threw the idea of soundness out the window from the very beginning.
        • dwb 84 days ago
          This is a point in Flow's favour. However! Seven years ago or so, when TypeScript was quite young and seemed inferior to Flow in almost all respects, I chose Flow for a large project. Since then, I spent inordinate amounts of time updating our code for the latest breaking Flow version, until one came along that would have taken too long to update for, so we just stayed on that one. We migrated to TypeScript a little while back and the practical effect has been much more and effective type checking through more coverage and support. TypeScript may be unsound, but it works better over all. We turn on the vast majority of the safety features to mitigate the unsoundness. And it's developed by a team that are beholden to a large and vibrant user base, so any changes are generally well-managed. There's no contest, really.
          • hajile 83 days ago
            Turning on EVERY safety feature won't mitigate the unsoundness because the problem goes beyond stuff like `any`.
            • dwb 83 days ago
              I know, but it just doesn't matter enough. Believe me, I'm signed up to the idea of theoretical rigour, the argument for soundness is part of what originally won me over (along with previous good experiences with Flow on a smaller project, and the support for gradual adoption). I will continue to be drawn to languages and tools that have a strong theoretical foundation. But in this particular case, today, when comparing these particular projects in the large JavaScript codebase I am talking about, TypeScript still wins by some distance. I promise that it has caught way more errors and been more generally helpful in its language server abilities than Flow ever was. Maybe Flow has caught up since then in its core functionality, I haven't been keeping track, but there would still be the wide disparity in community support which has serious implications for developer education, availability of library type definitions, etc.
              • hajile 83 days ago
                I think the real answer is adding actually sound types to JS itself.

                One of the biggest revolutions in JS JITS was the inline cache (IC). It allows fast lookup and specialized functions (which would be too expensive otherwise). These in turn allow all the optimizations of the higher-tier JITs.

                The biggest problem of Flow and TS is encouraging you to make slow code. The second you add a generic to your function, you are undoubtedly agreeing that it is going to be accepting more than 4 types. This means your function is megamorphic. No more IC. No more optimization (even worse, if it takes some time to hit those 5+ types, you get the dreaded deoptimization). In theory, they could detect that your function has 80 possible variations and create specialized, monomorphic functions for each one, but that's way too much code to send over the wire. That kind of specialization MUST be done in the JIT.

                If you bake the types into the language via a `"use type"` directive, this give a LOT of potential. First, you can add an actually sound type system. Second, like `"use strict"` eliminated a lot of the really bad parts of JS, you can eliminate unwanted type coercion and prevent the really dynamic things that prevent optimization. Because the JIT can use these types, it can eliminate the need for IC altogether in typed functions. It can still detect the most-used type variants of a function and make specialized versions then use the types to directly link those call sites to the fast version for even more optimization.

                I use TS because of its ubiquity, but I think there's the possibility for a future where a system a little more like Flow gets baked into the language.

                • sprkv5 83 days ago
                  > I use TS because of its ubiquity, but I think there's the possibility for a future where a system a little more like Flow gets baked into the language.

                  Have you looked into ReScript? It is basically a sound type system + JavaScript-like syntax. It inherits the type system from OCaml. You might like it. They recently released version 11.

                • dwb 83 days ago
                  Maybe! I see where you're coming from. That sounds like a long and painful road, still, though, from what is still a very dynamic language. Do you have a rough idea of how much more time/space-efficient a typical JavaScript program could be through this?
                  • hajile 83 days ago
                    It's an interesting question.

                    JS uses a JIT while Ocaml is AOT which is generally an advantage for Ocaml.

                    Ocaml only compiles once while JS compiles every time it runs. This means that JS is a lot more selective about its compilation, but the hot code could be every bit as fast as Ocaml. On the flip side, Ocaml is fast because it compiles method-at-a-time and a SML compiler like MLton which does a slow whole-program pass can generate significantly faster code (despite being a part-time hobby project for a few academics).

                    The big difference is money. Ocaml has some funding, but nothing compared to JS. It's hard to believe, but handling strings in JS is probably faster than what most devs could do themselves in C/C++. It's not because JS is inherently faster. It's because those bits are native and have had countless man-years poured into making them fast. That said, even the JIT itself is top-tier and raw integer code is only 20-50% slower than C (excluding any SIMD optimizations).

                    I think the upper limit for a typed JS could be about as fast and maybe a little faster than Ocaml on the JIT and maybe even a little faster with a more restrictive subset compiling to WASM.

        • ahuth 84 days ago
          TS made the choice to be “just JS” + types, and lean into JS-isms.

          Both choices are reasonable ones to make. Flow has some really cool stuff, and works great for a lot of people.

          There’s no denying, though, that there’s TS has done something right (even if you personally dislike it)

        • idlephysicist 84 days ago
          Can you explain what you mean when you say "to be sound"?
          • pansa2 84 days ago
            Here's an example of TypeScript failing to be sound - it should give a type error but it doesn't. I believe Flow does indeed give a type error in this situation:

            https://news.ycombinator.com/item?id=41069695

            • recursive 84 days ago
              You don't have to go even that far to find unsoundness in flow.

                  const arr = ["abcd"];
                  const str = arr[1];
                  const num = str.length; // this throws
                  console.log(num);
              
              For me, typescript is a pretty good balance.
              • hajile 84 days ago
                I think this is not a very good example. Not only does it also throw in TS, but it even throws in Haskell which is pretty much the poster boy for sound type systems.

                This isn't a type error unless your type system is also encoding lengths, but most type systems aren't going to do that and leave it to the runtime (I suspect the halting problem makes a general solution impossible).

                    main = putStrLn (["a", "b", "c"]!!4)
                • recursive 84 days ago
                  Yes it throws in typescript. Typescript isn't the the language chasing soundness at any cost. This just illustrates the futility of chasing soundness.

                  Soundness is good as long as the type-checking benefit is worth the cost of the constraints in the language. If the poster child for soundness isn't able to account for this very simple and common scenario, then nothing will actually be able to deliever full soundness.

                  It's just a question of how far down the spectrum you're willing to go. Pure js is too unsound for my taste. Haskell is too constrained for my taste. You might come to a different conclusion, but for me, typescript is a good balance.

                  • hajile 83 days ago
                    My balance point is StandardML. SML and TS both have structural typing (I believe this is why people find TS to be more ergonomic). SML has an actually sound type system (I believe there is an unsoundness related to assigning a function to a ref, but I've never even seen someone attempt to do that), but allows mutation, isn't lazy, and allows side effects.

                    Put another way, SML is all the best parts of TS, but with more soundness and none of the worst parts of TS and non of the many TS edge cases baked into the language because they keep squashing symptoms of unsoundness or adding weird JS edge cases that you shouldn't be doing anyway.

                    • recursive 83 days ago
                      Personally, I think javascript kind of sucks, but it's approximately* the only choice for targeting browsers. If it wasn't for this face, I probably never would have touched TS. SML sounds pretty good.
              • fabiospampinato 84 days ago
                Wait, so Flow is not actually sound and their website is lying? Or do they have some "technically correct" definition of "sound" that takes stuff like that into account?
                • price 83 days ago
                  Flow is not sound. They have the ambition of trying to be sound (which I appreciate), but they've never accomplished it.

                  I went looking for where on their website they claim to be sound. There's definitely some misleading wording here: https://flow.org/en/docs/lang/types-and-expressions/#toc-sou... but if you read the whole section, it ends up also acknowledging that it's not entirely sound.

                • recursive 84 days ago
                  I have no idea about the lawyerly technicalities, but you can try it yourself to verify what I'm saying.

                  https://flow.org/try/

                  Compare these two programs.

                      const arr = ["abcd"];
                      const str = arr[1];
                      const num = str.length; // this throws at runtime
                  
                      const arr = [new Date];
                      const dt = arr[1];
                      const num = dt.length; // fails to type check
                • hajile 84 days ago
                  Even haskell will generate a runtime error for an out-of-bounds index

                      main = putStrLn (["a", "b", "c"]!!4)
                  • recursive 84 days ago
                    This is different. Neither flow, typescript, nor javascript generate a runtime error for an out of bounds index. It's explicitly allowed by the language.

                    The result of the an OOB access of an array is specified to be `undefined`. The throw only happens later when the value is treated as the wrong type.

                    I don't consider a runtime error to be a failure of the type system for OOB array access. But in javascript, it's explicitly allowed by specification. It's a failure of any type system that fails to account for this specified behavior in the language.

                    • hajile 83 days ago
                      > It's explicitly allowed by the language.

                      This is like arguing that a null exception is fine because it's allowed by the language. If you get `undefined` when you expect another type, most future interaction are guaranteed to have JS throw because of the JS equivalent of a null pointer exception. They are technically different because a dynamic language runtime can prevent a total crash, but the effect on your web app is going to be essentially the same.

                          [1,2,3][4].toFixed(2)
                      
                      > It's a failure of any type system that fails to account for this specified behavior in the language.

                      Haskell has the ability to handle the error.

                      How do you recommend a compiler to detect out-of-bounds at compile time? It can certainly do this for our trivial example, but that example will also be immediately evident the first time you run the code too, so it's probably not worth the effort. What about the infinite number of more subtle variants?

                      • recursive 83 days ago
                        > How do you recommend a compiler to detect out-of-bounds at compile time?

                        I wouldn't make the recommendation that they do at all. Full soundness is not my thing. But... if Flow wanted to do it, it would have to change the type of indexing into `(Element[])[number]` with a read from `Element` to `Element | undefined`.

          • price 83 days ago
            When a language's type system is sound, that means that if you have an expression with type "string", then when you run the program the expression's value will only ever be a string and never some other sort of value.

            Or stated more abstractly: if an expression has type T, and at runtime the expression evaluates to a value v, then v has type T.

            The language can still have runtime errors, like if you try to access an array out of bounds. The key is that such operations have to give an error — like by throwing, so that the expression doesn't evaluate to any value at all — rather than returning a value that doesn't fit the type.

            Both TypeScript and Flow are unsound, because an expression with type "string" can always turn out to evaluate to null or a number or an object or anything else. Flow had the ambition to be sound, which is honorable but they never accomplished it. TypeScript announced up front that they didn't care about soundness: https://www.typescriptlang.org/docs/handbook/type-compatibil...

            Soundness is valuable because it makes it possible to look at the types and reason about the program using them. An unsound type-checker like TypeScript or Flow can still be very useful to human readers if most of the types in a codebase are accurate, but you always have to keep that asterisk in the back of your head.

            One very concrete consequence of soundness that it makes it possible to compile the code to fast native code. That's what motivated Dart a few years ago to migrate from an unsound type system to a sound one: https://dart.dev/language/type-system so that it could AOT-compile Flutter apps for speed.

        • machiaweliczny 84 days ago
          But in practice is was crap
        • draw_down 84 days ago
          Maybe FB should have tried putting more than like 4 people on the project then. (Yes I met them all once.)
    • Doxin 84 days ago
      > In Python, I've even heard of people writing types in source code but never checking them, essentially using type hints as a more convenient syntax for comments.

      Note that there's IDEs that'll use type hints to improve autocomplete and the like too, so even when not checking types it can make sense to add them in some places.

    • dobladov 84 days ago
      You can have this now adding types with JSDoc and validating them with typescript without compiling, you get faster builds and code that works everywhere without magic or need to strip anything else than comments.

      The biggest pain point of using JSDoc at least for me was the import syntax, this has changed since Typescript 5.5, and it's now not an issue anymore.

      • murmansk 84 days ago
        [flagged]
        • afavour 84 days ago
          I’ve had a lot of success combining JSDoc JS with .d.ts files. It’s kind of a Frankenstein philosophically (one half using TS and one half not) but the actual experience is great: still a very robust type system but no transpiling required.

          In a world where ES modules are natively supported everywhere it’s a joy to have a project “just work” with zero build steps. It’s not worth it in a large project where you’re already using five other plugins in your build script anyway but for small projects it’s a breath of fresh air.

          • flanbiscuit 84 days ago
            I do this as well. JSDoc is great for simple definitions, but as soon as you want to do something more complicated (generics, operators, access types, etc) you get stuck. The .d.ts are ignored because you're only importing them within JSDoc comments.
        • dobladov 84 days ago
          You should write complex types in interfaces files where they belong, and there's full typescript support.

          I use this approach professionally in teams with many developers, and it works better for us than native TS. Honestly give it a try, I was skeptical at first.

          • rty32 84 days ago
            In general JSDoc is just much more verbose and has more friction, even outside complex types. I recently finished a small (20 files/3000 lines), strictly typed JS project using full JSDoc, and I really miss the experience of using the real TypeScript syntax. Pain points: annotating function parameter types (especially anonymous function), intermediate variable type and automatic type-only import, these are the ones that I can remember. Yes you can get 99% there with JSDoc and .d.ts files, but that's painful.
            • dobladov 84 days ago
              I use snippets to write those, yes it's more verbose there's not denying that.

              For me the advantages of just having JS files and not worrying about more complex source-maps, build files, etc definitely makes it worth it.

              • FractalHQ 84 days ago
                Source maps and build files are automatically generated when bundling which you need to do with or without typescript… so this argument always confuses me. There is no tangible downside in my experience.. either way it’s just typing “pnpm build”.
        • tracker1 84 days ago
          JSDoc is also helpful in dealing with multiple argument option types for a function or constructor, which won't show in TS alone.
        • Waterluvian 84 days ago
          Jsdoc is honestly fine for simple and smaller projects. But yeah, it’s definitely not nearly as expensive while being anywhere as succinct.
          • yard2010 84 days ago
            JSDoc is for docs, TypeScript is a static type checker. How can these tools be used interchangeably?
            • styfle 83 days ago
              You can configure tsconfig.json to read JSDoc and error on invalid types so that you effectively get the same behavior as writing typescript.
        • adhamsalama 84 days ago
          You can write TypeScript types in JSDoc.
          • sureIy 84 days ago
            You can’t write complex TypeScript types in JSDoc, which is what GP said.

            The moment you need to declare or extend a type you’re done, you have to do so in a separate .ts file. It would be possible to do so and import it in JSDoc, but as mentioned before it’s a huge PITA on top of the PITA that writing types can already be (e.g. function/callbacks/generics)

        • meiraleal 84 days ago
          [flagged]
      • eezing 83 days ago
        Good for annotating js files here and there, but lousy dx.
      • itsmeste 84 days ago
        I strongly agree, but JSDoc isn't "the cool thing". So it's left to be used by us, who care (and read their docs).
        • llimllib 84 days ago
          The auto-export of declared types was what killed it for me
      • epolanski 84 days ago
        JSDoc absolutely does not scale and allows for very limited type programming.

        It's fine on toy projects, and somewhat I would say, for 99% of users that don't even know what a mapped or intersection type is.

        • mablopoule 84 days ago
          JSDoc does not scale, but some projects are just better when they aren't scaled.

          JSDoc is indeed fine on toy project, or in fact any project (even prod-ready ones) that doesn't warrant the trouble of adding NPM packages and transpilation steps.

          Although they are rare, those type of small, feature-complete codebases do exists.

    • black3r 84 days ago
      > If Node.js can run TypeScript files directly, then the TypeScript compiler won't need to strip types and convert to JavaScript

      Node.JS isn't the only JS runtime. You'll still have to compile TS to JS for browsers until all the browsers can run TS directly. Although some bundlers already do that by using a non-official compiler, like SWC (the one Node's trying out for this feature).

      > In Python, I've even heard of people writing types in source code but never checking them, essentially using type hints as a more convenient syntax for comments.

      It's not just comments. It's also, like the name "type hint" suggests, a hint for your IDE to display better autocomplete options.

      • pansa2 84 days ago
        > It's not just comments. It's also a hint for your IDE to display better autocomplete options.

        Ah yes, autocomplete is another benefit of machine-readable type hints. OTOH there's an argument that another IDE feature, informational pop-ups, would be better if they paid more attention to comments and less to type hints:

        https://discuss.python.org/t/a-more-useful-and-less-divisive...

    • phartenfeller 84 days ago
      There is an EcmaScript proposal to go in that direction: https://github.com/tc39/proposal-type-annotations

      I think this should be part of the language spec.

      • wiseowise 84 days ago
        Beyond ugly. They should just make TS official and be done with it.

        E: I thought it was JSDoc proposal. Ignore the comment.

        • blovescoffee 84 days ago
          I did only briefly look at the proposal. What did you find so ugly?
          • wiseowise 84 days ago
            I misread the proposal. Thought it was for JSDoc.
            • meiraleal 84 days ago
              Misread no, you didn't read.
              • wiseowise 84 days ago
                Yeah. I saw JSDoc and closed it, lol.
                • Dylan16807 84 days ago
                  Specifically, you saw the section titled "Limits of JSDoc Type Annotations"?
                  • wiseowise 84 days ago
                    Specifically, I saw JSDoc syntax and it triggered me so much that I closed the page and threw my phone away in disgust at absurdness of even the idea that someone thought having something like this unironically is a remotely good idea.
                    • FractalHQ 84 days ago
                      I support this behavior as the only correct reaction.
        • IshKebab 84 days ago
          What do you mean ugly? This basically is making Typescript official.

          They just can't have browsers doing the actual type checking because there isn't a specification for how to do that, and writing one would be extremely complicated, and I'm not sure what the point would be anyway.

          • wiseowise 84 days ago
            I misread the proposal, tho it it was for JSDoc.
        • padavanchik 84 days ago
          There is no language spec for TS. No alternative implementations. All we have is checker.ts. It's ugly and slow.
    • pphysch 85 days ago
      > In Python, I've even heard of people writing types in source code but never checking them

      This is my main approach. Type hints are wonderful for keeping code legible/sane without going into full static type enforcement which can become cumbersome for rapid development.

      • josephg 84 days ago
        You can configure typescript to make typing optional. With that option set, you can literally rename .js files to .ts and everything "compiles" and just works. Adding this feature to nodejs means you don't even have to set up tsc if you don't want to.

        But if I were putting in type hints like this, I'd still definitely want them to be statically checked. Its better to have no types at all than wrong types.

        • pansa2 84 days ago
          > Its better to have no types at all than wrong types.

          I agree - but the type systems of both Python and TypeScript are unsound, so all type hints can potentially be wrong. That's one reason why I still mostly use untyped Python - I don't think it's worth the effort of writing type annotations if they're just going to sit there and tell lies.

          Or maybe the unsoundness is just a theoretical issue - are incorrect type hints much of a problem in practice?

          • zoul 84 days ago
            In my experience unsoundness is almost never a problem in practice, see here for details:

            https://effectivetypescript.com/2021/05/06/unsoundness/

            • hajile 84 days ago
              Unsound types are never a problem until they are at which point you are staring at code that SHOULD be working, but is somehow breaking.
          • kcrwfrd_ 84 days ago
            Is this “unsound”-ness that you’re referring to because it uses structural typing and not nominal typing?

            Fwiw I’ve been working with TypeScript for 8+ years now and I’m pretty sure wrong type hints has never been a problem. TS is a God-send for working with a codebase.

            • cstrahan 84 days ago
              No, TypeScript is not unsound because it uses structural typing.

              A language has a sound type system if every well-typed program behaves as defined by the language's semantics during execution.

              Go is structurally typed, and yet it is sound: code that successfully type checks is guaranteed to abide the semantics of the language.

              TypeScript is unsound because code that type checks does not necessarily abide the semantics of the language:

                function messUpTheArray(arr: Array<string | number>): void {
                    arr.push(3);
                }
                
                const strings: Array<string> = ['foo', 'bar'];
                messUpTheArray(strings);
                
                const s: string = strings[2];
                console.log(s.toLowerCase())
              
              `strings` is declared as a `Array<string>`, but TypeScript is happy to insert a `number` into it. This is a contradiction, and an example of unsoundness.

              `s` is declared as `string`, but TypeScript is happy to assign a `number` to it. This is a contradiction, and an example of unsoundness.

              This code eventually fails at runtime when we try to call `s.toLowerCase()`, as `number` has no such function.

              What we're seeing here is that TypeScript will readily accept programs which violate its own rules. Any language that does this, whether nominally typed or structurally typed, is unsound.

            • nyssos 84 days ago
              There's not much connection. Typescript's record types aren't sound, but that's far from its only source of unsoundness, and sound structural typing is perfectly possible.
              • lolinder 84 days ago
                Soundness is also a highly theoretical issue that I've never once heard a professional TypeScript developer express concern about and have never once heard a single anecdote of it being an issue in real-world code that wasn't specifically designed to show the unsoundness. It usually only comes up among PL people (who I count myself among) who are extremely into the theory but not regularly coding in the language.

                Do you have an anecdote (just one!) of a case where TypeScript's lack of type system soundness bit you on a real application? Or an anecdote you can link to from someone else?

                • nyssos 84 days ago
                  > Do you have an anecdote (just one!) of a case where TypeScript's lack of type system soundness bit you on a real application?

                  Sure. The usual Java-style variance nonsense is probably the most common source, but I see you're not bothered by that, so the next worst thing is likely object spreading. Here's an anonymized version of something that cropped up in code review earlier this week:

                      const incomingValue: { name: string, updatedAt: number } = { name: "foo", updatedAt: 0 }
                  
                      const intermediateValueWithPoorlyChosenSignature: { name: string } = incomingValue
                  
                      const outgoingValue: { name: string, updatedAt: string } = { updatedAt: new Date().toISOString() , ...intermediateValueWithPoorlyChosenSignature }
                  • lolinder 84 days ago
                    I mean... yes, there's a footgun there where you have to know to spread first and then add the new properties. That's just a good practice in the general case: an intermediate type that fully described the data wouldn't have saved you from overwriting it unless you actually looked closely at the type signature.

                    And yes, TypeScript types are "at least these properties" and not "exactly these properties". That is by design and is frankly one reason why I like TypeScript over Java/C#/Kotlin.

                    I'd be very interested to know what you'd do to change the type system here to catch this. Are you proposing that types be exact bounds rather than lower bounds on what an object contains?

                    • nyssos 83 days ago
                      > That's just a good practice in the general case: an intermediate type that fully described the data wouldn't have saved you from overwriting it unless you actually looked closely at the type signature.

                      The issue isn't that it got overridden, it's that it got overridden with a value of the wrong type. An intermediate type signature with `updatedAt` as a key will produce a type error regardless of the type of the corresponding value.

                      > I'd be very interested to know what you'd do to change the type system here to catch this.

                      Like the other commenter said, extensible records. Ideally extensible row types, with records, unions, heterogeneous lists, and so on as interpretations, but that seems very unlikely.

                    • haskman 83 days ago
                      Look into "Row types" and how PureScript, Haskell, and Elm (to a limited extent) do it.

                      '{foo :: Int | bar} is a record with a known property 'foo' and some unspecified properties 'bar'. You cannot pass a `{foo :: Int, bar :: Int}` into a function that expects `{foo :: Int}`.

                      A function that accepts any record with a field foo, changes foo, keeping other properties intact has the type

                          {foo :: Int | bar} -> {foo :: Int | bar}
                • kcrwfrd_ 84 days ago
                  Ah someone else posted a link and I understand the unsoundness now.

                  The only time an issue ever came up for me was in dealing with arrays

                    let foo: number[] = [0, 1, 2]
                  
                    // typed as number but it’s really undefined
                    let bar = foo[3]
                  
                  But once you’re aware of the caveat it’s something you can deal with, and it certainly doesn’t negate the many massive benefits that TS confers over vanilla JS.
                  • AfterAnimator 79 days ago
                    For this case, I've switched to using `foo.at(3)` now instead, as it returns `T | undefined`, so you have to handle the undefined case.
                  • lolinder 84 days ago
                    Yeah, that example is unsound in the same way that Java's type system is unsound, it's a compromise nearly all languages make to avoid forcing you to add checks when you know what you're doing. That's not the kind of problem that people usually are referring to when they single out TypeScript.
          • lolinder 84 days ago
            I've been using TypeScript professionally for 6+ years and have only ever run into issues at the border between TypeScript and other systems (usually network, sometimes libraries that don't come with types). There are a few edge cases that I'm aware of, but they don't really come up in practice.
        • black3r 84 days ago
          Or you can configure the TS compiler to allow JS imports, then everything also compiles and works, but you can slowly convert your codebase from JS to TS file by file and be sure that all TS files are properly typed and all JS files are untyped instead of having everything as TS files where some are typed and some are not.
        • jaggederest 84 days ago
          Yeah I start projects by explicitly typing `any` all over the place and gradually refining things, so every type that's specified is explicit and checked, I'm really enjoying that style.
          • crabmusket 84 days ago
            Combine this with an eslint config that nudges you about explicit any, and the typescript compiler option to disallow implicit any, and you're well taken care of.
      • pansa2 84 days ago
        With this approach, do you still use Python's standard syntax for type hints?

            def mersenne(p: int): return 2**p - 1
        
        Or, given there's no need for the type hints to be checker-friendly, do you make them more human-friendly, e.g:

            def mersenne(p: 'prime number'): return 2**p - 1
        • LordKeren 83 days ago
          While it’s not common (from the source code I’ve reviewed over the years), some people make a new type with a name and use that in the definition:

          ```

          from typing import NewType

          # Create a new type for some_prime SomePrime = NewType('SomePrime', int)

          def process_prime(value: SomePrime) -> int: return value ```

          However, this isn’t nearly as common as simply using a more descriptive argument name like “prime_number : int”

          One of the big advantages to type hinting in Python is that it feeds the IDE a lot of information to increase auto-complete functionality, so you want to avoid things like p:”prime number”

          • pansa2 83 days ago
            > One of the big advantages to type hinting in Python is that it feeds the IDE a lot of information to increase auto-complete functionality

            Yeah, until this discussion I thought the main benefit of type hints was earlier detection of bugs via static checking. Now though, I'm getting the impression that the bigger benefit is enabling IDE features such as autocomplete.

            That helps me understand better why I haven't found type hints as useful as others seem to - I don't use an IDE. My use of Python is limited to small scripts which I write in a simple text editor.

            • LordKeren 83 days ago
              There’s a fair argument to make that better IDE autocomplete does prevent many type-related issues even before a static type checker is run
      • Humphrey 84 days ago
        Exactly. And if you use a library that does lots of meta programming (like Django) then it's impossible to pass all type errors. Hopefully one day the type system will be powerful enough to write a Django project with passing tests.
      • edflsafoiewq 84 days ago
        IME if you aren't checking them, they're eventually going to be out of date.
      • rlt 84 days ago
        I don't find TypeScript to be burdensome when rapidly iterating. Depending on how you've configured your dev environment you can just ignore type errors and still run the code.
    • schwartzworld 83 days ago
      Incidentally, this is how the ecmascript proposal for introducing types to JS would work by default. The runtime would ignore the types when running code. If you want type checking, you’d have to reach for external tooling.
  • unilynx 84 days ago
    If this feature ever becomes the default (ie not behind a flag) - how will the NPM ecosystem respond? Will contributors still bother to build CJS end EJS versions when publishing a NPM module, or just slap an 'engine: nodejs >= 25' on the package.json and stop bothering with the build step before pushing to NPM ?

    I personally would very much prefer if NPM modules that have their original code in TS and are currently transpiling would stop shipping dist/.cjs so I unambiguously know where to put my debugger/console.log statements. And it would probably be very tempting to NPM contributors to not have to bother with a build step anymore.

    But won't this start a ripple effect through NPM where everyone will start to assume very quickly 'everyone accepts TS files' - it only takes one of your dependencies for this effect to ripple through? It seems to me that nodejs can't move this outside an opt-in-experimental-flag without the whole community implicitly expecting all consumers to accept TS files before you know it. And if they do, it will be just months before Firefox and Safari will be force to accept it too, so all JS compilers will have to discard TS type annotations

    Which I would personally be happy with - we're building transcompiling steps into NPM modules that convert the ts code into js and d.ts just to support some hypothetical JS user even though we're using TS on the including side. But if node accepts .ts files we could just remove those transpiling steps without ever noticing it... so what's stopping NPM publishers from publishing js/d.ts files without noticing they broke anything?

    • Omerd6 84 days ago
      The legendary Ryan dahl is actually working on solving the exact problem you described by creating a new package registry called JSR.

      Essentially what it does is allow you to upload your typescript code without a build step so when other devs install it they can see the source code of the module in it's original typescript instead of transpiled JavaScript.

      • sam_perez 84 days ago
        That's really cool. One of the benefits of the JS ecosystem is the ability to step through code and crack open your dependencies. Not sure if this would directly make this possible when running your projects/tests, but it at least sounds like a step in that direction.
      • meowtimemania 84 days ago
        it must install compiled and uncompiled versions? Otherwise node (without the above flag) would throw errors when it encounters types in node_modules
        • Omerd6 84 days ago
          I just checked and you are correct, in node it only installs the compiled version.

          Apparently you can only view the uncompiled source code in deno since it natively supports typescript.

          My bad

          • meowtimemania 79 days ago
            No worries! I was curious how it worked on node. Thanks!
    • satanacchio 84 days ago
      We dont support running .ts files in node_modules, this is one of the main constraints to avoid breaking the ecosystem
    • nfriedly 84 days ago
      For the old libraries I maintain that are typescript and transpiled into .cjs and .mjs for npm, I'll probably just start shipping all three versions.

      For a new thing I was writing from scratch, yeah, I might just ship typescript and not bother transpiling.

      [edit: Apparently not. TS is only for top-level things, not libraries in node_modules according to the sibling comment from satanacchio who I believe is the author of the PR that added TS support and a member of the Node.js Technical Steering Committee]

      • WorldMaker 84 days ago
        Why are you still transpiling to .cjs in 2024? ESM is supported in every LTS version of Node now. We can kill CJS, we have the power.
        • nfriedly 84 days ago
          Because I don't like breaking things unnecessarily. Some of my libraries are 10 years old and depended upon by similarly old projects that are not using ESM and probably never will.

          Besides, it's already going through one transpilation step to go from TS to ESM, so adding a second one for CJS really isn't that much hassle.

          I think if node.js had made require() work with ESM, I could probably drop CJS. But since that's probably never going to happen, I'm just going to continue shipping both versions for old projects and not worry about it.

          • sureIy 84 days ago
            > adding a second one for CJS

            Nobody is arguing for that. Once you ship ESM, you can continue shipping ESM.

            In Node 22 you can even require() ES modules (with an experimental flag, at the moment)

            • nfriedly 84 days ago
              > > adding a second one for CJS

              > Nobody is arguing for that. Once you ship ESM, you can continue shipping ESM.

              I'm not sure I follow you there. I did continue shipping ESM.

              > In Node 22 you can even require() ES modules (with an experimental flag, at the moment)

              Oh, I didn't know about that, cool! Once it becomes un-flagged I might consider dropping CJS.

          • WorldMaker 84 days ago
            > I think if node.js had made require() work with ESM, I could probably drop CJS

            Why is making downstream have to switch to `await import()` that big of a deal?

            You can use async/await in CJS just fine. Sure, sometimes you may need to resort to some ugly async IIFE wrappers because CJS doesn't support top-level await like ESM does, but is that really such a big deal?

            Sure, it's a breaking change, but that's what semver major bumps are for.

            I just think that if projects want to stay in CJS they should learn how to use async/await. I clearly don't understand why CJS libraries feel a need synchronous require() for everything. (Though to be fair, I've also never intentionally written anything directly in CJS. I learned enough in the AMD days to avoid CJS like a plague.)

            • nfriedly 84 days ago
              > Why is making downstream have to switch to `await import()` that big of a deal?

              > You can use async/await in CJS just fine. Sure, sometimes you may need to resort to some ugly async IIFE wrappers because CJS doesn't support top-level await like ESM does, but is that really such a big deal?

              It might seem like a small amount of work, but for a library one must to multiply that small amount of work by the number of users who will have to repeat it. It can be a quite large amount in aggregate. And, for what benefit? So I can drop one line from my CI config? It just seems like a huge waste of everyone's time.

              Also, as a library user, I would (and occasionally do) get annoyed by seemingly unnecessary work foisted on my by a library author. It makes me consider whether or not I want to actually depend on that library, and sometimes the answer is no.

              • WorldMaker 84 days ago
                > multiply that small amount of work by the number of users who will have to repeat it

                This is probably where we have the biggest difference in our calculations. I know there's a lot of pain in legacy CJS systems, but from my view (which is maybe more "browser-oriented", which is maybe a bit more Deno/Bun-influences, which comes from a "Typescript-first" mentality going way back to 0.x) it is more legacy "giant balls of mud" maintained by a sparse few developers. I don't see this multiplicand as very big on the scale of library user count. Most CJS for years and years has been transpiled from Typescript or Rollup; most CJS only exists to be eaten by Webpack or other bundler, many of which today rewrite CJS to ESM anyway. From what I see a lot of CJS seems either transpiled out of habit (for some notion of supporting Node < 10 that doesn't make sense with current security support) or by accident (by a misconfigured tsconfig.json, for example, and then often looping back through a transpiler again back to ESM). The way we cut through the Gordian knot of we're all doing too much transpilation to/from CJS is to start eliminating automated transpilation to CJS in the first place. Which is why I find it useful every time to ask people what they are really trying to do when transpiling to .cjs today.

                Of course, if your multiplicand is lines-of-code impacted, because I agree there are some great big huge piles of mud in CJS that are likely stuck that way for lack of developers/maintainers and lack of time/budget/money, then worrying about the minority of users still intentionally using CJS is worth caring about, and my sympathies in that situation.

                You, of course, know your library's users better than me and maybe you do have a lot of CJS users that I just wouldn't consider in my calculations. I'm not going to stop you from transpiling to CJS if you find that necessary for your library. That's your judgment call. I just wanted to also make sure to ask the questions of "do you really need to?" and "how many users do you actually think it will impact?" out loud. Thanks a lot for the conversation on it, and I'm still going to be a radical banging the "CJS Must Die" drum, but I understand pragmatism and maintenance needs, especially those of legacy applications, and mostly just want to make sure the conversation is an active one and a lot less of passively transpiling stuff that doesn't really need it.

                • nightowl_games 83 days ago
                  Aight let me chime in here.

                  I'm a game developer. I make web games. We run our games with a simple nginx server that simply serves the wasm. We have some JavaScript libraries we use. They have to be raw dog .js.

                  I don't even know what your "ejs" or "cjs" acronyms mean.

                  We use the discord JavaScript SDK. Discord only ships it as a node module or as .ts.

                  It's a pain in our ass to update because we don't know what those tools your talking about are and we don't want to know. Just give me the damn .js

                  • WorldMaker 83 days ago
                    I'm on your side. No one should have to care about the difference between ESM (.mjs) and CJS (.cjs). CJS should just be dead and we only need one .js again. If you are following the Discord JS docs and using modern JS syntax `import` and `export` statements (if you are "raw dogging" it, have you heard the good word of <script type="module">?) then none of the above conversation applies to you, congratulations! That's the winning modern JS and you are one of the majority of users using it. The conversation above is about the old dead JS (CJS) and why people are still outputting the old dead JS today when it doesn't matter to people like you or I that just want plain modern .js files (and .ts files and simpler tsconfigs).
        • thr0w 84 days ago
          > ESM is supported in every LTS version of Node now. We can kill CJS, we have the power.

          I'd rather kill ESM. And it's not fully supported in Node. It doesn't work in the REPL.

          • crabmusket 84 days ago
            You can have a chat to all major browser vendors about killing ESM :)
        • joseferben 83 days ago
          i wish it was that easy. i keep trying to default to esm on node projects but the ecosystem is not there yet, at least in the context of server side nodejs stuff.
          • WorldMaker 83 days ago
            It's one reason to consider a switch to Deno or Bun, but at least to maybe at least check JSR before NPM these days. The ecosystem for server-side ESM exists and is growing at a rapid pace, but it's sometimes hard to separate the active maintained npm packages from the legacy ones in the server-side ecosystem. (Browser ecosystem is definitely more ESM hungry.)

            That said with "type": "module" in my own package.json files, I've so far never had a problem importing legacy CJS npm packages into ESM, other than the Types are more likely to be wrong (because packages that only publish CJS are more likely to also not publish their own types) or at least inaccurate for the current import approach (returning only "synthetic default" instead of individual exports, for example). That's a bunch of papercuts having to attempt multiple imports until you understand what shape Node is giving you of that CJS import, but after those papercuts I feel like interop is generally smooth sailing in today's Node.

            • joseferben 83 days ago
              the issue i ran into recently was around react/jsx where i wanted .tsx files server side.

              might have to look into it again but there was an issue where some react/jsx/fragment bit had to be ported to esm first.

              • WorldMaker 83 days ago
                Interesting. Every example that I've seen of React SSR is all natively in ESM, but I've mostly only glanced at the big ones (Astro/Next/Nuxt). I'm sure that there are a lot of paths that aren't as well paved given how diverse the React SSR space currently is (there are way too many competing options), and how simple you want to try to keep your SSR (a lot of those options are just so complex today, which presumably is why there are so many competing options, everyone has a different idea of how complex to make the whole thing).

                (My own .tsx based view library doesn't yet have official SSR support, but I do all my testing in ESM in the built-in Node test runner `node --test` so I don't see any complications in doing .tsx on the "server-side", because that is how I'm testing everything already, I just haven't entirely figured out my "hydration" or "islands" or "stamps" approach so I don't officially support SSR yet. It's on the roadmap and I've made small bits of progress towards it, just need to solve it and haven't had the time/priority.)

    • rnmkr 83 days ago
      I would love to ship my source code (.ts) to npm. But Typescript team was very much against this, as there'll be tsconfig issues and other performance issues. But still fingers crossed.
  • BiteCode_dev 84 days ago
    Eventually, node might allow JS to introspect those types.

    That would be a huge win. Right now in Python, great tools like pydantic exist because Python can introspect said types, and generate checks out of them.

    This mean you can define simple types, and get:

    - type checking - run time data check - api generation - api document generation

    Out of a single, standard notation.

    Right now in JS, things like zod have to do:

        const mySchema = z.string();
    
    Which is basically reinventing what typescript is already doing.
    • bythreads 84 days ago
      That's not entirely true. `z.string()` in Zod offers more than just type safety akin to TypeScript. TypeScript provides compile-time type checking, while Zod adds runtime validation and parsing.

      For those unfamiliar:

      `z.string()` effectively converts `mySchema` into a functional schema capable of parsing and validation.

      For example:

      `mySchema.parse("some data")` returns successfully.

      `mySchema.parse(321)` throws an exception.

      I've used it in places where you need runtime validation and in process verification - it works pretty well for that and you can extract the types from it via :

      const A = z.string(); type A = z.infer<typeof A>; // string

      Meaning if you define your types in zod first, and infer their types from that you get compile and runtime type checking.

      ---

      It a bit of an overkill for nimble and fast code bases though - but works wonders for situations where in process proofing needs to be done, and in all honesty it isn't that big of a task to do this.

      • Dylan16807 84 days ago
        > Zod offers more than just type safety akin to TypeScript. TypeScript provides compile-time type checking, while Zod adds runtime validation and parsing.

        Well of course it offers more, or you wouldn't be installing a library.

        The problem is that even when you're expressing normal Typescript types, you have to use entirely different syntax. It's good that you can usually avoid double-definition, but it's still a big barrier that shouldn't be necessary.

      • tommy_axle 84 days ago
        There's also typescript-to-zod that makes it possible to generate the zod schemas from your types.
    • MrBazlow 84 days ago
      A lot of the focus by the TypeScript team is focused on alignment with the JavaScript language these days and novel new features such as run time types have all but been dismissed or at the very least pushed behind the TC39 JavaScript types proposal. Much like using decorators on variables outside of class structures was.

      Having said that, TypeScript allows plugins, these are very rarely used as they augment the language by introducing other features that are transformed into the resulting JavaScript files. One plugin that relates to your suggestion of run time types is called Typia, it permits you to use your TypeScript type signatures at runtime with guards like `assert<MyType>(myValue)` where it intercepts the function call to construct an exhaustive if statement in the transpiled JavaScript checking the nature of the passed variable.

      So while I don't see it being a part of the language in the next four to six years, there are at least libraries out there already that allow you to do it today.

    • hajile 84 days ago
      If JS ever adds type checking, I hope it doesn't choose Typescript.

      We need a type system that is actually sound and TS is intentionally unsound. We need a type system that doesn't allow bad coding practices like TS does. We need a type system that enforces program design that allows programs to be fast. We need a Hindley Milner type system.

      If you want a module to be typed, add a `"use type"`. This should disallow bad parts of the language like type coercion. It should disallow things that hurt performance like changing object shape/value type or making arrays of random collections of stuff. Incoming data from untyped modules would either coerce or throw errors if coercion can't be done at which point the compiler can deeply-optimize the typed code because it would have far stronger type guarantees and wouldn't have a risk of bailing out.

      • joseferben 83 days ago
        having written ocaml in production for a few years, i think soundness comes at a cost of dev ergonomics. at least with the type systems of today’s industry languages.

        it blows my mind weekly how ergonomic and flexible typescript’s type system is. it allows me to write great apis for my team mates.

        is it possible for the type checker to end up in an infinite loop or for a junior developer to abuse “as”? absolutely, but it doesn’t really matter in practice.

        i wouldn’t want to run typescript in rockets or submarines tho!

        • hajile 83 days ago
          Ocaml types are USED by the compiler to generate FAST code.

          TS types are IGNORED by the JIT to generate SLOW code.

          All the features that make Typescript more ergonomic for devs also allow it to generate slower JS code. AssemblyScript tries to be TS for WASM and it doesn't support huge swaths of TS because they output unusably slow garbage.

          I also suspect that more than a few Ocaml ergonomic issues are due to nominal typing. StandardML's structural typing and inference give an experience very similar to TS, but without the major soundness issues (though it must be noted that ANY generics in JS will be slow unless the compiler is creating multiple function variants).

          • joseferben 83 days ago
            for my use case (web dev) ts is fast enough. but i do miss the ocaml compile times!

            it's mainly the lack of ad-hoc polymorphism that makes ocaml feel a bit clunky to me at times. but structural typing sure would be nice.

            i used to avoid typescript because of similar soundness issues. but in the context of web dev this weird type system that evolved from adding types to javascript turned out to be so nice to use. it's bonkers because on paper it shouldn't be this nice haha.

            • hajile 83 days ago
              Ocaml messed up with their operators. StandardML had a better approach and I hope a future version adds module typeclasses (to help limit the type soup we see in Haskell).

              As I wrote elsewhere in this thread, TS makes it incredibly easy to unintentionally make megamorphic functions that don’t have any inline cache and don’t get optimized at all. You think you’re writing efficient, DRY code, but it’s really just dog slow because you’ve neutered the JIT.

      • ahuth 84 days ago
        What bad coding practices does TS allow, and why are they bad?
        • DexesTTP 84 days ago
          TS allows you to pass a read-only object to a method taking a read-write value:

              type A = { value: number; }
              function test(a: A) { a.value = 3; }
              function main() {
                const a: Readonly<A> = { value: 1 };
                // a.value = 2; <= this errors out
                test(a); // this doesn't error out
                console.log(a); // shows 3
              }
          • tossandthrow 83 days ago
            It seems super weird that this type checks, and it appears to be some sort of corner case explicitly implemented.

            Normally typescript does not just allow implicit removal of a container type.

            Like, you can't pass Array<A> to a function that just takes A.

          • thoweofasdf 83 days ago
            Nice find. Never ran into this because I haven't mutated inputs in over 15 years.
        • hajile 84 days ago
          If there's a bad way to write JS, TS has something available to make sure it's typed.

          Does TS help you keep your functions monomorphic so they'll get optimized by the JIT? nope

          Does TS keep your object shape from changing so it will get optimized by the JIT? it actively does the opposite giving TONS of tools that allow you to add, remove, modify, and otherwise mess up your objects and guarantee your code will never optimize beyond the basic bytecode (making it one or two orders of magnitude more slow than it could otherwise be).

          TS doesn't do anything to prevent or even discourage these kinds of bad decisions. They "type soup" many projects fall into is another symptom of this. The big reason the types become such a mess is because the underlying design is a mess. Instead of telling programmers "fix your mess", TS just releases even more features so you can type the terrible code without fixing it.

          • tln 84 days ago
            > Does TS keep your object shape from changing so it will get optimized by the JIT? it actively does the opposite giving TONS of tools that allow you to add, remove, modify, and otherwise mess up your objects and guarantee your code will never optimize beyond the basic bytecode (making it one or two orders of magnitude more slow than it could otherwise be).

            Can you elaborate or point to some of the tools? So I know what tools I may need to avoid

            • hajile 84 days ago
              JS JITs use something called an inline cache (IC) to speed up the lookup of object shapes. JS JITs consider it to be a different shape if the keys are different (even if just one is added or removed), if the values of the same key are different types, and if the order of the keys change.

              If you have a monomorphic function (1 type), the IC is very fast. If you have a polymorphic function (2-4 types), the IC function gets quite a bit slower. They call 5+ types megamorphic and it basically foregoes IC altogether and also disables most optimizations.

              TS knows how many variants exist for a specific function and even knows how many of those variants are used. It should warn you when your functions are megamorphic, but that would instantly kill 90% of their type features because those features are actively BAD in JS.

              Let's illustrate this.

                  interface Foo {
                    bar: string | string[]
                    baz?: number
                    blah?: boolean
                  }
              
              Looks reasonably typical, but when we use it:

                  function useFoo(foo: Foo) { .... }
              
                  useFoo({bar: "abc", baz: 123, blah: true}) //monomorphic
                  useFoo({bar: "abc", baz: 123})             //now a slower polymorphic
                  useFoo({bar: "abc"}) 
                  useFoo({bar: ["b"], baz: 123}) 
                  useFoo({bar: ["b"], baz: 123, blah: true}) //we just fell off the performance cliff
              
              As you can see, getting bad performance is shockingly easy and if these calls were across five different files, they look similar enough that you'd have a hard time realizing things were slow.

              Union/intersection aren't directly evil. Unions of a single type (eg, a union of strings) is actually great as it offers more specificity while not increasing function complexity. Even if they are a union of different primitive types, that is sometimes necessary and the cost you are paying is visible (though most JS devs are oblivious to the cost).

              Optionals are somewhat more evil because they somewhat hide the price you are paying.

              [key:string] is potentially evil. If you are using it as a kind of `any`, then it is probably evil, but if you are using it to indicate a map of strings to a type, then it's perfectly fine.

              keyof is great for narrowing the possible until you start passing those keys around the type system.

              Template unions are also great for pumping out a giant string enum (though there is a definite people issue of making sure you're only allowing what you want to allow), but if they get passed around the type system for use, they are probably evil.

              Interface merging is evil. It allows your interface to spread across multiple places making it hard to follow and even harder to decide if it will make your code slow.

              Overloads are evil. They pretend you have two different functions, but then just union everything together.

              Conditional types are evil. They only exist for creating even more complex types and those types are basically guaranteed to be both impossible to fully understand and allow very slow code.

              Mapped types are evil. As with conditional types, they exist to make complex an incomprehensible types that allow slow code.

              Generics are the mother of all that is evil in TS. When you use a generic, you are allowing basically anything to be inserted which means your type is instantly megamorphic. If a piece of code uses generics, you should simply assume it is as slow as possible.

              As an aside, overloads were a missed opportunity. In theory, TS could speed everything up by dynamically generating all those different function variants at compile time. In practice, the widespread use of generic everything means your 5mb of code would instantly bloat into 5gb of code. Overloads would be a great syntax to specify that you care enough about the performance of that specific function that you want to make multiple versions and link to the right one at compile time. Libraries like React that make most of their user-facing functions megamorphic could probably see a decent performance boost from this in projects that used TS (they already try to do this manually by using the megamorphic function to dispatch to a bunch of monomorphic functions).

      • iimblack 84 days ago
        Rescript?
        • sprkv5 83 days ago
          Absolutely. People are sleeping on it.

          If only projects like Bun/Deno/Node added runtime support for ReScript instead of TypeScript, collectively as the web-tooling industry, we'd be in a better place. But you can't win against the MS's marketing budget.

          Also in hindsight, ReScript diverged away from OCaml, but the ReScript development team could have gone further by creating a runtime for ReScript. Then again I don't blame them - they are polishing the dev experience of ReScript and React.

          This is the decade of writing shiny new runtimes - I hope somebody writes a ReScript runtime. Imageine ReScript, Core, rescript-webapi, typechecker, re-analyze, plus a bundler minifier etc baked into the runtime like Bun. Sounds like an interesting value proposition. Fingers crossed.

    • aitchnyu 84 days ago
      Does this mean Node can know if an exception is subclass of ValueError or an object is instance of SomeClass? I'm a TS newb, I thought types outside of array, object, number, string arent present in JS and Zod and typeguard functions return plain objects with "trust me bro".
      • gampleman 84 days ago
        In JS, classes do retain runtime information. So the `instanceof` is a real runtime operator that works by checking the prototype chain of an object. So checking subclasses can be done at runtime.

        However, in TS other type information is erased at compile time. So if you write

            type Foo = "a" | "b";
        
        the runtime code will see that just as a plain string.
      • LelouBil 84 days ago
        You are right, they aren't. In the JavaScript languages which is what gets actually executed, there are no typescript types.

        The parent commenter was talking about a way for nodejs to provide, via an API, the content of type annotations on fields/functions/variables like in python.

        However, in python the type annotations are a property of the object at run time, whereas they are completely stripped before execution for typescript.

        So I'm not sure how it would work except by changing the typescript philosophy of "not changing runtime execution"

  • samtheprogram 85 days ago
    Bun’s DX is pretty unprecedented in this space, and most of my use cases are now covered / not causing Bun to crash (when actually using run-scripts with `bun run`).

    Meanwhile, I can’t configure node to not require extensions on import, nor have tsc configured to automatically add .js extensions to its compiled output, without adding on a bundler… although native TypeScript support would remedy this nit quite a bit, I can’t imagine the user experience (or performance) to match Bun’s when it reaches stable.

    • spankalee 85 days ago
      Extensions should be required. It's not possible to do path searches over the network like you can on local disk, and network-attached VMs, like browsers, are a very, very important runtime for JavaScript.
      • leipert 84 days ago
        Also performance.

        foo could mean foo/index.js, foo.js at the minimum. So you have 2x the lookups. Oh no, wait we also potentially have mjs, cjs, jsx, ts and tsx.

        So 12 times the stat checking for each import.

        • simlevesque 84 days ago
          > foo could mean foo/index.js, foo.js at the minimum. So you have 2x the lookups.

          Only in the worst case. If it's foo.js there's only one lookup.

          > Oh no, wait we also potentially have mjs, cjs, jsx, ts and tsx. So 12 times the stat checking for each import.

          Again, you're only taking into account the worst case.

      • pfg_ 84 days ago
        Fortunately, code is generally bundled for browsers to reduce the number of network requests and total size of downloads. And node has access to the filesystem, so it can do path searches just fine if it wants to support existing code.
        • WorldMaker 84 days ago
          You probably don't need a bundler in the browser anymore. We're not yet to the point that is a popular "mainstream" opinion, but between massive improvements in browser connection handling (HTTP 1.1 connection sharing actually works in more places, HTTP/2+) and very good ESM support in browser's well optimized preloaders and caching engines (which can sometimes reduce download size much better than all-or-none bundles can, sure the trade-off is network requests but we are in a good place to take that trade-off), we're at an exciting point where there is almost never a need to bundle in development environments, and it is increasingly an option to not bundle in production either. It is worth benchmarking today (I can't tell you what your profiler tools will tell you) if you are really gaining as much from production bundles as you think you are. Not enough people are running those benchmarks, but some of them may already be surprised.

          The Developer Experience of unbundled ESM is great. Of course you do need to do things like always use file extensions. But those aren't hard changes to make, worth it for the better Developer Experience, and if it can help us start to wean off of mega-bundler tools as required production compile time steps.

          • tubs 84 days ago
            Meh. Even with h3 I still see more gains from reducing network requests than most other attempts I try. (One day s3 will support multiple ranges per request, if I wish hard enough).
      • samtheprogram 84 days ago
        That makes sense. I guess since using .js for the relevant imports just works in TypeScript I should be happy then…
      • silverwind 84 days ago
        Yeah, besides that, leaving out the extension also creates ambiguity when the same filename exists with multiple file extensions.
    • WuxiFingerHold 84 days ago
      I like Bun a lot, but Deno is (still) the more mature, stable, capable (e.g. stable workers, http2) and depending on the use-case more performant option (V8 > JSC). DX and tooling is top-notch. Deno can perform typchecking, btw. They bundle the TSC IIRC. Bun is the hype, but Deno is currently clearly the better option for serious endevours. Still, the vision and execution of Bun is impressive. Good for us devs.
    • XCSme 84 days ago
      I tried Bun twice, months apart, it never worked for me on Windows, failing to run "bun install": https://github.com/search?q=repo%3Aoven-sh%2Fbun+bun+install...
    • IshKebab 84 days ago
      > unprecedented

      Well except for Deno...

      • pas 84 days ago
        Bun started with compatibility with NodeJS as a primary goal, whereas for Deno it took a while to be able to import npm stuff. (Of course there are fun WTF[0] errors with Bun, and I only tried Deno before the npm import feature landed.)

        [0] https://github.com/oven-sh/bun/issues/11420

    • oblio 84 days ago
      Isn't Bun too raw? It's built with Zig, which hasn't even hit 1.0.
      • laxis96 84 days ago
        Probably had to stay in the oven a bit longer...

        Jokes apart, Zig is moving forward a lot which is why it's not 1.0 yet, but it doesn't mean you can't write safe and performant applications right now.

        Zig is also a rather simple and straightforward language (like C) and has powerful compile-time code generation (like C macros, but without the awful preprocessor).

        • oblio 84 days ago
          I'm more worried about compilation or stdlib bugs. In theory you can do lots of things with lots of things, but in practice there are all sorts of hidden limitations and bugs that tend to be noticed once a software product is past 1.0 and has been out in the wild for half a decade or more.
      • jokethrowaway 84 days ago
        You still get segmentation faults. My biggest complain with bun is not having enough safety.

        If you use frameworks written for node memory usage is very high and performance is meh.

        If you use frameworks written for bun they smoke anything on node.

        I'd definitely move over, just to get rid of the whole TypeScript / cjs / esm crap, but:

        1. frontend support is poor (next.js / solid.js - I can't run anything fully on bun)

        3. I still need to rewrite my backend app from a node.js framework to a bun one

        4. for backend development the javascript ecosystem is losing the crown: if I wanted something safe I'd just write it in Rust (TS allows any random developer to write crap with any in it and it validates), if I'm doing something AI related I'd probably need python anyway and fastapi is not half bad

        • tossandthrow 83 days ago
          We use Bun in production for the entire stack and has done since v1 - only with minor hikups, that all trvivially has been fixed.

          DX is great!

          We use vite react ts, yoga, and prisma.

    • k__ 84 days ago
      Bun is pretty awesome.

      However, the node:crypto module still doesn't work 100%. So, I can't use it yet.

      • tossandthrow 84 days ago
        The parallel implementation they do uncovers a number of unexpected behaviors in the node implementation.
        • k__ 83 days ago
          I see.

          I just tried to sign some data with an RSA key, and the results differed in Node and Bun.

    • hackandthink 84 days ago
      "nor have tsc configured to automatically add .js extensions to its compiled output"

      It seems to be the default now:

        $echo 'console.log("test")' > t.ts
        $ tsc t.ts
        $ ls
        t.js  t.ts
        $ node t.js
        test
      • boromisp 84 days ago
        What they probably meant was writing 'import "file.ts"' and have tsc emit 'import "file.js"'. https://github.com/microsoft/TypeScript/issues/49083
        • WorldMaker 84 days ago
          Given the context of Node here will allow experimental type-stripping and will not be doing things like import rewriting, Typescript's decision here to focus on "users write .js in imports because that's how the type-stripped file should look" seems like the right call to me. Less work for a type-stripper because Typescript can already check if there is a .ts or .d.ts file for you if you use .js imports everywhere.
  • harshitaneja 84 days ago
    I really enjoy typescript and have been yearning for a typescript runtime but I can't help but laugh that I left java all those years ago to finally seek something a lot closer to java.

    I guess we all just wanted java with JIT, more feature rich type system and gradual typing. Also for all the shortcomings of npm ecosystem, it is a lot less daunting and more fun to be using libraries in this ecosystem.

    And surprisingly even though rust is on a different end of the language spectrum but yet it offers a similar feel.

    Edit: JIT was not the right terminology to use. I lazily wrote JIT. Apologies. What I meant to convey was the difference in startup times and run time between running something in JVM and V8. Java feels heavy but in javascript ecosystem it feels so nimble.

    • qalmakka 84 days ago
      > we all just wanted java with JIT

      Java was literally the thing that made the term "JIT" popular, so I really don't know what you were going for here.

      Also I just can't see how Typescript is in any way "closer" to Java - it's incredibly different IMHO. The only thing they have in common is probably the "Javascript" misnomer and the fact both support imperative programming, but that's it.

      • seanmcdirmid 84 days ago
        Typescript’s optional and unsound type system also does nothing for a JIT beyond what it could already do for JavaScript, you can’t do optimization if your types are unreliable. However, I really really like how Typescript’s type system super charges developer productivity (type errors via the compiler and feedback via the IDE), and don’t mind this part of the design at all.
        • teaearlgraycold 84 days ago
          You can use typescript types to compile functions. You just might need to deoptimize when you actually hit the function.
          • seanmcdirmid 84 days ago
            I don’t know if Vortex-style compilation ever worked in practice.
    • bufferoverflow 84 days ago
      Typescript is way better than Java, in my experience. It's a lot less verbose. A lot more flexible.
      • ZhongXina 84 days ago
        Better for what? Quickly churning out short-lived code to get the next round of funding, definitely. Writing (and _supporting_) "serious" projects over the long term, which also require high performance and/or high scalability, and can rip through terabytes of data if needed, definitely not. (All IMHO from lots of personal experience.)
        • hot_gril 84 days ago
          It's good for things that don't involve ripping through terabytes of data, which is actually a lot of things. And idk if I'd use Java for that either.
          • mcluck 84 days ago
            I'm actually in this exact position right now. The vast majority of the time I write in TS but I have a need to process a whole lot of data so I went for Rust instead. Java is too much of a headache for me, personally
            • hot_gril 84 days ago
              Yeah, Rust is fast, or you can go the other extreme with Python if you can put the heavy lifting on native modules.
        • viridian 84 days ago
          Depends on your architecture. For scaling out rather than up, node and python are both far more performant because the footprint of minimum viable environment is much smaller. When you need to serve anywhere from 10-200,000 requests a minute on the same system quickly, and efficiently, lambda/azure functions/google app engine backed by node or python is pretty ideal.

          As an example, when my org needs to contact folks about potential mass shooter events, our SLA is 90 seconds. If we did it in cloud with java or .net, it'd be too slow to spin up. If we did it on prem, we'd be charged insane amounts just for the ability to instantly respond to low frequency black swan events, or it'd be too slow. This is a real story of how a Java dev team transitioned to using node for scale in the first place.

          • neonsunset 84 days ago
            Unlike Spring, JIT-based ASP.NET Core deployments spin up very fast (<2-5s for even large-ish applications, the main bottleneck is how fast it can open connections to dependencies, load configuration, etc.). For AOT variant, the startup time is usually below 200ms if we don't count the slowness of surrounding infra which applies to any language.

            Of course CPU and RAM per request when compared to Node.js are not even close as Node is easily slower by a factor of 2-10.

      • sgammon 84 days ago
        They are not comparable. If anything, Kotlin is the equivalent in the JVM universe.
        • quonn 84 days ago
          Kotlin is much closer to Java than to TypeScript even in terms of flexibility.
          • DarkNova6 84 days ago
            To me "Flexibility" sounds a lot like "The programmer always knows what he does".
            • scotty79 84 days ago
              Flexibility means for me more something more like, I think I know what I want to do but I also know that I'm probably wrong about that, so for now let's skip all the baroque protocol and let me make it work first. Once I'm sure I wrote what I actually wanted I'll add types if only to get rid of some bugs, consider edge cases and earn nice code completions and auto-generated docs.
          • sgammon 83 days ago
            Laughs in `: dynamic`
          • sgammon 84 days ago
            Sure, but Kotlin is to Java as TypeScript is to JavaScript, which is the point I am making.
            • wiseowise 83 days ago
              That’s not even remotely true.

              TypeScript is a direct superset of JavaScript. Any valid JS is a valid TypeScript.

              If Kotlin is what TS to JS, then so is Groovy, Scala, Clojure and other JVM languages.

              • sgammon 83 days ago
                > TypeScript is a direct superset of JavaScript. Any valid JS is a valid TypeScript.

                Kotlin on JVM is a direct superset of Java on JVM. Any valid Java is also valid Kotlin at the bytecode layer.

                > If Kotlin is what TS to JS, then so is Groovy, Scala, Clojure and other JVM languages.

                Correct

      • khana 84 days ago
        [dead]
    • bubblyworld 84 days ago
      I'm very glad to use typescript over java, personally - the ergonomics are so much better! Especially if you stray away from the somewhat incomplete classes thing (type support for decorator arguments isn't great, for instance) and just focus on interfaces and functions.

      One thing I miss that java has is runtime reflection of types though. Typescript's ecosystem has a million different ways to get around that and they're all a bit ugly imo.

    • zx8080 84 days ago
      > we all just wanted java with JIT, more feature rich type system

      Java has JIT. How is TypeSript type system feature-richer than the Java one?

      • wiseowise 84 days ago
        • DarkNova6 84 days ago
          > alone puts TS over anything that Java has.

          Virtual Threads alone challenge this assumption.

          Syntax bloat is not a feature.

          • Byamarro 84 days ago
            It's not really a syntax bloat, the linked docs mention how to define strict string types and elaborate on type-level programming, something that is a very rare and powerful type-level capability. As far as I understand Virtual Threads aren't type oriented feature, which is basically the context for this thread.
          • 3836293648 84 days ago
            Virtual Threads are not a type system feature?
      • magnio 84 days ago
        I don't use them directly much, but template literal generic and contidiontal types is probably the closest a mainstream language has inched towards dependent types.

        Some examples of TypeScript power:

        - SQL database in TypeScript types: https://github.com/codemix/ts-sql

        - Statically typed raw SQL queries: https://github.com/andywer/squid?tab=readme-ov-file#tag-func...

        - (Someone fill in your TS hackery for me)

      • VMG 84 days ago
        1. *Type Inference*: TypeScript can automatically infer types from context, reducing the need for explicit type declarations. 2. *Union and Intersection Types*: Allows combining multiple types, offering more flexibility in defining data structures. 3. *Literal Types*: TypeScript supports exact values as types (e.g., specific strings or numbers), which can be useful for more precise type-checking. 4. *Type Aliases*: You can create custom, reusable types, enhancing code clarity and maintainability. 5. *Interfaces and Structural Typing*: Interfaces allow for flexible contracts, and TypeScript uses structural typing, where the type compatibility is based on the shape of the data rather than explicit type declarations. 6. *Mapped and Conditional Types*: These allow for dynamic type creation and manipulation, making the type system more powerful and expressive. 7. *Optional Properties and Strict Null Checks*: These provide better handling of undefined and null values.
        • tpm 84 days ago
          That's just an copy-paste of some features, not a comparison with Java which does most of that too.
          • afiori 84 days ago
            Union types, structural typing, and conditional types are like a big chunck of what makes typescript typescript.

            It is how TS is able to "type" a completely untyped language.

            Just the support for union types is something that not even Haskell or Ocaml have.

            • nequo 84 days ago
              I am not familiar with TypeScript. Is there something that you can achieve with union types that you can’t with sum types or type classes in Haskell?
              • afiori 84 days ago
                TL;DR: Typescript is unsound so it can add a lot more type-level features that would make a sound type system undecidable

                Conceptually no, almost every useful union type can be easily converted to a sum type. In my opinion the difference is in the ergonomics and in the implicit structural subtyping.

                For example a common union type is number|string, and the beatiful part is that to use a value of such a type you do not need to do any matching or mapping you can just use the value as it does not have a runtime wrapper, for example (x:string|number)=>JSON.stringify(x) works perfectly fine.

                Also you can have a function that takes as input a Array<string>|number|null and returns a string|number without having to declare different contructors for the input number type and the output number type

                I believe that you can essentially implement this behaviour by generating enough typeclasses in Haskell, but regardless of the feasibility it, likely, would not be a good idea.

                An example of something in between union types an Hindley–Milner sum types are Ocaml's polymorphic variants types https://ocaml.org/manual/5.2/types.html#sss:typexpr-polyvar that are (I believe) more advanced than TS unions but also a lot less ergonomic to use.

                And TS has much more eg intersection types you could have a function with type

                    (x:number)=>string & (x:string)=>number
                
                meaning that it is both a function that maps number to strings and strings to numbers (again you can do this with typeclasses but it is a worse experience)

                typescript also has very good support for value types for example there is the string type but also the "hello" type which is the type of only the string "hello"

                All in all if someone told me that they implemented typescript in haskell typeclasses I would not call bullshit on them, but I would not believe that anyone would actually use it for anything

                • tpm 84 days ago
                  > For example a common union type is number|string

                  So that's like my beloved Perl then.

          • VMG 84 days ago
            most of that? name one
            • tpm 84 days ago
              Java has type inference. Also if a type alias is just a new name for a existing type, then you can always do something like

                class MyNewClass extends OldClass {};
              
              (of course it's not just a new name, it's also a new class, but it's also still a OldClass, and you are out of luck if OldClass is final or sealed)

              Java also has interfaces, of course. And optional properties (using Optional) and strict null checks, when you want that, you can use it.

              • VMG 84 days ago
                > type inference

                very limited, for instance you must declare the type of a public method

                > alias

                as you point out it's not

                > Java also has interfaces, of course

                but you have to implement them explicitly

                > strict null checks, when you want that, you can use it

                if we start accepting static analysis tools then C has null checks as well I guess

                • tpm 84 days ago
                  > as you point out it's not

                  so what's the difference except the name?

                  > if we start accepting static analysis tools

                  I'm not talking about static analysis. In today's Java you can write code that does not accept nulls, if you want to.

                  • svieira 84 days ago
                    You cannot write code that will fail to compile `theEntryMethod(null)` unless you only use primitive types. (You can, of course, make that method fail at runtime, but that's not what's being talked about here).
              • guipsp 84 days ago
                Using optional still has the secret third thing problem
          • wetpaws 84 days ago
            [dead]
      • yen223 84 days ago
        Java and Typescript have fundamentally different type systems, that lead to drastically different ways to approach types.

        Utility types, like Partial<T>, are basically impossible to represent in Java except with almost-duplicated classes.

        • tpm 84 days ago
          > drastically different ways to approach types

          Exactly.

          > Partial<T>

          Looking at that it's just what a default POJO (with nullable properties) already is, so I'd see no need to represent that in Java.

          Looks cool though and I like Typescript; my issue with it is that it needs transpiling to run. If it was a first-class citizen in an environment I would use it for my pet projects.

          • svieira 84 days ago
            Yep - all non-primitive types in Java are `TheType | null` - TypeScript actually allows you to strip out the `| null`, which then means that sometimes you want to add it back in. So Java doesn't have a need for `Partial<T>`, it has a need for `NonNull<T>` and it can't express that at the type system level very easily right now (you can do it with type tagging and runtime checks inserted explicitly, but it's not very ergonomic right now)

            https://gist.github.com/svieira/9f8beeafb7bf4aa55d40c638532f...

          • crabmusket 84 days ago
            > it's just what a default POJO (with nullable properties) already is

            I think you missed the point. Partial<T> is an example of a "mapped type", see the handbook for more explanation: https://www.typescriptlang.org/docs/handbook/2/mapped-types....

            • tpm 84 days ago
              I understand that much, just thinking aloud what problem would that solve in Java.
              • scotty79 84 days ago
                I think it's for when you need type that expresses partial data update for an object that has some fields required.
    • jddj 84 days ago
      I'm somewhere here as well. Personally I think what I want is the stdlib (without the current legacy/ all but deprecated bits) and ecosystem of c# but with the ease and power of structural algebraic types. AoT is fine, with option for single binary. Ideally runtimeless with clever trimming. If it also ran jitted in the browser all the better.

      I also want compiler/type checker niceties like exhaustive pattern matching.

    • austin-cheney 83 days ago
      > I guess we all just wanted java with JIT

      Oh god no. What an abomination.

      Greater than 95% of the incompetence in JavaScript comes from two camps. The first of those are people who absolutely cannot program at all. The second of those are Java developers who were taught Java in school and it’s all they can do, so everything must look like Java.

      The result of both tribes is pretending to do something they cannot do on their own. When you’re a pretender vanity becomes excessively important because everything is superficial, so you get layers of shit you don’t need that they cannot live without. Any attempts slice off the unnecessary bullshit always results in hyper emotional distress because people feel threatened when exposed. That right there is why I will never write JavaScript for employment ever again.

    • lmm 84 days ago
      Java's type system was just very limited, gradual typing is a poor tradeoff most of the time. I used to think there were advantages to something like Python, but once I found Scala I never went back.
    • dagenix 84 days ago
      Java does jit
      • harshitaneja 84 days ago
        Yes, JIT was not the right terminology to use. I lazily wrote JIT. Apologies. What I meant to convey was the difference in startup times and run time between running something in JVM and V8. Java feels heavy but in javascript ecosystem it feels so nimble.
        • sgammon 84 days ago
          Native Java via GraalVM starts up in milliseconds.
          • wiseowise 84 days ago
            And it has to go through slow compilation step. With Node you can have a cake and eat it too.
            • DarkNova6 84 days ago
              You can't be serious about comparing the technological capabilities of the JVM and Node and objectively declare the latter as the winner.

              Compilation times are also an absolute non-issue.

              You don't compile for development. You do it for production (in the rare circumstances that you need it).

              • harshitaneja 84 days ago
                That's not what I am trying to convey here. JVM is amazing and it is a feat that java is as fast as it is and javascript and v8 are order of magnitude slower.

                Also even though I also found java too verbose, I kept believing that we need it to be so to write good software. I still enjoy java but it doesn't compare to the ergonomics of typescript for me. And nimbleness of the experience according to me plays a decent role.

                Currently for me, either I really care about performance and I default to rust for those applications or I need solutions where the product will evolve quickly over time and I need great DX over performance and I default to typescript for those.

                Java definitely has a role to play but its role in my work has certainly diminished.

            • sgammon 84 days ago
              It does not "have" to go through such a step, by the way, because you can simply run such code on the JVM.
            • ZhongXina 84 days ago
              You're saying it like it's an absolutely good thing. Some (many?) users would rather pay the cost upfront in compilation time (doesn't really matter if it's AOT or JIT) than pay the same cost many times over through a significantly slower runtime. JVM also scales up to supercomputers (and everything in between) if you want it to, so depending on your requirements a single-threaded alternative might not even be an option.
    • scotty79 84 days ago
      Gradual typing is the key. The problem with Java is that types are in your face way before you actually need them.

      With TS you can prototype with JS and only after you know what you are looking for you can start to add types to find bugs and edge cases and want to get nice code completions for your stuff.

    • wiseowise 84 days ago
      > Also for all the shortcomings of npm ecosystem, it is a lot less daunting and more fun to be using libraries in this ecosystem.

      God I wish they’d just integrate something lightweight like npm into JDK.

      It is beyond me why you have to install third-party heavy weight tool just to manage dependencies.

    • 38 84 days ago
      > gradual typing

      AKA dynamic typing. Unless it's 100% static, it's dynamic

      • debugnik 84 days ago
        Gradual typing could still keep some static guarantees if the static part were sound, e.g. you couldn't assign a dynamic-typed integer to a string-typed variable without checking the type at runtime first; which TypeScript isn't.

        Elixir's new type system does much better here, as it determines whether a function actually guards for the right type at runtime ("strong arrows") and propagates the guarantees, or lack thereof, accordingly.

    • tomjen3 84 days ago
      The typesystem of Java was so laughably unpowerful that it severely constrained what you could write.

      In Typescript you have far more freedom, and all the benefits of strong types.

      • yen223 84 days ago
        The fact that Java forced you to write types, and then made everything implicitly nullable so that you still get NullPointerExceptions at runtime after writing out all those types, was probably a big reason why dynamically-typed languages became popular.
      • DarkNova6 84 days ago
        Strong types without strong typing that is.
    • hot_gril 84 days ago
      The type system is a big part of what made Java cumbersome. It's loosened up a little over the years. TS itself may allow partial typing, but when team/company policies are involved, you'll often end up being forced to type everything.
    • ninepoints 84 days ago
      Not having an opaque tech stack encumbered by a patent minefield is another plus.
  • rockwotj 85 days ago
    My favorite deno feature is coming to node directly. Awesome!

    Maybe this means I don't always have to install esbuild to strip types - very excited how this will make writing scripts in TypeScript that much easier to use. I lately have been prefering Python for one off scripts, but I do think personally TypeScript > Python wrt types. And larger scripts really benefit from types especially when looking at them again after a few months.

    • yamumsahoe 85 days ago
      btw if anyone is looking to run ts on node, there is tsx. there is also ts-node but i prefer tsx.

      https://github.com/privatenumber/tsx

      • iansinnott 84 days ago
        Seconded again. While tsx usually just works ts-node almost never just works. tsx is perhaps unfortunately named though so it may confuse people at first since it has nothing to do with jsx syntax.
        • jimvdv 84 days ago
          Thank you for bringing this up, I almost ignored this project since I assumed it had something to do with TypeScript + JSX.

          The JS ecosystem sure struggles with naming things.

          • tommica 84 days ago
            The _programming_ ecosystem sure struggles with naming things.
            • jimvdv 84 days ago
              Fair enough
          • nikeee 84 days ago
            I own the npm package node-ts, which has thousands of installs per week just because people confuse it with ts-node.

            I didn't intend to typo-squat. Actually, my package is older than ts-node and was just a pun because it is an API for TeamSpeak written in TypeScript.

        • tills13 84 days ago
          It's named as such to mirror `npx`
      • jessym 84 days ago
        I second this. The tsx library is zero config and always "just works" in my experience, which puts it miles ahead of ts-node, imo.
      • wruza 84 days ago
        Tsx’s only reset-in-console mode is <Enter>, which makes it impossible to develop cli apps in watch mode.

        You cannot run tsx from a non-project cwd if you’re using tsconfig/paths.

        And personally I find its maintainers relatively unpleasant to message with. Leaves “you’re plebs” aftertaste most of the times.

      • silverwind 84 days ago
        tsx has very slow startup performance, I prefer https://github.com/swc-project/swc-node which is around twice as fast.
        • dimgl 84 days ago
          We have not seen this whatsoever. How big is your project? `tsx` is almost instantaneous in a server-side project of ours.
        • herpdyderp 84 days ago
          Does swc-node work with code coverage calculation libraries? For a long time tsx didn’t (and it’s still pretty finicky) so that kept me from using it.
          • silverwind 84 days ago
            Not sure what features those need, but at least the stack traces are correct in swc-node, so maybe worth a try.
    • medv 84 days ago
      • tnzk 84 days ago
        This seems very interesting approach to scripting. Does it basically provides with an alias to child_process.exec as $ and besides that I can write in the same way I'd do in Node?

        > Node.js standard library requires additional hassle before using

        I read the hassle as having to setup Node runtime in advance, but zx requires npm to install so I'm not sure.

    • yamumsahoe 85 days ago
      correction: the only deno future that i want
      • sholladay 84 days ago
        Deno has so many other great features. Most web standard APIs are available in Deno, for example. It can do URL imports. It has a built in linter, formatter, and test framework. Built in documentation generator. A much better built in web server.

        Node is copying many of these features to varying degrees of success. But Deno is evolving, too.

        • johnny22 84 days ago
          the url imports is one the things I don't want.
          • niklasmtj 84 days ago
            There are also the `npm:`, `node:` and `jsr:` specifiers now. So you don't have to use the URL imports if you don't feel them.
          • sholladay 84 days ago
            You want to be forced to use a centralized registry? I don’t know. URL imports also enable fully isomorphic modules. I think you would enjoy the freedom of URL imports if the ergonomics were better. For example, it should just default to https:// so you don’t have to type that. Import maps also help a lot with this, definitely use them. But they could be even better by having first-class support for templating the module version into the URL so that the version can be stored separately, alongside the module name. Popular hosts with well-known URL structures could have their URLs automatically templated so you only have to specify the host and not the rest of the URL.

            In other words, the tooling could be better, but the fundamentals of URL imports are sound, IMO.

            • zzo38computer 84 days ago
              I disagree. It should not default to "https://" (I think defaulting to local files would be better).

              Furthermore, I think that it should be made so that the "hashed:" scheme that I had invented (in the Scorpion protocol/file-format specification document, although this scheme can be used independently of that) can also be usable.

              And, popular hosts with well-known URL structures automatically templating also I would disagree, although it might do to allow any expressions in place of the string literals and then add functions for abbreviations of some of those URLs, if that would help (although I still think it is unnecessary).

            • johnny22 76 days ago
              yes, I would prefer to use a centralized registry indeed. However, that's not actually what i'm talking about here. Even just decoupling the import from the package is enough. You can already do this by by pointing a package in package.json to a remote tarball or git repo.
        • throwitaway1123 84 days ago
          Node supports URL imports via the --experimental-network-imports command line option. There's also a built in test runner now.
      • WuxiFingerHold 84 days ago
        ... and obviously the only one you know.

        Kidding aside: You should really take an hour and check out the manual and std lib (https://jsr.io/@std). I was surprised how far Deno has come. A lot of pretty useful stuff you would otherwise need tons of NPM modules for.

  • throwitaway1123 85 days ago
    It's been a really eventful month for Node. First they added node:sqlite in v22.5.0, and now TypeScript support is landing. I love the direction Node is heading in.
    • eknkc 84 days ago
      It is Bun influence / competition I guess. Good for everyone.
      • mark_and_sweep 84 days ago
        I believe the competition started with Deno. But yes, Bun is part of the competition now, too.
        • eknkc 84 days ago
          Deno was doing its own thing though.

          Bun came out swinging with strong Node.JS compatibility promises. I have simply replaced node with bun for most of my own work without much effort. The mental effort required is to use `bun` instead of `node` in command line for most of the trivial things.

    • crabmusket 84 days ago
      The recently-added test runner is very cool too!
      • SwiftyBug 84 days ago
        YES. It was such a joy to be able to ditch Jest completely and run tests natively.
        • joseferben 84 days ago
          i tried that but had to revert to vitest, the native test runner feels incomplete atm.
          • crabmusket 84 days ago
            What is missing for your use case or workflow?
            • sureIy 84 days ago
              Probably a bunch of assertion types and general DX. Node:test is just a feature, Vitest is a whole product. The former might be enough for small packages but nowhere near useful for anything non-trivial.
              • crabmusket 84 days ago
                Fair enough. We still use Jest at work for exactly those reasons. In my personal projects, I prefer to minimise dependencies rather than get every DX benefit I can.
            • joseferben 83 days ago
              for instance exiting the runner on the first error. and the diffs with node:assert are not as nice either compared to vitest.

              i'm building a framework with minimal dependencies at https://www.plainweb.dev so i'm super excited about everything that node builds in (sqlite, typescript).

              but there is a difference between supporting the bare minimum and actually making it nice to use day to day.

      • conaclos 84 days ago
        I tested it a few months ago. However, the output isn't very human friendly.
      • herpdyderp 84 days ago
        I just started using it the other day and it’s a dream. I look forward to the eventual stability of their snapshot testing.
  • satanacchio 84 days ago
    Hi I'm the author of the PR, AMA
    • mostafah 84 days ago
      Thank you a lot. Great work. I know it’s still experimental, but over time it will have a big impact on developer experience and will simplify the development workflow for a lot of projects.
    • Shacklz 84 days ago
      Great work, many thanks!

      Out of curiosity, what do you see as next steps, and what possible futures do you see for typescript in the node- and overall JS-ecosystem?

      • satanacchio 84 days ago
        this is the roadmap https://github.com/nodejs/loaders/issues/217. We talked with the typescript team and we will give each other continous feedback on the progression. We made sure to take some precautions in order to avoid breaking the ecosystem. I still think in production, js is the way to go, so users should always transpile their ts files.
        • nilsbunger 83 days ago
          Could you expand on why transpiling is the right long-term strategy for production? I get that right now you don't support some TS-specific features like enums. Is that the concern? Those seem like a few legacy exceptions, new TS capabilities will be "just javascript".

          Not transpiling would be great to reduce toolchain complexity and eliminate the need for sourcemaps just to understand exceptions and debug.

          • satanacchio 82 days ago
            The first reason is because if we supported ts features that require transformation (such as enume) we would also need to support sourcemaps, so in the first iteration I decided not to, to avoid being overwhelmed. Right now we replace inline types with whitespace, so locations are preserved. We plan to add those features, probably behind a flag at the beginning. We need to move in small steps and think very carefully, every decision could make a huge impact on the ecosystem so I decided to start with the smalled subset possible.
            • nilsbunger 80 days ago
              That all makes sense, it sounds like you haven’t ruled out supporting TS directly in production, but it’s complex and you have to move carefully and you’re not sure if you’ll get all the way there. Is that right ?
  • gaptoothclan 84 days ago
    A long time ago I started converted to using node js for backend work, seemed to offer many benefits over writing code in PHP without bringing many problems of Java. I found node to be somewhat clunky and a language where you had to bolt it together to get the language you wanted. Eventually started writing golang and it felt much easier to write, sometimes way more verbose but the type safety just made coding simpler.

    Typescript seemed like a good option but was just another bolt on, I am not sure what value you gain by using Typescript over Golang, you have nice defined types which is great but it does not solve other issues with the language that are resolved in golang (also solved in deno).

    One large benefit of using node over golang is the speed of prototyping something which I think having to use type script largely negates, so I can not really decide if this is a good step forwards or is making node loose some qualities that made it a good choice in other ways.

    • bezier-curve 84 days ago
      Typescript is safer JS. You're still using JS with TS. The "bolted on" phrasing makes me think your issue may be more the absence of more opinionated frameworks like Django, that manage everything out of the box. I love using Django, but it's a little harder to go off the beaten path with it.
      • hot_gril 84 days ago
        "Bolted on" is how I'd describe it too. Using TS means messing a lot more with random config files. And standard tools like the NodeJS profiler don't work with TS, which hopefully will change soon.

        I've never used Django. Express seems a lot nicer.

      • gaptoothclan 84 days ago
        My phrasing there was a direct comparison of developer experience between golang and nodejs. Golang has a very complete core library, I try to avoid frameworks as much as possible. For me I rarely have to think about the language or ecosystem, everything I want or need is already part of the language, testing, linting are some great examples
    • __alias 84 days ago
      I mean the obvious answer is language familiarity, If your projects frontend code is in javascript/typescript ( which it is ), then using node is an easy choice. Shared libraries, shared types, etc etc
      • gaptoothclan 84 days ago
        I was in the paradigm, there was very little code reuse from front to backend, some time performing validation I would like to have that option, but I would not have that as a killer feature that determined the language I use.
      • fulafel 84 days ago
        Lots of people do of course use other languages for the frontend. (Or go for thin frontends, ala HTMX )
        • __alias 76 days ago
          1. that's a lie, and "lots" of people don't use HTMX (unless I've been living under a rock and there is a non-unsubstantial number of people using it :D ) 2. HTMX IS javascript, and you can still use the same familiar packages across front end and backend e.g. lodash
    • _joel 84 days ago
      I guess if you already know Javascript, or have inhouse experience vs. learning Go. We use it with cdktf as previous fe experience, seemed logical vs. Go
      • gaptoothclan 84 days ago
        sorry was going to add that there are probably more javascript developers in the jobs market, although there is a limit to the usefulness of these developers.

        In the company I worked at we were fairly small and did not have huge applications running on node, so it made that journey easier

  • 65n56nm5665m56 84 days ago
    I beg of thee, do not do this. I get that people love typescript but I am already running into a problem where javascript resources are written in typescript by default with nothing for regular javascript. This is the same problem that happened when JQuery hit its peak popularity and an overwhelming amount of resources and guides amounted to "Oh just do this in JQuery"
    • pansa2 84 days ago
      > javascript resources are written in typescript by default with nothing for regular javascript. This is the same problem that happened when JQuery hit its peak popularity

      That's definitely a potential issue - JavaScript is the fundamental standard, not JQuery and not TypeScript. Certainly there are situations where maximum forward-compatibility is important (learning resources are a good example) and for those, vanilla JavaScript is the best choice.

      All of the old resources that relied on JQuery are now hopelessly outdated, whereas the contemporary ones that used plain JavaScript are as valid now as when they were written. I'm sure the same will be true of TypeScript vs JavaScript when the next big thing comes along.

      • silverwind 84 days ago
        > the contemporary ones that used plain JavaScript are as valid now as when they were written.

        True, but on the other hand, almost any JS code snippet written 20 years ago has better and more elegant alternatives today. APIs evolve all the time.

    • lolrsten349 84 days ago
      Tell me you're out of touch without telling me that you're out of touch.

      jQuery was so popular because writing anymore than a few lines of vanilla JavaScript was an *awful* experience due to all differences in browsers.

      When things eventually standardized-ish and jQuery became unnecessary, other libraries/ecosystems popped up (e.g. React/JSX) to make writing webapps easier because writing anymore than a few lines of vanilla JavaScript was still an *awful* experience.

      When webapps grew in size and scope, other "transpiled" languages popped up (e.g. TypeScript) because writing anymore than a few lines of vanilla JavaScript is *still an awful* experience.

      We're stuck with JavaScript due to past decisions, but let's not pretend it's actually a good tool. If it were we wouldn't need 50,000 tools/frameworks/transpiled languages to hide how terrible it is.

      • sureIy 84 days ago
        JavaScript is not terrible and I don’t understand where you got that from. Since ES2015 came out, it’s actually rather pleasant.
        • alxndr 82 days ago
          In the first decade of this millennium, writing JS was a pretty frustrating, and JQuery papered over a lot of the nastiness.
  • vithalreddy 85 days ago
    Simply amazing!

    I wonder bun and deno support for typescript played a big role here :)

    • matrixhelix 85 days ago
      This is why competition is important
    • yamumsahoe 85 days ago
      i do wish nodejs adopts uwebsockets (totally what makes bun fast)
      • WuxiFingerHold 84 days ago
        Yes, many aren't aware of that. If nodes webserver performance is not enough, you could always use uwebsocktsjs or hyperexpress with node.
  • me_vinayakakv 85 days ago
    Nice to see Node.js getting parity on this with Deno and Bun
    • Klaster_1 85 days ago
      This reminds me of io.js situation, where in the end major fork changes were incorporated into Node. This is why I am comfortable staying with Node and npm for my projects - the features will eventually trickle down anyway.
      • devjab 84 days ago
        This is the “enterprise” approach and it’s a solid one in my book. I do think drop-ins like Bun and PNPM are always great, however, and we’ve adopted both where it has made sense. I don’t think Bun will make sense very often as it’s only when you really need the performance the added maintenance becomes worth it. Especially right now where it’s not exactly stable for a lot of things. PNPM however is often very great compared to NPM and doesn’t add much maintenance as the tooling essentially gives your developers a very similar experience.

        I’m also not sure the features will eventually “tickle down”. I’m not sure NPM wants to adopt the advantages PNPM gives you as an example, and it’s probably a good thing too considering the basis of NPM is just a really solid system to build on top of which it wouldn’t be if it was very opinionated. One of the big issues Node has today is that it was very opinionated with CommonJS, which made sense at the time, but is a ginormous pain in the butt in the modern world. Though the blame is obviously not with Node alone.

      • ibash 85 days ago
        But in the meantime you suffer for it.

        How many days have you spent on webpack config? Or the package.json type property? Or yarn/pnpm/etc particulars?

        I have spent too many.

        Bun is quite nice.

        • crabmusket 84 days ago
          Is Bun's bundler on par with webpack for features? You can't escape webpack* if you're targeting frontend.

          *Or vite, or whatever equivalent.

          • iainmerrick 84 days ago
            Using vite is escaping webpack!
        • Klaster_1 84 days ago
          Very true, rising popularity of deno and bun clearly indicate that new runtimes solve real issues people have. That's why I mentioned "my projects", your experience may vary.
    • theflyinghorse 85 days ago
      On the topic of typescript - yes. However Bun has a lot more tools baked in than Node does at (bun test for instance). Would be real nice to see Node start adopting more ideas from Bun and others.
      • iecheruo 85 days ago
        Give node a closer look, it's been quietly accruing those features.

        node has a built in test runner now

        https://nodejs.org/api/test.html

      • joshmanders 84 days ago
        I find it interesting that everyone looks at Bun and shames Node saying they need to catch up to Bun and implement stuff Bun has but Node doesn't, yet nobody is like Bun should catch up with feature parity of Node. Node isn't trying to replace Bun, Bun is trying to replace Node so it should be the one who needs to match parity.
        • pfg_ 84 days ago
          Bun is doing that - every update usually has node compat fixes or improvements and the list of supported modules has gone up significantly since it was released
  • Shacklz 85 days ago
    I'm honestly giddy. This could be the (slow) beginning of a new era, where "JS with types" is finally a native thing.

    I'm even willing to forgive all the mess that CJS vs. ESM is if they manage to pull this off.

    I hope this sees widespread adoption/usage, which might finally cause some movement to integrate TS into ecmascript after all. Some dynamically-typed language fanatics (which are, in my opinion, completely detached from the reality that static types are what the vast majority of devs want) still have an iron grip on TC39, this might be the start of their end. And good riddance.

    • yashap 84 days ago
      Yeah, my personal experience is that easily 95% of devs I work with/have met in person, if not closer to 99%, prefer statically typed languages. Maybe that’s a biased sample, but I do think the overall preference among devs is very strong. I also see JS slowly, more-or-less becoming TypeScript over time.
      • tstrimple 84 days ago
        My days of being a real software developer are long behind me. So I'm totally willing to accept that I'm wrong here. But when I build a POC in particular, there's a LOT of power and flexibility granted by not giving a fuck about types. Suddenly I can accept non well defined data types (depending on my implementation) and can persist data that otherwise would have taken code changes and approval processes to accept. I do believe there is a place for types, but to type all the things is folly. There are capabilities within JavaScript to handle both.
        • iamsaitam 84 days ago
          The malicious beauty of typescript is that at any point you can just declare something "any" and voilá, the guard rails are off.
          • leshenka 83 days ago
            It’s fine to have an escape hatch when you can’t figure/don't care about types (yes sometimes you should use unknown but that’s another topic)

            But at least TS forces (if enabled by strict flag) you to explicitly mark all those places. You can always revisit them later.

            Case in point: I’ve written A LOT of redux-saga code and figuring out types for that was exceptionally difficult for me. Sprinkled all that with a ton of anys. Had a few bugs but not anything serious.

            Finally rewriting all that slop with async-await and am really happy about it

        • iainmerrick 84 days ago
          when I build a POC in particular, there's a LOT of power and flexibility granted by not giving a fuck about types

          I find the same thing, but only for very small throwaway scripts and the like. For anything beyond like 20 lines of code, I rapidly hit confusing cases like “is this parameter just a map, or a map or maps?” Then I add types and it makes sense again.

        • miunau 84 days ago
          Just use `any` or `unknown` when prototyping, then apply types once your happy paths start working for the first time to start catching the unhappy ones.
          • djeastm 84 days ago
            >then apply types once your happy paths start working for the first time to start catching the unhappy ones.

            I.e. the "I'll go back and add safety later" strategy. Somehow I never seem to get around to doing it.

            • miunau 80 days ago
              Like documentation, you should do this while it's fresh in your head, but I get it.
        • yashap 82 days ago
          For sure, if I’m writing a short script, no more than ~200 LOC, Python (without type hints) is my favourite language. But for a sizeable codebase, worked on by multiple devs or even just me over time, I’ve got an extremely strong preference for static types.

          Also, FWIW, TypeScript is the most “lightweight” statically typed language I’ve ever used, in terms of extra ceremony/lines of code over a dynamic language. Once you get used to its type system, and embrace the structural typing ideas, I feel the overhead is super minimal. It might slow me down by ~5% on a short script over JavaScript, while dramatically improving maintainability as a codebase grows.

      • spaceheater 84 days ago
        Let's make it 94 then. I think typescript is an abomination forced by java developers that don't want to learn javascript.
        • wiseowise 84 days ago
          Guard rails on a bridge are an abomination forced onto us by government because people don’t want to learn how to fly.
    • jampekka 84 days ago
      > the reality that static types are what the vast majority of devs want

      [citation needed]

      • wiseowise 84 days ago
        • dgb23 84 days ago
          This is not what GP asked for. That's the most requested feature from the state of JS survey.

          I bet the percentage of web developers who want this are a tiny minority. There are just too many issues with this.

          Static typing without the benefits of better runtime performance, soundness, strong typing etc. is basically just documentation/comments with extra steps.

          Also TS is a complex, moving target. Most devs don't want to learn new fancy features every couple of months, but prefer stability guarantees. Several notable projects have moved away from TS. Even Ryan Dahl admitted that integrating Deno with TS was probably a mistake.

          Meanwhile you have WASM slowly and steadily getting crucial features on a sound foundation.

          I'm extremely cautious about TS and wary of the hype surrounding it.

      • wokwokwok 84 days ago
        The reaction emoji on the merged PR are not particularly ambiguous.
  • cryptica 84 days ago
    It's interesting how TypeScript beat Flow in terms of popularity, and yet everyone is now calling for TypeScript to be more like Flow (to focus on plain type checking instead of transpilation; just strip out the type annotations)... And most of those people don't even know about the existence of Flow.

    The sad thing about the tech sector is that you can be right about something and yet still lose the hype-wagon popularity contest. Then your competitor copies your original idea... The exact same idea which they had previously claimed was inferior.

    It seems that the idea was inferior purely on the basis that it wasn't their idea. As soon as they've appropriated the idea, suddenly it's the best idea in the world.

  • spankalee 85 days ago
    It's about time for TC39 and Microsoft to standardize TypeScript as part of JavaScript. Not "types as comments" either, but actually TypeScript, minus the non-standard runtime semantics and modulo whatever changes are necessary to integrate the grammar.

    So many runtimes and tools are integrating TypeScript now, and with multiple implementations, that a real standard is necessary. It'll be much harder to evolve TypeScript because it'll have to stay backwards compatible, but it's grown to that point now, imo.

    • dgellow 85 days ago
      I would rather let Typescript evolve a few more years before freezing its development via standardization
      • suby 84 days ago
        I'm interested to know what are some things that you (or anyone reading this) feels that Typescript is missing / should change?
        • rty32 84 days ago
          One example:

          https://github.com/microsoft/TypeScript/issues/30551

          Which goes to https://github.com/microsoft/TypeScript/issues/9998 which captures a lot of such scenarios

          And just a few releases ago there were some big problems with handling recursive types. I think most are fixed but there may still be a few around.

          These are things that you run into on a daily basis if you write enough TypeScript.

          I doubt we should standardize TypeScript before we have definitive solutions to all these. And I 100% agree with the parent's comment.

        • low_tech_punk 84 days ago
          Maybe it's helpful to analyze TypeScript's track record in deprecating features and creating breaking changes, which could be a big red flag to TC39. I'm all for typing support but my work is mostly prototyping on short lived projects. People maintaining production systems that will eventually become legacy systems might have a different opinion.
        • _flux 84 days ago
          I don't have a list and I don't even write TS (or do much web dev in general), but I do follow their announcements and it seems every one of them brings new big type things.

          Nevertheless, I don't see TS support in e.g. browsers being anything useful, as in practice all JS code deployed is already packaged somehow, so the stage to convert TS to JS (and then also checking the types..) fits that just fine. It's useful for hobbyists, but I don't that is a reason enough to come up with a standard.

      • pas 84 days ago
        C++ and Java are quite good at churning out standard docs and new versions every few years. (Even Python is aiming to offer GIL-lessness!)
    • mythz 85 days ago
      This has been proposed for several years now, but there's been very little progress on it:

      https://github.com/tc39/proposal-type-annotations

      • spankalee 85 days ago
        That is "types as comments", not standardizing TypeScript.
        • dgellow 85 days ago
          It is explicitly not type as comments, it is type erasure
          • crabmusket 84 days ago
            That has no bearing on the comment you're replying to.

            The point is that the types could be TypeScript, Flow, Hegel, or something else. The browser won't perform type checking, it will just ignore the types.

            So, it is not standardising TypeScript.

          • jraph 84 days ago
            I believe you and several of your child comments might be confusing "types as comments" with "types in comments".
          • madeofpalk 84 days ago
            It explicitly is, from the link

            > At runtime, a JavaScript engine ignores them, treating the types as comments.

            The phrasing here is that the types are meaningless. They are just "comments" to the JS engine.

          • spankalee 84 days ago
            > This proposal aims to enable developers to add type annotations to their JavaScript code, allowing those annotations to be checked by a type checker that is external to JavaScript. At runtime, a JavaScript engine ignores them, treating the types as comments.
            • dgellow 84 days ago
              Yes, that’s explaining what type erasure is. “type as comments” is what Flow supports, literal comments for type annotations.

              I think we disagree on the terminology but agree on the goal of the proposal

              • spankalee 84 days ago
                "types as comments" is the term that the champions and reviewers of this proposal have been using.

                It refers to how the types are parsed: aside from some kind of standard start and end delimiters, the parser does not try to parse the expression-level type syntax. Type expressions are just strings of characters. This way you can have basically any syntax at all for types.

              • eyelidlessness 84 days ago
                The proposal would explicitly treat supported type annotation syntax as comments in the grammar. It is definitely types as comments, even if it is also type erasure.

                And it would apply to Flow’s type annotation syntax which is also not presently treated as comments, at least for the very large subset of that syntax which overlaps with the proposal.

              • pcthrowaway 84 days ago
                Types as comments is what jsdoc supports. Flow is compiled like typescript.

                tsc does also support jsdoc though, so technically I think tsc is closer to supporting types as comments, though it's possible Flow has this also.

                • dgellow 84 days ago
                  • pcthrowaway 84 days ago
                    Thanks, I've never used flow and didn't know this. I've only seen it used in the non-comment form.
                • spankalee 84 days ago
                  Types _in_ comments is different from types _as_ comments. "Types as comments" refers to how type expressions are parsed.
                  • pcthrowaway 84 days ago
                    jsdoc is a format for supplying types (and annotations) as comments as well.

                    I think I get the distinction you're trying to make, but I don't think the nuance there is significant enough to merit differentiating the two.

    • paulddraper 85 days ago
      TypeScript used to have a standard.

      Years and years ago. Now it's whatever the compiler does.

      Which to be fair, is evolving quite rapidly.

      • spankalee 85 days ago
        Yeah, that was from 2016 or so. But a TypeScript spec is different than folding TypeScript into the ECMAScript standard. Some parts of TypeScript would have to be dropped or changed for that to work.
        • paulddraper 84 days ago
          Right, having a spec is easier (and effectively prerequisite) to your suggestion.

          But they don't have it.

    • silverwind 84 days ago
      Imho standardizing the syntax is enough and this is what https://github.com/tc39/proposal-type-annotations does.

      The type checker is immensely complex and should be left out so that other type checkers can be developed, e.g. similar to how it's for Python today.

      • spankalee 84 days ago
        That doesn't standardize the syntax, only types-as-comments. It would have to standardize some delimiters for type expressions, but that's it.

        I do think that the semantics should be standardized too, otherwise you have non-interoperable types. The goal should be that you can use two libraries together without having to make sure they use the same type-checker.

    • hajile 84 days ago
      TS is an intentionally unsound type system that tries to allow you to type your code no matter if it will run like garbage, is unreadably complex, and uses terrible parts of the language.

      What TC39 needs is a type system that limits what you can do to things that are sound, performant, and good practice. TS is the exact opposite of this.

    • nektro 83 days ago
      TC39 standardizing TypeScript as part of JavaScript would be a critical error. TC39's job is to look at all the options and make something better. TypeScript won out in popularity due to its tight integration with VSCode. it's not necessarily the best route for the core language.
    • pier25 84 days ago
      Another reason is probably performance. Executing TS would require a lot of extra CPU and even more energy than JS.

      A lot of effort and money has been invested into JS engines. I wonder if making a TS native engine (which nobody has made yet) from scratch might make more sense than adapting JS engines to run TS.

    • vivzkestrel 85 days ago
      its about time google chrome started making a typescript engine maybe? and get rid of JS in phases?
      • crabmusket 84 days ago
        This is a persistent meme that has no basis in reality. A TypeScript engine is a JavaScript engine, since everything that can be done in JS can be done in TS. It's plausible, maybe, that there could be some additional optimisations on TS code where the engine is sufficiently happy with all types in a subset of the program. But that would be on top of all existing JS engine features, unless you want your engine's performance to suddenly degrade if you stray outside the fully-statically-verifiabe-TS happy path.
        • dtech 84 days ago
          That's a pretty obtuse interpretation of the comment. Browsers natively being able to run Typescript code / .ts files instead of requiring transpiling to plain Javascript would be a large boon to the TS ecosystem by making basically everything easier. Even if it's just stripping the TS and running the plain JS it would already be helpful, but it running the typechecking beforehand would be wonderful.
          • crabmusket 84 days ago
            Maybe I was reading too much into the comment I replied to, but to me "a typescript engine" implied more than "ignoring the types" (which is the current TC39 proposal).

            And I was replying based on what I've seen other people saying whenever the subject comes up; apologies if I misread.

            Browsers doing type checking is a pretty fraught idea IMO, at least with Typescript and not some other statically typed language entirely.

      • jjk7 85 days ago
        There are a few optimizations that types can use but 99% of applications wouldn't benefit from them anyways.
        • Shacklz 84 days ago
          Taking one link out of the toolchain (tsc) would already be a huge blessing.

          And naive me hopes for a future where in my web-app I can set a policy that any non-ts, type-incompliant code is not allowed to run.

          The amount of exceptions I get in the console from terrible garbage-code outside of my control but that I have to include because enterprise is staggering. Would love to have a meta-setting which would just kill them if they can't be arsed to even have a modicum of code-hygiene (sorry for the rant)

          • _flux 84 days ago
            Why is taking out the part that actually checks the types at the developer's side a huge blessing?

            Or if you are hoping to get the benefit of type checking in the browser itself (taking the same sweet time as tsc, but this time on every browser instead of once in the CI), then how long would you want to wait to be able to actually use the new typing functionality described in e.g. the latest TS annoucement? https://devblogs.microsoft.com/typescript/announcing-typescr... .

            Because it would take a while until that would then become the standard and then become available in every browser. And you still need to provide the JS versions, because not every browser is going to support TS.

            In the meanwhile you could just keep using tsc just as before and get access to new functionality immediately.

            (I imagine you could run tsc in the browser right now if you really wanted to.)

            • Shacklz 84 days ago
              > Why is taking out the part that actually checks the types at the developer's side a huge blessing?

              Oh, no, certainly we want to keep type-checking in the pipeline, somewhere.

              However, if the browser "understood" typescript, your codebase could have immediate hot-reload, without any transpilation in-between. The type-checking could then be (and already is when using something like esbuild/swc) an entirely separate process that happens independently.

              Webpack's HMR is pretty good, but not having to modify the code at all to have it work in the browser, that'd be much much better :)

              ... the browser being able to typecheck (and reject violating code) itself is certainly something I'd love to see eventually, but fully agreed, this is not happening anytime soon.

        • flohofwoe 84 days ago
          One advantage of Typescript in context of performance should be that it nudges one to not change the 'shape' of runtime objects too much, this should allow less runtime overhead in JS engines because code doesn't need to be re-jitted as often.

          This doesn't require the type annotations at runtime though, it's just a side effect of code being written against a static type system.

      • mcintyre1994 84 days ago
        A lot of apps would (or at least should) still want to strip types for bundle size reasons though.

        To take one extreme example, a library I work on includes an API for calling a JSON RPC server. Instead of manually implementing each call, we use a proxy object that converts any method call on it to a JSON RPC call. Then layer on types so given an RPC object you know every method on it and have typed input params and output. This means you can have any number of methods without increasing your bundle size, because all the types disappear at runtime. It also means you can add your own methods if you’re talking to a server that implements custom ones by just defining types. If you shipped this to the browser with the types then it’d be a much bigger bundle than without them.

      • rty32 84 days ago
        My take as an outsider: Google has absolutely no interest in this, especially with their recent cost-cutting measures. Google cares about things that make the "web" better for the end users so that they can sell more ads, not developer tools. TypeScript or JavaScript doesn't matter that much to Google, and actually Google probably doesn't want to see TypeScript files distributed over network (which doesn't make much sense in the first place). In all honesty, Microsoft understands development experience much better than Google and most other companies. They literally own Visual Studio, Visual Studio Code and GitHub and sell products/services for money.
    • tylerchilds 85 days ago
      i’m not sure i buy the “company floods the industry with broken tooling and deserves to be standardized” narrative.

      yeah, the ecosystem sucks, but rewarding the system incentivized to co-opt the system will actually make things worse in the long run, not better.

      for example, internet explorer failed why?

      • dgellow 85 days ago
        What broken tooling are you talking about? tsc is broken?

        IE failed because it was a horrible browser that didn’t evolve for years and was incompatible with major web standard developments. Nothing to do with typescript, an open source, best in class type system and type checker.

        • tylerchilds 84 days ago
          IE failed because they tried to define the standard as themselves. i argue we’re witnessing that again from the same company that only gave up that strategy once they had typescript, github, npm locked in.

          i’m not bullish on political strategies being technical solutions, which is the premise.

          typescript has nothing to do with internet explorer, true, but is it really not obvious that it is the same tactic as a different brand? become the standard, steer the committee.

          and broken in that copying code between systems requires compatibility between configurations, which should be a red flag for any language.

        • tylerchilds 84 days ago
          “During the transpilation process, no type checking is performed, and types are discarded.”

          this node feature is primarily around disregarding typescript in favor of the underlying javascript it represents.

          that reminds me of this fun article: https://www.richard-towers.com/2023/03/11/typescripting-the-...

          • dgellow 84 days ago
            Yes, the feature is about being able to run typescript scripts. It’s not a type checker, it is similar to ts-node, deno, bun, etc. Typescript has been designed for that specific purpose.
            • tylerchilds 84 days ago
              seen and heard, my original point was that typescript is not poised to be a tc39 standard.

              this is still a “runs some typescript” and “not runs every typescript file”

              “ At least initially in this PR no trasformation is performed, meaning that using Enum, namespaces etc... will not be possible.”

              this type of nuance is the core of why typescript is a headache for any organization with more than a single codebase— javascript is portable, typescript is in theory, but not in observed practice.

  • jvanveen 84 days ago
    I just switched to Bun for typescript support, a free bundler and better performance. Iojs flashback all over :)
    • XCSme 84 days ago
      Does Bun always work for you? For me, none of my projects worked to run "bun install" on.
  • ofirg 85 days ago
    support for typescript as long as you are only using it for type checking, not if you are also using features that are not supported in the javascript version you are targeting.
    • paulddraper 85 days ago
      That is what people use TypeScript for generally.

      If you need JS syntax features that Node.js doesn't support you can use tsc, babel, etc.

      Because obviously unsupported features are unsupported, whether JS or TS.

    • flohofwoe 84 days ago
      > not if you are also using features that are not supported in the javascript version you are targeting. reply

      This is only 'half-assed' anyway, TS will only emulate new language features on older JS target version, but not any Javascript runtime features (like new Object methods). For the latter you will still need a separate polyfill solution.

  • alabhyajindal 84 days ago
    What next? Renaming Node.js to Node.ts?

    I understand that built in TS support is very nice as provided by Bun. But I feel doing this in Node is going to take a long time.

  • nthngy 84 days ago
    IMAO writing (hard-coding) TypeScript is deprecated and a waste of time. With all tech available nowadays it is possible to do the entire type check automatically in the IDE, even more so now with the help of AI. It's just a matter of time when we stop hard-coding type info. Better invest time and money in getting the IDE's work better for us.
  • deanc 85 days ago
    So how does this work in practice? Does it strip types and yolo trying to run or will it spit out type errors?
    • bentruyman 85 days ago
      If you read the third sentence of the PR, it says:

      > During the transpilation process, no type checking is performed, and types are discarded

    • TheRealPomax 85 days ago
      Why would you describe that "yolo"? If you're writing TS, you already have a TS linter that checks whether code has typing problems or not (at least, I certainly hope you do?). It's not really Node's job to do that linting, its job is to execute the JS that's hiding in the TS. It'd be "handy" if it did, but it'd also be a bit weird when there are already TS linting tools. It'd just hold up landing any sort of TS support that much longer.
    • Shacklz 85 days ago
      If I understood correctly they use a wrapper around swc to strip types, without any type-check being performed.

      Which makes perfect sense to start out with; as typechecking with tsc is rather slow and can easily be delegated to the consumer.

  • skybrian 84 days ago
    It would be nice for debugging if at least simple npms could just bundle their .ts files without any processing, so we could see the comments and types as they existed in the git repo. Apps can always minify them later.

    (I use Deno, but also use some npms.)

    • flohofwoe 84 days ago
      You can simply create npm packages which contain only the 'unprocessed' TS source files (or really any type of files - for instance I experimented with using npm as package for C/C++ projects in the past, it works just fine). Pre-bundling or compiling from TS to JS is just a convention. And in case of bundling not a good one IMHO, because bundling should only be a final step in the top-level project. One good reason to compile the package content to .js/.d.js/.map files is that the resulting package is usable both in JS and TS projects.
      • skybrian 84 days ago
        The nice thing about this change to Node.js (when it’s no longer experimental) is that you could just distribute .ts files and JS projects could use them.
  • torginus 84 days ago
    I might be ignorant but wasn't there a plan to add Typescript support to the browser itself?

    In which case wouldn't V8 support TS directly without needing to transpile?

    • herpdyderp 84 days ago
      Yes. That is noted in the PR:

      > There is a TC39 proposal for type annotations

      Which links to https://github.com/tc39/proposal-type-annotations

      It is a long ways off though.

    • jokoon 84 days ago
      all browsers would have to agree, the language would have to be very well defined and well supported across all those browsers.

      that would be a lot of work

      I would rather see new WASM features instead

  • apatheticonion 83 days ago
    I wonder how this handles import resolution.

    type: module requires file extensions and does not support importing folders like most people are used to so it will not be compatible with most existing Typescript code.

    Will it transform esm import syntax into require statements?

    I'd prefer it break existing code to enforce correctness

  • commercialnix 84 days ago
    • garbagepatch 84 days ago
      Looks very similar to Reason ML, another language that compiles to Js: https://reasonml.github.io/docs/en/getting-started
      • hajile 84 days ago
        ReScript was an outgrowth of ReasonML and basically split the community killing both of them.
        • commercialnix 84 days ago
          I was not aware of this. I did see ReScript covertly bring some sanity to some NodeJS projects, and more than once. So this history of the project is worth digging into. Thank you for surfacing this insight.
        • mardifoufs 84 days ago
          What was the reason behind the split?
  • throw156754228 84 days ago
    I see it just strips the typings. So if I attach the debugger I'm still going to see javascript right?
    • zarzavat 84 days ago
      Presumably it would also generate a source map to allow debuggers to work properly, like tsc does.
  • zaphod420 84 days ago
    I would rather raw dog JavaScript than write Typescript. Typescript is an abomination.
  • umvi 84 days ago
    Side note, but IMO Typescript is too complicated. They should have stuck to a reasonably simple type system but now I see projects with incomprehensible and frankly unmaintainable typescript consisting of extremely complex generics, type conditionals, and type constraints. Basically if you aren't careful you'll find your project metaprogramming in typescript's turing complete meta language...
    • andrewmcwatters 84 days ago
      TypeScript in its current usage reminds me of Hello, World! or FizzBuzz Enterprise Edition. There's almost more code dedicated to typing than the actual running software itself in some codebases I've seen.

      The authors trick you with reasonable examples on https://www.typescriptlang.org, but in the wild, you have these ridiculous codebases that couldn't control themselves and they have this insane ratio of multiple declaration files to actual source files and you have to ask yourself, "Are you writing software to get something actually done, or do you just like write type definitions?"

      Even people who write in C++ don't go to the lengths that TypeScript users do. It's super weird and cult-like.

      • sureIy 84 days ago
        I also get lost in types sometimes, but the point of types is that they help you. Typescript lets you just use `as any` and `as unknown` as you please, but you want complex constraints you will need complex types.

        There are some type libraries that parse GraphQL queries and CSS selectors. They’re crazy to look at but they’re hugely helpful.

        • umvi 84 days ago
          Maybe I just have bad luck, but most of the libraries I've tried that are "crazy to look at" seem good in theory but are janky in practice. For example, openapi-fetch (https://github.com/openapi-ts/openapi-typescript/tree/main/p...), on paper seems great, but has lots of jank in practice.

          And I would wager the bugs and jank are in no small part due to the extremely complex generics/constraints.

      • umvi 84 days ago
        To be clear, I do like the type-checking benefits of TypeScript, but it requires some discipline to keep it simple. Get one unchecked TS astronaut on the team and the TypeScript can get complex and esoteric very quickly.
    • ChicagoBoy11 84 days ago
      This is something I've struggled with as a mostly solo dev. I've most often just stuck with vanilla javascript because of course that's good enough, but definitely there have been times where I hoped I had some typing helping me out. Alas, I haven't quite finagled the art of finding a way to use it "just a little bit."
  • brundolf 84 days ago
    I feel like Deno and Bun have lit a fire under the Node team to finally modernize things people have been desperately wanting for years. Great to see!
  • storafrid 84 days ago
    I have mixed feelings about this. While I do use TS with Node.js today and absolutely like the concept, its type system is still far from something mature and stable like C#. We keep running into ceilings (EDIT: lack of completeness/depth, not lack of complexity) all the time, and TypeScript questions on Stack Overflow is basically a library of workarounds. Mostly bad ones. So if I worked on Node.js I would prefer it to evolve more before actually marrying and having kids with it. But at the same time, I like the direction Node.js is taking.
    • coffeemug 84 days ago
      What's an example of a ceiling? Out of all the mass-market programming languages, TS arguably has the most advanced type system in the world. It's a modern marvel that they got it working on top of Javascript.
      • storafrid 84 days ago
        Certainly advanced, but not mature in my experience. Using e.g. classes and inferred generic function arguments, quickly reveals a lot of features that are missing. Often some similar feature is present but it lacks depth/completeness. Lots of good discussions to read in TS repo on GitHub if you're interested: Optional generic type inference, extends oneof, generic values, keyof a subset, conditional types, etc.

        I want to emphasize that the reason we keep running into "ceilings" is probably because of its advanced type system. Libraries and frameworks are using those type features and when we can't keep building on the type - we end up casting to unknown and reconstructing it. Which feels worse than not being able to construct that complex type at all.

      • bottlepalm 84 days ago
        I think it took the seemingly impossible challenge of bringing typing to a dynamic language that made typescript so powerful in the first place.

        All other static languages start bottom up, simple to more complex, but end up getting boxed in by their own design. TypeScript started top down, trying to map itself on to a fully dynamic language. Never getting boxed in, just trying to 'fill' the box that is all the possibilities of JavaScript. 10 years on and TypeScript is still exciting, making significant updates and improvements.

        • randomdata 84 days ago
          Typescript isn't particularly powerful compared to other non-mainstream languages, though, which is why the parent comment was careful to add that caveat. Which is to say that I'm not sure the idea that "all other static languages" start simple and get boxed in stands up.

          You may have a point that Typescript would have been relegated to obscurity with all the others had it tried to start "top down" as a brand new language. There may be some truth that it is a necessity of a language to start simple in order to become accepted in the mainstream and that Typescript only made it because it rode on the coattails of a language that also started simple: Javascript.

      • mdhb 84 days ago
        I don’t know what you mean here by advanced? If you mean the sheer amount of fuckery they have to do in order to make it work with JS perhaps you have a point.

        If you mean expressiveness or consistent or soundness then no, it’s actually very bad compared to almost anything else and I think the longer it goes on the more it starts to feel like a house of cards.

        The upside I guess is that whenever Safari decides to get their shit together Web Assembly is well placed to get us out of the scenario where we are forced to use JS and as an extension Typescript at all for most things and actually good language choices with reliable type systems like Dart, Kotlin and C# all become viable options.

        There is no way I’d choose JavaScript over those other options in the majority of scenarios unless I was forced to.

        • wiseowise 84 days ago
          > The upside I guess is that whenever Safari decides to get their shit together Web Assembly is well placed to get us out of the scenario where we are forced to use JS and as an extension Typescript at all for most things and actually good language choices with reliable type systems like Dart, Kotlin and C# all become viable options.

          Out of those three only Dart has nice DX story compared to JS world.

    • SebastianKra 84 days ago
      This is the first time I'm hearing such a claim.

      In C# you can't work with optional generics because an optional reference type is different from an optional value type.

      C#s poor type-inference often requires you to type out types thrice. You can't declare constants or class members with type-inferrence.

      The only way to define sum-types (A | B | C) is through intefaces and I'm pretty sure they can't be sealed. Defining product-types (A & B & C) is impossible.

      • storafrid 84 days ago
        Sorry I probably used the wrong term, not a native English speaker. I didn't mean lack of complexity or lack of "features" but rather the lack of carefully thought-through feature "depth". Like, we can infer generic arguments which is nice, but then we try doing that with some keyof complex type and it doesn't work. And later we find an issue on GitHub saying that it's not implemented. Which is fine, I love TS anyway and it's evolving.
      • fire_lake 84 days ago
        C# has record (product) types now.
        • crabmusket 84 days ago
          I'm not sure about the terminology here, but the & in TS is much more than a record. You can use it to smush types together e.g.

          {name: string} & {birthday: Date}

          becomes a single type with both properties.

          • fire_lake 84 days ago
            Product types are tuple types.

            Record types are tuple types with names instead of indexes.

            The TypeScript “&” is another thing.

    • Zamicol 84 days ago
      Well said. Bearing that in mind, we've found that JSDoc is a reasonable substitution for some TypeScript applications; however, JSDoc has limitations that we've ran into frequently as well.
  • olalonde 84 days ago
    It just took 3 weeks to merge this? Seems really fast by Node.js standards. I assume it was discussed before the PR was opened?
  • DrMiaow 84 days ago
    Without some kind of transformation on `enum` isn't this mostly useless? `enum` is heavily used in many TypeScript code-bases.
  • zelphirkalt 84 days ago
    If this means, that one day I don't need to use a bundler any longer to make a website using TypeScript, I am all for it.
    • ivanjermakov 83 days ago
      For that browser needs to understand ts natively. Otherwise, transpilation has to be done: by tsc, bun, or by something else.
  • hpeter 83 days ago
    So, how much slower is it than running js?
  • rglover 84 days ago
    Didn't expect types to come to JavaScript like this (native or without a separate compiler), but I love the idea.
  • rendall 83 days ago
    Love this. Next on my wish list road map is tail end recursion
  • tracker1 84 days ago
    Nice to see this, and important to stay closer to parity with Deno and Bun.
  • jdeaton 84 days ago
    Isn't the whole idea of TS that you just convert to JS to run it
    • WorldMaker 84 days ago
      Yes. The question is where you do the conversion and how much of a conversion to do. This allows a key conversion (type stripping) directly at load time by Node rather than needing an extra step (an external type stripper such as Typescript or esbuild) sometime before passing the file to node for loading.
  • sgammon 84 days ago
    Really glad to see this
  • winrid 84 days ago
    anyone know, for example where this is already done with Bun, how this impacts startup times with large applications?
  • dangoodmanUT 84 days ago
    NodeJS is the ChatGPT of JS.

    Eventuall they add the feature something else has, and then everyone uses it again instead of the other thing.

  • CodeCompost 84 days ago
    Ok that's cool, but how about adding native support to browsers?
  • RadixDLT 84 days ago
    Oh, great, people love JavaScript. What a shock.
  • low_tech_punk 84 days ago
    Is there a slippery slope? Node -> Browser
  • just-tom 85 days ago
    Nice, but without support for Enums, for me it's mostly useless.
    • Shacklz 85 days ago
      In our codebase we started to disallow enums in favour of string literal types, and once folks get over the ingrained "this needs to be an enum" (coming mostly from other languages like Java), it's not much missed.

      Enums are one of the very few things in typescript that seem to not have turned out that well, but it's relatively easy to work without them with string-literable types and such, derived from some const in case they're also needed at runtime.

      • shepherdjerred 84 days ago
        100% agree. Coming from Java I really wanted an Enum type, but string literals/discriminated unions fill that niche just fine.
      • bythreads 84 days ago
        agree enums in typescript are the devil
        • revskill 84 days ago
          Could u explain why ?
    • azangru 84 days ago
      There are union types for strings; and there are plain javascript objects (typed as const) if "namespace.name" syntax is desired. With these available, what is the point of enums?
      • silverwind 84 days ago
        Namespaces are also not supported as per https://github.com/nodejs/node/blob/main/doc/api/typescript.....

        I for one can can live without all these features and will be banning via eslint.

        • azangru 84 days ago
          Yes. Both namespaces and enums (and probably the "private" keyword on class methods) are an early addition to the language, which would have never been added if typescript from the very start aligned closely with Ecmascript.