11 comments

  • throw0101d 13 days ago
    If anyone wants their own HSM, Nitrokey and Yubikey sell them:

    * https://shop.nitrokey.com/shop/nkhs2-nitrokey-hsm-2-7

    * https://www.yubico.com/product/yubihsm-2-series/yubihsm-2/

    Consider buying two to have backups ((encrypted) export/import-backup/restore is supported).

    Creating your own CA:

    * https://docs.nitrokey.com/hsm/mac/certificate-authority

    Considering using 'helper software' for running a CA:

    * https://github.com/smallstep / https://smallstep.com/docs/step-ca/

    * https://github.com/OpenVPN/easy-rsa

    * https://hohnstaedt.de/xca/

    * https://github.com/FiloSottile/mkcert (good for on-one-host dev stuff)

  • throwaway256346 13 days ago
    I work with industrial HSMs (those expensive ones) on a daily basis and their SDKs are a bugfest (both client side and in-device). They are audited (FIPS140-2 and now 3 approved even!) but apperantly testing the firmware against the test vectors from the RFCs is too much too ask for...

    Contacting support about broken firmware or broken documentation is a trip to tartarus in itself. Decompiling the libraries is usually faster to figure out what is wrong.

    Don't put too much trust in them unless you really have to.

    • riskable 13 days ago
      The problem is that certification takes SO LONG and they're not allowed to change the firmware while it's being certified or afterwards. What this means that FIPS certification is an indication of an inherently insecure device.

      It literally means it hasn't been receiving regular security patches/updates!

      • bdamm 13 days ago
        Nobody actually runs HSMs in FIPS mode anyway. FIPS certification just means it can be run in a FIPS mode, and that it did at one time pass the certification. So while it is a very useful hurdle to jump over, it is impractical to use (for the same reasons you mention, and others).
        • hedora 13 days ago
          The only time I've paid close attention to a FIPS certification process, they forced us to substantially weaken the security posture of our product by making it easier for attackers to exfiltrate keys in certain circumstances (the product was designed to be run in trusted environments, and there were many less-theoretical attack vectors, but the FIPS process didn't care about those).

          Anyway, it hasn't been a useful hurdle to jump over in my experience. At this point, if a system has a FIPS compliance mode, that lowers my opinion of its real-world security properties. If someone voluntarily insists on using FIPS-compliant stuff, I assume they're completely incompetent in all matters, professional and personal (that heuristic has worked for me 100% of the time).

    • p_l 13 days ago
      Any comments you could share about Luna HSM ones?

      Recall seeing a lot of them as reasonably accessible in cloud and not only setups, thus my interest.

  • horeszko 13 days ago
    I built my own key-vault/HSM since I wanted to use various cryptographic algorithms (argon2 and signing JWTs) not supported by typical HSMs.

    repo for the software: https://codeberg.org/ChristopherChmielewski/cns

  • client4 13 days ago
    A tangential topic studies how you can actually trust the hardware. Andrew "Bunnie" Huang has done a lot of great work in the area, first with Precursor, and lately with Infra-Red, in situ (IRIS) inspection of silicone.

    * https://www.bunniestudios.com/blog/2020/introducing-precurso...

    * https://www.bunniestudios.com/blog/2024/iris-infra-red-in-si...

  • axoltl 13 days ago
    Two small notes on definitions:

    ---

    "Secure Elements / Hardware Roots of Trust: Embedded within chips, these elements provide a secure base for trusted operations and are often used in mobile devices and IoT applications."

    SEs (Secure Elements) are discrete components. Smartcards and SIM cards are examples of SEs. They are different from a root of trust. When talked about in a cryptographic context a 'hardware root of trust' is usually a public key embedded in an immutable ROM.

    ---

    "Secure Enclaves: These provide isolated execution environments within a CPU..."

    Not necessarily within a CPU. Apple has the SEP (Secure Enclave Processor) which is a discrete core on the die.

    • lxgr 12 days ago
      > Not necessarily within a CPU.

      Arguably especially not within a CPU. When I hear "isolated execution environments in a CPU", I think TEE (e.g. ARM TrustZone), not Secure Enclaves.

  • wslh 13 days ago
    I see many people here recommending products without the caveats included in the article such as tamper-resistant features. It is a warning for 1-Click [1] users.

    [1] https://en.wikipedia.org/wiki/1-Click

    • troyvit 13 days ago
      Good call. OnlyKey talks a little about it: https://docs.onlykey.io/security.html
    • lxgr 12 days ago
      I don't think I get the 1-Click reference here – could you please elaborate?
    • lobster2342 13 days ago
      Yeah, there is one flaw that I recently noted: how does the HSM authenticate legitimate users?

      Like "I am an app, dear HSM, please perform following crypto operation for me."

      If an attacker can pretend to be a legitimate HSM user, then she does not need key access, she just asks the HSM to perform a crypto operation on her behalf.

      On the other hand, if the HSM needs secrets in order to authenticate legitimate users, then those secrets are prone to those attacks, against an HSM shall protect.

      Or dont i get it?

      • lxgr 12 days ago
        You're absolutely right: If a HSM is just used as a signing/decryption oracle, it doesn't add much value.

        They're most useful if they can perform high-level operations, such as "please validate whether the entered card PIN <encrypted PIN block> here matches <PIN verification value on file>, given <credit card number>". The output of that example operation would only be a single bit of information (yes or no), rather than e.g. leaking the entire correct PIN, or even just the decrypted PIN that was entered at the POS.

        But even just a signing/decryption oracle can be a step up from just storing long-lived private and secret keys on your application servers, where you'll never know for sure whether they were exfiltrated at some point.

      • lormayna 12 days ago
        Disclaimer: I have worked in the past for one of the major HSM vendors, but things can be changed in the last years.

        To authenticate an application, you should generate a client certificate and share to the application, in order to create a mutual authentication trust. When you request some operations to HSM you need to authenticate yourself with the certificate. Of course the certificate must be kept as a secret and not shared with anybody. There is also a sort of RBAC scheme related to client certificate.

      • throwway120385 13 days ago
        It's a whole "chain of trust" kind of problem just like X.509.
        • lxgr 12 days ago
          Not really, since that would just be kicking the can down the road. Yes, you can have authentication between application servers and HSMs, but ultimately, whatever credential is used for that can be stolen from the application server, and then whoever has it can talk to the HSM.

          The real benefit of HSMs is that they can make low-level keys accessible via a high-level interface that restricts the type of operations that can be performed at all. For example, an HSM will never just hand raw keys to application servers, but rather only let them perform various operations (based on their permissions). The higher level and specific these permissions are, the better.

  • nimbius 13 days ago
    if anyone wants an open source HSM on the cheap based on a raspberry pi that is pkcs11 compatible, check out the picohsm project https://www.picokeys.com/pico-hsm/
    • filleokus 13 days ago
      > Operation Time

      > RSA key length (bits) Average time (seconds)

      > 1024 16

      > 2048 124

      > 3072 600

      > 4096 ~1000

      That must be a typo, that they mean milli seconds - right? Otherwise this seems too slow to do anything useful?

      • woodruffw 13 days ago
        That does seem exceptionally slow, although RSA key generation is also notoriously slow.

        (In most settings where an HSM is used, you shouldn’t be generating keys all that often. So these times are often acceptable.)

    • lxgr 12 days ago
      This definitely seems useful, but it's arguably not an HSM. It's literally downloadable software! (You wouldn't download an HSM.)

      HSMs of course also run software, but they usually provide at least some level of hardening against physical attacks. In other words, it shouldn't be possible to just extract key from them. Is that the case here?

      I think it would be more honest to call this a (possibly hardened) key server/service. Often, that's all people want from an HSM! But sometimes it isn't (whether for compliance or other reasons).

    • lossolo 13 days ago
  • s4mw1se 13 days ago
    security starts at the shipping port

    Just seeing a flood of comments of everyones cheap $10 dollar devices got me thinking…

    How do you actually check the integrity of the HSM, both at the software level and hardware level?

    The companies hosted open source repo is only worth a shit if you can verify the integrity of the software on the device.

    Do any vendors ship with verifiable Hardware Bill of Materials and Software Bill of materials? How do you know the device you got 2 years ago didn’t have a zero day in a common library disclosed a year after?

    Because if you can’t continuously check the integrity of your device… well you don’t know if it’s actually secure.

    • lxgr 12 days ago
      There's no way around trusting your hardware vendor (and often the software they ship it with as well; at least the OS is usually closed source and not user-installable, at least in the case of smart cards, which are arguably just HSMs in a different form factor).

      Traditionally, the industry has been addressing this via audits and commercial agreements.

  • dfox 13 days ago
    I particularly like the Luna USB HSM 7 that the DNSSec root is in the process of switching to. But price of the thing is truly ridiculous, especially for its handheld form-factor.

    For a long time (10 years?) I'm thinking about how I would design an (possibly open source) HSM and I'm pretty sure that I have reasonably secure and tamper proof design (including external tamper input, which was the obvious feature for the original application I had in mind). But well, the idea of putting all that into handheld device with no battery…

  • BerthaDouglas34 13 days ago
    [dead]
  • 5n00py 14 days ago
    [flagged]