5 comments

  • hn_throwaway_99 6 hours ago
    It's amusing to me how the economic and cultural incentives at so many companies is to lie as much as possible when it comes to breach disclosures while pretending that you're still technically telling the truth.

    I think that in all of these cases it would have been no worse for the companies in question if they just sent out a dry, "just the facts, ma'am" report of what actually happened, without any of the BS "the security of our customer data is our primary priority!" statements to begin with that always accompany these kinds of breach disclosures. E.g. something like:

    On <date>, due to a vulnerability in the third party vendor SolarWinds which provides network security services for us, we detected the following breaches of customer data:

    1. xxx

    2. yyy

    The steps we are currently taking, and what you should do: zzz.

    ----

    Perhaps one good thing that can come out of this is that some sort of "standard" format for breach disclosures comes about (think the "Nutrition Facts" labels on food boxes in the US). All I do when I see companies trying to minimize breach disclosures is assume they're bullshitting anyway.

    • kmeisthax 4 hours ago
      If companies were mere profit-seeking entities, these breach notices would be minimally disruptive to the business. Most people do not immediately jump ship just because a breach happened.

      But most companies are not just that. They're barely-legal Ponzi schemes. The board and their appointed CxOs are selected specifically on the basis of how much they can get the stock price up. This results in companies making lots of terribly short-sighted decisions.

      In the specific case of breach disclosures, any bad news about a company tends to create uncertainty, which makes short-term investors and speculators close their positions, which drops the price. This drop tends to be short-term, but it imperils the liquidity of the investment, and liquid investments tend to be more valuable, so...

      • TeMPOraL 3 hours ago
        Thanks, that does explain the long-standing conundrum I had. Having worked for a cybersec/GRC startup in the past[0], I got a good look at how risks and their impacts are categorized, but I still couldn't figure out, why does anyone care.

        Like, "reputational damage", obviously nobody cares if a company gets breached - 99% of the customers won't notice, 99% of the remaining won't understand it, and the competitors are probably just as much at risk; all you need to do is issue some PR note and maybe offer free credit monitoring (some US peculiarity), and you're done. Same for most other things leading to "reputational damage". It feels like it's obviously a loss of $nothing, so why do CFOs and CISOs seem to put so much interest in this impact category?

        Well, I haven't thought about stock prices, and their lack of correlation with customer experience. My bad.

        --

        [0] - I suppose I had all the things I needed to figure it out, somehow I didn't connect the dots. And/or was too busy trying to ensure our fancy probability math wasn't bullshit to pay attention to the larger context.

    • JumpCrisscross 3 hours ago
      > in all of these cases it would have been no worse for the companies in question if they just sent out a dry, "just the facts, ma'am" report of what actually happened

      This assumes there is someone on staff capable of writing a no-nonsense diagnosis.

      • TeMPOraL 3 hours ago
        Sure there are. The person writing the release gets fed some internal bullet points or summaries as source material; that material is strictly less bullshit than the resulting official press release.
    • SpicyLemonZest 3 hours ago
      I'm sympathetic, but I feel like the order against Mimecast illustrates a big part of the problem here. This seems to me like a pretty detailed disclosure:

      > The investigation revealed that the threat actor accessed and downloaded a limited number of our source code repositories, as the threat actor is reported to have done with other victims of the SolarWinds Orion supply chain attack. We believe that the source code downloaded by the threat actor was incomplete and would be insufficient to build and run any aspect of the Mimecast service. We found no evidence that the threat actor made any modifications to our source code nor do we believe that there was any impact on our products. We will continue to analyze and monitor our source code to protect against potential misuse.

      But the SEC feels this was misleading, because they did not specify which source code repositories were targeted or what percentage of the code in those repositories was exfiltrated. That's the dynamic that drives these kind of disclosures, oversharing driving demands for even more absurd levels of oversharing. They had to go calculate that precisely 76% of their M365 interop code was exfiltrated - is that information worth the cost of producing it, or even valuable to anyone in any way?

      • notatoad 2 hours ago
        >is that information worth the cost of producing it, or even valuable to anyone in any way?

        It's valuable to the SEC, because they're the ones tasked with enforcing these rules and specifics are what allow for enforcement. If you publish an actual percentage, then they can ding you for lying if the percentage was wrong. being vague isn't misleading on its own, but it can be used to be misleading.

        if they actually know what was exfiltrated, then putting specifics in the disclosure should be a trivial matter. maybe not a percentage of lines in the codebase, but you've got to give the SEC enough that they could potentially check it and determine if it was a lie. and "a limited number" isn't specific enough to do that.

        • SpicyLemonZest 48 minutes ago
          I don't agree that the Securities and Exchange Commission is tasked with enforcing good cybersecurity disclosures. You'll note that this settlement formally has to do with the companies' statements to investors, although I agree with the implicit assumption a lot of people upthread are making, that the charges would not have been filed if customer disclosures were adequate.
      • Veserv 2 hours ago
        You do not need to say precisely 76%. Nobody is going to complain if you spend less resources to get a less strict upper bound like 80%. Hell, you can make it easy for yourself and just say 100%; costs nothing and guaranteed to not understate the customer impact. The problem is deceptively implying less customer impact.

        But no company will deliberately overstate the customer impact, think of what it would do to their bottom lines. They much prefer spending a bunch of money to minimize overstating. Exactly.

        If only they were allowed to understate customer impact then they could harvest even more of that reputational arbitrage is not a very compelling justification.

  • MattSteelblade 6 hours ago
    > Unisys will pay a $4 million civil penalty;

    > Avaya. will pay a $1 million civil penalty;

    > Check Point will pay a $995,000 civil penalty; and

    > Mimecast will pay a $990,000 civil penalty.

    With the exception of Mimecast, these are companies that are bringing in billions of dollars in revenue annually. How is this supposed to deter them?

    • Hilift 4 hours ago
      The fines are symbolic. Even if you look at the fine for the hotel data breach in 2018, that was only $52 million (US) and $23 million (UK), total of $75 million. And the Equifax breach? An executive VP of IT sold $584k of shares right after the breach and before the press release. Nothing happened to him, he said he was unaware of the breach. https://www.npr.org/sections/thetwo-way/2017/09/08/549434187...

      The SW supply chain attack is one of the most brilliant cyber attacks in recent history. They hit a train load of gold bars, and had a much as 14 months of dwell time with potentially 18,000 customers. Discovery must have been disappointing for the attackers.

      If you follow the most important rule, secrecy, you get plausible deniability and small-er fines.

    • ensignavenger 6 hours ago
      Unisys and Avaya are both reporting losses. This fine makes it even more of a loss. Further, if they don't mend their ways, the SEC will give them an even bigger fine.
    • 0xffff2 6 hours ago
      They pay the penalty and they are expected fix the issue. If they don't, there will be additional enforcement actions.
      • Mistletoe 5 hours ago
        Doing anything at all probably costs more than $1M.
        • alephnerd 5 hours ago
          Not that much more.

          Furthermore, security vendors like Avaya and Unisys could arguably be in breach of contract with customers because it could be argued that they misrepresented their internal security protocols to customers.

    • advisedwang 3 hours ago
      SEC likely offered low settlements here to get agreements without having to battle in court whether SEC even has the authority to do this. Now that they have to some degree established authority here* they can go for enforcement harder and push companies further on disclosure.

      * ie a practical precedent, not a legal one

    • teeray 4 hours ago
      The law should be written to require a mandatory percentage of revenue. That will wake them up.
      • kmeisthax 4 hours ago
        It will not.

        The reason why companies get breached is because the systems being breached are all legacy. Company A buys company B who bought company C, which merged with company D. C fires D's old IT department, because it's redundant, so now D's billing system is being managed by C's IT department. C then sells itself to B, who has a much more robust billing system. At this point, it'd make sense to replace the billing system from D, but everyone who knew how it worked got fired in the C/D merger. So it sits around because nobody wants to break that part of the business. Then A buys B and does another round of layoffs, so anyone who even knew about this is gone.

        Ten years and hundreds of iterations of this exact cycle later, you get an e-mail from a stranger saying they found all your customer records being sold on a cybercrime forum. Your IT department scrambles to remediate a breach in a system they've never heard of that nobody remembers installing or maintaining. It's just always been there. Corporate amnesia runs deep. People are finding forgotten old servers running unpatched versions of Windows Server 2003 that were so ritualistically overlooked you'd need to be high on Class Z mnestics just to perceive them.

        Every enterprise IT department is like this. That's why companies get breached so damned often. There is never enough time in the budget to properly document legacy systems, nor are the decision-makers at the top even aware of the fact that they exist. Their job is to eat things, and they eat voraciously. If you want to stop this from happening, you need to make M&A illegal, not just inflict more pain to the invisible arms the corporate body cannot perceive pain from.

        • akira2501 4 hours ago
          > Every enterprise IT department is like this.

          That's because it's not understood what a liability allowing this to occur is. Perhaps if we fine them based on revenue they would understand that IT is a core part of their company and can no longer live on the edges of the business units.

        • anitil 1 hour ago
          Thankyou for a clear explanation of how this sort of thing can happen. I've seen similar issues in profit-making parts of businesses, so I imagine it can only be worse in areas seen as cost centres
        • philipov 4 hours ago
          Well, you've convinced me. M&A should be illegal.
      • JumpCrisscross 3 hours ago
        > law should be written to require a mandatory percentage of revenue. That will wake them up.

        Percent of revenue fines regressively to margin.

        10% of Walmart's revenue is 4 years' profits. 10% of Equifax's is a few quarters'. Moreover, you'd have a bureaucrats' delight of companies splitting revenues across entities while courts have to litigate common control claims. Unless you have a good reason to punish low-margin businesses more heavily than high-margin ones, this is an inefficient scheme.

        Better: fines based on damages, trebled.

        • TeMPOraL 3 hours ago
          > Better: fines based on damages, trebled.

          Except damages for data leaks are kind of hard to compute, since in practice they're $0 until some of the data is provably used to cause some non-$0 worth of damage down the line.

          • JumpCrisscross 45 minutes ago
            > damages for data leaks are kind of hard to compute, since in practice they're $0 until some of the data is provably used to cause some non-$0 worth of damage down the line

            Through private action, yes. Use statute to define damages as a function of number of people affected, type of data released and whether the company self reported or was caught, by the public or a regulator. Add enhancements if the company was reckless, the data was out there for longer than a month or if it was accessed by foreign adversaries.

    • alephnerd 5 hours ago
      > How is this supposed to deter them

      Unisys and Avaya are both security vendors. This absolutely is a bad look for them, as almost every Security RFP asks about internal controls and how a vendor has remediated against these issues, and this is ammunition for any competitor to ask a prospect to re-evaluate purchases from either due to misrepresenting their security procedures.

      Furthermore, Unisys only has an operating profit of around $200M a year, so a $4M fine is fairly brutal (that's an entire security team's operating budget for a company at Unisys' size).

      Avaya's is smaller still, so that $1M is fairly brutal for them

    • SpicyLemonZest 4 hours ago
      It's not a case of deterrence. As the orders linked from the press release describe, all four of these companies have been cooperating extensively with the SEC to fix things up and agreed to continue doing so as part of the settlement.
  • mise_en_place 5 hours ago
    Probably not the case here, but the issue is with how some of the NIST standards around cybersecurity are certified. API endpoints are manually tested and then screenshots are provided. Completely manual and very inefficient and prone to human error. This is an issue of US national security, we need more skilled hackers in this space.
  • hbcondo714 2 hours ago
    FWIW, other publicly traded companies disclosed[1] their breaches since the rule went live a year ago.

    [1] https://last10k.com/stock-screeners/cybersecurity

  • librasteve 6 hours ago
    I feel that it is time to criminalise corporate fraud - ie executives presiding over businesses or state organizations that lie, deliberately obscure or suppress any relevant facts should expect jail time. This ought to be at similar levels of time and standards of “should know” as health and safety law.

    Several recent examples would have fallen foul of this … Grenfell tower, Tesla FSD, Boeing 737max, Thames Water, United Utilities and the EA.

    • Etheryte 5 hours ago
      I agree, we already see this in the financial industry, if you don't do your part to prevent money laundering, you can be facing real jail time. It's long overdue that similar liability came to other industries, the examples you brought up show it's clearly necessary. The free market and its financial incentives alone are not cutting it.