[HN Gopher] Developer's Guide to SaaS Compliance
___________________________________________________________________
Developer's Guide to SaaS Compliance
Author : serverlessmom
Score : 109 points
Date : 2022-06-13 17:32 UTC (5 hours ago)
(HTM) web link (www.courier.com)
(TXT) w3m dump (www.courier.com)
| baggy_trough wrote:
| Is there a guide to this for small teams and one man bands?
| deckard1 wrote:
| The idea is for you to stop.
|
| Regulatory capture and learned helplessness. The costs of
| compliance require deep pockets. When you look at what they
| _actually_ do, then it 's obvious it's just theater. You'll see
| so many comments on HN that persuade you that security and
| privacy are _too complex_ for you to handle (learned
| helplessness). It doesn 't matter if your org is 2000 people
| with entire departments focused on compliance. Security and
| privacy will always be somewhere out there on the horizon. A
| mythical thing that no one can obtain. Definitely not a sole
| developer working alone in their bedroom. So better not try.
|
| Which is a bit crazy that this blog post is targeting
| "developers". As if developers care about any of this stuff.
| Executives at large corporations do. But those same developers
| working at that same company are off in agile land working on
| micromanaged tickets. They don't have a say in SOC. Not in
| whether it's worth it, not in how information is collected. Not
| even in how information is stored, in most cases. Because,
| again, SOC is top-down. Not bottom-up. The same executives
| pushing SOC are the same ones pushing Google Analytics.
| Theater.
| Aaronstotle wrote:
| I've worked at smaller orgs and my advice would be to first ask
| yourself if it's worth the cost, they are large time-sucks and
| cost intensive.
|
| If this is still something you want to pursue, hire experienced
| help. A majority of time spent in audits is figuring out what
| the auditors are looking for, and having someone experienced
| can save you a lot of headache.
|
| Be aware that compliance is more than a one time thing, and
| during this process you will have created either an entirely
| new department, or at the very least multiple work-streams.
| karaterobot wrote:
| I worked in this industry for a while (am still adjacent to it).
| This is a well-written guide, as far as I can tell.
|
| The thing it wouldn't mention is that very, very few companies
| actual care about complying with data security standards for the
| sake of keeping PII and sensitive data safe. They are more than
| happy to do the absolute minimum to pass an audit, and the
| absolute minimum is shockingly little.
|
| What they really care about -- and what lights a fire under them
| in the way that basic ethics and common sense apparently does not
| -- is passing vendor security reviews.
|
| Shout out to companies with very strict assessments, who actually
| pay attention and weed out companies with bad practices.
| serverlessmom wrote:
| You have a really good point, thank you for your post and the
| reminder of this aspect!
|
| I believe fully in the importance of ethics and security that
| we as a society and we that work in tech should be honoring in
| full. It is disheartening to watch trusted companies utilize
| that -just enough- mentality to skim over the tops of audits.
| We have all seen what it looks like when companies operate from
| the absolute minimum and how dangerous and disrespectful that
| is for everyone- company and users included.
| pc86 wrote:
| > _It is disheartening to watch trusted companies utilize
| that -just enough- mentality to skim over the tops of
| audits._
|
| I'll be honest, this mentality bothers me. The point of
| independent audits and guidelines is to tell someone what the
| minimum bar is. If the minimum is 50, and the company goes
| from 20 to 50 in order to pass that audit, that's a good
| thing, not "doing just enough...to skim over the top." If you
| want to argue the minimum should be 75 instead, fine, but
| argue that the audit isn't good enough or that the guidelines
| are wrong, not that the companies are somehow unethical or
| immoral for spending more money than they need to in order to
| pass a vendor security review that is not going to award them
| any extra credit for effort.
| ab_testing wrote:
| Exactly, if you want me to wear 37 pieces of flair, why
| don't you just make the minimum 37 pieces of flair?
| FeaturelessBug wrote:
| The point is exactly that- morality isn't for "extra
| credit", we shouldn't need to get rewarded to do the right
| thing- that's kind of the point.
|
| I mean sure, maybe when it comes to smaller companies. But
| some of the companies with the biggest security blunders
| are those that have enough money that the security for
| their users information should be a major priority and
| those costs would barely impact their bottom line.
|
| And also... Using this same train of argument couldn't we
| also just argue that the cost of security compliance
| shouldn't be this high to begin with so it's more
| accessible?
| pc86 wrote:
| Isn't that supposed to be the whole point of these
| security reviews? To tell people where the line is?
| Saying "just do more because it's the moral thing to do"
| is not a convincing argument, because it doesn't tell
| anyone what "more" is.
| dcveloper wrote:
| I've worked in this field, as well. Both implementing a
| FedRAMP'ed PaaS and sponsoring a CSP from the customer side
| where FedRAMP compliance was required. One thing that is often
| missing in these articles are compliance costs. Most don't
| realize that FedRAMP compliance at a High baseline is likely a
| $750K - 1M investment.
| cm2012 wrote:
| The cost is much higher than that when you account for the
| friction added to day to day developer work after compliance
| processes are put into place.
|
| Adding 5% more friction on every step of development
| compounds a lot.
| worker_person wrote:
| Then all the good developers leave. A series of decent
| people hire in, get frustrated and quit. After awhile you
| just have a core group of either incompetent or desperate
| people hanging on.
|
| Management can ignore for a few years. Rebooting things
| isn't too hard. But then the issues that could be ignored
| can't be anymore. Eventually you get sold for the
| intellectual property or customer base.
| FeaturelessBug wrote:
| So in that case it's much less about a company desiring to
| comply to these costs and much more about not being able to
| realistically being able to afford to do so?
| darren wrote:
| Ouch, I had no idea it cost that much. What are the main cost
| areas?
|
| What would you estimate compliance at a Moderate baseline
| would be?
| dcveloper wrote:
| 1. Engineer costs - A PaaS at the high baseline will likely
| implement 300+ controls. It's been a while since I looked
| at an IaaS CSP's FedRAMP package, but they typically
| implement roughly 100 fully implemented controls. The rest
| is on the customer to fully implement or engineer
| completely. Likely 300K-500K worth of engineering costs.
|
| 2. Assessment - 3PAO assessor will likely be 100K-200K.
| Most first time CSP's may require more than 1 assessment as
| the process is usually (1) Assess (2) Submit to FedRAMP PMO
| (3) they provide feedback (4) limited time to implement. If
| you cannot implement in sufficient time, you'll have to
| reassess. Note, unless you are AWS, Azure, Google, FedRAMP
| PMO may not prioritize you without sufficient customer
| support. As a result, your contract with your 3PAO may be
| expired. You'll need to bring them in again.
|
| 3. Documentation experts - There's an art to generating the
| FedRAMP package. Engineers typically aren't good at it, and
| it often requires one level of abstraction above internal
| technical documentation. Having technical writing experts
| that know how to communicate the security implementation
| without diverging too much is a skill set. You share the
| bear minimum to get compliance. As there's business risk
| from sharing too much (sharing implementation details with
| a competitor or untrusted source). Also, the more technical
| details there are, the more audit questions often arise.
|
| 4. Control Implementation SME's - Often time your engineers
| don't know how to implement a required security control or
| don't know what the compliance people really want. Many
| CSP's hire a 3PAO assessor to advice you how to implement.
| This cannot be the same 3PAO assessor that audits you.
|
| 5. Conflict between product/feature value versus control
| implementation - Sometimes a value or feature of your
| product directly conflicts with a control requirement. A
| good example is a CMS PaaS (WP as a service or Drupal as a
| Service). Those CMS's often support user code or user code
| to spawn processes. The high baseline requires process
| whitelisting. Solving this problem while not destroying
| that feature can be difficult or expensive.
| sam0x17 wrote:
| Side note to that, a lot of the orgs that actually have medical
| PII (schools, especially) will be significantly less compliant
| themseles with things like HIPAA than the vendors they use.
| bob1029 wrote:
| > Shout out to companies with very strict assessments, who
| actually pay attention and weed out companies with bad
| practices.
|
| We do B2B software. All of our customers (small community
| banks) have been incredible hard-asses regarding PII
| visibility. As they should be.
|
| Even in cases where our customers use a "cloud" solution, it's
| with some niche 3rd party vendor that would never grant access
| to someone outside the secret club. Every one of our product
| installations is effectively an "on-prem" deal.
|
| All of the logging information that we get to see is redacted
| for PII by the customer's instance before it leaves their
| secure context. This is a zero-tolerance policy too. We
| anonymize even the most generic facts and ensure our hashes are
| salted as specifically as feasible. There are some cases where
| this burns us (e.g. knowing that the SSN contained a non-
| numeric digit could be _very_ helpful at troubleshooting time),
| but on the other hand everyone sleeps better at night knowing
| that PII is not leaking out into 3rd party buckets arbitrarily.
|
| I would say that being very cautious about PII has opened up
| more opportunities for us. Our organization operates under the
| pretense that if any one of our customers were to become
| compromised by way of our product (or consultation per the
| product), we are _instantly_ dead and put out of business. We
| don 't sell ourselves exactly this way on sales calls, but we
| do make it clear that PII is the #1 concern in our minds. For
| us, it is analogous to safety on a construction site or nuclear
| power plant.
| jupp0r wrote:
| I'm always wondering whether these very abstract and incoherent
| standards actually improve or damage _actual_ real world security
| practices. I've seen whole departments at companies that used to
| be very focussed on protecting customer and company data shift
| focus to compliance measures with the effect of actual security
| getting worse in the process.
| nullandvoid wrote:
| On this note, is there a guide for what minimum the average Joe
| SaaS developer needs to complete before taking money through a
| SaaS (E.G setting up X type of business entity, getting
| insurance)? I pay tax to the UK, so any info for that would be
| fantastic.
| mbesto wrote:
| > It also stipulates security control measures such as two-factor
| authentication (2FA) and access control for any accounts that
| store sensitive information, end-to-end encryption, training
| staff in data protection awareness, and a data privacy policy.
|
| Uhhh does it? I'm pretty sure it does not explicitly stipulate
| this.
| rsstack wrote:
| Good article!
|
| Just a note: some links are rendered wrong. The Markdown [ ] and
| ( ) are rendered, instead of an <a> tag.
| serverlessmom wrote:
| Fixed, thank you!
___________________________________________________________________
(page generated 2022-06-13 23:00 UTC)