[HN Gopher] How we develop FDA-compliant machine learning algori...
___________________________________________________________________
How we develop FDA-compliant machine learning algorithms
Author : yshrestha
Score : 39 points
Date : 2021-02-01 19:14 UTC (3 hours ago)
(HTM) web link (innolitics.com)
(TXT) w3m dump (innolitics.com)
| king_magic wrote:
| Good read. Reminds me a lot of Andrej Karpathy's "A Recipe for
| Training Neural Networks", which has served me _extremely_ well
| in my AI career.
|
| https://karpathy.github.io/2019/04/25/recipe/
| omarhaneef wrote:
| I looked at a few of these (FDA approved AI products) recently
| because I was curious as to what the instrument was allowed to do
| and sure enough it was more of a smart instrument that told you,
| for instance, how large a volume was.
|
| These products don't, at the moment, actually diagnose a
| condition or recommend a treatment which is interesting.
|
| One of the constraints for AI treatment has always been authority
| and responsibility. To put it another way, if the AI completely
| messes up and recommends something terrible, whom do you sue?
|
| I am hopeful that we (as a society) will sort out these issues
| with self-driving cars and apply some of those rules to medicine.
| superbcarrot wrote:
| It's tough to assign blame and responsibility to systems that
| are inherently probabilistic. ML algorithms output
| probabilities so even as the algorithms get incredibly good,
| unlikely scenarios will inevitably occur in large enough
| samples. This isn't a problem when the task is to tag your
| Facebook photos but is a problem when driving a car in the real
| world.
| OnlyOneCannolo wrote:
| >It's tough to assign blame and responsibility to systems
| that are inherently probabilistic.
|
| I hear this all the time nowadays, but everything has always
| been probabilistic. Either people don't know they're
| rehashing old problems, or it's said as a cop-out, but
| nothing is fundamentally different.
| superbcarrot wrote:
| I'm not following. How is everything probabilistic (in a
| meaningful way)? Most other systems don't have built in
| uncertainty in the same way ML does.
| OnlyOneCannolo wrote:
| Physical systems are probabilistic. Environmental
| conditions are probabilistic. And we've been trusting our
| safety to physical products operating in the real world
| for a long time.
| Gatsky wrote:
| I think it is a little wrong-headed to emphasise the litigation
| aspect. Something that I think people don't appreciate is that
| as a patient, you want a real person to take responsibility for
| your problem because that is the basis for a favourable outcome
| and a satisfying interaction. This applies to every service in
| the end, whether you are ordering food at a restaurant, trying
| to get a plumbing problem fixed or being treated for cancer.
|
| Any corporation, for-profit or not, generally seeks to
| obfuscate their responsibilities where possible (one could
| argue that this is the explicit purpose of a corporation - to
| be a virtual entity which cannot really be held accountable).
| Tobacco companies are the best and ongoing examples of this.
| Creating legal protection for AI such that accountability is
| somehow removed or diminished is not desirable in my opinion.
| It will lead inevitably to a situation where healthcare becomes
| analogous to trying to get technical support for your printer.
| Contrast this with the what happens currently, where you walk
| into a clinic or a hospital with a problem and meet a bunch of
| real people who listen to you. Of course there are massive
| problems with healthcare, and many people have unpleasant or
| dangerous experiences, but it can definitely be made worse if
| we aren't careful.
| ska wrote:
| It's an interesting time for AI/ML/pattern rec in medical
| devices.
|
| People have been shipping such systems since at least the mid 90s
| (e.g. CADe) but it's always been a bit of a difficult area from a
| regulatory point of view.
|
| If you go back a couple decades, or really even only one
| depending on the panel, it was fair to describe the FDA as an
| organization that understood hardware but didn't really
| understand software. On top of this, a lot of their approach was
| focused on controlled, reproducible results. So anything
| probabilistic or even worse, continually learning, becomes
| difficult to frame.
|
| The FDA guidance doc/ action plan (link is broken in article and
| on FDA site: it's here https://www.fda.gov/medical-
| devices/software-medical-device-...) is actually pretty
| encouraging, because they took community feedback from the
| original request for feedback
| (https://www.fda.gov/media/122535/download) and seem to have
| handle that process reasonably.
|
| Many people I've talked to without direct experience seem to
| assume the 13485 process as a whole is needlessly bureaucratic,
| but in my experience it's mostly holding yourself to a higher
| engineering standard that some other industries, and works out
| ok. This willingness to engage with technologies can be
| frustratingly slow, but it's mostly reasonable.
| ealexhudson wrote:
| 13485 is relatively bureaucratic, but it's not required for FDA
| per se and there are things you can do. What is more
| problematic is IEC 62304 or similar, which embeds a very
| particular SDLC into the development approach. It doesn't
| really fit AI/ML, it barely fits modern software development
| anyway.
| jdgiese wrote:
| We've had somewhat of a hard time applying agile to IEC
| 62304, although it's not impossible.
|
| This document has some useful tips:
|
| AAMI TIR45:2012 (R2018) Guidance On The Use Of AGILE
| Practices In The Development Of Medical Device Software
|
| Also, we have an open source offering that includes an
| IEC62304 compliant software plan. You can check this out
| here:
|
| https://github.com/innolitics/rdm/blob/master/rdm/init_files.
| ..
| ska wrote:
| details aside it's cool to see this done out in the open,
| that will really help people who have never seen it
| understand the scope.
| yshrestha wrote:
| Agreed. The standards are designed to be impervious to
| time and the implementations used. This has the
| unfortunate consequence of being quite abstract and
| difficult to grasp at first.
|
| For example "configuration management" was meaningless
| the first time I read it but really just translates to
| "version control"
|
| I think having an open source example will really help
| the concepts make sense to engineers implementing this
| for the first time.
| jdgiese wrote:
| Thanks! We think so too. It's a daunting task to go from
| ISO13485 or IEC62304 to a set of process documents. We
| hope to continue improving RDM to lower the barrier for
| researchers and startup founders in the space. We're
| about to add templates for a low-overhead QMS within the
| next week or two.
| oaiey wrote:
| 62304 does not limit your choice of sdlc or agility. You just
| have to be able to read it.
|
| In generally, Developing with regulatory oversight and a real
| QA department is just very different. The agile manifesto
| speaks of "working software" over "documentation". It does
| not speak of code. It speaks about working software. In a
| regulated business "working software" includes a forest worth
| of hopefully digital paper and the code for your customers
| and oversight. The paperwork is part of the output here and
| not part of the self inflicted documentation.
| ska wrote:
| You can avoid it for class I, but generally 13485 is becoming
| more harmonized and required.
|
| I don't think 62304 is too hard to adapt; i haven't done the
| process work in a while but unless the latest rev radically
| changed things it isn't very prescriptive, not nearly as much
| as you suggest; I've seen agile and Agile variants done. It
| is a sign of the slowness of the process that it's only
| recently become required, and reflects 15-years ago thinking,
| but it's fine.
|
| You are right that a lot of modern development processes need
| a fair bit of tweaking, but it's mostly because they aren't
| nearly careful enough for device use. This may be a bit of a
| burden on pure software devices, but it's probably the right
| trade off overall.
|
| This is why the big shift in thinking for most developers
| will be the 14971 (risk management) adoption, not 62304
| itself (IEC 62304 punts to ISO 14971 for this part, but it's
| common with hardware dev)
| yshrestha wrote:
| Interesting. I do agree 62304 seems to mostly codify what
| we would consider best software principles anyway.
|
| Your statement about 14971 being the big shift in thinking
| is interesting. With a bit of clinical context, do you
| think developers can write code with risk management in
| mind?
___________________________________________________________________
(page generated 2021-02-01 23:01 UTC)