As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor’s voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.
I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that’s beside the point. Also the question is about educating them, not a legal one.
How do I present my case? I’m not willing to use a non local AI transcribing my voice. I don’t want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a “cloud sollution”. Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.
I don’t where you live. But almost all of bigtec US cloud is problematic (Read: Illegal to use) for storing or processing of Personal information according to the GDPR if you’re based in the EU. Don’t know about HIPPA and other non-EU legislation. But almost all cloudservices use US bigtech as a subprocessor under the hood. Which means that the use of AI and cloud is most likely not GDPR-complaint. Which you could mention to the right people and hope they listen.
Edit: It’s illegal to use for the processing of the patients PII, because of transfer to insecure third countries and because bigtech uses the data for their own purposes without any legal basis.
You don’t have to use a cloud service to do AI transcription. You don’t even need to use AI. Speech to text has been a thing for like 30+ years.
Also, AWS has a FedRAMP authorized Gov Cloud that’s almost certainly HIPAA (and it’s non-us counterparts) compliant.
Also also, there are plenty of cloud based services that are HIPAA compliant.
I agree and I suspect this planned system might get scuttled before release due to legal problems. That’s why I framed it in a non legal way. I want my bosses to understand the privacy issue, both in this particular case but also in future cases.
I would have work sign a legal discharge that from the moment I use the technology, none of the recordings or transcription of me can be used to incriminate me in case of an alleged malpractice.
In fact, since both are generated or can be generated in a way that both sounds very assertive but also can be adding incredibly wild mistakes, in a potentially life and death situation, they legally recognise potentially nullifying my work, and taking the entire legal responsibility for it.
As you can see in the most recent example involving Air Canada, a policy has been invented out of thin air. Such policy is costing the company. In the case of a doctor, if the administration of the wrong sedative, the wrong medication, or if the wrong diagnosis was communicated to the patient, etc; all that could have serious consequences.
All sounding (using your phrasings, etc) like you, being extremely assertive, etc.
A human doing that job will know not to derive from the recording. An AI? “antihistaminic” and “anti asthmatic” aren’t too far off, and that is just one example off of the top of my head.
You’re going to lose this fight. Admin types don’t understand technology and, at this point, I imagine neither do most doctors. You’ll be loud minority because your concerns aren’t concrete enough and ‘AI is so cool. I mean it’s in the news!’
Maybe I’m wrong, but my organization just went full ‘we don’t understand AI so don’t use it ever,’ which is the other side of the same coin.
I understand the fight will be hard and I’m not getting into it if I cant present something they will understand. I’m definetly in a minority both among the admin staff and my peers, the doctors. Most are totally ignorsnt to the privacy issue.
Okay, so two questions:
- are you in a country that falls under the GDPR?
- this training data is to be per person?
I work in Sweden and it falls under GDPR. There are probably are GDPR implications but as I wrote the question is not legal. I want my bosses to be aeare of the general issue ad this is but the first of many similar problems.
The training data is to be per person, resulting in a tailored model to every single doctor.
Do your patients know that their information is being transcribed in the cloud, which means it could potentially be hacked, leaked, tracked, and sold? How does this foster a sense of distrust, and harm the patients progress?
Could you leverage this information and the possibility of being sued if information is leaked with the bureaucrats?
What, exactly, are your privacy concerns about this?
Not OP but if I were him/her: Leakage of patient data. Even if OP isn’t responsible, simply being tied to an incident like this can look very bad in fields that rely heavily on reputation.
AI models are known to leak this kind of information, there are news articles all over
It would be worth finding out more about how exactly the training process works, namely whether or not the AI company stores the training audio clips after training has been completed. If not, then I would say you don’t have anything to worry about, because the model itself can’t be used to clone your voice to any useful extent. Deep neural networks aren’t reversible like that. Even if they were, it’s not just trained on you, it’s trained on hundreds of thousands of people then fine-tuned to you.
If they do store the clips though, then maybe show them this article about GitHub to prove to them that there is precedence for private companies using people’s data to train AI without their explicit consent.
This is really weird. Is it common in other countries for doctors to not input the data in the system themselves?
I don’t know if it’s common practise in other countries. In Sweden where I work it is. I think the rationale is the following:
- It’s a lot faster to use a voice recorder.
- A doctor’s time is worth a lot more than a secretary’s (in the sense of pay and rarity)
- Using a voice recorder lets us review lab results, radiology etc at the same time as recording, not having to switch between tasks. -Doctorss wont have to be good spellers or think about building well thought out sentences. We also dont have to look up classification codes for procedures and diagnoses. All this will be done by the secretary.
Of course we have to review the teanscribed result. At my hospital, all doctors carry smart cards and use the personal stoed private key to digitally sign every transcribed medical record entry.
Shouldn’t that be a HIPAA violation? Like you can’t in good conscious guarantee that the patient data isn’t being used for anything but the healthcare.