May 26, 2023
Bulletin interne de l'Institut Pasteur
Data sent to conversational artificail intelligences (AI) (ChatGPT, ...), and generally cloud services, are not controlled, so it is often impossible to know who can access them.
For example, with a specific query, it is possible to retrieve data previously sent to the AI by another user.
These tools must be used with caution, especially:
Do not submit Institut Pasteur sensitive information (e.g. research or medical data)
ChatGPT indicates in its Terms of use that they use its users’ conversations to improve its model. Thus, submitted data can be accessed by third parties.
In March, Samsung employees leaked highly confidential information to ChatGTP.
How can you know if a data is sensitive?
Please go to Home - ISS Services Portal
Do not install tools that automatically access your data
Some tools, such as extensions to automatically reply to emails, will extract your mailbox content and use it to improve their AI models or even more…
Some extensions offer to learn your writing style, they can need your mailbox history to be transmitted for training purposes
Beware of suspicious extensions
Several could be malicious. Hackers are taking advantage of the conversational AI craze, like ChatGPT, by creating fake applications. By installing them you can compromise your computer and the data it contains will probably be stolen or encrypted.
Infostealer malwares have been particularly virulent since late 2022
Be even more careful, AI also benefits hackers
AI can also be used to forge even more realistic phishing emails. If you have any doubt about the legitimacy of an email, you can contact the ISS team at rssi@pasteur.fr.
AI are already being used to imitate voices over phone!
You use an AI and are wondering about the risk
Do not hesitate to contact the ISS team (rssi@pasteur.fr) to help you in this process, these technologies evolve quickly, and many questions will appear over time.
Good to know
A phishing e-learning training is available for everyone on the Kaptitude platform!
If you have any questions, please contact rssi@pasteur.fr