As ChatGPT, a chatbot becomes more and more popular, Microsoft and Amazon are guarding against employee breaches of confidentiality by forbidding them from sharing sensitive data with ChatGPT.
Earlier this month, a Microsoft employee asked on an internal forum if ChatGPT or any other product from its developer, OpenAI, could be used at work. A senior engineer in Microsoft’s Chief Technology Officer (CTO) office responded that employees could use ChatGPT as long as they did not share confidential information with it.
“Please do not send sensitive data to OpenAI endpoints, as they may use it to train future models.” The senior engineer wrote in an internal post. A spokesperson for Microsoft, which is an investor in OpenAI, would not comment.
Similarly, Amazon’s corporate counsel warned employees not to share “any confidential Amazon information (including the Amazon code you are writing)” with ChatGPT. “This is important because your input information may be used as training data for further iterations of ChatGPT, and we do not want its output to contain or resemble our confidential information (I have seen examples where its output closely matches existing material).” The lawyer wrote.