Use Caution With ChatGTP and Other AI Services

Tags GPT ChatGPT AI

As many of you are aware, AI has become a very popular and interesting topic.  With the introduction of ChatGTP and other AI services, there is a great deal of fascination (justifiably) with the technology.  That said, as with any technology, our duties to protect and preserve the data we use to perform our work do not change.  A recent disclosure of data and information by Samsung into ChatGTP has some valuable lessons for all of us.  
 
The UT System does not have any legal agreements with any AI developer that provides any assurance of data confidentiality.  Therefore, putting data into ChatGTP or similar services is equivalent to disclosing the data to the public.  Therefore, we must use the same data sharing precautions that we use every day with the new technology.  Specifically, this means the following information should not be placed into any AI service:
 
Any data whose disclosure to the public would be considered a breach under FERPA, HIPAA, PCI, GLBA or any other Federal or State Statute.
Examples include (not exhaustive):

  • SSN
  • Credit Card Numbers
  • Personally identifiable medical information
  • Financial Aid information
  • Student names and grades
  • Etc. 

Additionally, great caution is suggested with the following information:

  • Research data/Intellectual Property
  • Source code
  • Proprietary data 
  • Internal meeting notes
  • Hardware related information
  • Presentation notes, emails

While generative AI may prove to be a valuable tool, our use of it is limited by our control over how data is stored and accessed.  Please be cognizant of our data stewardship responsibilities as you explore these new technologies and their capabilities.
 
Ramon Padilla Jr.
Vice Chancellor IT and Innovation

Print Article

Related Articles (1)

This article explains the key features of UT Verse, UT’s generative AI chat, where you can have AI conversations about topics that are geared toward our internal audiences, university business, and research.