DeepSeek or deep risk? Keeping UNSW’s (and your) data safe in the age of AI

03 Feb 2025
DeepSeek AI

Are you using AI tools safely and responsibly?

The past week has been a whirlwind in the AI space and as we kick off the year, it's a good time to revisit this important question.

AI is evolving at breakneck speed. With every new release brings exciting possibilities and the potential of substantial risk. The latest tool making headlines, DeepSeek, with its notion of lower costs and capabilities that promise to outperform other popular genIAI tools, has sparked much interest in individuals, tech companies and even eliciting government warnings (see below) about privacy and security.

At UNSW IT, our advice is simple: approach emerging technologies with a healthy dose of scepticism. Stay informed, be cautious and keep your personal security and UNSW’s data at the forefront of your decision making, even when running AI models locally.

To help you navigate AI safely here at UNSW, here are some high-level guidelines and reminders.

Three key reminders

  1. Be wary of new AI tools - not all tools are created equal. Many, including DeepSeek, raise many questions about where data goes, how it’s being used and who it’s shared with. Before using any new AI tool, ask yourself:
  • Who owns this tool?
  • Is my data safe?                  
  • What is the risk?
  • What data has it been trained on?

If you can’t answer these questions confidently, then it’s best to hold off.

  1. Use only UNSW-approved AI tools - UNSW has thoroughly evaluated and approved AI tools that meet our ethical, security and privacy standards. To keep UNSW’s data safe, use only UNSW-supported AI tools, such as CoPilot with Enterprise Data Protection (now known as Copilot Chat) available for all staff and students and our limited OpenAI ChatGPT pilot, for any work and study related tasks. 
     
  2. Never input UNSW data into unapproved AI tools: AI models can collect, store and reuse any data you provide. Unless an AI provider explicitly guarantees UNSW data will not be used for training, assume it will. Avoid uploading UNSW data such as documents, student details or internal communications to unapproved AI platforms. Privacy and security risks, including cyber threats and data breaches, are high, and censorship, bias, or incomplete information can further undermine reliability. If running AI models locally, do so securely and always follow UNSW’s cyber security policies, standards and guidelines.

Government warning: proceed with caution

Australian Government ministers have been urging caution around tools like DeepSeek due to data security concerns. National security agencies are assessing the potential risk posed by such AI tools and will be issuing formal guidance in the future. So, until then, it is better to follow our three key reminders to protect yourself and UNSW.

To stay informed:

DeepSeek: Australian ministers urge caution over AI app | SBS News

Be careful with DeepSeek, Australia says - so is it safe to use?

Stay secure: use approved AI tools

Stay informed to stay secure. For more information on UNSW’s AI policies, guidelines, and the tools you can safely use, check out:

Are you keeping UNSW’s data safe through careful use of AI? | Inside UNSW

UNSW supported AI tools

Guide to AI @ UNSW

AI Assurance Framework

By making informed choices, you help protect your data, UNSW’s data and our collective security.

Stay smart, stay sceptical and use AI responsibly.

If you would like to discuss this approach or any other AI-related concerns, please contact IT via MyIT to explore possible ways forward.

 

Comments