Visibility and governance for AI use across your institution
Vireo Sentinel shows university, TAFE, and training provider leaders what staff and researchers share with AI tools, supports your TEQSA and Privacy Act obligations, and protects student and research data at the point of entry.
Your institution has policies. It probably doesn't have visibility.
Most universities, TAFEs, and training providers have published an AI use policy in the last 18 months. Most of those policies are not enforced. Most institutional leaders have no way of knowing what staff, researchers, and administrators are sharing with ChatGPT, Claude, Gemini, or any of the dozens of other AI tools that have arrived since.
The result is a governance gap that sits squarely inside your duty under the Privacy Act, your obligations under TEQSA's Higher Education Standards Framework, and (if you have international students) your obligations under the ESOS Act and the National Code.
Six things happening on your campus this week
Marking and feedback
A tutor pastes student work into ChatGPT to draft feedback faster. Identifiable student work, identifiable student names, and the academic record of an enrolled student leave the institution.
Research data summarisation
A researcher pastes interview transcripts or unpublished data into Claude to help structure findings. Pre-publication research and participant information leaves the institution.
Admissions and recruitment
A recruitment officer pastes applicant details into AI to help draft offers, decline letters, or shortlists. Personal information about applicants is sent to a third party.
Welfare and student support
A student support officer types up notes about a student's mental health, financial situation, or disclosed disability into AI to help draft a referral. Sensitive personal information sits in prompt history.
Grant applications and IP
A research administrator pastes draft grant content, including unpublished IP and commercial-in-confidence material, into AI to refine the writing.
International student information
A faculty officer pastes details about an international student's enrolment, attendance, or visa status into AI. Information protected under the ESOS Act leaves the institution.
Blocking doesn't work. Policies on their own don't work.
Staff work from home, use personal devices, and access AI through every browser they open. Bans push usage underground. Fewer than one in ten employees have had any meaningful AI training.
Six categories of information at risk every day
Student records
Enrolment, attendance, academic transcripts, contact details.
Academic and assessment data
Student work, marking notes, moderation discussions.
Research data
Interview transcripts, participant information, unpublished findings.
Grant applications and IP
Commercial-in-confidence research material, partner agreements.
HR and staff records
Performance reviews, disciplinary matters, payroll.
International student information
Visa, attendance, and enrolment data protected under the ESOS Act and the National Code.
What the regulators expect
Privacy Act 1988 (as amended by the POLA Act 2024)
In effect now and through to 10 December 2026
Universities, TAFEs, and most training providers are APP entities. The 2024 amendments introduced a statutory tort for serious invasion of privacy from 10 June 2025, allowing affected individuals to sue directly for damages. Separately, the OAIC can pursue civil penalties of up to $50 million for serious or repeated breaches. The OAIC's 2025 to 2026 enforcement priorities specifically include AI and biometric technologies. Both mechanisms apply, and they apply independently.
TEQSA and the Higher Education Standards Framework (Threshold Standards 2021)
In effect now
Section 7.3 of the Threshold Standards requires institutions to have effective information management. AI use that exposes student records, assessment data, or research material is directly relevant. TEQSA has been increasingly active on AI-related risk in 2025 and 2026 and expects institutions to be able to demonstrate, not just assert, that they are managing the risks.
ESOS Act and the National Code
In effect now, applies to providers with international students
The Education Services for Overseas Students Act and the National Code 2018 place specific obligations on institutions handling information about international students, including visa compliance data, attendance records, and welfare matters. Pasting that information into AI tools without controls is a clear gap.
EU AI Act
High-risk classification active from August 2026
Annex III of the EU AI Act classifies AI systems used in education as high-risk. This applies to any institution operating in the EU or processing data on EU residents. High-risk classification triggers documentation, logging, human oversight, and risk assessment requirements.
GDPR and UK GDPR
In effect now
Any institution with EU or UK students or partnerships processes personal data under GDPR. Sharing student data with third-party AI providers requires a lawful basis. Most institutional privacy notices do not currently cover AI tool use. That is a compliance gap the regulators have begun to investigate.
FERPA
Relevant for US partnerships
Institutions running joint programmes or partnerships with US institutions handle data subject to FERPA. Pasting student records into commercial AI tools may constitute disclosure to a third party without consent.
How Vireo Sentinel helps
See what's happening
Which AI tools your staff and researchers use, by faculty, by department, by use case. Visibility your CISO, DPO, and academic governance committee can act on.
Catch information before it leaves
Real-time detection of identifiable student information, research data, and sensitive personal information using 100+ detection patterns and entity recognition that runs locally in the browser. Staff can cancel, redact, edit, or override with a justification.
Support your governance
Compliance-supporting reports mapped to the Privacy Act, TEQSA Threshold Standards, the ESOS Act, the EU AI Act, and the GDPR. Evidence for your governance committee, your auditor, and your regulators.
Built for higher education
Department-level visibility
Drill down by faculty, department, or research unit. Patterns surface where you can act on them.
Privacy by design
Detection happens in the browser, before anything reaches our servers or the AI platform. Identifiable information is redacted at the source.
Deploys without an IT project
A browser extension. No agents, no network changes, no proxy configuration. Live in 10 minutes per user.
Per-seat pricing that scales
Transparent per-seat pricing across the institution, with enterprise discounts at scale. Annual invoicing on request.
A subscription is not a governance system
If your institution has a ChatGPT Enterprise or Claude Enterprise licence, the data your staff submit will not be used to train the underlying models. That's the limit of what the subscription gives you.
The admin console shows you who has a login. It does not show you what staff and researchers are putting into prompts. It cannot detect identifiable student information before it is submitted. It tells you nothing about the dozens of other AI tools your people use alongside the one you are paying for.
When TEQSA, your governance committee, or a privacy regulator asks how you manage AI use across the institution, you need more than a subscription. Vireo gives you the visibility, the interventions, and the evidence to answer with confidence.
Transparent pricing, scaled for institutions
Vireo's per-seat cost drops as your team grows. Enterprise annual invoicing is available for institutions deploying across faculties or campuses. Request a tailored quote or start with a free 14-day trial for a department or pilot group.
Get visibility across your institution
Start with a department or a pilot group. See what visibility looks like in your environment, then scale. No enterprise procurement cycle, no five-week deployment, no implementation services contract.
Frequently asked questions
Does Vireo Sentinel work for universities with multiple faculties?
Yes. Vireo provides department-level visibility so you can see AI usage patterns by faculty, research unit, or administrative team. Pricing scales as your deployment grows.
How does Vireo support TEQSA compliance for AI use?
Vireo generates compliance-supporting reports mapped to the Higher Education Standards Framework. Section 7.3 requires effective information management, and Vireo provides the visibility and evidence to demonstrate it.
Can we pilot Vireo in a single department before rolling out?
Yes. Start with a department or pilot group using a 14-day free trial. No enterprise procurement cycle required. Scale across the institution when you are ready.