Blog

Automated Government Decisions in Australia: Transparency in 2026

Transparency and Trust in Automated Government Decisions Australian agencies increasingly use technology to automate decision-making processes – in areas from


Transparency and Trust in Automated Government Decisions

Australian agencies increasingly use technology to automate decision-making processes – in areas from social services to taxation, aged care and veterans’ entitlements. The OAIC recently published a report assessing 23 federal agencies authorised to use automated decision-making (ADM), examining how transparently each agency discloses these systems under the Freedom of Information Act. The aim is to ensure such operational information on ADM is available via each agency’s Information Publication Scheme (IPS), boosting public trust.

Limited disclosure by agencies

The OAIC’s desktop review found that most agencies have not clearly documented their use of automation. Key findings include: 

Source 

  • 23 government agencies reviewed (October 2025).
  • 4 agencies (17%) explicitly disclosed using ADM in their public IPS reports.
  • 2 agencies (9%) likely use ADM (via external sources) but did not disclose it.
  • The remaining 17 agencies (74%) gave no public confirmation or policy details on ADM.

In other words, the review found “varying levels of transparency and consistency” across agencies on how they describe automated decision systems. Most agencies are not proactively explaining how algorithms or automated tools influence decisions affecting citizens. This limited visibility means the public and oversight bodies can’t easily see how digital tools are used in government decisions.

Why transparency matters

The OAIC report and the Information Commissioner stress that clear, proactive ADM disclosure is crucial. The FOI Act’s IPS already requires agencies to publish operational information, and the OAIC will update the FOI guidelines to explicitly include ADM use. As Commissioner Tydd said: “Proactively publishing clear information about automated decision-making is essential to building trust and ensuring accountability”. Transparent ADM policies and reporting “improve integrity [and] strengthen public confidence” in government decision-making.

From a cyber governance standpoint, these disclosure gaps pose real risks. Incomplete or opaque ADM processes raise compliance and privacy concerns. Upcoming Privacy Act reforms (effective late 2025) will impose new legal obligations on organisations using automated decisions that affect individuals. Agencies must therefore meet both FOI/IPS transparency requirements and ensure privacy and fairness in any use of automation.

Protectera advises treating ADM systems as high-risk technology assets. That means:

  • Document and govern ADM: Assign clear ownership, audit each system, and write policies for how automated decisions are made.
  • Be open and clear: Mention the use of ADM in your IPS disclosures and privacy notices, and keep simple records that explain what each system does and how it works.
  • Prepare for audit and incident: Keep logs and be ready to explain automated outputs if questioned; integrate ADM oversight into incident response plans.

These steps reduce cyber and compliance risks. In short, transparency drives trust. When the public and stakeholders can see how decisions are made, confidence grows.

Getting started

Ready to get started? At Protectera, we guide organisations in building transparency and trust around technology. If you want to find out how we can help you with automated decision-making risk assessments and compliance services. Talk to our Cyber Risk Advisors now at 02 7227 5428 or book a consultation. Don’t forget to follow us on LinkedIn.