24/03/2026
Many senior leaders still treat AI and automation as an IT or operational efficiency matter. Under NSW's Digital Work Systems Act, it is now a named workplace health and safety duty, owned at the leadership level.
When algorithms allocate tasks, monitor performance, or manage workloads, the associated psychosocial and physical risks fall within your WHS obligations. If those systems are not assessed, controlled, and reviewed within your existing risk framework, you are already exposed.
The common failure is treating digital work systems as a technology deployment rather than a governance obligation. Without formal risk assessment, documented controls, and management review, AI systems create unmanaged liability, regardless of how well they perform commercially.
Organisations that embed AI risk into their ISO 45001 risk registers and WHS governance frameworks now are the ones that will be audit-ready, regulator-ready, and defensible when scrutiny arrives.
At Anitech, we work alongside industrial and operational leaders to translate emerging regulatory obligations into structured, audit-ready management systems.
If your organisation is navigating this transition, we are happy to have a conversation, visit: https://hubs.li/Q047SzJz0