In this episode of Digital Nexus, Chris and Mark are joined by Dr Tim Rayner — AI Philosopher, author of Hacker Culture and the New Rules of Innovation, and educator at UTS Business School — to unpack how AI can move from “smart autocomplete” to genuine teammates that help people think better, learn faster and build braver.
If you’re a founder, product lead, educator or operator trying to make sense of AI beyond the hype cycle, this one goes deep into the human side of intelligent systems: judgment, values, learning, and what it means to flourish in an automated economy.
In this episode you’ll learn:
Why universities are being forced to rethink the traditional degree How one business school is restructuring its graduate programs around AI, project-based learning and “build-first” mindsets, and what that signals for employers and students.
From tools to teammates:
What separates “AI as a fast calculator” from “AI as a collaborator”, and how to frame agent workflows so they support, rather than replace, human judgment.
Hacker culture and innovation inside big organisations
Tim’s take on the “hidden hackers” in every company, and how leaders can give them space, scaffolding and safety to run meaningful experiments instead of gimmicky pilots.
Cognitive offloading vs. cognitive laziness
When it is smart to lean on AI for heavy lifting — and where you need to keep humans in the loop so your team doesn’t atrophy its own thinking.
Philosophy as a practical AI skill
How ideas from Socrates and modern ethics show up in real product decisions: from incentives and power, to who benefits, who’s left out, and how you decide what “good” looks like.
Future-of-work reality check
The kinds of jobs and skills Tim expects to grow as AI spreads — and why curiosity, critical thinking and the ability to work with systems matter more than any single tool.
Designing AI into your workflows without breaking trust
Concrete examples of where AI is already creating value in research, analysis and decision support, and the guardrails you need so teams feel supported, not monitored.