Philosophy

March 1, 2026

3/1/26

why ai literacy matters more than ai access

I wish i'd had ai when I was younger.

not because I lacked ambition or ideas. I had plenty of both. what I lacked was the ability to bridge the gap between what I wanted to build and what I could actually execute. i'd get stuck on something simple for hours because I couldn't articulate the question properly to search for it.

the information existed. the gap between my understanding and the available resources was just too wide to bridge alone.

that gap still exists today. it's just shifted.

the dangerous middle ground

I see the same pattern now when I work with businesses trying to implement ai.

a leader knows ai exists. they know it's supposed to help. but they can't articulate what they actually need from it. the gap in knowledge is what's happening. it signifies that the leader needs education on what ai can and can't do.

here's what nobody talks about: giving someone ai access without closing that literacy gap makes things worse.

nearly always.

I watched a company automate their client intake process because it felt like it took time. they built this whole flow with ai qualifying questions and routing. but the actual bottleneck was that they didn't have a clear service offering.

now instead of a human conversation that could adapt and clarify what the client actually needed, they had this rigid automated path that confused people.

clients would get halfway through and abandon it because the questions didn't make sense for their situation. the team then had to spend time following up with half-completed forms, trying to reconstruct what the person actually wanted.

they turned a flexible five-minute conversation into a broken system that required more human intervention than before. just later in the process.

the coordination cost was massive. now they had to manage the automation, interpret incomplete data, and still have the conversation anyway.

they automated because they could point at intake and say "this takes time" but they never asked if that time was actually the problem.

tool literacy versus system literacy

there's a difference between someone who understands tools and someone who understands the system with tools.

It's easy to just understand tools. go to zapier.com and you'll be able to automate stuff in your business. understanding the system that carries your business and the tools makes all the difference.

only then do you understand why and where things should be automated.

I've seen this play out repeatedly. someone learns enough about ai to automate but not enough to know they shouldn't. they're in the dangerous middle ground.

**62% of leaders recognise an ai literacy skills gap within their organisations.** yet only 25% have implemented organisation-wide ai training programmes.

that gap creates vulnerability.

an example I see constantly: people using chatgpt as just a search engine. it's nowhere near just that. explaining to someone you can use it to create courses or education on topics changes everything for them.

now it's not just getting answers. it's understanding answers.

something that couldn't be done before with just an internet search. you can have it simplify answers, explain in different ways, offer anecdotes.

if you don't know ai is capable of this you're missing out on capability.

when you can properly understand ai capabilities it unlocks another part of your brain to find out even more. this is the curse of open-ended text box ai. people find it really hard to get the full benefit without guidance.

the audit nobody wants

part of my process is a business audit to understand how the business system works.

I won't work with someone who wants to skip this.

It's an important step to understand if they respect the need for system literacy. when someone wants to skip it, they want results instantly.

there are certainly ways to make that happen. but if I start working with a new company and I don't understand their systems then I could end up automating the wrong things.

it could look good on paper. the end result is not more efficiency and I'm to blame.

any work that has my company name on it must be of the highest quality. I have no interest in building systems that will come crashing down in the future.

wanting instant results reveals they don't understand that automation quality depends on system understanding first.

this isn't new. people underestimate the adaptation that needs to happen.

ai runs on data and so many businesses aren't even ready for stage one because their data is a mess. it's a lot of restructuring that may be necessary to get ready for ai in your business.

the same happened when we went from physical files to digital ones. there are still companies that haven't made that transition today and they're left in a totally outdated era.

ai tech is moving even quicker than that era. they'll quickly get left behind if they don't start adapting to it now.

the coming bifurcation

here's what I see happening in the next three years.

the ones who embraced ai and took the time to build it in will have an operational ai moat around their business. they'll have systems that stand out from the rest and humans will work seamlessly alongside computers to produce great results.

those who just deployed will likely keep redeploying all the time.

they'll never be satisfied and constantly have to adapt because they didn't take the time to do it right and understand it.

**95% of corporate ai projects fail to demonstrate profit-and-loss impact.** not because the technology doesn't work. because most implementations fail from misalignment between mechanism and actual workflow physics.

companies attempt to force generative ai into existing processes with minimal adaptation. they automate before they optimise.

the divide won't be between those with ai access and those without. it'll be between those who understand when to deploy it and those who automate indiscriminately.

companies that embrace ai the right way will do things faster, cheaper and better soon enough.

what literacy actually means

literacy isn't a fixed state. it's an ongoing relationship.

it means understanding all its capabilities and being able to accurately map it to your systems too. when you understand what ai is good or bad at you understand where you can plug it into your business.

you understand where your people do their best work and where ai could do just as good a job.

once you have this base understanding you can "grow with the ai". you can see new capabilities come along and understand what to do with it.

someone growing with ai thinks architecturally. one solution that makes sense across multiple areas. versus the tool accumulator who's just patching individual problems.

one is just bolting on solutions to random problems they see and the other is looking at the big picture when they problem solve.

often an ai solution or tool doesn't just solve one thing alone. you should structure it so it makes sense in other areas too to get the best out of it.

what i learned the hard way

in my earlier stages I would rush to try different things. new tools that came out. when I started understanding fitting the solution to the whole process it all changed.

I started understanding that bandaid fixes don't last.

you need to understand the whole process, the bottlenecks and find solutions where they're needed and make sense. not just a forced solution.

this is what I wish someone had told me earlier. understand the process first.

the information was always there when I was younger trying to learn to code. the gap between my understanding and the available resources was just too wide. ai could have helped me break down what I was actually trying to do, translate my confused thinking into the right technical question.

but access alone wouldn't have been enough.

I needed literacy. I needed to understand not just what the tools could do but when to use them and when not to.

**40% of the global workforce will need to reskill in the next three years as ai adoption accelerates.** the question isn't whether you'll adopt ai. it's whether you'll understand it well enough to deploy it properly.

the winners won't be those who automate most. they'll be those who automate appropriately.

that requires literacy. not just access.

what does your literacy gap look like right now?

Connor Saunders © 2025

obsessed with efficiency

Connor Saunders © 2025

obsessed with efficiency