AI tools have hit the world by storm, which naturally has led to many programmers (especially those still learning or early in their careers) to be asking things like:
And I get it. These are fair questions.
Here's my spoiler + teaser all in one: AI is NOT going to replace programmers. But something else will 😱.
Don't worry, just keep reading and I'll explain what I mean and let you know what you can do to not only keep your job as a programmer, but also show you why it's still an amazing skill to learn and career to pursue.
In this post, I’m going to share some of my personal thoughts and answer some common questions we’re seeing in our community of over 500,000 developers and other tech workers. Most importantly, I’ll show you how to stay ahead, and make sure you don’t get replaced.
So let’s dive in…
To understand why our jobs are secure, we need to understand how these AI’s work. More specifically, we need to look at the models used by the 3 main market leaders:
It’s hard to write an article about these as they get upgraded and improved all the time. Mainly by increasing the amount of memory (tokens) they can use, and the accuracy of their answers (since more context usually means better reasoning and fewer mistakes).
Here’s the current stats:
As you can see, those are some insane improvements since launch, however, they’re not as smart as they might seem, so let me explain.
Each of these AIs are a type of AI called a Large Language Model (otherwise known as LLMs).
This is a type of AI that attempts to mimic human intelligence by analyzing large volumes of data (hence the name), and create statistical models to find the connections and patterns between that data. If you think about it, it’s kind of similar to how a person would learn. We have experiences (datasets) and then form insights around them.
However, it’s important to understand that each of these LLMs are not a true AI in the conventional sense. They're not sentient like you might see from a sci-fi movie. They’re just very advanced pattern recognition machines.
The problem of course is that the interface and ‘human-like’ answers these machines provide, make it seem like they’re much smarter than they actually are.
This is where a lot of the fears and issues around AI stems from. Because when it seems smart, it's easy to think it can take your job, or worse - that you can use it to do your job and not double check its information.
In the example above, ChatGPT found multiple previous precedents for the lawyer to base their case on, which in theory, is a great use of the tool.
The problem of course is that none of the precedents that it gave him were actually real. It made them up to fit the question that was asked, and the lawyer never checked before going to trial...
Don’t get me wrong, LLMs like ChatGPT are still a huge advancement towards more sophisticated AI, but it's not Skynet just yet. There’s no sentience, just pattern recognition and prediction.
And remember that this pattern recognition comes from data provided by the internet which, let's be honest, isn't always accurate, or even sane sometimes.
All that being said, what we’re seeing so far has the potential to dramatically change how we interact with computer systems, as well as our daily work, which leads me right into the next common question.
Ha nope!
When I first wrote this guide, it was around 12 months after the initial launch (I was holding off opinion to see where it was going), and we were seeing the general public use it less, while other industries were leaning into it more.
This meant we went from around 1.8 Billion users per month to around 1.5 Billion.
I think a lot of this was people not being sure how to use it properly. In fact, when we interviewed our audience of tech professionals, that was one of the biggest factors holding them back from using it.
However, as more people have learned to use it, the daily usage has continued to skyrocket.
In fact, ChatGPT alone gets an estimated 1.8 Billion visits a WEEK now, compared to per month before. That's a total of around 5.1 Billion visits a month!
So yeah... I think it's fair to say that it's here to stay...
Kind of. Due to the larger memory sizes, and because they have access to open-source coding information, they can in theory ‘write’ code.
However, it’s often simply repeating something it's found (like a search engine would) or it's attempting to write something based on its understanding of what you’re asking it to do, and previous data that's similar.
It doesn't have the depth of context like a programmer would, and so this can lead to the LLM "hallucinating", meaning it makes up an answer to something (and even worse, does it confidently).
That means the code may look legit, and the AI might say it's legit, but it's not.
In fact, according to this report, more than 50% of the time, the code that ChatGPT gives you is incorrect.
To be fair, it's better nowadays, and I'm not trying to say it's useless and can’t write code at all. In fact, it definitely can and in many cases, it can do a really great job of getting you around 70% of the way there.
But it requires you to actually understand the underlying fundamentals of coding, and to also look at the code and assess the output it has provided to see if it will work.
The best way to think of it is like what programmers would do if they were looking for code snippets on StackOverflow or other sites (which also have incorrect information and possibly the wrong code):
You can get ideas to help you get 70% of the way there, but you need to know how to code to actually use it properly.
This is why your job is safe from AI. But there are other issues…
No. At least, not until AGI (artificial general intelligence) is figured out and it becomes a true self-learning, actual AI tool, but we’re not there yet. We may even be decades away from that happening.
For now, the tool is impressive (and it feels very impressive) but it’s really only as good as the person using it. Like I just said above, you need to be able to ask the right questions (prompts) to get the desired output. Phrase the question slightly wrong and you’ll get an output that meets your question but doesn’t meet your end goal.
Heck, even if you ask the question correctly, it can still give you the wrong answer. So no, AI won’t take your job on its own, but that doesn’t mean you shouldn’t be planning ahead or figuring out how to take advantage of this tool.
I’d argue that if embraced and used properly, these AI tools will actually open up new opportunities for programmers and other tech workers, across multiple areas. In fact, there’s an entirely new industry of jobs opened up now thanks to these tools (more on this in a second).
This is often the case when innovation enters an industry.
Just look at the last 20 years and the changes we've seen:
It's easy to think of all the possible bad things, but the reality is, this will open up so many more opportunities.
My good friend and fellow ZTM instructor Daniel Bourke put it so well in his post:
Change will always happen. It's how we choose to react to it that matters.
So there ya have it, AI tools themselves are not going to take your job. But as I alluded to at the beginning of the post, I do think something else might…
It’s not AI that’s going to take your job. It’s the programmers who are 10x more effective because they’ve learned to use these AI tools that are the real threat.
People are getting more done, making their jobs easier, companies are starting to require these tools, and team leads expect them to be a standard and essential tool in the coming years.
You can’t ignore this. These tools are a big deal, and you need to learn them if you want to get ahead or even get hired in the future.
Companies will always need smart coders, because they're not going to trust the AI to write it all themselves and have no one to check it or put it all together. Heck, some did to disastrous effects.
Combining a smart coder with someone that knows how to use these tools, creates an unbeatable combo that will have employers salivating at hiring them. Salivating sounded weird there but you know what I mean 😛
Because who do you think they're going to hire if given the option? Today's standard or the 10x version?
In fact, we feel so strongly in how much these tools can help you alongside your current focus, that we've created an array of courses (such as Prompt Engineering, Pair Programming, and others), that teach you exactly how to use AI tools for coding and tech.
This way, you can learn the fundamentals of a language, framework, or role so you know and understand what you’re doing, but at the same time, you’ll be using these new tools to help you apply what you learn, at a much more accelerated rate.
Speaking of which…
Remember how I said that these tools are only as good as the questions you ask, and also that new innovations often bring new industries?
Well, that’s where Prompt Engineering comes in!
A Prompt Engineer is a person who specializes in developing, refining, and optimizing the prompts that are given to ChatGPT (or other LLMs), to ensure the outputs are as accurate, efficient, and useful as possible (i.e. they ask the best questions to get the right outputs).
It sounds simple but it's a little more complex.
The reason they can get such good output is because these engineers have a deeper understanding of the technology underlying LLMs. They can then use this to further improve the quality and accuracy of ChatGPT’s responses, in ways that the average user might never think to implement.
Heck, in ways that they may never stumble upon organically.
This is why there’s been a boom in companies hiring Prompt Engineers to collaborate with different teams to improve their prompt generation process and overall AI system performance, or even train up specific models that are unique to them.
For example
BloombergGPT is an LLM that uses ChatGPT, but it has been specifically trained on Bloomberg's own financial data.
This niched-down approach and focused datasets mean that even though it's using the same GPT LLM, it actually outperforms ChatGPT on finance-related tasks.
This is THE skill to have, regardless of your role in tech right now.
Hopefully by now you can see that AI isn’t going to take your job, but the people who learn to use them properly are, so make sure you’re not left behind.
Sure, even after a few years of these tools being live it’s still early days, but these tools are going to have an absolutely huge impact moving forward.
You can't sleep on these! You need to learn them if you want to get ahead, improve your workflow, or even apply for roles in the future. Those who embrace change, almost always come out of it better off.
And seriously… who wouldn’t want to automate 30% of their job if they could!?