Will AI Replace Programmers? No! But Something Else Will

Andrei Neagoie
Andrei Neagoie
hero image

Unless you’ve been living under a rock for the last year, you’ve no doubt heard about ChatGPT, OpenAI and the AI hype that's taken over the tech world.

ai and programming is hot right now

ChatGPT is by no means the only AI tool but it alone had one million users in 5 days and approximately 1.8 Billion site visitors in March alone.

On top of that, there's been hundreds of articles talking about how it’s either the end or the future of programming, work, and life in general.

Which naturally has led to many programmers (especially those still learning or early in their careers) to be asking things like:

  • Are these AI tools going to replace me?
  • Am I wasting my time learning how to code?
  • Are companies still going to be hiring programmers?

I get it. These are fair questions.

Here's my spoiler + teaser all in one: AI is not going to replace programmers. But something else will 😱.

Don't worry, just keep reading and I'll explain what I mean and let you know what you can do to not only keep your job as a programmer, but also show you why it's still an amazing skill to learn and career to pursue.

daily emotions for programmers

Where have you been Andrei? Why write this post now?

It's so easy to get lost and caught in the hype and mania.

But now that it's calmed down and the dust has settled a bit (even though it's really just the beginning), I felt like now was a good time to share my thoughts on it.

Don't just trust my opinion though.

Check out this survey of over 3,000+ programmers about their experience with AI tools so far, and how it affects their work.

The survey asked programmers across all ages, levels of experience and company size questions like:

  • Will AI replace programmers?
  • Do you need to learn to use these AI tools?
  • Will they help or hinder you?
  • What are other programmers using them for? Are they just question-and-answer tools or can they do anything more helpful? Can they write code?

But in this post, I’m going to share some of my personal thoughts and answer some common questions we’re seeing in our community of over 400,000 developers and other tech workers.

Oh and how could I forget, I'll make sure to tell you how to make sure you don't get replaced.

So let’s dive in…

What are LLMs?

Let's start by making sure we're on the same page with a few high level questions.

Large Language Models (otherwise known as LLMs) are a type of AI that attempts to mimic human intelligence. They do this by analyzing large volumes of data (hence the name), and create statistical models to find the connections and patterns between that data.

If you think about it, it’s kind of similar to how a person would learn. We have experiences (datasets) and then form insights around them.

Some examples of LLMs are GPT3.5 and GPT4, which are the LLMs that underly ChatGPT. There are even open source ones like Llama (Large Language Model Meta AI) which is Meta’s answer to ChatGPT.

Speaking of which...

What is ChatGPT?

ChatGPT (Chatbot Generative Pre-Trained Transformer) is simply a chatbot created by OpenAI that uses its own LLM. (This is the GPT part of the acronym, and there are currently 2 versions. GPT3.5 is the free version of ChatGPT, and GPT4 is whats used for the paid version).

Adding a chatbot may not seem like it would make much of a difference, but it has a huge impact in both usability, and trustworthiness.

How?

Well, rather than having to write in code to use the LLM, you can ask questions like you would with a person. This lowers the bar to use it dramatically, and thanks to the seemingly human like responses, it helps people trust the results.

In fact, this conversational quality of ChatGPT is why a few people have the following question…

Are LLMs a true AI?

Well that depends on what you mean by "true AI".

LLMs are definitely a form of artificial intelligence, but they're not "true AI". Only AGI (Artificial General Intelligence), which is when an autonomous system can surpass human capabilities in the majority of economically valuable tasks, is "true AI".

Instead at the moment, these tools are basically a very smart looking auto-complete.

AI is not real AI just yet

However, LLMs present data in a way that seems almost human, which makes us think they're thinking and reasoning just like we do.

This level of mimicry is why a lot of people are excited, and why they also trust its responses without always doing their due diligence.

PSA: Don't blindy trust the results from ChatGPT or other AI tools (at least not yet). You'll regret it later and potentially cause yourself more work and headache than the original problem you wanted the AI to solve for you.

lawyer uses ai for research and backfires

In the example above, ChatGPT found multiple previous precedents for the lawyer to base their case on, which in theory, is a great use of the tool.

The problem of course is that none of the precedents that it gave him were actually real. It made them up to fit the question that was asked, and the lawyer never checked before going to trial...

Don’t get me wrong, LLMs like ChatGPT are still a huge advancement towards more sophisticated AI, but it's not Skynet just yet. There’s no sentience, just pattern recognition and prediction.

And remember that this pattern recognition comes from data provided by the internet which, let's be honest, isn't always accurate, or even sane sometimes.

AI is only ever as good as the data its given

All that being said, what we’re seeing so far has the potential to dramatically change how we interact with computer systems, as well as our daily work.

Is ChatGPT (and other LLMs) just a fad?

Hmm... yes and no. When ChatGPT first hit the news, we were seeing huge volumes of people using this tool, across multiple industries.

That being said, the initial hype from the general public and major media has started to drop off a little now as their interest moves to the next latest thing.

CHATGPT CURRENT ESTIMATED USER TRAFFIC

I think the issue here is that people have been trying it out, but not really learning how to use it effectively.

In fact, when we asked programmers what was holding them back from using these tools, it was almost always due to them not understanding how to use it effectively (and these are tech savvy people).

36.6% of programmers who have never used AI tools before are hesitant due to the learning curve

Like any new tool or skill, it take's some practice to be able to use it well.

But people want instant results.

So they try it a couple times and feel like it's just slowing them down compared to just doing their task how they’ve always done it. And so for most people, these AI tools have been more of a cool toy rather than a useful tool.

Until people can get past that initial learning curve, it will probably continue to die down in attention in the short-term.

But it's definitely not going anywhere and the usage will continue to rise over the long-term as people do become more proficient at using these tools.

For example

In that same study of over 3,000+ programmers, we saw that the people who took the time to actually learn how to use these tools with their job, were leveraging AI tools to help with 30-80% of their workload!

52.9% of programmers are using AI tools to help with at least 30% of their work

That means anywhere from 1-4 days of their job is being automated (or made easier) in some way every week with these tools, which is absolutely mindblowing.

Tl;dr

  • For the general public, it was fun and the interest is dying down a little (but will pick back up once there are more and more real-world use cases and purpose-specific tools)
  • For the people who have learned how to use it, they know it's a tool they'll HAVE to learn to use

Eventually, people will start to learn how to use this more in multiple ways, and it popularity will continue to rise.

Can ChatGPT write code?

Kind of. Because it has access to open-source coding information, it can in theory ‘write’ code.

However, it’s often simply repeating something it's found (like a search engine would) or it's attempting to write something based on its understanding of what you’re asking it to do, and previous data that's similar.

AI code LOOKS GOOD

This can lead to the LLM "hallucinating", meaning it makes up an answer to something (and even worse, does it confidently).

That means the code may look legit, and ChatGPT might say it's legit, but it's not.

In fact, according to this report, more than 50% of the time, the code that ChatGPT gives you is incorrect.

code HALLUCINATIONS

That’s not to say it's useless and can’t write code at all. In fact, it definitely can and in many cases, it can do a really great job. But it requires you to actually understand the underlying fundamentals of coding, and to also look at the code and assess the output it has provided to see if it will work.

But this is pretty much the same thing a programmer would do if they were looking for code snippets on StackOverflow or other sites (which also have incorrect information and code).

The major difference is that if you know programming fundamentals and also know how to write prompts to get what you're looking for, ChatGPT and these other tools can provide an output way faster and way easier than you'd currently be able to achieve by using Google, StackOverflow, etc.

Check out this post by Alex Hyett for some specific examples of how you can properly use ChatGPT for coding today.

Given ChatGPT is already so capable of writing code, the next thought people have is “OMG, won’t this replace me?!” or “Oh man, I guess I shouldn’t become a developer now because they’re all getting replaced anyways”.

Well, let’s get into that…

Will AI replace programmers?

No. At least, not until AGI (artificial general intelligence) is figured out and it becomes a true self-learning, actual AI tool, but we’re not there yet. We may even be decades away from that happening.

For now, the tool is impressive (and it feels very impressive) but it’s really only as good as the person using it. Like I just said above, you need to be able to ask the right questions (prompts) to get the desired output. Phrase the question slightly wrong and you’ll get an output that meets your question but doesn’t meet your end goal.

Heck, even if you ask the question correctly, it can still give you the wrong answer. So no, AI won’t take your job on its own, but that doesn’t mean you shouldn’t be planning ahead or figuring out how to take advantage of this tool.

I’d argue that if embraced and used properly, these AI tools will actually open up new opportunities for programmers and other tech workers, across multiple areas.

In fact, there’s an entirely new industry of jobs opened up now thanks to these tools (more on this in a second).

This is often the case when innovation enters an industry.

new industries from old

Just look at the last 20 years and the changes we've seen:

  • Uber created 1.2 Million jobs and opened up the taxi driving career to a much wider audience, but these people still needed to know how to drive
  • Photoshop and digital photography seemed like the end of photographers, but instead it opened up far more roles in actual editing and post-production
  • Social media wasn't a thing until Myspace. Then Facebook burst onto the scene in 2004 (along with others), creating whole new industries and jobs that never existed before, from software to cloud to marketing and more

It's easy to think of all the possible bad things, but the reality is, this will open up so many more opportunities.

My good friend and fellow ZTM instructor Daniel Bourke put it so well in his post a few days ago:

opportunity and adaptability

Change will always happen. It's how we choose to react to it that matters.

So there ya have it, AI tools themselves are not going to take your job. But as I alluded to at the beginning of the post, I do think something else might…

The real threat to your job: 10x programmers

Again, it’s not AI that’s going to take your job. It’s the programmers who are 10x more effective because they’ve learned to use these AI tools that are the real threat.

But the good news is that you can easily become one of them 😎.

I highly recommend you read our report on AI tools and programming, but here are some key points that you need to know:

  • 84.4% of the programmers surveyed had some level of experience with AI tools
  • 80.5% of programmers are using these tools as a search engine for new topics, while 58.5% are actually using them to help write code. Others are using it for almost all areas of their work. Debugging, documentation, testing, and more
  • 46.4% of Front-End Developers are using AI tools to help with 30% percent (or more) of their job. Some are even using it to help with 80-90% of their work. (It’s not just Front-End devs that are using this either)
  • 51.4% of programmers who have never used an AI tool plan to start using them in the next 6 months
  • 27.9% of the team leads surveyed were encouraging team members to learn to use AI tools for pair programming
  • 10.7% of programmers who applied for a job in the last 12 months, stated that the job listing had ‘having experience using ChatGPT (or any other AI tool)’ as a requirement
  • 78.6% of programmers with 10+ years of experience think that AI tools will become a standard requirement in the future
  • 82.3% of team leads also think these AI tools will become a standard requirement in the future

AI pair programming is going to be absolutely huge

People are getting more done, making their jobs easier, companies are starting to require these tools, and team leads expect them to be a standard and essential tool in the coming years.

You can’t ignore this. These tools are a big deal, and you need to learn them if you want to get ahead or even get hired in the future.

Heck, two of the major coding resources, GitHub and StackOverflow are both building their own AI-powered pair programming tools now because they see the value in this.

github copilot

Companies will always need smart coders, because they're not going to trust the AI to write it all themselves and have no one to check it or put it all together.

They need someone who can "manage" these AI tools. Smart programmers that can detect when an AI tool is wrong, and course correct. Not the ones that blindly follow and trust whatever it tells them to.

always check your code

Combining a smart coder with someone that knows how to use these tools, creates an unbeatable combo that will have employers salivating at hiring them. Salivating sounded weird there but you know what I mean 😛

Because who do you think they're going to hire if given the option? Today's standard or the 10x version?...

tl;dr

Whoever learns and embraces these tools will 100% pull ahead of their peers.

We feel so strongly in how much these tools can help you that we're creating an array of courses (such as Prompt Engineering, Pair Programming, and others), that will deep dive into exactly how to use AI tools for coding and tech.

This way, you can learn the fundamentals of a language, framework, or role so you know and understand what you’re doing, but at the same time, you’ll be using these new tools to help you apply what you learn, at a much more accelerated rate.

Get started by learning how ChatGPT and Large Language Models (LLMs) actually work.

learn to code fast

Not only that, but you’ll also be able to pair program without the awkwardness and potential negativity of another human. (Although all of the ZTM community are awesome, but not everyone is fortunate enough to be paired with such cool people!).

AI pair programming has the potential to help you learn to code and work faster than ever, while also getting a leg up on your competition. Just as long as you learn to actually code and not blindly rely on the tool!

Tl;dr

  • Current AI tools can be extremely helpful when you understand how to use them properly (and don't rely on them blindy)
  • They can save you time, but they are not perfect (yet). You still need to know how to code yourself to ensure it's giving you the correct answers or how to work on what they give you
  • People with zero coding experience are asbolutely not going to take your job using these tools
  • The people who DO have experience and learn to use them effectively, are where the real threat is

What is Prompt Engineering?

Remember how I said that these tools are only as good as the questions you ask, and also that new innovations often bring new industries?

Well, that’s where Prompt Engineering comes in!

A Prompt Engineer is a person who specializes in developing, refining, and optimizing the prompts that are given to ChatGPT (or other LLMs), to ensure the outputs are as accurate, efficient, and useful as possible (i.e they ask the best questions to get the right outputs).

incorrect inputs

It sounds simple but it's a little more complex.

The reason they can get such good output is because these engineers have a deeper understanding of the technology underlying LLMs. They can then use this to further improve the quality and accuracy of ChatGPT’s responses, in ways that the average user might never think to implement. Heck, in ways that they may never stumble upon organically.

adding context to questions in generative ai questions

This is why there’s been a boom in companies hiring Prompt Engineers to collaborate with different teams to improve their prompt generation process and overall AI system performance, or even train up specific models that are unique to them.

For example

BloombergGPT is an LLM that uses ChatGPT, but it has been specifically trained on Bloomberg's own financial data.

bloombergGPT

This niched-down approach and focused datasets mean that even though it's using the same GPT LLM, it actually outperforms ChatGPT on finance-related tasks.

This is THE skill to have, regardless of your role in tech right now.

Why you should learn Prompt Engineering

3 reasons:

  1. Opportunities
  2. Demand
  3. Salary

At the time of writing this, there are already 7,059 Prompt Engineering jobs available in the US alone, with an average salary of $126,000 per year.

prompt engineer salary

Now 7,000 jobs may not seem like much right now, but this is an industry that's less than 2 years old, and will definitely grow over time (ChatGPT only released officially to the public in November 2022).

Add in the fact that being a first mover in any field can have huge advantages.

What like? Well, you get to start on a similar playing field as everyone else, and work your way to a Senior role in very little time, and earn some serious money.

For example

Take a look at this job available right now on ZipRecruiter.

prompt engineer positions

$250-$375k a year, and all they want to see in terms of Prompt Engineering experience is some project work!

Bear in mind that this isn’t even a Senior role! It will likely go up another huge bump once you have more practice in the field, and you can get there in just a few years.

To put that into perspective, to become a Senior at Amazon you would probably need 10-20 years experience or more. But because this is such a new industry, you can hit the ground running with some personal projects, so why not get into it now?

Finally, if the money and opportunities are not a good enough reason to learn to become a Prompt Engineer, then how about the fact that by learning this, you could just as easily apply that knowledge to your current career and make your life so much easier!?

The future is here and it's not only exciting, it also pays well 😀.

Conclusion

Hopefully by now you can see that AI isn’t going to take your job, but the people who learn to use them properly are, so make sure you’re not left behind.

Sure, it’s still early days but these tools are going to have an absolutely huge impact moving forward, and you need to learn them if you want to get ahead, improve your workflow, or even apply for roles in the future. Those who embrace change, almost always come out of it better off.

And seriously… who wouldn’t want to automate 30% of their job if they could!?

P.S.

Keep an eye out for our Prompt Engineering and Pair Programming courses coming soon. Sign up below and we'll let you know when they're live.

More from Zero To Mastery

AI + SEO: How To Use ChatGPT For SEO Success preview
AI + SEO: How To Use ChatGPT For SEO Success

Everyone's talking ChatGPT. Is it replacing us? Or can it help us? I'll show you 11 ways you can master the AI (before it masters you) to improve your SEO.

The State Of AI Tools And Coding: 2023 Edition preview
The State Of AI Tools And Coding: 2023 Edition

We surveyed 3,240 Developers, Programmers, and Engineers (biggest AI survey to date) to find out what's really going on with AI tools and programming. Here are the stats you need to know.

How To Use ChatGPT To 10x Your Coding preview
How To Use ChatGPT To 10x Your Coding

Are programmers going to be replaced by AI? 😰 Or can we use them to become 10x developers? In my experience, it's the latter. Let me show you how.