When will the AI bubble burst?

A bubble getting bigger

AI has become a whole new industry in recent years. It's actually impressive how fast it has grown since ChatGPT first came out. Nowadays, everyone is talking about AI and this technology is implemented in every single system/device/software. It became a marketing argument for a product to sell (just look at the advertisements for new phones or else).

But I also feel like AI situation has gotten a bit out of hands. And I'm not just talking about Software development (a more concret example of that later). For infos, I consider myself a junior software developer on its way to reach a mid level. I prefer to clarify, so that more experienced software engineers give me their own opinion about what I'm about to describe. I just graduated from college, and due to my young age, I witnessed the evolution of writing code from copy/pasting random stuff from stack overflow to modern AI-powered code editor like Cursor or Windsurf. It went so fast ! I actually still remember when Github released GithubCopilot; it was mind blowing to me.

But have we gotten too far ? Let's be clear, this article is not about demonizing AI and people who use it. Like every new things, it has its pros and cons and we need to be nuanced. I'm writing this to expose my view on the current situation and to have feedbacks and opinions from you readers.

A study by MIT

Recently, MIT's NANDA (a research program focused on studying the impact of AI), published a new report revealing that :

95% of generative AI pilot projects fail to deliver measurable returns of investment despite $30-$40 billion in entreprise spending.

This number might seem huge considering the fact that nowadays, it is considered normal to use AI to code.

A lot of people will say that everyone in the tech world was already complaining over the overuse of AI tools, but I think that this is what studies are made for: denying or confirming ideas and tendencies that people would have.

The prompt culture

The tech field and especially the software engineering domain has always been a fast growing industry; just look at how many Javascript frameworks are released everyday. But with AI, I feel like we are putting a layer of abstraction above so many others. Being a front-end developer can be hard enough with all the concepts and paradigms you have to swallow nowadays: SPAs, SSR, Reactive Programming, State Management, etc. And with so many things, you absolutely need to go step by step and understand what you are doing. But with AI, you just need to prompt :

fix the circular dependency injection

and you are good to go. Or are you ?

I know that many people wish that AI could replace developers to be faster. Because, it's what matters right ? Quantity over quality. But it's just not the case. I can't count how many stories I've read where someone was vibe coding and accidentally delete the production database because AI didn't respect the boundaries established by the developer. It's funny until you realize that the real problem is not the AI, but the developer himself. Like, how could he let Claude access the production db. I had the idea of writing this article when reading this kind of stories. If you don't know what "Vibe Coding" is, it's actually a new way of writing code by not writing code and just prompting an AI agent that would do the job for you.

To be honest, I'm actually worried for the new generation of developers. But at the same time, if vibe coding force people to learn how to code because they can't debug their application, I'm happy.

For a while, I used AI to code (Copilot, Cursor, etc.) because I saw everyone using it and they were so fast at producing code when I was so slow. I always had to check documentation for everything when they just didn't care about syntax because they were not writing code anymore. They were just prompting. I regret falling into this trap because here is the truth: at the end of the day, there is no substitute to learning. And that's also the point of being a good software engineer. We strive for learning and become good learner. After realizing this, I changed the whole way to approach coding and learning. I changed, my code editor to NeoVim, learned linux and installed Arch on my system, and went back to fundamentals like learning C. The first language I learned was Javascript and I know the web world revolves around this programming language. But I think, at some point all programmers need to step outside of their comfort zone and learn something different : a new programming language ? a new text editor (nvim btw) ? a new technology ?

Be addict to learning new things and don't let AI guide you blindly.

A different way of learning

Now, I know I just painted a picture of LLMS and vibe coding that is not very flattering. But I also think that you can use it to serve your learning journey. In the end, it all depends on how YOU intent to use the tools provided. A good example of that is when I decided to learn memory management in C. I asked ChatGPT to prepare exercices for me in intent to learn concepts in a certain way. I also think that generative AI tools can be a great substitute to search engines in some cases. You can get the information much faster and it condenses it for you. If I don't understand something, I can just ask :

What is the difference between a runtime and a compile time error ?

And most of the time it will be really well explained with good practicle examples.

Again, I'm not saying that you should not use an LLM model to assist you in your coding sessions. Sometimes, it does an incredible job for long repeating tasks that are easy to understand and debug. It can also do a good job for small tasks if the model has access to the context of your app like :

Create a new GET endpoint named /posts

or

Refactor this block of code

What I don't like and don't recommend you to do, is asking the model to create large sections of your app without reviewing the generated code. You can code this way, and it might work for small projects. But for an entreprise level codebase, it just won't. I've seen developers trying this methodology, and they were spending more time prompting the AI to correct what it has previously done and fix bugs, than coding the app.

For me, the most powerful aspect of a developer, is his capacity to learn new things, in programming, but also in every other domains and fields. I just feel like we are using AI the wrong way these days. We should put learning back at the center of our journey.

The job market situation

I also wanted to talk about AI in other fields than Software Engineering. Because as I said before, it is used absolutely everywhere and sometimes for no real needs and necessities. The use of AI is a trend and I think that our society has to find a balance regarding the use of this new technology. A good example of a drift, is the current job market situation.

To clarify things, I'm a young french software engineer specialized in web development searching for a job for several months now. I applied for over 100 jobs and entered into only 1 employment process which has been unsuccessful. You may argue that my resume or cover letter were not good enough, and that's always something that can be questioned. But it's a whole other topic. Here, I want to talk about ATSs and how AI has been implemented into ATSs.

ATS stands for "Applicant Tracking System". It's a software used by recruiters and HR teams to manage job applications. It collects application (from job boards, LinkedIn, company career pages…), stores candidates data and filter CVs automatically.

Now, ATSs are not a new thing and it has been around for a while. But with the advent of AI, ATSs are now using it for their filtering system.

But how does it work ?

The ATS is going to rank your application based on the score of your CV according to certain rules. Most ATS systems used keyword matching. For example : if the job description requires "Python, SQL, REST APIs", and your CV doesn't mention these words explicitly, your score will go down.

This system can already cause problems. Indeed, the ATS doesn't understand things like a human being would. It just scans a document to find keywords and give you a score. If you describe a skill or experience in the wrong way with the wrong words, your resume has no chance of ending up in the hands of the HR team.

And I did not even talk about AI yet. Nowadays, some advanced systems use AI with semantic search to "better understand context". The ATS has now become a barrier that you have to cross in order to show your skills and experience to a human. And that's why I don't understand smaller companies who use these steroid boosted system to select candidates. I think that being a recruiter is a deeply human job that AI just fails to do currently. I can't imagine the number of interesting profiles who have been dismissed because they didn't have the "good keywords", or because the system didn't like them in some kind of arbitrary way.

The job has changed

Over the past few years, all companies are looking for the x10 software engineer who can code an operating system with his eyes closed. The market changed drastically and it became a funnel where the supply has gone way bigger than the demand. The developer job is not the job of the future anymore. Actually, I see very few job offers for junior positions. And the only ones I see, are only junior because they have the word in the title. Companies don't want to train young dev anymore. They want a developer that can do everything, from web development, to systems architecture and cybersecurity. But you can't be an expert in everything. With the current ecosystem, it is just simply not possible.

As for the use of AI, I hope we will all take a step back and approach things in a more human way.