
In the race to automate everything – from customer service to code – AI is being manipulated as a silver tablet. The story is seductive: AI tools that can write the entire applications, can streamline engineering teams and reduce the need for expensive human developers with hundreds of other jobs.
But from my perspective as a technologist, which spends every day inside the data and workflows of real companies, does not coincide with the promotional reality.
I have worked with leaders of industry like General Electric, The Walt Disney Company and Harvard Medical School to adapt my data and AI Infrastructure, and here I have learned: Most jobs still have an idea on the place of humans with AI.
I am worried that we are thinking a lot ahead. Over the last two years, More than a quarter Programming jobs have disappeared. Mark Zuckerberg Announced He plans to replace several meta coors with AI.
But, the interesting thing is that both Bill Gates and Sam Altman have Publicly warned Against changing coders.
Right now, we should not rely on the AI tool to successfully change jobs in technology or business. This is because AI who knows is naturally limited what he has seen – and whatever it has seen in the world of technology is the boilerplate.
The generative AI model is trained on large dataset, usually falling into two main categories: publicly available data (from open internet), or ownership or licensed data (in-house made by organization, or purchased from third party).
Simple tasks, such as creating a basic website or configuring the template app, are easy winning win for general models. But when it comes to writing sophisticated, proprietary infrastructure code, which gives powers to companies such as Google or Strip, there is a problem: this code is not present in public repository. It is closed inside the walls of the corporations, inaccessible to training data and often written by engineers with decades experience.
Now, Can’t cause reasons Still on his own. And there is no instinct in it. It is just copying the pattern. A friend of mine in the tech world once described the big language model (LLM) as one "Really good estimates."
Think about AI as a member of a junior team today – first help for draft or simple projects. But like any junior, it requires oversight. In programming, for example, while I have found 5X improvement for simple coding, I found that reviewing and correction of more complex AI-made code often takes more time and energy than writing code.
You still need senior professionals with deep experiences to find defects, and to understand the nuances of how those defects can be risk after six months from now.
This is not to say that AI should not have space in the workplace. But the dream of changing the entire teams of programmer or accountant or marketers with a host of a human and AI tool is prematurely. We still need senior level people in these jobs, and we need to train people in junior level jobs to be technically capable of taking more complex roles a day.
In tech and business, AI should not be aimed at removing humans from the loop. I am not saying this because I am afraid that I will take my job. I am saying this because I have seen how dangerous it can be to trust AI at this stage.
Business leaders, no matter which industry they are in, should know: while AI promises cost savings and small teams, these efficiency benefits can be backfire. You can rely on AI to demonstrate more junior levels of work, but not to complete more sophisticated projects.
AI is fast. Human beings are smart. There is a big difference. The sooner we shift the conversation to replace humans to strengthen them, the more we will take the benefits of AI again.
Derek Chang is the founder partner Stratus data,

