With the recent rise of Github’s CoPilot and ChatGPT you can see the technology for code generation is ‘hot’. (Side note: which is better? find out in ZEN Software’s Ilya Grishkov’s excellent article here). AI assistance for programming can enable junior engineers from ‘being junior less long’ while seniors are helped by not being bogged down with all the details of programming.
But: could these new tools end the field of programming as we know it? Some people think writing a program will become a thing of the past.
A new field
Matt Welsh is the CEO and co-founder of Fixie.ai, an AI-powered software delivery company. He opines in January's Communications of the ACM,
...and indeed, for all but very specialized applications, most software, as we know it, will be replaced by AI systems that are trained rather than programmed.
I think the debate right now is primarily around the extent to which these AI models are going to revolutionize the field.
Fixie.ai seems to be a fresh startup as their site does not appear to have a start of a product.
Matt Welsh concludes in a video interview.
It's more a question of degree rather than whether it's going to happen....
Welsh can be forgiven for having a slight bias since he co-owns a startup in this tech. However, we may be at the peak of the famous Hype Cycle regarding A.I. powered programming. ‘Peak of inflated Expectations’: Like we saw countless times before.
Since today's A.I. models are still based on ‘monkey see, monkey do’ principles with a VAST amount of monkey data to back up these models, the mundane software will most likely be easier and better generated than 'programmed from scratch'. Writing all code from scratch has been a thing of the past anyway since StackOverflow came around. Programmers from Junior to Senior levels have since been frantically copy-pasting from StackOverflow.com.
O RLY funny mock covers
The limits of the current AI Assistance are in generated code for complex software implementations. Especially A.I. assisted debugging is, at this moment, not possible.
One study shows that code generated by AI assistants is more likely to contain security vulnerabilities than code generated by participants without access to AI assistants. The researchers also observed a false sense of security from participants that had access to the AI assistants. These participants thought that their AI-generated code was reasonably secure. AI-generated code, however, is more likely to contain security flaws in SQL injection and string handling.
What I think will happen is that what we now think of as programming will partly move towards ‘prompt writing’ to get the best results from these AI-based models. In my humble opinion, this is also programming, but less with traditional computer languages and more with natural language. Also: I’m sure Start-up founders and others with skin in the game would disagree with me and aim for the complete displacement of all computer code. This displacement will simply not happen.
Use the new tools!
So, learn from these fresh and new innovations, but don’t be alarmed by this silly sales nonsense that a robot will take your programming job: you’ll have extremely good new tools at your disposal, and your job as a software engineer will be more important than ever! 🤖
Update: our jobs are safe 😉