
Open question, If when Excel was released it had a disclaimer that says that sometimes it makes mistakes, would it be as successful as it was?
I meant the original version of 1985, not the current “AI enhanced” version that can indeed make a mess.
Excel was a very straight forward application, it would reduce human error and increase speed on accounting and statistical stuff by several orders of magnitude, AI, on the other side, only do the the later at what seems to be the expense of the former.
And companies that had tried to go all in on AI have learned this the hard way, for some of them, the increase in errors and revenue loss because of them does not make up for the human cost cuts. So, AI is definitely something impressing but probably not as useful as tech CEOs want us to believe.
On that same line, Generative AI seems to make CEOs, like Elon Musk, want developers to generate many lines of code, to the point of trying to measure developer performance by the number of lines they generate. Back in IBM times, IBM also used to measure developer performance by lines of code (LOC) and was Bill Gates himself the one that said it was a flawed metric of developer performance. It is funny how times change.
I would argue many good developers would agree that good code is not the one that has many lines or too little lines, but the one that finds the balance between efficiency and maintainability. One can make very long code or very optimized code and both be hard to understand, maintain and refactor and in general be judged as bad code.
AI may be able to generate good code but so far seems not completely on its own, but I’m afraid that for the sake of speed people are letting that measurement go unchecked. Maybe AI is so good that making code that is maintainable by a human is no longer required. In that case, lets hope that the AI can keep maintaining it for us, for it not, we will have a very had time dealing with spaghetti code the same way as it was in the 80’s with basic and the TOGO directive. The day AI can’t understand its own spaghetii code the 10x speed gain will disappear for a .1x big human refactoring effort.
Maybe I’m wrong, time will tell.