It's rant time. I haven't written an article in a while partly because I've been busy with work, school, and a self-identity crisis and partly because I haven't had something worth writing about. Until now.
You see, I've been cooped up at the same company for a long time now but I try to keep an eye on the general movement of software and technology. From my little, limited peep-hole view of the software industry, I certainly can't see everything. In fact, I'm probably missing a ton of context, but I'm going to be self-righteous for a second and air my grievances to the internet. The title of this article is Software Malaise, which is quite apt. According to good ol' Merriam-Webster, malaise means a vague sense of mental or moral ill-being, which is exactly what I want to discuss. I can't pinpoint why I'm bothered, or who's fault it is, but I'm mentally ill with the current practice of software development.
I think, ultimately, it comes down to a people problem. People suck, in general. They tend to let all kinds of personal biases interfere with otherwise pristine logical analysis. Okay, so what the hell am I talking about? Here's the heart of the issue: Software can be so much better than it currently is - both in terms of what it can do and the ease with which it can be developed. Currently, though, it sucks and it sucks hard. It's insane how hard it is to develop good, quality software on a budget within an appropriate amount of time without wanting to pull your hair out. That's not even getting into how much of a nightmare the code-base is to maintain over a 10 year period.
Some of the answers come down to momentum and individual developer comfort levels. There are millions of people using these languages and technologies and billions or trillions of lines of code that would have to either be rewritten or backwards supported for the foreseeable future. It's like someone who has a mess of papers strewn across their desk. They know exactly where everything is, so don't go changing anything, otherwise, it'll be anarchy. Organized chaos. Another possible answer is that it's a hard problem to solve. Maybe it's too hard to find something that works or performs better. Maybe too many people have differing opinions and we can't come to a common consensus on what should be done or how it should be implemented. Sometimes, people or companies do try, but the community squashes it, or ignores it, and the energy of the project flutters out.
I don't buy some of that crap. Most of what we have and use now was invented by someone to solve a problem. Adoption took time, it didn't happen instantly. Python wasn't always as popular as it is today, or Ruby, or Java for that matter. Frameworks like Django or Ruby on Rails didn't always exist. I think the problem is that not enough people are focusing on solving those types of problems anymore. Either because they're too busy with other things, or because there's no money in it. I'm not the only one who thinks this way or takes issue with the problems we're faced with but choose to ignore. For some interesting videos highlighting some different aspects of what I'm talking about see: The Web Will Die When OOP Dies, Replacing the Unix tradition, and Object-Oriented Programming is Garbage.
Now, don't get me wrong. It is literally awe inspiring what has been achieved with software in spite of all of these problems. But this only further proves my point - imagine what could be possible to achieve if we didn't have to jump through hoop after hoop just to get a system deployed that is stable and scalable. Also, I'm not saying that no one is working on this problem or that there haven't been great strides and improvements over the years. It just seems to me that, on the whole, the development community is resistant to such changes. It saddens me that is how things operate. The larger something is, the harder it is to move or change, and with software reaching as far and wide as it has, it's becoming increasingly difficult to innovate. I only hope that some day, a new generation of developers will say they've had enough and will lead the charge into a new era of computing. Perhaps advancements in AI will help bring about a future where software is easier to write, test, and debug and we can tackle some truly amazing problems with the systems we design.