Is software getting worse?
I recently stumbled upon “Software disenchantment,” a post by Nikita Prokopov. It called to mind Maciej Cegłowski’s post “The Website Obesity Crisis” and several others in the same vein. Among people who write about software development, there’s a growing consensus that our apps are getting larger, slower, and more broken, in an age when hardware should enable us to write apps that are faster, smaller, and more robust than ever. DOOM, which came out in 1996, can run on a pregnancy test and a hundred other unexpected devices; meanwhile, chat apps in 2022 use half a gigabyte of RAM (or more) while running in the background and sometimes lock up completely, even on high-end hardware.
The aforementioned posts on this subject come across as about 80% fair and reasonable criticism, 20% out-of-touch grumbling. Or in other words:
Most developers know better than to say things like “it’s a smartphone OS, how hard can it be?” or “my spreadsheet app in the 90s was 10 kilobytes, how come Factorio is a full gigabyte?” If you weren’t there when it was built, you can’t reliably estimate all the hard knocks and complexity that went into it.
But that doesn’t mean there’s no room for objective criticism. Apps are slower than they used to be. And exponentially larger without a corresponding increase in value. At the very least, there are optimization opportunities in almost any modern app. We could make them faster, probably by orders of magnitude. We could remove code. We could write tiny, purpose-built libraries. We could find new ways to compress assets.
Why don’t we?
Prokopov’s answer is “software engineers aren’t taking pride in their work.” There’s some truth to that. But I strongly believe it’s the natural human state to work hard and make excellent things, and we only fail to do so when something repeatedly stops us. So instead of relying on the myth of laziness to explain slow and buggy software, we should be asking “what widespread forces and incentives are creating an environment where it’s hard for software engineers to do their best work?”
I have a few answers to that.
Speed is a feature, reliability is nothing
Software is envisioned by engineers as networks of interacting components, inputs, and outputs. This model is both accurate and useful. However, it’s not the way software is packaged, marketed, or sold. To businesspeople and customers, software is a list of features.
Take an inventory management app as an example. Its marketing materials will consist of several high-res stock photos, a bold color palette, and statements like the following:
- Tracks inventory across multiple warehouses
- Integrates with Delivery Pro, Supply Chain Plus, and Super Point-of-Sale systems
- Weekly and monthly reporting at multiple levels
- Fine-grained access and security controls
- Instant updates across all terminals
- Runs on Windows, MacOS, and Linux
These are falsifiable statements; either the software does these things or it does not. They can all be proven in a one-hour product demo. And only one deals with speed. The software may in fact be very slow, taking several seconds to respond to a button click, without making the “instant updates” claim a lie.
We can all agree that speed affects a user’s entire experience of an app. It’s an important marker of quality. But it’s difficult to sell. If you spend your time optimizing a core process while your competitor develops a new type of report, you’ll lose eight of your next ten sales over it. If you poll your existing customers about what you should work on next, they’re going to ask for features, not speed—unless the software is so slow it borders on unusable. And god forbid any red-blooded board of directors would allow the company to take a six-month detour from its product roadmap to work on technical debt. The pressure is always on us to build features, features, features.
Programmers want to write fast apps. But the market doesn’t care.
You may notice reliability isn’t on the list at all. How exactly would you say that? “Bug-free?” There’s no way to ensure that, let alone prove it in a product demo. “90% unit test coverage and a full suite of integration tests?” Nobody knows what that means and if you explained it to them, they’d be bored. There’s no way to express reliability in a way customers will both believe and care about. The Agile age has taught them that bugs will inevitably exist and you’ll fix them on an ongoing basis. And since there’s no comprehensive way to measure defects in software (surely if we knew about them, we would have already fixed them?) it’s not a feature that can be compared between products. We can invest time to test, refactor, and improve, but it’s entirely possible no one will notice.
Programmers want to write bug-free apps. But the market doesn’t care.
Disk usage isn’t on the list either, though occasionally it appears in small, low-contrast print below a “Download” button. And of everything here, this one is perhaps least connected with competitiveness or quality in customers’ minds. When was the last time you blamed a developer (as opposed to yourself or your computer) when you ran out of disk space? Or chose between two video games based on download size? Probably never. You can find people who complain about the size of the latest Call of Duty, but the sequels still make a billion dollars the week they come out.
Shrinking an executable or output bundle is thankless work. And it’s often highly technical work, requiring an understanding of not just the app one is building but the hundreds of lower-level libraries it depends on. Furthermore, it’s actively discouraged (“don’t reinvent the wheel”), partially because it’s a minefield. You may not know what a line of code is for, but that doesn’t mean it’s useless. Maybe it’s the difference between a working app and a broken one for the 0.01% of your customers that use Ubuntu on a smartphone. Maybe it’s the one thing keeping the app from crashing to a halt every four years on Leap Day. Even the smallest utility function eventually develops into an artifact of non-obvious institutional knowledge. It’s just not worth messing with.
Some programmers want to write smaller apps. But the benefits aren’t there for the market or for us.
Consumer software is undervalued
It’s not hard to distribute an app. That’s more or less what the Internet is for. But selling an app is like pulling teeth. The same general public who will pay $15 for a sandwich or a movie ticket—and then shrug and move on if they didn’t like it—are overcome by existential doubt if an app they’re interested in costs one (1) dollar. There are only two demographics that are willing to pay for good software: corporations and video gamers. We’ve somehow blundered our way into a world where everyone else expects software to be free.
This expectation has been devastating to the quality of consumer apps. Building an app costs anywhere from 50,000 to half a million dollars. If you can’t get people to pay on the way in, you have to recoup costs some other way. And herein are the biggest causes of bloat and slowness in both web and native applications: user tracking, ads, marketing funnels, affiliate sales, subscription paywalls, counter-counter-measures for all the above, and a hundred even-less-reputable revenue streams. These things are frequently attributed to greed, but more often they’re a result of desperation. Some of the most popular websites on the Internet are just barely scraping by.
It’s hard to overstate the waste and inefficiency of a system like this. You publish a unique, high-quality app for what you believe to be a fair price. It sits at zero downloads, day after day. You rebuild it on a free trial/subscription model. It gets a few hundred downloads but only a handful of users convert to a paid plan, not nearly enough to cover your costs. You put ads in the free version, even though it breaks your UI designer’s heart. You find out that ad views pay out in fractions of a cent. You put in more ads. Users (who, bafflingly, are still using the app for free) complain that there are too many ads. You swap some ads for in-app purchases. Users complain about those, too. You add call-to-action modals to encourage users to pay for the ad-free experience. You find out most of them would sooner delete the app. You add analytics and telemetry so you can figure out how to increase retention. You discover that “retention” and “addiction” might as well be synonyms. The cycle goes on, and before long you no longer have an app; you have a joyless revenue machine that exploits your users’ attention and privacy at every turn. And you’re still not making very much money.
We could avoid all of this if people were willing to pay for apps. But they’re not. So apps are huge and slow and broken instead.
Developers don’t realize the power they have
Lest I be accused of blaming everyone but myself, let’s examine the role of software developers. There has to be something we can do better.
Even in a recession, developers have an extraordinary amount of leverage. We can insist on working with (or not working with) specific technologies. We can hold out for high salaries, benefits, and equity. We can change the culture and work environment of an entire company by exercising even the slightest amount of solidarity. Good programmers are hard to come by. Everyone knows it, and we know they know it.
That’s our power, and we can do more with it.
We should set aside time in every sprint to resolve technical debt. We should procrastinate feature work now and then when there’s an especially promising opportunity to optimize and improve our code. We should persuade our employers to sponsor open-source projects. We should create the expectation that we won’t always be working on the product roadmap; our code and our industry expect more of us.
Most of the time there won’t be any negative consequences. We’re not asking too much. Every other industry has professional standards and requirements that transcend any one job description. Why do we so often act like software development doesn’t?
The only caveat is that the incentives aren’t in our favor. It’s an uphill battle. Some managers won’t be comfortable with us spending time on things they don’t understand. Some salespeople will worry that our software isn’t competitive. Investors may threaten to outsource our work to more pliable developers. It will be a while before customer attitudes and market forces shift. But if changing the state of modern software is a worthy goal, then it’s worth the effort.
Will it get better?
It’s hard to be optimistic about the future of software. Programmers were allowed to build tiny, highly-optimized apps in the 90s because there was no other choice. Their customers had 32 megabytes of RAM and a 200 megahertz single-core processor. If an app wasn’t as lean as possible, it wouldn’t run at all. Today, a two-year-old base-model Macbook Air has 250 times as much memory (not to mention faster memory) and a quad-core processor with several times the speed on any one core. You can get away with a lot more now. And we do. We ship apps that are 90% dead weight. We don’t optimize until someone complains. We package a full web browser installation with apps for sending messages, taking notes, even writing our own code (I’m using one right now).
The last two decades have been dedicated to making software development faster, easier, and more foolproof. And admittedly, we’re creating apps faster than ever, with more features than ever, using less experienced developers than ever. It’s not hard to see the appeal from a business perspective. But we’re paying the price—and so are our customers, the power grid, and the planet.
Things won’t change overnight, probably not even in the next five years. But there are reasons to be hopeful.
The latest wave of web programming languages and technologies (like WebAssembly, ESBuild, SWC, Bun, and Yew) is enabling new levels of speed and reliability, both at compile-time and runtime. Rust, noted for delivering the performance of C and the developer-friendliness of higher-level languages, is gaining popularity on web servers. Lightweight Electron alternatives like Tauri are poised to take over as the web developer’s cross-platform framework of choice. Tree-shaking is something we’ve come to expect from compilers and bundlers.
In terms of the market, several popular video games (like Dead Cells and The Binding of Isaac) have made their way to mobile platforms as paid downloads. There’s still a lot of work to be done, but this is promising headway towards reeducating smartphone users, the world’s largest group of technology consumers, about the cost of software.
If the last 20 years have been about making us more productive—sacrificing efficiency and financial sustainability in the process—perhaps the next 20 will be about tackling our collective technical debt, reclaiming efficiency, and improving economic exchange without losing the productivity that’s made software omnipresent in our lives.
Tags: software development
62 Comments
Great article! It’s nice to know other devs care about reliability and performance. The everyday apps I use on my computer and smart TV are bug-ridden and slow; something has to change.
surprized that the article is silent on functional programming.
I’d add poor design skills to the mix:
– designing (and coding?) software/applications with little/no practical experience in the languages/tools to be used
– peformance as an afterthought (big assumption here that those associated with the project have an idea of what ‘good’ and ‘bad’ performance looks like, let alone how to measure it)
– scaling as an afterthought (my SQL query is blazingly fast at 1.3 seconds against a development database that’s 3MB in size!)
– UI developers who limit their focus to their personal phone interface while forgetting about other interfaces (tablets, desktop monitors, etc)
– how about that order processing system with a backend (relational) database that consists of 1,800 tables (don’t get me started on the ERP database with 75K tables and 30K views … not an exaggeration)
Prioritize design (aka architecture) and you’ll be $30k short after hiring Software Architect.
Prioritize performance and you’ll sacrifice time-to-market or plainly burn money before getting any traction and learning what really matters for paying customers.
Prioritize scalability and you’ll just shift the complexity to the infrastructure and burn money on learning that Microservices are hype and should be avoided until its the only option forward.
At last point you are totally right. Corporate overlords are choosing old and bloated software we are asked to integrate (e.g. SAP). It’s just a sign to leave the company.
Long gone are the days when you buy software once. The $1 app is more like $1 each month for the rest of your life.
For the last 20 years i have watched software get worse and worse, driven by three things imho… 1. Just do it project managers, driving functionality checklisting 2. Self taught developers who don’t know the basics of design and 3. Companies that won’t invest in training these same developers (even in house). If i had my way we would have to be members of a professional body that enforces skill levels and standards for managers and devs…just like lawyers, accountants and all the other proper professions have to observe. It won’t happen because too much money is made selling good looking, but unreliable software to customers who don’t know how to tell the difference.
Doom was an awesome game in 1993. Minimum requirements were an Intel i386 (32bit 12-40 MHz) and 4MB RAM. A Desktop with those specs cost about the same as a gaming Desktop now but, adjust for inflation, that is two to three times more expensive. However, the modern PC is orders of magnitude more powerful. In 1993 the internet was at university or niche and dialup, mobile phones just made calls and were for the city elite. There was no WIFI, there were no smart phones. Software Engineers read books, there was no Stack Overflow.
While Doom has a nostalgic interest, it is dated, it is lo-fi, it is free. By today’s standards Doom is simple. Somebody might pay for Doom but nobody should. I don’t want to go back to Doom.
Today, resources are comparatively cheap, we have the Cloud, we have a multi-billion dollar, content driven games industry. We are reaching or have reached the point where software is no longer limited by the hardware on which it runs, it is limited by the complexity we can imagine.
The cost of complexity is nonlinear. Try to keep each layer of abstraction as simple and efficient as possible but, it doesn’t need to run on an i386 in 4MB or RAM. Over optimization increases complexity.
So a messaging app is clunky and slow. Get some experienced engineers to make you a better one. The Penny Farthing is history, lets make the electric autonomous vehicle of the future.
Doom did however feature multiplayer possibility dialup, LAN or RS232, which from a programmer’s perspective is the very same thing to code as multiplayer over Internet. Harder even, since RS232 doesn’t use sockets but has to be coded separately. Multiplayer 1st person shooters wasn’t really a thing yet, most bought the game for single player mode. They just coded these parts as a bonus.
And btw Minecraft runs on the Quake engine. Nobody would pay for Minecraft, right…?
I remember setting up a multiplayer game with with an RS232 cross over cable between two Commodore Amigas with D15 sockets. That was the easy bit, getting two CRT screens in the same room was the hard bit.
It was fun then, but if I had the time now, I’d prefer Modern Warfare: Warzone 2. I can play for free, there is more content, it is richer and so much more detailed.
It would have blown my teenage mind.
FWIW, I do vaguely recall that there used to be inferior games that did have bugs and it was much harder to get an update/fix.
What are you on about? Neither versions of Minecraft (Java and Bedrock) run on the Quake engine!
Minecraft is made in Java and has nothing to do with the Quake engine.
Yes, we don’t have limits now, but how do we use all those resources? If you take shooter games as example, where is all the real progress? Where are physics, destructible environments, really smart enemies? Nothing new but the graphics since 90s. And the same with websites and apps. The same thing in a nicer wrapper, which is delivered faster at the expense of quality. Modern Windows or MacOS are less responsive than Windows 98 which runs on 100 times slower hardware.
Great article! I develop Android apps on the side, they are small, but I find the APK size so huge compared to apps that came out when Android first appeared. Maybe the SDKs required for backwards compatability, telemetry and ads are too big. I wonder if these libraries are shared under the hood, which makes sense to me, but could be a security issue, among other things.
Here is another interesting blog article from Poul-Henning Kamp (in Danish).
https://www.version2.dk/holdning/garbage-garbage-out
In short, it states that there is 165 million lines of code in Firefox – and in the comments he states that the entire code of FreeBSD is 17 million lines.
This really should put the code “quality” of Firefox into perspective… 😬
The one reason is multiple people working on the same thing and lack of ownership. Applications have become large, no one sees the full picture and everyone just adds his feature to it because that is the job they got. Looking further than the feature to be implemented today is pointless, whatever you do outside your scope will be pissed away by someone else soon anyway, is not rewarding in any way and isn’t your responsibility to begin with. So the product just gets increasingly expensive to maintain until there is no movement anymore and it is replaced by something else.
It is true that feature bloat makes software slower, but it simply isn’t true that the customers are asking for more features. Rather, they get them showed down their throats. Those who request tons of pointless features are mainly the marketing departments in every huge software company. Or rather, these companies need some lame excuse to sell you the very same product over and over. “Now with fluff!” Okay but can I have it without fluff? “Nu-uh, the fluff is good for you. We will stop shipping the fluff-free version.”
There was some famous comparison between an ancient Macintosh vs a modern PC with Windows and they clocked how long from power-up it would take to launch Excel and have it ready to go. The old Mac won, even though the PC has some 1000 times faster hardware. Feature bloat.
Looking at Excel as a prime example, there has not been a lot of functionality added to it between year 1992 and year 2022 that anyone asked for. It can nowadays run VBA scripts and the “look & feel” is more pleasant, and that about summaries the only relevant parts of 30 years of product development. The rest is just useless bloat that nobody asked for and that very few use. Looking at the Office version from around year 2000 that my parents still use, from an average user’s perspective it can literally do everything that Office 365 can do. And it can work without an Internet connection, so it’s actually a far better product than Office 365.
> The old Mac won, even though the PC has some 1000 times faster hardware. Feature bloat.
Since people are getting real work done in Excel (surprisingly), Microsoft learned to preload stuff for them and relies on Superfetch service to hide the consequences. Moreover, all the people that use Excel are getting old so they don’t really feel the difference between 2s and 5s. And once you’re on 5s brain is switching context, so additional 10s is not harmful.
On top of that – try to sell Excel knock-off based on “it starts in 1s”. Nobody will care 🙂 Bloat is like smog – some people suffer, but most of us just learned to live with poor quality of the air.
“myth of laziness” is a miss-characterisation of what people don’t take pride anymore. They don’t; rather than rely upon knowing the language for example it’s “the tool tells me what’s wrong and I fix it” as one example. In other words if it compiles and passes a few tests it works. OK I’m being slightly glib. But the standard of expertise I worked to in early days (admittedly safety critical and high reliability in the 90s) was done to a much higher standard, by people who were expert in the tools they were using including the language, it was embarrassing if you clicked compile and it failed.
That doesn’t mean people are now lazier. They build more faster by not caring about quality, speed (over much) or size. When you need a modern last 5 years machine to cope with a current web site you know something is horribly wrong. I even had one web site warn my OS was too old, credit to it still seemingly working however. Basically the small number for who quality matters to are swamped by the get shit done brigade all trying to maximise profit.
Just like Italy and the slow food movement I’d rather see good programs appear at a sensible pace. But I’m a minority who suffers horribly to work fast to stay in touch pace wise with the “good enough” brigade. Funny thing is my slower rate sees me producing stuff that lasts longer with fewer bugs so once rework is deducted I’m in the long run no slower it just looks like it on a weekly basis.
Programmers *need* to be lazier. They need to figure out how to deliver the minimal set of features with the least amount of complexity and resources. I continually run across code that is much too complex, uses far too many dependent libraries, and avoids tried-and-true methodologies because the programmer wanted to use the latest-and-greatest complex thing. I think I agree with what you are saying, but I would say that we need to embrace words like “lazy” and “efficient” and “small” and “simple”, otherwise we’re not solving the problem.
In other words, everything Wirth wrote about in [A Plea for Lean Software][1] still applies almost 30 years later.
[1]: https://www.computer.org/csdl/magazine/co/1995/02/r2064/13rRUwInv7E
With the advent of AI code generators, the quality of software will experience a big degradation, as a lot of would-be coders will blindly rely on the generated code.
Size isn’t necessarily everything.
Something smaller, but with horrible configuration and a tangle of churning dependencies could be worse than a larger, self-contained thing that works out of the box.
As a Software Developer I think there is something this article gets wrong. Software Developers are not often the ones setting what feature sets and bug fixes they get to work on nor are they often in control of the deadlines. I, like many of my colleagues, take a lot of pride in my work, but the truth is that if you are given a deadline to meet, you have to produce something and it certainly won’t be as optimized and bug-free as any of us would like.
Second, software and feature sets are several orders of magnitude more complex than they ever were 30 years ago. With more “moving parts” the difficulty level of keeping everything running smoothly exponentially increases. This is why code testing (unit, regression, etc) is so incredibly important.
You can objectively say that apps are slower than they once were, but they are also significantly more feature rich. Faster hardware helps make up for some of that, but it’s a constant cat/mouse game. I don’t mean to say there is no truth to the article, but just that it’s missing these important components.
Why is Software so slow? Because every time I catch a developer writing a sql query inside a for loop (instead of using an IN clause once) the developer quotes Knuth to me:
“Premature optimization is the root of all evil”
Who am I to argue with Knuth? I walk away each time in shame. If we’re reaching our young developers this mentality on our schools, what else should we expect??
The irony is Knuth is responsible for collecting and creating some of the most beautifully optimal algorithms.
There is a sweet spot where simplicity and performance meet. That is where we should be aiming.
To combat the RBAR graffiti, the full quote is
“Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered. We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil. Yet we should not pass up our opportunities in that critical 3%.”
You could explain, a set based statement will make the code easier to debug and maintain and will keep its performance out of the 3%.
Good article, but it’s even more extreme: DOOM came out in 1993.
I think Software is getting worse because (at least in the U.S), only about one percent of the population knows how to code. If you think about it, it is astonishing that so few people have any idea how something thqt almost everyone uses all day every day works. If more people understood Software, expectations (and eventually requirements, qualify, etc) would all be different. Building software is hard. In many countries, you need to do a hard skill to make decent money. Not so much in USA im afraid.
My biggest problem today with software development is how obsessed we are with coding, as opposed to visual programming. Literally only developers use the terminal these days, whereas years ago to operate a computer you needed to use the terminal. HOPEFULLY in the future, you will be able to program visually. This will greatly increase the ability of ordinary users to be able to program. Programming should not be a skill for elite men, programming is a skill of designing what you would like a computer to do and executing the idea.
“Programming should not be a skill for elite men”
this is news to me. Are you implying we need to make coding easier for women?
Tools and languages can make programming simpler for everybody but its the leaky abstractions that make those tools less useful.
https://www.joelonsoftware.com/2002/11/11/the-law-of-leaky-abstractions/
I acknowledge that there is an under-representation of women in programming and STEM in general but, many of the women I have worked with were elite. I’d suggest that the terminal or command-line is not the sexist blocker you think it is and the real cause is more likely a cultural bias that affects women, men and non-binaries alike. As your feminine reading name suggests, you may have a better insight than me.
There is nothing elite about the people writing software today. Which is exactly why the overall quality is so poor. And as much as I’m a fan of the low-code/no-code movement, it probably won’t be beneficial to overall software quality.
I’m really glad that I was able to start my career in the 90s, when new concepts were introduced gradually each year. Nowadays, upcoming developers have to deal with all these layers and abstractions of the past decades from day one – and it’s no surprise that no one understands what actually goes on in the system and shortcuts are being taken just to produce any kind of result.
Trying to put another, visual layer on top of that will in all likelihood just make things worse.
There are decades of faulty abstractions and technical debt to fix. Some bugs even start in the CPU and its speculative execution feature or the baseband GSM module of a mobile phone…
Partly, it’s the mindset of devs; they don’t see the need for documentation, etc.
To be fair, partly is project management and business objectives. Business always wants something out yesterday and don’t care if it’s reliable or even optimised.
I always felt that devs, PMs (product or project) governance, business, etc ought to do software support for at least 2, 3 years before they’re allowed to go to other roles. Then they understand the need for optimization and reliability.
You guys have ignored the main culprit here. That’s AGILE. I don’t know who invented it, I don’t care either. But that’s what which ruined the design and coding part.
I understand the pain of Waterfall. A product takes 1 year to roll out instead of 1 month. Agile is need of the hour becz of the cut throat competition. Whoever brings their new idea to the market first and captures the most audience, wins!
But if you need speed, you obviously compromise on something and that in turn, is design and quality. And no matter who in the world tries to convince me its not that, they’re part of the manufacturing line and it is the root cause of everything.
No developer has time to just sit idle for 2 hours in isolation and spend time on thinking just about the design.
Agile says, a user story must have a end to end customer value. Now where someone will find time to get the database structure correct first, then do an API design first. Performance has been pushed to end as rolling out the feature is more important now. Performance we will fix later in “upcoming iterations” The fundamentals we start with is wrong in the first place. Agile is for product managers and not for developers!
According to me, why can’t we just create something which is in between waterfall and agile. Why not enforce the rule of release something when it ticks all boxes of “Design”, “Development”, “Testing” and “Performance”. Is it that difficult to create a text book in WaterGile? Why agile coaches stick to the text book and ignore the real world?
Agile is not the problem. Agile manifesto doesn’t state “quick and dirty” solutions. Also – like any practice, no reason to be super strict about all of it’s features. So, for your example, some will take a “system design” or “DB research” story for one sprint, for example, with clear DOD, and implement the next sprint
Couldn’t agree more! This “feature” led KPI is rooted firmly in Agile. And I have also lived through some awful waterfall led solutions. If only I had time to come up with a better methodology! Unfortunately I have features to deliver…
@Emily Reynolds
Exactly, your 4.5 lines are the reason of this whole article: “Is software getting worse?”
“how obsessed we are with coding, as opposed to visual programming” – jeez 🙁
One issue is the constant push for more complex capabilities. I recall when Lotus 123 came out on the PC in the 80’s. Spreadsheets today have the same basic functionality but the capabilities are far more complex. Time and Complexity place stress on software reliability. If you try to rush to market too quickly or you are pushing the state of the art on complexity that will result in more defects.
Total agreement. There’s a great deal of apathy in engineering, a lack of caring about the craft. (Just give me my paycheck so I can go home.) I would add that I’m also seeing a lot of designs that abandon functionality in favor of style. So many websites out there that try to look great, but have the usefulness of a gnat. (Mostly just annoying, and don’t do what you want.) Good engineers will push back when designs are crap. Speak up, don’t suck up.
I am in firmware development for different chipsets as dialog, qualcomm, stm. All SDK’s provided are quite a mess,
as in this order. A disaster, developed in a hurry, with no design behind, just crop-it and ship it.
I think it must be the managers who dont let programmers work on making things work right and would rather see ‘design enhancements’ that literally no one asked for. Take the latest android ‘upgrade’. Not only did they not fix a whole ton of buggy crap but their new design made using it actually worse. Check out their volume control, it’s got that pill shape where you dont know what position the thing is actually at. And then how bout bluetooth? It’s such crap, always having to bluetooth on and off and reset the known devices, just to get music playing out of a little speaker, much less get a headset working. Txting? Seriously? This tech has been around for what 20 years and it still has issues with groups and stuff like that. At least Apple seems to have fixed this a little. The android people tell you that they have to keep their apps running in the background 24/7 to make it ‘faster’ for when you pull up that app… well, virtually everytime you pull up the app, it’s completely reset and you have to start over. This never works. Facebook changed the way the comments section displays, after literally no one asked them too and its all small and harder to read. Meanwhile, FB search is completely useless, you can search by the literal name of sth that you look at often and it cant find it. Windows10 reboots, without my say so, making me set up all kinds of windows and tabs, some of which had running processes going.
“Coders” vs “Software Engineers”
Part of the problem is the general public doesn’t understand how much quicker apps/websites can be. And that’s understandable. You don’t know what you don’t know. If more companies focus on delivering a fast experience, eventually a critical mass will be reached, where users will demand fast speed, just as much as they’ll demand the newest bell and whistle. Educate the public via exposure to fast solutions and the rest will naturally take care of itself.
One big issue is the demise of the QA engineer. In the old days, you had engineers dedicated to good UX via “big picture testing”. With the advent of TDD, it was assumed that developers could write unit & integration tests that obviated the need for QA testing. Doh! Invalid assumption! Most devs will wok on one little piece of a larger system & write tests for the components therein. And all the unit tests in the world will never test UX. Big picture testing is out the window.
And then on websites, you have massively bloated Javascript frameworks. Not much more to say there.
And yes, as @Harsh Sharma stated, Agile shares a lot of the blame. The very word “Sprint” – a fundamental Agile concept, connotes a rush job. What do you expect from a rush job?? What’s happened to Agile is it has become more of a tool for micro-management.
Finally – a culture of FOMO has us lurching from one shiny “new” tech to another, where most are just re-invented wheels. Today its chatGPT, tomorrow…? Nothing is new under the sun. You don’t need to be afraid… you ain’t missin much. I promise.
Yes, I would agree with every word in this paragraph.
In 80s-90s we we creating a highly optimized code and saving memory where it was possible.
That required a very solid background and experience.
I still do not understand what means “optimizing code” – I just cannot write 2+2 on 100 pages.. 🙂
Lack of a solid background is a major reason of what is going on.. plus dummy created new classes and libraries.
My old programs created in C/C++ Borland can be deployed without any issues on any platform and perform hundred times faster than newly created analogues..
I guess the main culprit here are cross compatibility, UI evolution and development cost.
In the 90s-early 2000s most app were paid app, desktop only, if not windows only or mac only and had limited networking features.
Developers had time to focus on one thing (their app), with quite simple UI and interaction.
But things changed a lot since then:
* Nowadays we have app which are cross platform, more often rely on freemium or subscription models. The market is just more diversified and complex than it used to be, and need constant weekly/monthly updates.
* Development team size stayed mostly the same if not shrink
* Customer expectation in term of UI really went up drastically
If so many devs replaced native app by HTML/CSS it’s mostly to solve those problems.
Developers didn’t become lazy, they were just ask to do more and more with less resources, so they had to adapt and find solutions.
And sadly the main cost of abstraction is performance, which is often considered low priority, until customers start to complain about it.
Which is why electron became so popular and used across the industry, it’s a big tradeoff between performances and development time. The cost of having dedicated good native app with dedicated codebase and maintenance cost is just less and less viable over time.
For example comparing apps like slack or discord to windows messenger or IRC, for most people they are just “text messaging” app and they wondering why discord can take 1GB of ram when messenger used 20~50MB. But they are decade apart in term of features, reliability and UI.
Nowadays nobody would use messenger, everyone kinda expect GPU acceleration on their 4k screen, multithreading, animated gif, SNS integration, livestreaming audio/video, bot, custom contents, … Each of these feature is complex (rely on tons of dependencies, codecs, drivers, …) and has to be scaled worldwide and works consistently on every device.
The overall complexity and size of apps just went through the roof in the past 15years. Technically developers can still make light and fast messenger app, but nobody want to use them anymore because they lack the fancy features. IRC would still be popular if it was not the case.
Games and doom are also a good example, at the time (in the 90s) it was viable to build a new dedicated engine, low level and performant.
But nowadays, graphics and physics expectation are so high and development already so long, it’s just more attractive to use big framework like unity or unreal. Get a standardized codebase, cross platform out of the box but games are getting huge and less optimized per device.
While reading through article and comments I seem to have found no mention of development tools. Which, in my view, are crucial in the process of software quality degradation. Programmers are not making software from scratch, they depend on a myriad of third party components and are limited by an underlying architecture. You can be a top class developer, but you simply can’t make a bug-free, fast and slim application when you are forced to rely upon buggy, sluggish and bloated components, submit to a limited, badly designed architecture and use a development environment that makes you spend more time guessing why it refuses to work, than actually coding. Not to mention lacking, confusing and/or misleading documentation. And, of course, the tendency of modern development environments to “assist” you so much, it almost moves your hand for you. Blaming programmers for bad software today is more or less like blaming drivers for accidents when cars are almost self-driving and come out of the factory broken.
The commercialist “profit first” ideology is affecting all levels of the industry, causing a snowball effect. Technologies are designed to produce results as fast as possible, with little or no responsibility for the future, and the drawbacks for the consumers are actually benefits for the industry. Bugs and security holes force consumers to update their software, but fixes always come bundled with “enhancements” that make the software slower/larger and force consumers to upgrade their hardware. Hardware breaks compatibility with old software, and we got an endless loop which seems like a brilliant financial idea, but only in a short term.
Bloating is similar to bureaucracy. It is profitable to force businesses to depend on massive overseas development teams, instead of hiring a couple of in-house developers. To make that happen, everything that could be done with 5 lines of code is blown up into a package of thousands classes, that do not actually contain any code at all – just like unneeded middle men in civil service they suck on resources without giving anything back. Technologies are renamed, ideas are obscured, overcomplicated, to make developers dependant on frameworks. Each framework is bloated with components, and when applications don’t work together because their components conflict with each other – the application is packed into a virtual machine and connected to other similar virtual machines via network protocols, and it’s actually a “good thing” because you can spread those virtual machines across a network of physical machines when you run out of hardware resources.
There’s no way out, until we run out of resources. Or consumers. Whichever comes first.
One of my hobbies is astro-photography, and the program I use for processing raw pictures fundamentally does the the same as a famous bloatware suite, and the downloaded and installed size of the program? 4.8MB.
My only comment is I wish programmers were more lazy. I keep having to maintain or reinvent software that is built to be too complex, with too many features, and too many dependent libraries. I usually have to throw out features or bloated libraries to make the app maintainable. If other people are blaming programmers for being too lazy, they’re just compounding the problem. We need to reward the lazy programmer who builds the app in the simplest, most reliable, way, with the least amount of resources. We need to punish the programmers who keep adding dependent libraries and features, jumping on the latest bandwagon that won’t stand the test of time and lead to a maintenance and reliability nightmare.
Love this. I’m ransitioning to Software from a long career inMechanical Engineering. I would submit that the same general overall concept of good solid design intent (in a variety of forms and fashions across many disciplines) getting left behind for the sake of sales is prevalent everywhere. Just blame the marketeers and call it a day.
I concur. We are in for a rough ride.
There’s a reason why I am still using MS Office 2003. And I’d be using Office 97 if I could have found one for sale.
I occasionally want to produce sheet music. I have two apps for this, both from the same company. Printmusic has barely enough features for everything I want to do. And once I had to add an extra “line” of music and paste it into the final image using Paint. Finale is much more powerful and would do everything. I could set music for a 100-piece orchestra if I wanted to. But I’m setting songs for solo voice. So I use Printmusic and ignore Finale.
The same for lots of other things. I only update TBird, Firefox, and Chrome when absolutely necessary (e.g., to remove a vulnerability).
Agile has unwittingly created this short sighted mentality in the world of software development. Devs just want to get the story done in the sprint and “worry about xyz later”. Like tunnel vision. This results into a messy blob of patchwork and bandaids one upon another. No one wants to think further out and ahead and just wants to hack it to … you guessed it … get the story done. Both developers and their managers are feeding into this.
This was certainly not the case in the days of waterfall.
Misuse of agile (ie as a management tool and not a development tool) is the problem. Everyone needs to sit down with their project managers and explain you can have a fixed release date and only release the stuff that’s ready, or you can have a fixed feature list and release it when it’s ready. And keep saying it.
I feel the same, I even pass through some depressive states a few year ago (pre covid era) and finally came across with one simple Idea (a very old one): “Adk for forgiveness, not for permission”, so I changed to another Enterprise and I used a new speech in the interview “I have my own quality standards where …. (insert here your list of important things) are my personal trademark, so I can trade-off some things a bit but there is a hard limit that I’ll never cross” , with this statement I find the way to find the projects where I can be myself. So, my advice is stop asking for permission, just use your criteria for do what you think must be done.
Agreed. I call it “going rogue.” My team knows that occasionally I spend a day or two working on something that 1) has obvious ROI in terms of software quality or process and 2) isn’t on the task board. And I will do so without warning.
I went rogue; you can fire me if you want. (No one ever does.)
In the tradition of The Hacker’s Dictionary, the dart I’d throw at the problem of deteriorating value and increasing bloat can be summarized in one word: “Suits.”
Plenty of programmers retain their pride of work and do the best they can to code in a beautiful way. And they succeed when left to their own devices.
Don’t look at us developers. We do what we can to produce good, useable software. But marketing and suits force all kinds of stuff on us, what are we to do? Disagree and resign? That only makes their life easier because they’ll get some yes person that will create these mostrosities. How many times have I been threatened with being replaced with a “no code” platform? Nothing I can do about it. It sucks.
Security is the key these days, not performance. I work for a large firm and all our software has to go through code checks, pen tests, checklists of stuff. Nothing at all that it’s fit for purpose, let alone good code. Same thing with our clients, pages of checklist security questions, never a mention of performance or footprint.
Another problem is all us experienced software engineers who are still programming – instead of complaining that project managers don’t understand us, we could become project managers.
If you ask me, I think one of the reasons why software engineers aren’t taking pride in their product is the fact that there are a vast amount of alternative stacks that could’ve been used to build the same product.
Sometimes it can be due to the fact that they didn’t follow a specific architecture, or that they were in a haste to launch and didn’t pay much attention to architecting correctly.
Some engineers are proud of their products, these ones are the gurus that integrate at least one or more cutting-edge technology in their applications.
Greate article BTW!
Hi, Nikita here. Thanks for the mention, and congrats on a great article! But can you fix my last name, please? It’s Prokopov, not Propokov. Thanks!
Fixed! Sorry for the mistake.
You are talking from a very, very privileged place. After working for more than 20 years developing software, only in the last 3 I’m making “good money”, which means less than 2000 US dollars a month in my country (Brazil).
After 20 years trying to do my best, I only became to get more money when I ignored the urge to make a good work and simply please who is paying, singing the song them want to hear, and never been more happy, eating the food me and my wants when we want, paying the bills without bat an eye, travelling whenever we have time.
Working with many people from first world counties, knowing that an incredible stupid person makes like 20x what I do just because that person is lucky to be born there (converting their currency to mine) while you are whipped to deliver more, only makes you value money over anything else.
I’ll gladly take your place, privileged folk from California, to work by half your salary (meaning 5 times my very good salary now in my country) and make what my client is demanding.