I made a big fault in my last blog post. I claimed that software quality is getting worse in the past decade or two without giving any facts to undermine that claim. And I’m very sorry for posting a personal perception like that.
I still think that software quality is not getting better. And I want to give you the points why I think that is.
What is quality?
Quality is value to someone (who matters).
– Jerry Weinberg & add on by James Bach
Quality as I would describe it better consists of a bunch of ilities (like Capability, Reliability, Usability, etc). ISO 25010 counts about 30, the famous poster from Rikard Edgren & Co. contains over 50. There are many factors that add to the perceived value of a product. Maybe mine differ so much from yours that your opinion on the same fact is completely opposite to mine.
Quality of people producing software
I came into professional IT in the year 2000. In the years of the boom. CS in universities was the subject to take. People sat on the floor or had to stay outside in crowded class rooms. There was a huge demand in IT people and universities were not fast enough in producing new talents. So companies tried to attract people from other areas into IT. I remember starting in my big test project in 2003 with over 70 people in my team. There were only two people with formal CS degrees and me with vocational training as developer. The rest of the team came from various backgrounds. I don’t want to say that those folks were bad, not at all. Some of the best testers I worked with so far were among them, having diplomas in philosophy, meteorology or similar. But definitely not all were good or should have pursued a career in IT. But we needed the people, so they stayed.
With my current job I got the responsibility of hiring people for my team myself. My amount of experience is not big at all, and maybe other reasons influence the perception. In the last three years I have seen about 150 CVs and interviewed 40+ people for 2 job openings in the past years. And the ~150 were only those HR sent through. And the success in finding people with the right skills who fit into the team was very low.
A question I tend to ask is how the candidates came into software testing. Most given answer was something in the sense of, “friends told me that there is an opportunity for me to get into IT”. My colleagues searching for developers and business analysts have similar time consuming experiences with only slightly higher success rates.
When speaking with friends in the industry asking them about their hiring experience, all share the same tenor. It gets harder and harder to find good people.
Software companies and products popping up everywhere
Success stories like Facebook and Instagram, collecting billions of dollars for their ideas and the users they have, local success stories getting still some millions from bigger companies who want to have the product. These are the factors that inspires people to come up with their own ideas to make some big bucks. Thanks to the internet its easier than ever to publish software and draw some attention to it. If your software goes viral (enough) you have made it.
There are lots of companies who try to emulate success stories and come up with a similar product. Once a product gets some fame, you can be sure that there will be dozens of clones available in no time, trying to get a piece from the cake. Thanks to the law of the market those products quickly reach their end of life, some keep dwelling in the shadows.
Frequency of updates
The internet and modern software development approaches like Agile and DevOps make it possible to update software quickly. There are companies using DevOps approaches bragging with daily or even more frequent updates. Time to market is one of the driving factors. Software companies don’t have months and years to come up with a product anymore. If you want to earn money and stay ahead of your opponents, you have to move quickly. Agile and the approach to publish MVPs (minimum viable products) is getting strong to produce the right thing. And it works e.g. great with websites and other centrally hosted applications. But with that approach, e.g. as seen in the app market for smartphones, the industry also produces thousands of apps that start with some small feature and then being either forgotten or growing from there. Often the aspect of viability is not very distinct.
Two decades ago I first heard the name “banana projects”, “ripes at the customer”. Back then it was a dispraise to get that title. With MVP that approach was made the weapon of choice. Don’t waste time on producing a product, fail fast and learn. That approach has two sides. Software companies save money and get a chance to produce something with potential, on the other hand users waste time to evaluate dozens of products to find a solution that fits. Win-win situation? I’m not so sure.
Big companies like Apple, Google and Microsoft have problems with prestige projects, affecting millions of customers. There are fixed dates to hold, promoted by marketing, without listening to project stakeholders. There is a keynote, they need to publish on a given date. Thanks to some late changes we see problems over and over again coming in the initial versions asking for patch releases soon after. Android has a different problem with device fragmentation growing by the hour. Teams concentrate on the majority of the user, evaluating if problems reported by users from the minority are worth fixing at all.
Mobile apps are the example I see that phenomenon every day. I have about 100 apps on my smartphone. Every work day I get at least 1 update for one of the apps. Sometimes up to 8. Shortly before and after releases of new versions of the operating system that number gets double-digit. On all my devices I spend a fair amount of time when using it with maintenance and updating. Something I don’t want to waste my time with, to be honest.
Hardware is getting faster and faster, and developers simply don’t have to care much about optimizing their code. Just add some cores or some GB of RAM, and it works faster again. This has of course always been a phenomenon. Developers were simply not trained to optimize the last bit of their code. Hardware gets replaced even faster these days, so why worry about slow devices? And so goes the spiral.
A long time ago when most things already got delivered in CD-ROM, there was an initiative that tried to fit awesome stuff on floppy disks (you might remember those things that look like a 3D print of the save symbol), taking max. 1.4MB of disk space and running performant on older CPUs as well. Sadly I couldn’t find any links.
Then there was Fli4l, a project that produced Linux distributions that fit on floppy disks to run some valuable software on old machines like firewalls, proxy servers and web servers booting from floppy disks.
You say in times of 128GB USB sticks that’s no longer necessary. Well, exactly there lies the problem in my eyes. People (developers and users alike) don’t care, because they don’t have to.
Times change and so do development approaches, ways of distributing software, and ways of using software. Is the demand for software higher than 10-20 years ago or is that demand artificially induced?
With the few topics I tried to explain here I still say from my point of view and in my opinion that software quality in many cases is decreasing. Some companies make it right and use the available approaches and technologies to improve their software. But how many are out there that are doing it wrong or simply deliver bad software faster. If you don’t come across those pieces of software in your life, good for you, and I hope it stays that way for a long time.