How to Build Excellent Software part 2
I release my initial post on building excellent software may have sounded a bit anti-UX. The UX "movement" is all about providing an excellent, unique, crafted user experience that rewards the user and delights them with your product.
I perhaps implied that "features" don't matter. Yet little rarely used features are often the ones that delight us the most. I recently owned a Samsung Galaxy Note. With the stock Samsung firmware it was extremely feature rich, there was lots of nice stuff like sending the screen easily to my TV, some nice gesture support, eye tracking screen time-outs. However, I chose to install a custom firmware that was extremely stripped down by comparison. This was purely because the standard firmware was too slow for my liking, I felt I was always waiting. Of course I missed some of the features but the large increase in responsiveness in 99% of the actions I would perform on the device more than made up for the missing functionality. I have found that most of my colleagues with Samsung phones have installed custom firmware in order to improve performance. When the device is waiting for you and not the other way round you have the basis for an excellent UX, without this base level the UX will always be worse than the faster option. Slow application responsiveness is always one of the highest causes on the list of why someone tried an alternative product, sometimes it is the only reason given.
But surely responsiveness and optimisation is a given and should already have been accomplished and dont quad core mobiles make slow performance a thing of the past? Well even Google has arbitrary delays in its software, for example in 4.4.2 android prevents services and broadcast-receivers from calling startActivity for 5 seconds after the home button is pressed.
This arbitrary delay can cause the OS to appear sluggish under certain circumstances, and this 5 second delay seems pretty unnecessary. So while Android is making great strides in those little features, and Google Now gets more impressive in every version there are still lots of places which could be optimised to provide a much better experience. Using less system resource and providing results faster has so many knock on effects in mobiles that every fraction of a second saved on common actions really is worth more than the sum of its parts to the end user.
Optimisation can often be a relatively low cost activity, it is easy to test once the system is appropriately benchmarked, as long as the features remain the same then you know that it is a guaranteed improvement. Introducing new features with even a modest level of complexity will almost always cost more, and their benefit is not always easy to calculate. In extreme cases new features can even be considered a net cost.
Windows Vista looked nicer than XP, in many circumstances it could be faster, however, anyone who used it will be well aware of the constant additional confirmations demanded by User Access Controls (UAC) required for "security" purposes. Most users quickly turned off the UAC, this new feature suddenly has zero benefit, but actually worse than this the negative experience while it was active and the time cost of switching it off leads to a net loss in the eyes of the user, and this negative view can even spill into developing negative opinions on other features. This detracted from many of the positive elements that Vista introduced such as superior font rendering and better search indexing. A lot of people remained on XP or even switched back from Vista to improve speed and to avoid irritations such as UAC.
It is vital to have a good test suite available to make sure that there is no detrimental performance impact from new features. Performance reduction has always been the first and most vocal complaint I have even witnessed in software development. This can easily turn a beneficial feature into an end user revolt over a new version. It helps make sure that any new feature has little or no cost and will only be viewed as a benefit.
I perhaps implied that "features" don't matter. Yet little rarely used features are often the ones that delight us the most. I recently owned a Samsung Galaxy Note. With the stock Samsung firmware it was extremely feature rich, there was lots of nice stuff like sending the screen easily to my TV, some nice gesture support, eye tracking screen time-outs. However, I chose to install a custom firmware that was extremely stripped down by comparison. This was purely because the standard firmware was too slow for my liking, I felt I was always waiting. Of course I missed some of the features but the large increase in responsiveness in 99% of the actions I would perform on the device more than made up for the missing functionality. I have found that most of my colleagues with Samsung phones have installed custom firmware in order to improve performance. When the device is waiting for you and not the other way round you have the basis for an excellent UX, without this base level the UX will always be worse than the faster option. Slow application responsiveness is always one of the highest causes on the list of why someone tried an alternative product, sometimes it is the only reason given.
But surely responsiveness and optimisation is a given and should already have been accomplished and dont quad core mobiles make slow performance a thing of the past? Well even Google has arbitrary delays in its software, for example in 4.4.2 android prevents services and broadcast-receivers from calling startActivity for 5 seconds after the home button is pressed.
This arbitrary delay can cause the OS to appear sluggish under certain circumstances, and this 5 second delay seems pretty unnecessary. So while Android is making great strides in those little features, and Google Now gets more impressive in every version there are still lots of places which could be optimised to provide a much better experience. Using less system resource and providing results faster has so many knock on effects in mobiles that every fraction of a second saved on common actions really is worth more than the sum of its parts to the end user.
Optimisation can often be a relatively low cost activity, it is easy to test once the system is appropriately benchmarked, as long as the features remain the same then you know that it is a guaranteed improvement. Introducing new features with even a modest level of complexity will almost always cost more, and their benefit is not always easy to calculate. In extreme cases new features can even be considered a net cost.
Windows Vista looked nicer than XP, in many circumstances it could be faster, however, anyone who used it will be well aware of the constant additional confirmations demanded by User Access Controls (UAC) required for "security" purposes. Most users quickly turned off the UAC, this new feature suddenly has zero benefit, but actually worse than this the negative experience while it was active and the time cost of switching it off leads to a net loss in the eyes of the user, and this negative view can even spill into developing negative opinions on other features. This detracted from many of the positive elements that Vista introduced such as superior font rendering and better search indexing. A lot of people remained on XP or even switched back from Vista to improve speed and to avoid irritations such as UAC.
It is vital to have a good test suite available to make sure that there is no detrimental performance impact from new features. Performance reduction has always been the first and most vocal complaint I have even witnessed in software development. This can easily turn a beneficial feature into an end user revolt over a new version. It helps make sure that any new feature has little or no cost and will only be viewed as a benefit.
Comments