When Is New Technology Adopted?

We have seen big tech claims, but no impact on the market, why?


As we have reached a certain level of technological knowledge in 2015, we have seen that there is one key to success when it comes to new technologies: Usability.

The truth is that a technology goes through several stages, from initial conception, hype, disillusionment and then full scale adoption. This curve tends to be driven by how easy it is for people to use or develop programmes for a product.

When a product is initially conceptualized it is used in its rawest form, like using a PC exclusively on DOS. For the uninitiated this kind of work would have been almost impossible. It wasn’t until 1985, when Windows 1.0 was released that the majority of the population could start using it.

This ability to use a technology by the majority of a population is vital in not only the commercial value of a product, but also the potential it will have in the future.

This is due to the money that is available in developing new programmes or customizations for it. If there is no market for people buying something, the chances of being able to fund an extensive development programme for a technology that people aren’t buying is not going to make sense at any company.

We saw with Google Glass, that if people don’t like a product, its lifespan will be cut short. People did not like the interface of the Glass, not simply the menus and usability but the necessity to have it on your face in a very obvious way was just not viable.

It isn’t just for consumer products either.

With business products that require an in-depth knowledge of how to use them, rather than being able to pick it up quickly, it means that there is a need to specialist skills. These specialist skills, by definition, are only available to a limited number of people, who can then demand considerable salaries based on these skills.

Therefore, in order for a technology to have truly widespread success, it needs to have a user interface that many can use without necessarily knowing the complex programming that the systems are based on.

So far I have only really discussed complex operating systems, which is discrediting much of the current flock of new technology.

Wearable technology for instance, does not require considerable knowledge of how to programme a particular device or app, but instead needs to have a user interface that allows people to use the product effectively.

We have seen with multiple smart watches that the key to success is finding the best way for people to interact with a smaller screen, whilst still making it a pleasing experience. It is something that no company has truly been able to do outside of specific niches like health monitoring. Apple are the next company to try and make the most of the new interface and we wait to see if they will have found the correct recipe.

As we have seen in the past, simply copying existing systems does not always work and it is the smallest details that can have the biggest impact on how a system is perceived and used. We see this with the first generation new technologies, where the slight bugs are present, to be later ironed out in later iterations. Think about the first iPhone compared to the most recent, the changes (aside from technology performance) are minimal, but have made a big difference to how people use the device. It are these little changes in the ways that people interact with a technology that ultimately prove to be the catalyst for success.

So when are we going to see new technologies become popular? Essentially when people can use them properly. 


Read next:

Leading Innovation into the Mainstream