‘Big data’ is all the rage, and for good reason. Companies need to sort through tons of it to understand what products their customers really want, and what they will buy. After all, their future depends on it. Yet, in spite of vast amounts of market intelligence and virtually unlimited information, how well do companies really know their customer?
In a world where, according to the Association of Product Management and product Marketing, more than 50 per cent of new technology products that enter the market fail, and roughly 75 per cent of consumer packaged goods and retail products fail to earn even $7.5m during their first year, we are clearly not doing a good job putting this data to work.
While marketing and manufacturing organisations have begun to take advantage of big data, the same cannot be said of product and service planning, a function most companies still regard as an art. The fact is that reliably creating high-demand products and services requires companies to balance that art with science. That’s where IT comes in.
For senior IT executives, though, the challenges are two-fold: First, to recognize the value of big data in mining customer needs and desires, and second, to devise a data management strategy that integrates big data into the front end of the innovation pipeline.
The third wave
Market research has evolved from the days of discovering customer needs principally on the basis of periodic surveys, focus groups, behavior observation and ethnographic studies. While still important sources of customer reaction, these techniques do not scale to the requirements of global markets or mass customisation.
Next, some companies began to incorporate collaboration technologies, which let customers directly influence product and service features. IBM and Siemens, for example, make use of crowdsourcing. Open innovation initiatives at Proctor & Gamble, Airbus, and Sara Lee are having a notable influence on product development.
Interestingly, a third wave now requires companies to accumulate data from sources residing at close proximity to the customer. In particular, online and brick-and-mortar retail, insurance, transportation and telecommunication companies, healthcare providers, utilities and banks collect large volumes of customer data.
There is transactional data from web searches, purchases, customer service calls and visits to the bank or doctor’s office; social media data from LinkedIn, Facebook, Twitter and blogs, online customer reviews and sentiment analysis, and data produced by the so-called ‘Internet of Things’ which includes location-based services and data from sensors and monitors embedded in a growing array of products, from automobiles to medical devices to household appliances.
With the evolution of software as a service and open source technologies, capturing and managing this data has become more cost effective for companies than ever before. However, sorting through and getting at the right information, and then putting it to work for product development is the real challenge. This requires a fully integrated approach to managing product life cycles so that each data point can be used to paint a full picture of what the customer really wants and needs, which can then be reflected in the end product.
This third wave will account for the biggest growth in big data. A fact made all the more vivid by the recent prediction by Hans Vestberg, Ericsson’s chief executive, that there will be 50bn internet-connected devices by 2020.
IT’s crucial role
That IT has had a positive impact on productivity and efficiency is undeniable. Now, as the pace of innovation quickens and the pressure to remain competitive intensifies, IT can help company innovators build the right products and services. This time, the task is to collect, synthesise, analyse and integrate big data and then merge those results with other sources of market research. There are three areas where IT can contribute.
The first is building the necessary infrastructure. This includes defining the data and computing architectures for extracting and analysing the data, and working with third-party sources of big data. It also means making the data available to existing innovation management and product lifecycle management systems so that those that are best able to act upon it, have easy access. A critical part of this infrastructure also includes developing a system that allows for end-to-end automation of information to avoid siloing and other classic roadblocks in the development cycle.
The second is identifying operational processes. This includes social media analytics and applications for data visualisation, collaboration and business intelligence. And it includes helping to define new business processes and workflows for product and service development organisations.
Third, is advocating for proper policies and governance. The availability of sensitive data to more organisations carries with it new attendant risks. These risks must be assessed in the context of new usage models and existing security, privacy and data access policies.
Eyes on the prize
The barriers to incorporating big data into the innovation process can be significant, ranging from cultural to organisational, technological and budgetary. However, the opportunities are also clear: quicker to market with new products and services; identifying weak product and service candidates earlier in the development cycle; eliminating features that customers don’t want while adding features they are will to pay a premium for; and identifying and prioritising requirements for specific markets.
The bottom line is that the more you know about your customers, and the more you incorporate that knowledge into products and service, the more positive the impact on revenues, margins and market share.
John Hamm is the CEO of Accept Software and the author of Unusually Excellent .