menu-close
search-icon
banner

Tag Archives: Data Analytics

The Analytics Trust Gap. It Is Very Real

So, you are looking to start an Analytics Program within your organisation. You have the tools ready, you have the resources designated for the task, and you have all the required process you need in place to run a robust Analytics program. You expect to see a massive revenue growth within a year; however, after the passage of 365 days, the outcomes are well short of your expectations. At this point, you have a set of questions to ask yourself:

  • Where could I have gone wrong?
  • My organisation has a massive volume of data. Was the quality of my data not up to mark?
  • I had all the tools in place, with Artificial Intelligence automating all the processes. Where did I fall short?
  • I have received data from multiple sources? Is this causing the shortfall?
  • Have I ensured that the data residing in my data lakes are free from breaches and attacks?

All these questions which arise could ultimately lead you to lose faith in your analytics program.

Over the past several years, we have seen how data analytics has evolved from the simple exploratory level to the current predictive level. With Business Intelligence (BI) playing the pivotal role in decision making, organizations are seeking the power of advanced analytics and machine intelligence to attain agility and competitive differentiation. Telcos, which own the most significant share of customer data among all the industries, are in the best position to leverage them to achieve higher levels of maturity. However, a recent KPMG report reveals a paradox that despite the huge investments in data analytics, organizations are not able to build value around it due to lack of trust. According to the report, only 35% of decision-makers have a high level of trust in their own organization’s analytics, and 25% admit that they either have limited trust or active distrust in their analytics. Moreover, only 10% said they excel in managing the quality of data and analytics, and 13% said they excel in the privacy and ethical use of data and analytics

What Causes the Trust Gap?

The above findings come as no surprise considering the growing complexity associated with handling the data originating from disparate sources. Many studies now indicate that the once 4 Vs to describe key aspects of data (Volume, Velocity, Variety, and Velocity) has now grown to 10 (to include Variability, Veracity, Validity, Vulnerability, Volatility, and Visualization).

But besides the growing complexity of data, there are multiple other aspects which are leading to the trust gap, some of which are captured below.

Quality- a top concern

As the data grows more complex, analysis can be challenging. Poor data quality or incompetent analysis can lead to disaster. As Gartner puts it, “As organizations accelerate their digital business efforts, poor data quality is a major contributor to a crisis in information trust and business value, negatively impacting financial performance.”

Working with false or incomplete data could result in uninformed and biased decisions, which could prove harmful to the overall business. Gartner has also estimated that poor data quality can lead to an average of $15 million per year in losses.

Can we trust machines?

In today’s machine-controlled analytics landscape, building trust becomes even more challenging. The advent of artificial intelligence (AI), coupled with the advancements in machine learning (ML), has opened a plethora of opportunities in data analytics. We have seen many horror stories wherein placing complete faith in AI without human intervention has had disastrous consequences.

Integration of disparate data

Considering that organisations do not have a single source of truth when it comes to the data they gather, data integration is another major roadblock, resulting in poor execution and sometimes complete failure of analytics implementation. Organizations which are slow in their transformation journey confront challenges in integrating data of different formats. They lack the skills, training, and tools to build a centralized access and control policy. Historically, the self-service concept is appealing but has often fallen short of expectations due to barriers faced at multiple levels – technology, people, and process.

Security-an everlasting concern

There is no respite from data breaches and misuse. The fact that fraudsters are making headway by exploiting advanced techniques escalates the concerns.  The thin line between the security breach and the reputation of an organization brings the transformation to a halt.

These are but a few reasons to why a trust gap is being created, but they are very real. Should it remain, the trust gap will lead to severe implications in terms of competitive advantage, operational efficiency, and growth. The inability to rise as a data-driven organization means that they also lag mature organizations considering data-driven organisations witness 23x Customer Acquisition, 6x Customer Retention, and 19x Better Profitability (according to McKinsey). And non-data driven companies miss out on all these benefits.

It is clear – The time has now come to bridge the trust gap! The only question that now remains is, how? Stay tuned to our blog for the answer.

Sandeep-Banga

Sandeep is responsible for Product Marketing of Subex’s Analytics portfolio. In addition to this role, he also looks after Public Relations and Analyst Relations for the company. He comes in with 9 years’ experience in the Marketing domain.

Are Traditional Data Warehouse Challenges Affecting Your Business?

With data emerging as the new currency for businesses, data warehousing demands a new approach in dealing with the challenges. As Gartner puts it, poor data warehousing practices “undermine the organization’s digital initiatives, weaken their competitive standing and sow customer distrust.” Telecom operators are among the most affected by data warehousing challenges as they handle billions of customer data generated from multiple sources like files and probes (SS7, SIP, SIGTRAN), as well as the massive volume of data streamed from social media platforms.

As the volume, velocity, variety, and veracity of data generated continue to grow; the traditional data warehouse approach flounders while managing and analyzing the data. With conventional data warehouse analytics offerings, answering even seemingly simple questions such as “Who are my ten best customers?” could take up to 5-10 days. Even after the team figures out the right criteria, compiling and analyzing the data could again be a time-consuming process. As the questions grow complex, the burden only grows further.

While in-memory databases have helped alleviate the problem to some extent by providing better performance but with rigid data models, it makes data analytics workloads more and more compute-bound. Even it is well understood that traditional data warehouse approaches have become arduous due to the IT dependency and upfront data modeling. As a result, the time to value grows longer, and the outcome materializes only when the business can start to use the reports and insights provided by the data warehouse. Since this approach is rigid, it may also call for data modeling changes, which will further delay the process execution.

Working with data always carried an inherent risk that false or incomplete data could lead to uninformed or even misinformed decisions. Telcos have a winning edge as they are the custodian of largest repository of customer data in the world. Therefore, businesses can no longer ignore these challenges as new business opportunities are emerging around data usage. Look at the sheer size of the data generated during a typical business day, an operator serving 80 million mobile subscribers generates around 20 billion Detail Records (xCDRs) daily.

Worldwide several Telcos have identified opportunities around data monetization – both internal and external. The growth of the Internet of Things (IoT), artificial intelligence (AI) and machine learning (ML) has largely contributed to this upswing. Telcos’ growing engagement with content providers, IoT companies, VAS companies and others prove they are striking it right. At this juncture, it becomes crucial for telecom companies to revise their data strategy around modern big data analytics tools.

Working with poor-quality data can also bring damage to the existing business, especially concerning customer value. As you know, customer preferences are evolving, so ensuring customer satisfaction and loyalty mostly relies on how quickly you address their issues. Big data and real-time analytics gain relevance in this context.

Hence, a robust big data platform is no more a luxury but a business imperative! Stay tuned to know more about Subex’s approach in handling the complexities around a traditional DWH and data quality issues.

For more information on how Subex is helping Telcos address gaps in their analytics approach through an end-to-end framework, attend a webinar we are hosting with Telecoms.com entitled, Bridging the Analytics ‘Trust Gap’ Within Telcos.

Register yourself here: http://bit.ly/2NzFXJF

Varune Virendra Gupta

Varune has 9+ years of experience in the Telecom and the IT industry, which includes working with IBM, TCS and NEC.
Currently, he is working as a Director in the Business Consulting group for the Emerging Markets.

Customer Analytics: Data Breaches and Consumer Trust

In this (possibly) final blog of this series I will be looking at how customers are becoming increasingly concerned at companies’ inability to keep their data safe, and how high publicity data breaches are eroding public confidence.

I previously wrote how it was possible to know where someone stood on issues of internet security just by checking their birth date. Those born after 1980, the so called Generation Y, or Millennials, are generally more comfortable sharing information online.   But things are gradually changing. In a 2014 survey by eMarketer it was found that Generation Y and Z were also becoming significantly more concerned about how well companies protected their personal data.

This concern can have a major impact on a company’s profits, as a recent report by the Ponemon Institute’s 2015 Cost of Data Breach Study (in conjunction with IBM) showed:-

Lost business has, potentially, the most severe financial consequences for an organization. The cost increased from a total average cost of $1.33 million last year to $1.57 million in 2015. This cost component includes the abnormal turnover of customers, increased customer acquisition activities, reputation losses and diminished goodwill. The growing awareness of identity theft and consumers’ concerns about the security of their personal data following a breach has contributed to the increase in lost business

By studying many data breaches across many industries, Ponemon have developed an approach whereby they can attach cost per record lost in order to estimate the cost of data breaches. What they have found is that, not only are data breaches becoming more common, but they are also becoming more costly.

The average cost paid for each lost or stolen record containing sensitive and confidential information increased 6 percent, jumping from $145 in 2014 to $154 in 2015.

2015 was perhaps the worst year so far for data breaches. NetworkWorld’s Top-10 data breaches of 2015 reported that many firms became the victim of very public data breaches including

  • Children’s toy companies (vTech)
  • Phone companies (T-Mobile, TalkTalk)
  • Healthcare firms (Premera, Anthem)
  • Dating agencies (Ashley Madison)
  • Government departments (IRS)
  • Security consultancies (Hacking Team)

+ Many more.

The trend does not look good, with the number of breaches predicted to increase in both size and cost for the foreseeable future.

Consumers are now realising that many companies are not vigilant enough in protecting their data, but just as bad, that customer data is being used in ways that are inappropriate and may adversely affect the consumer.

In a poll conducted by leading consultancy Radius Global 78% of internet users said they only purchased from companies they trusted.

The message is clear. If companies are to use customer’s data to provide a better service then they must do everything in their power to ensure the security of their systems and the responsible use of customer data, or run the risk of facing big fines and a catastrophic loss of consumer trust.

Big data and advanced analytics are forcing governments to bring in new regulations to ensure that companies use customer data responsibly, but it’s in every companies interest to ensure that the bond of trust is not broken.

Mark Jenkins

Mark Jenkins has worked in the IT industry for over 15 years as a BI and Analytics consultant, and more recently as ROC Product Manager for Subex Ltd. He has designed and deployed solutions for global companies in many sectors including Insurance, utilities and telecommunications. Mark holds a BSc Hons in Computer Science from Manchester University (UK).

Get Started with Subex