menu-close
search-icon
banner

Category Archives: Analytics

Why Analytics Alone is Insufficient for Telcos Seeking Increase in CX and Profitability?

The past decade has seen the telecom industry embark upon a momentous growth path. The number of unique mobile subscriptions worldwide has almost doubled from 2.6 billion in 2009[1] to 5.1 billion[2] in 2019, thanks to the quick evolution from 2G to 3G, 4G, and now 5G, along with innovations in IoT, M2M, Artificial Intelligence (AI) and cloud computing. While this growth phase has led to the industry accumulating humongous amounts of data, a majority of them have not reaped the benefits of monetizing the data at their disposal—similar to how Facebook, Google or the many others are successfully doing.

Data monetization is vital for Telcos to stay ahead

Telcos are advantaged by a unique position in the ICT value chain. For example, a simple event such as planning a holiday today involves using your smartphone for a variety of tasks—from booking flights and hotels to researching the best places to explore. All of these result in the generation of varied datasets, to which telcos have access to. However, a complicated ecosystem of challenges is preventing telcos from successful internal and external data monetization.

As Telcos continue to face increasing pressure in their top-line business, internal data monetization can bring them a strategic advantage in solving business problems. From the understanding of revenue trends such as the accurate pinpointing of developments in the voice or data business to gaining deep insight into customer likes and dislikes, data analytics can address several business issues. Similarly, external monetization opens opportunities for telcos in a range of industry verticals.

However, to stay relevant in the market, telcos will have to stay ahead on the trends, which can lead them to opportunities. Messaging (SMS), which was an exclusive offering of telcos has now been taken up by players like WhatsApp and Facebook Messenger. Employing data analytics to analyze and predict trends is the only way to stay ahead.

Why analytics alone won’t help?

Several leading telcos have already realized the power that lies in data analytics as a key strategic pillar and continue to invest in advanced data technologies. However, the ROI for data analytics investments by telcos continues to stand unproven, especially on an incremental basis. The reason: Most telcos view data analytics as a technology asset, leaving open a wide gap between technology and business goals. This severe lack of coordination has resulted in analytics and data science being viewed as a technology solution rather than a business problem, causing business goals to suffer.

The answer lies in domain-driven analytics

For telcos seeking to stay competitive, domain-driven analytics is the answer. Spearheaded by deep domain knowledge and wide industry exposure a domain-driven analytics solution understands the requirements of telcos operating in specific regions or market conditions. Through the analysis of existing data sets, domain-driven analytics can transform current stats into future possibilities.

Take the successful use case of how a telco in a developing market used domain-driven analytics to increase revenue. The telco was using its existing business model of acquiring more customers to increase revenue. Using domain-driven analysis, however, it was found that the current ‘active base’ which was 8.5 days needed to be improved in order to increase revenue. The telco used the recommendations and conducted an integrated campaign which led to the increase of the ‘active base’ from 8.5 to 9.5 days and a boost in revenue by 25%.

With 25+ years of experience in driving data-driven business transformation for global telcos, Subex has garnered a lot of experience in applying domain-driven analytics for telecom.

[1] https://www.budde.com.au/Research/Global-Telecoms-The-Big-Picture-2019-Key-Industry-Statistics

[2] https://www.gsmaintelligence.com/

 

To find out more about how domain-driven analytics can impact your ROI, Increase Customer Experience and Profitability.

Download the webinar recording now!

The 6 worst pitfalls of not having an Analytics Maturity Assessment in place

Most organizations use analytics to improve their operations to enhance business performance and growth. However, through our experience of working with telecom organisations across various geographies, we have witnessed that analytics in many cases is done in an ad-hoc manner rather than in a planned phase. This results in organisations not witnessing the kind of return they were expecting from their analytics investments. An Analytics Maturity Assessment (AMA henceforth) helps in identifying such pitfalls and avoiding them. Few of the pitfalls that AMA helps in identifying are:

  1. Data Issues & trust: The root of any analytics project is the Data element. It is important that the end user does not have any form of data trust issues when it comes to their data. Proper data capturing, storing, infrastructure design, validation etc. are important dimensions which AMA investigates with proper checks and balances based on industry standards.

 

  1. Lack of Clarity: The first step to having clean data comes from having a good understanding of the data. There are basic graphs, queries, questions that every data analysis requires at the start of any project. For companies to be truly data driven, a Data Analysis pack consisting of the above elements and beyond is necessary. AMA checks these points to help figure out whether the first cut is robust enough. This understanding, the results of such a pack, is essential not only for the analytics team but helps business users as well gather lots of useful insights.

 

  1. Lack of Business Understanding: Analytics is a means to an end and not an end in itself. The intention is to help the organization improve on its top line, bottom line and operational efficiency. This is not possible unless the analytics team has a good understanding of the business as well the business team clearly understands what analytics can bring to the table. Synergy between the two is needed for implementable outcomes. The analytics solution should answer the questions that the end user wants to know. AMA helps in avoiding this pitfall or checks the status quo by having multiple checklists such as SMEs, trainings, meetings, presentations on this dimension.

 

  1. Improper Implementation: After what is to be done is clear, how it is done is essential for effective implementation. Ad hoc project work and repetition of same mistakes are cost centers for organizations. There must be standard practices for project implementation.

 

  1. Information Asymmetry: Having to constantly reinvent the wheel is another cost factor which takes the essential time of resources. This issue is usual the result of teams working in silos and not functioning as a larger team. Often similar analytics projects are undertaken in different verticals such as say finance and operations. This is another grey area which AMA helps in identifying and avoiding.

 

  1. Missing dollar accountability: Many organizations do not have long term vision in place. In these cases, analytics becomes good to have but it remains just a cost center. The vision and purpose of analytics needs to should come from the top and be clearly identified and communicated. Each project’s expected outcomes should be predefined and once implemented, its ROI calculated. This is essential for the integration of analytics into the DNA of the organization for it to be able to make a string contribution to the growth story.

Conclusion:

It is essential for every organizations to take stock of the situation after certain intervals. It is even more important when trying to inculcate something very different and new within the organisation to bring about a mindset/cultural change. Most of organizations are investing in analytics but find using it effectively difficult. AMA helps in streamlining this change. It helps in asking pertinent questions and figuring out the change methodology. As noted above, there are quite a few pitfalls of not having an AMA for an organization, and not having an AMA in place can be expensive considering the associated risks.

To understand more on how you can adopt an AMA within your organisation, view a recent webinar we had conducted on the topic.

Click here to view the webinar

The Road to Being Data-Driven Starts with Knowing Where You Are

The telecom world is changing, and organisations are challenged to not only grow, but to merely stay relevant.  To cope up with the fast-changing environment, organizations need to be at a certain level of maturity where the decision-making process needs to move from gut feelings to number & reasoning-based practices. The mandate here is clear: Organisations need to become data-driven or risk being left behind. However, the reality is bleak considering that nearly 100% of enterprises want to become more data-driven, but only less than a third have accomplished that goal. The starting point to being data-driven starts with being able to assess where one stands, and which is the best way to move ahead to a certain objective, a notion which continues to remain challenging for organisations.

Almost all organisations have ventured into the woods of data analytics with a purpose to make a sense and get the best out of the huge volumes of data they have. Sometimes they ask what others (either in the same industry or across industries and academia) are doing and how they can replicate it, while at times they ask what new can be done that no one else has. To analogize the entire data analytics practice to the workings of an engine, an engine can be only as good as the sum of its parts. The parts must fit in well, the oiling mechanisms should help in friction reduction, the oil supply has to be timely and of course, the sparks need to be perfect. As many say, data is indeed the new oil and data analytics is fast becoming the engine (in this case, of growth).

Fine tuning this engine is the need of the hour. Assessing where we stand, which directions we can move towards and where would that lead us to, what would be the best enabler to move in each direction, and how to go about doing it, is mission critical. This, in itself, is an optimization problem where maximizing returns and minimizing costs given the multiple constraints is challenging.

Transformation is important, but to ensure true competitive advantage, organizations must transform themselves in a planned phase. The approach to analytics cannot be stochastic as it would result in more troubles than benefits. Organizations must traverse one stage at a time. Not only defining those stages is the need of the hour but also is knowing what steps one needs to take at a point of time to move from one stage to another. In this scenario, an Analytics Maturity Assessment becomes imperative.

We have been working with customers across the globe in helping them assess their Analytics Maturity, to define the business objectives and carve out an Analytics roadmap towards said objectives, through our Subex Analytics Maturity Models. Subex Analytics Maturity Models defines those stages and the steps between those stages, through an assessment across People, Process, Technology and most importantly Data.

To know more about why an Analytics Maturity Assessment is important, and how it can help your organisation, Schedule a Demo with us and our Subject Matter Experts will get in touch with you.

Addressing the Trust Gap. It is Possible

In our previous blog, we spoke about how in today’s world of rapid and constant change it has become ever so important to make the most of real-time inflow of data. Data is the new oil – and like oil, data needs a refinery before it is used across business use cases. To quickly take a step back, this trend always reminds me of the comic Tintin and The Land of Black Gold – “Boom! … One day your car goes Boom!”. The plot revolves around car engines exploding because of faulty petrol at its source. Similarly, if data is not clean at the source, your business decisions are bound to go “BOOM”!

‘TinTin: Land of Black Gold’ by Hergé

 

Analytics has been commoditized today, with the entry of open source tools and technologies. However, there is a significant trust gap when it comes to the consumption of analytics. How much do you trust your data? How significant is the output of analytics in the organisations board meeting? While in our last blog we looked deeply into the trust gap and its roots, at the end of the day, we need to realise that analytics is just an application of Math-Technology-Business on data. So, if we believe in Mathematics, have faith in the technological revolution and are confident of our business intuition, there is no reason for analytics not to be considered as the most critical function – all that remains is refining the oil, i.e., data.

We at Subex, recognize and respect this trust gap. We also believe an analytics strategy should build around the golden triangle – People, Process, and Technology.  The first and most critical step would be to have analytics done in a democratized manner. Everyone in the organization, from the C-Level, to the Department Head level, to the Analyst level should be armed to be data-driven. The involvement of machine should not undermine the trustworthiness, nor should it lead to a decrease in human involvement; instead, you should leverage the best of both human and machine intelligence to improve products, enhance the quality of service (QoS) and derive more returns from your investments.

Does Human Intelligence + Machine Intelligence = Trust?

Taking the Human Intelligence + Machine Intelligence philosophy into account, we have come up with a concept known as Subex ACT (Analytics Centre of Trust), designed to bridge the Trust gap by covering the end-end cycle of Data-Insights-Decisions (D.I.D.). Let us take a quick look at the three pillars of our ACT program:

  1. Defining a Strategy

Before starting the analytics journey, it is imperative to assess the following

  1. What is the analytical maturity of the organization?
  2. What are my objectives from the analytical program vis-à-vis the business vision
  3. Do I have a roadmap in place?

The main Objectives of this process are:

  • Setting up the goals for the organization: The Strategy can help deliver competitive advantage, create incremental revenue opportunities, and reduce costs.
  • Assessing your analytics maturity vis-à-vis your goals: Understand where you are in terms of your analytics maturity and identify the target maturity which will help you reach the goals defined in step 1
  • Plan for the Transition: Understand how you will transition from your current maturity level to the desired maturity level and ensure the process is time-bound, tangible and step-wise. What we recommend is that you identify tangible use cases, such as churn, and move the analytics maturity of addressing churn from, say, 3 to 4. Once that is completed, define another use case and increase the maturity to address that similarly
  1. Setting up an Information Infrastructure

Post defining the analytical strategy it is imperative we have the right set of tools to handle the task at hand. The tools which organisations need today need to be the following:

  • Agile: Create the ability to address the problem statements based on the requirements
  • Scalable: Should be able to handle massive volumes and different types of data
  • Reliable: The information that is generated by the system needs to be trustworthy
  • Real-Time: For quick and accurate decision making the reports should be in real-time
  • API Integration: The tools should be compatible with API-based integration
  • User-friendly: Consumption of the reports/data should be easy to use
  • Secure: The tool should be compliant with security guidelines
  • Self-Serviceable: Accessible UI enabling the end user to self-generate reports

Such an Information Infrastructure should offer a self-service reporting environment wherein each stakeholder gets the access to the tools to analyze and act upon the information. This will not only reduce the time gap in execution but raise the operational efficiency to a new level. As the model evolves into an Analytics Centre of Trust (ACT), the transformation journey becomes smooth.

  • Analytical Driven Business Outcomes

The final piece of the Analytical Framework, is clearly towards the analytical output. For too long, organisations have set up analytics practices with a mandate towards delivering on analytics outcomes. Subex is of the firm belief that the key to analytics is to attain business outcomes while ensuring insights are available across all audience levels in a democratized fashion which is easily understandable.

Conclusion

The world of digital technologies is open for Telcos to build new business opportunities as well as excel on the existing ones. It’s time to identify the gaps in your analytics strategy and develop an ACT that helps you climb the ladder faster. As an organization is preparing to capture the active markets, your analytics goals must focus on using information as a strategic asset to generate revenue, improve operational efficiency and provide best-in-class customer service.

In our next blog, we will cover how Subex ACT helps CSPs regarding addressing these three pillars and how it helps bring Agility, an Analytics to Business mindset and Democratization through Consumable Outcomes to your organisation.

This blog has been co-authored with Sandeep Banga. 

The Analytics Trust Gap. It Is Very Real

So, you are looking to start an Analytics Program within your organisation. You have the tools ready, you have the resources designated for the task, and you have all the required process you need in place to run a robust Analytics program. You expect to see a massive revenue growth within a year; however, after the passage of 365 days, the outcomes are well short of your expectations. At this point, you have a set of questions to ask yourself:

  • Where could I have gone wrong?
  • My organisation has a massive volume of data. Was the quality of my data not up to mark?
  • I had all the tools in place, with Artificial Intelligence automating all the processes. Where did I fall short?
  • I have received data from multiple sources? Is this causing the shortfall?
  • Have I ensured that the data residing in my data lakes are free from breaches and attacks?

All these questions which arise could ultimately lead you to lose faith in your analytics program.

Over the past several years, we have seen how data analytics has evolved from the simple exploratory level to the current predictive level. With Business Intelligence (BI) playing the pivotal role in decision making, organizations are seeking the power of advanced analytics and machine intelligence to attain agility and competitive differentiation. Telcos, which own the most significant share of customer data among all the industries, are in the best position to leverage them to achieve higher levels of maturity. However, a recent KPMG report reveals a paradox that despite the huge investments in data analytics, organizations are not able to build value around it due to lack of trust. According to the report, only 35% of decision-makers have a high level of trust in their own organization’s analytics, and 25% admit that they either have limited trust or active distrust in their analytics. Moreover, only 10% said they excel in managing the quality of data and analytics, and 13% said they excel in the privacy and ethical use of data and analytics

What Causes the Trust Gap?

The above findings come as no surprise considering the growing complexity associated with handling the data originating from disparate sources. Many studies now indicate that the once 4 Vs to describe key aspects of data (Volume, Velocity, Variety, and Velocity) has now grown to 10 (to include Variability, Veracity, Validity, Vulnerability, Volatility, and Visualization).

But besides the growing complexity of data, there are multiple other aspects which are leading to the trust gap, some of which are captured below.

Quality- a top concern

As the data grows more complex, analysis can be challenging. Poor data quality or incompetent analysis can lead to disaster. As Gartner puts it, “As organizations accelerate their digital business efforts, poor data quality is a major contributor to a crisis in information trust and business value, negatively impacting financial performance.”

Working with false or incomplete data could result in uninformed and biased decisions, which could prove harmful to the overall business. Gartner has also estimated that poor data quality can lead to an average of $15 million per year in losses.

Can we trust machines?

In today’s machine-controlled analytics landscape, building trust becomes even more challenging. The advent of artificial intelligence (AI), coupled with the advancements in machine learning (ML), has opened a plethora of opportunities in data analytics. We have seen many horror stories wherein placing complete faith in AI without human intervention has had disastrous consequences.

Integration of disparate data

Considering that organisations do not have a single source of truth when it comes to the data they gather, data integration is another major roadblock, resulting in poor execution and sometimes complete failure of analytics implementation. Organizations which are slow in their transformation journey confront challenges in integrating data of different formats. They lack the skills, training, and tools to build a centralized access and control policy. Historically, the self-service concept is appealing but has often fallen short of expectations due to barriers faced at multiple levels – technology, people, and process.

Security-an everlasting concern

There is no respite from data breaches and misuse. The fact that fraudsters are making headway by exploiting advanced techniques escalates the concerns.  The thin line between the security breach and the reputation of an organization brings the transformation to a halt.

These are but a few reasons to why a trust gap is being created, but they are very real. Should it remain, the trust gap will lead to severe implications in terms of competitive advantage, operational efficiency, and growth. The inability to rise as a data-driven organization means that they also lag mature organizations considering data-driven organisations witness 23x Customer Acquisition, 6x Customer Retention, and 19x Better Profitability (according to McKinsey). And non-data driven companies miss out on all these benefits.

It is clear – The time has now come to bridge the trust gap! The only question that now remains is, how? Stay tuned to our blog for the answer.

Actionable predictive analytics: overcoming the analysis paralysis

Why standard forecasting analytics models fail to deliver in today’s world of complex digital networks and why telcos need a domain-specific analytics solution.

“Your analytical dashboards and visualizations look good, but I prefer actionable reports and insights”, said the deputy CEO of a Southeast Asia-based telecom service provider during one of our meetings last year. This was not just one odd instance. We have heard this many times in the past year from other CSP executives. There are many domain-agnostic AI/ML based analytics solution providers in the market, but what telcos really want is an analytical solution which provides end-to-end domain-specific actionable insights. Forecasting traffic or pointing out anomalies is one thing, but how to incorporate those recommendations into capacity planning? What is the root-cause for that anomaly so that it could be prevented in future? Instead of getting lost in analysis paralysis amidst thousands of fancy statistical metrics; a simpler, actionable and reliable predictive analytics solution is the need of the hour.

With the right mix of domain knowledge and analytics advantage, centered around the actual requirements of the network planners; Subex has come up with the concept of actionable predictive analytics. Network planners should be enabled for efficient, reliable and cost-effective capacity planning. Hence, here the focus is more on what matters to the telco network teams, i.e. the business values such as capex optimization, network performance improvement, customer experience enhancement and operational efficiency; rather than on underlying analytical components such as configured models or feature engineering.

Here are two of the most important aspects about Subex’s approach to predictive analytics which are different from the traditional forecasting models –

Multi-variate analysis

Unlike the traditional forecasting systems which predict the future trends for a metric based on the historical pattern of that given metric, in multivariate approach, the system understands the lagging or leading effect on the given KPI from other KPIs. With this, the telco can predict, in near real-time, what is going to happen in the future and adopt appropriate measures to prevent capacity issues. A multi-variate, self-learning forecasting model which runs on the in-house machine learning platform is complemented by domain-specific configurations and expertise which is equally essential for intelligent forecasting.

The figures below compare a multivariate model scenario that considers the lagging effect of KPI1 (e.g. customer complaints) on KPI2 (e.g. capacity utilization) with that of a traditional model that does not give such insights. In the first case, the operator does not get accurate results as yielded in the second case because there is a direct relation between traffic and customer complaints. For example, if there was an aberrant increase in traffic, the operator can take that fact into consideration for accurate prediction of future customer complaints.

One more use-case could be accurately predicting the time to capacity exhaust for a site if one of the neighboring sites is planned for decommissioning soon. In this case, with the help of geo-spatial analytics, the additional load on the given site due to decommissioning of the neighboring site would also be considered for calculating time to capacity exhaust.

capacity exhaust

Domain Specific Insights

Be it wireless or hybrid fiber-coaxial networks, even an accurate capacity forecast is incomplete without the required domain-specific insights. Without a proper root cause analysis for a network element exhausting soon (in terms of capacity), the network planners won’t be able to make the right decision about its proactive mitigation.

These are some questions to consider when developing your network augment action plan:

  • How many customers will be impacted when a given network element hits a capacity exhaustion threshold?
  • Will prioritizing the given candidate for capacity augment above other options result in the best customer experience improvement and maximized ROI?
  • What is the reason for this capacity exhaust? Is it because of seasonality, periodicity or cyclicity? Is it an anomaly due to some one-off event?
  • Will new Capex be required to address the capacity bottleneck, or are there alternatives to new spending?

Some of the insights that could be useful for the planners leveraging predictive analytics for capacity planning and management are shown below –

capacity planning

Predictive Analytics

Apart from the above two key differentiators, some other important aspects for a pragmatic, accurate and reliable predictive analytics solution are scalability and flexibility.

Multi-variate forecast models need to run thousands of simulations across the network to identify the correct correlated metrics for accurate predictions.  Such models need to be configurable, flexible and easy-to-understand for non-data scientists.

Are Traditional Data Warehouse Challenges Affecting Your Business?

With data emerging as the new currency for businesses, data warehousing demands a new approach in dealing with the challenges. As Gartner puts it, poor data warehousing practices “undermine the organization’s digital initiatives, weaken their competitive standing and sow customer distrust.” Telecom operators are among the most affected by data warehousing challenges as they handle billions of customer data generated from multiple sources like files and probes (SS7, SIP, SIGTRAN), as well as the massive volume of data streamed from social media platforms.

As the volume, velocity, variety, and veracity of data generated continue to grow; the traditional data warehouse approach flounders while managing and analyzing the data. With conventional data warehouse analytics offerings, answering even seemingly simple questions such as “Who are my ten best customers?” could take up to 5-10 days. Even after the team figures out the right criteria, compiling and analyzing the data could again be a time-consuming process. As the questions grow complex, the burden only grows further.

While in-memory databases have helped alleviate the problem to some extent by providing better performance but with rigid data models, it makes data analytics workloads more and more compute-bound. Even it is well understood that traditional data warehouse approaches have become arduous due to the IT dependency and upfront data modeling. As a result, the time to value grows longer, and the outcome materializes only when the business can start to use the reports and insights provided by the data warehouse. Since this approach is rigid, it may also call for data modeling changes, which will further delay the process execution.

Working with data always carried an inherent risk that false or incomplete data could lead to uninformed or even misinformed decisions. Telcos have a winning edge as they are the custodian of largest repository of customer data in the world. Therefore, businesses can no longer ignore these challenges as new business opportunities are emerging around data usage. Look at the sheer size of the data generated during a typical business day, an operator serving 80 million mobile subscribers generates around 20 billion Detail Records (xCDRs) daily.

Worldwide several Telcos have identified opportunities around data monetization – both internal and external. The growth of the Internet of Things (IoT), artificial intelligence (AI) and machine learning (ML) has largely contributed to this upswing. Telcos’ growing engagement with content providers, IoT companies, VAS companies and others prove they are striking it right. At this juncture, it becomes crucial for telecom companies to revise their data strategy around modern big data analytics tools.

Working with poor-quality data can also bring damage to the existing business, especially concerning customer value. As you know, customer preferences are evolving, so ensuring customer satisfaction and loyalty mostly relies on how quickly you address their issues. Big data and real-time analytics gain relevance in this context.

Hence, a robust big data platform is no more a luxury but a business imperative! Stay tuned to know more about Subex’s approach in handling the complexities around a traditional DWH and data quality issues.

For more information on how Subex is helping Telcos address gaps in their analytics approach through an end-to-end framework, attend a webinar we are hosting with Telecoms.com entitled, Bridging the Analytics ‘Trust Gap’ Within Telcos.

Register yourself here: https://bit.ly/2NzFXJF

The Trifecta Effect for Telco Analytics –Anomaly Detection

Subex recently participated in the Monetising Big Data in Telecoms World Summit 2018, Singapore, where we demonstrated our expertise of 25+ years in the Telecom domain handling data at a massive scale. We did this by presenting on the topic: The Trifecta Effect for Telco Analytics –Anomaly Detection

To take a step back, Subex has partnered with 250+ telcos across 100 countries. We have been handling big data and have an understanding of the business of telcos working in different contexts, demographics and geographies, and at different stages of their growth. We have been leveraging analytics for 10+ years in the assurance portfolio, and have made a foray into analytics across all domains in the telecom sector. While starting on this journey, we did a survey across many operators and what came out was unexpected albeit not exactly surprising, based on our experience.

We saw that, while the telecom domain today is at the forefront of innovation, the industry can be considered as a relative laggard in adopting analytics, when compared with other industries. Around 60-70% of the executives still lack relevant data for decision making. To top it all, where analytics is being used, around 60% of the organizations are still not very confident about their analytics insights.

Based on our market research and after multiple interviews we believe that the reason for these above problems can be classified along the following categories:

  1. Poor ROI
  2. Multiple and Complex Dashboards
  3. Long Development Cycles
  4. Data quality issues
  5. Short Supply of Data Scientists
  6. Lack of Agility

To cater to these problems, we at Subex have designed a way to address these issues. Our solution, ROC Insights stands on three pillars which we call the Trifecta of Analytics: Agility, Cost & Consumption. Subex does this by

  • Delivering insights in less than 8 weeks
  • Following a pure OPEX model which takes care of the cost
  • Most importantly ROC Insights simplifies consumption of analytics insights tremendously through the use of storyboards

At the Monetising Big Data in Telecoms World Summit 2018, we explain how the Trifecta can be leveraged by using a simple example of Anomaly Analytics. For a telecom, with massive volumes of data, it is more difficult to detect anomalies in data than finding a needle in a haystack. In case of the latter we know that we are looking for a needle. In this case, we don’t even know what exactly one is looking for. However, it is extremely important for a telco to manage anomalies to mitigate risk and to prevent missing out on opportunities.

There is a very fine line between the definition of an anomaly and outlier. Rather the distinction is rather fuzzy as anomalies and outliers are intersecting but certainly not subsets. Take for instance, a queen bee in a bee hive. She is an outlier but not an anomaly. Anomalies usually remain undetected. They are unknown problems with unknown solutions. Our work detects, curates and qualifies these problems to move it from the unknown-unknown realm to the known-known realm by breaking it into parts, analyzing them using various ML/DL algorithms and as well doing causal analysis to find the root factors.

We talk of two examples where we work with Telcos to solve their anomaly problems using advanced analytics. In the first case, our solution of anomaly detection required modification as the client was based out of East Africa with pockets of high population density in vast open areas. We developed anomaly for cell sites using algorithms such as time series, manifold learning, LSTM etc. We did anomaly detection for different KPIs such as call duration, data and customer latching. This was further qualified along the dimensions of 2G/3G/4G, on-net/offnet and so on. Anomaly analysis generally suffers from the problem of ‘too many’ and difficulty in prioritization. Our solution gives revenue numbers to the anomaly and prioritizes them. Most importantly it looks also at the long term business impact of high value customers, churn etc. and also considers the factors whether load balancing is done by nearby cell sites and the customers were really impacted or not.

Our second use case was regarding our work for a client based out of N. America. They were having difficulty in detecting lost handset on time. This was causing them huge monetary loses. With the help of algorithms like probabilistic graphical model, Markov chain etc. we created algorithm to detect whether a handset was stolen or not. Our algorithm helped in improving the client’s solution 9 times in detecting the algorithm. As well in 80% of the cases, the instances were detected within 24 hours, thereby improving the customer’s bottom line.

All in all, the presentation provided for a good opportunity for attendees to understand how Subex is working with big data, and how ROC Insights can help telcos by pinpointing upon a very specific problem. At large, we enjoyed presenting at the event and meeting with telcos from across the APAC region. Hope to see you there next year!

AfricaCom 2017: A Subexian View

It is a well-known fact that AfricaCom is one of the most popular and significant events for telecom operators and vendors alike in one of the most fast-paced emerging economies: Africa. The annual event is attended by the top telecom executives from across the world and is attended by Subex every year. I had the privilege of attending the last edition of AfricaCom, and from what I’m told, 2017 was one of AfricaCom’s busiest years: a sentiment which was shared by executives of operators and vendors and even by the Cape Town locals.

While I happened to attend some of the sessions at the newly opened Innovation Stage, my sense was that most of the buzz was happening at the exhibition area, where Subex too was an exhibitor among some of the biggest names in the telecom space. It was clear that Digital Transformation was the central theme of AfricaCom 2017, and was a key discussion point among both vendors and operators.

The theme of Digital Transformation and Digitalisation has been a cornerstone of Subex over the last few years and keeping in line with the concept, Subex showcased its Analytics and Consulting & Managed Services offerings to the delegates besides its renowned flagship products. In my discussion with executives from the top CSPs, it’s clear that analytics is still as popular as ever, and the demand from true analytical services for the telecom domain is hotter than ever.

The need to leverage data was also a hot discussion point for Subex even with industry analysts. As per my discussion with a renowned analyst from Ovum, leveraging analytics in the African market is definitely an area of interest.

Overall, from the excitement in the market for things to come, it’s clear that the African market is increasingly embracing the journey ahead and preparing themselves for the challenges on the horizon. It will be exciting to see how the market evolves over the course of the next year, and this thought makes the next edition of AfricaCom an exciting one.

I am hoping to come back again to experience the next wave of transformation that the industry undertakes and visit the wind of change I experienced at the Cape of Good Hope!

The holistic approach to securing IoT Ecosystems

With the convergence of the physical and digital world, IoT ecosystem is becoming more pervasive and smart. With its “connected” nature, the concept of securing the IoT ecosystem has taken precedence and has today become a necessity. It is no longer a question of “IF” IoT networks will get hacked, It’s “WHEN.”  Organizations should be concerned about what should be done “WHEN” the ecosystem is compromised.

The blog discusses the various risks the IoT boom poses, and the approached organizations need to adopt to safeguard themselves.

Read More

Note: Published with permission from Liveworx

As a digital service provider, your adoption of Internet of Things (IoT) services presents opportunities as well as challenges. The upside: IoT opens new revenue streams by providing always-connected services to digital subscribers. The downside: IoT exposes subscribers to identity theft and security breaches. Subex provides holistic cyber security to ensure that enterprises are safe from unauthorized access and intrusion across their network.Subex Secure offers comprehensive IoT security coverage from real-time discovery and monitoring to response and recovery. Our solution leverages a one-of-its-kind honeypot network that combines physical devices and device emulations to generate IoT / ICS signatures. Our system evaluates global identity and device breaches and updates the Subex Secure signature repository to safeguard your enterprise from new and emerging IoT threats.

Get Started with Subex