Would Product upgrade alone solve the problem?

During one of my business travels recently, I met the Revenue Assurance & Fraud Management heads of few telco’s. There was one meeting in particular that struck me – let me call him Jack, the AVP of FM. Jack has been using a Fraud Management System(FMS) for close to 4 years & has certain business challenges to address. Jack was exploring the option of upgrading the FMS to tackle the challenges and needs.

Jacks’ primary challenge was to build a business case justification for FM upgrade to show the elusive RoI. Apparently, there have been challenges of his team detecting fraud, and he was of the belief that an upgrade will help address the fraud in new generation of services (which the current system is not capable of) and thus contribute to the RoI.

On probing further it was clear that he’s been under tremendous pressure from the higher-ups to showcase RoI, especially in the recent past due to tough macro-economic conditions. Some more questioning and discussions with his team members revealed that there have been (and are) many hurdles in performing their tasks –

a)      IT issues pertaining to system availability, performance, processing & tuning

b)      Knowledge issues in fine-tuning the rules and thresholds periodically

c)      People issues in understanding the domain and carrying out effective and smart investigations

While the operator is on a drive to introduce Next Generation services, more than 88% of the revenue still comes from traditional services – Voice, SMS/MMS, Roaming, Interconnect and GPRS. The top frauds also happen in these areas – http://www.cfca.org/fraudlosssurvey/

It was a revelation for Jack when the data was put up for discussion. It was also evident that an upgrade alone is not going to solve his problem. The need of the hour was an overhaul of the entire eco-system (address the 88%), along with the upgrade (to address the remaining 12%).

During the course of the discussion, I suggested a few best practices based on prior experience through the Managed Services engagements.

a)      Conduct an assessment to baseline the performance of the current function, including a SWOT analysis and detailing a roadmap for growth

b)      Basis assessment, build a business case for justification for skills/efficiency improvement and required technological upgrades

c)      As next step, strengthen the foundation of fraud prevention by improving on people, process by leveraging on best practices and experience from vendors and partners if needed

d)     Once the basics are addressed, mature to the next level by incorporating technological, process, procedures & skill upgrades

I also quoted one such Subex Managed Services engagement in India, where the operator was on an older version of the Fraud Management when the engagement started and has seen more than 2 times RoI within a year, followed by an upgrade to latest version. This helped them in the following ways:

a)      Optimize the resources as a first step and improve on fraud operations through skilled workforce, leveraging on existing technology and automation of repeatable tasks

b)      This resulted in significant financial savings, lowered operational and workforce risks, improved knowledge and enhanced business agility

c)      The highly scalable model for future growth also meant they were able to choose specific fraud related services and technological upgrades depending on its strategic objectives and business priorities

Jack, being the positive person, was able to appreciate a new perspective on his challenges and is looking forward to a detailed assessment to build a case for strengthening the FM team.

After this incident, I wonder are there more Jack’s out there with the right intention but not necessarily armed with the right tools?

Data Discrepancies Don’t Matter

Now, referring to the title, you may be thinking: That’s a rather cheeky thing to say given the high direct and indirect costs of errant data incurred by virtually all operators.   You might cite the significant Opex penalty related to reworking designs and to service activation fallout.   I get that.  What about the millions of USD in stranded Capex most operators have in their networks?  Check.  My personal favorite comes from Larry English, a leading expert on information quality, who has ranked poor quality information as the second biggest threat to mankind after global warming.  And here I was worried about a looming global economic collapse!

My point is actually that the discrepancies themselves have no business value.   They are simply an indicator of things gone bad.  The canary in the coal mine.    These “things” are likely some combination of people, processes and system transactions, of course.  Yet many operators make finding and reporting discrepancies the primary focus of their data quality efforts.  Let’s face it, anyone with modest Excel skills can bash two data sets together with MATCH and VLOOKUP functions  and bask in the glow of everything that doesn’t line up.  Sound familiar?

For context, I am mostly referring to mismatches between the network and how the network is represented in back-office systems like Inventory—but the observations I will share can be applied to other domains.   Data anomalies, for example, are all too common when attempting to align subscriber orders and billing records in the Revenue Assurance domain.

Too often, Data Integrity Management (DIM) programs start with gusto and end with a fizzle, placed on a shelf so that shinier (and easier!) objects can be chased.  Why is this?  Understanding that I am now on the spot to answer my own rhetorical question, let me give it a go.

  • The scourge of false positives: There are few things as frustrating as chasing one’s tail.  Yet that is the feeling when you find that a high percentage of your “discrepancies” are not material discrepancies (i.e. an object in the Network but not in Inventory) but simply mismatches in naming conventions.   A DIM solution must profile and normalize the data that are compared so as not to spew out a lot of noise.
  •  The allure of objects in the mirror that are closer than they appear:  OK, not sure this aphorism works but I trust you to hang with me.   I am referring to misplaced priorities— paying attention to one (closer, easier) set of discrepancies while ignoring another set that might yield a bigger business impact once corrected.    Data quality issues must be prioritized, with priorities established based upon clear and measurable KPI targets.  If you wish to move the needle on service activation fallout rates, for example, you need to understand the underlying root causes and be deliberate about going after those for correction.  Clearly, you should not place as much value on finding ‘stranded” common equipment cards as on recovering high-value optics that can be provisioned for new services.
  • The tyranny of haphazard correction: I’m alluding here to the process and discipline of DIM.  Filtered and prioritized discrepancies should be wrapped with workflow and case management in a repeatable and efficient manner.  The goals are to reduce the cost and time related to correction of data quality issues.  If data cleanse activities are unstructured and not monitored by rigorous reporting, the business targets for your DIM program are unlikely to be met.
  • The failure to toot one’s own horn: Let’s say that your data integrity efforts have met with some success.  Do you have precise measurements of that success?  What is the value of recovered assets?  How many hours have been saved in reduced truck rolls related to on-demand audits?  Have order cycle times improved?  By how much?   Ideally, can you show how your DIM program has improved metrics that appear on the enterprise scorecard?   It is critical that the business stakeholders and the executive team have visibility to the value returned by the DIM program.  Not only does this enable continued funding but it could set the stage for “self-funding” using a portion of the cost savings.
  • The bane of “one and done”:  For a DIM program to succeed in the long run, I suggest drawing from forensic science and tracing bad data to underlying pathologies… i.e. people, process and/or system breakdowns.   A formal data governance program that harnesses analytics to spotlight these breakdowns and foster preventive measures is highly recommended. The true power of DIM is in prevention of future data issues so that the current efforts to cleanse data will not simply be erased by the passage of time.

Identifying data discrepancies is a good first step.  Correcting and preventing them is even better.    Institutionalizing DIM via continuously measuring and reporting your successes… well, you get the idea.

Could Analytics Pose a Life Threat to Large Tier Operators?

As we dive deeper into the world of analytics, more and more information and intelligence is being made available to operators, analysts, and other interested parties. But along this same progression of “seeming” advancement in the domain, there is also a growing, critical threat to large tier operators: Access.

Access itself is a big word. In the context above it simply means “access to analytics intelligence”. But the other meanings are where the problem exists for larger tier operators. When you think of “access” in terms of these areas, the problem becomes more clear:

1. Access to where your data is located – where is it? All of it?

2. Access to the group(s) who control that data – can you get to those people? Do you even know who they are?

3. Access to the data itself – assuming you know where it is and who owns, can you even get to it?

As history proves out again and again, these sets of questions become increasingly harder to answer as the size of the operator continues to get bigger. Taking the approach of a larger vs. a smaller operator, here’s what is all-too-often the case:

Where is the data? In smaller operators, data tends to be located in singular systems. Put more simply, they don’t have 4 billing systems for retail customers, 3 inventory platforms, and 6 order entry instances. They tend to be closer to one of each type. Obviously the younger the operator, the more advantages they have as well, simply because infrastructure and architecture are more simplified. If they are a larger tier operator, however, they are often quite the opposite. In addition to having multiple systems that are redundant or duplicated, they often have customers scattered across these instances with no particular rhyme or reason, as several system consolidation events have spread any given customer’s account in unusual directions. One of the biggest challenges in this regard is getting to a true “single view” of an account. Some of the largest carriers in the world have tried to get there – they remain unsuccessful, and ironically, they often don’t even realize it themselves.

Who controls the data? So let’s say you know where your data is. Can you get to it? Do you need a budget available to *pay* the internal group to get you your own data? But most critically: When you show the data owners what you need and why you need it, can you successfully escape the attempt by those owners to create the analytics for you? Many of these shops that own data also have lighter-weight analytics capabilities. They would love for you to engage them to build something for you. They would love for you to pay their department for their efforts. They would love for you to not use an outside expert, as this is often *perceived* as a direct threat to them (this is NOT the case, but they still respond like it is). They would love for you to educate them on what you want and how to do it. But most importantly, they’ll sometimes make getting your data so difficult and expensive, that it becomes more financially beneficial to just do it their way. This never gets you the result you expect, on time, anywhere close to budget.

Contrast this with a smaller operator. The smaller the operator, the more collaboration takes place. That’s not a judgement…it’s just the unfortunate truth (for the large tiers). Of all the operators I’ve worked on project deliveries directly with over the years, this has held true each and every time. When a finance guy needs a data dump from the accounting platform, he doesn’t make a phone call to a different region, through a path of 3-10 people to get to the right person. What he does do is walk down the hall, around the corner, into the IT guy’s office, sits down and has a few minutes of “small talk”, asks for the data at the end of those few minutes, and usually has it by the next day. Also – smaller operators don’t believe they can do it all. They already wear 10 hats during the day, and welcome an expert team in analytics to help drive immediate value back into their business, almost form the day they walk in the door.

The simple fact of the matter is this: Analytics is going to be a crucial differentiating point for operators’ speed (agility) of response to business and market changes. And when you cut to the chase, data access will make this practice virtually impossible for large tier, legacy operators to fully leverage when compared to the smaller organizations. This will cripple many large operators in some key, lucrative segments of their markets. How can we be so sure? It’s already happening…

What are the Current Concerns and Challenges of an Wholesale Operator ??

Last week I spoke at the GSC conference where an august  group of carriers convene to discuss the problems they face and best practices within their organizations.  So apart from being in Windsor one of the loveliest places in the UK and setting aside the fact we had a great treat from a British multinational Operator – a meal at the Guards Museum, Westminster.  What was discussed?

Two interconnected topics ( no pun intended) were top of the agenda item. GIPX and credit control and management.  Here carriers debated whether and how to give smaller players access to their network and how to limit debt liability. British multinational telecommunications services company spoke about their pre-pay component in billing and how that manages debt, predicts traffic and spend and successfully allows (untested) new carriers access. As the provider to them it was great to hear how pleased they are and what practical use the solution gives them.

We had a very interesting talk from the director of the Institute of Credit Management explained his organization’s mission   to educate about the issues of credit, risk and to stress the importance of positive cash flow.  Again this had carriers discussing issues around cash collection and customer management.  Opinion was divided – some leaning backwards in pursuit of the relationship others just pulling the plug when debt became unacceptable. This just reiterates how slightly different this domain is – retail billing  is, you don’t pay – you’re gone ( well there are exceptions in cases of hardship).  This is about carrier relationships and the bi-lateral sessions where carriers sit and negotiate with each other face to face – stresses the complex and often personal nature of the inter- carrier relationships.

The meeting  also  had a great speech from a Belgian Operator discussing Fraud prevention and the problems By Pass and  PBX hacking remain ever constant.

What did I learn?

New technologies offer new opportunities – not really a massive learning point – but more automation, alerting and deeper  MIS gives carriers the capability  to avail themselves  of these new opportunities and as providers of software to them we need to make sure that their days are as easy as possible.

Are you losing 90% of interconnect revenue due to Bypass Fraud ?

Sim Box Fraud, Landing Fraud, Grey Routing, VOIP Bypass whatever you may want to call it – Bypass Fraud continues to hurt operators across the globe. As per CFCA, operators lost more than 2.88 Billion USD to Bypass fraud in 2011!!! Imagine operators could have sold 7 million Iphones and 3.5 Million Ipads with that much money.
In the recent past we heard of 2 news involving Bypass fraud in Philippines and Ghana –
A huge bypass racket involving three Taiwanese nationals and a Filipino was unearthed at a condo at Makati City – Manila, Philippines. . Losses for operators ran up to millions of Pesos.
While in Ghana – where operators have lost millions of dollars to bypass fraud, the regulatory body has made it clear that if it discovers any illegally acquired SIM cards being used for SIM Box fraud, the respective telecom operators would be held responsible and made to pay for it
Below infographic highlights the unique approach used by Subex to not only detect Bypass fraud but also detect the root cause and prevent it.

To download pdf, click on the given link Bypass Fraud Infographic

Co-Authors – Ravish P, Nithin G & Ashwini S.

Creative – Shafi

Sources:   http://www.ghanaweb.com/GhanaHomePage/NewsArchive/artikel.php?ID=224287







Press & Analyst Contact

Sandeep Banga
E-mail : sandeep.banga@subex.com