Tags Posts tagged with "Telecom"

Telecom

0 174

Customer Journey Mapping can provide great insights on how customers see a company from their perspective. Insights that can lead to an improved customer experience, trust and loyalty

In the first blog in this series on customer analytics, the technique of Customer Journey Mapping (CJM) was discussed as a way to follow how customers move from one touch point to the next, and track their emotional well-being during those interactions. In the last blog I described how using a persona to represent a group of customers would allow marketing to get a better understanding of customers. In this blog I will explore how Customer Journey maps can be created for persona to visualize an idealized journey for the group represented. This is now becoming a well-accepted technique for not only improving user experience in software design, but also in the design of products, digital and conventional marketing channels, architecture and many other areas.

There are two basic approaches for creating persona. One is to base the persona on in-depth research of the customers within a market segment, and the other is to base the persona on intuition, sometimes referred to as a provisional persona. In reality, it makes most sense to use a combination of research and intuition, and then verify the persona with those who have front line contact with customers. Generally customers belonging to a company’s biggest market segment would be targeted first and a primary persona is created to represent them. If the team creating the persona do not have direct knowledge of the customers in that segment then they will need to conduct research to understand the values and motivations of the group.

Once a persona has been defined then it’s possible to look at how the company would engage that persona in a sale, and the hope is that the persona would follow each engagement at every touchpoint, even long after they’ve made the purchase and are using the product. The framework for this is known as the Customer Lifecycle. There are many versions of this but they all share some basic stages, as described by Jim Sterne and Matt Cutler in a paper called “E-metrics, business metrics for the new economy

  • Reach: Trying to get the attention of the people we want to reach.
  • Acquisition: Attracting and bringing the reached person into the influence sphere of our organization. 
  • Conversion: When the people we reach or have a more established relationship with, decide to buy something from us.
  • Retention: Trying to keep the customers and trying to sell them more (cross-selling, up-selling).
  • Loyalty: We would like the customer to become more than a customer: a loyal partner and even a ‘brand advocate’ Moments of truth

This can be represented either horizontally or in a circular lifecycle type chart

The Customer Life Cycle – Source: E-Metrics Business Metrics for the New Economy by Jim Sterne and Matt Cutler

The persona journey describes how it’s anticipated that a particular persona would move through the lifecycle. It would describe the channels through which it’s expected they are made aware of a product, how it’s expected they would research the product and what would motivate them to make a decision to buy. Key points in the journey where customers decide whether to continue or abandon the process are known as ‘Moments of Truth’, a term coined by Jan Carlzon, the well-known CEO of SAS Airlines who turned the company around in just a couple of years.

Walking in the customers shoes in this way is not easy, and would normally be done as a workshop with representatives from across an organisation, but it’s an exercise that can provide many useful insights. Service quality gaps, cross channel alignment, ways to better engage customers and align internal teams are just a few of the many benefits that come from journey mapping. When idealised journey maps are compared with the actual journeys that customers take then many preconceived ideas about how customers see and engage with the company may get thrown out and fresh ways to engage, retain and acquire new customers be discovered.

In the next part of this customer analytics based series of blogs I will be looking at the security implications of big data and advanced customer profiling, and how regulators around the world are trying to protect an individual’s right to be treated equally by large corporations.

0 603

Signalling level risks, specially fraudulent accesses from connected SS7 networks, is one area which is making a lot of noise in the assurance and security functions of Telecom organizations today.
The focus on the matter is such that most of the industry conferences talking about the current and next gen threats have a lot of matter being presented and shared on this topic – both from the operators and vendors alike.

What is it ?
The signalling level risks generally refer to SS7 (2G/3G) and Diameter (4G) level vulnerabilities (inherent or configuration based) which exposes operators to hacks/frauds through signalling control commands specially in roaming and interconnect scenarios. The scenario becomes more risky considering a normally configured SS7 infrastructure of an operator is accessible to any other operator in this world, either directly or through certain number of hops.
Now, just consider a situation where a rogue operator exists or a group of hackers with a malicious intent have got access to SS7 signalling of any less-secure operator in this world.
The losses due to signalling risks, while are still quite speculative, are expected to run in billions every year. Artificial inflation of traffic (specially A2P & P2A SMSes), Spamming, Spoofing, Refiling, profile modification, unlawful tracking, unethical disruptive activities from competition etc. are examples of some risks which have been found to be existing NOW with an estimated 100% infection rate.

Why is it happening ?
The SS7 signalling based vulnerabilities have been existing since very long, but have become part of news headlines recently due to certain revelations made by famous ethical hackers at certain high profile security conferences.
Some industry pundits make a point, which most of my industry connections agree with, is that these risks exist mostly due to the fact that operators tend to create unreliable partnerships and configure unregulated access (like open GT access, acceptance of any signalling command etc.) which enables malicious parties to connect to operators networks and conduct fraudulent activities very easily.
There have also been discussions around existence of services exploiting these signalling level vulnerabilities being offered in the grey markets through rougue hacking communities for a price.

Can you eradicate these risks ?
Ideal Solution: Operators need to sanitize their access configuration on SS7. Rethink, Reidentify, Reevaluate and Reconfigure the access levels.
But this is really difficult or maybe nearly impossible to achieve due to some practical issues on the ground, such as:

  • Most of the SS7 networks were configured long time back – There is an expertise issue operators are facing wrt SS7 networks now which limits their capability in terms of reconfiguration of SS7 based networks
  • It is a time consuming activity, which, would also lead to a lot of efforts on re-testing connectivity with all the partners, attracting a lot of investment
  • It may lead to reconfiguration of the signalling level configuration at the network level, and in certain instances, would require network downtime – A complete NO-NO for a lot of players out there. Situation becomes even more problematic for countries where Telecom Networks are considered a National Infrastructure.
  • Lastly, not every operator will take up this activity for many different reasons including the reasons like operators not participating in the awareness meetings/conferences being organized around the world or even like some rogue operators participating in malicious activities deliberately.

The problem becomes much more trickier from the fact that even one infected, unsecure or rogue operator in the world will continue to pose a threat to everyone else. And sanitizing each operator against these threats is a feat which is very unlikely to be achieved.

It is now unanimously being accepted that SS7 signal based networks are here to stay (atleast 10 years in developed markets and 20-25 in developing or lesser developed countries) and even their vulnerabilities, which are expected to grow by huge amounts considering the limelight it has received recently.

The bigger problem which has started giving sleepless nights to the fraud & security functions in operators moving towards 4G and setting up their networks over diameter protocol (provides 4G signalling framework) does not have native security standards inbuilt, but requires security mechanisms to be implemented on top, a practice always found susceptible to gaps). Also, the access methods are similar to SS7, so it exposes 4G networks to similar signalling risks as SS7.

What can be done now ?
For now, an approach of detection would be ideal until the industry identifies a way to plug these vulnerabilities around the world, which is definitely a few years away with a lot of research hours of investment.
An approach of detecting malicious signalling requests in your network still has few complexities to manage:

  • High false positive rates – A lot of signalling requests appearing to be malicious come out as configuration issues from the partners. Hence, domain expertise is essential to filter out ‘needle from the haystack’.
  • Sheer size of signalling data to be analyzed – big data support is required.
  • Skill set – This activity will surely require a knowledge upscaling and may be difficult for the traditional teams like fraud and risk management to absorb. Even teams like security, with less focus on fraud domain know how, is expected to find it difficult to add this activity in their set of responsibilities.

I feel industry partnerships with vendors, possessing both the domain knowledge, right skill set and technology built on big data platform is the way to go.

These partnerships, considering no-one has a complete answer to this rampant problem of signalling vulnerabilities as of now, need to be built on solid vendor capabilities, while being both liberal and experimental to give room for exploration.

0 99

As one of the largest capital line items in every telecom operator’s budget, Network Capex continues to drive large numbers every year since network augments and migrations to newer technologies are unavoidable budget items. Today operators are spending huge sums of money on new network infrastructure for advanced telecom services like LTE / 4G, IPX, etc. without adequate visibility on revenue growth. A recent survey points out 20% of the assets fail to return cost of capital and 5-15% of these network assets are ‘stranded’.

Hence, effective capital expenditure and network asset lifecycle management are rapidly becoming a big boardroom issue for telecoms operators. This is only possible when all functions work together to maximize the returns from their investments. Both the CFO’s and the CTO’s teams in a telecoms operator should have a holistic and collaborative view on the network asset investment.

The urgent need is to have a strategic approach to asset assurance program which manages and reduces network capex substantially. ROC Asset Assurance is different from ERP services because of its workflow and analytics elements. It can initiate workflow to ensure that all the applicable assets are procured and deployed when needed. ROC Asset Assurance helps the CFO & CTO function within the operator company to tackle the following pain-points:

  • Planning of capital spend vs budget
  • Tracking deployed assets and ROI on those assets
  • What to buy, when to buy, where to buy, and for what reason?
  • Information related to assets
  • Ensuring usage of all available assets at the utmost efficiency
  • Network resource capacity and the need to respond

To learn how an effective asset assurance program will provide complete confidence to operators that their network will grow to meet market demands while also guaranteeing optimal value for every dollar of capital budget spent, download our newsletter: Asset Assurance –  Preserving Capex While Improving Network Efficiency featuring research from leading Analyst firm Gartner.

1 291

Sitting listening to one of the speakers whilst attending the ‘M2M: Beyond Connectivity’ seminar hosted by European Communications this week, for a moment I was reminded of the classic football chant – “Who ate all the pies?”  The general consensus is that the chant was originated by Sheffield United supporters at the end of the 19th century and aimed at their 140kg goalkeeper.  Not sure he would be playing in the top flight nowadays!!

So what was it that led me to draw a parallel between an overweight football player and M2M?  In the presentation by Ansgar Schlautmann from Arthur D. Little, an interesting and revealing view of the holistic value chain for smart solutions was given.

The individual players in the spread of total revenues need not however be concerned by the splits shown; even in fact for the CSPs who see their connectivity portion with a distribution of somewhere between 15-20%.  With the anticipated value of the M2M market projected to reach approximately $86 billion[i] by 2017, even this slice of the pie should be enough to get their teeth into.

The analogies associated with balanced nutrition and how the industry will ensure that each participant will get its own healthy share can be drawn in a number of ways, so let’s consider them.

Use by

If we have to be honest, it’s a good thing that the M2M industry has come to us with a long shelf life.  It’s been a number of years to even get us to this point where we see on the horizon a potential explosion of new product/service offerings being crafted for the market.

There will always be those people who want to be early adopters and will derive the niche, well crafted packaged service for their market, but surely one of the keys here is to just start thinking M2M in all areas of our lives, we need to start using it, to start realising the potential it can bring to daily routine where in a sense – decisions are made for us.  This isn’t to a Skynet level, albeit the goal there was for speed and efficiency and to reduce human error – even though M2M can bring us that.  To reap the benefits of M2M, we need to build it, we need to use it, when by?  The sooner the better.

Best before

Even though we can get excited by cars that will arrange their own services; highway maintenance being done based upon vehicular data relating to potholes; smart cities that will see routine maintenance programmes scrapped for self servicing provisioning and so on – the reality is we are a little way from reaching this best before point.  Now I say best before not in the context of a product/service will expire in it’s usefulness at some point in the future.  I mean with regards to when we will see offerings come to us as consumers that will encapsulate all of the richness that M2M could potentially bring to us.  It is getting closer, advancements are being made in all sectors of the industry, but to reach a best before point – I don’t see that happening until we see wholehearted adoption within the verticals themselves.

Ingredients

Having the right ingredients, i.e. providing energy, protein, carbohydrates etc. in the right quantity, quality and packaging in many cases are a key factor when we hit the supermarket or the online grocery sites.  As consumers in this M2M arena, will we be so different?  We want well packaged, well balanced, quality products and services.  As participants in this M2M value chain, the balance will come from the blend of services that can be offered; the partnerships that are built; being able to craft simple, yet wholesome solutions to the consumer flavour of the day.

0 56

What makes a Telco settle for less? Is it the budget, fear of survival in a highly competitive market or knowledge and skill issues?

Bangladesh recently opened up the Interconnect landscape with 21 new ICX and 22 new IGW licensees entering the fray. This, without doubt, has fragmented the entire Interconnect market and has put significant pressure on the incumbents (to fight emerging competition) and for the new telcos (to have a viable business case for break even and eventually survive).

In a recently concluded InterConnect Conference at Bangladesh hosted by Subex, I had the opportunity to interact with a wide range of audience – CXOs, consultants & IT personnel. While it was acknowledged that a billing system is important to start the operations and convert “usage” to “cash”, it was evident from the discussions, that price was the driving factor in deciding a billing platform and to that end operators were scouting for “low cost” billing systems or looking at developing it in-house. Severe cash-flow issues, as the roll-out along with statutory payments to the regulator for obtaining and maintaining the license, was proving to be very costly. Hence cash-flow and total cost of ownership (TCO) were the main contributing factors to look at a low-cost billing option. The new telcos, in addition, were also exposed to the following risks:

  • Survival : Accurate and prompt billing will be a significant differentiating factor in the highly competitive market
  • Agility : As competition intensifies they need to be flexible to adapt to market needs and offer innovative products and services at the shortest possible time
  • Revenue leakages : Bangladesh as a market is highly prone to fraud and as per the regulator more than 10% of the revenue is lost due to illegal bypass

So, how are the telcos going to remain agile, competitive and eventually break-even at the shortest possible time? Do they have to settle for less and allow the forces of the market to dictate the future?

Subex unveiled a cloud offering for the Interconnect operators in the roadshow. It gives the best of both the worlds – low cost and superior technology, and help the telcos be ready for the future. The cloud model mitigates the business risks and provides the following benefits to the ICX and IGW:

  • Low cost of deployment & operations : With a TCO less than 30% of a licensed/in-house model and completely managed by experts, and CapEx requirement lesser by 50-60%
  • Low commitment : Volume based pricing to fit the business needs and scale as the business grows
  • Minimal risks : Easy sign-on and sign-off to the cloud model
  • Ready to launch: Pre-configured application requiring minimal customization (about 20%) to suit telco specific needs
  • Minimal domain knowledge: With service provider completely managing the core activities, knowledge and resource requirements from Telco are minimal. They are already constrained with resources and those can be used for growing their business
  • Flexibility : The cloud model can very easily extend into Revenue assurance, Fraud Management, Analytics for future

Naturally the excitement was evident when such an option was presented to the telco representatives. The only objection was around security (which was expected) and security is very easily addressable with the right technology and stringent processes.The “best-in-class” solution is available on a “best-in-class” service model which suits the pocket and addresses the business risks. It was time for the operators to go back and “relook” at their business case.

After all, when they have a great option, why do they have to settle for less?

Follow Us