Quantcast
Channel: Analytics – Virtusa Official Blog
Viewing all 36 articles
Browse latest View live

It pays to modernize your data architecture

$
0
0

In today’s world where data is collected at every interaction, be it over the phone, mobile, PC, sensors, with or without us knowing, it becomes important to have a strategy around data. Traditionally, data has been seen as something to “run the business,” but, in today’s context, it can actually “be the business” if monetized 

Continue Reading...

The post It pays to modernize your data architecture appeared first on Virtusa Official Blog.


Micro-segmentation: Enabling Bank Marketers to Target Customers

IBM Information on Demand 2012: A Look into the Future of Information Management

$
0
0

IBM’s Information on Demand (IOD) 2012 conference held in October at the Mandalay Bay in Las Vegas was like a theme park for information management. It’s one of the best forums for creating and experiencing the buzz among business partners, industry activities and customers. At this event, there were just as many new questions created 

Continue Reading...

The post IBM Information on Demand 2012: A Look into the Future of Information Management appeared first on VirtusaPolaris Official Blog.

Big Data – Push Aside the Hype, the Value Lies in Action

How Technology is changing the future of banking in India

$
0
0

In the present era of global, interconnected financial systems, a small ripple in one nook and corner of the world can probably create a shock wave in the entire economic ecosystem. This has led to increased scrutiny and more stringent and intricate regulatory requisites for the financial institutions. Regulators have not only shortened the time but have also doubled up the information they sought from their customers. Risk, financial data and technology are now becoming hot topic of discussions, and banks are now looking for crucial and viable solutions that can help them sail through this expedition of compliance. To become more agile and remain relevant, traditional banks in India are exploring their technological options with more focus on insights into customer behaviour.

 

Data analytics helping banks in regulatory compliance

The world’s top investment banks have been fined with close to $43 Billion over the past few years due to not adhering to the compliance rules in areas such as customer reporting, and thus making it the single most expensive compliance issue. The regulators continue to put more pressure on the financial services firms by adding constant reporting and more regulations such as KYC, Basel III and Solvency II to maintain data compliance.

While attempts have been made to deploy big data tools such as Hadoop to increase analytics, in the size, complexity, and regulatory compliance but using analytics has been a painful endeavour for many banks. Banks are still behind other industries in deploying new analytics techniques which is why they face many data challenges today like data overload, inadequate data management systems, increasing customer demand for information and increased regulatory reporting.

As banks continue to become more diverse and complex, technology radically changes the speed of operations and data production. To ensure compliance, model banks need to apply big data processing and reporting tools to ingest and process data from both new and legacy sources. These platforms should be sophisticated enough to process all types of structured and unstructured information such as internal documentation, voice-mails, customer correspondence, transactions, etc. There should be a rich process modelling capability that can be used to detect patterns based on pre-defined regulatory reports to quickly identify the risks.

Establishing this type of big data analytic platform allows banks to reduce the complexity of their process and improve the speed of their analytical cycles. This allows banks to not only lower the cost of data processing, but enables them to discover new insights by identifying and managing risks more proactively. By harnessing big data analytics for both compliance and improvements in core operations, banks can leverage and spend efficiency across their business lines and seek improvement in areas such as customer and fraud analytics.

 

Transforming regulatory compliance though artificial intelligence

New innovations in data analytics empower banks with systems that are smart in automatically refining their algorithms and improving their results over time. We are not talking about the old school approach to data analysis — spreadsheets, data tables and crunching numbers on a calculator. We are talking about the new age transforming access i.e. artificial intelligence (AI).

Advances in automation and data-led intelligence has put sophisticated AI technologies within the reach of traditional institutions. This is because the modern AI platform can essentially stand on the shoulders of the data and process automation technology trends that precede it. AI is a collection of technologies such as Natural Language Processing (NLP) and machine learning that is now being applied across banks to further automate processing of information to better interpret and contextualize the information. It has the potential to substantially refurbish the whole compliance process that is operational at a bank.

NLP is well suited for processing financial documents to extract metadata, identify entities and understand the intent or purpose of the documents. NLP can be used to identify the types of products such as loans or swaps and correlate it to a regulatory topic such as anti-money laundering, insider trading or other abuse. When combined with robotics, AI can hugely simplify the processes and reduce the chances of human error. As it will continuously self-improve, there are more chances of technology being able to manage complicated and time-consuming data updates better than its human users could ever do.

This undoubtedly means that banks are better equipped to deal with the demands of regulators in a manner that just wouldn’t have been possible via any basic analytics tool or human intervention. Most AI systems are not at that stage just yet, but the potential for transformation is huge.

 

Improving customer experience through Big data 

Banks have access to more consumer data than other businesses. With frequent use of web and mobile banking channels, the volume and variety of data that banks hold about their customers has steadily increased, driving an increase in the number of customer interactions. Banks hold detailed customer profiles, information on spending and income, and a clear picture of where people spend their time, banks are in a unique position to paint a clear picture of each of their customers.

Big Data offers banks an opportunity to differentiate themselves from the competition. Using advanced big data techniques to collect, process and analyse information, banks can provide better personalization and relevant information to customers across all areas of retail banking. This can ultimately make banks more customer-centric. By using customer data effectively, banks can deliver more targeted and cost-effective marketing campaigns, design products and offers that are specifically tailored to customer needs. Combining data sets in creative ways can surprise and delight customers, leading to retention, loyalty and a higher lifetime value.

Big data can also play an important role in customer retention by minimizing churn. Loyalty has become a top issue with the millennial generation. It costs banks significantly more to acquire new customers than retain existing ones, and it costs far more to re-acquire deflected customers.

All this shows that it may be a tedious journey for banks to deploy these technologies now but will render some great results in the future course of time. Capturing these opportunities will require investment, painstaking planning, and coordinated decision-making, spanning the whole bank. Automation is rewriting the rules of how banks compete. Banks that fail to grasp this risk may damage the franchises built over generations. But if they manage to address these multiple strategic challenges, they can position their institutions to compete effectively and capture an emerging, long-term growth trajectory.

This article was originally published in Business Standard.

Sreekanth Lapala

Senior Vice President – Outsourcing Transformation Services, Virtusa. Sreekanth has over 17 years of industry experience, with focus on information technology with immense experience focused on execution of large scale transformation programs and growing delivery centers. In the current role, he has a dual responsibility of leading Virtusa’s largest account while also heading the Outsourcing Transformation Services group. Prior to joining Virtusa, Sreekanth had a very successful stint at a product company and instrumental in building the world’s first J2EE 1.3 certified server by Sun Microsystems. Sreekanth is a prolific speaker and has been frequently quoted in the media. Sreekanth is a graduate from BITS Pilani.

More Posts

The post How Technology is changing the future of banking in India appeared first on Virtusa Official Blog.

IBM Information on Demand 2012: A Look into the Future of Information Management

$
0
0

IBM’s Information on Demand (IOD) 2012 conference held in October at the Mandalay Bay in Las Vegas was like a theme park for information management. It’s one of the best forums for creating and experiencing the buzz among business partners, industry activities and customers. At this event, there were just as many new questions created as ones that were answered.

The event included three days of keynotes and nearly 700 breakout sessions which shared huge amounts of concentrated and comprehensive insights into Big Blue’s latest innovations, strategies and technologies. A multi-day pass to the event is barely enough to absorb the overwhelming amount of information and opportunities available. The daily sessions focused on the “how” and the “what” of solutions delivered by service-oriented architecture (SOA), data virtualization, cloud offerings, analytics, visualization and information management.

Not surprisingly, the strongest pillar at this year’s IBM IOD was big data. Everywhere we turned there was someone discussing or explaining the topic. Over the last few years there has been too much buzz and “scare” around businesses being “out of control” in the face of an insurmountable, unalterable tsunami of data. In one of his keynote sessions at IOD, Steve Mills, IBM’s SVP of Software, accurately stated “Big data was always there – it was summarized and archived. What’s new is the ability to take action on the data.”
Jason Silva opened  his session with a very polished presentation on digital connectivity and how it can relate to big data, Robert Leblanc continued with the theme but got more specific on how companies can use big data to help them deliver real value, while Deepak Advani continued with even more specific examples, highlighting some interviews with customers who had real-world implementations of IBM products and services, including InfoSphere Streams.

New offering announcements included IBM InfoSphere BigInsights Hadoop-based offering, specifically built-in social media analytics accelerators to help marketers develop applications for customer acquisition and retention, perform customer segmentation and campaign optimization, and streamline lead generation.  Additionally, the newly added Data Explorer feature enables advanced data federation capabilities from IBM’s Vivisimo acquisition.  The software automatically discovers and navigates available data wherever it resides to reveal themes, visualize relationships, identify the value of data and establish context of data usage.

On the Cloud and Virtualization front, a new offering included cloud analytics for line of business and SMBs — cloud-hosted applications to deliver predictive analytics specifically aimed at the financial services, retail, and education industries.

Another new solution called Analytic Answers offers SMBs predictive analytics as a service.  Customers will have access to tools, pre-built models, and expertise to help them develop actionable insights to their data stores.

So what does this all mean? To put it into perspective, we’re watching data traffic grow exponentially right before our eyes.  But it isn’t about the data; it’s about the path to wisdom that it provides. As organizations are swimming in an expanding sea of data that is either too voluminous or too unstructured to be managed and analyzed through traditional means, IOD certainly helped its hungry or rather confused customers catch a glimpse of its big data solutions and optimistic customer testimonials.

The IOD 2012 conference was an enriching experience where we got to spend time interacting with other customers and IBM experts. Hands-on labs and deep dive discussions allowing for a better understanding of product sets, is not something you’d get from a webinar or presentation material. Additionally, with the breadth of client base that were attracted to this event, the event served as an excellent networking platform.

With one’s brain nearing capacity by the end of the event, IBM’s IOD 2012 certainly left us a lot to think about. We can’t help but wonder, are we just mandated to think Big or is it truly our Big Future?

The post IBM Information on Demand 2012: A Look into the Future of Information Management appeared first on Virtusa Official Blog.

Big Data – Push Aside the Hype, the Value Lies in Action

$
0
0

As eCommerce (eAnything for that matter), Web 2.0 and several other buzzwords have come and gone, one of today’s biggest tech/business buzzword is “Big Data.” While the term itself continues to gain momentum in the media (see the Google trends graphs below), the technology has not caught up with the hype and the hype is fading to the point that Gartner has placed it in the “trough of disillusionment”.


Google Trends view of “Web 2.0”


Google Trends view of “Big Data”

Even though the buzz is fading, companies’ attention needs to focus on the business value derived from analytics and Big Data. So, what does this mean for the enterprise considering a deployment of Big Data technologies such as NoSQL and Hadoop?

  1. Is it “Big?” – First, is your data and analytics project “big?” – In many ways, Big Data is really a misnomer. Big implies a large quantity of data, and while managing large quantities of Big Data is certainly a use case of many Big Data solutions, it really is only one. Data can be more than big; it’s fast, constantly changing and noisy. Technologies branded as Big Data can help manage these areas as well – from real time processing to fast storage and retrieval technologies; these technologies can help the enterprise adapt to “changing data.”
  2. Which Processes will Improve? – What are the areas of opportunity for additional data-driven insights in your enterprise? Both business and IT leaders need to take a hard look at the analytics initiatives they are spearheading. Will the additional insight coming from these new systems truly make an impact on existing business processes and transform them? Or, will they generate additional reports in managers’ inboxes that will go unread?
  3. What Return will the Project Drive? – As with any other major IT initiative, Big Data analytics projects should be tied to a tangible business outcome. Ensure the investment in both IT and in employees merits a positive financial return. Analytics for analytics sake benefits no one.
  4. “Think Globally; Act Locally”– Big Data initiatives can get big very quickly. As with any other analytics initiative, “eating the elephant” and building a robust Big Data platform with thousands of Hadoop nodes isn’t necessarily the right approach. Enterprises looking to make a play in the Big Data analytics space should look for quick wins and prioritize those plays which will deliver the most bang for the buck. Big Data analytics is a journey and no journey is successful if you cannot take a successful first step.
  5. Find Internal Champions and Build Buzz. Once you start gaining success with your Big Data initiatives, make sure you leverage those people whose processes you’ve influenced. Hopefully, their day-to-day jobs will be simpler, faster, more informed; fulfilling the prophecies of “Big Data.” Make sure you capture their testimonials as you look to move onto bigger opportunities as internal references are usually the most credible.

These five steps may not look too unfamiliar to most who have been around the analytics space for more than a few years. “Big Data” technologies do promise and have often delivered some revolutionary benefits to many firms who have deployed them. If firms can create simple, clear plans to implement the unique attributes of Big Data technologies, they will be well-equipped to deliver the benefits the current hype suggests.

Kevin English

Associate Director, Virtusa. Kevin English is an Associate Director in the Enterprise Information Management practice at Virtusa. Kevin has over 13 years of experience in the areas of analytics, business process outsourcing and strategy consulting. At Virtusa, he is responsible for creating, packaging, selling and evangelizing industry specific offerings in the areas of analytics and Big Data that deliver business outcomes to clients. Additionally, Kevin holds 4 patents in the area of text analytics. While not at work, Kevin enjoys spending time with his wife and 2 young daughters.

More Posts

The post Big Data – Push Aside the Hype, the Value Lies in Action appeared first on Virtusa Official Blog.

Improving customer experience and the need for customer data management in the utilities sector

$
0
0

In the UK the Water industry operates on a geographic basis. If you live in a water company’s supply area, then that company bills you for the supply of fresh water and treatment of waste water. In the electricity and gas industry the retail market is more open and non-geographic, so you can have a billing relationship with a company from a wider range of energy suppliers even though your local network is owned by only one.

This is one of the reasons why the water regulator Ofwat imposed its Service Incentive Mechanism on water companies, where they are assessed on a number of customer satisfaction metrics, ranked and rated accordingly, with lower ranked suppliers mandated to improve their customer service rather than raising their pricing. In a non-competitive market the regulator, who has consumers’ interests at heart, has ensured that water companies have a strong incentive to improve customer service.

In the gas and electricity markets with open competition there was a drive to attract new customers and retain existing customers through a large number of pricing tariffs that were changed regularly. This made it hard for consumers to compare the offers from various suppliers. The regulator Ofgem, as part of its recent Retail Market Review, has mandated that energy suppliers can only offer four tariffs. In such a competitive market, being unable to differentiate as much from a sales standpoint has focused attention onto service delivery, and on improving customer experience as a way to attract and retain customers.

Water, gas and electricity companies are now all eagerly looking at ways to improve customer experience, and there are valuable lessons to be learnt from other retail markets that have already tackled this issue. Being able to offer a good customer experience is predicated on knowing your customer, having a 360 degree view of them, understanding all of their interactions with you, and even understanding how best to communicate with them. This in turn needs a customer data management system where the master record for a customer is held, and which can be enriched over time by every interaction you have with that customer.

While many suppliers have implemented large enterprise resource planning systems, the ability to address customer data management as above using these existing systems is limited, as they are often slow and hard to change.

There are also other factors to consider which are occurring at an ever increasing rate. There is a millennial generation of customers now who expect companies to respond to them immediately, who want the ability to self-serve, who interact and recommend or criticize through social media channels. This is a generation that has been heavily marketed to by brands, and are very brand conscious.

In addition there is the advent of smart metering, where suppliers are going to be measuring consumption on a granular basis, to help optimise supply and transmission networks, educate consumers on their consumption, and to provide other benefits.

All of the above suggests the need for a Customer Data Management platform, which can create rich customer data records over time, so that suppliers understand how to engage with their customers, what is important to them, and ensure that other business IT systems leverage this data so that every interaction a customer has with the supplier (whether through a printed letter, the contact centre, a service call-out and so on) appears seamless and consistent, where the consumer really feels that the supplier understand who they are, and best serves their needs.

For the Millennial generation, this is even more important, as they tend to group together on brands, building consensus through social media. A supplier that really understands every aspect of each of their customers is going to offer a better customer experience than one that doesn’t, and attract new customers from that generation.

Drilling down to the technology level, suppliers may have several customer data records for the same unique customer, from previous billing relationships with them, from customers moving from address to address where the current ERP systems are not updating a golden master but creating a new customer, or from different databases they use to market to customers and solicit feedback from them. This is where an MDM platform fills the need, providing that set of golden master data that all downstream customer relevant business IT systems can draw on.

Bringing together and making sense of all of the consumption data from smart metering is best addressed by a big data platform, and just as importantly the ability to exploit that data by finding the right “needle in a haystack” through analytics is required. This feeds into the Customer Data Management platform, further enriching relevant customer data records.

The days are long gone when consumers of water, gas and electricity had to live with what their service provider wanted to offer. In this ever-changing world of aggressive competition, demanding consumers, rapid technology advances and stringent regulatory requirements, the utility companies that succeed will be the ones that adapt quickly to changes in consumer needs and behaviour, and more importantly deliver an enhanced customer experience. And an effective customer data management strategy that helps in gaining valuable consumer insights will be the crucial differentiator for a utility company to maintain its market position or improve it!

The post Improving customer experience and the need for customer data management in the utilities sector appeared first on Virtusa Official Blog.


“Big Data” or “Big Time Security Breach”?

$
0
0

The well-established perception that data security is a combination of people, process and technology holds good for “Big Data”, as well as any other kind of data. In the “Big Data World” data security gets more complex and magnified, but the fundamental issues remain the same. Recent studies reveal that the total average cost of data breach can be as high as $5.4 million for a given financial year, this is a significant number. This alone is reason enough for organisations to evolve from traditional data governance mechanisms to models that can encompass Big Data as well.

This poses some big questions for Big Data advocates to ponder over though – is it moral, or legal, to make use of Big Data in such a way that it reveals information about someone, or something that may not be intended for public consumption? For example, if someone has an illness, or is carrying out activities that they don’t want others to know about? For enterprises, they should consider whether this information can lead to unintended consequences and potential data security breaches? Should a retailer be taken by surprise if they are penalised for using customer data that they have derived, for purposes unknown to the individual about whom they have collected it? Aside from morality, given the financial impact of data breaches, this requires serious thought.

An immediate action item stems from this: classification of “derived data”. Just because the sources you use to derive information are classified as freely available, does this mean that the information you have derived should also be classified as freely available? It is imperative for organisations to understand this and then classify the information from analytics, just as robustly as the information used to derive it.

This brings us to a systemic approach:

  • Understand the “business outcome” that you are looking to influence through Big Data technologies, and categorically list out the value you intend to derive out of their use.
  • Understand the “derived data” that you are looking to use to influence this outcome, classify it for security purposes, understanding that this classification may be different from the sources that are used to derive it.
  • In other words, treat the outcome of Big Data as an information source in its own right and protect it accordingly.

Now that you have done this, given that issues can easily get out of hand, how do you optimise Data Governance processes to make them more robust? Here are some ideas:

  • Increase the frequency of data monitoring. For example, with the arrival of gadgets such as smart meters, data that was once captured every month can now be captured every 10 – 15 minutes – monthly monitoring is no longer enough!
  • Address data quality issues at similarly increased frequencies.
  • Bring in robust data governance policies and procedures for stricter access controls and privacy restrictions on the resulting data sets.

These actions can not only help you comply with regulatory requirements on one hand, but can also help prevent security breaches that can cost heavily by way of negative publicity, lawsuits and fines.

With Big Data promising big opportunities, it is more important than ever for organisations who intend to monetise it to be extremely cautious and not fall foul of stringent data laws and compliance.

Looking forward, companies need to get hold of this issue and ensure they are securing data in the correct way – before their Big Data breach becomes the next Big Headline.

The article was originally published on Big Data Republic on December 30, 2013 and is re-posted here by permission.

 

The post “Big Data” or “Big Time Security Breach”? appeared first on Virtusa Official Blog.

Medicare claims data: 3 Analytics solution ideas for Payers and Providers to enhance customer experience

$
0
0

The billing (claims) data of healthcare providers for the United States Medicare Program, which is considered to be one of the most important healthcare programs to be held private for almost 35 years, was made available to the public on April 9, 2014.

The data that will be available to the public includes the identifiable healthcare provider information, specialty, procedure and associated cost information. However, data related to the patients will not be available to the public, as patient privacy will be maintained.

Experts and some groups have opposed the release of this data due to the possibility of privacy intrusion and potential for patients and payers to misinterpret the data among others. The implications of this data release require further extensive analysis.

Several payer groups are very excited about this information. Government and the government agencies are interested to see the cost efficiencies that this program would bring. Insurance firms and private payers would like to leverage this information to benchmark the claims the providers make. Patients and healthcare delivery enablers (employers) would like to wean the ‘quality of care’ information from this data.

This data is really a treasure trove to understanding the various dimensions of government spending, provider billing patterns, fraud and potential cost efficiencies. In my opinion, three product ideas based on this data which can be features of a stand-alone product or used in conjunction with any existing product are:

  • Fraud/Improper payment prevention
  • Doctor ratings
  • Driving cost efficiencies and cost reductions

Fraud/Improper payment prevention
According to a US Government Accountability report, in 2012, the Medicare program covered more than 49 million elderly and disabled beneficiaries. The cost was $555 billion and the estimated improper payments reached $44 billion. This ratio of almost 8% of the payments being improper provides a great opportunity for reduction.

A product which uses advanced prediction/estimation models based on patient behavior, billing cycles, nation-wide provider billing estimates and standard cost estimates to detect improper payments would be of great interest to the government.

Insurance providers would also be very interested in such a product.

Doctor ratings
With the newly available information on the procedures, it will become easy to identify the expertise of the providers. For example, if one wanted to get a cataract operation done, you could find out how many operations your surgeon did last year. Research shows that the quality of procedures are often better if the doctor performs it frequently.

A product or a feature which combines these newly available doctor ratings with the cost information would be a very powerful tool for the selection of the expert. Several insurance companies and healthcare analytics firms already provide features which list doctors by their specialty, ratings gathered from peers, and cost estimates. The new analyzed data would fit right in. A product which compares the healthcare provider claims estimate for the same procedure from a Medicare perspective and insurance provider perspective would be of great interest to both the government agencies (CMS etc.) as well as the insurance providers. This will help the payers to benchmark costs and drive efficiencies.

Driving cost efficiencies and cost reductions
With the data on the cost per procedure now clearly available, researchers can look at the money spent on drugs or procedures in different parts of the country. They can check whether that leads to increased quality in care. From a payer point of view, now there is comparable claims information for the provider to compare the insurance coverage part and the Medicare billing cost for the same procedure.

A product which compares the healthcare provider claims estimate for the same procedure from a Medicare perspective and insurance provider perspective would be of great interest to both the government agencies (CMS etc.) as well as the insurance providers. This will help the payers to benchmark costs and drive efficiencies.

In summary, the released Medicare claims data is a very rich piece of data that will make its presence felt in multiple facets of healthcare and its repercussions will be tremendous to say the least.

The article was originally published on Healthcare Payer News on April 24, 2014 and is re-posted here by permission.

The post Medicare claims data: 3 Analytics solution ideas for Payers and Providers to enhance customer experience appeared first on Virtusa Official Blog.

How payments and analytics deliver customer insights and drive loyalty

$
0
0

It’s well known that it’s more cost effective to sell to existing customers, than it is to acquire new ones. If businesses can gain extra insights into how their customers use their services, they can find new ways to tailor and customise those services to individual customers. Electronic payments make the interaction with a business a far smoother, quicker and convenient experience for customers – tapping into that by implementing electronic payment methods can have a real pay-off. One example of an electronic payments success story is from Starbucks, which claims that more than 10% of its in-store sales are driven by its mobile wallet app.

The data acquired through electronic payment systems, and the many mobile transactions that are happening through payments apps are a gold mine of customer behaviour. A few payment providers are using this data to analyse spending patterns based on the locations and pro-actively engaging with the customers. One has to only look at a company like PayPal, that has over 100 million credit cards on file, to start to understand exactly how much data electronic payment providers hold and how valuable that data might be, given that it can offer insights into such a large numbers of customers’ behaviour.

Using the data gleaned from payment systems, banks and payment providers can better understand the spend pattern and location of consumer behaviour, which if acted upon can present opportunities to increase customer spend and loyalty at the same time. Here’s a scenario for increasing loyalty amongst commuters – you could have a payment app that analyses the data of the various journeys a person takes on a tube. If that customer was to travel a particular route every day of the week, the app would use this data to work out the best possible cost options; a Weekly Pass, Monthly Pass, or PAYG Oyster. The payment provider’s app could then ping the customer with the most cost effective option, which could be purchased through the same app. In an example like this everyone wins, the traveller saves money and it greatly increases his loyalty towards the payment provider.

While using the data from electronic payment systems cans boost loyalty, there is no magic formula to it – loyalty will always be about trust and advice. But what banks and payment providers have been able to tap into is the use of data analytics and mobile payment analytics through geospatial applications and innovative partnership models. Using technology to create innovative rewards for customers, banks and financial providers can build all-important trust by saving the customer money.

The article was originally published on The Digital Banking Club on July 29, 2014 and is re-posted here by permission.

The post How payments and analytics deliver customer insights and drive loyalty appeared first on Virtusa Official Blog.

Enabling data discovery: Big data’s ability to solve bigger problems

$
0
0

Industry leaders are debating the co-existence of big data & traditional business intelligence. One group strongly believes traditional business intelligence will be washed away in the big data tsunami, while another group discounts big data as a big hype and vouches for traditional business intelligence as an organization’s bastion.

Big data will continue to be the buzz word at least for a few more decades, as the industry going all out to create interesting solutions. With third party solution providers fighting for their share of pie, Apache incubator will continue grooming tons of new frameworks & tools. New players will emerge and existing niche players will be consolidated – either by acquisitions or by getting lost in a fierce battle. Big data is here to stay for a long time, growing in stature and adding value to customers.

Defined in simple terms, business intelligence is to use reports and dashboards to answer a set of formulated questions. Today, big data solutions are augmenting the data warehouse eco-system; solving problems that are not optimally solvable otherwise due to the sheer volume, velocity and variety of data and other factors.

Big data solutions can do much more to help businesses. True potential arises with the ability to ask bigger questions by comprehending data beyond the realms of the traditional data warehouse. Data discovery starts with asking questions that are not asked today to know the unknown … Voilà!

Big data can catalyze organizations to mature the curve from business intelligence to data discovery. An emerging area in big data is graph database and allied graph analytics. Graphs are not new, they were documented as early as 1736 on the famous Seven Bridges of Königsberg problem presented by Leonhard Euler and have evolved since then. Today graph databases are imperative in the NoSQL arena, when the relationship among the data is as important as the data itself. The data is stored in a schema-less manner as nodes and edges, with ability to crunch humongous data in memory at astonishing speeds. Graph analytics provide the ability to solve the most complex of problems in a simple yet efficient manner by applying graph theories.

The methodology of overlaying a subset graph pattern over the dataset towards discovery of matching data patterns makes data discovery engaging as well as an easy activity.

Majority of the product recommendation engines for online retail today have products hardwired at the platform configurations. These recommendations are usually pre-calculated weeks ahead based on customer’s historic purchases and may be irrelevant today. Graph analytics can provide dynamic recommendations based on real-time or near real-time buying patterns using pattern matching algorithms which not only uses past purchases but also product catalog search patterns and social media activities.

Data discovery’s impact on enterprises bottom line is immense and big data solutions can aid getting there faster. Life science and financial industries are early adopters where solutions are helping drug repurposing, detecting financial fraud patterns and money laundering.

Virendhar Sivaraman

Architect - Big Data Solution and Technical PM, Virtusa. Virendhar is a seasoned Data Warehouse and Business Intelligence professional with over a decade of experience. Over the years, he has designed and delivered many traditional data warehousing projects. At Virtusa, he has been playing a key role in creating and implementing technology solution including big data. His core competencies include churn analytics, customer profiling & loyalty management solutions and sentiment analysis. He holds a Apache Hadoop Developer Certificate from Cloudera. Virendhar received his Post Graduate Diploma in Management (Marketing) from Loyola Institute of Business Administration, Chennai and his Bachelors in Engineering (Computer Science) from Crescent Engineering College, University of Madras.

More Posts

The post Enabling data discovery: Big data’s ability to solve bigger problems appeared first on Virtusa Official Blog.

Mission and make up of a data sciences group

$
0
0

The market is abuzz with data trends like big data and data science, and there are more to come. Numerous organizations across the world are trying to find the best possible way to establish an effective data group. However, organizations face various challenges in setting up a best-in-class data group, and those who have somehow managed to set one up face issues sustaining it. Hence, it has become very critical to analyze and understand the factors responsible for making this task challenging.

One of the key reasons for the nascent death of such a data group is the incapability of data organizations to continuously showcase their real potential to a business. These groups usually have technologists and evangelists who build on existing success and take on more volume and velocity of data, variety of data types—including structured, unstructured, and semi-structured—and provide real-time data processing and analytics capabilities.

The organization keeps the group and the hype stays on for a while until interest dwindles. Due to lack of adoption by business to sustain growth and interest, such data groups slowly cease to exist.

This article highlights the ways that help with the setup and sustenance of a data organization—focusing on five must have areas for this group. Going with the current trends, this group is referred as the data sciences group.

The Mission
The mission of a data sciences group spans a variety of priorities—providing protection, value, predictability, accuracy, and easy access.

The first goal is to provide solutions that help eliminate the vulnerability of business, such as loss of customers, inappropriate transfer of sensitive data, and market threats.

Secondarily, they should offer valuable solutions to keep a constant watch on the multi-channel customer voice and adapt to the pulse of a business’ biggest asset, the customer.

Thirdly, it should provide solutions to help businesses make faster and more accurate decisions using prediction models to eliminate human error and create a data-driven organization with veracity.

It should also offer solutions to help lower a business’ total cost of operations es using disruptive technologies and eliminating technical debt.

Finally, it should provide solutions that are be easily accessible and available anywhere, everywhere, and any time.

Group Composition
A data sciences group should be made up of a business analyst; a data analyst; technical resources with knowledge of mobility, visualization/user experience, and big data technology; and a business sponsor.

Every project this group executes must have a sponsor from the business who spends time ensuring that the solution being developed delivers value.

In terms of deliverables, the business analyst must be able to create a product requirements document for the solution, road map, ROI, and benefits to the business. The data analyst must be able to point to source and the target data deliverables and be responsible for the data quality of the solution. The technical resources own the technical solution and architecture. The business sponsor takes responsibility the deliverable to make the solution operational in the organization. In terms of training, the data sciences group needs to constantly learn and keep pace with the technological advances so that the solutions developed are innovative.

Data Sciences Group’s Evangelization
A data sciences group can only be evangelized by the ultimate operational and analytics users of the solution in question. They are the people who can keep this group from disappearing into oblivion and they do it by integrating the newly developed solutions and adopting them. The most important thing to note is that this group should not be confused with the enterprise data warehouse group.

The article was originally published on Software Magazine on November 26, 2014 and is re-posted here by permission. 

Kumar Ramamurthy

Vice President, Chief Technologist - Enterprise Information Management (EIM) Practice, Virtusa. Kumar has over fifteen years of experience in enterprise data architecture, database related technologies, software platforms and architecture assessments. Kumar is primarily involved in consulting engagements and assessments for Virtusa at existing and new EIM clients. He also has overall responsibility for delivery assurance from the EIM practice at Virtusa. Kumar has proven ability in consulting, selling, driving, delivering large scale EIM development/maintenance projects both in the enterprise and ISV spaces, while ably bridging the technical and business worlds to ensure the delivery of the best, most accurate business solutions to clients. He is particularly knowledgeable in Kimball and Inmon related DW architectures. He is adept at Data Integration, BI, Data Governance, Database Performance tuning, data modeling and MDM areas. Kumar has a Masters in Computer Science from Bharathiar University, Coimbatore, India. When he is not creating EIM solutions, Kumar spends time with his son and enjoys golf at his Arkansas home.

More Posts

The post Mission and make up of a data sciences group appeared first on Virtusa Official Blog.

How Technology is changing the future of banking in India

$
0
0

In the present era of global, interconnected financial systems, a small ripple in one nook and corner of the world can probably create a shock wave in the entire economic ecosystem. This has led to increased scrutiny and more stringent and intricate regulatory requisites for the financial institutions. Regulators have not only shortened the time but have also doubled up the information they sought from their customers. Risk, financial data and technology are now becoming hot topic of discussions, and banks are now looking for crucial and viable solutions that can help them sail through this expedition of compliance. To become more agile and remain relevant, traditional banks in India are exploring their technological options with more focus on insights into customer behaviour.

 

Data analytics helping banks in regulatory compliance

The world’s top investment banks have been fined with close to $43 Billion over the past few years due to not adhering to the compliance rules in areas such as customer reporting, and thus making it the single most expensive compliance issue. The regulators continue to put more pressure on the financial services firms by adding constant reporting and more regulations such as KYC, Basel III and Solvency II to maintain data compliance.

While attempts have been made to deploy big data tools such as Hadoop to increase analytics, in the size, complexity, and regulatory compliance but using analytics has been a painful endeavour for many banks. Banks are still behind other industries in deploying new analytics techniques which is why they face many data challenges today like data overload, inadequate data management systems, increasing customer demand for information and increased regulatory reporting.

As banks continue to become more diverse and complex, technology radically changes the speed of operations and data production. To ensure compliance, model banks need to apply big data processing and reporting tools to ingest and process data from both new and legacy sources. These platforms should be sophisticated enough to process all types of structured and unstructured information such as internal documentation, voice-mails, customer correspondence, transactions, etc. There should be a rich process modelling capability that can be used to detect patterns based on pre-defined regulatory reports to quickly identify the risks.

Establishing this type of big data analytic platform allows banks to reduce the complexity of their process and improve the speed of their analytical cycles. This allows banks to not only lower the cost of data processing, but enables them to discover new insights by identifying and managing risks more proactively. By harnessing big data analytics for both compliance and improvements in core operations, banks can leverage and spend efficiency across their business lines and seek improvement in areas such as customer and fraud analytics.

 

Transforming regulatory compliance though artificial intelligence

New innovations in data analytics empower banks with systems that are smart in automatically refining their algorithms and improving their results over time. We are not talking about the old school approach to data analysis — spreadsheets, data tables and crunching numbers on a calculator. We are talking about the new age transforming access i.e. artificial intelligence (AI).

Advances in automation and data-led intelligence has put sophisticated AI technologies within the reach of traditional institutions. This is because the modern AI platform can essentially stand on the shoulders of the data and process automation technology trends that precede it. AI is a collection of technologies such as Natural Language Processing (NLP) and machine learning that is now being applied across banks to further automate processing of information to better interpret and contextualize the information. It has the potential to substantially refurbish the whole compliance process that is operational at a bank.

NLP is well suited for processing financial documents to extract metadata, identify entities and understand the intent or purpose of the documents. NLP can be used to identify the types of products such as loans or swaps and correlate it to a regulatory topic such as anti-money laundering, insider trading or other abuse. When combined with robotics, AI can hugely simplify the processes and reduce the chances of human error. As it will continuously self-improve, there are more chances of technology being able to manage complicated and time-consuming data updates better than its human users could ever do.

This undoubtedly means that banks are better equipped to deal with the demands of regulators in a manner that just wouldn’t have been possible via any basic analytics tool or human intervention. Most AI systems are not at that stage just yet, but the potential for transformation is huge.

 

Improving customer experience through Big data 

Banks have access to more consumer data than other businesses. With frequent use of web and mobile banking channels, the volume and variety of data that banks hold about their customers has steadily increased, driving an increase in the number of customer interactions. Banks hold detailed customer profiles, information on spending and income, and a clear picture of where people spend their time, banks are in a unique position to paint a clear picture of each of their customers.

Big Data offers banks an opportunity to differentiate themselves from the competition. Using advanced big data techniques to collect, process and analyse information, banks can provide better personalization and relevant information to customers across all areas of retail banking. This can ultimately make banks more customer-centric. By using customer data effectively, banks can deliver more targeted and cost-effective marketing campaigns, design products and offers that are specifically tailored to customer needs. Combining data sets in creative ways can surprise and delight customers, leading to retention, loyalty and a higher lifetime value.

Big data can also play an important role in customer retention by minimizing churn. Loyalty has become a top issue with the millennial generation. It costs banks significantly more to acquire new customers than retain existing ones, and it costs far more to re-acquire deflected customers.

All this shows that it may be a tedious journey for banks to deploy these technologies now but will render some great results in the future course of time. Capturing these opportunities will require investment, painstaking planning, and coordinated decision-making, spanning the whole bank. Automation is rewriting the rules of how banks compete. Banks that fail to grasp this risk may damage the franchises built over generations. But if they manage to address these multiple strategic challenges, they can position their institutions to compete effectively and capture an emerging, long-term growth trajectory.

This article was originally published in Business Standard.

Sreekanth Lapala

Senior Vice President – Outsourcing Transformation Services, Virtusa. Sreekanth has over 17 years of industry experience, with focus on information technology with immense experience focused on execution of large scale transformation programs and growing delivery centers. In the current role, he has a dual responsibility of leading Virtusa’s largest account while also heading the Outsourcing Transformation Services group. Prior to joining Virtusa, Sreekanth had a very successful stint at a product company and instrumental in building the world’s first J2EE 1.3 certified server by Sun Microsystems. Sreekanth is a prolific speaker and has been frequently quoted in the media. Sreekanth is a graduate from BITS Pilani.

More Posts

The post How Technology is changing the future of banking in India appeared first on Virtusa Official Blog.

Blockchain and Analytics: Unlocking New Billing Opportunities for Communication Service Providers

$
0
0

In my twenty odd years in the telecommunication and media industry, the last five or so have seen challenges mount for communication service providers (CSPs). Major players have struggled to monetize the massive amount of data flowing through their networks. Consumer demands grow continually as rapidly as expanding digital media promises to democratize content distribution. At the same time, over-the-top (OTT) service providers seem poised to cement their dominance in core services like voice and messaging that has traditionally been a CSP’s forte.

Given this backdrop, CSPs are on the lookout for new service opportunities that can offer competitive advantage while generating newer sources of revenue. For instance, billing— once restricted to the purely functional process of calculating usage and generating invoices—can now enhance customer loyalty and attract new business prospects. We find a majority of Telcos are keen to change or replace their revenue management systems to drive more value.

For telecom business support systems (BSS), billing is one of the most critical active systems in the life cycle of a customer. One bad experience can push a customer to switch to a competitor. Thus, the need to handle it expertly. As CSPs move toward delivering data-heavy services to their customers, they require high-end, flexible, and agile billing systems that can generate accurate invoices, facilitate real-time revenue assurance, enhance customer experience, and drive up their chances of generating revenue. Moreover, to excel at core BSS operations, they must ensure the constant availability of up-to-date information that can help accurately track a customer’s activity and better provision new subscription services.

Blockchain combined with analytics and machine learning can transform billing for CSPs.

 

Transforming billing with Blockchain

With emerging technologies like Blockchain, CSPs can gain a strong foothold in these areas. Blockchain opens new avenues for CSPs to create a repository of recognized identities for every customer and their corresponding billing information. For customers, this provides unparalleled convenience. They no longer need to repeatedly provide personal information to service providers while setting up new accounts or opting for new services. These virtual identities also lower the chances of identity fraud and billing inaccuracy, a key billing challenge.

  • Billing inaccuracy can be resolved through public-private cryptography, to create a virtual identity for each subscriber when encrypted can be stored as a smart contract. These smart contracts can automatically authenticate and authorize a billing settlement from the customer’s end on a real-time basis.
  • Blockchain-based virtual subscriber identities curb roaming fraud as well. Permissioned Blockchain solutions between visited public mobile network (VPNM) operators to set up micro contracts with network users, require permission from the host network providers and VPNMs to enable roaming. The contracts can continually feed updates to the subscribed network operator from the individual user, allowing for immediate billing and invoicing.

Adding analytics and machine learning to the mix

The ubiquity and ready-access of digital content has fuelled the demand for flexible subscription and billing plans. For instance, a customer may want access to premium content for a limited period and ask the provider to charge the additional fee directly from a credit card for immediate usage. Customers may also temporarily enhance the download speed to a premium-service level to be able to watch a high-definition movie without changing their monthly subscription plan. To address and monetize these small but important opportunities, CSPs need their billing to be flexible and accommodating. It is here that analytics and machine learning can help them.

Transactional data from billing systems contain a wealth of data on customer usage behaviour. CSPs can consider implementing advanced analytics and machine learning models to unlock a range of real-time subscriber intelligence from this data lake. What applications or services are they using? Are they encountering network downtime issues? When are they most active? These insights give CSPs an opportunity to offer more targeted, personalized services to customers that leads to a better experience. These satisfied customers then become key prospects for CSPs to upsell and cross-sell, as they are more likely to respond positively to additional service offers.

Agile and transparent Blockchain-based billing solutions combined with analytics have the potential to transform the relationship between CSPs and their subscribers. The faster organizations acknowledge this and act on it, the better their chances of becoming market leaders where service differentiation and customer experience reign supreme.

 

Manmohan Panda

Manmohan Panda is Senior Director-Communications Solutions at Virtusa. With nearly two decades of rich experience in telecommunication and media domain, Manmohan Panda currently leads a group of enterprise telecom solutions architects for one of Virtusa’s largest Telco client. Prior to Virtusa, he served in various leadership roles with Telco products and value-added product companies, with implementations across the globe. Through his career, he has successfully led and implemented large complex transformation programs and innovative solutions. He has a Bachelor’s degree in Electronics and Telecommunications and an advanced diploma in computers from C-DAC, India. He is a certified Enterprise Architect with TOGAF 9.1.

More Posts

The post Blockchain and Analytics: Unlocking New Billing Opportunities for Communication Service Providers appeared first on Virtusa Official Blog.


How Technology is changing the future of banking in India

$
0
0

In the present era of global, interconnected financial systems, a small ripple in one nook and corner of the world can probably create a shock wave in the entire economic ecosystem. This has led to increased scrutiny and more stringent and intricate regulatory requisites for the financial institutions. Regulators have not only shortened the time but have also doubled up the information they sought from their customers. Risk, financial data and technology are now becoming hot topic of discussions, and banks are now looking for crucial and viable solutions that can help them sail through this expedition of compliance. To become more agile and remain relevant, traditional banks in India are exploring their technological options with more focus on insights into customer behaviour.

 

Data analytics helping banks in regulatory compliance

The world’s top investment banks have been fined with close to $43 Billion over the past few years due to not adhering to the compliance rules in areas such as customer reporting, and thus making it the single most expensive compliance issue. The regulators continue to put more pressure on the financial services firms by adding constant reporting and more regulations such as KYC, Basel III and Solvency II to maintain data compliance.

While attempts have been made to deploy big data tools such as Hadoop to increase analytics, in the size, complexity, and regulatory compliance but using analytics has been a painful endeavour for many banks. Banks are still behind other industries in deploying new analytics techniques which is why they face many data challenges today like data overload, inadequate data management systems, increasing customer demand for information and increased regulatory reporting.

As banks continue to become more diverse and complex, technology radically changes the speed of operations and data production. To ensure compliance, model banks need to apply big data processing and reporting tools to ingest and process data from both new and legacy sources. These platforms should be sophisticated enough to process all types of structured and unstructured information such as internal documentation, voice-mails, customer correspondence, transactions, etc. There should be a rich process modelling capability that can be used to detect patterns based on pre-defined regulatory reports to quickly identify the risks.

Establishing this type of big data analytic platform allows banks to reduce the complexity of their process and improve the speed of their analytical cycles. This allows banks to not only lower the cost of data processing, but enables them to discover new insights by identifying and managing risks more proactively. By harnessing big data analytics for both compliance and improvements in core operations, banks can leverage and spend efficiency across their business lines and seek improvement in areas such as customer and fraud analytics.

 

Transforming regulatory compliance though artificial intelligence

New innovations in data analytics empower banks with systems that are smart in automatically refining their algorithms and improving their results over time. We are not talking about the old school approach to data analysis — spreadsheets, data tables and crunching numbers on a calculator. We are talking about the new age transforming access i.e. artificial intelligence (AI).

Advances in automation and data-led intelligence has put sophisticated AI technologies within the reach of traditional institutions. This is because the modern AI platform can essentially stand on the shoulders of the data and process automation technology trends that precede it. AI is a collection of technologies such as Natural Language Processing (NLP) and machine learning that is now being applied across banks to further automate processing of information to better interpret and contextualize the information. It has the potential to substantially refurbish the whole compliance process that is operational at a bank.

NLP is well suited for processing financial documents to extract metadata, identify entities and understand the intent or purpose of the documents. NLP can be used to identify the types of products such as loans or swaps and correlate it to a regulatory topic such as anti-money laundering, insider trading or other abuse. When combined with robotics, AI can hugely simplify the processes and reduce the chances of human error. As it will continuously self-improve, there are more chances of technology being able to manage complicated and time-consuming data updates better than its human users could ever do.

This undoubtedly means that banks are better equipped to deal with the demands of regulators in a manner that just wouldn’t have been possible via any basic analytics tool or human intervention. Most AI systems are not at that stage just yet, but the potential for transformation is huge.

 

Improving customer experience through Big data 

Banks have access to more consumer data than other businesses. With frequent use of web and mobile banking channels, the volume and variety of data that banks hold about their customers has steadily increased, driving an increase in the number of customer interactions. Banks hold detailed customer profiles, information on spending and income, and a clear picture of where people spend their time, banks are in a unique position to paint a clear picture of each of their customers.

Big Data offers banks an opportunity to differentiate themselves from the competition. Using advanced big data techniques to collect, process and analyse information, banks can provide better personalization and relevant information to customers across all areas of retail banking. This can ultimately make banks more customer-centric. By using customer data effectively, banks can deliver more targeted and cost-effective marketing campaigns, design products and offers that are specifically tailored to customer needs. Combining data sets in creative ways can surprise and delight customers, leading to retention, loyalty and a higher lifetime value.

Big data can also play an important role in customer retention by minimizing churn. Loyalty has become a top issue with the millennial generation. It costs banks significantly more to acquire new customers than retain existing ones, and it costs far more to re-acquire deflected customers.

All this shows that it may be a tedious journey for banks to deploy these technologies now but will render some great results in the future course of time. Capturing these opportunities will require investment, painstaking planning, and coordinated decision-making, spanning the whole bank. Automation is rewriting the rules of how banks compete. Banks that fail to grasp this risk may damage the franchises built over generations. But if they manage to address these multiple strategic challenges, they can position their institutions to compete effectively and capture an emerging, long-term growth trajectory.

This article was originally published in Business Standard.

Sreekanth Lapala

Senior Vice President – Outsourcing Transformation Services, Virtusa. Sreekanth has over 17 years of industry experience, with focus on information technology with immense experience focused on execution of large scale transformation programs and growing delivery centers. In the current role, he has a dual responsibility of leading Virtusa’s largest account while also heading the Outsourcing Transformation Services group. Prior to joining Virtusa, Sreekanth had a very successful stint at a product company and instrumental in building the world’s first J2EE 1.3 certified server by Sun Microsystems. Sreekanth is a prolific speaker and has been frequently quoted in the media. Sreekanth is a graduate from BITS Pilani.

More Posts

The post How Technology is changing the future of banking in India appeared first on Virtusa Official Blog.

Viewing all 36 articles
Browse latest View live




Latest Images