Case Study

Case Study Post Type

Credit-risk-management-for-housing-finance-1-scaled
Financial Services

Credit Risk management for Housing Finance

Client Background Client is one of India’s premier House Finance Company providing low ticket home loans to the priority sector in India. Priority sector is classified as a relatively unbanked segment with a non steady income stream. Through its operations, the client wants to impact millions at the bottom of the pyramid. Technology plays a very critical role in accessing and managing risk of lending to the priority segment. Business Objective Client has in place extensive lending risk evaluation algorithms that have evolved through its many years of operations. These algorithms have extensive rules for various geographies, income segments to access default events in near future and have held really well. One of the risk metrics for any loan is 3 consecutive payment defaults within the first year of disbursement. Portfolio default rate stood at 6 % translating into Non Performing Assets to tune of 3 M USD. It was desired to improve detection of risky applicants at the time of loan origination process using Machine Learning. Solution- AI Based Risk Management Our team developed a Risk detection platform powered by Machine Learning algorithms for detecting 3 consecutive payment defaults within the first year of disbursement. Platform generates a default score between 0-100 where higher the score, higher the chances of 3 consecutive defaults in first year. Key features of platform: Robust Machine Learning Algorithms built on three years of historical loan portfolio data and tested for different population segments. Ingests approx 200 structured and 30 plus unstructured consumer application data points. Machine Learning algorithms monitored periodically for any new risk patterns. Outcome Decrease in portfolio default rate from 6% to 2% over a period of 18 months. Loan disbursements increased by 10 % to 55M USD.

Happy young Asian couple and realtor agent. Cheerful young man s
Financial Services

Personalized insurance suggestions

Client Background Client is a leading private life insurance firm in India selling a gamut of insurance products like term insurance, savings plan, ULIP’s, pensions and many more, helping them to meet their long-term financial goals and plan their lives in a better way. Business Need Client wanted to improve cross-sell penetration within existing customer base. It was already running direct marketing campaigns targeting customer segments based on current product held, life stage rules, time of the year and other business rules. Incremental gains from such campaigns were marginal and it was desired to deep dive into data to: Proactively identify the policyholders who have a high likelihood to purchase more than one policy based on all available data fields. Use agent characteristics as the main lever to predict cross-sell propensity. Solution Propensity Algorithm to score customers using Logistic Regression. Datasets used were Customer Demographics, Product Details, Financial Behavior, Customer Interaction and Agent Details. Cross Sell propensity scores at product category level for each customer. Scores normalized to recommend top 2 products customer is likely to purchase. Recommendations used to power email and call center campaigns. Roi Tailored marketing campaigns across modes of marketing. Incremental Revenue of USD 100,000 in 3 months. Lower cost of Marketing Campaigns.

Gas turbine electrical power plant with in mountain power for fa
Industrial

Downtime management for industrial gas turbines

Client Background Our client is a leader in creating energy technologies and creates a third of the world’s electricity. Its technology equips 90% of power transmission utilities worldwide. Business Objectives To create a Recommendation Engine to assist Outage Planners and Field Service Teams in their efforts to more effectively guide and inform maintenance interactions with their clients. To source data from a variety of internal (example; outage reports, equipment data, sensor data, technical lists, resource lists, scope lists, operational checklists, tribal knowledge) and possibly external sources to create the Outage Risk Assessment Platform. This data set will be leveraged to create a recommendation engine that flags and prioritizes risks at the client level and equipment level. Solution Solution development journey was divided into following phases: PHASE 1: Data Collection, Integration and Transformation During this phase our consultants worked with key stakeholders and subject matter experts in the client’s business to understand and document the current baseline process Outage Planners take to plan for scheduled maintenance sessions with clients. This included understanding data sources used, link information to decisions made and document how the Outage Planners use the information to prioritize risk issues for their client maintenance interactions today. Our team then collected and assembled information from various data sources provided by client. PHASE 2: Data Analysis and Model Development During Phase 2, our team conducted basic data exploration, defined key events relevant to the planning process and communicated preliminary findings. Following the delivery of the initial set of insights, we started the modelling process. The purpose of the recommendation engine was to help Outage Planners identify and prioritize key risks to inform the agenda of their scheduled maintenance reviews. The range of recommendations included direct step actions the Outage Planner should take to address obvious issues to more general risks that might require further drill down to identify root cause issues. Our team worked with the client to identify several Outage Planners for a beta test of the recommendation engine. In early testing, the recommendation engine was able to identify 50%-60% of unplanned outages and impacting factors across 4 categories. The results of the beta enabled us to modify the recommendation engine to ensure greater relevance of the product. PHASE 3: System Implementation Once the model was finalized, we implemented the algorithm and the report that features the model output/recommendations on the client’s internal system. Outage Planners had access to the information/report and could lift the prioritized risk related issues and use the information in their mitigation plans as they prepared for maintenance meetings with their clients. Outcome From engineer notes that were stored as PDF, our recommendation engine discovered reasons for maintenance delays and machine breakdown. These insights were used to identify best practices for outage planning, scheduling maintenance, procuring repair parts.

Best Price Premium Sale Concept
Financial Services

Personalized consumer engagement for credit card customers

Client Background Client is an Indian Fintech startup with the goal of enabling banks to provide delightful, relevant & personalized experience to its retail customers. It promises to help customers save more & spend on right things at right price using machine learning & artificial intelligence. Business Need Client was in initial stages of conceptualizing the product that would enable it to fulfil its vision. It was sure that this couldn’t be done without application of Analytics & Machine learning but lacked the in-house expertise to do so. It was therefore desired to engage an analytics provider that would bring in necessary expertise in applying machine learning for personalization & recommendation. Chosen partner needed to work closely with the client in identifying how machine learning could play a role in the product, product features it could power and then go about implementing same. Client already had a technology partner responsible for creating the consumer side of application. Solution- Personalized Customer Engagement Client was able to get access to the sample data for credit card customers & their transactions through tie up with a private sector bank in India. Given that client lacked knowledge of machine learning and was unsure of where & how machine learning could be applied, our team had to work closely with the client in educating it and identifying areas in consumer engagement that machine learning can influence. This was more like co-creation of a product.  Following key observations were made from the data: Credit card transaction data had hygiene issues with non-standardized merchant names. This would impact attempts at discovering consumer preferences for merchants, online & offline. Merchant categories like Retail, Hypermark, and Entertainment would offer us limited insight into consumer preferences regarding clothing, food preference etc. A consumer would have specific merchants they would like to shop more often from. There would be spendings necessary and frequent in nature like grocery, bill payments for mobile, DTH etc. Then there would also be spendings discretionary in nature like entertainment, travel etc. Client would be onboarding several merchant partners across frequent & non-frequent categories whose offers will be served through the platform. There were restrictions on SMS based outreach to each customer of 8 per month imposed by the bank. Any recommendation algorithm would need to take this fact into account. It was decided to: Enrich merchant meta-data that would allow us to create deep customer profiles and serve relevant offers. Enrichment exercise had to be manual mostly. Group the merchant categories by frequency of transaction & necessity vs discretionary nature. This would help us allocate the quota of SMS appropriately. Develop customer spend preferences under categories, merchants, day of the week, time of the day wherever relevant. Enrich merchant offers with similar tags used for the merchant to enable mapping. Use collaborative filtering based insights & event-based targeting for driving recommendations initially. Once the system go live we could use feedback to develop more sophisticated algorithms. Use scalable technologies like Hadoop & Spark for data processing and machine learning so that tomorrow we can scale with larger transaction datasets. Four months exercise resulted in the development of MVP with modules for data cleaning and transaction enrichment, customer preference creation, rules for event-based recommendation and collaborative filtering based recommendation. Roi Allowed client to develop robust MVP within cost constraints and scale it to next level. Our team also participated in pitching product & its analytics capabilities to private sector banks.

Businessman protects family members with an umbrella on a digital tablet
Financial Services

Predictive churn management for Life insurance

Client Background Client is a pan India Private life insurer following a multi-channel distribution strategy with a vision to help people plan their life better. It has been offering a suite of insurance products and investment plans through digital and offline channels since its inception in 2008. Life Insurance products are long-term products meant to cover the risk of untimely death of an insured person or allow an individual to build wealth for future needs. Such needs can be child’s education, child marriage or savings for retirement. Wealth Builder plans can be market linked like ULIPS or can offer fixed rate of return through the period. Most of the plans have minimum premium paying term of 10 years with pure term plans going as high as 35 years. These plans have been traditionally been sold through agency channel where agents get in touch with prospective customers and explain those benefits of the plan. Agents, in turn, are offered fixed percentage of commission for every policy sold. This commission is higher for the first year and gradually reduces over next years. Business Objective Persistency is a key metric through which Life Insurer measures the effectiveness of its retention strategy. For most practical purpose 1st year persistency is defined as a number of policies still in force after 1 year of acquisition and similar for 2nd-year persistency and so forth. Client’s persistency metrics (13th, 25th, 37th ) were below the industry average. Poor persistency ratios are a cause of high concern for any life insurer given the high cost of customer acquisition and market competition. It was hence desired to understand factors behind poor persistency through the quantitative approach and recommend scientific measures to improve same.  It was also expected to devise optimal renewal strategy that currently involved agent follow-ups, email reminders, SMS alerts & telephonic follow-ups. Solution Our team of domain experts and data scientists spent a week at client premises to understand Customer acquisition process & different acquisition channels along with their strength & weakness. Renewal strategy; how it was implemented then and whether it varied for different customer segments or not. Internal data fields available for study, namely, Customer Information collected during application stage, customer interaction data through emails, call logs etc, insurance product details, acquisition channel information etc. It was observed that Renewal strategy was sub-optimal with premium amount as the only criteria used for pro-active reach out prioritization. Monthly policy renewals were of the order of 11k and call centre capacity was 12k. Assuming it takes 5 minutes for every call, a single agent will be able to make approx 1700 calls per month with a total of 21k calls across all agents. This meant an average of 2 calls per customer for renewal reminders. Optimal call centre reach out strategy across other life insurers had at least 4-5 calls at various days before renewal. This clearly meant call centre resources were stretched and we had to either increase the capacity or prioritize. 85% of the policy base had yearly renewal with the remainder evenly spread across monthly, quarterly, half yearly. There are several potential reasons for a customer not renewing the policy starting from the non-existence of customer need anymore, purchase made for wrong reasons (ex: tax savings only) without evaluating actual needs, misselling by the agent, low returns compared to the market, wrong agent practices, non-timely renewal follow-ups etc. Agent sourced policies were more proneto churn. It was decided to have a churn score for each policy to be renewed. The score would indicate the risk associated with policy not getting renewed, higher the score, higher the risk. Valiance data science team started with looking at different customer datasets and built a single view at a policy level. Datasets included were Customer Demographics, financial product details, payment transaction history, financial product performance and customer engagement through different channels. Each policy renewal was classified into lapse/non-lapse taking into account grace period. Further study involved Studying the impact of each factor on lapse event. Team created additional variables like ratio of annual premium vs income, urban/rural location category to study additional effects. Correlation studies, scatter plots were used to communicate the results. Multiple predictive algorithms were built using Logistic regression and Random Forest techniques. Team shared results with the client to arrive at a model that was an appropriate fit and made business sense in explaining the outcome. An outcome model was used to score upcoming renewals on the basis of likelihood to lapse. Scores generated for each policy coming up for renewal on monthly basis. Customer Segments were created on basis of churn score and Annual Premium. Contact Strategy was finalized on basis of churn score and premium at stake. Customers with higher churn score and premium >25k were pursued through calls and visits if needed. Customers with lower churn score and lower premium were contacted via SMS and emails. The frequency of emails and calls was adjusted as per segment. Outcome Optimal renewal strategy was developed for customer segments basis churn score and premium amount. High churn score and high premium amount were proactively reached out by agents. Low score, low premium amount customers were sent email & SMS reminders only. Targeted retention efforts resulted in an increase in Policy Persistency by 20% over 1 year with increased revenue of 3M USD.

Card payment with chip and pin machine in shop
Financial Services

Credit Fraud Mitigation at POS

Client Background Client is one the India’s biggest NBFC (Non-Banking Finance Company) providing consumer durable loans. These loans are unsecured interest free loans offered to a customer when he walks in to purchase any big ticket consumer durable goods. It’s imperative for client to have quick and reliable credit appraisal so as to not lose the customer and the same time control for fraud. Business Objective Presently the fraud stood at 3 million USD per annum. Identification of fraud happens after loan has been disbursed and customer takes delivery. There is absolutely no mechanism to assist the service representative at the store to flag off a particular case as potential fraud. Any framework/tool should be able to quantify the potential risk and hence move the case from ‘Instant mode’ to ‘Normal mode’ as the cost of outright reject is very high. It was desired to develop a ML based fraud detection framework for POS loan approvals so as To identify customers who are more likely to commit fraud/default on consumer durable loans and hence streamline loan approval process according to customer risk profiles. Solution- Fraud Detection Using Machine Learning Our team developed a Machine Learning based real time Fraud Detection engine integrated with Point of Sale. Engine classifies loan applicants into high, medium and low risk categories for potential fraud. Key features of fraud detection Assigns fraud score for applicant at point of lending. Higher fraud score applications routed through a stringent verification process. Machine Learning algorithms monitored periodically for any new fraud patterns. Outcome Substantial decrease in loan disbursement to fraudulent cases at Point of Sale Almost 10% of the originations are referred to ‘Normal process’ in which the fraud incidence is as high as 5% which translates into a gross saving of almost 1.5 million USD i.e. 50% of the VaR Substantial decrease in the third party cost of loan amount recovery from the fraudulent cases.

Atm bank banking paper currency bank teller removing currency
Financial Services

Cashflow prediction & Optimization for ATM machines

Client Background Client is a prominent consumer bank in Indonesia with a network of 400 branches, 3,000 ATMs and 1,000 recycle machines (RCMs). Business Need Our client was looking for an optimization solution to better manage the distribution of cash across their many customer interaction points. They requested a forecast of deposits and withdrawals on a daily basis at the branch, ATM and RCM level. In addition, Valiance needed to find optimal dates for cash-in (cash deposits) required at the branch, ATM and RCM level to meet the withdrawal demand subject to following constraints: Each cash deposit by a van incurs transportation costs. Transportation cost should be minimized in a way that we bring in cash needed for next few days in a single journey instead of many. Excess cash parked in branches implies additional interest cost paid to central bank. Bringing cash required to meet demand would save excessive interest paid. Limit to the cash amount every branch, ATM and RCM can hold. Solution Valiance created a Neural Network-based machine algorithm to forecast transactions. A Quadratic Optimization model was implemented to optimize cash in/cash out for RCM, ATM and cash centers. Outcome Daily forecasts for deposits and withdrawals helped the bank to more efficiently distribute its money across ATMs, RCMs and branches. Forecasts were accurate to the tune of +/-15% on average. Client reduced its transportation cost and saved interest on borrowed money by optimizing cash deposits to branches, atm’s and RCM’s once the forecast for deposits and withdrawals was complete.

Account Assets Audit Bank Bookkeeping Finance Concept
Financial Services

Risk augmented personal loan cross sell

Client Background Client is one of India’s leading financial services company focused on lending, asset management, wealth management and insurance. The company through its joint ventures and subsidiaries employs over 20,000 employees and has established a nationwide presence across over 1400 locations. Business Need Personal loans are unsecured loans generally offered at higher interest rates; in cases higher than interest rates offered by banks for similar loans. Traditional customer segment approaching the client for a personal loan cross has been one not being able to secure a similar loan from a bank. This inherently increases the risk of lending to a new applicant. At the same time, high interest rates make this product attractive from ROI standpoint. Already existing customer base with previous payment track record represents more lucrative & less risky segment for such unsecured loans. Client ran regular email marketing campaigns across its existing customer base for personal loan cross sell. Additional reach out was made through telephonic calls. A high number of existing customer base limited effective call strategy and hence conversions. It was therefore desired to expand the personal loan penetration across existing customer base and proactively identify customers who are likely to respond to personal loan offers. Solution In any lending business, business growth & risk teams need to work together to achieve a healthy portfolio with decent returns. Pro-active solution desired need to be able to identify existing customer who would respond to personal loan offers. At the same time, such an offer need to take into account risk profile of a customer. Our team, therefore, recommended the hybrid approach of marrying cross sell propensity with risk scoring. Valiance data science team built default propensity and cross-sell propensity models. It took our team two months to come out with accurate & robust models from the start of the engagement. Default propensity was arrived at based on demographics, product loan details and different RTR parameters that were available viz. a number of cheque bounces, channels of payment, maximum delinquency etc. For Cross Sell model we profiled the responders to the campaign to understand their affinity towards different offers. Marketing campaigns were designed by overlaying cross-sell propensity and delinquency risk. Results of the campaign were evaluated over the six months time frame. Outcome 10 percent increase in the overall conversion rates compared to baseline conversion rates from previous campaigns. Portfolio generated from cross sell campaign had 10-20% higher ticket size compared with average ticket size.

Customer review satisfaction feedback survey concept.
Financial Services

NPS for global financial services company

Client Background A leading global Financial Services firm wanted to estimate Net Promoter Score (NPS) for its non-survey responders and align business strategies accordingly. Business Objective Less than 1% of customers were responders to survey requests across all campaigns. Net Promoter Score (NPS) being important to understand share of wallet, it was desired to build an estimate of NPS for non-survey responders. Solution Proposed solution included building a predictive model to score non-survey responders into three categories promoters, detractors and neutral. Survey data (from Refer to Friend survey) had series of questions and responses. Using same, on a scale of 1 to 10, a promoter was defined to have a score of 9 or 10, 7 or 8 as neutral or passive and remaining detractors. Post this, models were integrated with CRM and Avaya (Call Management Tool) so that if any customer calls in, model would score with all the data available till that instant. As result of scoring, customer care executive would be aware of customer’s score and would give a guiding, gentle or lavish touch. This was later implemented across all customer touch points. Some of the variables (leading KPIs) we used were SOW, call hold time, resolution codes, frequency usage, touch points, internal cluster profiles, customer care executive rankings etc. Outcome In control sample, we were able to demonstrate higher satisfaction and increased transaction affinity over a period of 6 Months. In long term, it increased share of wallet significantly.

Power plant with blue sky.
Industrial

IOT based Power Plant Monitoring

About our Client Client is a state owned Indian hydel power generation company operating three power plants with generation capacity between 30 MW to 240 MW in India. These power plants have been operational since decades using equipment’s from different manufacturers. Plant’s operational team use electronic hardware deployed at plant site to monitor plant and asset (transformers, generators, turbines) parameters like voltage levels, current readings, pressure levels, fan speed & vibrations at various places. These sensor devices are localized and do not exchange data to a central place to enable centralized monitoring, analytics or various AI workloads. Business Objective As part of digital transformation journey client is looking to build cloud based IOT platform to enable Centralized data collection of various sensor data in a data lake environment Real time monitoring of plant & asset parameters i.e. output power generated, pressure, voltage & current levels. KPI reporting for internal stakeholders at various levels. Power advanced AI workloads like predictive maintenance in future. Solution After discussion with client’s technology team and evaluating popular cloud vendors, AWS was chosen as a platform of choice to develop the solution. We then worked with AWS technology team to develop technology architecture comprising data lake environment on S3, Kinesis streams to ingest incoming sensor streams, lambda functions to pre-process sensor data before storing in S3 and athena queries for reporting. Entire solution was developed in following steps Different plant assets (transformers, turbines, generators) were equipped with sensors to enable IoT remote monitoring. These sensors were also configured to send data to cloud via centralized local gateway device. Total of 123 sensors were integrated across 3 transformers, generators, transformers each plus a single water conducting system. Ingestion of streaming data through Kinesis data stream and pre-processing & enrichment through lamda functions. Data from kinesis streams was stored in S3 in Json and parquet format. Data in S3 was aggregated using athena queries to compute KPI’s and other operational metrics. These metrics were stored in relational database for downstream BI applications. Web based application with multi-user & role-based access to enable information access for different audience. Plant’s team wanted to monitor real time data & historical data points to observe & study any anomalies in asset’s health. Management wanted to review each power plant’s output generation, downtime. Outcome In a month’s time we went live with the IOT platform across single power plant. This enabled operational team real time access to the asset data, historical trends & patterns. Management could also view metrics on output power & downtime. Platform is now being expanded to two more power plants.

Scroll to Top