Axtria Blogs

Blogs By Amanjeet Saluja

Driver Score - Quantifying your Moves

With the advancements in the field of science and technology, more and more appliances have integrated systems capable of storing and reporting streams of data. Under the field of telematics, various devices are being used to extract, store and transform information related to vehicles and their usage. The data provided by these devices also contains information regarding various aspects of the driver including the driving patterns.

Need of an hour is the unified rating for drivers that represents how good a driver is, irrespective of the device or telematics service provider (TSP). A lot of information related to driving behaviors is being tracked through in-vehicle telecommunication devices (telematics) that are usually self-installed into a special vehicle port and can be used to predict the level of risk that a driver will cause an accident in foreseeable future. Through this paper, we propose the following methodology to calculate these rating, the driver scores as it is generally called.

Read More

At Supplier Management crossroads? - Choose the Right Approach

With new compliance requirements, increased scrutiny from regulators and the pressure to maintain ones market position, reputation and brand, developing a comprehensive supplier management function has become critical in recent times. Banks can gain tremendous cost advantages and efficiency gains through increased governance and streamlined operations of supplier management.

Read More

Deploying K Nearest Neighbor Modeling Methodologies for Real World Problems

Easily available online data, immense improvement of theory and algorithms in recent times have influenced and enhanced the computation power in solving real world problems. A bunch of machine learning techniques are available now to make our job easier. K Nearest Neighbor (KNN), Decision Tree, Gradient Boosting Methodologies (GBM), Random Forest, Support Vector Machine (SVM) are some of the popular techniques that have emerged in the recent past. It may happen that multiple methods (machine learning as well as traditional methods) are applicable to a particular problem in hand and the researcher or the analyst may be confused to find a good reason for choosing one method over another. This paper aims to help you make intelligent decisions about where KNN is most suitable for your particular problem and how you can apply it.

Read More

A Mathematical Framework for Privacy Risk

Privacy is the right to have control over how an individual’s personal information is being collected, shared and used. In recent trends, people’s attitude towards the internet has changed tremendously. The willingness and comfort level of people to share and reveal information about themselves and their peers have vastly increased, and are further rising. As information intensive sites increase and sprout across networks, the probability of individuals to both consciously and subconsciously  reveal personal information on these different networks has increased drastically, and will continue to increase in the foreseeable future. This increase in exposure can lead to incidents of identity theft, fraud and data leakage posing a serious threat to one’s privacy.

Read More

Operational Risk Management in Banks

In recent years, more than 100 losses exceeding US$ 100 million have been reported, prompting the regulators to turn their focus towards effectiveness of Operational Risk Management practices at banks. BCBS (Basel Committee on Banking Supervision) has issued guidance to financial institutions to create sound operational risk management infrastructure.

A study of 30 globally systematic important banks (G-SIBs) conducted by the Basel committee in 2013 did not provide too many reasons to cheer for other banks. Important challenges highlighted by the study focused around data architecture, reporting and KRI framework.

Read More

Basel Downturn- Identification & Estimation

As part of Basel accords, banks and other financial institution should hold sufficient capital to buffer against large unexpected losses. Minimum capital required, for pillar 1, is a combination of credit, operations and market risk.

While the Probability of Default (PD) is conditioned to include defaults to a 99.9 percentile value, no such amendments were made to Exposure At Default (EAD) or the Loss Given Default (LGD) to reflect such severe scenarios. Regulatory guidance was issued to condition EAD and LGD values to observed period of downturn within the data. Regulatory guidelines suggest addition of conservatism based on maxima of historical data.

Prerequisite to any downturn computation is identification of downturn period which exhibits circumstances of an adverse nature. Banks should be able to meet their debt obligation in spite of these unforeseen variations. The current methodology to do so, hinges on the following key steps:

  • Identify downturn driver: Using historical data, plot the series of portfolio default rate over time
  • Identify downturn: Identify the month with highest default rate; a 6-month window on both sides of this point is considered as downturn period
  • Downturn computation: Maximum of EAD and LGD in this period provides the downturn estimates

However, since the existing methodology is based on absolute values of EAD and LGD (which might not paint the complete picture about the respective downturn estimates), it has a number of inherent drawbacks:

  • Dependence on credit line assignment policies: the downturn identified only on the basis of EAD maxima isn’t really a reflection of distress but could also possibly be an outcome of a bank’s credit line assignment policies.
  • Inability to incorporate “hunger for credit”: The current approach doesn’t explicitly take into account increased credit hunger of the borrower. Someone in a dire need of credit would certainly tap all possible resources, thus leading to higher EAD.
  • Dependence on credit line decreases by management: Banks may cut back credit lines to reduce potential losses if they observe deterioration of borrower credit quality at a more systematic level. Thus, EAD is determined by the behavior of both borrowers and lenders while they “race to default” jointly.
  • Possibility of LGD downturn pre-dating PD downturn: Both PD and LGD downturn are identified as their maximum values in downturn window. This leads to a possibility that LGD downturn might occur before default rate peak.

To overcome these drawbacks, we propose inclusion of credit conversion factor into downturn identification process (CCF). CCF is the ratio of difference between EAD and balance at observation to the available credit (also referred as open-to-buy) as of observation point.

                                                                            

Exposure At Default (EAD) for an account is driven by borrower’s hunger for credit. Various exogenous factors like credit limit, historical spending patterns, current balance etc. can help in estimating this phenomenon. Out of these, available credit limit has proved to be the best indicator for predicting EAD. It’s not so much the actual credit limit, but the drawdowns on unused limit that defines the credit hunger of the borrower. The way CCF methodology can be applied for estimating EAD downturn is as follows:

  • Using historical data, plot the series of portfolio CCF over time
  • Consider peak CCF as downturn point
  • Compare CCF of scoring month with CCF at downturn point
  • Adjust scoring EAD to reflect CCF at downturn point  (CCF Final = Max(CCF Scoring, CCF Downturn)

The same is highlighted in the graphic below.

Downturn_EAD

 

Loss given default for an account is equally dependent on the ability, as much as, the intention to pay. It would be safe to assume that a defaulter utilizing large amount of the available credit won’t be paying back a major chunk of his balance. Applying this concept, we can deduce a relationship between CCF and recovery rate. However, the problem of stale data cannot be mitigated using CCF directly. To overcome this, we suggest establishing a relationship between CCF and LGD. This can be used to forecast values of LGD for the workout window, thus providing recoveries for recent time period. The steps to execute are as follows:

  • Create series of both, actual LGD and observed CCF, over time
  • Establish a statistical relationship between the two using linear regression or time series modeling
  • Use model equations to forecast LGD for months with actual CCF available
  • Select the maximum LGD from the pool of both actual and forecasted LGD as downturn LGD
  • Compare the downturn LGD with that of the scoring month and adjust to reflect the worst case scenario

The same is shown in the graphic below:

 LGD_Identification

We believe CCF approach is more risk sensitive (and granular) as it takes into account various factors namely customer’s hunger for credit, the intention to pay, management’s response to downturn conditions to assess downturn period. Given Basel guidelines encourage banks to look for more granular techniques, as these may provide more conservative downturn estimates, we believe CCF methodologyk is more apt for downturn estimation.

 Click here to read the complete whitepaper.

Get a copy of the whitepaper now!

 

Read More

Borrower-Centrism: Bring ‘Service’ back into Mortgage Servicing

Mortgage Servicers are entrusted with almost all vital operations, ranging from payment collection to loss mitigation attempts. However, until January 2014, no strict regulations were in place to safeguard the borrower’s interest. In the upheaval of the last few years, both competitive pressures and the regulatory environment have changed this forever. If the industry has to flourish in the wake of the crisis, then it must embrace borrower-centrism as a core value.

Read More

Are you always delayed in processing your claims?

Processing claims quickly and accurately is one of the biggest challenges payers face today in healthcare industry. Huge volume of claims notwithstanding, multiple and incompatible systems requiring significant manual hand offs have made the timely disposal of claims the single biggest burden on operating costs of payers. Add to it the problem of incorrectly filed claims that payers need to handle, leading to choking of band-width on account of re-processing; the problem suddenly becomes even more acute. With state specific regulations penalizing such delays, it has become a matter of survival for payers to figure out the optimal trade-off between analysis one claim warrants versus time it takes to get it through the system.

Read More

Operational Risk Computation - Whitepaper

Operational Risk has become a key area of priority for banks in recent past due to regulatory pressures. The banks are identifying ways of measuring, monitoring, and predicting losses due to operational risk in the system and the associated capital requirements.

Read More

The Long and Winding Road of Regulatory Compliance

CCAR 2013 results are out. While most of Bank Holding Corporations (BHCs) have exhibited strong capital resilience to stress tests, 4 out of 18 BHC need attention in their capital plan or capital planning process[1]. In 2012, the picture was similar. Most BHCs had adequate capital ratios; however 4 of the 19 BHCs had one or more projected regulatory capital ratios that fell below regulatory minimum levels at some point over the stress scenario horizon[2].

Read More

Let Us Show You What Axtria’s 

Solutions Can Do For You