Axtria Blog

At Supplier Management crossroads? - Choose the Right Approach

With new compliance requirements, increased scrutiny from regulators and the pressure to maintain ones market position, reputation and brand, developing a comprehensive supplier management function has become critical in recent times. Banks can gain tremendous cost advantages and efficiency gains through increased governance and streamlined operations of supplier management.

Read More

Deploying K Nearest Neighbor Modeling Methodologies for Real World Problems

Easily available online data, immense improvement of theory and algorithms in recent times have influenced and enhanced the computation power in solving real world problems. A bunch of machine learning techniques are available now to make our job easier. K Nearest Neighbor (KNN), Decision Tree, Gradient Boosting Methodologies (GBM), Random Forest, Support Vector Machine (SVM) are some of the popular techniques that have emerged in the recent past. It may happen that multiple methods (machine learning as well as traditional methods) are applicable to a particular problem in hand and the researcher or the analyst may be confused to find a good reason for choosing one method over another. This paper aims to help you make intelligent decisions about where KNN is most suitable for your particular problem and how you can apply it.

Read More

A Mathematical Framework for Privacy Risk

Privacy is the right to have control over how an individual’s personal information is being collected, shared and used. In recent trends, people’s attitude towards the internet has changed tremendously. The willingness and comfort level of people to share and reveal information about themselves and their peers have vastly increased, and are further rising. As information intensive sites increase and sprout across networks, the probability of individuals to both consciously and subconsciously  reveal personal information on these different networks has increased drastically, and will continue to increase in the foreseeable future. This increase in exposure can lead to incidents of identity theft, fraud and data leakage posing a serious threat to one’s privacy.

Read More

Operational Risk Management in Banks

In recent years, more than 100 losses exceeding US$ 100 million have been reported, prompting the regulators to turn their focus towards effectiveness of Operational Risk Management practices at banks. BCBS (Basel Committee on Banking Supervision) has issued guidance to financial institutions to create sound operational risk management infrastructure.

A study of 30 globally systematic important banks (G-SIBs) conducted by the Basel committee in 2013 did not provide too many reasons to cheer for other banks. Important challenges highlighted by the study focused around data architecture, reporting and KRI framework.

Read More

Operational Risk Management in Banks

As part of Basel accords, banks and other financial institution should hold sufficient capital to buffer against large unexpected losses. Minimum capital required, for pillar 1, is a combination of credit, operations and market risk.

While the Probability of Default (PD) is conditioned to include defaults to a 99.9 percentile value, no such amendments were made to Exposure At Default (EAD) or the Loss Given Default (LGD) to reflect such severe scenarios. Regulatory guidance was issued to condition EAD and LGD values to observed period of downturn within the data. Regulatory guidelines suggest addition of conservatism based on maxima of historical data.

Prerequisite to any downturn computation is identification of downturn period which exhibits circumstances of an adverse nature. Banks should be able to meet their debt obligation in spite of these unforeseen variations. The current methodology to do so, hinges on the following key steps:

  • Identify downturn driver: Using historical data, plot the series of portfolio default rate over time
  • Identify downturn: Identify the month with highest default rate; a 6-month window on both sides of this point is considered as downturn period
  • Downturn computation: Maximum of EAD and LGD in this period provides the downturn estimates

However, since the existing methodology is based on absolute values of EAD and LGD (which might not paint the complete picture about the respective downturn estimates), it has a number of inherent drawbacks:

  • Dependence on credit line assignment policies: the downturn identified only on the basis of EAD maxima isn’t really a reflection of distress but could also possibly be an outcome of a bank’s credit line assignment policies.
  • Inability to incorporate “hunger for credit”: The current approach doesn’t explicitly take into account increased credit hunger of the borrower. Someone in a dire need of credit would certainly tap all possible resources, thus leading to higher EAD.
  • Dependence on credit line decreases by management: Banks may cut back credit lines to reduce potential losses if they observe deterioration of borrower credit quality at a more systematic level. Thus, EAD is determined by the behavior of both borrowers and lenders while they “race to default” jointly.
  • Possibility of LGD downturn pre-dating PD downturn: Both PD and LGD downturn are identified as their maximum values in downturn window. This leads to a possibility that LGD downturn might occur before default rate peak.

To overcome these drawbacks, we propose inclusion of credit conversion factor into downturn identification process (CCF). CCF is the ratio of difference between EAD and balance at observation to the available credit (also referred as open-to-buy) as of observation point.

                                                                            

Exposure At Default (EAD) for an account is driven by borrower’s hunger for credit. Various exogenous factors like credit limit, historical spending patterns, current balance etc. can help in estimating this phenomenon. Out of these, available credit limit has proved to be the best indicator for predicting EAD. It’s not so much the actual credit limit, but the drawdowns on unused limit that defines the credit hunger of the borrower. The way CCF methodology can be applied for estimating EAD downturn is as follows:

  • Using historical data, plot the series of portfolio CCF over time
  • Consider peak CCF as downturn point
  • Compare CCF of scoring month with CCF at downturn point
  • Adjust scoring EAD to reflect CCF at downturn point  (CCF Final = Max(CCF Scoring, CCF Downturn)

The same is highlighted in the graphic below.

Downturn_EAD

 

Loss given default for an account is equally dependent on the ability, as much as, the intention to pay. It would be safe to assume that a defaulter utilizing large amount of the available credit won’t be paying back a major chunk of his balance. Applying this concept, we can deduce a relationship between CCF and recovery rate. However, the problem of stale data cannot be mitigated using CCF directly. To overcome this, we suggest establishing a relationship between CCF and LGD. This can be used to forecast values of LGD for the workout window, thus providing recoveries for recent time period. The steps to execute are as follows:

  • Create series of both, actual LGD and observed CCF, over time
  • Establish a statistical relationship between the two using linear regression or time series modeling
  • Use model equations to forecast LGD for months with actual CCF available
  • Select the maximum LGD from the pool of both actual and forecasted LGD as downturn LGD
  • Compare the downturn LGD with that of the scoring month and adjust to reflect the worst case scenario

The same is shown in the graphic below:

 LGD_Identification

We believe CCF approach is more risk sensitive (and granular) as it takes into account various factors namely customer’s hunger for credit, the intention to pay, management’s response to downturn conditions to assess downturn period. Given Basel guidelines encourage banks to look for more granular techniques, as these may provide more conservative downturn estimates, we believe CCF methodologyk is more apt for downturn estimation.

 Click here to read the complete whitepaper.

Get a copy of the whitepaper now!

 

Read More

Mortgage Servicer - You’ve Got Information!

In the light of the crisis, the mortgage industry has been under tremendous pressure to manage its business in a more data-driven manner. There are both operational as well as regulatory imperatives to understand the borrower better, maintain procedural controls, and improve transparency.

Read More

Effective Quality Assurance for Banks

Over the years, financial systems and networks enabling banks’ business operations have grown in scope, scale and complexity. To manage internal processes such as customer acquisition, existing customer management and collections processes ; and risk strategies such as ‘Authorization, CLI / CLD, Payment Hold’ etc., the banks have adopted newer enterprise grade platforms such as Visionplus (First Data), TS2 (TSYS), TRIAD (FICO), Blaze Advisor (FICO) and Strategy Design Studio (Experian).

Read More

How to select a stress testing methodology?

As the world is crawling out of the global economic crisis, several efforts are underway by regulators as well as international organizations, such as the International Monetary Fund (IMF) and Bank of International Settlements, to institutionalize stress testing as an integral part of bank’s functioning. The intent is to better understand system-wide risks that can trigger widespread economic and financial instability. US Fed, therefore, has mandated an annual Comprehensive Capital Adequacy Review (CCAR) exercise for all banks to submit their capital plans for multiple scenarios (Baseline and Stressed scenarios).

Read More

Are you always delayed in processing your claims?

Processing claims quickly and accurately is one of the biggest challenges payers face today in healthcare industry. Huge volume of claims notwithstanding, multiple and incompatible systems requiring significant manual hand offs have made the timely disposal of claims the single biggest burden on operating costs of payers. Add to it the problem of incorrectly filed claims that payers need to handle, leading to choking of band-width on account of re-processing; the problem suddenly becomes even more acute. With state specific regulations penalizing such delays, it has become a matter of survival for payers to figure out the optimal trade-off between analysis one claim warrants versus time it takes to get it through the system.

Read More

Newsletter Sign-Up

FOLLOW

Recent Posts