Search results

1 – 10 of 21
Article
Publication date: 3 July 2017

Lukas Prorokowski

To explain the shadow banking regime that will be enforced in the European Union by local regulators starting in January 2017.

1491

Abstract

Purpose

To explain the shadow banking regime that will be enforced in the European Union by local regulators starting in January 2017.

Design/methodology/approach

Recognising the regulatory-induced difficulties in the process of identifying certain types of clients (investment funds) as shadow banking entities, this article provides a decision tree for the shadow banking classification process in order to aid the impacted institutions with the assessment of their clients. With this in mind, the article advises the impacted institutions on the specific steps that should be taken when assessing investment funds for shadow banking flags. Furthermore, the article provides insights into the information required to conduct the shadow banking classification process.

Findings

The regime requires the impacted institutions to assess their clients for shadow banking flags in order to impose limits on credit lines to clients classified as shadow banking entities. The US regulatory jurisdiction will be impacted over a longer term.

Originality/value

The recommendations in this article will be especially useful for investment funds to ensure that the relevant information is clearly stated in their prospectuses in order to avoid being classified as shadow banking entities.

Article
Publication date: 21 March 2019

Lukasz Prorokowski, Hubert Prorokowski and Georgette Bongfen Nteh

This paper aims to analyse the recent changes to the Pillar 2 regulatory-prescribed methodologies to classify and calculate credit concentration risk. Focussing on the Prudential…

Abstract

Purpose

This paper aims to analyse the recent changes to the Pillar 2 regulatory-prescribed methodologies to classify and calculate credit concentration risk. Focussing on the Prudential Regulation Authority’s (PRA) methodologies, the paper tests the susceptibility to bias of the Herfindahl–Hirscham Index (HHI). The empirical tests serve to assess the assumption that the regulatory classification of exposures within the geographical concentration is subject to potential misuse that would undermine the PRA’s objective of obtaining risk sensitivity and improved banking competition.

Design/methodology/approach

Using the credit exposure data from three global banks, the HHI methodology is applied to the portfolio of geographically classified exposures, replicating the regulatory exercise of reporting credit concentration risk under Pillar 2. In doing so, the validity of the aforementioned assumption is tested by simulating the PRA’s Pillar 2 regulatory submission exercise with different scenarios, under which the credit exposures are assigned to different geographical regions.

Findings

The paper empirically shows that changing the geographical mapping of the Eastern European EU member states can result in a substantial reduction of the Pillar 2 credit concentration risk capital add-on. These empirical findings hold only for the banks with large exposures to Eastern Europe and Central Asia. The paper reports no material impact for the well-diversified credit portfolios of global banks.

Originality/value

This paper reviews the PRA-prescribed methodologies and the Pillar 2 regulatory guidance for calculating the capital add-on for the single name, sector and geographical credit concentration risk. In doing so, this paper becomes the first to test the assumptions that the regulatory guidance around the geographical breakdown of credit exposures is subject to potential abuse because of the ambiguity of the regulations.

Details

Journal of Financial Regulation and Compliance, vol. 27 no. 3
Type: Research Article
ISSN: 1358-1988

Keywords

Article
Publication date: 6 July 2020

Lukasz Prorokowski, Oleg Deev and Hubert Prorokowski

The use of risk proxies in internal models remains a popular modelling solution. However, there is some risk that a proxy may not constitute an adequate representation of the…

Abstract

Purpose

The use of risk proxies in internal models remains a popular modelling solution. However, there is some risk that a proxy may not constitute an adequate representation of the underlying asset in terms of capturing tail risk. Therefore, using empirical examples for the financial collateral haircut model, this paper aims to critically review available statistical tools for measuring the adequacy of capturing tail risk by proxies used in the internal risk models of banks. In doing so, this paper advises on the most appropriate solutions for validating risk proxies.

Design/methodology/approach

This paper reviews statistical tools used to validate if the equity index/fund benchmark are proxies that adequately represent tail risk in the returns on an individual asset (equity/fund). The following statistical tools for comparing return distributions of the proxies and the portfolio items are discussed: the two-sample Kolmogorov–Smirnov test, the spillover test and the Harrell’s C test.

Findings

Upon the empirical review of the available statistical tools, this paper suggests using the two-sample Kolmogorov–Smirnov test to validate the adequacy of capturing tail risk by the assigned proxy and the Harrell’s C test to capture the discriminatory power of the proxy-based collateral haircuts models. This paper also suggests a tool that compares the reactions of risk proxies to tail events to verify possible underestimation of risk in times of significant stress.

Originality/value

The current regulations require banks to prove that the modelled proxies are representative of the real price observations without underestimation of tail risk and asset price volatility. This paper shows how to validate proxy-based financial collateral haircuts models.

Details

The Journal of Risk Finance, vol. 21 no. 3
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 12 February 2018

Lukasz Prorokowski

Focusing on delivering practical implications, the purpose of this paper is to show optimal ways of calculating risk weights for public sector entities (PSEs) under the…

Abstract

Purpose

Focusing on delivering practical implications, the purpose of this paper is to show optimal ways of calculating risk weights for public sector entities (PSEs) under the standardised approach in credit risk. Focusing on the changing regulatory background, this paper aims to explain the proposed revisions to the standardised approach for credit risk. Where necessary, upon the review of the forthcoming standards, this paper attempts to indicate room for improvement for policymakers and flag areas of potential ambiguity for practitioners.

Design/methodology/approach

This paper discusses and analyses the revised standards for the standardised approach in credit risk with respect to the treatment of PSEs. This paper, analysing the current regulatory proposals, tests the hypothesis stating that the affected banks may experience higher or lower capital charges for credit risk depending on the following factors: Choosing the optimal risk weight calculation methodology; and choosing the optimal composition of the credit risk portfolio.

Findings

The paper advises on using sovereign ratings as a base of risk weight calculations and categorising eligible entities as sovereign exposures. Individual entity ratings are not readily available and the majority of PSEs remain unrated by the external agencies. The simplistic approach of using sovereign ratings results in a lower risk weighted capital than the approach of using individual entity ratings. The sovereign rating approach decreases the value of the original exposure by 77 per cent. Reliance on sovereign ratings outperforms the optimal solution proposed in this paper. Categorisation of eligible entities as sovereign exposures significantly decreases the risk exposure capital in the standardised approach. There are, however, specific criteria highlighted in this paper that must be met by a PSE to be categorised as a sovereign exposure.

Originality/value

In addition to testing various scenarios of calculating risk weights, this paper highlights regulatory areas that require further improvements and immediate attention from the policymakers and practitioners. At this point, the paper reports that the proposed changes to the risk weight buckets for PSE exposures may be erroneous and resulting from the typos in the second consultative paper.

Details

Journal of Financial Regulation and Compliance, vol. 26 no. 1
Type: Research Article
ISSN: 1358-1988

Keywords

Article
Publication date: 8 July 2014

Lukasz Prorokowski and Hubert Prorokowski

This paper, based on case-studies with five universal banks from Europe and North America, aims to investigate which types of comprehensive risk measure (CRM) models are being…

Abstract

Purpose

This paper, based on case-studies with five universal banks from Europe and North America, aims to investigate which types of comprehensive risk measure (CRM) models are being used in the industry, the challenges being faced in implementation and how they are being currently rectified. Undoubtedly, CRM remains the most challenging and ambiguous measure applied to the correlation trading book. The turmoil surrounding the new regulatory framework boils down to the Basel Committee implementing a range of capital charges for market risk to promote “safer” banking in times of financial crisis. This report discusses current issues faced by global banks when complying with the complex set of financial rules imposed by Basel 2.5.

Design/methodology/approach

The current research project is based on in-depth, semi-structured interviews with five universal banks to explore the strides major banks are taking to introduce CRM modelling while complying with the new regulatory requirements.

Findings

There are three measures introduced by the Basel Committee to serve as capital charges for market risk: incremental risk charge; stressed value at risk and CRM. All of these regulatory-driven measures have met with strong criticism for their cumbersome nature and extremely high capital charges. Furthermore, with banks facing imminent implementation deadlines, all challenges surrounding CRM must be rectified. This paper provides some practical insights into how banks are finalising the new methodologies to comply with Basel 2.5.

Originality/value

The introduction of CRM and regulatory approval of new internal market risk models under Basel 2.5 has exerted strong pressure on global banks. The issues and computational challenges surrounding the implementation of CRM methodologies are currently fiercely debated among the affected banks. With little guidance from regulators, it remains very unclear how to implement, calculate and validate CRM in practice. To this end, a need for a study that sheds some light on practices with developing and computing CRM emerged. On submitting this paper to the journal, we have received news that JP Morgan is to pay four regulators $920 million as a result of a CRM-related scandal.

Details

Journal of Financial Regulation and Compliance, vol. 22 no. 3
Type: Research Article
ISSN: 1358-1988

Keywords

Article
Publication date: 4 November 2014

Lukasz Prorokowski and Hubert Prorokowski

The purpose of this paper is to outline how banks are coping with the new regulatory challenges posed by stressed value at risk (SVaR). The Basel Committee has introduced three…

651

Abstract

Purpose

The purpose of this paper is to outline how banks are coping with the new regulatory challenges posed by stressed value at risk (SVaR). The Basel Committee has introduced three measures of capital charges for market risk: incremental risk charge (IRC), SVaR and comprehensive risk measure (CRM). This paper is designed to analyse the methodologies for SVaR deployed at different banks to highlight the SVaR-related challenges stemming from complying with Basel 2.5. This revised market risk framework comes into force in Europe in 2012. Among the wide range of changes is the requirement for banks to calculate SVaR at a 99 per cent confidence interval over a period of significant stress.

Design/methodology/approach

The current research project is based on in-depth, semi-structured interviews with nine universal banks and one financial services company to explore the strides major banks are taking to implement SVaR methodologies while complying with Basel 2.5.

Findings

This paper focuses on strengths and weaknesses of the SVaR approach while reviewing peer practices of implementing SVaR modelling. Interestingly, the surveyed banks have not indicated significant challenges associated with implementation of SVaR, and the reported problems boil down to dealing with the poor quality of market data and, as in cases of IRC and CRM, the lack of regulatory guidance. As far as peer practices of implementing SVaR modelling are concerned, the majority of the surveyed banks utilise historical simulations and apply both the absolute and relative measures of volatility for different risk factors.

Originality/value

The academic studies that explicitly analyse challenges associated with implementing the stressed version of VaR are scarce. Filling in the gap in the existing academic literature, this paper aims to shed some explanatory light on the issues major banks are facing when calculating SVaR. In doing so, this study adequately bridges theory and practice by contributing to the fierce debate on compliance with Basel 2.5.

Details

Journal of Financial Regulation and Compliance, vol. 22 no. 4
Type: Research Article
ISSN: 1358-1988

Keywords

Article
Publication date: 2 November 2015

Lukas Prorokowski and Hubert Prorokowski

BCBS 239 sets out a challenging standard for risk data processing and reporting. Any bank striving to comply with the principles will be keen to introspect how risk data is…

Abstract

Purpose

BCBS 239 sets out a challenging standard for risk data processing and reporting. Any bank striving to comply with the principles will be keen to introspect how risk data is organized and what execution capabilities are at their disposal. With this in mind, the current paper advises banks on the growing number of solutions, tools and techniques that can be used to support risk data management frameworks under BCBS 239.

Design/methodology/approach

This paper, based on a survey with 29 major financial institutions, including G-SIBs and D-SIBs from diversified geographical regions such as North America, Europe and APAC, aims to advise banks and other financial services firms on what is needed to become ready and compliant with BCBS 239. This paper discusses best practice solutions for master data management, data lineage and end user implementations.

Findings

The primary conclusion of this paper is that banks should not treat BCBS 239 as yet another compliance exercise. The BCBS 239 principles constitute a driving force to restore viability and improve risk governance. In light of the new standards, banks can benefit from making significant progress towards risk data management transformation. This report argues that banks need to invest in a solution that empowers those who use the data to manage risk data. Thus, operational complexities are lifted and no data operations team is needed for proprietary coding of the data. Only then banks will stay abreast of the competition, while becoming fully compliant with the BCBS 239 principles.

Practical implications

As noted by Prorokowski (2014), “Increasingly zero accountability, imposed, leveraged omnipresent vast endeavors, yielding ongoing understanding […] of the impact of the global financial crisis on the ways data should be treated” sparked off international debates addressing the need for an effective solution to risk data management and reporting.

Originality/value

This paper discusses the forthcoming regulatory change that will have a significant impact on the banking industry. The Basel Committee on Banking Supervision published its Principles for effective risk data aggregation and risk reporting (BCBS239) in January last year. The document contains 11 principles that Global Systemically Important Banks (G-SIBs) will need to comply with by January 2016. The BCBS 239 principles are regarded as the least known components of the new regulatory reforms. As it transpires, the principles require many banks to undertake a significant amount of technical work and investments in IT infrastructure. Furthermore, BCBS 239 urges financial services firms to review their definitions of the completeness of risk data.

Details

Journal of Investment Compliance, vol. 16 no. 4
Type: Research Article
ISSN: 1528-5812

Keywords

Article
Publication date: 28 October 2014

Lukas Prorokowski

This paper aims to investigate whether enhanced requirements result in the depositories exiting the business. Furthermore, this paper attempts to analyse prospective changes to…

Abstract

Purpose

This paper aims to investigate whether enhanced requirements result in the depositories exiting the business. Furthermore, this paper attempts to analyse prospective changes to the operating structures caused by the Alternative Investment Fund Managers Directive (AIFMD). Most importantly, this paper discusses the processes to evaluate and manage counterparty risk relating to prime brokers. AIFMD makes fundamental changes to the depository liability and managing counterparty risk by making a depository bank liable for any losses to investor assets, even those held within third-party custodians appointed by the depository. Depositories will also need to calculate the probability of default of their sub-custodians and use complex credit models to calculate any capital requirements under the fourth Capital Requirements Directive (CRD IV).

Design/methodology/approach

This paper is based on an insightful secondary analysis of the AIFMD with practical implications drawn for depository banks. The analysis of this topical research has been broken down into the following sections: assessing and managing counterparty risk of prime brokers; insurance against defaults of prime brokers; and regulatory-driven challenges and changes to depository banks.

Findings

The post-Lehman banking industry has realised that counterparty risk cannot be ignored. This has triggered heated debates among regulators and practitioners whereby any depository bank should clearly separate the assets of its clients from the depository assets and its own assets. This paper argues that the custodian services will witness consolidation with the big players remaining and small custodians forced to leave the business in light of the enhanced liabilities under the AIFMD. In addition to this, this paper has stressed that assessing counterparty risk should be supported by an insightful analysis of the culture of a prime broker; its legal, structural and regulatory safeguards; and quality of assets. Moreover, managing risks associated with prime brokers entails significant costs to depositories. Thus, depository banks are advised to factor these costs into their pricing models.

Originality/value

Given the magnitude of recent regulatory initiatives and complex challenges faced by depositories, an important question arises whether depository banks would exit the business in light of the regulatory-induced liabilities. This paper addresses the aforementioned question and provides practical implications into managing emerging risks by depository banks. At this point, the majority of depositories are in a process of developing in-house solutions for managing risks related to prime brokers, and hence would benefit from practical insights into these processes that are provided in this paper.

Details

Journal of Investment Compliance, vol. 15 no. 4
Type: Research Article
ISSN: 1528-5812

Keywords

Article
Publication date: 16 November 2015

Lukasz Prorokowski

This paper aims to discuss ideas of factoring in external loss data to the internal loss data sets to obtain a true picture of operational losses for non-bank financial services…

Abstract

Purpose

This paper aims to discuss ideas of factoring in external loss data to the internal loss data sets to obtain a true picture of operational losses for non-bank financial services firms, focusing on a case study of the interdealer brokers business and a specific Basel II category of the operational risk capital charges. As it transpires, financial services firms are increasingly required by regulators to merge external loss data with their internal data sets when using a loss distribution approach. However, there is a significant constrain on the availability and completeness of the external data for non-bank financial services firms.

Design/methodology/approach

Embarking on a modified Kaplan-Meier method is a clever way of factoring in external loss data into the internal data set. It allows non-bank financial firms to choose which fragments of the data constitute “the best fit”. In choosing the external data, this paper posits that such firms need to rely on loss-type events that display similar patterns in probabilities of occurrence. This method eliminates over-reliance on the external data that are specific for a different entity. One of the most important assumption underpinning the method presented in this paper is the fact that constant time intervals between the recorded operational loss events are assumed. Hereto, reaching a certain level of loss is used as the event of interest in both groups. For simplification purposes and to eliminate the noise and capture significant losses, we set this level as a multiplicity of the interdealer broker’s loss threshold.

Findings

Obtaining external loss data is difficult for the non-bank financial services firms. Furthermore, institutions operating as interdealer brokers are exposed to different levels of operational risk that affect their own Advanced Measurement Approach to capital charges under Basel II. The existing consortium data sets are not suitable for non-bank financial institutions. With this in mind, the non-bank firms should select only the parts of the external data that fit their business environment.

Originality/value

This paper should be of interest to any financial services firms that is required by regulators to merge its internal loss data sets with external loss data. Furthermore, this paper makes strong recommendations for regulators who should understand that the contemporary operational risk consortium data sets are not suitable for non-bank financial services firms.

Details

The Journal of Risk Finance, vol. 16 no. 5
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 11 May 2015

Lukasz Prorokowski

This paper aims to discuss the impact of nascent Markets in Financial Instruments Directive (MiFID II) initiatives and, thus, to deliver practical insights into MiFID II…

1868

Abstract

Purpose

This paper aims to discuss the impact of nascent Markets in Financial Instruments Directive (MiFID II) initiatives and, thus, to deliver practical insights into MiFID II implementation, compliance and cost reduction MiFID II constitutes the backbone for the upcoming financial market reforms. With the first proposal of MiFID drafted in October 2011, this regulatory framework has undergone over 2,000 amendments. As MiFID II currently stands, this Directive attempts to address issues exposed by the global financial crisis.

Design/methodology/approach

This study, based on secondary research and an in-depth analysis of the MiFID II framework, investigates structural and technological challenges entailed by this Directive. The analysis is broken down into the following sections: technological and structural challenges; costs of implementation; MiFID II teams; facilitating near real-time regulatory reporting; increased transparency requirements; and information technology (IT) initiatives for MiFID II compliance.

Findings

MiFID II commands significant changes in business and operating models. With this in mind, the study indicates current technological and structural challenges faced by financial institutions and advises on ways of mitigating MiFID II risks. Although it is too early to assess the costs of implementing MiFID II, this paper suggests ways of reducing MiFID II-related costs. The study also advises on organising dedicated teams to deal with MiFID II. Furthermore, this paper argues that early investments in IT systems and processes would allow financial services firms to gain a competitive advantage and, hence, scoop up market share or launch new, lucrative services – especially in the area of collateralisation and market data processing.

Originality/value

This paper shows that the current version of MiFID II still requires a great deal of attention from the regulators that need to readdress contentious issues revolving around the links between MiFID II and other regulatory frameworks such as European Market Infrastructure Regulation and Dodd–Frank. This study addresses the MiFID II compliance issues by adopting European Union and non-European Union banks’ and asset managers’ perspectives and, hence, delivers practical implications for risk managers and compliance officers of various financial institutions.

Details

Journal of Financial Regulation and Compliance, vol. 23 no. 2
Type: Research Article
ISSN: 1358-1988

Keywords

1 – 10 of 21