top of page
STUDENT SUPERVISION

The accounting standards provides guidelines on how to determine the fair value for a financial asset or liability held at fair value. When considering the fair value of derivative instruments, some additional adjustments need to be made for counterparty credit risk. For interest rate swaps, in particular, one needs to calculate the effective exposure of the swap in order to make these adjustments. One of the most popular methods, albeit computationally intensive, is to calculate these exposures through Monte Carlo simulation. In this study an alternative method of calculating the effective exposure using biplot interpolation is proposed. In this proposed method, an analytical approach in approximating the effective exposure profile is implemented through fitting a beta function. The parameters for this beta function are then estimated through biplot interpolation, which in turn approximates the exposure profile. When the performance of the biplot interpolation approach was tested using a standard interval testing approach, the approximated biplot interpolated profile provided a reasonable approximation of the true profile.

Valuation is not an interesting problem in corporate finance, it is the only problem. Price and value are assumed to be the same number in economic theories of equilibrium and perfect capital markets. The economic theories of equilibrium asset pricing offer very weak practical suggestions for stock price behaviour at the firm level. The fundamental approach to stock price investing operates on the basis that price and value are two separate quantities and the stock price is fully determined by its intrinsic value. In this research the option-theoretic approach to default modelling is amended to provide an alternate view of value. Structural models apply an option-theoretic approach inspired by Merton (1974) that uses equity market and financial statement data in order to determine default probabilities. Default probabilities obtainable from the reduced form class of models provides the basis for extending the Merton model to estimate the firms value from market observable credit spreads. The probability of default is then a known constant provided from the reduced form model. The Merton model is reformulated with equity or firm value being used as the subject of the formula. The re-appropriated Merton model then provides a unique estimate of the firm's value based on current market information. The expected return on equity is then estimated from market credit spreads using individual capital structure and traded equity information. In this research it was found that historic estimates of return are poor predictors of future return at the firm level. The structural models provide good forecasts of return in some instances although have many challenges in implementation. The use of statistical learning methods was found to greatly improve predictions of future equity return movements using both debt and equity predictor variables, including unique predictor variables constructed from the structural models of the firm.

Principal Component Analysis (PCA) biplots is a valuable means of visualising high dimensional multivariate data. The application of PCA biplots over a wide variety of research areas containing multivariate data is well documented. However, the application of biplots to financial data is limited. This is partly due to PCA being an inadequate means of dimension reduction for multivariate data that is subject to extremes. This implies that its application to financial data is greatly diminished since extreme observations are key to understanding risk. Hence, the purpose of this research is to develop a method to accommodate PCA biplots for multivariate data containing extreme observations. This is achieved by fitting an elliptical copula to the data and deriving a correlation matrix from the copula parameters. The copula parameters are estimated from only extreme observations and as such the derived correlation matrices contain the dependencies of extreme observations. Finally, applying PCA to such an “extremal” correlation matrix more efficiently preserves extremes and allows one to construct PCA biplots.

During 2014, the International Accounting Standards Board (IASB) implemented a new standard for measuring the fair value of assets through the International Financial Reporting Standards (IFRS) 13 guidance. The newly introduced guidelines have probed market participants to adjust their valuation of financial positions for material counterparty credit risk (CCR) in the over-the-counter
(OTC) market.

 

Five different models are implemented in this research for the purpose of calculating the credit value adjustment (CVA) and debit value adjustment (DVA) of an interest rate swap portfolio between a South African corporate treasurer, Eskom, and a generic South African tier 1 bank. The models differ from simple to complex. The Monte Carlo (MC) simulation model is assumed to be the most accurate, since it involves the simulation of expected exposure and the modelling of the short-rate.


Corporate treasurers do not always have the necessary resources to calculate CVA by means of a sophisticated approach. Due to input data and resource challenges, corporate treasurers need to consider creative alternative methods to include CCR in their fair value adjustments. Therefore, semi-analytic methods and input approximation methods were considered in this research. It was found that simpler semi-analytic approximation methods do not possess the complexity needed to deal with the complexity of netting and collateral agreements. They serve as good approximations to quickly estimate a ball-park CVA, but lack the accuracy of the MC based approach.

Please reload

Honours supervision

Giliomee, G. 2019. Prediction of Credit Spread Movements in The Context of The South African Market. Unpublished Honours degree assignment. Stellenbosch: Stellenbosch University.

A credit spread is the extra risk-reward that an investor is bearing for investing in corporate bonds instead of Government Bonds. This research uses transactional data over a period of approximately 11 years to investigate the determinants of monthly credit spread changes in the context of the South African market. For this period a final bond sample consisting of 390 different bond issues and a total of 2,020 monthly observations were obtained. Each of the observations were grouped according to the leverage ratio of the issuing company. In the analysis an optimal set of both company-level and market-wide variables, mostly inspired by structural models of default were used. Initially the analysis was done for all observations across all leverage groups. From this it was observed that the identified set of variables explain at most 27% of the variation in monthly credit spread changes. To study the effect of the time to maturity the observations were subdivided according to three different maturity groups. The following groups were constructed: Short (less than 4 years), Medium (4 to 8 years) and Long (more than 8 years). The analysis was done over all leverage ratios across these three different maturity groups. The adjusted R-squared varied between 0.00% and 66.51%. Further, the method of principal components analysis was applied on the residuals to get a better understanding of the unexplained variation. It was observed that more than 40% was due to the first two principal components. No dominant latent factor was present in the unexplained variation.  Finally, it was concluded that most of the explanatory variables investigated have some ability to explain changes in credit spreads.

Rodwell, D.T. & Cronjé, R. 2018. Determining the Intrinsic Value of Cryptocurrencies. Unpublished Honours degree assignment. Stellenbosch: Stellenbosch University.

One of the products of the digital age is cryptocurrencies. The introduction of the concept of a cryptocurrency, namely Bitcoin, was presented in 2008, through a research paper proposing the idea of a peer-to-peer electronic payment system. Many variants of Bitcoin have since then been created and introduced into the market, all possessing varying functionality and value. The literature available so far, on valuing bitcoin and cryptocurrencies in general, is extensively consulted to determine the best possible valuation technique. 


This research paper investigates whether cryptocurrencies possess any intrinsic value through a cost of production valuation technique. Within the cost of production framework a model for bitcoin and altcoins is presented. The model output is determined to be consistent as a lower-bound value for the market price, by plotting the model price against the market price. In addition, a Granger causality test is conducted through a multivariate auto-regressive model to determine whether the model price causes the market price, in essence providing evidence for the case that the cost of production is a coherent means to determine the intrinsic value.


The Granger test yields inconclusive results, for a number of reasons outlined in this paper. The feasibility of the models presented, thus have little statistical backing. However, the output produced by the model in most cases still acted as a lower bound with convergence occurring after the renowned 2017 bubble.

Buitendag, D. & Pretorius, W.L. 2018. The visualisation of multidimensional financial data through biplots. Unpublished Honours degree assignment. Stellenbosch: Stellenbosch University.

In the modern finance world, the management of companies are responsible for making decisions based on inputs received from teams either within the firm or from external parties. The effectiveness of these inputs is dependent on the managers being able to understand and interpret them. In many cases visual aids such as pie charts and scatterplots are used, but it is believed that there exists a more powerful tool - perhaps one that can venture into the world of high dimensional data. This idea leads to the introduction of the biplot as a medium through which multidimensional data can be displayed. These high dimensional sets could vary from risk measures such as Value at Risk (VaR) or expected tail losses, to residuals of different models for estimating volatility, to financial ratios that are calculated from the management accounts of firms.


The biplot is introduced on a conceptual and theoretical level, followed by an example using a small data set. The two main types used in this research assignment are then revealed, namely Principal Component Analysis (PCA) biplots and Canonical Variate Analysis (CVA) biplots. The fundamentals of the two types are explained followed by two simple plots from the same arbitrary data set to highlight the differences. Methods for dealing with larger sets are introduced, of which some are used later in the research on a chosen data set. The set used, refers to the default status of Polish companies for the period 2000-2012, including 64 financial ratios, where observations of 5910 companies were taken. The set is then cleaned by outlier treatment methods and put through a variable selection process which leads to a large reduction in the number of variables used in the analysis. The end result is a set of 11 financial ratios, with a 12th variable being the defaulted status of the companies. In the default status variable, 410 companies defaulted and 5500 did not default.


Various biplots are plotted from the cleaned set. Due to the imbalance regarding the defaulted and non-defaulted proportion, samples of equal size are taken from the set. Next, various shapes of the different biplots are compared. Different groupings of strongly correlated ratios are made, and the isolated axes are plotted showing the varying relations for variables with different default statuses. The cumulative quality of the plots is then shown for a different number of dimensions, indicating how accurate the prediction would be in the two-dimensional case as used in the research. Lastly, density plots are drawn which are connected to the modelling of default rates in credit risk. The research assignment is ended with a conclusion and recommendations for further studies.

Burger, N. & Scholtz, D. 2016. Understanding the developments in derivative pricing with regards to xVA. Unpublished Honours degree assignment. Stellenbosch: Stellenbosch University.

Since the financial crisis of 2008, Credit Value Adjustment (CVA) and Debit Value Adjustment (DVA) have been widely implemented by banks. CVA is the reduction in the risk-free value of over-the-counter (OTC) derivative assets compensating for the probability of counterparty default. DVA is the adjustment made to the value of derivative liabilities to reflect an entity’s own creditworthiness. Regulatory authorities, such as the IASB and Basel, have forced banks to include CVA and DVA when pricing derivatives and to ensure the correct reporting of transactions within their financial statements.

Due to evolving markets, increasing competition and growing pressure on business models, financial institutions now have to account for a series of xVAs in their financial reports. This acronym, xVA, refers to the list of valuation adjustments in general, where a letter referring to the specific adjustment can replace the “x”. This list of adjustments includes CVA, DVA, Funding Value Adjustment (FVA), Collateral Value Adjustment (CollVA), Hedging Value Adjustment (HVA), Liquidity Value Adjustment (LVA), Marginal Value Adjustment (MVA) and Capital Value Adjustment (KVA).

This study will clarify concepts regarding the xVA framework, which provides an ideal platform for derivative trading. The description, calculation and relevance of each xVA will be given. In order to understand how to implement xVA within a financial institution, a look into the mechanics behind xVA pricing will be undertaken. After considering the opinions and approaches of the four largest professional services firms and well-known banks within South Africa, it has become clear that CVA and DVA are included when pricing derivatives. None of the other xVAs have been fully implemented. Hence, the importance of further research on the topic of xVA derivative pricing has become evident.

Cronje, H.R.D. 2016. Implied Volatility Surfaces in the South African ALSI Market. Unpublished Honours degree assignment. Stellenbosch: Stellenbosch University.

The constant volatility assumption in the Black-Scholes model has led to the development of alternative models. These models capture the effect of the implied volatility from the market, with respect to the variation in time. That is, volatility is volatile in itself, which can lead to substantial errors in estimation of option prices. In illiquid markets it is difficult to obtain volatility surfaces due to the lack of data. Volatility surfaces enables investors to price options for any set of strike prices and time to maturity of the option. In this research assignment two different models are implemented in the South African market context. The first model is the Stochastic Alpha Beta Rho (SABR) model developed by Hagen et al. (2002) and adjusted by West (2005) to fitted illiquid markets. The second model is a deterministic approach, where a quadratic function is fit to the market data, developed by Kotzé et al. (2009). The SABR model initially did not fit the market data, but the author of this paper proposed an adjustment to the SABR volatilities to obtain a reasonable fit. This research paper found that the Quadratic Deterministic model results in more accurate results and it is easier to implement when compared to the SABR model.

Singh, K.R. 2016. Application of Structural Default Probability Models to the South African Market. Unpublished Honours degree assignment. Stellenbosch: Stellenbosch University.

Following the 2008 financial crisis, global regulatory authorities have highlighted the need for transparency in over the counter derivative transactions as well as the quantification of counterparty credit risk in the form of credit value adjustments (CVA) amongst others. Default probabilities are essential to multiple facets of the measurement and management of credit risk and are essential for CVA, which are required by the International Financial Reporting Standards. Default probabilities can be determined from quoted bond prices in the markets or credit default swap (CDS) spreads. However, bond and CDS market data are not always available and may be particularly complex for the counterparty being evaluated in the transaction. Structural models apply an option-theoretic approach inspired by Merton (1974) that uses equity market and financial statement data in order to determine default probabilities. The research found that the Merton and Delianedis & Geske (D&G) structural models provide limited information regarding the credit risk of firms in the South African market. The low levels of leverage amongst South African firms was found to be a primary reason for the inability of the basic structural models to capture the credit risks associated with the firms. Moreover, the extensions of the Merton (1974) model although practically challenging to implement, may provide a consistent and reliable manner in which to determine default probabilities from financial statement and equity market data.

Perrang, J. & Sebitlo, K.J. 2016. The application of the Nelson & Siegel (1987) model to the South African zero coupon yield curve. Unpublished Honours degree assignment. Stellenbosch: Stellenbosch University.

The importance of the term structure of interest rates to financial market participants has led to the development of mathematical models to explain the shape of the yield curve. One such model that is considered in this paper is the Nelson & Siegel (1987) three factor model. This model expresses the entire term structure of interest rates by three latent factors. Determining these factors for the South African zero coupon yield curve over time could prove useful for many purposes. This paper firstly evaluates the fit of this model to the considered term structure of interest rates. Secondly, the latent factors are extracted over time in order to forecast the zero rates. Additionally, an investigation of whether combining these factors with macroeconomic variables to provide meaningful forecasts is made. Finally, the application of the Nelson & Siegel (1987) model for purpose of extrapolating the term structure of interest rates is critically assessed. The simplicity of the Nelson & Siegel (1987) model compared to various other term structure models may perhaps result in it being applied by less quantitative literate market participants. It was found that the Nelson & Siegel (1987) model provided a good in sample fit for the South African zero coupon yield curve and a reasonable out of sample forecast. Additionally, for extrapolation purposes, the model performed adequately.

Bezuidenhout, M. 2015. Goodness of fit for different dynamic hedging strategies. Unpublished Honours degree assignment. Stellenbosch: Stellenbosch University.

Hedging refers to a strategy used to lower the overall risk of a portfolio. There exists a broad selection of models available in the market to implement a dynamic hedging strategy. In this research assignment the Constant Conditional Correlation (CCC), Dynamic Conditional Correlation (DCC) and the Copula Based-GARCH models are constructed and executed to evaluate the most appropriate model for the selected data. Mili and Abid (2004:659) argues that a dynamic hedging strategy leads to greater risk reduction as opposed to a static hedging strategy. Various summary statistics are calculated to detect if there is any autocorrelations present in the data, this is used for the testing of Autoregressive Conditional Heteroskedastic (ARCH) effects. Some goodness-of-fit (GOF) tests are calculated to interpret the data and to decide on appropriate distributions, this is followed by different information criterions to use in the process for model selection. Finally, the parameters of the three models will be estimated through the process of maximum-likelihood. The results of each model fit are discussed and relevant comparisons are made.

There are two hedging strategies implemented in this study, the first is the Top 40 index that will be directly hedged with its own futures, and secondly using foreign currency futures to cross hedge the currency exposure of holding foreign equity. It was found that the Copula-Based GARCH model, which permits nonlinear and asymmetric dependence between the two assets in the cross-hedge portfolio, results in the most appropriate model fit.  However, comparing the DCC and CCC-GARCH models, the conclusion is that the DCC-GARCH model is more appropriate than the CCC-GARCH model, therefore implying that the more dynamic a model is the better.

Leuvennink, C. 2015. A literature review comparing IFRS and Basel CVA methodologies. Unpublished Honours degree assignment. Stellenbosch. Stellenbosch University.

The financial crisis caused the attention of the financial world to shift to Counterparty Credit Risk (CCR) since it verified that no counterparty could ever be regarded as risk free. Major topics of discussion at this point were the pricing of CCR in the valuation of derivatives and taking CCR into consideration when determining required capital. This is done through a Credit Value Adjustment (CVA), which is the adjustment made to the value of a derivative to include the credit quality of the counterparty. Even though CVA has been used before the crisis, major changes in regulation of both IFRS and Basel concerning CCR now requires bank to calculate CVA.


This study clarifies the concepts necessary to understand and calculate CVA. The regulation of IFRS and Basel regarding CVA are discussed after which the different approaches that are used to calculate CVA are considered. Since IFRS does not specify a method of calcula-tion, the approaches in published guidance of four auditing companies, PwC, EY, Deloitte and KPMG, are discussed.


Differences in the approaches used to calculate CVA for Basel and IFRS were found. This is due to a considerable difference in their objectives. Basel’s main focus is on enhanced risk management by including CVA in capital requirements while the IASB’s objective is greater transparency and a convergence in approaches of fair value calculations. Another major dif-ference is their stance on Debit Value Adjustment (DVA) which the adjustment made to fair value if there is a change in the credit quality of the entity itself. Basel does not allow DVA to be included for capital requirement purposes while from an accounting point of view it is im-perative to consider DVA in fair value calculation.

Davids, S.L. 2014. Estimating Default Probabilities from South African Bond Mark to Market Data. Unpublished Honours degree assignment. Stellenbosch: Stellenbosch University.

The increase in awareness of counterparty credit risk and international financial reporting standards has led to the requirement of estimates for the default probability of counterparties in over the counter derivative transactions. Default probabilities are necessary for the calculation credit value adjustments (CVA), which can be seen as the quantification of counterparty credit risk. In this research paper different methods and models for estimating default probabilities is reviewed and a detailed summary of the application of these models in a South African context is given. It was found that credit risk models can, to an extent, be used as a reflection of credit risk. The methodology followed using South African bond mark to market data to obtain a matrix of credit spreads for different rating classes is then outlined and corresponding default probabilities for the different rating classes are then calculated. The methodology however is not completely general and the obtained credit spreads are higher than that of previous studies. Overall it was found that South African bond mark to market data can be used for inferring default probabilities for the purpose of CVA calculations.

Kapp, K.P. 2014. Calculation of Credit Value Adjustment using a series of Swaptions. Unpublished Honours degree assignment. Stellenbosch: Stellenbosch University.

The semi-analytical approach to calculating Credit Value Adjustment (CVA) on an interest rate swap (IRS) provides an alternative to the limited simplistic mark-to-market approach and the resource-intensive Monte Carlo approach. The semi-analytical (or swaption) approach was implemented in this paper using two models: Black's model and the Hull-White one-factor model. Results from using the semi-analytical approach were compared to that of the simplistic method. A relatively thorough analysis was also done on the factors affecting results from using the swaption approach, and the two models were also compared. It was found that the swaption approach provided far more information than the simplistic approach on how exposures are distributed over the two counterparties in an IRS and especially over the lifetime of the swap contract. Model parameters, as well as the term structure of interest rates, were found to have a significant effect on the CVA in each of the models. The Hull-White model, with an additional parameter over Black's model, showed more complex interactions between its parameters and CVA than Black's model did.

Smit, L. 2014. Estimating Expected Exposure for Counterparty Credit Risk Adjustments. Unpublished Honours degree assignment. Stellenbosch: Stellenbosch University.

Counterparty credit risk (CCR) forms an integral part of the current financial market. CCR is already widely implemented by banks under the Basel Committee of Banking Supervision guidelines. The International Accounting Standards Board has introduced IFRS 13, which subsequently requires all financial entities to use fair value adjustments to account for CCR. In this paper the methods to estimate expected exposure  of interest rate swaps for credit valuation adjustments (CVA) are discussed and compared to attempt to find the most appropriate method to be adopted by smaller non-financial organisations.

A model is developed to estimate  with the Monte Carlo methodology using the Vasicek short rate model. The more computationally intensive Monte Carlo method is then compared to add-on methodologies such as the current exposure method (CEM) and expansions thereof. Thereafter, the CVA values are calculated using each method and compared. Finally, the results of each method were presented and then discussed in terms of accuracy and simplicity.

It was found that when comparing the  estimated with the simple CEM method to the Monte Carlo method, the results were inaccurate. However, by making simple modifications to the simple CEM method, the accuracy of estimation can be improved greatly. Therefore, although the Monte Carlo method is still more accurate, smaller non-financial organisations with resource constraints and fewer technical abilities could use the modified CEM method an approximation for CVA.

Van Niekerk, A. 2014. The single- and multi-curve approach to swap valuation along with zero-coupon risk free swap curve construction. Unpublished Honours degree assignment. Stellenbosch: Stellenbosch University.

Determining the risk neutral value of cash collateralised derivatives is important. This is because the fair value of a cash collateralised derivative is determined by adding the fair value for non-performance risks to the risk neutral value where a risk free rate is needed for discounting (Gregory, 2012:306). Throughout the considered cash collateralised derivative is a “plain vanilla” interest rate swap, which is assumed to be fully collateralised such that no adjustments need to be made for credit risk.

The 2008 financial crisis lead institutions to consider a more appropriate risk free rate to discount cash collateralised derivatives in their fair value calculation (Schubert, 2012:28). Thus, there has been a shift in global markets to using the multi-curve approach to swap valuation where the OIS zero-coupon risk free curve is used for discounting swaps.

There are markets that have not made the transition to using the multi-curve swap valuation approach and instead still use the single-curve swap valuation approach. This could lead to a possible error in swap value calculation for these markets. Observed market swap curves are used when calculating the value of a swap. It is important to know how these swap curves are constructed and how variations in the construction of these swap curves could influence swap values.

Please reload

Masters supervision

bottom of page