Browse Research

Viewing 1376 to 1400 of 7690 results
2010
The Federal Housing Administration (FHA) insures mortgages against the risk of foreclosure. Since its inception in 1934, FHA has insured over 37 million mortgages on single-family homes. This requires FHA to store data on a large number of mortgages. Because of the large number of mergers/acquisitions among mortgage lenders, it is sometimes difficult for the surviving lenders to maintain accurate databases.
2010
Motivation: Since 2007 a global financial crisis has been unfolding. The crisis was initially caused by defaults on subprime loans, aided and abetted by pools of asset-backed securities and credit derivatives, but corporate defaults, such as that of Lehman Brothers, and outright fraud have also contributed to the crisis. Little research has been published investigating the role of data issues in various aspects of the financial crisis.
2010
Motivation: Provide a guide to open source tools that can be used as a reference to do text mining. Method: We apply the text processing language Perl and the statistical language R to two text databases, an accident description database and a survey database.
2010
2010 Spring CAS E-Forum The E-Forum replaces the traditional printed Forum as the means to disseminate non-refereed research papers to the actuarial community. The CAS will no longer distribute the Forum in hard copy format. The CAS is not responsible for statements or opinions expressed in the papers in the E-Forum. These papers have not been peer reviewed by any CAS Committee.
2010
Implementing a properly functioning Enterprise Risk Management (ERM) program has become increasingly important for insurance companies. Unlike traditional risk management where individual risks are managed in separate silos, ERM is based on the concept of managing all relevant risks in an integrated, holistic fashion.
2010
This article shows how to apply the theory of order statistics to estimate confidence intervals for quantile-based risk measures, a class that includes the VaR, expected shortfall, and coherent, convex, and spectral risk measures.
2010
Value-at-Risk (VaR) and conditional value-at-risk (CVaR) are important risk measures. Especially VaR is very popular and widespread in risk management and banking supervision. However, VaR has some unwelcome properties which are not shared by CVaR. Therefore CVaR is preferable from a theoretical point of view. Both VaR and CVaR are discussed for long and short positions.
2010
We examine discounted penalties at ruin for surplus dynamics driven by a general spectrally negative Lévy process; the natural class of stochastic processes which contains many examples of risk processes which have already been considered in the existing literature. Following from the important contributions of [Zhou, X., 2005. On a classical risk model with a constant dividend barrier. North Am. Act. J.
2010
We introduce the formalism of generalized Fourier transforms in the context of risk management. We develop a general framework in which to efficiently compute the most popular risk measures, value-at-risk and expected shortfall (also known as conditional value-at-risk). The only ingredient required by our approach is the knowledge of the characteristic function describing the financial data in use.
2010
Mean-Variance (M-V) analysis and the CAPM are derived in the expected utility framework. Behavioural Economists and Psychologists (BE&P) advocate that expected utility is invalid, suggesting Prospect Theory as a substitute paradigm. Moreover, they show that the M-V rule, which is the foundation of the CAPM, is not always consistent with peoples' choices.
2010
While catastrophe bonds, futures and options have attracted increasing scholarly attention through- out the last two decades, the catastrophe swap, a financial instrument of growing importance for risk managers and investors, has been virtually neglected altogether.
2010
Compensation for victims of catastrophes is a hot topic in many countries today. Consequently, the legislator is increasingly intervening in the catastrophe insurance market in order to stimulate its functioning. Various forms of public-private partnerships have hence developed, although law and economics scholarship has differing views on this type of government intervention.
2010
It is common actuarial practice to calculate premiums and reserves under a set of biometric assumptions that represent a worst-case scenario for the insurer. The new solvency regime of the European Union (Solvency II) also uses worst-case scenarios for the calculation of solvency capital requirements for life insurance business.
2010
Measuring the risk of a financial portfolio involves two steps: estimating the loss distribution of the portfolio from available observations and computing a 'risk measure' that summarizes the risk of the portfolio. We define the notion of 'risk measurement procedure', which includes both of these steps, and introduce a rigorous framework for studying the robustness of risk measurement procedures and their sensitivity to changes in the data set.
2010
This study presents nonparametric estimates of spectral risk measures (SRM) applied to long and short positions in five prominent equity futures contracts. It also compares these to estimates of two popular alternative measures, the Value-at-Risk and Expected Shortfall. The SRMs are conditioned on the coefficient of absolute risk aversion, and the latter two are conditioned on the confidence level.
2010
This paper re-examines the predictive ability of the consumption-wealth ratio (cay) on the equity premium using hand-collected annual data spanning one century for four major economies. In addition to statistical tests of out-of-sample forecast accuracy, we measure the economic value of the predictive information in cay in a stylized asset allocation strategy.
2010
The problem of optimal excess of loss reinsurance with a limiting and a retention level is considered.
2010
Historically, flood risk management in the United Kingdom has mainly concentrated on river and coastal flooding, yet flooding from surface water runoff is a risk to urban areas.
2010
This paper examines the impact of capital-based regulation on the insurer's risk and capital adjustments in the US property-liability insurance industry. We conduct the three-stage least squares (3SLS) procedure to estimate a simultaneous equations model. The key finding is that undercapitalized insurers increase capital to avoid regulatory costs and take more risks to generate higher returns.
2010
We compare capital requirements derived from tail conditional expectation (TCE) with those derived from the tail conditional median (TCM). In theory, TCE is higher than TCM for most distributions commonly used in finance and at fixed confidence levels; however, we find that in empirical data, there is no clear-cut relationship between the two.
2010
Recently there has been an increasing trend in the quantitative finance community to call for statistical models which are explicitly model returns with non-normal probability distributions (e.g. Sheikh and Qiao, 2009, Bhansali, 2008, Harvey and Siddique, 2004). In this paper, we explain why summary rejection of normal distributions is almost always ill-advised.
2010
This article identifies the relation between earnings quality and the cost of equity capital of Tunisian listed firms. Using the Fama and French's (1993) model, we find that there is a statistically significant relationship between our proxies of earnings quality and cost of equity capital. This result supports theoretical models predicting that investors are interested in information that reflects correctly the firm's financial situation.