Search results

1 – 10 of over 36000
Article
Publication date: 3 August 2023

Carollyne Maragoni Santos, Eduardo Botti Abbade and Ana Elizabeth Cavalcante Fai

This study estimates the land footprint, nutrients and monetary value of persimmon loss in Brazil, and also consolidated the methodological approach for assessing resources…

Abstract

Purpose

This study estimates the land footprint, nutrients and monetary value of persimmon loss in Brazil, and also consolidated the methodological approach for assessing resources related to food loss.

Design/methodology/approach

It uses data on the harvested area, production, production loss and production value of persimmon in Brazil from 2014 to 2019. The persimmon loss in Brazil was converted into macro- and micronutrients, land use and monetary value.

Findings

The average annual production loss, loss production value and land footprint of persimmon are 35,100 tons, US$12m  and 1,673 hectares, respectively. Persimmon loss represents the average loss per year of 6.6bn grams of carbohydrates, 1.6bn grams of food fibers, 7.2bn milligrams of vitamin C, 41.8bn micrograms of vitamin A, 4.5bn milligrams of calcium and 54.8bn milligrams of potassium. These nutrients have the potential to meet the nutritional daily needs of approximately 135,000, 176,000 people, 270,000, 164,000, 12,000 and 32m, respectively.

Practical implications

Through (1) research and innovation; (2) infrastructure development; (3) training and education; (4) collaboration and networking; and (5) market diversification and value addition, people can increase persimmon shelf life, reduce postharvest losses and create a resilient environment for small persimmon farmers. This approach promotes sustainability in the agri-food system and empowers stakeholders.

Originality/value

This investigation helps to understand the value of food loss, considering the use of natural resources, as well as the loss of nutrients and monetary value.

Details

British Food Journal, vol. 125 no. 12
Type: Research Article
ISSN: 0007-070X

Keywords

Article
Publication date: 24 February 2012

Siamak Daneshvaran and Maryam Haji

In general, the insurance industry accepts large risks due to the frequency and severity of extreme events. Because of the short record on hazard data for such events, a large…

Abstract

Purpose

In general, the insurance industry accepts large risks due to the frequency and severity of extreme events. Because of the short record on hazard data for such events, a large amount of uncertainty has to be dealt with. Given this large uncertainty it is important to better quantify the hazard parameters that are defined as inputs to the catastrophe models. The purpose of this paper is to evaluate the hurricane risk from loss point of view in the USA for both long‐term and warm phase conditions using a simulation‐based stochastic model.

Design/methodology/approach

A Poisson process is used to simulate the occurrence of events for both conditions. The generated event‐sets were used along with vulnerability and cost models to estimate the loss to an insurance industry portfolio. The paper discusses the statistics of events categorized by the Saffir‐Simpson Hurricane Wind Scale, annualized and return period losses and compares the results for both assumed long‐term and warm phase climate states.

Findings

The analysis shows that the population of landfall data for the two climate conditions is not statistically different. However, if we accept that a difference in the frequency of landfall occurrence between the two assumptions exists, the increase in average annual loss is about 17 per cent.

Originality/value

This paper provides insights to the difference between the two states of atmosphere from the point of view of insured losses for hurricanes and is one of the first papers that offers conclusion on the uncertainty associated with the warm phase data.

Article
Publication date: 6 March 2007

Siamak Daneshvaran and Robert E. Morden

Perils of tornado and hail cause large amounts of loss every year. Based on the data provided by Property Claims Services, since 1949, tornado, hail and straight‐line‐wind losses

Abstract

Purpose

Perils of tornado and hail cause large amounts of loss every year. Based on the data provided by Property Claims Services, since 1949, tornado, hail and straight‐line‐wind losses account for more than 40 percent of total natural losses in the USA. Given the high frequency of tornado and damaging hail in the continental USA, quantifying these risks will be an important advancement in pricing them for insurance/reinsurance purposes. In the absence of a realistic physical model, which would look at these perils on a cluster/outbreak basis, it is not possible to underwrite these risks effectively. The purpose of this paper is to focus on the tornado risk.

Design/methodology/approach

A tornado wind‐field model is developed based on the model used by Wen and Ang. The model is calibrated to the specifications given in the Fujita intensity scale. To estimate the tornado hazard, a historical database is generated and de‐trended using the information provided by Storm Prediction Center along with the dataset given by Grazulis. This new historical database together with a reinsurance timeframe criterion in mind was used to define outbreaks. These outbreaks are used in a Monte‐Carlo simulation process to generate a large number of outbreaks representing 35,000 years of simulated data. This event‐set is used to estimate spatial frequency contours and loss analyses.

Findings

The results focus on the spatial frequency of occurrence of tornadoes in the USA. The losses are tallied using multiple occurrences of tornado and/or hail per outbreak. The distribution of loss, both on per occurrence and on aggregate basis, are discussed.

Originality/value

This paper is believed to be the first one to use a tornado wind‐field model, outbreak model, and vulnerability models, which estimate both spatial distribution of hazard and location‐based distribution of losses. Estimation of losses due to hail is also provided.

Details

The Journal of Risk Finance, vol. 8 no. 2
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 1 January 2000

Eduardo Canabarro, Markus Finkemeier, Richard R. Anderson and Fouad Bendimerad

Insurance‐linked securities can benefit both issuers and investors; they supply insurance and reinsurance companies with additional risk capital at reasonable prices (with little…

1188

Abstract

Insurance‐linked securities can benefit both issuers and investors; they supply insurance and reinsurance companies with additional risk capital at reasonable prices (with little or no credit risk), and supply excess returns to investors that are uncorrelated with the returns of other financial assets. This article explains the terminology of insurance and reinsurance, the structure of insurance‐linked securities, and provides an overview of major transactions. First, there is a discussion of how stochastic catastrophe modeling has been applied to assess the risk of natural catastrophes, including the reliability and validation of the risk models. Second, the authors compare the risk‐adjusted returns of recent securitizations on the basis of relative value. Compared with high‐yield bonds, catastrophe (“CAT”) bonds have wide spreads and very attractive Sharpe ratios. In fact, the risk‐adjusted returns on CAT bonds dominate high‐yield bonds. Furthermore, since natural catastrophe risk is essentially uncorrelated with market risk, high expected excess returns make CAT bonds high‐alpha assets. The authors illustrate this point and show that a relatively small allocation of insurance‐linked securities within a fixed income portfolio can enhance the expected return and simultaneously decrease risk, without significantly changing the skewness and kurtosis of the return distribution.

Details

The Journal of Risk Finance, vol. 1 no. 2
Type: Research Article
ISSN: 1526-5943

Article
Publication date: 4 December 2017

Shriniwas Gautam, Antonio L. Acedo Jr, Pepijn Schreinemachers and Bhishma P. Subedi

The purpose of this paper is to develop a straightforward method to quantify volume and value of postharvest losses in the tomato postharvest value chain in Nepal and estimate the…

Abstract

Purpose

The purpose of this paper is to develop a straightforward method to quantify volume and value of postharvest losses in the tomato postharvest value chain in Nepal and estimate the monetary loss shouldered by value chain actors.

Design/methodology/approach

The study combines interview data to quantify volume and prices with produce sampling to quantify quality losses, and does this at four nodes of the tomato value chain in Nepal: farmers, collectors, wholesalers, and retailers to estimate volume and value of postharvest losses.

Findings

Almost one-fourth of the total tomato harvest weight that enters the value chain is lost before it reaches consumers, and other one-fifth is traded by the value chain actors at reduced price due to quality damage. The total volume of postharvest loss (weight and quality loss) is not the same for all value chain actors and the average monetary loss ranges from 4 percent of gross revenues for farmers to 12 percent for wholesalers.

Practical implications

A standard method to account for both physical weight losses and quality losses of horticultural produce is lacking in estimates of the monetary value of postharvest losses for horticultural crops. Knowing such losses is essential for postharvest technology generation, promotion, and adoption. This study provides a framework that can be adopted and improved in future loss assessment studies for estimating the volume and value of postharvest losses in a horticultural value chain.

Originality/value

The uniqueness of the method used in this study is that it combines interview data to estimate price and volume with produce sampling to quantify quality losses, and does this at four nodes of the value chain: farmers, collectors, wholesalers, and retailers. This method could become a standard approach for assessment of postharvest weight and quality losses and to estimate the monetary value of total postharvest losses in the value chain for horticultural crops.

Details

British Food Journal, vol. 119 no. 12
Type: Research Article
ISSN: 0007-070X

Keywords

Article
Publication date: 1 June 2000

George K. Chako

Briefly reviews previous literature by the author before presenting an original 12 step system integration protocol designed to ensure the success of companies or countries in…

7242

Abstract

Briefly reviews previous literature by the author before presenting an original 12 step system integration protocol designed to ensure the success of companies or countries in their efforts to develop and market new products. Looks at the issues from different strategic levels such as corporate, international, military and economic. Presents 31 case studies, including the success of Japan in microchips to the failure of Xerox to sell its invention of the Alto personal computer 3 years before Apple: from the success in DNA and Superconductor research to the success of Sunbeam in inventing and marketing food processors: and from the daring invention and production of atomic energy for survival to the successes of sewing machine inventor Howe in co‐operating on patents to compete in markets. Includes 306 questions and answers in order to qualify concepts introduced.

Details

Asia Pacific Journal of Marketing and Logistics, vol. 12 no. 2/3
Type: Research Article
ISSN: 1355-5855

Keywords

Article
Publication date: 24 June 2019

Richard Evans, Geoff Walters and Richard Tacon

The purpose of this paper is to provide an assessment of the effectiveness of the Salary Cost Management Protocol, a form of financial regulation introduced by the English…

1360

Abstract

Purpose

The purpose of this paper is to provide an assessment of the effectiveness of the Salary Cost Management Protocol, a form of financial regulation introduced by the English Football League in 2004 to improve the financial sustainability of professional football (i.e. soccer) clubs.

Design/methodology/approach

The analytical approach is to assess the effect of the regulation from evidence of change in measures of the financial performance of clubs drawing on three criteria: profitability, liquidity and solvency. A unique database was created from the published financial statements and notes to the accounts of the clubs in the Tier 4 league (known since 2004 as League Two) from 1994 to 2014 to encapsulate the 10-year period before and after the regulation was introduced. To show trends in the data within the study period, the data are reported in graphical form. The statistical significance of change in both the slope and intercepts for trends between breaks of interest in the data is estimated by linear regression.

Findings

The results show that financial regulation failed to significantly improve the profitability or the solvency of football clubs in League Two. Whilst the liquidity of the clubs improved in response to the introduction of the financial regulation, the results show this was only in the year in which the financial regulation was introduced.

Research limitations/implications

The results extend theoretical debate on financial regulation in sports leagues by moving beyond the assumption that financial regulation is a “technical exercise” to provide an alternative way of thinking about financial regulation as a “legitimising exercise”.

Originality/value

This is the first study to assess the impact of financial regulation for football league clubs over a longitudinal period. It is also extends previous research in which only single aspects of the financial sustainability of football clubs, such as insolvency, have been considered.

Details

Accounting, Auditing & Accountability Journal, vol. 32 no. 7
Type: Research Article
ISSN: 0951-3574

Keywords

Article
Publication date: 7 October 2019

Mario Ordaz, Mario Andrés Salgado-Gálvez, Benjamín Huerta, Juan Carlos Rodríguez and Carlos Avelar

The development of multi-hazard risk assessment frameworks has gained momentum in the recent past. Nevertheless, the common practice with openly available risk data sets, such as…

Abstract

Purpose

The development of multi-hazard risk assessment frameworks has gained momentum in the recent past. Nevertheless, the common practice with openly available risk data sets, such as the ones derived from the United Nations Office for Disaster Risk Reduction Global Risk Model, has been to assess risk individually for each peril and afterwards aggregate, when possible, the results. Although this approach is sufficient for perils that do not have any interaction between them, for the cases where such interaction exists, and losses can be assumed to occur simultaneously, there may be underestimation of losses. The paper aims to discuss these issues.

Design/methodology/approach

This paper summarizes a methodology to integrate simultaneous losses caused by earthquakes and tsunamis, with a peril-agnostic approach that can be expanded to other hazards. The methodology is applied in two relevant locations in Latin America, Acapulco (Mexico) and Callao (Peru), considering in each case building by building exposure databases with portfolios of different characteristics, where the results obtained with the proposed approach are compared against those obtained after the direct aggregation of individual losses.

Findings

The fully probabilistic risk assessment framework used herein is the same of the global risk model but applied at a much higher resolution level of the hazard and exposure data sets, showing its scalability characteristics and the opportunities to refine certain inputs to move forward into decision-making activities related to disaster risk management and reduction.

Originality/value

This paper applies for the first time the proposed methodology in a high-resolution multi-hazard risk assessment for earthquake and tsunami in two major coastal cities in Latin America.

Details

Disaster Prevention and Management: An International Journal, vol. 28 no. 6
Type: Research Article
ISSN: 0965-3562

Keywords

Abstract

Details

Energy Economics
Type: Book
ISBN: 978-1-78756-780-1

Article
Publication date: 1 February 2004

SIAMAK DANESHVARAN and ROBERT E. MORDEN

The insurance industry, in general, accepts large risks due to the combined severity and frequency of catastrophic events; further, these risks are poorly defined given the small…

Abstract

The insurance industry, in general, accepts large risks due to the combined severity and frequency of catastrophic events; further, these risks are poorly defined given the small amount of data available for extreme events. It is important for the equitable transfer of risk to understand and quantify this risk as accurately as possible. As this risk is propagated to the capital markets, more and more parties will be exposed. An important part of pricing insurance‐linked securities (ILS) is quantifying the uncertainties existing in the physical parameters of the catastrophe models, including both the hazard and damage models. Given the amount of reliable data (1945 till present) on important storm parameters such as central pressure drop, radius to maximum winds, and non‐stationarity of the occurrence rate, moments estimated for these parameters are not highly reliable and knowledge uncertainty must be considered. Also, the engineering damage model for a given class of building in a large portfolio is subject to uncertainty associated with the quality of the buildings. A sample portfolio is used to demonstrate the impact of these knowledge uncertainties. Uncertainties associated with variability of statistics on central pressure drop, occurrence rate, and building quality were estimated and later propagated through a tropical cyclone catastrophe model to quantify the uncertainty of PML results. Finally their effect on the pricing of a typical insurance‐linked security (ILS) was estimated. Statistics of spread over LIBOR given different bond ratings/probability of attachment are presented using a pricing model (Lane [2000]). For a typical ILS, a relatively large coefficient of variation for both probability of attachment and spread over LIBOR was observed. This in turn leads to a rather large price uncertainty for a typical layer and may explain why rational investors expect a higher return for assuming catastrophe risk. The results hold independent of pricing model used. The objective of this study is to quantify this uncertainty for a simple call option and demonstrate its effect on pricing.

Details

The Journal of Risk Finance, vol. 5 no. 2
Type: Research Article
ISSN: 1526-5943

1 – 10 of over 36000