Hello guest! Sign up now and publish with us.
or Log in if you are already signed up.

Risk Assessment and Management

Risk Assessment and Management

Publisher: Academy Publish

Publish date: 2012-11-03

ISBN: 978-0-9835850-0-8

Editor: Prof. Zhang Zhiyong


Download full book in PDF format

The broad definitions of a risk can be summarized as probability and/or impact of unfortunate events, or the quantifiable likelihood of loss. As soon as a probability of a loss occurs we need to asset it, model it and manage it to try to minimize the realization of mentioned unwanted opportunities. Risks can be found anywhere from financial markets, projects, natural causes, national securities, industrial processes, health, science experiments and so forth. Therefore, we need to manage the risks. Using ISO 31000 definition, risk management is the identification, assessment, and prioritization of risks followed by coordinated and economical application of resources to minimize, monitor, and control the probability and/or impact of unfortunate event or to maximize the realization of opportunities. While risk itself ISO defines aseffect of uncertainty on objectives, whether positive or negative.

In Scope:

Risk assessment and management papers should describe important applications in the field of interest such as engineering, science, manufacturing, business, homeland security, management, and public policy. Use high mathematics or avoid it whenever possible by taking into consideration case studies; incorporate examples, illustrate analytical methods, discuss problems and possible solutions. Write a paper that will become a source for academic, industry, and government professionals in such diverse areas as homeland and cyber security, healthcare, the environment, physical infrastructure systems, engineering, business, and more.

 

Topics:

Methods
Principles of risk management
Risk assessment:

  •     In public health
  •     Human health
  •     In information security
  •     In project management
  •     for megaprojects
  •     Quantitative risk assessment
  •     In software evolution
     



  • A Multilingual and Multimedia Glossary to protect humans, their environment and their life.

    Authors: Prof.Dr. Gertrud Greciano

    Abstract:

    Presentation of EU-Dictionary Project, to 8 languages extended: 8x230 terms, 8x400 definitions, 1400 idiomatic expressions, bibliography for pre-event and in-event of natural and industrial disasters. Basic terminology in context and use. A linguistic tool for Risk Assessment and Risk Management

    Read this paper


  • An approach for structuring the major Key Success Criteria (KSC) for project risk management

    Authors: Dr. Mohammad ali Hatefi & Mr. Mohammad mehdi Vahabi

    Abstract:

    The most significant programs of the governments are done by project activities. So, the economies, businesses and industries are directly influenced by the succession of projects. The "1:10:100" rule indicates that if it takes a dollar to correct an error in the project conceptualization phase, it will take 10 dollars to correct it during the design and documentation phase and 100 dollars to correct it during the execution phase. A very important part of the project conceptualization phase is to provide the strategic plan for projects. Besides, risk is a phenomenon which internally appears in all aspects of our projects, hence, a project strategic plan must be equipped by Risk Management (RM) programs. Kerzner (2010) defines RM as the act or practice of dealing with risk. It includes planning for risk, assessing (identifying and analyzing) risk issues, developing risk handling options, and monitoring risks to determine how risks have changed. The science of RM was developed back in the sixteenth century during the Renaissance, a period of discovery. RM is usually one of the main topics of interest for investigators in the field of project management. Many researchers have proposed several RM methodologies such as RISKMAN (Carter et al. 1996), RISKIT (Kontio, 2001), RFRM (Haimes et al., 2002), PUMA (Del Cano & De la Cruz 2002), SHAMPU (Chapman & Ward, 2003), AS/NZS 4360 (Cooper, 2004), PRAM (APM, 2004), RAMP (Institution of Civil Engineers et al. 2005), and PMBOK (PMI, 2008). Most of these RMPs have a similar framework with differences in the way of structuring the process, scopes, kind of planning, etc (Seyedhoseini et. al., 2008). The main objective of this research study is to structure major Key Success Criteria (KSC) as a roadmap for the project managers to provide the RM programs, especially in the project conceptualization phases. For this purpose, initially, the paper identifies the RM KSCs. Then, by configurationally analysis of a system, a structured framework will be established.

    Read this paper


  • Cardiovascular Risk Assessment: from precise science to health hazard

    Authors: Dr. Jeetesh. V Patel

    Abstract:

    Cardiovascular disease (CVD) is a leading cause of death across the world, accounting for nearly one half of all deaths in the developed and developing world [1][2][3][4] What is tragic about CVD, is that deaths from CVD occur prematurely, and are preventable. Smoking is an established risk factor that increases the risk of dying from heart disease and stroke [5]. Serum cholesterol is another of the major modifiable risk factor for CVD [6] [7] and clinical trials using cholesterol lowering intervention have established that this treatment represents the greatest likelihood that a physician engaged in general medical practice can routinely prolong life following an ischemic event [8]. Other risk factors are also well established for this disease [9][10], and it is imperative to detect, evaluate and control these risk factors early and effectively. Not surprisingly, CVD prevention is at the forefront of public health agendas, and over the last four decades, there have been dramatic advances, where the early identification and treatment of CVD and the improved awareness of in the general public underpin a steady decline of mortality rates in the West [11]. During this period we also saw the introduction of novel mathematical functions to stratify people at risk of acute myocardial infarction, i.e. CVD risk score estimation [12].Today, CVD risk score estimation is increasingly seen as standard practice in Western Medicine, allowing the physician to rapidly integrate different facets of health information to define the possibility of adverse outcomes in in dividuals in a variety of situations. While the process of risk assessment and management in this setting is been backed by a strong body of scientific evidence [13], there are limitations. The successes of such diagnostic and management approaches have had a limited impact in some sections of society [14], and health policies, which strive for consistency in risk assessment and management, can exaggerate these weaknesses, contributing to damaging inequalities in healthcare. The causes and management of CVD are complex, and multifactorial, and risk assessment carries with it the danger of a move away from science in favor of simplistic policy. This chapter looks at the science of cardiovascular risk assessment, its application in healthcare and the ‘hazard’ asso ciated with such an approach.

    Read this paper


  • Communicating human health risks associated with airborne particulate released from fly ash dumping site: Probabilistic Approach

    Authors: Mr. Abdullah Mofarrah

    Abstract:

    The goal of health risk assessment is to estimate the severity and likelihood of harm to human health from exposure to a substance or activity that under plausible circumstances can cause harm to human health. Risk assessment model is used to identify the human health impacts due to exposure to contaminants via multiple exposure routes such as inhalation, ingestion, and dermal contact. The regulatory agencies use health risk assessment process in a variety of situations such as: (i) to set standards for concentrations of toxic chemicals in air, water, soil or food, (ii) to conduct baseline analyses of contaminated sites or facilities to determine the need for remedial action and the extent of cleanup required, (iii) to develop cleanup goals at contaminated sites, (iv) to evaluate the effectiveness of existing and new technologies for effective prevention, control, or mitigation of hazards and risks, etc. There are two primary methods of risk analysis: first, qualitative analysis, which helps in the identification of the assets and resources at risk. This analysis uses simple calculations, experts’ assumptions and procedure. It does not determine the threat intensities or freque ncies. Different qualitative scales are used to achieve an acceptable level of risk and to increase the overall awareness. On the other hand, quantitative analysis identifies the specific magnitudes in which the losses and safeguards exist. The quantitative health risk analysis involves exposure dose estimates against a benchmark of toxicity, such as a cancer slope factor (SF), reference dose (RfD). For example, we might estimate the probability of cancer in the community where the chemical was suddenly spilled. Or, we might calculate the health risks associated with the presence of metals in air or the presence of pathogens in drinking water. There are several models available for health risk assessment.

    Read this paper


  • Credit analysis

    Authors: Dr. Mihaela Alina Dima

    Abstract:

    The past decade has seen dramatic losses in the banking industry. Firms that had been performing well suddenly announced large losses due to credit exposures that turned sour, interest rate positions taken, or derivative exposures that may or may not have been assumed to hedge balance sheet risk. In response to this, commercial banks have almost universally embarked upon an upgrading of their risk management and control systems. Credit analysis is concerned with identifying, evaluating and mitigating those risks which may result in a company not being able to meet its creditors’ claims. Credit analysis is the quantitative and qualitative analysis of a company, which help to determine the company’s debt service capacity, or how capable it is to pay back its principal payments to the bank or other creditors. This chapter presents the importance of credit assessment and the main elements/steps in credit analysis process, describes the sources and category of information to assess the customer solvency, the role of credit scoring mechanism, identifies the risks in lending situations and discusses solutions for mitigation process, draws conclusions regarding the likelihood of payment and make recommendations as to the proper type and structure of the loan in the light of the perceived financing needs and risks.

    Read this paper


  • Data Envelopment Analysis (DEA) Integrated Risk Assessment Technique on Hedge Funds Investment: Theory and Practical Application

    Authors: John Lamb and Kai-Hong Tee

    Abstract:

    This chapter discusses Hedge funds' risks, in particular the leverage risk, as it has the greater effect when applied both to derivatives or other asset classes alike. Evaluation of the effect of leverage could become complicated as this effect could be escalated by the liquidity and credit risks underlying the invested assets. Unlike market risk, these are the risks that are not easy to hedge and need constant closed monitoring. Moreover, market risks could also evolve over time. This reinforced the importance of correlation in estimating the eventual effect, especially for complex portfolios and structured products. Under most cases, by segregating risks geographically and at a granular positional level, Hedge fund managers can also have more transparency to estimate direct or indirect relationships between strategies and correlations can still be calculated. The proposed Data Envelopment Analysis (DEA) method in this chapter, shows how the use of correlations of returns, and the consideration of the distributional characteristics of the Hedge funds, by suitably using appropriate risk and returns measures, is able to rank Hedge funds. By further assessing the statistical properties of the performance and the ranked positions of Hedge funds, the proposed method aims to provide a reliable tool to assess risk of Hedge funds appropriately, which should support the Hedge funds selections process for the Funds of Hedge funds managers.

    Read this paper


  • Defining food sampling strategies for chemical risk assessment

    Authors: Nathalie Wesolek, Alain-Claude Roudot

    Abstract:

    Collection of accurate and reliable data is a prerequisite for informed risk assessment and risk management. For chemical contaminants in food, contamination assessments enable consumer protection and exposure assessments. And yet, the accuracy of a contamination assessment depends on both chemical analysis and sampling plan performance. A sampling plan is always used when the contamination level of a food lot is evaluated, due to the fact that the whole lot can not be analysed, but only samples, which are drawn from the lot. An efficient sampling plan enables to take samples from a food lot, following a given protocol, with a relatively low risk of misestimating the true mean concentration of the food lot after analysis of the food samples. Sampling plan performance testing is achieved thanks to mathematical validation methods. The best fit sampling plan is the one that gives the best compromise between the lowering of the risk of misestimating the true lot concentration and the practical feasibility (not too much time consuming nor money consuming). This chapter presents two sampling plan validation strategies: a parametric method developed by Whitaker and co-workers from 1972 and a non parametric method set up by Schatzki et al. (Schatzki, 1995; Campbell et al. 2003). To our knowledge, these are the only two methods sufficiently evolved for having been applied to real situation cases for food sampling validation. These statistical methods are first explained from a theoretical point of view. Then, each one is illustrated by a practical application to a sampling plan validation for a specific chemical risk in a food commodity, thanks to workable contamination data gathered in the literature. According to us, in its general mathematical principle, the non parametric method is more appropriate to cases with contaminants distributed heterogeneously in a food lot. However, due to its ease of use, the parametric method applies best to cases where the distribution of the contaminant is homogeneous. A food contaminant is homogeneously distributed in a contaminated food lot when the contamination incidence rate for individual food items taken from the contaminated lot is high, and when the concentration levels in each food item are rather alike. Otherwise, when the contamination incidence rate is low, and when the concentrations differ greatly in each food item, this means that the contaminant is heterogeneously distributed within the food lot. For these reasons, the first sampling plan validation technique (parametric method) is applied, in this chapter, to phycotoxin contamination in shellfish lots at the cultivation zone, as it is considered as being a homogeneously distributed contaminant case. For the heterogeneously distributed contaminant case, mycotoxin contamination data for pistachios at retail stage are exploited in order to put into practice the non-parametric sampling plan validation method. Both phycotoxins and mycotoxins are natural toxins that are unsafe for human. Limits of contaminations are set by national and international safety agencies, but sampling strategies have a great influence on the detected results in food lots. The chapter will show that an optimal sampling strategy can be obtained in each of the two cases, but that they require different mathematical approaches in order to obtain reliable Operating Characteristics (OC) curves showing: - the consumer risk (risk of accepting lots at a true concentration above the contaminant’s concentration threshold); - the producer risk (risk of rejecting lots at a true concentration under the contaminant’s concentration threshold)

    Read this paper


  • Discounting and Catastrophic Risk Management

    Authors: T. Ermolieva, Y. Ermoliev, G. Fischer, M. Makowski, M. Obersteiner

    Abstract:

    The risk management of complex coupled human-environmental systems essentially relies on discounting future losses and gains to their present values. These evaluations are used to justify catastrophic risks management decisions which may turn into benefits over long and uncertain time horizons. The misperception of proper discounting rates critically affects evaluations and may be rather misleading. Catastrophes are not properly treated within conventional economic theory. The lack of proper evaluations dramatically contributes to increasing the vulnerability of our society to human-made and natural disasters. Underestimation of rare low probability - high consequences potentially catastrophic scenarios (events) have led to the growth of buildings and industrial land and sizable value accumulation in flood (and other disaster) prone areas without paying proper attention to flood mitigations. A challenge is that an extreme event, say a once-in-300-year flood which occurs on average only once in 300 years, may have never occurred before in a given region. Therefore, purely adaptive policies relying on historical observations provide no awareness of the risk although, a 300-year flood may occur next year. For example, floods in Austria, Germany and the Czech Republic in 2002 were classified as 1000-, 500-, 250-, and 100-year events. Chernobyl nuclear disaster was evaluated as 106 – year event. Yet common practice is to ignore these types of events as improbable events during a human lifetime. This paper analyzes the implications of potentially catastrophic events on the choice of discounting for long-term catastrophic risk management. It is shown that arbitrary discounting can be linked to “stopping time” events, which define the discount-related random horizon (“end of the world”) of evaluations. In other words, any discounting compares potential gains and losses only within a finite random discount-related stopping time horizon. The expected duration of this horizon for standard discount rates obtained from capital markets does not exceed a few decades and, as such, these rates cannot properly evaluate impacts of 1000-, 500-, 250-, 100- year catastrophes. The paper demonstrates that the correct discounting can be induced by the concept of stopping time, i.e. by explicit modelling of arrival time scenarios of potential catastrophes. In general, catastrophic events affect the induced discount rates, which alter the optimal mitigation efforts that, in turn, change events. The paper shows that stopping-time related discounting calls for the use of stochastic optimisation methods. Combined with explicit spatio-temporal catastrophe modelling, this induces the discounting which allows to properly focus risk management solutions on arrival times of potential catastrophic events rather then horizons of capital markets.

    Read this paper


  • Framework for Managing Urban and Regional Economic Development Risks

    Authors: Professor Brian Roberts

    Abstract:

    This paper elucidates upon the application of a risk evaluation technique, Multi sector Risk Analysis (Roberts, B.H , and C Tabart. 2005) , to develop a framework for pre and post risk strategies for managing local economic development risk in cities and regions. The framework was used for developing sector industry risk management strategies and plans for selected industries in Canberra, Australia.

    Read this paper


  • From Risk Management towards Uncertainty Management

    Authors: Dr. Genserik Reniers

    Abstract:

    Risk can be defined as “The effect of uncertainty on achieving objectives” (ISO 31000:2009). Hence, risks should be regarded as being two-faced: the same risks can be called negative if the outcome is negative and positive if the outcome is positive. In this chapter, we plea for managing uncertainties instead of merely (negative) risks in companies. An uncertainty manager, instead of a risk manager, should be appointed in organizations to identify, analyse, and assess uncertainties which go hand in hand with achieving predefined goals. Uncertainty management requires open-mindedness, the willingness to collaborate, and the know-how to deal with a variety of risk assessment techniques, amongst others. Uncertainty managers then should be able to formulate balanced and well-considered recommendations for decision-making, for different types of risks.

    Read this paper


  • Health Risks Associated with Low Dose Ionizing Radiation

    Authors: Dr. Vesna Spasic Jokic

    Abstract:

    People are continually exposed to natural sources, but medical applications and industry development contribute to growing level of exposure to manmade sources. This fact urges development of methods for dose and risk estimation. Before commencing a new work activity involving ionizing radiations, a risk assessment should be made to identify the hazards and nature and to evaluate the magnitude of the risks to which both workers and members of the general public could be subjected. This review summarizes the stochastic (genetic risks in offspring, somatic effects in directly exposed population) and the deterministic effects and risks associated with exposure to radiation. The importance to estimate the possible health and environmental risk due to the natural and anthropogenic radionuclides in the environment was recognized by scientific community as well as official state institutions authorised to establish priorities and regulatory actions. While effects of exposure to high doses are well known, there is an open debate about effects of low doses. Radiation protection is based on Linear-no-threshold (LNT) model which states that possibility for cancer developing exists for very low doses, just above zero dose, due to stochastic nature of radiation, but this model is still the subject of serious critiques. Simulation of radiation transport processes, from source in environment to organism and through organism to different organs and tissues, enables dose estimations which can be related to certain stochastic or deterministic effects and further to appropriate risk assessment. A great number of numerical experiments, which are dealing with transport of radiation particles, are based on Monte Carlo techniques. Monte Carlo method represents statistical sampling technique that enables solving of mathematical model in the form of probabilistic approximation of the solution. The Monte Carlo method is using for accurate prediction of dose distributions. One of the primary reasons to use Monte Carlo analysis is to examine the effect of uncertainty and natural variability on the risk estimate. Two examples of Monte Carlo application in our clinical and environmental risk assessment practices are given in chapters 6. and 7. Analyses of data for risk assessment purposes strongly depends on professional judgement based on scientific expertise in design and conceptualize the risk assessment; evaluation and selection methods and models; determination the relevance of available data to the risk assessment etc.

    Read this paper


  • Hygroscopic Growth of fine Aerosols in the Nasal Airway of a 5-year-old Child

    Authors: Dr. Jinxiang Xi

    Abstract:

    Condensation growth of inhaled aerosols within human respiratory tract can be significant, which may notably alter the behavior and fate of the inhaled aerosols. Previous experimental studies have shown that certain fine sub-micrometer aerosol particles deposit in the nasal airways like micrometer aerosols. It is also observed that the inhalation of warm saturated airflow may result in humidity values ranging from sub-saturated through super-saturated conditions. Therefore, we hypothesize that with slow inhalation rate and high humidity, fine aerosols that bypass the vestibule and nasal valve will experience significant growth and deposit in the downstream airway surfaces. The objective of this study is to evaluate the effect of hygroscopic growth on the transport and deposition of fine aerosols in nasal airways under various relative humidity (RH) and temperature conditions. To achieve this objective, a physiologically realistic nasal-laryngeal airway was developed based on MR imaging of a 5-year-old boy. Temperature and RH field were simulated using the Low Reynolds number k-ε turbulence model and the specie transport model for a large spectrum of activity and thermo-humidity conditions. Particulate motion and diameter growth through the nasal-laryngeal airway were investigated using an extensively tested discrete Lagrangian tracking model coupled with a user-defined droplet hygroscopic module. The subsequent aerosol deposition were captured and compared between four different inspiratory thermo-humidity scenarios on both regional and local basis. Small growth and even shrinkage in aerosol size were observed to occur in the nasal airway under sub-saturated conditions and large growth occurs for both super-saturated conditions. Specifically, when inhaling hot and saturated airflows (T = 47oC, RH =100%), significantly enhanced fraction of aerosols deposit in the nasal turbinate region in general, and olfactory region in particular, relative to the other three conditions. Inhalation dosimetry of hygroscopic fine aerosols was shown to be a dynamic process depending on concurrent deposition mechanisms and instantaneous particulate size, which is further determined by local thermo-humidity conditions. Results of this study highlight condensation growth as a potentially significant mechanism in the deposition of fine aerosol particles under saturated inhalation conditions and possess a promising implication to improve intranasal delivery efficacy targeted at turbinate and olfactory regions.

    Read this paper


  • Lung cancer attributed to radon exposure, for the general population: estimation and prevention

    Authors: Truta-Popa Lucia-Adina, Cosma Constantin, Hofmann Werner

    Abstract:

    Strong evidence has been found in epidemiological studies that cumulative exposure to radon is the second leading cause of lung cancer, after smoking. The purpose of the present study is threefold: i) to analyze the lung cancer risk for residential radon exposures, based on the dose-effect relationship; ii) to estimate the percentage of lung cancers attributed to radon exposure, based on the relative risk values predicted using two carcinogenesis models, and iii) to point out the technical means available to protect people against radon exposures above the recommended Action Level (of 200 Bq/m3) and summarize the cost-effectiveness and benefits of such remedial/preventive actions. Simulations of lung cancer risk at low radon exposure levels occurring indoors, performed with the mechanistic, biology-based Transformation Frequency-Tissue Response model and also with the risk model of Darby exhibit a linear dose-effect relationship. Predicted relative risks at measured exposure levels enabled us to estimate a significant fraction of lung cancer cases attributable to radon (between 16.38% for Stei, Romania and 7.68% for the study of Darby), by using a novel, more efficient method, validated by epidemiological data. Accordingly, an intervention against radon exposure in homes may be analyzed, in terms of possible cost-effectiveness and benefits.

    Read this paper


  • Measuring and evaluating the effectiveness of information security

    Authors: M.Sc. Mario Sajko

    Abstract:

    State of information system security is described with the large number of features and indicators for different areas of information security. Depending on the area of activities in the business company and the security objectives, for the effective security management it is necessary to select appropriate indicators and establish a process of their monitoring and measurement. Systematic application of such indicators is usually defined as a metric of standardized measures and methods of measurement and interpretation of results. Although in the literature we can find the examples of such indicators and specific suggestions about how to collect and measure them, the whole area of the information security measurement is yet unexplored. The focus of this paper is to recapitulate past experience and results in the systematization and structuring of security indicators which is not completely clarified. The aim is to identify existing experiences concerning the application of security metrics as an instrument of evaluation and assessment of information system security.

    Read this paper


  • Polysystemic monitoring of nuclear plant stuff and risk group revealing

    Authors: Dr. Irina Alchinova, Elena Arkhipova, Anton Cherepov,Prof. Mikhail Karganov

    Abstract:

    Any stable fixation of the pathological trace is preceded by processes of dysregulation of the corresponding functions. The most probable pathological outcomes can be predicted on the basis of the results of polysystemic sanogenetic monitoring by detecting dysregulation in certain systems of the organism (cardiorespiratory, psychomotor, and metabolism systems). Monitoring is carried out using computerized measurement instrumentation and data processing systems, which provides the basis for strict quantitative assessment of the dynamics of risk for the studied populations. The risks assessment goes from the instrument of control to the rank of controlled processes, which is the basis for successful operation of potentially hazardous industries. Device complex and methodological approaches have been tested during screening examination of workers of the nuclear fuel plant.

    Read this paper


  • Risk Management in Agricultural Production: Case Studies from Turkey

    Authors: Dr. (Assoc. Prof.) Handan Akcaoz

    Abstract:

    Agriculture is often characterized as a uniquely high-risk and high-uncertainty sector of the economy. There are many sources of risk and uncertainty in agriculture, forcing farmers to make decisions in a risky, ever changing environment. These risks and uncertainties are conventionally categorized into the management areas of production, marketing and finance. Some of the risk and uncertainty components are lack of rainfall, price changes, lack of labour for required time, machinery breakdowns in unexpected situations and changes in government policy and other similar factors. These factors are the main causes of income fluctuations in agriculture. Because of risk and uncertainty components, big fluctuations in yields and prices have occurred and this situation leads to important income differences from one year to another. The risk and uncertainty conditions are reflected in prices of goods purchased by farmers and amounts of production. These are therefore of crucial importance for a farmer’s income. Risk management strategies are developed to provide some protection in situations in which the consequences of a decision are not known when the decision is made. The various risk management strategies have different effects on the farm business, but none of the responses can provide protection from all types of risk. The purpose of this study was to determine the risk sources that the farmers face in agricultural production and the risk management strategies that can be used for dealing with these risk sources in some regions of Turkey. In the study, brief information was given about research area; socioeconomic characteristics of the farmers were analyzed, and data about decision-making on risk, financial status and sustainability was assessed.

    Read this paper


  • Risk Management in Banking

    Authors: Dr. Mihaela Alina Dima

    Abstract:

    The financial markets have been constantly changing over the last decades and risks went along with the trends in these markets. Commercial banks undertake the important process of financial intermediation whereby the funds or savings of the surplus sector are channeled to deficit sector. Financial intermediation can enhance growth by pooling funds of the small and scattered savers and allocating them for investment in an efficient manner by using their informational advantage in the loan market. In order to do that, banks are constantly under pressure and have to assume high risks and at the same time manage the risks in order to avoid, or at least minimize losses. Therefore, the universal banks need to value risks, which are intangible and invisible, requires that risks be well-defined. Risk management can be regarded as an active, strategic, and integrated process that encompasses both the measurement and the mitigation of risk, with the ultimate goal of maximizing the value of a bank, while minimizing the risk of bankruptcy. This chapter presents the major risks in banking activity and innovative tools used by the banks in order to evaluate and mitigate them with the goal to preserve a sound financial system.

    Read this paper


  • Risk Management Models and Tools for Insurance Companies

    Authors: Dr. Irina Voronova

    Abstract:

    M.Sc. Jekaterina Kuzmina1, Dr. math. Gaida Pettere2 and Dr. oec. Irina Voronova3 1 Investment Analyst, Baltikums Bank AS (Latvia) 2 Professor, Riga Technical University (Latvia) 3 Professor, Riga Technical University (Latvia) Current research is dealing with development of an internal risk management model for insurance companies using such tools as risk measures and copulas. The model developed satisfies regulatory requirements (under Solvency II regime) and internal risk management standards, as well as allows dealing with otherwise complex multivariate modeling. The research consists of both theoretical and empirical study, while Excel and Matcad programs are used for empirical tests of the internal risk management model.

    Read this paper


  • RISK MANAGEMENT: CHOOSING OPTIMAL TOOLS ON THE BASIS OF PSYCHOLOGICAL ANALYSIS

    Authors: Dr.R.K. Schindhelm, dr. P.M.W. Janssens

    Abstract:

    Over the last decade, there has been a significant increase in attempts to manage failures and unwarranted side effects in healthcare. More efforts than ever are being made to avoid unnecessary and preventable mortality, morbidity and loss as a result of unintended diagnostics, treatment, and – in the worst-case scenario – undoing of an inadvertent outcome. Total quality management has been introduced in healthcare coupled to accreditation and certification of institutions, departments and activities, using approaches like risk analysis (RA), risk management (RM) and six sigma and lean management. Clinical laboratories have traditionally taken a lead in these developments, not only because laboratory procedures are readily amenable to a process-like approach, but also because of the interest of laboratory professionals in the aforementioned topics. Many aspects of RA/RM have been introduced in specific areas of medicine, including activities in the operating theater, intensive care and delivery room, as well as in the application of artificial reproductive techniques. In this contribution we elaborate on the methods developed to reduce failures in healthcare. Starting with an introductory overview of risk analysis and management and the psychology of human failure, we apply the insights from both to specific areas of the laboratory organization. Not surprisingly, given that the contribution of the “human factor” is relatively large in the pre-analytical phase, the majority of errors will occur in this area of laboratory testing. As our approach appears to have the best chance of success in activities that are largely constituted by human activity, notably manual operation, our elaborated analysis is especially suitable for the pre-analytical segment of the laboratory. Our analysis shows that combining risk analysis with views on the psychology of human failure leads to new and better founded insights. This enables better choices to be made in error prevention and the prioritization of measures in cases where time, human capacity or money is limited.

    Read this paper


  • Security Risk Assessment in Digital Rights Management Ecosystem

    Authors: Zhiyong Zhang

    Abstract:

    In multimedia consuming, Digital Rights Management (DRM) is the important means to confirm the benefits of both digital contents/services providers and consumers. To keep the DRM system running in order, risk management should be highlighted, due to limited security and uncertain trust among multi-stakeholders in DRM-enabling contents value chian. Now, the legitimate sharing of copyrighted digital content is still an open issue, which faces severe risks of propertied assets circumvention and copyright infringements. In this chapter, we try to highlight a multi-disciplinary method for all-around examinations on security, trust and risks to digital assets in the contents sharing scenario. And, our proposed method is a qualitative and quantitative fuzzy risk assessment, which is used for estimating a novel concept called Risk-Controlled Utility (RCU) in DRM. Then, we emphasize on an application case of the emerging trusted computing policy, and analyze the influences of different content sharing modes. Finally, we address a business model with some simulation results. The comparison with other methods shows that the fusion of qualitative and quantitative styles can not only valuate the RCU with uncertain risk events effectively, but also provide accurate assessment data for the security policies of DRM.

    Read this paper


  • Software Risk Management Model Using Goal-Driven Approach from Early Requirements Engineering

    Authors: Dr. Shareeful Islam and Dr. Anca-Juliana Stoica

    Abstract:

    Software projects contain significant number of uncertainties throughout the life cycle of the product. These uncertainties pose any potential risks which may cause financial loss, vendor bad reputation, market rejection, user dissatisfaction, or even rejection of the final product. Early risk management practice is effective to control these risks in order to meet the project specific goals. But there is still a lack of comprehensive guidelines on how to integrate risk management activities at the early development stage. This research contributes an integrated modeling framework for software development risk management, called Goal-driven Software Development Risk Management Model (GSRM). The framework is comprised of a conceptual model, analysis techniques and a methodology to systematically identify, model, analyze and control risks to attain the project specific goals. GSRM includes goals as the objectives, expectations and constraints from the development components by focusing on the project success factors for the risk management. The model considers goals beyond schedule, budget and quality and realizes the importance of motivating project stakeholders in particular customer/user to take an active part in the software project. It focuses on the non-technical components such as project execution, customers/ users, project participants and usage environment, along with the technical components such as development process, system specification and tools as a holistic view for the risk management. The approach is empirically evaluated using case study research method along with action research. The observations made from the study showed that the chance for a successful project increases substantially if project includes early risk management practice. Goal-driven approach eases risk management activities and its integration early requirements engineering.

    Read this paper


  • SURVEYS OF LIVING ORGANISMS EXPOSURE PROFILE TO “REGIONAL AND NON-REGIONAL” ORGANIC POLLUTANTS AT TRACE LEVELS

    Authors: Dr. Melinda Haydee Kovacs

    Abstract:

    The proposed chapter will present concrete data regarding de trace analysis of selected organic pollutants (as organometallic compounds or other organic pollutants as chlorine based compounds or mono- and polycyclic aromatic hydrocarbons)from biological samples collected through non-invasive methods. Advantages, disadvantages as well key problems with current solutions will be advenced also. Also it will be presented concrete data regarding the profile of selected studied pollutants on human animal subjects as well on vegetal samples. Conection between trace amounts of pollutants founded in animal/vegetal samples (used as basic food by inhabitants) and amounts of these pollutants founded in studied inhabitants will be established and presented also.

    Read this paper


  • The New Basel Framework and the Emergence of Prudential Rules

    Authors: Dr. Marius Motocu

    Abstract:

    The classic case for prudential regulation arises from the systemic risk that inheres in the banking sector. Commercial banking is an inherently risky activity. Commercial banks transform short term and, thus, highly liquid liabilities (deposits and money market certificates) into long term and relatively illiquid assets. Consequently, every commercial bank faces liquidity risk and sol vency risk. Liquidity risk arises from the possibility that depositors will run on the bank; solvency risk arises from the possibility that the value of assets will drop below liabilities through default or other factors. Illiquidity and insolvency have the same impact: they prevent banks from repaying their creditors. Systemic risk arises because liquidity or solvency problems at one bank can spread through the entire financial system. When one bank cannot repay an important creditor, the creditor is likely to begin encountering liquidity and solvency issues of its own. Illiquidity at the counterparty bank makes it difficult for this bank to repay all of its credito rs who may then, in turn, face liquidity problems. The interconnectedness of the banking system - the fact that one bank’s assets are another bank’s liabilities - can thus transform what might be otherwise a localized liquidity or solvency crisis into a systemic crisis (Oatley and Winecoff, 2011).

    Read this paper


  • The operational risk management process: Implementation of an OR management model

    Authors: dr. Aleksandra Brdar Turk

    Abstract:

    The management of operational risk in any company, especially in financial institutions, is an ongoing process, which should be well incorporated in the daily operations as well as the tactical and strategic decison-making processes of the upper management. The development and intergation of risk management models into regular operations of a company is a key step into integrating risk management into the company's business and deriving maximum benefits and risk protection from its usage.

    Read this paper


  • The role of risk-informed decision making in the licensing of nuclear power plants

    Authors: A. L. Sousa; J. P. Duarte; P. L. Saldanha; P. F. Frutuoso e Melo

    Abstract:

    A. L. Sousa - Comissão Nacional de Energia Nuclear- CGCN/CNEN - Rua da Passagem, 123/ 9o andar, CEP-22290-901, Rio de Janeiro, RJ, BRASIL - alsousa@cnen.gov.br - J. P. Duarte - Departamento de Engenharia Nuclear, Escola Politécnica, Universidade Federal do Rio de Janeiro, Av. Horácio Macedo, 2030, Bloco G, Sala 206 -Rio de Janeiro, RJ, BRASIL - jduarte@nuclear.ufrj.br - P. L. Saldanha - Comissão Nacional de Energia Nuclear- CGRC/CNEN - Rua General Severiano, 90/4o andar, CEP-22290-901, Rio de Janeiro, RJ - BRASIL -saldanha@cnen.gov.br - P. F. Frutuoso e Melo - COPPE/UFRJ - Programa de Engenharia Nuclear, Av. Horácio Macedo, 2030, Bloco G, Sala 206 -Rio de Janeiro, RJ, BRASIL - frutuoso@nuclear.ufrj.br

    Read this paper


  • THEORETICAL CONSIDERATIONS ON GEOLOGIC RISK MANAGEMENT

    Authors: Dr. Lber Galban Rodriguez, Dr. Tomás Jacinto Chuy Rodríguez, Dr. Ingrid Noelia Vidaud Quintana,

    Abstract:

    The starting point for analysis of geological risks in the world, are the analysis of the influence of geological processes and phenomena in various construction and infrastructure projects that man do. The management of risks generated by these and the fundamental concepts surrounding this topic are discussed in this work. Finally defining a modern concept for the management of geological risk and making an outline of the main tools and measures used for this purpose

    Read this paper


  • What is the role of in-vitro models in the estimation of the health risk caused by nanoparticle exposure?

    Authors: Prof. Dr. Eleonore Fröhlich

    Abstract:

    Multiple cellular targets for nanoparticles have been identified in the last years, but the relevance of these findings for the risk of exposure to nanoparticles in consumer products is not clear. In this review the different aspects of risk assessment for nanoparticles are illustrated using titanium dioxide as an example. Relevant exposure routes, the role of particle parameters, such as particle size, shape, surface charge, degree of agglomeration, ‘aging’ of particles, contamination with biological and chemical compounds etc. are addressed. Then the role of in-vitro models for the assessment of nanoparticle effects is summarized. In the third part technical difficulties which interfere with risk assessment of nanoparticles, such as interference with the assay systems, are discussed and in the last part of the review refined in-vitro models and relevant exposures are proposed. The detection of nanoparticle penetration through epithelial barriers as well as the monitoring of long-term effects due to persistence of nanoparticles in cells are important parameters in the risk assessment of nanoparticles in consumer products. In-vitro models can help to set parameters for Standard Operation Procedures to ensure realistic nanoparticle testing both in-vitro and in-vivo.

    Read this paper