
Ders & Ödev & Tez Lise ve Üniversitede Okuyan üyelerimiz Dersler Ödevler ve Tezlerler 

Seçenekler 
#1




A Comparison of Methods and Softwares in Probabilistic Design
A Comparison of Methods and Softwares in Probabilistic Design
Fatih Arslan ME 5352 – Probabilistic Design Fall 2011 Dr. EkwaroOsire February 28, 2011 Alıntı:
Abstract In the design and analysis of engineering systems, engineers have always recognized the existence of uncertainty. Most of the parameters involved in engineering design are random quantities, and no matter what design methodology is used to approach this randomness or uncertainty, the main task is to ensure satisfactory performance. Traditionally, engineers have used a deterministic methodology in engineering analyses. This simplified approach accounts for uncertainties through the introduction of Factors of Safety (FS). However, deterministic approach, by definition, results in the loss of probability information which is useful in assessing risk and formulating tradeoff studies for design. In probabilistic methodology, variable states are represented by probability distributions, rather than unique values. These reflect the uncertainties about the true values. This paper discusses the advantages of probabilistic methodology, some of its techniques and applications, and the hurdles faced in its advancement. 1. INTRODUCTION Traditionally, engineers have used a deterministic methodology in engineering analyses. In deterministic design, factors of safety are determined from past knowledge and experience; therefore, sometimes they are unavailable for new designs. Also, they do not absolutely guarantee satisfactory performance or safety of the system. In deterministic design methodology, there is no information on how the various parameters of the system affect safety. Hence, designing a system with a uniform distribution of safety levels amid different components is difficult using factors of safety [1]. Engineering design requires a balance between minimizing cost and maximizing safety levels. Safety factors do not provide enough information to achieve the best use of resources. An excessive conservatism in the selection of safety factors leads to overdesign, which is uneconomical, while a dearth of conservatism leads to underdesign, which results in high failure costs and low reliability. Often, both these problems coexist in a single product using this approach. Therefore, a deterministic approach in engineering analyses frequently results in inconsistent products [2]. This is fast becoming unacceptable in industry due to mounting customer concern about reliability, the increased global competition in terms of cost and time to market, and the demand to improve energy utilization. Probabilistic analysis provides the information necessary for optimum design, and offers the opportunity for betterment in business and product performance in terms of:
Several codes or design guidelines have been revised in recent times to incorporate probabilistic analysis. For example, revisions have been made in the European and Canadian design specifications, as well as the American Institute of Steel Construction Load and Resistance Factor Design (1994) specifications. Several other specifications are currently in development to incorporate probabilistic analysis. It is expected that the use of probabilistic analysis in these codes will provide more information about the behavior of the system, the interaction between system components, and the effects of different variables on performance. In addition to bringing rationality to the consideration of uncertainty in the design process, probabilistic analysis retains the expertise or experience gained from a particular system [1]. In probabilistic methodology, the opinions of experienced designers about uncertainties in the system are included as the ‘professional factor”. To allow the use of uncertainty and the probability theory more effectively in the design process, several probabilistic techniques have been developed over the years. However, they have not yet been utilized to their maximum effect due to many reasons including: the need of a strong statistical background of the designer, the mathematical sophistication and computational intensity of several probabilistic techniques, difficulty in data collection and limitations in availability, and lack of confidence in their application [2]. However, customer sensitive and high volume manufacturing sectors, for example the automotive industry, would benefit greatly from probabilistic design. Considerable improvements in available data, and an educational and cultural stepchange, however, are necessary to fully incorporate it in design activities. As a first step, it is essential for engineers to understand the capabilities and limitations of the available techniques. Constant enhancement in computer performance and the identification of potential failure modes in computer databases indicate that the feasibility of probabilistic design is increasing steadily. Analytical techniques are increasingly becoming more sophisticated, and now it is possible to consider the entire space of random variables on the lifetime reliability of structures. This was demonstrated by NASA’s CARES/Life code combined with the ANSYS probabilistic design system. ANSYS is an engineering simulation software provider. ANSYS Probabilistic Design System (ANSYS/PDS) is a probabilistic analysis tool that is an integral component of the ANSYS finite element (FE) analysis software. CARES/Life coupled with ANSYS was used by Glenn Researchers to study effects of random variables on component geometry, loading, and material properties on the expected life of the part [8]. Over time, increased confidence is probabilistic design techniques will lead to a move towards fully simulationbased design. This will save cost by nullifying the need of prototype validation before production [3]. Designers, however, have concerns about the reliability and appropriateness of these techniques at various stages of the design process. This paper examines various probabilistic techniques incorporated in the design process. Some applications and technologies making use of this methodology will also be discussed. 
#2




Re: A Comparison of Methods and Softwares in Probabilistic Design
2. PROBABILISTIC DESIGN METHODOLOGY
The foundation for the emergence of many probabilistic techniques is the limit state theory. These techniques generally attempt to determine the limit state functions, also called the performance functions. This theory provides a means for assessing the performance of components against various limiting conditions, such that the component loses the ability to perform its intended function. The limit state function is often defined in terms of stress and strength, or load and resistance, due to its importance in a structure’s safety. In principle, however, any parameter relating to the performance can be used in its formulation. The performance function for a strength limit state is given by the difference between the resistance and the load, as in Eq. (1). [1] (1) The integral of the joint probability function of the random variable over the complete failure region can be used to evaluate the Probability of Failure (POF), given in Eq. (2). [1] (2) 2.1 Coupling Formula In cases where the limit state function consists of more than two independent variables, Taylor expansion can be used to approximate variance of response. The variance of a function can be approximated by Eq. (3), using the variable’s standard deviation and mean. The coupling formula can be used to calculate the standard normal variate, given by Eq. (4). The standard normal distribution function is then used to determine the POF, as in Eq. (5). [1] (3) 2.2 Monte Carlo Simulation (MCS)(4) (5) In practical situations, sometimes it is impossible to evaluate the POF directly. For such cases, Monte Carlo Simulation (MCS) can be used for its evaluation. This is done through the simulation involving the repeated computation of the performance function, to generate POF from the distribution of values calculated from randomly generated combinations of input variables. This is an effective method, however, to produce reasonably accurate results the number of iterations has to be relatively large. For a specified POF, the coefficient of variation of the POF estimator is given by Eq. (6). [1] (6) For high reliability, the number of iterations must be large. This process can be time consuming and costly. This limitation makes the application of this technique in some areas, such as stress analysis in FEM, difficult and expensive. 2.3 First Order Reliability Method (FORM) In First Order Reliability Method (FORM), the linear state is linearised about the point with the highest probability, also known as the Most Probable Point (MPP). For this reason, this technique may lack the desired accuracy in highly nonlinear problems. Also, the error increases rapidly as the number of variables or the degrees of freedom increase. It has the advantage of simplicity, but fails to make use of the distributional information. The POF, in the standard normal space, is given by Eq. (7). [1] (7) Unlike FORM, the Second Order Reliability Method (SORM) is determined by fitting a parabolic surface to the limit state function at MPP. SORM provides more accurate results as compared to FORM, but is more computationally intensive. 2.4 Response Surface Methodology (RSM) The Response Surface Methodology (RSM) is an approximate mathematical function for the limit state, so it avoids computing the actual performance function. This technique is based on the principle that a suitable polynomial and a smooth surface can be used as approximations over a limited region of present interest. RSM is a very useful technique when a closedfrom performance function is unavailable. A suggestion of an RSM function is given in Eq. (8). [1] (8) 2.5 Advanced Mean Value (AMV) Advanced Mean Value (AMV) is based on the assumption that a Taylor series expansion exists at the mean values if the limit state function is smooth. The approximation is given in Eq. (9). [1] (9) AMV involves significantly fewer evaluations of g(x) as compared to MCS, greatly improving the efficiency for cases with a complicated performance function. 2.6 Fast Probability Integration (FPI) Fast Probability Integration (FPI) is an approach that approximates the MCS and evaluates the actual performance function. This is in contrast with the RSM. Even for highly nonlinear performance functions, the results from FPI are efficient and accurate. In FPI, a Taylor’s series expansion about the design point is first used to approximate a nonlinear limit state function with a quadratic function (Eq. 10). Then, it is transformed into a linear function (Eq. 11) following the transformation of to (Eq. 12). [1] By effectively approximating nonnormal distributions with corresponding normal distributions, FPI proves robust to variability in the input parameters. However, this requires extensive programming. (10) (11) (12) 2.7 Criteria for Evaluation There are a number of criteria that can be used to compare the effectiveness and efficiency of each technique in engineering design, such as:

#3




Re: A Comparison of Methods and Softwares in Probabilistic Design
3. PROBABILISTIC DESIGN SOFTWARES
COMRELis a software based on the SORM/FORM techniques. It is used for timeinvariant and timevariant component reliability analysis. Latest versions are equipped with improved optimizers, sampling options, and GUI. This software can deal with arbitrary dependence structures in the probabilistic model [14]. NESSUS(Numerical Evaluation of Stochastic Structures Under Stress) is a modular computer software designed to perform probabilistic analysis of mechanical/structural systems and components. Using this program, uncertainty in material properties, loading, geometry, initial conditions and boundary conditions can be simulated. It is currently the most popular reliability analysis software in the world [12]. DARWIN(Design Assessment of Reliability With Inspection) integrates life assessment for low cycle fatigue based on fracture mechanics, finite element stress analysis results, material anomaly data, anomaly detection, and probability of inspection schedules and anomaly detection. The probability of failure of a rotor disk as a function of operating cycles can be determined [12]. PRODAF(Probabilistic Design and Analysis Framework) is a software tool that implements a multidisciplinary, practical, designforreliability technique for aerospace systems. It uses FPI code to determine probabilistic component failure data. PRODAF can be used to address problems ranging from the detailed analysis/design of a single component to the conceptual design of a complete system [7][13]. RENOis a software designed for probabilistic event and risk analysis. It can be used to create flowchart models for risk and safety analyses, complex reliability analyses, and decision making or maintenance planning. This program can also be used to perform operational research, optimization, and financial analysis [15]. Weibull++is a software tool for life data analysis. It uses multiple lifetimedistributions (including all Weibull distributions) to perform life data analysis. The software has support for all data types, and is equipped with tools for related reliability analyses, such as degradation data analysis, warranty data analysis, recurrent events data analysis and nonparametric data analysis. Lambda Predictis a standards based reliability prediction program.In cases where the actual product reliability data is unavailable, this software can be used to compare design alternatives, evaluate design feasibility, track reliability improvement, and identify potential failure areas [15]. Designers and engineers are increasingly taking advantage and making use of probabilistic design softwares in a wide range of applications. In the next section, some areas of current application, and others with potential of utilizing probabilistic analysis techniques will be discussed. 
#4




Re: A Comparison of Methods and Softwares in Probabilistic Design
4. DISCUSSION
Some of the largest companies in the world are using predictive technologies for part design after having used deterministic techniques before. The goal is to capture the uncertainties as a component of the prediction, and to design processes in a manner that the durability and performance of the product are retained. Unipass is a technology that has been used in recent times to design gas turbines, elevators, and helicopters. Each of these products has components that are designed to carry loads during operation. In gas turbine engines there exist thermal, mechanical, and operational loads induced by the combustion process. Using predictive technology if the designer observes that the design is expected to last for an acceptable length of time, then that design is ratified. However, it is modified if the analysis points otherwise. This helps designers create robust designs because it takes into consideration the realities of everyday use on parts [5]. NASA and major aircraft companies have a keen interest in factoring risk into their models and designs. For aircraft engines, even a probability of failure of one in a million is too high a risk. Often, the acceptable chance of failure is one in a billion because the consequences of failure in such a case are very high. However, if the failure of one of the engines would not cause major problems, designers might accept a higher probability of failure. This is because risk factors are often looked at in terms of dollars, owing to the higher cost of production for greater reliability. This allows engineers to quantify risk and decide what design is acceptable or unacceptable [5]. The probabilistic design methodology and predictive technologies that use it have some strong backers. For example, the probabilistic methods committee of the Society of Automotive Engineers has made it a mission to facilitate and enable deployment of probabilistic technology in the industry to improve competitiveness by faster, better, smarter, greener, reliable, and affordable product development. At NASA, where the technique has some ardent supporters, the technology is used to perform indepth studies of parameter uncertainties. Probabilistic design techniques are currently being used in seismic hazard analysis. Seismic hazard is the possibility of having potentially destructive earthquake activities at a specified location. Since the introduction of probabilistic seismic hazard assessment (PSHA) thirty years ago, it has become the most extensively used approach for evaluating the characteristics of groundmotion for earthquake design [10]. In PSHA, all the potential earthquakes that could affect a site are determined to evaluate the design values of motion through a recurrence relationship. Besides structural and mechanical analysis, probabilistic techniques are also being employed in areas such as environmental regulation. Probabilistic analysis has potential for risk management in developing countries where resources, technical expertise, and information are often scarce. Most regulatory agencies there endorse the use of deterministic analysis in decision making. However, studies have shown that the use of deterministic approach can lead to higher risks than necessary. Probabilistic analysis may prove to be useful for public policy decisions, such as the regulation of arsenic in Chile [11]. 
#5




Re: A Comparison of Methods and Softwares in Probabilistic Design
5. CONCLUSION
The methodology of probabilistic design and some of its techniques, softwares, and applications have been outlined. This design approach provides the information necessary for optimum design. However, there are a number of limitations of probabilistic design methodology. 1) In real life scenarios, data is often too scarce to accurately estimate the probabilities of the random variables. This holds especially true for the tails of distributions, because they represent extremely rare values, which do not point to a particular choice of distribution. However, some researchers maintain that accurately estimating the standard deviation is more important than choosing the right distribution [4]. 2) When statistical techniques are used to estimate the output scatter, a particular distribution for the output distribution has to be assumed to calculate the probability of failure. Often, the choice is normal distribution. This distribution is viable for linear distributions; however, for problems with pronounced nonlinearities, the distribution of the response quantities can be different from normal [4]. Some researchers believe that the probabilistic approach permits complex and sometimes intangible risk to be manipulated, scaled, and simplified. A single value representing risk gives unjustified confidence that the issue has been addressed, with further thought tending to ignore the underlying risk factors [6]. 3) The predicted failure probability is greatly affected by modeling errors. Ideally, these errors should be accommodated in the probabilistic formulation as additional uncertainties. However, estimating the statistics of modeling errors is extremely difficult, because of the data required in large numbers about experimentalanalytical mismatch observed in systems of the same kind [4]. Although no extra modeling is necessary in formulating a problem in probabilistic design over a deterministic one, a major contributing factor to the lack of probabilistic applications is the time to collect data. Making available information such as coefficient of variation, standard deviation, etc. will encourage more probabilistic design applications [1]. 
#6




Re: A Comparison of Methods and Softwares in Probabilistic Design
REFERENCES
