声振论坛

 找回密码
 我要加入

QQ登录

只需一步,快速开始

查看: 1664|回复: 1

[计算力学] [转帖]Research directions in computational mechanics

[复制链接]
发表于 2007-3-18 09:59 | 显示全部楼层 |阅读模式

马上注册,结交更多好友,享用更多功能,让你轻松玩转社区。

您需要 登录 才可以下载或查看,没有账号?我要加入

x
The paper was published in 2003, written by Tinsley Oden, Belytschko, Babuska and Hughes. It's entitled "Research Directions in Computational Mechanics" (Computer Methods in Applied Mechanics and Engineering, 192, pp 913-922, 2003). They outlined six areas with significant research opportunities in CM:

1. Virtual design


In this regard, they mentioned that although great strides have been made in simulation in the past two decades, virtual prototyping is still more of an art than a science. To develop a virtual prototyping capability, many tests must be performed since many of the physical phenomena can not be modeled on the basis first principles today. Instead, models are tuned to tests, and the technology is not applicable to radically new designs. Specific obstacles to virtual prototyping include the inability to simulate problems with multiphysics phenomena, such as burning and change of phase, fracture and spalling, phenomena involving large disparities in scales, and behavior with a significant stochastic characteristics.

In this regard, they mentioned that although great strides have been made in simulation in the past two decades, virtual prototyping is still more of an art than a science. To develop a virtual prototyping capability, many tests must be performed since many of the physical phenomena can not be modeled on the basis first principles today. Instead, models are tuned to tests, and the technology is not applicable to radically new designs. Specific obstacles to virtual prototyping include the inability to simulate problems with multiphysics phenomena, such as burning and change of phase, fracture and spalling, phenomena involving large disparities in scales, and behavior with a significant stochastic characteristics.

2. Multi-scale phenomena (bridging of molecular to continuum models)

A major challenge to CM for the future is to model events in which these remarkably varying scales are significant in a single system or phenomena. It is then necessary to model multi-scale phenomena simultaneously for predictive capability. Analysis of multi-scale phenomena, while apparently beyond the horizon of contemporary capabilities, is one of the most fundamental challenges of research in the next decade and beyond. So-called scale bridging, in which the careful characterization of mechanical phenomena require that the model ‘‘bridge’’ the representations of events that occur at two or more scales, require the development of a variety of new techniques and methods. In this area, integration of computational methods and devices with experimental or sensing devices is critical. High fidelity simulation and computational mechanics must involve innovative and efficient use of a spectrum of imaging modalities, including X-ray tomography, electron microscopy, sonar imaging, and many others. Similarly, in modeling phenomena such as climate changes, weather conditions, and the interaction of ocean and atmosphere, satellite-generated data must be incorporated seamlessly into viable computational models to obtain meaningful predictions. Again, the spectrum of computational mechanics must be significantly broadened to include the use of these technologies. Once more, the intrinsically interdisciplinary nature of the subject will be expanded and reinforced.

3. Model selection and adaptivity


Model selection is a crucial element in automating engineering analysis and applications are unlimited; the subject could conceivably embrace classes of models including diverse spatial and temporal scales, enabling the systematic and controlled simulation of events modeled using atomistic or molecular models to continuum models. Model selection, model error estimation, and model adaptivity are exciting areas of CM and promise to provide an active area of research for the next decade and beyond.


Areas in which adaptive modeling have great promise include the study and characterization of composite materials, unsteady turbulent flows, multiphase flows of fluids, etc. Other techniques for model adaptivity involve the use and integration of test and imaging data, feedback from experiments and measurements, and various combinations of these methodologies.

Areas in which adaptive modeling have great promise include the study and characterization of composite materials, unsteady turbulent flows, multiphase flows of fluids, etc. Other techniques for model adaptivity involve the use and integration of test and imaging data, feedback from experiments and measurements, and various combinations of these methodologies.

3. Model selection and adaptivity


Model selection is a crucial element in automating engineering analysis and applications are unlimited; the subject could conceivably embrace classes of models including diverse spatial and temporal scales, enabling the systematic and controlled simulation of events modeled using atomistic or molecular models to continuum models. Model selection, model error estimation, and model adaptivity are exciting areas of CM and promise to provide an active area of research for the next decade and beyond.


Areas in which adaptive modeling have great promise include the study and characterization of composite materials, unsteady turbulent flows, multiphase flows of fluids, etc. Other techniques for model adaptivity involve the use and integration of test and imaging data, feedback from experiments and measurements, and various combinations of these methodologies.

Areas in which adaptive modeling have great promise include the study and characterization of composite materials, unsteady turbulent flows, multiphase flows of fluids, etc. Other techniques for model adaptivity involve the use and integration of test and imaging data, feedback from experiments and measurements, and various combinations of these methodologies.

Model selection is a crucial element in automating engineering analysis and applications are unlimited; the subject could conceivably embrace classes of models including diverse spatial and temporal scales, enabling the systematic and controlled simulation of events modeled using atomistic or molecular models to continuum models. Model selection, model error estimation, and model adaptivity are exciting areas of CM and promise to provide an active area of research for the next decade and beyond.


Areas in which adaptive modeling have great promise include the study and characterization of composite materials, unsteady turbulent flows, multiphase flows of fluids, etc. Other techniques for model adaptivity involve the use and integration of test and imaging data, feedback from experiments and measurements, and various combinations of these methodologies.

Areas in which adaptive modeling have great promise include the study and characterization of composite materials, unsteady turbulent flows, multiphase flows of fluids, etc. Other techniques for model adaptivity involve the use and integration of test and imaging data, feedback from experiments and measurements, and various combinations of these methodologies.
回复
分享到:

使用道具 举报

 楼主| 发表于 2007-3-18 09:59 | 显示全部楼层
4. Very large-scale parallel computing


One of the most difficult issues facing researchers in CM in the next decade will be purely a conceptual one: the recalibration of their own education, approach, and perceptions to allow them to use efficiently the extraordinary computational tools that will be developed during this period. Today, mechanicians using computational products for engineering analysis and design can routinely develop computational models involving 500,000–10,000,000 degrees of freedom. Problems of this size today are being solved on contemporary workstations. Nevertheless, these contemporary models employ rather crude characterizations of materials, geometry, boundary conditions, failure criteria, and many other important features of the system, because it is taken for granted by the modeler that to include these details will result in computational problems so large and complex that they would exceed the capacities of modern computational facilities.

This argument is no longer correct. As the 21st century begins, computational devices capable of delivering five trillion operations per second and storing a thousand trillion bytes of data are in use and larger machines are being developed. In a decades time, machines with capabilities an order-of-magnitude beyond this level may be available. It is probable that such terascale computation capabilities will soon be in the hands of most engineers and mechanicians, thus making possible models with a level of detail and sophistication completely unimaginable only a decade ago.

The proper use of this extraordinary toolkit will itself represent a significant challenge. Included in the challenge is the education of the next generation of engineers and mechanicians who will be expected to not only master the principles of mechanics but also the use of the computational tools available to them.

These new capabilities, and advances in modeling and parallel computation, will ultimately have a remarkable and irreversible impact in education in science and engineering. Simplified models and approximate theories remain important in developing understanding, but students need no longer rehearse only idealized situations: they can now tackle more realistic models.

Highspeed parallel computing together with the software developments, alluded to elsewhere in this document, will create a revolution in engineering analysis and ultimately in the way it is taught in colleges and universities. Less than a decade ago many feared that access to modern computational methods and machines would breed overconfidence in engineers, at the expense of common sense, judgement and reasoning. Now, the new concern is one of underestimation of the power of modern computational methods and devices and the danger of their underutilization in important simulations, analysis and design.

One of the most difficult issues facing researchers in CM in the next decade will be purely a conceptual one: the recalibration of their own education, approach, and perceptions to allow them to use efficiently the extraordinary computational tools that will be developed during this period. Today, mechanicians using computational products for engineering analysis and design can routinely develop computational models involving 500,000–10,000,000 degrees of freedom. Problems of this size today are being solved on contemporary workstations. Nevertheless, these contemporary models employ rather crude characterizations of materials, geometry, boundary conditions, failure criteria, and many other important features of the system, because it is taken for granted by the modeler that to include these details will result in computational problems so large and complex that they would exceed the capacities of modern computational facilities.

This argument is no longer correct. As the 21st century begins, computational devices capable of delivering five trillion operations per second and storing a thousand trillion bytes of data are in use and larger machines are being developed. In a decades time, machines with capabilities an order-of-magnitude beyond this level may be available. It is probable that such terascale computation capabilities will soon be in the hands of most engineers and mechanicians, thus making possible models with a level of detail and sophistication completely unimaginable only a decade ago.

The proper use of this extraordinary toolkit will itself represent a significant challenge. Included in the challenge is the education of the next generation of engineers and mechanicians who will be expected to not only master the principles of mechanics but also the use of the computational tools available to them.

These new capabilities, and advances in modeling and parallel computation, will ultimately have a remarkable and irreversible impact in education in science and engineering. Simplified models and approximate theories remain important in developing understanding, but students need no longer rehearse only idealized situations: they can now tackle more realistic models.

Highspeed parallel computing together with the software developments, alluded to elsewhere in this document, will create a revolution in engineering analysis and ultimately in the way it is taught in colleges and universities. Less than a decade ago many feared that access to modern computational methods and machines would breed overconfidence in engineers, at the expense of common sense, judgement and reasoning. Now, the new concern is one of underestimation of the power of modern computational methods and devices and the danger of their underutilization in important simulations, analysis and design.

The proper use of this extraordinary toolkit will itself represent a significant challenge. Included in the challenge is the education of the next generation of engineers and mechanicians who will be expected to not only master the principles of mechanics but also the use of the computational tools available to them.

These new capabilities, and advances in modeling and parallel computation, will ultimately have a remarkable and irreversible impact in education in science and engineering. Simplified models and approximate theories remain important in developing understanding, but students need no longer rehearse only idealized situations: they can now tackle more realistic models.

Highspeed parallel computing together with the software developments, alluded to elsewhere in this document, will create a revolution in engineering analysis and ultimately in the way it is taught in colleges and universities. Less than a decade ago many feared that access to modern computational methods and machines would breed overconfidence in engineers, at the expense of common sense, judgement and reasoning. Now, the new concern is one of underestimation of the power of modern computational methods and devices and the danger of their underutilization in important simulations, analysis and design.

5. Controlling uncertainty: probabilistic methods


The random nature of many features of physical events is widely recognized by industry and researchers. The natural stimuli that activate physical systems may be completely unpredictable by deterministic models: the randomness of a gust of wind, the characterization of forces in boundary and initial conditions on mechanical systems, random microstructural features of engineering materials, the random fluctuations in temperature, humidity, and other environmental factors, all make the characterizations provided by deterministic models of mechanics less satisfactory with respect to their predictive capabilities. Fortunately, the entire subject of uncertainty can itself be addressed in a scientific and mathematically precise way and the random characteristics of nature can be addressed by computational models. During the next decade, probabilistic modeling of mechanical problems will be a topic of great importance and interest.

6. Biomedical applications

Predictive bones, nerves, and other biological systems.

Note: Please feel free to add emerging areas in CM that can pose as interesting research topics for the next decade.

Submitted by A. Yudhanto

The random nature of many features of physical events is widely recognized by industry and researchers. The natural stimuli that activate physical systems may be completely unpredictable by deterministic models: the randomness of a gust of wind, the characterization of forces in boundary and initial conditions on mechanical systems, random microstructural features of engineering materials, the random fluctuations in temperature, humidity, and other environmental factors, all make the characterizations provided by deterministic models of mechanics less satisfactory with respect to their predictive capabilities. Fortunately, the entire subject of uncertainty can itself be addressed in a scientific and mathematically precise way and the random characteristics of nature can be addressed by computational models. During the next decade, probabilistic modeling of mechanical problems will be a topic of great importance and interest.

6. Biomedical applications

Predictive bones, nerves, and other biological systems.

Note: Please feel free to add emerging areas in CM that can pose as interesting research topics for the next decade.

Submitted by A. Yudhanto
您需要登录后才可以回帖 登录 | 我要加入

本版积分规则

QQ|小黑屋|Archiver|手机版|联系我们|声振论坛

GMT+8, 2024-12-25 20:55 , Processed in 0.067499 second(s), 17 queries , Gzip On.

Powered by Discuz! X3.4

Copyright © 2001-2021, Tencent Cloud.

快速回复 返回顶部 返回列表