Anda di halaman 1dari 44

Industry Spotlight:

Chemical and Processing


FEA is a valuable tool that aids doctors in orthopedic operations

Researchers use ANSYS to develop micron-sized, selfpowered mobile mechanisms

Quickly vary geometry even without parametric CAD

Take a look at the future of product development...


...a process that's more automated, more integrated, more innovative and truer to life. Thats where ANSYS is taking engineering simulation. By combining technologies like industry-leading meshing, nonlinear analysis and computational fluid dynamics, you can reduce costs and drive products to market quicker. Bring your products and processes to life with ANSYS. Visit www.ansys.com/secret/6 or call 1.866.ANSYS.AI.

Contents
Industry Spotlight
6

Departments
Editorial - To Collaborate,
You Need People

Chemical and Processing


A continuing series on the value of engineering simulation in specific industries

2 3 16 18 25 26 33 36 40

Industry News - Recent Announcements and Upcoming Events Software Profile - The New Face of
ANSYS ICEM CFD

Features
10

CFD Update - Simulation Helps


Improve Oil Refinery Operations

ANSYS for Virtual Surgery


FEA is a valuable tool that aids doctors in orthopedic operations

Managing CAE Processes - Upfront


Analysis in the Global Enterprise

Simulation at Work - Analysis of


Artificial Knee Joints

Tech File - Demystifying Contact


Elements

14

FEA in Micro-Robotics
Researchers use ANSYS to develop micron-sized, selfpowered mobile mechanisms

Tips and Techniques- Contact


Defaults in Workbench and ANSYS

Guest Commentary- Putting Quality


Assurance in Finite Element Analysis

28

Design Insight for Legacy Models


Quickly vary geometry even without parametric CAD

About the cover


There are many examples of successful chemical and processing companies using ANSYS simulation technology to improve products and processes. Our cover article describes how Twister BV used ANSYS CFX to reduce costs by 70% compared to the conventional route without CFD in developing gas separator equipment.

Want to continue receiving ANSYS Solutions?


Visit www.ansys.com/subscribe to update your information. Plus, youll have the chance to sign up to receive CFX eNews and email alerts when the latest electronic version of ANSYS Solutions becomes available!

For ANSYS, Inc. sales information, call 1.866.267.9724, or visit www.ansys.com on the Internet. Go to www.ansyssolutions.com/subscribe to subscribe to ANSYS Solutions.
Editorial Director John Krouse jkrouse@compuserve.com Managing Editor Jennifer L. Hucko jennifer.hucko@ansys.com Designers Miller Creative Group info@millercreativegroup.com Art Director Paul DiMieri paul.dimieri@ansys.com Ad Sales Manager Ann Stanton ann.stanton@ansys.com Circulation Manager Elaine Travers elaine.travers@ansys.com Editorial Advisor Kelly Wall kelly.wall@ansys.com CFD Update Advisor Chris Reeves chris.reeves@ansys.com

ANSYS Solutions is published for ANSYS, Inc. customers, partners, and others interested in the field of design and analysis applications.
The content of ANSYS Solutions has been carefully reviewed and is deemed to be accurate and complete. However, neither ANSYS, Inc., nor Miller Creative Group guarantees or warrants accuracy or completeness of the material contained in this publication. ANSYS, ANSYS DesignSpace, CFX, ANSYS DesignModeler, DesignXplorer, ANSYS Workbench Environment, AI*Environment, CADOE and any and all ANSYS, Inc. product names are registered trademarks or registered trademarks of subsidiaries of ANSYS, Inc. located in the United States or other countries. ICEM CFD is a trademark licensed by ANSYS, Inc. All other trademarks or registered trademarks are the property of their respective owners. POSTMASTER: Send change of address to ANSYS, Inc., Southpointe, 275 Technology Drive, Canonsburg, PA 15317, USA 2004 ANSYS, Inc. All rights reserved. 2004 ANSYS, Inc. All rights reserved.

www.ansys.com

ANSYS Solutions

Summer 2004

Editorial

To Collaborate, You Need People


Intellectual capital for creating innovative designs is lacking at manufacturers that skimp on jobs.
Companies need people for multidisciplinary One of the most significant collaboration. But, unfortunately, jobs at manufacturing and possibly least recognized firms are in steady decline. According to the National aspects of engineering Association of Manufacturers, after peaking at 17.3 milsimulation is that the technollion in mid-2000, manufacturing employment has fallen ogy can be a tremendously by 2.8 million while employment in non-manufacturing effective communication sectors of the economy rose by 671,000 to 115 million. and collaboration tool in Data from the U.S. Bureau of Labor Statistics indicates product development. By there were 2,378 extended mass layoffs in using virtual prototyping, manufacturing during 2002 alone, resulting in 454,034 what-if studies and a wide workers being removed from their jobs. range of other analyses By John Krouse Meanwhile, the overall economy is rebounding, to show how proposed Editorial Director with the Dow Jones Industrial Average undergoing products will perform, ANSYS Solutions a strong sustained rally and corporate profits up. jkrouse@compuserve.com engineering simulation can Forecasters at the National Association for Business give people in multi-functional Economics predict that the U.S. economy will show a product development teams tremendous insight into robust annual growth of 4.5% in 2004. designs. The technology also provides an effective way Despite this strong economic growth, payrolls in for team members to interact, with disciplines outside manufacturing continue to go down as manufacturers engineering able to see the impact of their various ideas, operate with as few people suggestions, feedback and as possible. Running these input. In this way, teams can COLLABORATION TAPS INTO THE super-lean operations investigate even the most unconventional ideas, some INTELLECTUAL CAPITAL OF THE ENTERPRISE pumps up short-term of which can turn out to be THE COMBINED KNOW-HOW AND INSIGHT OF profits. But manufacturers cannot sustain long-term the basis of ground-breaking WORKERS ABOUT THE COMPANYS OPERATION, growth based on savings new products. from a barely adequate Collaborative product ITS PRODUCTS AND ITS CUSTOMERS. workforce being stretched development is a growing to the limit. Product quality, customer service and brand trend in manufacturing industries, getting engineers and image ultimately suffer, as do product innovations that analysts working with others across the extended spring from collaborative design. enterprise: manufacturing, testing, quality assurance, To collaborate, you need people: ones with enough sales, marketing and service even those outside the time in the workday to apply their knowledge on creative company such as suppliers, customers, consultants projects. When manufacturers cut jobs indiscriminately, and partners. These people typically dont know how to theyre not just getting rid of salaried bodies, theyre build meshes, define boundary conditions, run analyses discarding the company s most valuable asset the or perform optimizations. But they can see the impact of wealth of intellectual capital in its workforce. Companies what simulations show, and they can provide valuable that fall into this trap risk being left behind in the market feedback in spotting, evaluating and fixing potential by astute competitors with enough sense to invest in problems. Marketing could suggest a different contour their workers and the knowledge they bring to the that would make a consumer product more saleable, for collaborative processes necessary to develop winning example, or procurement might suggest alternate products. I suppliers for stronger and less expensive components to reduce excess stress. This multi-functional synergy is the basis for the creativity necessary to develop innovative products and processes that might not immediately occur to individuals working separately. Collaboration taps into the intellectual capital of the enterprise the combined know-how and insight of workers about the companys operation, its products and its customers.

www.ansys.com

ANSYS Solutions

Summer 2004

Industry News

Recent Announcements
EASA 3.0 - The New Standard for Efficient Application Development EASA enables ultra-rapid creation and deployment of Web-enabled applications that can drive most applications, including ANSYS and CFX. EASA also can be used to integrate several tools, thus automating processes involving say CAD, FEA and even in-house codes. EASA is available as a software product to author and publish your own custom applications. Alternatively, several ASDs are now using EASA to create turnkey applications to your specification as a service. New features in EASA 3.0 include: Connectivity to Relational Databases such as SQL Server and Oracle, and with database applications such as ERP, CRM and PLM systems. Improved Security for Internet Use using Secure Socket Layer (SSL) technology, enabling you to host applications for use over the Internet. Multi-Language EASAPs create your app in your language, and users see it in their preferred language. Character sets supported include Roman, Chinese, Japanese, Russian and Arabic. New parametric study and optimization capabilities New API EASAs differentiator has always been to allow non-programmers to create professionalgrade Web-enabled applications around their underlying software. Now an API allows EASA authors who have programming skills to create applications at the next level by using custom code. For more information, visit www.ease.aeat.com.

Vision and strategy set the theme for the general session. Kicking off the conference with a welcome address, ANSYS president and CEO, Jim Cashman, set the stage for keynote speaker, Brad Butterworth of Team Alinghi. As the cunning strategist aboard the Team Alinghi yachts, Brad shared his experience and discussed how the Americas Cup winner is using ANSYS integrated simulation solutions to defend its title in the 2007 competition. After the morning break, ANSYS presented its Technology Roadmap, the companys successful, ongoing strategy for integrating the power of the entire ANSYS, Inc. family of products into the ultimate engineering simulation solution. Then, Bruce Toal, director of Marketing and Solutions, High Performance Technical Computing Division at Hewlett-Packard Company, spoke about the company s Adaptive Enterprise for Design and Manufacturing. Following a day of technical and general sessions, and visiting exhibitor booths, attendees enjoyed a conference social sponsored by Hewlett-Packard Monday evening. Standing ovations and triumphant applause echoed throughout the ballroom during the social as ANSYS president and CEO, Jim Cashman, presented Dr. John Swanson, ANSYS founder, with an award for being the recipient of the 2004 AAES John Fritz Medal. ANSYS long-standing partners and its key customers took to the podium for the Tuesday general session. LMS International s Tom Curry, executive vice president and chief marketing officer, spoke about the product creation process. Tom guides the companys growth in predictive computer-aided engineering, physical prototyping and related services. Herman Millers Larry Larder, director of engineering services, discussed how they use ANSYS simulation technologies to experiment and innovate in the office furniture industry.

2004 International ANSYS Conference Hailed Success Engineering professionals from throughout the world gathered at the Hilton Pittsburgh in May for the 2004 International ANSYS Conference to discover the true meaning behind what it is to Profit from Simulation.

www.ansys.com

ANSYS Solutions

Summer 2004

Industry News

SGI s director of product marketing, Shawn Underwood, presented future of high performance computing followed by Dr. Paresh Pattani, director of HPC and Workstation Applications at Intel Corporation who focused on the paradigm shift in high performance computing. Jorivaldo Medeiros, technical consultant at PETROBRAS, offered his ANSYS success story on how the company drives development and innovation in equipment technology. In addition, ANSYS became the first engineering simulation company to solve a 111 Million Degrees of Freedom structural analysis model. After lunch, the Management Track addressed strategies on how to implement new technologies and explain the benefits of engineering simulation to management.

of quality products to market, users have faced major challenges to realizing the full value. For example, hardware and software limitations have historically made realistic simulations elusive when realism involves highly detailed models and complex physical behavior. Manufacturers are looking for more accurate, large system simulations to improve their time-to-money, said Charles Foundyller, CEO at Daratech, Inc. This announcement means that users now have a clear roadmap to improved productivity. As hardware advances in speed and capacity, ANSYS is committed to being the leader in developing CAE software applications that take advantage of the latest computing power. This leadership provides customers with the best engineering simulation tools for their product development process to help achieve better cost, quality and time metrics. This powerful new offering from ANSYS speaks to its commitment to develop and deliver the best in advanced engineering solutions. In turn, ANSYS has entered into a three-year partnership with SGI to advance the capabilities of ANSYS in parallel processing and large memory solutions.

ANSYS Breaks Engineering Simulation Solution Barrier ANSYS, Inc. has become the first engineering simulation company to solve a structural analysis model with more than 100 million degrees of freedom (DOF), making it possible for ANSYS customers to solve models of aircraft engines, automobiles, construction equipment and other complete systems. In a joint effort with Silicon Graphics, Inc. (SGI), the 111 million DOF structural analysis problem was completed in only a few hours using an SGI Altix computer. DOF refers to the number of equations being solved in an analysis giving an indication of a models size. ANSYS ability to solve models this large opens the door to an entirely new simulation paradigm. Prior to this capability, a simulation could be conducted only at a less detailed level for a complete model or only at the individual component level for a detailed model. Now, it will be possible to simulate a detailed, complete model directly; potentially shortening design time from months to weeks. Equally important, having a high fidelity comprehensive model can allow trouble spots to be detected much earlier in the design process. This may greatly reduce additional design costs and can provide an even shorter time to market, said Jin Qian, senior analyst at Deere & Company Technical Center. According to Marc Halpern, research director at Gartner, although simulation accelerates the delivery

Safe Technology Incorporates AFS Strain-Life Cast Iron Database in fe-safe Safe Technology Ltd has been granted a license to use the AFS cast iron database from the research report Strain-Life Fatigue Properties Database for Cast Iron in its state-of-the-art durability analysis software suite for finite element models, fe-safe. Safe Technology Ltd is a technical leader in the design and development of durability analysis software that pushes the boundaries of fatigue analysis software to ensure greater accuracy and confidence in modern fatigue analysis methods for industrial applications. The availability of the AFS database within fe-safe ensures that users will have access to the most up-to-date and accurate cast-iron materials data for their durability analyses. The AFS Ductile Iron and the Gray Iron Research Committees have developed a Strain-Life Fatigue Properties Database for Cast Iron. This database represents the capability of the domestic casting industry and is available as a special AFS publication. It is the culmination of a five-year effort in partnership

www.ansys.com

ANSYS Solutions

Summer 2004

with the DOE Industrial Technology Program. The scope of this information includes 22 carefully specified and produced castings from ASTM/SAE standard grades of irons, including Austempered Gray Iron (AGI) (specification is under development). Each grade is comprehensively characterized from an authoritative source with chemical analysis, microstructure analysis, hardness tests, monotonic tension tests and compression tests. This information is contained in user-friendly digital files on two CD-ROMs for importing into computer aided design software. AFS Publications are described online at www.afsinc.org/estore/. For more information, visit www.safetechnology.com

Autodesk is proud to be working with an industry innovator like ANSYS, said Robert Kross, vice president of the Manufacturing Solutions Division at Autodesk. This reinforces our commitment to deliver proven and robust technologies to manufacturers, in order to help them deliver better quality products and bring them to market faster. Inventor Pro 9.0 will make simulation (CAE) functionality available to a broader mechanical design community, while protecting customers business investment by seamlessly integrating with other high-end ANSYS offerings. Our customers will surely benefit from this relationship. The total solution will help product development teams make more informed decisions earlier in the design process, allowing them to reduce costs and development time while designing better and more innovative products. This new offering from Autodesk will be viewed very strategically by their customers. As they deploy simulation tools throughout their product design process, the Autodesk-ANSYS offering will be a key component to a customers overall simulation strategy, said Mike Wheeler, vice president and general manager of the Mechanical Business Unit at ANSYS. ANSYS is proud to be part of the design effort to create this next generation tool as part of our overall ANSYS Workbench development plan.

Product Development Platform Will Simulate and Optimize Design Performance for Autodesk Inventor Professional Customers Autodesk will license ANSYS simulation technologies and package them as an integral part of the Autodesk Inventor Professional 9.0 product and future releases. Powered by ANSYS part-level stress and resonant frequency simulation technologies, Autodesk Inventor Professional 9.0 will enable design engineers to create more cost-effective and robust designs, based on how the products function in the real world, by facilitating quick and easy what-if studies right within the softwares graphical user interface.

Upcoming Events
Date August 29-September 3 September 5-8 September 6-9 September 7-8 September 19-20 September 21-22 September 22-25 September 29-30 September 29-30 September 28 - October 2 October 4 October 4-5 October 12 October 13 October 20 October 28-29
www.ansys.com

Event ICAS 2004 RoomVent 2004 17th International Symposium on Clean Room Technology European UGM for Automotive Applications Radtherm User Conference German Aerospace Congress 2004 Numerican Analysis and Simulation in Vehicle Engeineering 3rd International Symposium on Two-Phase Flow Modeling and Experimentation Calculation & Simulation in Vehicle Building Pump Users Intarnational Forum 2004 ASME DETC/CIE Conference 2004 PLM European Event DaratechDPS ANSYS Multiphysics Seminar Construtec Conference ANSYS 9.0 Update Seminar ANSYS User Conference

Location Yokohama, Japan Coimbra, Portugal Bonn, Germany Neu-Ulm, Germany Dresden, Germany Troy, Michigan, USA Pisa, Italy Wurzburg, Germany Karlsruhe, Germany Salt Lake City, Utah, USA UK Novi, MI Sweden Spain Sweden Mexico
ANSYS Solutions

Summer 2004

Industry Spotlight

Industry Spotlight:

Chemical and Processing


A continuing series on the value of engineering simulation in specific industries.
The chemical and processing industries provide the building blocks for many products. By using large amounts of heat and energy to physically or chemically transform materials, these industries help meet the worlds most fundamental needs for food, shelter and health, as well as products that are vital to such advanced technologies as computing, telecommunications and biotechnology. According to the American Chemical Society, chemical and processing industries account for 25% of manufacturing energy use. These industries consume fossil resources as both fuel and feedstock, and produce large amounts of waste and emissions. In turn, as exemplified by the U.S. Governments 2020 Vision, these industries face major challenges to meet the needs of the present without compromising the needs of future generations in the face of increasing industrial competitiveness. This translates into the need to make processes much more energy efficient, safer and more flexible, and to reduce emissions to meet the many competitive challenges within a global economy. As well as the need to reduce design cycle times and costs, major challenges where simulation has an important role including:
I

Scale-up, to extrapolate a process from laboratory and pilot plant scale, to the industrial plant scale, which may require many millions of dollars investment. Process intensification, to combine different processes into smaller compact and efficient units, instead of treating them as individual processes. Retrofitting, to upgrade a plant to become more efficient, within the many constraints of the existing footprint of the plant. This issue of ANSYS Solutions provides examples of these, in offshore oil production, waste water treatment and chemical processing, and many other examples which highlight the benefits to be obtained are to be found on the ANSYS CFX Website at www.ansys.com/cfx. These problems are inherently multi-scale, with the combination of different physical and chemical processes at the molecular level, and the macro-flow processes transporting a reacting fluid around the complex geometries of a large industrial chemical reactor. The recent advances in modeling capabilities, combined with the scalable parallel performance of low cost hardware, and the powerful geometrical and meshing tools in the ANSYS software modules open up many new opportunities to achieve major new benefits in the complex and demanding world of the chemical and process industries.

Offshore platform
www.ansys.com ANSYS Solutions

Summer 2004

Case-in-point:
Integral Two-Phase Flow Modeling in Natural Gas Processing
Customized version of CFX reduces costs 70% compared to the conventional route without CFD in developing gas separator equipment.
By Marco Betting, Team Leader Twister Technology; Bart Lammers, Fluid Dynamics Specialist; and Bart Prast, Fluid Dynamics Specialist, Twister BV Natural gas processing involves dedicated systems to remove water, heavy hydrocarbons and acidic vapors from the gas stream to make it suitable for transportation to the end-customer. From a process engineering perspective, these systems are combinations of flashes, phase separations, flow splitters, and heat and mass exchangers exhaustively designed to achieve required export specifications. While the process engineer is concerned with finding the optimal system configuration using pre-defined process steps and equilibrium thermodynamics, the flow-path designer tries to optimize the performance of each individual process step in the system based on an understanding of both two-phase flow behavior and non-equilibrium thermodynamics. The fluid dynamics interaction between subsequent process steps is not always
Normalized total C8 fraction in vortex section of Twister Supersonic Separator

C8 separation

taken into account to its full extent, even though this can strongly influence the total system performance. Developing and designing new equipment for the process industry is a time-consuming and expensive exercise. Twister BV (www.twisterbv.com) offers innovative gas processing solutions that can play an essential role in meeting these challenges. The team has been developing the Twister Supersonic Separator, which is based on a

Expander

Cyclone Separator

Compressor

Saturated Feed Gas


100 bar, 20C

Diffuser Laval Nozzle


30 bar, -40C

Supersonic Wing Mach 1.3 (500 m/s) Cyclone Aeparator (300,000 g)

Liquids + Slip-Gas
70 bar, 5C

In Twister, the feed gas is expanded to supersonic velocity, thereby creating a homogeneous mist flow. During the expansion, a strong swirl is generated via a delta wing, causing the droplets to drift toward the circumference of the tube. Finally a co-axial flow splitter (vortex finder) skims the liquid enriched flow from the dried flow in the core. The two flows are recompressed in co-axial diffusers resulting in a final pressure being approximately 35% less than the feed pressure.
www.ansys.com ANSYS Solutions

Liquid Vapor

Uniform C8 distribution

Dry gas
70 bar, 5C

Twister separator

Summer 2004

Industry Spotlight

unique combination of known physical processes, combining aero-dynamics, thermo-dynamics and fluid-dynamics to produce a revolutionary gas conditioning process. The route from a new Twister tube concept to marketable hardware via several production field trials has been a major undertaking. Reducing costs in the cycle of designing, testing and redesigning of Twister prototypes for the challenging conditions involved in high-pressure sour natural gas processing is of great importance. The introduction of computational fluid dynamics in the Twister development four years ago resulted in a cost reduction of approximately 70% compared to the conventional route without CFD.

Multi-component gases with several condensable species A homogeneous nucleation model to determine the droplet number density A growth model, to allow for the change in size of the particles, through condensation and evaporation Droplet coalescence models depending on droplet size, number density and turbulence intensities Slip models to predict the separation of the droplets from the continuous phase Accounts for turbulent dispersion Aforementioned models are coupled via mass, momentum and energy equations Energy is affected by release of latent heat during condensation/evaporation The development and validation of the customized CFX code was of paramount importance in maturing the Twister separator for commercial application in the oil & gas industry. This custom version of CFX-5 includes all first-order effects useful for determining the performance of liquid/gas separators proceeded by an expander or throttling valve.

Customized Version of CFX


Twister BV and ANSYS CFX jointly have developed a customized version of CFX 5.6*, capable of modeling non-equilibrium phase transition in multi-component real gas mixtures. The consulting team at ANSYS was chosen to perform this work because of their understanding of the needs of the industry and the flexible nature of CFX-5, which made it suitable for implementing the specialized features required. The specific features of this customized two-phase CFD code are: Full equations of state, including the effects of phase change

Twister and LTX separators

G+L dispersed

G+L stratified

G+L dispersed

For a process engineer, the quality of the gas coming over the top of the separator is determined with the phase equilibrium after an isenthalpic flash, presuming a certain liquid carry-over. The flow-path designer is concerned with the reduction of the carry-over by optimizing the flow variables of the separator, based on a feed with presumed droplet sizes.
www.ansys.com ANSYS Solutions

Summer 2004

Using the customized two-phase code, the flow path designer can study the influence of the geometry of a choke valve on the resulting droplet size distribution and better assess the performance of the separator based thereon.

P, T, flow, LGR composition

P, T, flow, LGR, composition

Mach number
1.4 1.2 1.0 0.8 0.6 0.4 0.2 0.0

P, T, flow, LGR, droplet size, droplet number

P, T, flow, composition LTX separator

Improving Facility Performance


Essential for the optimization of the separation performance of Twister is the prediction of droplet sizes. The droplet size is determined by both the vapor diffusion rate toward the droplets and the mutual agglomeration of these droplets. The size distribution mainly depends on the time interval of the nucleation pulse. The droplet size distribution determines the drift velocity of the liquid phase and hence determines the separation efficiency. Appropriate models for this have been implemented. This customized CFX code also enables the process engineer to better understand the relationship between the performance of subsequent process steps, e.g., the operation of a Low Temperature Separator (LTS) fed by a choke valve. Twister BV and ANSYS CFX have completed a powerful CFD code validated for natural gas processes. This unique CFD capability enables process engineers to optimize engineering practices, while increasing the performance of gas processing facilities. I

*I.P. Jones et. al, The use of coupled solvers for multiphase and reacting flows; 3rd international conference of CSIRO, 1012 December 2003, Melbourne, Australia.

www.ansys.com

ANSYS Solutions

Summer 2004

10

ANSYS for Virtual Surgery


FEA is a valuable tool that aids doctors in orthopedic operations.
By Andrs Hajdu and Zoltn Zrg Institute of Informatics University of Debrecen, Hungary

Analysis, imaging and visualization technologies are being applied increasingly in medical applications, particularly in evaluating different approaches to surgery and determining the best ways to proceed in the operation. In this growing field, one of the primary focuses of our work applies finite element analysis to orthopedic surgery: specifically, the specialized area of osteotomy, where bones are surgically segmented and repositioned to correct various deformities. We chose ANSYS for this work because of the reliability and flexibility of the software in handling the irregular geometries and nonlinear properties inherent in these materials. Medical imaging technologies such as CT, MRI, PET or SPECT deliver slice or projection images of internal areas of the human body. These tools are generally used to visualize configurations of bones, organs and tissue, but they also have the ability to export image data and additional information in commonly known medical file formats like DICOM.

These files then can be processed by third-party computer programs for assessing and diagnosing the condition of the patient and planning surgical intervention, that is, how the surgical procedure will be performed. Other very promising fields include telesurgery, virtual environments in medical school education and prototype modeling of artificial joints. The goal of the research is to develop computer applications in the field of orthopedic surgery, especially osteotomy intervention procedures based on CT images. the team at the Institute of Informatics uses this simulation technology to examine theories underlying new types of surgeries as well as to aid doctors in treating individual patients undergoing hip joint correction. These two approaches have many common tasks: extracting image data from diverse medical image exchange format files, enhancing images, choosing the appropriate segmentation techniques, CAD-oriented volume reconstruction, data exchange with FEM/FEA tools, and geometric description of virtual surgery.

www.ansys.com

ANSYS Solutions

Summer 2004

11
Building Orthopedic Models
CT data files. The first step in building an orthopedic model is extracting an image file from medical data exchange formats. As CT images represent the X-ray absorption of a given crosssection, the intensity values of their pixels represent this 12-bit absorption rate, rather than common color ranges. Since the slice density is usually reduced to a minimum for in-vivo scanning, considerable information often is lost, especially in complex regions of the human body. For visualization purposes, this deficiency can be compensated with interpolation techniques, but no lost anatomical data can be recovered in this way. Using these files for FEA work thus often requires further enhancement. Image enhancement and segmentation. As given tissue structures have their own absorption rate intervals, a windowing technique might be sufficient for a simple visualization. However, because these intervals can overlap, other tissue parts that differ from our VOI (Volume Of Interest) remain in the image, after applying the intensity window. Some conventional procedures like morphological or spectral-space filtering must be applied, as well as specific techniques for CT segmentation. We found that other methods, such as region growing and gradient-based segmentation, achieved excellent results for bone structures. Volume reconstruction. The final goal of the project is to develop an application to be used in surgery planning on a routine basis by medical staff without experience in using CAD-related software. We wanted this application to be able to transfer structural data into a finite element modeling and analysis software. Thus, volumetric information must be represented in a geometrically appropriate way. There is a difference between simple surface rendering and geometrical volume reconstruction in CAD systems. Volumetric data has to be represented using solid modeling primitives, and reconstructed using related concepts: keypoints, parametric splines, line loops, ruled and planar surfaces, volumes and solids. When extracting contour points of ROIs (Regions Of Interest), we need to reduce the number of points to approximately 10-15% by keeping only points with rapidly changing surroundings. These points then can be interpolated with splines, splines assembled to surfaces and surfaces to solids. The major difficulty is that CAD-related systems are designed to work with regular-shaped objects, and bone structures are not like that. However, to be able to execute FEA, it is necessary to use this approach. Moreover, virtual surgery interventions have to be carried out on this representation, or in such a way that proper geometrical representation of the modified bone structure remains easy to regain. As is often the case, conversion problems may occur when exchanging data between CAD systems, so we perform the above volume reconstruction procedure directly in the FEM software using built-in tools provided in the package. After testing many FEM programs, we chose ANSYS software for this task. Figure 1 illustrates how they

Figure 1. Steps of volume (bone) reconstruction in ANSYS.

www.ansys.com

ANSYS Solutions

Summer 2004

12

Approaches to Virtual Surgery


Planar approach. There are some cases when information from 2-D slices is sufficient for performing virtual surgery instead of 3-D solids. For example, the first subject of our project human femur lengthening using helical incision provided a good opportunity for experimenting with 3-D interventions performed in 2-D. By taking the intersection (dark section on Figure 2) of the theoretical cutting tool path (Figure 2 left) with the planes of the individual CT slices, we subtracted these profiles from the bone section profile (Figure 3). After the volume reconstruction using this technique, we obtained the modified bone structure without the need for further intervention. Another possibility is to use ANSYS to build up the geometric model of the bone and the cutting tool from their boundary lines, then to remove the solid defined by the path of the cutting tool. The team wrote an ANSYS script to obtain fast and automatic model creation. In the case of the hip joint correction, some intervention also might be simulated in 2-D, but designating and registering ROIs on the slice set is more difficult. However, handling volumes as a set of unsorted 3-D points with additional attributes serves as an intermediate solution. Three-dimensional approach. In the first subject, the 3-D approach adopted by us was the combination of the volume reconstruction technique and conventional CAD modeling. We reconstructed the middle part (diaphysis) of the human femur, and, in the same coordinate system, using the axis of the actual bone, we constructed the solid object representing the path of the cutting tool. This was achieved by applying helical extrusion along this axis on a rectangle, using the parameters of the actual osteotomy. By subtracting these solids from each other, they

Figure 2. Part of theoretical path and planar intersection of the cutting tool.

reconstructed in ANSYS an 8-inch part of a femur (pipe-like bone) using the mentioned procedure. The entire reconstruction procedure was implemented in a simple ANSYS script file. A natural extension of this method seems to be suitable also for bones containing more parts, holes, etc. In this case, Boolean operations between solids provided by ANSYS gives us a powerful tool. Another challenging problem currently being investigated is the reconstruction of those parts of the bones where the CT slices contain varying topology (e.g., when reaching a junction in some special bones).

Figure 3. Subtraction of the cutting tool from a bone section profile in 2-D, and the 3-D outcome.
www.ansys.com

Figure 4. Subtracting a helix from the diaphysis.


ANSYS Solutions

Summer 2004

13

Figure 5. Tetrahedron mesh for GL visualization and FEA.

Figure 6. Different bar hole types and variable helix paths to improve efficiency of lengthening.

obtained the wanted solid object (Figure 4). This Boolean subtraction was also executed by ANSYS. As previously mentioned, they also work on pre-operative analysis and comparison of hip joint osteotomy. The 3-D reconstruction of this region is more difficult because of the information loss during the CT scanning procedure. There are many consecutive slice pairs with large differences. In this case, interpolation gives no satisfying results, and we specialize in general methods to reduce the level of user action required. Our interface for virtual surgery is GLUT-based, containing I/O tools for importing existing meshes and exporting the model into a FEM/FEA environment. Besides using similar scripts for building up the geometry as described above, we also take advantage of the mesh generator and manager capabilities of ANSYS in data exchange. That way, we can import a tetrahedron mesh used in OpenGL technology into ANSYS for FEA analysis, for example, and ANSYS geometry also can be exported as a tetrahedron mesh for visualizing purposes. Figure 5 shows an example of a tetrahedron mesh visualization in OpenGL. FEM/FEA results. Using the volume reconstruction approach, we needed only to translate our internal representation to the scripting language. Material types and parameters also can be defined using scripts. The bone material model we used is a linear isotropic one. After applying constraints and forces on the nodes of the solids, they have tested stress and displacement of the bone structure. Using the obtained results, a comparison can be made for the known osteotomy interventions of a certain type. For femur lengthening, our experience indicated that the highest stress values occurred around the starting and ending boreholes of the cut, so we also considered the usability of different borehole types and helix with variable pitches, as shown in Figure 6. I
Dr. Andr s Hajdu is an instructor with the Institute of Informatics at the University of Debrecen in Hungary and can be contacted at hajdua@inf.unideb.hu. His research is supported by OTKA grants T032361, F043090 and IKTA 6/2001. Zoltn Zrg (zorgoz@inf.unideb.hu) is in PhD studies at the Institute.

References and Resources for Further Reading H. Ab, K. Hayashi and M. Sato (Eds.): Data Book on Mechanical Properties of Living Cells, Tissues, and Organs, Springer-Verlag, Tokyo, 1996. Z. Cserntony, L. Kiss, S. Man, L. Gspr and K. Szepesi: Multilevel callus distraction. A novel idea to shorten the lengthening time, Medical Hypotheses, 2002, accepted. R. C. Gonzalez and R. E. Woods: Digital image processing, Addison-Wesley, Reading, Massachusetts, 1992. A. L. Marsan: Solid model construction from 3-D images (PDF, PhD dissertation), The University of Michigan, 1999. K. Radermacher, C. V. Pichler, S. Fischer and G. Rau: 3-D Visualization in Surgery, Helmholtz-Institute Aachen, 1998. L. A. Ritter, M. A. Livin, R. B. Sader, H-F. B. Zeilhofer and E. A. Keeve: Fast Generation of 3-D Bone Models for Craniofacial Surgical Planning: An Interactive Approach, CARS/Springer, 2002. M. Sonka, V. Hlavac and R. Boyle: Image processing, analysis, and machine vision, Brooks/Cole Publishing Company, Pacific Grove, California, 1999. Tsai Ming-Dar, Shyan-Bin Jou and Ming-Shium Hsieh: An Orthopedic Virtual Reality Surgical Simulator (PDF), ICAT 2000. Zoltn Zrg, Andrs Hajdu, Sndor Man, Zoltn Cserntony and Szabolcs Molnr: Analysis of a new femur- lengthening surgery, IEEE IASTED International Conference on Biomechanics (BioMech 2003) (2003), Rhodes, Greece, Biomechanics/34-38.

Web Links to More Information


http://graphics.stanford.edu/data/3-Dscanrep/ http://image.soongsil.ac.kr/software.html http://medical.nema.org http://www.ablesw.com/3-D-doctor/ http://wwwr.kanazawa-it.ac.jp/ael/imaging/synapse http://www.materialise.com http://www.nist.gov/iges
ANSYS Solutions

www.ansys.com

Summer 2004

14

FEA in Micro-Robotics
Researchers use ANSYS to develop micron-sized, self-powered mobile mechanisms.
By Bruce Donald, Craig McGray, and Igor Paprotny of the Micro-Robotics Group, Computer Science Department, Dartmouth College; Daniela Rus, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology; and Chris Levey, Dartmouth Thayer School of Engineering

Mobile robots with dimensions in the millimeter to centimeter range have been developed, but the problem of constructing such systems at micron scales remains largely unsolved.

The anticipated applications for mobile micro-robots are numerous: manipulation of biological cells in fighting cancer, for example, or stealth surveillance technology where clouds of flying micro-robots could monitor sites relatively undetected by sight or radar. Micrometer-sized robots could actively participate in the self-assembly of higherorder structures, linking to form complex assemblies analogous to biological systems. One could envision such self-assembly to take place inside a human body, growing prosthetic devices at their destination, for example, thus alleviating the need for intrusive surgery. Targeting these types of potential future micro-robotic applications, the Micro-Robotics Group at Dartmouth College has been developing a new class of untethered micro-actuators. Measuring less than 80 mm in length, these actuators are powered through a novel capacitive-coupled power delivery mechanism, allowing actuation without a physical connection to the power source. Finite element analysis using ANSYS allowed us to test the feasibility of the power delivery mechanism prior to actual fabrication of the device. The micro-actuators are designed to move in stepwise manner utilizing the concept of scratch-drive actuation (SDA). The functionality of a scratch-drive

Figure 1. Concept behind scratch-drive actuation, which moves the micro-actuators in a stepwise manner. An electrical potential applied between the back-plate (1) and an underlying substrate (2) causes the back-plate to bend down, storing strain energy, while the edge of a bushing (3) is pushed forward. When the potential is removed from the back-plate, the strain energy is released and the backplate snaps back to its original shape, causing the actuator to move forward.

actuator is shown in Figure 1. The actuation cycle begins when an electrical potential is applied between the back-plate and an underlying substrate. The back-plate bends downward, storing strain energy, while the edge of a bushing is pushed forward. When the potential is removed, the strain energy is released and the back-plate snaps back to its original shape. The actuation-cycle is now completed, and the actuator has taken a step forward. In contrast to traditional SDA power delivery schemes (such as using rails or spring tethers), our designs induce the potential onto the back-plate using

www.ansys.com

ANSYS Solutions

Summer 2004

a capacitive circuit formed between underlying interdigitated electrodes and the back-plate of the actuator. A circuit representation of the system as shown in Figure 2 indicated that the back-plate potential should be approximately midway between the potentials of the underlying electrodes. We validated the power delivery concept for the specific geometry of our design by modeling the system through electro-static analysis in ANSYS. Figure 3 shows the volume model of the actuator and the electrode field. The results of the analysis are shown in Figure 4, indicating the electrical potentials of the conductive elements in the model. Additionally, a cut through the air element shows the electrical potential from the field propagating through it. The potential of the electrodes in this example was set to 0 V (blue) and 100 V (red), which represented the model boundary conditions. The required potential of the back-plate was solved to be approximately 50 V, validating the circuit-model approximation. We also discovered that the potential of the back-plate changes only slightly as a function of the orientation of the drive in relation to the electrode field. This indicates that the actuator can be powered regardless of its orientation, so long as the device remains inside the electrode field. Additionally, we used the ANSYS model to visualize the intensity of the electric field propagating through the bottom layer of the insulation material, as shown in Figure 5. We suspect charging of the device due to charge-migration in the direction of the field, and charges embedding in the insulating layer underneath the drive. We anticipate that these charges will cluster along the areas where the electric field is the strongest. In future experiments, attempt will be made to image this pattern using a scanning electron microscope. Following the finite element analysis, we have successfully fabricated and actuated an untethered scratch-drive actuator capable of motion at speeds of up to 1.5mm/sgood pace for such a tiny device. Our current work is focused on how to apply these actuators to create steerable autonomous micro-robotic systems. We anticipate further use of ANSYS to model the electrostatic and mechanical interaction of the system components to further shorten our development cycle. In particular, we plan to use the ANSYS coupled-physics solver to determine the snap-down and operational characteristics of our actuators. I

15

Figure 2. Simplified capacitive-circuit representation of the system.

Figure 3. Volume model of the actuator and the electrode field, prior to solving the model in ANSYS.

Figure 4. Results of the electrostatic analysis, indicating the calculated potentials of the different model components after applying the boundary conditions.

Figure 5. Intensity of the electric field propagating through the bottom insulation layer of the actuator.
www.ansys.com ANSYS Solutions

Summer 2004

Software Profile

16

The New Face of ANSYS ICEM CFD


V5.0 represents a significant redesign for the market leader in mesh generation.
Judd Kaiser, Technical Solution Specialist The new user interface for ANSYS ICEM CFD brings important benefits to all users and has undergone extensive testing, with earlier releases of AI*Environment and ANSYS ICEM CFD 4.CFX utilizing essentially the same interface. The learning curve for new users can be dramatically shortened by way of an updated layout consisting of tabbed panels, a hierarchical model tree and intuitive icons. Existing users can look forward to enhanced meshing technology in a single unified environment for shell, tet, prism, hex and hybrid mesh generation. Performance improvement highlights for these users include hotkeys (which provide one-click access to the most commonly used functions), selection filters and support for the Spaceball 3-D motion controller. provides a common desktop for a wide range of CAE applications. With ANSYS Workbench V8.1 and ANSYS ICEM CFD V5.0 installed, ANSYS ICEM CFD meshing is exposed as the Advanced Meshing tab. Geometry can be transferred seamlessly from DesignModeler to ANSYS ICEM CFD.

Fault-Tolerant Meshing

Having the geometry in hand doesnt do you any good if you cant create a mesh. Fault-tolerant meshing algorithms remain the heart of the ANSYS ICEM CFD meshing suite. Using an octree-based meshing algorithm, ANSYS ICEM CFD Tetra generates a volumetric mesh of tetrahedral elements that are projected to the underlying surface model. This methodology renders the mesh independent of the CAD surface patch structure. This makes the meshing algorithm highly fault-tolerant sliver Getting Geometry In surfaces, small gaps and surface overlaps cause no problem. The mesh ANSYS ICEM CFD is well-known has the ability to walk over small for its ability to get geometry from ANSYS ICEM CFD remains the clear details in the model. Control is in the virtually any source: native CAD choice for meshing complex geometry. hands of the user, who has the flexibility packages, IGES, ACIS or other Shown is a tet/prism mesh for a race to define which geometric details are formats. The package continues to car wheel and suspension. ignored and which are represented be unique among mesh generators accurately by the mesh. Tetras computation speed in its ability to use geometry in both CAD and faceted has been improved with V5.0. As an example, a test representations. Faceted geometry is commonly used model of 250,000 elements and moderate geometry for rapid prototyping (stereo lithography, STL), reverse complexity required 32% less CPU time during engineering (where the STL geometry comes from meshing when compared with the previous version. techniques such as digital photo scan) and biomedical The Delaunay tet meshing algorithm was added applications (where the geometry can come to the meshing suite in the previous version and has from techniques such as magnetic resonance undergone numerous improvements, including imaging [MRI]). support for density volumetric mesh controls. One major development is that V5.0 is the first For viscous CFD applications, tet meshes can be version of ANSYS ICEM CFD capable of running improved by adding a layer of prism elements for within the ANSYS Workbench Environment. As the improved near-wall resolution for boundary layer common platform for all ANSYS products, Workbench
www.ansys.com ANSYS Solutions

Summer 2004

17

Prism before

Prism after

Images showing a cut through a hybrid hex/tet mesh of a wind tunnel/missile configuration before and after adding a layer of prism elements on the wind tunnel walls. Note that the prism layer is included for both the hex and tet zones (new feature in V5.0).

flows. ANSYS ICEM CFD Prism also has been improved for this release. Prism layers can now be grown from surface mesh without the need for an attached volume tet mesh. Perhaps more significant, prism layers can now be grown from both tri and quad elements. This means that it is now possible to grow a prism layer in a combined hybrid hex/tet mesh.

Integrated Hex Meshing


ANSYS ICEM CFD Hexa remains a leader in getting high-quality, all-hex element meshes on geometries, which most competitors wouldnt even attempt a hex mesh. The key to the approach is a block structure that is generated independent of the underlying arrangement of CAD (or faceted) surfaces. Think of the block structure as an extremely coarse all-hex mesh that captures the basic shape of the domain. Each block is then a parametric space in which the mesh can be refined. For CFD meshes, the ability of this parametric space to be distorted to follow anisotropic physics makes it very efficient at capturing key features of the flow with the lowest possible element count. Dassault Systemes recognized the power and promise of this methodology, selecting ANSYS ICEM CFD technology as the only hex meshing solution to be offered integrated into CATIA V5. CAA V-5based ANSYS ICEM CFD Hexa offers hex meshing that maintains parametric associativity to the native CATIA Design Analysis Model. New in V5.0, Hexa has been fully integrated into the new user interface. Hex meshing functions are housed in the blocking tab, and block structure entities are organized on the blocking branch of the model tree. In addition to reworking the user interface, several operations have been significantly streamlined. New methods of creating grid blocks have been added. The process of grouping curves and defining edge-to-curve projections has been made more

efficient. Most operations now take advantage of multi-selection methods, such as box and polygon select. The addition of blocking hotkeys is a real time-saver, giving the user single-keystroke access to the most frequently used operations. For shell meshing, V5.0 offers unstructured 2-D blocks, combining the best of ANSYS ICEM CFD Hexa and the patch-based mesher formerly known as Quad. The creation of blocks for 2-D shell meshing has been automated, so that blocks can be created automatically for all selected surfaces.

Mesh Editing
ANSYS ICEM CFD offers maximum flexibility in its mesh editing tools, whether its via global smoothing algorithms or techniques to repair or recreate individual problem elements. These tools provide one last place to work around any bottlenecks. Noteworthy are new unstructured hex mesh smoothing algorithms, which strive for mesh smoothness and near-wall orthogonality while preserving mesh spacing normal to the wall. Two new quality metrics have been added in order to help quantify mesh smoothness: adjacent cell volume ratio and opposite face area ratio.

Scripting Tools
ANSYS ICEM CFD provides a powerful suite of tools for geometry creation, model diagnosis and repair, meshing and mesh editing. All of these tools are exposed at a command line level, providing a formidable toolbox for the development of vertical applications. Every operation performed can be stored in a script for replay on model variants. This power can be extended by using the Tcl/Tk scripting language, enabling the development of entire applications. These tools enable users to get around virtually any geometry or meshing bottleneck, getting the mesh you need using the geometry you have. I

www.ansys.com

ANSYS Solutions

Summer 2004

CFD Update: Whats New in Computational Fluid Dynamics

18

Blood Flow Analysis Improves Stent-Grafts


Coupled ANSYS and CFX fluid structure simulations help researchers develop optimal surgical recommendations, improved stent designs and proper stent placement.
By Dr. Clement Kleinstreuer, Professor and Director of the Computational Fluid-Particle Dynamics Laboratory and Zhonghua Li, Doctoral Student, Biomechanical Engineering Research Group, North Carolina State University One of the more intriguing challenges in modern medicine is the repair of abdominal aortic aneurysms using stent-grafts: tubular wire mesh stents interwoven with a synthetic graft material. The device is guided into place through a small incision in the groin and then propped open in the aorta, thus reinforcing the damaged area of the artery. For reasons that were not well understood until recently, however, some stent-grafts move out of place. This migration may again expose the weakened aortic wall to relatively high blood pressure, potentially leading to sudden aneurysm rupture and death. Developing an understanding of stent-graft migration and finding suitable solutions is our current work at the Biomechanical Engineering Research Group (BERG) of North Carolina State University in Raleigh. We are using a pairing of computational fluid dynamics (CFD) interactively coupled with computational structure analysis. Using coupled CFX and ANSYS Structural models in these fluid structure interactions (FSI), we are learning what goes on inside the aorta before and after a stent-graft is surgically inserted, and how the stent-graft might migrate or dislodge. Most studies assume that artery walls are stiff with regard to the pressure changes that come with each heartbeat, and that arterial wall thicknesses are constant both axially and circumferentially. Neither is usually true, especially for older patients with hypertension, a group that suffers most from aneurysms.

LEFT: Representation of a cross-section of an abdominal aortic aneurysm (AAA) with a bifurcating stent-graft. RIGHT: Representation of an aortic artery aneurysm (bulge on left) between the renal artery (to the kidneys, top) and the iliac bifurcation (to the legs). Aside from the color shading chosen, this is what the surgeon would see before starting to implant the stent-graft.

Studying Stent Migration


The stent migration problem in abdominal aortic aneurysm (AAA) repairs is critical to the patient s survival. When the stented graft slides out of place axially, the weakened or diseased artery wall is re-exposed to the high blood pressure of pulsating blood flow. That greatly increases the possibility of AAA-rupture, which is usually fatal. Easily overlooked, aortic aneurysms are the 13th leading cause of death in the United States.

Wall displacements and pressure/stress levels for Resteady=1200, using CFX and ANSYS: (left) axisymmetric AAA, and (right) stented AAA, where the stent-graft clearly shields the weakened aneurysm wall from the blood flow
www.ansys.com ANSYS Solutions

Summer 2004

Schematic representation of an axisymmetric AAA, including implanted stent-graft with relevant analytical data.

19

Using five case histories, CFX and ANSYS Structural were used to compute the incipient migration forces of a stented graft under different placement conditions. In the process, we modeled different artery neck configurations, variable arterial wall thicknesses, transient hemodynamics and multi-structure interactions. The actual stented AAA model in ANSYS consisted of a lumen or bulge in the artery wall, an endovascular graft shell, a cavity of stagnant blood and the AAA wall. Using iterative fluid structure interaction was an intense computational problem as ANSYS Structural and CFX exchanged coupled variations in wall flex and geometry, requiring several new flow and structure results at each time step. The ANSYS Structural problem centered around nonlinear, large deformation, contact and dynamic analyses.

Insight into Physical Processes


The CFX post-processor in conjunction with our programs gives us a great deal of insight into the physical processes. It helps us to spot critical areas where platelets or low-density lipoproteins (LDLs) may clump together, and, ultimately, it helps us with design optimization of stent-grafts and secure stent-graft placements. The coupled CFX and ANSYS results were validated with experimental data sets and with clinical observations. Surgeons and scientists know that forces triggering stented graft migration include blood momentum changes, blood pressure and artery wall

shear stress, inappropriate configurations of the healthy aortic neck section, tissue problems in the aortic neck segment and biomechanical degradation of the prosthetic material. To set the model stent-graft into motion, an increasing pull force was applied with an APDL subroutine. Coulombs Law was used for each contact elements friction coefficients, but the simulations revealed a nonlinear correlation in large displacements between the migration force needed to move the stent and the friction coefficients. The simulation also revealed that the risk of displacement rises sharply in patients with high blood pressure. Coupled ANSYS and CFX fluid structure simulations verified that a stent-graft can significantly reduce the risk of an aneurysm rupture even when high blood pressure is the fundamental cause. Clearly, these tools for blood-flow-stent-artery interactions are valid, predictive and powerful for optimal surgical recommendations, improved stent designs and proper stent placement. I

For this study, CFX-4 was linked to ANSYS with Fortran to perform fluid-structure interaction. Presently, generalized, fully representative stented abdominal aortic aneurysm configurations are being analyzed, employing ANSYS and CFX-5.

Cardiac cycle (time level of interest: t/T=0.32, Re=550)

Velocity distribution in non-stented axisymmetric AAA model


www.ansys.com

Wall stress and velocity distribution in stented axisymmetric AAA model


ANSYS Solutions

Summer 2004

CFD Update: Whats New in Computational Fluid Dynamics

20

Simulation Helps Improve Oil Refinery Operations


Analysis assists in reducing coke deposits while improving hydrocarbon stripping.
By Dr. Peter Witt Research Scientist CSIRO Minerals

During oil processing, heavier products are broken down by high temperatures into lighter products in cokers. This cracking process strips off lighter liquid hydrocarbon products such as naphtha and gas oils, leaving heavier coke behind. The challenge that CSIRO Minerals has been helping Syncrude resolve is how to best reduce coke deposits that build-up in their fluid coker stripper while maintaining or improving hydrocarbon stripping.
Syncrude Canada Ltd. is the world s largest producer of crude oil from oil sands and the largest single-source producer in Canada. CSIRO (Australias Commonwealth Scientific and Industrial Research Organisation) is one of the worlds largest and most diverse scientific global research organizations. CSIRO Minerals is a long time user of CFX and in collaboration with the Clean Power from Lignite CRC developed the fluidized bed model in CFX-4. Because of its robust multiphase capability and its ability to be extended into new application areas, CFX is used extensively by CSIRO Minerals in undertaking complex CFD modeling of multiphase, combustion and reacting processes in the mineral processing, chemical and petrochemical industries. In the past, physical modeling had been used to understand the flow of solids and gas in the stripper. This modeling is performed at ambient conditions, so scaling of both the physical size and materials is required to approximate the actual high temperature and pressure in the stripper. This scaling process can introduce some uncertainty in understanding the actual stripper operation.

Maintenance work on a coker unit at Syncrudes oil sands plant in Alberta, Canada.
www.ansys.com ANSYS Solutions

Summer 2004

0.0 secs

By using CFD modeling to complement the physical modeling programs, scaling is eliminated and the actual dimensions and operating conditions are used. Furthermore, CFX simulation provides much greater detail of the flows and forces in the stripper than can be obtained from physical models or from the plant. This is due to the difficulty in making measurements and visualizing the flow in complex multiphase systems. Syncrude senior research associates Dr. Larry Hackman and Mr. Craig McKnight explain that extensive cold flow modeling (but not CFD modeling) had previously been used to investigate the operation of the fluid bed coker stripper and the gas and solids behavior in the unit. McKnight notes this project with CSIRO Minerals resulted in detailed, high quality reports, which provide a new understanding of the fluid coker stripper operation. Hackman indicated, By using CFX to gain a better understanding, it is anticipated that design changes will be identified to improve stripping efficiency, reduce shed fouling and optimize stripper operation. To most efficiently perform the simulations and utilize the results, the two companies are leveraging the distance separating their facilities. When it is night in Edmonton, Alberta, Canada where Syncrude Research is located, CSIRO Minerals staff is hard at work in Australia performing analyses and posting results (including pictures and animations) on their extranet. The next morning, the group in Canada can view progress of the modeling work and provide feedback for a quick turnaround. In this way, CSIRO is utilizing CFX technology to assist Syncrude in determining how best to utilize their current plant to get maximum throughput and thus make the most of their capital investment. I
5.0 secs

21

9.0 secs

13.0 secs

16.5 secs

Three-dimensional fluidized bed model of the Syncrude fluid coker stripper. The model predicts the motion of bubbles (in purple) rising from injectors in the lower part of the bed and the complex flow behavior of coke particles. Flow simulations provide insights into the stripper operation, which are then used to 0.75 improve the design. Gas Volume 0.68 Fraction 0.60 0.52 0.45

20 secs

www.ansys.com

ANSYS Solutions

Summer 2004

CFD Update: Whats New in Computational Fluid Dynamics

22

CFX-5.7 Brings Powerful Integrated Tools to Engineering Design


Latest release enhances core CFD features and gives users greater access to ANSYS tools.
By Michael Raw Vice President, Product Development ANSYS Fluids Business Released in April 2004, CFX-5.7 demonstrates the continuing development of core CFD technologies, plus leverages ANSYS technologies to provide an exciting new series of capabilities for CFX users. This latest version contains the most advanced CFD features available, representing a powerful combination of proven, leading-edge technologies that provide the accuracy, reliability, speed and flexibility companies trust in their demanding fluid simulation applications. ANSYS Integration CFX customers are now gaining access to state-of-the-art geometry modeling software with ANSYS DesignModeler, a Workbench-based product that is our new geometry creation tool providing bi-directional associative CAD interfaces to all major CAD packages. The CFX-5 mesher, called CFX-Mesh and based on the advancing front inflation tetra/prism meshing technology, has been implemented in Workbench as a native GUI application that is easy to use and closely integrated with DesignModeler. ANSYS ICEM CFD meshers, including the unique hexahedral element meshing tools, are also now available in Workbench. They provide meshes for the most demanding CFD applications and are well known for their robustness when applied to very large or complex industrial CAD models. The combination of ANSYS DesignModeler, CFX-Mesh and ICEM

CFX data can be interpolated directly onto ANSYS CBD files, providing a flexible route to transfer CFX results to an existing ANSYS mesh.

www.ansys.com

ANSYS Solutions

Summer 2004

It is now possible to perform texture mapping in CFX-Post

CFD meshing technology provides a comprehensive CAD-to-meshing solution for CFD applications. As this is often the most time-consuming stage of CFD simulation, this represents a genuine time-saving benefit to ANSYS CFX users. The latest release introduces a fluid structure interaction (FSI) capability for when the interaction of a fluid around the solid is important, such as fluids-induced stresses and heat transfer. A simple-to-use, one-way transfer of data from a CFX solution to ANSYS provides for seamless passing of thermal and loads information from fluids to structural analysis. This approach automatically interpolates the data into the ANSYS CBD file format. For more complex FSI situations, such as large-scale solid deformation or motions in which the two-way influences are important, CFX-5 can dynamically interact with ANSYS stress analysis. ANSYS Inc. has the unique distinction of offering the industrys only native connection between such components, which means ease-of-use, flexibility and reliability.

23

Reacting particles are a feature of this release, including a fully featured coal combustion model.

Core CFD Enhancements


As our flagship CFD simulation product, CFX-5.7 has been significantly enhanced for this release in several modeling areas, including moving mesh capability, general grid interface support of conjugate heat transfer, advanced turbulence models, multiphase models, as well as pre- and post-processing improvements. These enhancements are briefly described in the sections that follow. When fluid flow simulations involve changing geometry (as in devices such as valves, pistons or gear pumps, for example), CFX-5 moving mesh options can be used. Several mesh movement strategies are available: prescribed surface movement with automatic mesh morphing, explicit 3-D mesh movement via user functions or multiple mesh files, remeshing with topology change and combinations of these strategies. These strategies cover almost every conceivable mesh movement needed. The Generalized Grid Interface (GGI) is now supported for Conjugate Heat Transfer (CHT), allowing the solid and fluid or the two solids to be created and meshed separately and set up to reflect the needs of the physics in each zone independently. The heat transfer and/or radiation between the two objects can then be analyzed by connecting the two domains using GGI. CFX-5.7 adds two innovative turbulence models to its already comprehensive suite for turbulence analysis. Turbulence is present in most industrial flow, and accounting for this phenomenon appropriately can make a great difference in the accuracy of a simulation. CFX-5.7 will introduce the Transition Turbulence model, which helps to accurately predict the laminar-toturbulent scenarios often key for heat transfer prediction, e.g. in turbine blades. In addition, the detached eddy (DES) transient turbulence model has been completed. Unique to CFX-5, this model combines the efficiencies of a Reynolds Averaged Navier Stokes (RANS) simulation in attached boundary layer regions with the ability to compute the large eddy transient structures using LES. Current multiphase capabilities have been extended, and an algebraic slip model (ASM) has been added. Using an algebraic approximation for the dispersed phase velocities, ASM is highly efficient even for a large number of dispersed phase size groups. A first implementation of the Multiple Size group model (MUSIG), accounting for a wide spectrum of particles sizes and shapes at every point in dispersed two-phase flows, has been added with access through the CFX Command Language (CCL). The material properties editor in CFX-Pre, the CFX-5 physics pre-processor, allows users to select materials from groups to ensure proper interactions with their physical model set-up. This helps to avoid errors in selecting or combining materials. A multi-component, customizable particle model is now available as part of the Lagrangian Particle Tracking capability. Reacting particles features include evaporating, boiling and oil droplet models, as well as a fully featured coal combustion model. These models are often key to accurate simulations in the power generation industry. Some of the improvements now in CFX-Post include the ability to easily compare results between different simulation results or time steps, CGNS data support, surface streamlines, text labels that update automatically, surfaces of revolution, display of particle subset in particle tracking and texture mapping. Development is now focused on future releases, which will provide for a closer interface with the ANSYS Workbench Environment, including more enhancements to fluid-structure interaction capabilities as well as the continued investment in core CFD technology. I

www.ansys.com

ANSYS Solutions

Summer 2004

CFD Update: Whats New in Computational Fluid Dynamics

24
How many holes do we need to dig? Construction costs can exceed $1 million for a new 22m diameter tank at a water treatment plant.

Improving Water Treatment Systems


Engineers design compact, more efficient secondary clarifiers with the aid of CFX.
By David J. Burt Senior Engineer MMI Engineering A secondary clarifier is the final treatment stage of a traditional activated sludge sewage works. It separates solid precipitate material from effluent water prior to discharge. Because of recent changes in environmental legislation, many treatment works in the UK are required to carry increased throughput or meet more stringent effluent quality limits. This means that more clarification capacity is needed. But with land in urban areas scarce and construction costs high, there is an increasing need to maximize the performance of existing units rather than build new ones. The standard technique for designing a final clarifier is mass flux theory. However, this method uses a one-dimensional settling model and cannot account for the density current flow typical in a clarifier. Even if the clarifier external design satisfies mass flux theory it may still fail, or perform badly in practice because of the internal flow features. Often designers are forced to allow a for a 20% factor of safety in tank surface area to allow for the shortcomings of mass flux theory. With CFD modeling, it is possible to capture all of the flow processes to show short-circuiting, scouring of the sludge blanket and solids re-entrainment to effluent. This means it is possible to design more compact units or retrofit existing units with internal baffling to allow for higher loading. By augmenting the standard drift flux models in CFX, engineers at MMI have established a set of validated and verified models for clarifier performance. These models include settling algorithms and rheological functions for activated sludge mixtures. The models have recently been used at a number of UK sites to optimize final effluent quality for increased load. I
MMI Engineering is a wholly owned subsidiary of GeoSyntec Consultants and provides a range of environmental, geotechnical, hydrological and civil engineering services. Further details can be found at www.mmiengineering.com and www.geosyntec.com.
www.ansys.com ANSYS Solutions

Concentration profiles through a cross section of the clarifier approaching 8000 mg/l solids in the blanket. This tank features an Energy Dissipation Influent EDI, optimized stilling well diameter and additional Stamford baffling below the effluent weir.

A useful post-processing idea is to track stream lines for the solid phase velocity field. In this case colored with G scalar to show where floc may experience greatest shear.

Summer 2004

Managing CAE Processes

Upfront Analysis
in the

Global Enterprise
same processes and techniques. Using analysis as an integrated part of product development enables engineers from around the world to collaborate in unprecedented ways. Many of Delphi Corporation s customers are global companies which market and sell their products around the world. It is therefore important for all of Delphis resources to be used to satisfy our customers needs regardless of where the need arises. Recent programs at Delphi Electronics and Safety (Formerly Delphi Delco Electronics Systems) have involved just such a scenario. Engineers from three different countries have been involved in the design process from the moment contracts are awarded. Even while some the system features are being finalized, the resources of the company around the world are mobilized to analyze and evaluate the component designs. Finite element analysis is used extensively to evaluate component performance. In many cases the early analysis indicates that modifications are necessary. The modifications are made and assessed until all problems are eliminated. Engineers responsible for making design modifications can use the local resources as well as those abroad to ensure the viability of their design. For example, many engineers at Delphi Electronics and Safetys design centers around the world have been trained to use first-order analysis tools. These engineers are usually able to use analysis to eliminate many design flaws. However, often they need help in completing the picture, either because of shortage of time and other resources, or because they lack the specialty skills that are available at other sites. Finally, one of the most important reasons for performing upfront CAE is simply that many of our customers require it. In many cases, customers have developed extensive validation requirements that use simulation extensively in the concept approval phase. I

25

Early simulation is especially important when engineers at dispersed locations must collaborate in product development.
By Fereydoon Dadkhah Mechanical Analysis and Simulation Delphi Electronics and Safety Two important side effects of the continuing pressure to reduce product development time and development costs have been the increased use of analysis in the early stages of design and the development and manufacturing of many products at overseas sites. Upfront analysis has been identified by many companies as a critical stage of product development due to the many benefits it provides. Done properly, upfront analysis can shorten the design cycle of a product drastically by identifying problems early before substantial investment of time and material has been made in the product. In the earlier stages of design, engineers have more options at their disposal when changing a design to address problems uncovered by analysis. As a product s design approaches completion, many design modification options are eliminated due to a variety of reasons such as manufacturability, cost, system integration, packaging etc. Therefore, problems that are discovered later in the process are generally more expensive to implement. Once a problem is discovered using upfront analysis, all the viable design options can also be evaluated by employing the same analysis techniques. As a result, when a prototype is finally built and tested, it is much more likely to pass the tests than if upfront analysis had not been used. Another fact of today s global economic environment is that many companies have moved beyond establishing manufacturing-only facilities overseas to performing some of their product development activities at the overseas locations as well.This global footprint can lead to situations where a product is conceived and its performance requirements specified in country A, it is then designed and tested in country B and mass produced in country C. Therefore, development centers have to be flexible enough to respond to the needs of their local market as well as be able to develop products for different, distant markets. Once again, the shortened design schedules makes the use of CAE mandatory, especially in the early stages. Because of the distributed product development process, it is important that all the engineers and designers use the

www.ansys.com

ANSYS Solutions

Summer 2004

Simulation at Work

26

Y Z X

Analysis of Artificial Knee Joints


ANSYS provides fast, accurate feedback on new orthopedic implant designs.

Founded in 1895, DePuy is the oldest manufacturer of orthopedic implants in the United States, with a reputation for innovation in new product development. The company has patented a wide range of replacement knee systems, the first of which was developed more than 20 years ago. One of these types incorporates a state-of-the-art mobile bearing, which offers a wide range of options to allow the surgeon to match the implant to the patients anatomy. Figure 1 illustrates a typical replacement knee. In one recent application, two sizes of a replacement knee design were analyzed at different angles of articulation using ANSYS. Initially, finite element results were compared with known experimental measurements obtained on one of the two sizes at three angles of articulation. Once correlation had been achieved, the same methodology was used to analyze the other design at various angles.

Meshing Critical Components


The replacement knee design is composed of two components: the femoral component and the bearing. Figure 2 shows the solid geometry of the design in ANSYS after importation of the CAD model in Parasolid format.

Both the femoral component and the bearing were meshed with 3-D higher order tetrahedral elements. The meshing of the two parts was made fully parameterized. The mesh on the underside of the femoral component was made sufficiently fine to ensure minimal loss of accuracy in the geometry of the curved contact surfaces. A coarser mesh was used in the interior and on the upper side of the femoral component, since its material was significantly stiffer than that of the bearing, and, consequently, very little structural deformation was expected. Another option was to mesh the contact surfaces of the femoral component with rigid target and the load applied to a pilot node. A similar approach was used for the bearing, as the size of the elements was more critical in the contact region than other non-contacting surfaces. However, a mesh density even finer than that on the contact surfaces of the femoral component was desirable in the bearing to ensure a good resolution of the contact area and stresses. An indiscriminate refinement of the mesh on all the upper surfaces of the bearing proved to be computationally too expensive, and a new meshing procedure was developed and tested by IDAC, a finite element analysis and computer-aided engineering consulting firm and the leading UK provider of ANSYS and DesignSpace software.

www.ansys.com

ANSYS Solutions

Summer 2004

27

Figure 1. One of DePuys knee implants incorporates a mobile bearing that offers a wide range of options to allow the surgeon to match the implant to a patients anatomy.

Figure 2. Solid geometry of orthopedic knee design in ANSYS after importation of the CAD model in Parasolid format.

Figure 3. Analysis shows stress distribution in contact area between the bearing and the femoral component.

Running the Analysis


A preliminary contact analysis was first run with the original mesh density prescribed to the bearing, then the elements that were in contact with the femoral component were further refined for the subsequent solution. An example of this mesh is depicted in Figure 3. The image illustrates the stress distribution in the contact area between the bearing and the femoral component. These stress distribution plots can be created in the ANSYS program for any point in time during the nonlinear solution. It was found that excessive geometric penetration at setup produced stress singularities and that, therefore, the contact pair should be checked prior to the solution. Localized peak contact stresses also could be produced by the discretization of the otherwise smooth contact surfaces. The mesh refinement level for the elements in the vicinity of contact after the preliminary contact analysis may be increased, but at the expense of a longer solution time. Apart from contact stresses, the total contact area was also an important aspect of the design being studied. The total contact area was obtained from summing the areas of all contact elements showing partial or full contact. This generally leads to an overestimation of the actual contact area (although it was considered insignificant given the high mesh density in the contact area). All analysis work described in this project was performed on Intel-based personal computers running the ANSYS program. DePuy are users of ANSYS and the parametric models created by IDAC have been supplied to DePuy for their engineers to perform further analyses and modifications in-house.

Benefits: Speed and Accuracy


Following on from this study, and working with IDAC, a number of our own engineers have been able to do further comparisons of a new design against an existing product in various loading conditions, says James Brooks, a senior mechanical design engineer at DePuy. This has rapidly allowed us to get a good indication of the performance of the product before testing. Fiona Haig, a mechanical designer at DePuy, reports additional benefits of the analysis solution. IDACs macro allowed us to quickly and consistently replicate physical testing that would normally have taken weeks to undertake in our labs. In addition, it permitted us to gain detailed information on stress and deflection, which can be difficult to detect in physical tests. The macro has proved an invaluable tool in the comparison and validation of new implant designs as well as proving a highly effective learning aid for our core team of FEA users, explains Haig. The results achieved using IDAC s analysis method closely correlated to the results of those physical tests previously undertaken in our labs. This validation has allowed us to extend the application of this methodology to the evaluation of a range of new implant designs, providing feedback accurately and in a short time frame. I

www.ansys.com

ANSYS Solutions

Summer 2004

More Design Insight,


28

Faster...
Quickly study the design impact of varying geometry, even without a parametric CAD model.
By Pierre Thieffry, ParaMesh and Variational Technology Solutions Specialist and Raymond Browell, Product Manager, ANSYS, Inc. The combination of the ANSYS Workbench Environment and DesignXplorer VT provides ANSYS users with powerful tools for gaining significant insight into designs when working with a CAD system. Bi-directional parametric associativity with the parent CAD package, made possible by the ANSYS Workbench Environment, makes understanding the design impact of varying geometry easy and comprehensive. But what if this is an old design and the user cannot find the geometry files for the part? Or perhaps you have the geometry, but it is in a non-associative format such as IGES or Parasolid. Maybe the geometry is parametric and regenerates robustly, but the parameters created by the designer are not the ones that make sense for the analyst. For instance, the parameters definitions might be chained together so that it is impossible to vary one feature without changing others. Perhaps a consultant provided the user with only the FEA or math model and the original geometry used to create the model isn t available, or it might take too much time to recreate. This is quite often the case with legacy models. Typically, this would be the end of the story. But with the combination of ParaMesh and DesignXplorer VT, it is just the beginning. Take an inside lok at how these tools can be used to study a legacy model like the engine torsional damper model shown in Figure 1. To optimize this engine damper, perform the following six-step procedure: 1. From an existing ANSYS database write a .cdb file. 2. Import the .cdb file into ParaMesh. 3. Create the mesh morphing parameters within ANSYS ParaMesh. (Note the name of the parameters and their order of creation. This information will be mandatory in the next steps.) 4. Declare the mesh morphing parameters by editing the ANSYS input file 5. Perform the Parametric Solution using ANSYS DesignXplorer VT. 6. Post-process with Solution Viewer, the DesignXplorer VT post-processor within the ANSYS Environment and, if desired, optimize the results.

Figure 1. Sample model of engine torsional damper.


www.ansys.com

Figure 2. Parameter definitions used for mesh morphing.


ANSYS Solutions

Summer 2004

Even for Legacy Models


Creating an ANSYS .cdb file is easy for ANSYS users, and importing the model into ParaMesh is no different than reading a file into any package, so skip to Step 3. Figure 2 shows the parameter definitions we will create in Step 3. The first parameter adjusts the inner diameter of the structure by moving the inner surface nodes and is named inner_diam, and it has a range of variation of 0.5 mm to +2 mm with an initial wall thickness of 2mm. The second parameter adjusts the width of the slotted hole and is named hole_diam, which has a range of 2 to +2mm, with an initial value of 4 mm. The third parameter is the radial location of the slotted hole and is named hole_position, which has a range of 3 to +3 mm and has an initial value of 45mm.
hole_position Minimum Value

29

hole_position Maximum Value

angle_position Minimum Value

inner_diam Minimum Value angle_position Maximum Value

inner_diam Maximum Value Figure 3. Extremes in part geometry obtained by varying mesh morphing parameters.

Hole_diam Minimum Value

Hole_diam Maximum Value

The fourth parameter is the location of the start of the bevel angle bend and is named Angle_position, which has a range of 6 to +2mm with an initial value of about 20mm. Comparing the images in Figure 3 indicates the extremes of the part. The boundary conditions applied are: symmetry boundary conditions on the planar faces, structure is clamped on the central hole and an inward radial pressure applied on the external surface. For Step 4, edit the ANSYS input file (see Figure 4) and declare the ParaMesh mesh morphing parameters so that DesignXplorer VT will know to solve for them. As seen in the sample file, the parameters definitions are straightforward. To access the ParaMesh parameters from within the ANSYS DesignXplorer VT solution, use the SXGEOM command.

www.ansys.com

ANSYS Solutions

Summer 2004

30

/SX SXMETH,,AUTO ! Define the output results SXRSLT,disp,NODE,U,ALL,, SXRSLT,sigma,ELEM,S,ALL,, SXRSLT,mass,ELEM,MASS,ALL,, ! Define the file where the parameters have been created SXRFIL,tor_spring,rsx ! Declare the shape parameters SXGEOM,inner_diam SXGEOM,hole_diam SXGEOM,angle_position SXGEOM,hole_position FINISH /SOLU ! Prepare for a DXVT solution STAOP,SX
Figure 4. Sample section of the ANSYS input file.

Starting with a sensitivity histogram as shown in Figure 5, the sensitivity of maximum stress with respect to each of the mesh morphing parameters is evident. These values are interpreted such that for a change from the minimum value to the maximum value of the parameter hole_position, the maximum stress increases by 103MPa. For the hole_diam parameter, the maximum stress decreases as the parameter increases.

Figure 5. Sensitivity diagram.

This brings us to Step 5, which is solving the model with the mesh morphing parameters using DesignXplorer VT. DesignXplorer VT uses a new and exclusive technique called Variational Technology. In a traditional finite-element analysis, each change of the value of any input variable requires a new finite element analysis. To perform a what-if study where several input variables are varied in a certain range, a considerable number of finite element analyses may be required to satisfactorily evaluate the finite element results over the range of the input variables. In other methods, it is important to remember that each design candidate requires a complete re-mesh and re-solve. The benefit of Variational Technology is that only one solution is required to make the same type of forecast that other methods provide. The response surface created by DesignXplorer VT is an explicit approximation function of the finite-element results expressed as a function of all selected input variables. Variational Technology provides more accurate results, faster. Now post-process the analysis using the Solution Viewer, the DesignXplorer VT post-processor within the ANSYS Environment.

Figure 6. Histogram showing sensitivity of stress, displacement and mass with respect to morphing parameters.

DesignXplorer VTs Solution Viewer allows the user to view the sensitivity of multiple results to the input parameters. In the histogram shown in Figure 6, we see the sensitivity of the Maximum Von Mises Stress, Maximum Displacement, and the Model Mass with respect to all of the mesh morphing parameters. The above sensitivities are relative ones. The angle_position parameter has essentially no effect on the stress.

www.ansys.com

ANSYS Solutions

Summer 2004

With the initial structure, there is a maximum Von Mises Stress at 305MPa, a maximum displacement of 0.06 mm and a mass of 116g. It is ideal to keep the maximum stress under 265Mpa and keep the mass as low as possible. To reach this objective, the above sensitivities give some ideas about the changes to be made. The parameter hole_position has the most influence on the stress and has to be lowered. Moreover, it does not affect the mass, so it is a critical parameter for stress reduction only. The same holds for the inner_diam parameters effect on the mass. It is the most influent on mass, and has little effect on stresses. To reach the objective, expect two parameters to be lowered.

Note the complexity of the response of this torsional damper with respect to the input parameters. As seen in the design curves, all but one of the responses to the input parameters are nonlinear. The response of the stresses to the hole_diameter has a definitive kink in it. The reason for that is tht the maximum stress jumps from one location to another when the parameters are changing. Simple Design of Experiment (DOE) curve fitting of results to selected samples would not have typically discovered this. One of the unique features of DesignXplorer VT is the instant, real time availability to the entire finite element solution for anywhere in the parameter domain. Pick any combination of parameter values and see a contour display of the finite element results. This is directly available, unlike DOE, DesignXplorer VT already has the results available for the user. Figure 9 shows a contour of Von Mises Stress from directly inside the Solution Viewer, for the parameter hole_position at -3mm, 0 and 3mm. The color scheme is the same for all meshes, so we really see the evolution with the given parameter.

31

Figure 7. Design curves created by Solution Viewer.

hole_position at -3mm

Additionally, DesignXplorer VTs Solution Viewer allows you to view your parametric response as either design curves such as those in Figure 7, or as response surfaces as shown in Figure 8.
hole_position at 0

hole_position at 3mm

Figure 8. Response surface created by Solution Viewer. Figure 9. Stress plots created by Solution Viewer with respect to varying parameter hole_position.

www.ansys.com

ANSYS Solutions

Summer 2004

Software Highlights

32

DesignXplorer VT also includes powerful optimization and tolerance capabilities. Using the optimization capabilities built into the Solution Viewer, optimize the part. As stated before, minimize the mass of the part while keeping its maximum stress under 265 MPa. The optimization needs in this case only 63 iterations. These are achieved in 60 seconds an amazingly short time considering the number of iterations. This is because, as mentioned earlier, DesignXplorer VT has the entire finite element solution for anywhere in the parameter domain. No additional solutions are required. The final maximum stress is 265Mpa. The final stress value is more than 10% below the initial stress

value. The final mass has also been lowered to 101g, a saving of 13% which is a better solution in terms of both stress and the mass. The powerful combination of ParaMesh and DesignXplorer VT opens doors to analyses that never before existed. Previously, you had to guess, or optimize manually. Now parametric analysis is available, no matter what environment is being used: Workbench for those that have parameterized CAD models, and ParaMesh with DesignXplorer VT for those with models without parametric CAD. That is the value this powerful combination of ANSYS ParaMesh and ANSYS DesignXplorer provides: more design insight, faster...even for legacy models. I

Procedure Overview
2. Import the .cdb file into ANSYS ParaMesh

1. From an existing ANSYS model, write a .cdb file (CDWRITE)

ANSYS ParaMesh
Create an ANSYS .cdb file 4. Declare the mesh morphing parameters by editing the ANSYS input file (file.dat) (Be sure to have consistent names) 3. Create the mesh morphing parameters within ANSYS ParaMesh

Start with an ANSYS database

file .rsx (or .van)

Edit the input file (file.dat) by adding the following commands: SXGEOM, SXRFIL, SXMETH 5. Perform the Parametric Solution by using ANSYS DesignXplorer VT

ANSYS DX VT and SXPOST


6. Post-process the parametric results and optimize the part

Follow these six steps in using ParaMesh and DesignXplorer VT to optimize legacy models, even if you do not have geometry or if the geometry is non-associative. ParaMesh easily prepares these models so they can be studied with DesignXplorer VT to arrive at quick insight into the design impact of varying the geometry.

www.ansys.com

ANSYS Solutions

Summer 2004

Tech File

33

Demystifying Contact Elements


Part 1 of 2: What they are, how they work and when to use them.
By John Crawford Consulting Analyst If you analyze enough problems, chances are good that sooner or later youll run across one that requires the use of contact elements. Contact elements are used to simulate how one or more surfaces interact with each other. For most analysts, our first exposure to contact elements can be a little confusing because of the variety of elements and the multitude of special features that are available. We have to determine which contact elements are appropriate for our problem, resolve any convergence problems that might arise during solution, and check the results for reasonable and accurate answers. Lets see if we can clear up some of the mysteries that surround the use of contact elements. Well begin by talking about the elements themselves. Remember that an analysis is made up of one or more load steps, and each load step has one or more substeps. Within each substep there can be several nested layers of equilibrium iterations. The precise number and manner in which they are nested is dependent on the solver, how many nonlinear features are being used and several other things. Contact analyses are nonlinear and therefore require their own equilibrium iteration loop. At the end of each contact equilibrium iteration, ANSYS checks to see if the status of each contact element has changed. It also calculates a convergence value (usually force equilibrium) and compares it to the convergence criteria. If the element status has not changed and the convergence criteria has been met, ANSYS determines that the solution for this iteration has converged and moves on to the next outer iteration loop, the next substep or the next load step, or stops solving altogether if the analysis is now complete. If at this point you re a little confused, don t worry. The critical ideas to remember from this are the following: Contact analyses are nonlinear in nature ANSYS performs a special equilibrium iteration loop when doing a contact analysis Contact elements have a status that indicates if they are open, closed, sliding, etc. ANSYS checks the element status and the convergence criteria at the end of each contact equilibrium iteration to determine if equilibrium has been achieved

Node-to-Node Elements
In the early days of finite element analysis, there was one type of contact element: the node-to-node variety. The early versions of node-to-node contact elements were CONTAC12 (2-D) and CONTAC52 (3-D). More recently, CONTA178 (2-D and 3-D) was introduced to encompass the capabilities of both of these elements and also introduce some new features, such as additional contact algorithms. Node-to-node contact elements are simple and solve relatively quickly. Their basic function is to monitor the movement of one node with respect to another node. When the gap between these nodes closes, the contact element allows load to transfer from one node to the other. What does this really mean and how does ANSYS know when the nodes are touching?

www.ansys.com

ANSYS Solutions

Summer 2004

Tech File

34

These characteristics are true for all types of contact elements. While they may seem a little primitive when compared with the newer contact elements, node-tonode contact elements have a lot going for them. Theyve been around long enough to have had their bugs worked out many years ago, and their extensive use over several decades means that there is a vast experience base to draw upon when setting up and debugging an analysis. CONTAC12 and CONTAC52 can have nodes that are either coincident or non-coincident. While the majority of applications involve using non-coincident nodes, coincident nodes can be useful for certain analyses. If coincident nodes are used, the orientation of the contact surface that exists between the two nodes must be defined. The initial condition gap or interference can be provided by the user as being either positive (gap) or negative (interference), or automatically calculated from the relative positions of the nodes. Node-to-node contact is also available in COMBIN40. COMBIN40 is a rather unique element because it also includes a spring-slider, a damper (which works in parallel with the spring-slider) and a mass at each node. Any of these features can be used alone or simultaneously with any or all of the other features. While node-to-node contact elements are very useful, there are some limitations that must be kept in mind when using them. One limitation is that the orientation of the gap is not updated when large deflection analyses are performed. Another limitation is that these elements do not account for moment equilibrium. This does not present a problem when a line drawn between the nodes is normal to the contact surface because in this instance the moments are zero, but care should be taken in each analysis to recognize whether this is the case or not. If not, it is important to consider what effect this might have on the results. It is the responsibility of the analyst to recognize whether this condition is present and whether it introduces an unacceptable error that invalidates the usefulness of the analysis. Node-to-node elements can always be generated manually, and, depending on the model, you can often use the EINTF command to make them as well.

Node-to-Surface Elements
The next evolution in contact elements was the introduction of node-to-surface contact elements, such as CONTAC26 (2-D), CONTAC48 (2-D), CONTAC49 (3-D), and the recent addition of CONTA175 (2-D and 3-D). The major enhancement offered by node-to-surface contact elements is that they allow a node to contact anywhere along an edge (in 2-D) or a surface (in 3-D). Rather than a node being confined to contacting a specific node, a node can contact the edge of a certain element. This has significant benefits when objects translate or rotate relative to each other. Node-to-surface contact elements are capable of simulating large relative movements with accuracy. Because CONTA175 includes all the capabilities of the other node-to-surface contact elements and has other features that these elements do not have, CONTA175 will replace the other node-to-surface elements in future versions of ANSYS. Beginning in ANSYS 8.1, CONTAC26, CONTAC48 and CONTAC49 will be undocumented, and they will eventually be removed from ANSYS. There are several ways to generate node-tosurface contact elements. They can be made manually, but this becomes impractical when making more than a few elements. GCGEN and ESURF are commands that are frequently used to generate node-to-surface contact elements, with GCGEN being the easiest and quickest way to make CONTAC48 and CONTAC49 node to surface contact elements, while ESURF is used to make CONTA175 node to surface elements. To use GCGEN, you make two components, one that contains the nodes from one of the contact surfaces, and another that contains the elements from the other contact surfaces, and then use GCGEN to automatically generate node-to-surface contact elements between every node and every element that are in these components. To use ESURF, you select the elements that the CONTA175 elements will be attached to and their nodes that are on the surface you wish to place the contact elements onto, making sure that you have the proper element attributes active (TYPE, REAL and MAT), and then issue the ESURF command. Last but not least, the Contact Wizard can be used to generate node-to-surface contact elements and is usually the easiest and quickest way of making them.

www.ansys.com

ANSYS Solutions

Summer 2004

Surface-to-Surface Elements
The latest evolution of contact element technology has been in the area of surface-to-surface contact. This allows contact to take place between one or more edges in 2-D, or one or more surfaces in 3-D. There are several important characteristics that make surface-to-surface contact elements very different from their less sophisticated ancestors. Surface-to-surface contact is not defined by a single element, but by two types of elements called targets and contacts. Any number of target and contact elements can be identified as being a set or group. Contact can take place between any contact elements and any target elements that are in this group. ANSYS uses the real constant number to identify the target and contact elements that are in a group. All target and contact elements in this group have the same real constant number. Two-dimensional contact problems can be simulated using either CONTA171 or CONTA172 with TARGE169, while three-dimensional problems would use either CONTA173 or CONTA174 with TARGE170.

CONTA171 and CONTA173 are appropriate for edges and surfaces made from linear (no midside nodes) elements while CONTA172 and CONTA174 can be used with edges and surfaces made from quadratic (having midside nodes) elements. Both CONTA172 and CONTA174 can be used in a degenerate form on surfaces made from linear elements. The introduction of surface-to-surface contact elements has brought about big improvements in solution efficiency and has also broadened the types of contact problems that can be modeled. They offer many new and improved features, such as the ability to contact and then bond two surfaces together, automatic opening or closing of gaps to a uniform value, and a variety of contact algorithms, to name just a few. You can generate surface-to-surface contact elements by using series NSEL, ESEL and ESURF commands. The Contact Wizard automates these steps and makes the generation of surface-to-surface contact elements quick and easy in both 2-D and 3-D. Now that we have been introduced to the contact elements that are at our disposal, well follow up next time with some helpful hints on how to use them. I
Part two of this article, to appear in the next issue of ANSYS Solutions, will discuss various aspects of using contact elements, including modeling tips and setting appropriate stiffness.

35

Want to continue receiving ANSYS Solutions?


Visit www.ansys.com/subscribe/ to update your information. Plus, youll have the chance to sign up to receive CFX eNews and email alerts when the latest electronic version of ANSYS Solutions becomes available!

www.ansys.com

ANSYS Solutions

Summer 2004

Tips and Techniques

36

Contact Defaults in Workbench and ANSYS


Intelligent default settings solve common problems fast with minimal user intervention.
By ANSYS, Inc. Technical Support

As every experienced FEA analyst knows, no two contact problems are exactly alike, so there is no silver bullet combination of KEYOPT and real constant settings that will successfully work for all problems. That explains the many features available today within the contact elements. It also explains, in part, the rationale behind the different default settings sometimes found in the different environments. As migration between Workbench and ANSYS environments progresses, it is important for analysts to recognize that, although the contact technology used in both of these environments is exactly the same, some of the default KEYOPT and real constant settings are not. Tables 1 and 2 summarize all surface-tosurface contact element (CONTA171 174)

KEYOPTs and real constant properties with their respective default settings in each environment. Those that have different defaults in the different environments are highlighted in bold italic. KEYOPT(1): Select Degrees of Freedom (DOF) This option gives you the freedom to assign the contact DOF set consistent with the physics of the underlying elements. ANSYS surface-to-surface contact technology offers an impressive combination of structural, thermal, electric and magnetic capabilities. When building pairs through the ANSYS environment with traditional ANSYS Parametric Design Language (APDL), users must

Table 1: 8.0 Default Contact KEYOPTs


KEYOPTs Description ANSYS APDL Contact Wizard Workbench Default Linear (bonded, no sep) automatic pure penalty n/a gauss no adjust no control no action exclude all btwn loadsteps exclude bonded Workbench Default Nonlinear (standard, rough) automatic pure penalty n/a gauss no adjust no control no action include all/ ramped btwn loadsteps exclude n/a

1 2 3 4 5 6 7 8 9 10 11 12

Selects DOF Contact algorithm Stress state when super element is present Location of contact detection point CNOF/ICONT adjustment (blank) Element level time increment control Asymmetric contact selection Effect of initial penetration or gap Contact stiffness update Beam/shell thickness effect Behavior of contact surface

manual Aug Lagrange no super elem gauss no adjust no control no action include all btwn loadsteps exclude standard

automatic Aug Lagrange no super elem gauss no adjust no control no action include all btwn substps exclude standard

www.ansys.com

ANSYS Solutions

Summer 2004

Table 2: 8.0 Default Contact Real Constants


Real Constants No. Name 1 R1 2 R2 3 FKN 4 FTOLN 5 ICONT 6 PINB 7 PMAX 8 PMIN 9 TAUMAX 10 CNOF 11 FKOP 12 FKT 13 COHE 14 TCC 15 FHTG 16 SBCT 17 RDVF 18 FWGT 19 ECC 20 FHEG 21 FACT 22 DC 23 SLTO 25 TOLS 26 MCC Description Target circle radius Superelement thickness Normal penalty stiffness factor Penetration tolerance factor Initial contact closure Pinball region Upper limit of initial penetration Lower limit of initial penetration Maximum friction stress Contact surface offset Contact opening stiffness Tangent penalty stiffness Contact cohesion Thermal contact conductance Frictional heating factor Stefan-Boltzmann constant Radiation view factor Heat distribution weighing factor Electric contact conductance Joule dissipation weighting factor Static/dynamic ratio Exponential decay coefficient Allowable elastic slip Target edge extension factor Magnetic contact permeance ANSYS APDLContact 0 1 1 0.1 0 note 2 0 0 1.00E+20 0 1 1 0 0 1 0 1 0.5 0 1 1 0 1% note 4 0 Wizard n/a 1 1 0.1 0 note 2 n 0 0 1.00E+20 0 1 1 0 0 1 0 1 0.5 0 1 1 0 1% note 4 0 Workbench n/a n/a Note 1 0.1 0 ote 2 0 0 1.00E+20 0 1 1 0 Note 3 1 n/a n/a 0.5 n/a n/a 1 0 1% note 4 n/a

37

Notes: 1. FKN = 10 if only linear contact is active (bonded, no sep). If any nonlinear contact is active, all regions will have FKN = 1 (including bonded, no sep). 2. Depends on contact behavior, rigid vs. flex target, KEYOPT (9) and NLGEOM ON/OFF. 3. Calculated as a function of highest conductivity and overall model size. 4. 10% of target length for NLGEOM,OFF. 2% of target length for NLGEOM,ON.

set this option manually. The default will always be KEYOPT(1) =0 (for UX,UY). When building contact pairs in the ANSYS environment using the contact wizard, KEYOPT(1) is set automatically according to the DOF set of the underlying element. In Workbench, this option also is set automatically, depending on the underlying element DOF set. KEYOPT(2): Contact Algorithm ANSYS contact technology offers many algorithms to control how the code enforces compatibility at a contacting interface. The penalty method (KEYOPT(2) =1) is a traditional algorithm that enforces contact compatibility by using a contact spring to establish a relationship between the two surfaces. The spring stiffness is called the penalty parameter or, more commonly, the contact stiffness. The spring is inactive when the surfaces are apart (open

status), and becomes active when the surfaces begin to interpenetrate. The augmented Lagrange method (KEYOPT(2) = 0) uses an iterative series of penalty methods to enforce contact compatibility. Contact tractions (pressure and friction stresses) are augmented during equilibrium iterations so that final penetration is smaller then the allowable tolerance. This offers better conditioning than the pure penalty method and is less sensitive to magnitude of contact stiffness used, but may require more iterations than the penalty method. The Multi-Point Constraint (MPC) Method (KEYOPT(2) = 2) enforces contact compatibility by using internally generated constraint equations to establish a relationship between the two surfaces. The DOFs of the contact surface nodes are eliminated. No normal or tangential stiffness is required. For small deformation problems, no

www.ansys.com

ANSYS Solutions

Summer 2004

Tips and Techniques

38

iterations are needed in solving system equations. Since there is no penetration or contact sliding within a tolerance, MPC represents true linear contact behavior. For large deformation problems, the MPC equations are updated during each iteration. This method applies to bonded surface behavior only. It is also useful for building surface constraint relationships similar to CERIG and RBE3. MPC is available as a standard option when modeling bonded contact in both ANSYS and Workbench environments. The Pure Lagrange multiplier method (KEYOPT(2) = 3) adds an extra degree of freedom (contact pressure) to satisfy contact compatibility. Pure Lagrange enforces near-zero penetration with pressure DOF. Unlike the penalty and augmented Lagrange algorithms, it does not require a normal contact stiffness. Pure Lagrange does require a direct solver, can be more computationally expensive and can have convergence difficulties related to overconstraining, but it is a very useful algorithm when zero penetration is critical. It also can be combined with the penalty algorithm in the tangential direction (KEYOPT(2) = 4), when zero penetration is critical, and friction is also present. The ANSYS environment uses the augmented Lagrange by default. The Workbench environment currently uses the penalty method, but the default can be changed via the Options Menu at 8.1. MPC

is available as a standard alternative in both environments. The Pure Lagrange options are available in ANSYS, but can be accessed in Workbench via the pre-processor command builder. At version 8.1, Pure Lagrange is available in the Workbench environment. Table 3 summarizes all the algorithms with pros and cons of each. KEYOPT(9): Effect of Initial Penetration or Gap Properly accounting for or controlling interferences and gaps can sometimes be the difference between success and failure in simulating a complicated contact relationship. There are several contact options available to control how the code accounts for initial interference or gap effects: (0) Include everything: Include an initial interference from the geometry and the specified offset (if any). (1) Exclude everything: Ignore all initialinterference effects. (2) Include with ramped effects: Ramp the interference to enhance convergence. (3) Include offset only: Base initial interference on specified offset only. (4) Include offset only w/ ramp: Base initial interference on specified offset only, and ramp the interference effect to enhance convergence.

Table 3: Contact Algorithms


Algorithm Pure Penalty Pros Offers easiest convergence in least number of iterations Minimizes penetration; better conditioning than penalty; less sensitive to contact stiffness Cons Requires contact stiffness and allowance for some finite penetration Might require more iterations When to Use Helpful when contact convergence is a challenge and minimal penetration is acceptable (Default in Workbench) The default for surf-to-surf and nodeto-surf in ANSYS, as it has proven to produce the best quality results in the most common applications (Default in ANSYS) When zero penetration is critical

Augmented

Pure Lagrange

Offers near-zero penetration; zero elastic slip (no contact stiffness required)

Might require more iterations; might also require adjustment to chatter control parameters unique to this algorithm; can produce overconstraints in model Same as Pure Lagrange

Pure Lagrange on Normal; Penalty on Tangent Multipoint Constraint (MPC)

Same as Pure Lagrange, plus simulation of friction is handled most efficiently More efficient than traditional bonded contact; offers contact betweenmixed element types; offers CERIG RBE3 type constraints

When zero penetration is critical and friction is present Recommended for large bonded contact models to enhance run time and for contact between mixed element types and surface constraint applications

Can produce overconstraints in model

www.ansys.com

ANSYS Solutions

Summer 2004

In ANSYS, the default KEYOPT(9) = 0 is to include everything. In Workbench, the default is to exclude everything (1) when linear contact (bonded, no separation) is defined and include with ramped effects (2) when nonlinear contact (frictional, frictionless, rough) is defined. KEYOPT(10): Contact Stiffness Update When using the penalty and/or augmented Lagrange method, contact stiffness has long been recognized as a critical property that influences both accuracy and convergence. Too high a stiffness will ultimately lead to convergence difficulty; too low a stiffness will result in over-penetration and an inaccurate assessment of surface pressures and stresses at the interface. In an effort to arrive at a good balance between these extremes, automatic stiffness updating between loadsteps (KEYOPT(10) = 0) and substeps (KEYOPT(10) = 1), or between iterations (KEYOPT(10) = 2) was introduced as an enhancement to traditional trial-and-error methods. In ANSYS, when contact is built via APDL, the default is to update stiffness between loadsteps. In ANSYS, when contact is built via the Wizard, the default has been changed to update between substeps. This is considered to produce the most robust contact simulation in most cases. In Workbench, the default behavior is still between loadsteps, but the default can be changed via the Option Menu at Version 8.1. These defaults may change in future releases as further enhancements are made. KEYOPT(12): Behavior of Contact Surface ANSYS contact technology offers a rich library of surface behavior options to simulate every possible situation. These options are as follows: (0) Standard: (Referred to as Frictionless or Frictional in Workbench) normal contact closing and opening behavior, with normal sticking/sliding friction behavior when nonzero friction coefficient is defined. (1) Rough: Normal contact closing and opening behavior, but no sliding can occur (similar to having an infinite coefficient of friction). (2) No Separation: Target and contact surfaces are tied once contact is established (sliding is permitted). This is not available as a standard option in Workbench, but can be accessed via the pre-processor command builder.
www.ansys.com

(3) Bonded: Target and contact surfaces are glued once contact is established. (4) No Separation (always): (Referred to simply as No Separation in Workbench) Any contact detection points initially inside the pinball region or that come into contact are tied in the normal direction (sliding is permitted). (5) Bonded Contact (always): (Referred to simply as Bonded in Workbench) Any contact detection points initially inside the pinball region or that come into contact are bonded. (Design-space Default) (6) Bonded Contact (initial contact): Bonds surfaces ONLY in initial contact, initially open surfaces will remain open. This is not available as a standard option in Workbench, but can be accessed via the pre-processor command builder. The default surface behavior in ANSYS is nonlinear standard for simulating the most general normal contact closing and opening behavior, with normal sticking/sliding friction. In Workbench, the default behavior (which can be changed via the Options Menu at Version 8.1), set up with automatic contact detection to simulate an assembly, is linear Bonded Contact (Always). Real Constant(3): Normal Penalty Stiffness Factor (FKN) Users control the initial contact stiffness used by multiplying the calculated value by a factor, FKN. The default value for FKN used in ANSYS (APDL or Wizard) is 1.0. In Workbench, FKN = 10 if only linear contact is active (bonded or no separation). If any nonlinear contact is active, all regions will have FKN = 1 (including bonded and no separation). Real Constant(14): Thermal Contact Conductance (TCC) This constant dictates the thermal resistance across the interface of contacting bodies in applications involving thermal analysis. The default value in ANSYS for TCC is zero (perfect insulator). In Workbench, the default is automatically calculated as a function of the highest thermal conductivity of the contacting parts and the overall model sizethus essentially modeling perfect thermal contact. I

39

ANSYS Solutions

Summer 2004

Guest Commentary

Putting Quality
40

Assurance
Part 2 of 2:

in Finite Element Analysis


Step-by-step ways to best implement a quality assurance program.
By Vince Adams Director of Analysis Services IMPACT Engineering Solutions, Inc. managers set proper expectations regarding the capabilities and limitations of analysis is often the single most important step in improving the quality and value of simulation at a company. Management training should also include a discussion of the validity of assumptions and results, quality control concepts and an overview of the skills that they should expect their team members to possess to be productive with FEA.

The NAFEMS document Management of Finite Element Analysis Guidelines to Best Practice states that a quality assurance program should be developed to serve an organization, not vice-versa. To address this concern and the barriers described in Part 1 of this series, IMPACT Engineering Solutions has developed a suite of QA tools that can be customized and scaled to meet the needs of a wide-range of product development teams and industries. This suite of tools for QA includes: process audits, management education, user skill-level assessment, user education/ continuous improvement, pre- and post-analysis checklists, project documentation, data management, and analysis correlation guidelines.

User Skill Level Assessment


Skill level assessment may be the most difficult and controversial component of a QA program, and the one with the most far-reaching impact. Skill level assessment isnt technically difficult as there are several areas of expertise that are fundamental to the successful use of analysis. The difficulty lies in the potential for perceived threat. Consequently, users must be shown that the program is not a test but a tool to help them better understand their skills and needs. The program we have developed is composed of four sections in which candidates: Demonstrate a basic understanding of engineering mechanics (failure theory, stress concepts, material properties, etc.) Show a working knowledge of finite element analysis (terminology, concepts, capabilities, meshing, boundary conditions, etc.) Solve hands-on sample problems (using FEA tools that are to be part of their job) Present a portfolio of past work they have performed (reports, screen shots, models, plots, etc.) The assessment report provided back to management should include not only performance results but also a plan for improvement for that individual (including courses and other support options, as well as special skills that might be shared with others in the organization. Finally, some sort of indication of a users level of competence and/or approved responsibility level should be noted, without negative connotations that could be misconstrued.

Process Audit
The first step in establishing a QA program should be to document existing processes and company goals, including technical, organizational and competitive goals. Developing an understanding of how products are developed, what the historical issues and challenges have been, what interactions exist, and how simulation technologies can best impact a companys bottom line should precede any recommendations. A process audit should evaluate not only the tools used by an engineering department but also identify additional state-of-the-art tools that can impact the design process or allow simulation activities to grow beyond current limitations. A process audit should help ensure that all groups involved in the design process are on the same page. Finally, the process audit should put some monetary values to typical tasks so that potential savings and opportunities for gains can be more readily identified. The report generated from the process audit should be a living document that allows periodic review of critical components and observations.

Management Education
A recent survey indicated that management, for various reasons, was the greatest barrier to success of FEA in product design by the users who responded. Helping

User Education/Continuous Improvement


A proactive and forward-thinking QA program should identify areas of growth and knowledge required to

www.ansys.com

ANSYS Solutions

Summer 2004

keep skills of users sharp. A company can t be confident that users are state-of-the-art in their techniques and tools unless they are exposed to people and techniques outside of their familiar surroundings. The process audit conducted at the beginning of the program should identify critical skills and techniques that are needed to maximize the benefits of simulation, while the skills assessment should identify which users need work in those techniques. Employee growth should be planned, not expected to happen haphazardly. Knowledge and documentation of the next plateau for each user or group of users, with clear milestones, will help ensure that quality is maintained. It is also preferable to insist that all users at an organization go through a standard set of courses so that all are using the same language and have been exposed to the same data.

manage revisions and bill hierarchies, not the simplified geometries, results formats, and validation databases required for an analysis program. While every company must develop its own PLM and data management system that best fits within their organization, a QA program for analysis must tap into that system, formalize it if need be and provide means for policing the archiving of analysis data so that a companys intellectual property and investment in simulation is secure.

Analysis Correlation Guidelines.


Unfortunately, companies rarely correlate their finite element data with physical testing. And when testing is used, set-ups are often inappropriate, proper procedures are not followed and sufficient data points not gathered. Therefore, thought should be given to multiple validation points to ensure that boundary conditions, material properties and geometry are all properly specified to provide consistent correlation. The analyst and the test technician should work closely together to devise a test intended to correlate the analysis modeling assumptions. Care should be taken to evaluate the validity of constraints in the model, especially fixed constraints, as these can lead to gross variations in stiffness when comparing test results to analysis results. A QA program for analysis should bridge the gap between test and analysis, and document procedures for correlating FE data. I
Part 1 of this article in the previous issue of ANSYS Solutions discussed ways to overcome barriers to effective quality assurance in finite element analysis. Vince Adams is co-author of the book Building Better Products with Finite Element Analysis and the inaugural chairman for the NAFEMS North American Steering Committee. He currently serves as Director of Analysis Services at IMPACT Engineering Solutions, Inc. (www.impactengsol.com), a consulting firm providing design and analysis services and support to industrial clients in a wide range of industries around the world. Vince can be reached at vadams@impactengsol.com.

Pre and Post-Analysis Checklists


NAFEMS has developed an excellent starting point for companies looking to implement checklists as a quality control tool. We suggest taking these a step further and customizing them for a particular companys tools and analysis environment. As part of a total QA program, clients should be able to access these forms online via an Intranet or the Internet, and they should be made available as part of the project documentation as described below. Weve found that when these simple checking tools are bypassed, minor errors in data entry and interpretation can cause major problems in the decisions based on FE data.

Project Documentation
Too few companies have standard report formats for analysis while many companies dont mandate reports at all. Despite the obvious loss of intellectual capital a company will experience when an analyst leaves the organization without documenting their work, a company loses one of the most important quality control tools in the analysis process when reports arent completed. A QA program for analysis must include a report format that transcends groups, specializations, or departments. Analysis data on seemingly unrelated components could still provide insight and prevent repetition of work. In addition to providing details of the recent work, a project report should include references to similar historical projects, test data and correlation criteria. A report should indicate the source of inputs and assumptions as well as comment on the validity of these assumptions. Additionally, a company would benefit from linking test and analysis reports, even to the point of using similar formats for the two related tasks.

Quality Doesnt Happen by Accident


Even if their product lines are similar, no two companies operate alike. Consequently, no QA program can be assumed valid for all companies without running the risk of forcing an engineering organization to cater to the needs of the system. In our training programs, we identify geometry, properties, mesh and boundary conditions as the key assumptions in any analysis and the most likely sources of error. If nothing else, a QA procedure for FEA must provide a crosscheck on these factors. Ideally, all users in an organization will possess all the skills required to competently perform analyses. However, as the technology proliferates further into the design process, as it should, the likelihood of that required ideal skill level becomes less and less. So management of engineering organizations need to foster a quality environment so that analysis can be used to its full potential. Remember, quality doesnt happen by accident. Only with planning, standardization, education and diligent follow-through can a company truly feel confident that quality in FEA is assured.
ANSYS Solutions

Data Management
As companies begin to evaluate their PLM (product lifecycle management) structures, the organization of analysis or other product performance data must be included in the initial planning. D.H. Brown and Associates have investigated the needs of CAE data management and have found that structured PDM (product data management) systems may not be up to the task. PDM systems were typically developed to

www.ansys.com

Summer 2004

Unparalleled CAE performance and infrastructure


The fastest CAE performance on the planet just got more versatile Choose between: I Intel Itanium processors fastest floating point in the industry. I The CAE software portfolio offered by HP-UX, Linux and Microsoft Windows. I The cluster capability of HP Integrity servers. I The value offered by HP ProLiant industry-standard servers. I The pre/post processing power offered by HP Workstations.

Whatever your choice, HP and our CAE partners deliver. www.hp.com/go/cae


Screen images courtesy of (left to right): ANSYS ICEM CFD Cabin Modeler and ANSYS CFX Intel and Itanium are registered trademarks of the Intel Corporation in the United States and other countries. Microsoft and Windows are U.S. registered trademarks of Microsoft Corporation. 2004 Hewlett-Packard Development Company, L.P.

Anda mungkin juga menyukai