2014 Spring Meeting & 10th Global Congress on Process Safety
(48k) Young Professional Simulation Tutorial
18C00
Young Professional Simulation Tutorial
Authors:
Naomi Hua and Mike Donahue, Invensys
Process
simulation is a complex skill to master. Even with years of experience, a
typical user is constantly facing new problems that need to be addressed.
There are many issues that can lead to unexpected and unnecessary errors
with the use of process simulation tools. Regardless of what simulation
program you are using, these issues are common from one program to another.
In this presentation, targeted for the young professional, we will discuss the "Most Common Pitfalls in Process Simulation".
For each topic, we will discuss various techniques and methods these
troublesome problems and prevent their occurrence.
Topics to be
discussed include:
1.
Erroneous data
entry
2.
Simulation defaults
3.
Bad plant data
4.
Thermodynamic method
selection
5.
Simulation tolerances
and convergence
6.
Recycle identification
and handling
7.
Model complexity
8.
Difficult or impossible
specifications
9.
Assay pseudo-component
development
10.
Assay property blending
- Erroneous data entry is the most common mistake made when building a process simulation model. This presentation will cover methods that can help ensure proper data entry in your process simulation model, such as checking the data you entered, particularly the units of measure, using "alternate forms" of data input such as copy/paste and data import, and reviewing all warning and error messages.
- Defaults are methods, values, and conditions that are automatically selected for you by the program when you begin a new simulation model. These defaults will affect the results calculated by your model and every simulation program has them. Most defaults within simulation programs fall under categories such as convergence criteria, thermodynamic methods, tolerance and units of measure. The problem with defaults is unintended results from not fully understanding the effects a particular default has on the model. Take the time to understand the defaults of the program and how these defaults will affect the results. If you do not agree with the defaults selected by the program, most simulation programs will allow the user to change the default.
- Plant data changes over time. The feed information and process conditions of the unit operations may be different. Not only does plant data change over time, but measurement information is often very inaccurate. This could be due to faulty or poorly calibrated measurement equipment, leaks in lines, sensor failures, and other issues. Because of these changes and inaccuracies over time, you always have to validate the information that is entered in your model to ensure that you are obtaining the proper results. To do so, you can utilize programs that can help with data validation and reconciliation like SimSci's ROMeo online optimization program. With it, you can connect your model to your data historian to constantly import current data measurements into your model and then reconcile these measurements to determine when problems arise within your plant and where those problems are occurring. You will also need to use your engineering judgment to determine if the data is valid. More details will be provided during the presentation.
- The proper selection of thermodynamic methods for your simulation is the most important decision that you make when modeling a process. The method you select could have a drastic effect upon your results. Often, any number of thermodynamic methods is suitable for a given application. It is up to the user to determine the best method to use based upon knowledge, personal experience, and best judgment. Some important factors that need to be considered are the components being used and the operating conditions.
Oftentimes, the selected
thermodynamic method still needs to have additional
inputs such as binary interaction parameters
to simulate the process accurately.
5. Many
of the calculations in process simulation are iterative. When these
calculations iterate, the program compares the results calculated from one
iteration to the results calculated in the next iteration in order to determine
whether it is converged to a solution or needs to continue calculating.
This method of comparing results is where tolerances come into play and
simulation programs use tolerances to determine when a solution is reached.
One of the major mistakes when modeling a process is not
understanding or using the correct tolerance values. A looser tolerance
may make the file easier and quicker to converge, but it also reduces the
accuracy of the model. A tighter tolerance may be more accurate, but
might make it impossible for the model to solve or increase the convergence
time exponentially. A common practice is that when you have reached a converged
solution with your model, tighten the tolerances within the model and solve it
again. If you notice that your answers have changed significantly, you
need to continue tightening the tolerance values until your results change little
from one iteration to another. You
will then know that you have reached an accurate solution for your model.
6. Recycles
in process simulation occur when you return to a previously calculated unit
operation to recalculate the unit due to a change made elsewhere in the
model. They are very common and can often be difficult to converge.
Some suggestions on troubleshooting recycle convergence
include initially solving your model without the
recycles to see how it behaves, providing
initial estimates for recycle streams, using tighter tolerances on unit
operations within recycle loops than the total recycle tolerances themselves,
and ensuring that all components have a way in and out of a recycle by
performing mass balances and using purge or makeup streams where necessary.
- The general rule of thumb when modeling is to start simple and add complexity as you go. Each time you add a few streams or unit operations converge and save the model before moving on. Make sure you justify any additions that you make to the model. You can most likely model your system without including every unit from your PFD in the simulation. When modeling, think about the goal of your simulation and not about literally translating every real-world piece of equipment into the simulation model. Make the judgment call on what is most important and what the purpose is of every unit you add to your simulation. For example, if your real-life process shows a heat exchanger, a pump and a splitter, but all that is necessary is that the temperature, pressure, and separation are achieved properly, consider using a flash drum to accomplish all three.
- Specifications within process simulation are everywhere and every stream and unit operation you place in your model contains specifications. In its simplest form, a specification is a constraint that you place on the model that the model needs to meet in order to successfully converge. For streams, these are usually temperatures, pressures, flowrates and compositions. For unit operations, they range from process conditions, like temperatures and pressures, to product conditions, like an overhead composition in a column. There are some pitfalls, however, that you can run into with specifications such as azeotropes in columns, high purity specifications, and the possibility of multiple solutions.
- Petroleum assays are sets of distillation data that are used to characterize all of the components that reside in a given crude sample. When you enter distillation data into a process simulator, the program converts it into components that it can use when modeling your process. The components that are generated are called pseudo-components, and each pseudo-component represents all of the unknown components that would be within a given boiling point range in the crude. Some of the things you will want to ensure when properly developing your assay components are did you use the correct distillation curve basis, did you enter the data correctly, did you use enough cuts and are you using the correct characterization methods .
- When you have more than 1 assay in a simulation model, you need to be concerned with assay blending and how it will affect the results of your file. Assay blending is the mixing of 2 or more groups of components (i.e. ¨C sets of cuts/pseudo-components) in order to calculate averaged component properties for cuts that are in the same boiling point range as one another. Once these new averaged component properties are determined, the program will use these averages from this point on in the simulation model any time it needs to access component properties for a given pseudo-component. All component properties are blended, including normal boiling points, specific gravity, molecular weight and refinery inspection properties, and usually the blended component properties are calculated using a weighted average
These
topics are based on experience and feedback from end-users to technical support
representatives for process simulators. They often represent the most
commonly asked questions or areas that most users find to be problematic.
While the specific examples highlighted in each topic come from SimSci
software, the intent of this tutorial presentation is to be broad-based and is
particularly applicable to young professionals seeking to develop their
experience with simulation.