question
stringlengths 19
6.88k
| answer
stringlengths 38
33.3k
|
---|---|
Problem Statement: How do you simulate feedforward control of a stream feedrate? | Solution: Attached is an example of how to set a stream flowrate as a function of another stream flowrate.
See file setfeed.bkp.
A Calculator block is used to set the feedrate of stream HX2 to equal 75% of the stream HX1.
For more information see the Aspen Plus Help topic Simulation and Analysis Tools -> Sequential Modular Flowsheeting Tools -> Design Specifications: Feedback Control.
Keywords: None
References: None |
Problem Statement: Is there a list of the units available in Aspen Plus? | Solution: Yes, there is a list. The file RCUNITS.DAT under C:\Program Files (x86)\AspenTech\Aspen Plus xxxx\Engine\Xeq defines all units for the simulation engine. The same information is also stored in the graphical user interface configuration files. This file is not very readable and should never be modified by the user.
The table below lists all the units available in Aspen Plus V7.0. The same information is supplied in the attached spreadsheet. The SI unit is the base unit of measurement for the quantity. The factor and offset can be used to convert from the units to the SI unit with the formula: VALUE in SI units = factor * value + offset. For example, to convert a temperature of 32F to SI units (K), the table gives a factor of 0.555 and an offset of 255.37, so 0.555 * 32 + 255.37 = 273.15 K.
A special flag of -99999 indicates the unit is a gauge (or relative) pressure unit. In this case, the value of the atmospheric pressure specified by the user is added to the pressure.
Units are sometimes written as complex quantities such as million standard cubic feet per hour. The variety of ways units are used in Aspen Plus requires that these units are abbreviated.
Some of the symbols used in units abbreviations are explained below.
Symbol
Meaning
sq
Square. Used primarily with length units to represent area, such as sqft (square feet) and sqm (square meters).
cu
Cubic. Used with length units to represent volume, such as cuft (cubic feet) and cum (cubic meters). The standard abbreviation cc is used for cubic centimeters.
**.5
Square root. Used in a few types of units such as dipole moment.
M
Used as a prefix meaning thousand with English units, such as Mlb (thousand pounds) and Mscf (thousand standard cubic feet). With metric/SI units, m and M prefixes have their standard metric/SI meanings.
MM
Used as a prefix meaning million.
scf
Standard cubic feet. Standard conditions for scf are ideal gas at 14.696 psia and 60°F.
scfm
Standard cubic feet per minute.
scfh
Standard cubic feet per hour.
scfd
Standard cubic feet per day.
scm, ncm
Standard cubic meters. Standard conditions for scm are ideal gas at 1 atm and 0°C. Normal cubic meters (ncm) are the same as scm.
scmh, ncmh
Standard (normal) cubic meters per hour.
ncmd
Normal cubic meters per day.
atmg, barg, psig, Pag, kg/sqcmg,
in-water-g
The g at the end of these and similar pressure units indicates gauge pressure.
psia
The a at the end of the pressure unit indicates absolute pressure. psi is equivalent to psia.
mmhg-vac,
in-hg-vac,
in-water-vac
The vac at the end of these pressure units indicates vacuum pressure, the amount which the pressure is below atmospheric pressure. Vacuum pressure is exactly negative of gauge pressure.
in-water,
in-water-60F
Pressure in inches of water. For in-water, the density of water is assumed to be 1 g/cc. For in-water-60F, the density of water at 60 F is used.
Pcu
Pound centigrade unit (energy unit)
tcal
Thermodynamic calorie, equal to 4.184 Joules. The standard calorie represented by cal is the International Steam Table calorie equal to 4.1868 Joules.
tonne
Metric ton (1000 kg)
Keywords: None
References: None |
Problem Statement: Why different Advanced Process Control Collect methods could have a different timestamp?
The collect processes for APC Collect methods: DMCplus Collect, Inferential Qualities Data Collection (DC) module, Aspen Watch Miscellaneous Tags and Aspen Process Controller (APC) Builder Data Collection are different.
If a parallel collection process using more than a method is executed a different timestamp could be seen. NOTE: A single method of collection should be enough to record the data. | Solution: The timestamp difference should be less than 1 minute between the different Collect methods; if it’s a greater value check the following:
Determine where runs the Cim-IO Interface (e.g. Cim-IO for OPC) and the timestamp (including Daylight Saving settings, if applies) of that machine. If the data source is remote (e.g. the OPC server running on another machine) check the timestamp (including Daylight Saving settings, if applies) there too.
APC Builder can use three I/O source types: Cim-IO, OPC and Process Data. If OPC and Process Data are the connection type, check also in the data provider the timestamp (including Daylight Saving settings, if applies).
On the machine where the collection is being process check the timestamp (including Daylight Saving settings, if applies).
If the timestamps of the servers needs to be adjusted; the AspenTech services should be restarted (e.g. Aspen Tech Production Control RTE, Aspen APC Inferential Qualities Data Provider, etc.).
DMCplus Collect and IQ Data Collection module have relationship with ACO Utility Server; if cache reads are configured keep on mind that setting.
If using DMCplus Collect check if the OFFSET setting is different to zero.
If using IQ Data Collection module check the HISTOFST.
If using Watch Miscellaneous tags check the timestamp of the ScheduleActDef SCHEDULED_TIME related to the IOGet records.
Keywords: APC, Collect, Timestamp
References: None |
Problem Statement: How do you import legacy databank files into the Aspen Properties Enterprise Database (APED)? | Solution: You can create your database in two ways: Either import legacy databank files or clone an existing database. In either case, begin by opening the Database Manager from Start | Programs | AspenTech | Process Modeling <version> | Aspen Plus | Aspen Properties Database Manager. Right-click the Aspen Physical Properties Databases folder and select Create a new database. The Create a New Properties Database Wizard appears.
This option in the Database Manager lets you create a database using the following types of files.
DFMS input files used to create the traditional DFMS-based databanks and customize the simulation engine
TBS input files used to customize the GUI
Optional DIPPR access database which contains the literature citation, notes and uncertainties information
Optional molecular structure information (compressed mole files) that have been stored in an access database
Create the DFMS and/or TBS files according to the procedures under Customizing Traditional Properties Databanks, below. Then follow these steps to create the database.
On the first screen of the wizard select Import legacy data files and click Next.
When prompted, enter the Login Name and Password (this is normally apeduser2 and Aproperty88#) and the name of the new database. Keep the name short, using a maximum of 8 characters. Click Next.
To import the legacy data files, click the browse button and locate the folder(s) where the files are kept.
Use the up and down buttons to order the selected files.
Important: The selected files must be in the same order in which the DFMS input files were used to create the old databanks using DFMS. If you have TBS files, they must be listed after the DFMS input files. If you have a large number of these legacy files, it is recommended that you list the full path names of these files in the correct order in a text file. Save this file with extension .LST (use Files of type All Files and enter the filename with extension .LST when saving from Notepad). Then select this list file using the browse button instead of selecting the individual input files.
Once the database is created, it will be used automatically in Aspen Plus. There is no need to further customize the Aspen Plus Engine or GUI.
See the Aspen Plus System Management Guide, Chapter 4 for more information.
Keywords: None
References: None |
Problem Statement: AspenONE Process Explorer is a HTML-5 based powerful tool in AspenONE MES Suites. Users can make the most of its functionalities by referring to its Online Help. | Solution: If users have AspenONE Process Explorer installed already, just log onto its home page and find the aspenONE Help on toolbar.
The Help file contains the introduction of usage for all A1PE can do. Bing Translator under the search bar can help translate current page to specific language.
If you do not find the topic of your interest in Help file, please feel free to contact Aspen Support.
Keywords: AspenOne Process Explorer
A1PE help
References: None |
Problem Statement: In my project I have one MODULE Area; why am I seeing foundation costs for an electrical substation? | Solution: Areas and their components will require electrical for electrical items (motors, lights, etc) and Process Control systems. If you only have one area in your project, these project costs will be reported against that one area. If you add an area at the bottom of the project, then the project costs (like electrical foundations) will be reported against that last area in your area list.
Keywords: Module, area, civil, electrical foundation, foundation
References: None |
Problem Statement: My simulation is failing to converge. Is there a simple way to isolate blocks from each other to analyse the problem? | Solution: One option is to disconnect the streams. The drawbacks with this approach are:
you need to change the specifications of the port variables (make them fixed)
you may mess up arrays declared using multiports
you may not remember where the stream was connected when you want to restore the simulation
The suggestion is to use stream types instead of the generic connection stream type, and add a parameter which controls if the stream is behaving as a normal stream or as a torn stream.
Let's say our port type is called Fport with the following definition:
Port Fport
F as flow_mass;
End
And our stream would then be something like this:
Stream Fstream
F as flow_mass;
in_f as input Fport;
out_p as output Fport;
in_f.F = out_p.F;
in_f.F = F;
if not in_f.IsConnected then
F : fixed;
endif
end
This stream is already taking care of the connectivity: the IsConnected property will make the flow fixed if this is a feed stream.
The suggestion is to add a new parameter (declared with the type YesNo).
Stream Fstream
CutStream as YesNo (description:Tear stream, No);
F as flow_mass;
in_f as input Fport;
out_p as output Fport;
in_f.F = F;
if not in_f.IsConnected then
F : fixed;
endif
if CutStream == Yes then
out_p.F : fixed;
else
out_p.F = in_f.F;
endif
end
As a user, you can now simply set the parameter CutStream to Yes for all input and output streams of a block to isolate the block, without having to disconnect or change manually the specifications. Of course you need to be careful to then set back the parameter to No once you have resolved the problem to ensure you run a valid simulation. You can use the Find Variables (with the Parameter option checked) to verify the parameter is set to No for any stream of the simulation.
Keywords: None
References: None |
Problem Statement: When searching for tags to add to a trend plot in aspenONE Process Explorer, the application frame may pop up with a dialogue box reading:
Unable to validate server response.
Search Service responded with the following error:
HTTP Code: 407
Description: Proxy Authentication Required | Solution: This indicates that the server uses a proxy when attempting to connect to a website and that the credentials being used to connect to the website are being rejected. This can happen for companies that use a proxy server to allow or reject internet access. This issue can be fixed by having aspenONE Process Explorer use localhost instead of the web server name in the AtWebPlotsConfig.xml file.
To do this, navigate to the AtWebPlotsConfig.xml file and open the file using notepad. By default, it is found in C:\inetpub\wwwroot\AspenTech\ProcessExplorer\WebControls. Scroll to the bottom section of the XML file user to store solr information and edit the server section to use localhost instead of the listed server name. When finished, that section of the XML should appear as follows:
<Solr>
<!-- the SOLR engine server with port -->
<Server>http://localhost:8080</Server>
After performing an IIS reset and restarting the Apache Tomcat service, the search functionality in aspenONE Process Explorer should work without throwing the 407 error message.
Keywords:
References: None |
Problem Statement: Modeling Feedwater Heaters with Aspen Shell & Tube Exchanger | Solution: The following tutorial can be employed to design each zone of a feed-water heater using Aspen Shell & Tube Thermal Program. Once designs are completed, condenser design can be transferred to Aspen Shell & Tube Mechanical Program adjusting the tube length as required to account for the desuperheating and drain cooling zones.
After completing this Jump start guide, you will learn:
1. How to design, rate and simulate condensing zone, desuperheating zone and drain cooling zone in Aspen Shell and Tube Exchanger program
2. How to input the key parameters that related with rating
3. What is the design path and special considerations for complex exchanger simulation
Please use the example files provided to support this jumpstart guide.
Keywords: EDR, Feed Water Heater, Design, Simulation
References: None |
Problem Statement: Attached is file D86.bkp. This example has a stream containing hydrocarbons, it passes through a mixer with a single exit stream. This flowsheet is repeated for mixer units of different property methods. Find the D86 curve under Results Summary | Streams | Vol.% Curves. The D86 curve is identical for all property methods. Why does changing the property method not change the D86 curve? | Solution: The ASTM D86 curve is intended to be used in conjunction with Assays and Pseudocomponents. In the event where there are no Pseudocomponents, what is Aspen Plus in fact calculating?
When components are not defined as petroleum assays, the ASTM D86 curve is calculated using a method dependent only on the composition of the stream, it is not being affected by the chosen property method or the stream conditions. In this case, D86 values are calculated as the weighted average of individual component boiling points. The reported value is the mid/centroid boiling point of the cut.
Keywords: None
References: None |
Problem Statement: Over time, the rescheduling times in Aspen Cim-IO servers for asynchronous transfer records with the same frequency may slip to the point where all the transfer records are processed on the Aspen Cim-IO server at the same time. Also, performing a clean restart causes all the asynchronous transfer records to be scheduled simultaneously.
How can I stagger the scheduling of asynchronous get records with the same frequency evenly throughout the scanning period? | Solution: This query staggers the scheduling of asynchronous transfer records defined by IOLongTagGetDef with the same frequency evenly across the whole scanning period.
You can run this query after performing a clean restart to stagger scanning.
In order to combat rescheduling slippage in the Aspen Cim-IO server, you can save this query as a QueryDef record and schedule it to run on a regular (e.g. daily) basis.
This query handles transfer records defined by IOLongTagGetDef. You can adjust the query to stagger transfer records defined by IOGetDef, IOLLTagGetDef, or IOGetHisDef by substituting IOGetDef, IOLLTagGetDef, or IOGetHisDef for IOLongTagGetDef.
Note: This query turns IO_RECORD_PROCESSING OFF and ON for the transfer records having a positive frequency. This causes the scan list in the Aspen Cim-IO server to be rebuilt for the record which forces the asynchronous process on the Aspen Cim-IO server to retranslate tags. As a result, you may see the scanning status for the transfer records temporarily switch to WAIT FOR ASYNC.
--*******************************************************************************
-- DESCRIPTION
-- --------------------
-- This query is used to schedule the scanning of CIM-IO records per device
-- and io_frequency period so that scans are -- spread equally over the whole
-- scan period.
-- In this form, this should be run on a scheduled basis for all groups.
--
--*******************************************************************************
LOCAL iocnt integer;
for select io_main_task iomt, io_frequency iof, count(name) cnt
from iolongtaggetdef
where io_frequency > 0
group by io_main_task, io_frequency
order by iomt, iof
do
write 'Main Task = '||iomt||'. Frequency = '||iof||'. Count = '||cnt;
iocnt = cast((iof/cnt) as integer);
write 'Gap between scans in 1/10s seconds is '||iocnt;
for select name nnn from iolongtaggetdef
where io_Main_task = iomt
and io_frequency = iof
do
write nnn;
nnn->io_record_processing = 'Off';
wait (iof / cnt);
nnn->io_record_processing = 'On';
end
end
Keywords: None
References: None |
Problem Statement: What are the differences between the Copy Values, Rewind and Advanced Copy in the Snapshot management tool? | Solution: Copy Values copies fixed, free and initial variables to the simulation fixed, free and initial variables.
Rewind copies fixed, free and initial variables to the simulation, and it also rewind the procedure workspaces and delay information, as well as setting the time.
Advanced Copy is a variant of the Copy Values button, which gives you more flexibility as to what will be copied from and to, as well as offering some advanced string matching. Note that the default settings on Advanced Copy do not match the Copy Values action. To match this, you need to select the checkbox in the To side (on the right) for Fixed variables.
Note that Restart is a short cut to Rewind, which rewinds to the earliest snapshot which is converged. The name of the snapshot used will be shown in the Simulation Messages window.
Keywords: restart
rewind
snapshot management
copy values
advanced copy
References: None |
Problem Statement: How to change the sampling time of existing tags for Aspen Infoplus.21? | Solution: This article discussed the method to move the tags which users want to adjust sampling time to a new transfer record.
Turn off the existing transfer record (this is optional but we would recommend do so to avoid potential issue) and select Duplicate this record
Enter a valid name for the new transfer record.
3) Now, this new transfer record has the exact same configuration as the original.
4) Set IO_RECORD_PROCESSING to OFF and change IO_Frequency to x sec, as per your desired sampling time.
5) Go to the repeat area IO_#TAGS and delete those unwanted tags.
6) You can select multiple at one with the shift or ctrl button, then right-click and delete occurrence. Leaving only the tags you need.
7) Set IO_RECORD_PROCESSING to ON.
8) Right-click on the new transfer record and select Activate. Click Activate button.
9) Check if data is coming in.
10) go back to the original Transfer record, turn IO_DATA_PROCESSING to OFF for the tags you have moved in the IO_#TAGS repeat area, turn IO_Record_Processing on for the transfer record. Check if the data is coming.
Keywords: Database
Tags
Sampling time
IO_Processing_Frequency
References: None |
Problem Statement: What is In-line Pass Partition Area in the Exchanger and why is it important. | Solution: Pass partition lanes are gaps between tubes where there is a pass partition. The pass partition is where the plate which divides the inlet channel into for instance two passes is located. This has to seal with the tubesheet reasonably well so there is a gap between the rows of tubes where this touches the tubesheet.
This gap means that there is a larger distance between the tubes than is normally the case. If the flow in the heat exchanger is predominantly at right angles to these pass partitions (ie side to side in image below) then there is no problem.
However, if the flow is predominantly parallel to these pass partition lanes (up and over in the image below) the pass partition lanes are said to be in line with the flow.
This means that there is a large gap for leakage of flow down this gap so that it does not pass effectively between the tubes and therefore is less effective for heat transfer.
Keywords: In-line pass partition, pass partition, in-line, Flow, Baffles, leakage
References: None |
Problem Statement: What is the best way to model the CO2 Capture Process by amines and physical solvents in Aspen Plus? | Solution: In 2006.5, new Amines property packages for MEA and MDEA with H2S and CO2 were developed and delivered as application examples. With each release, these models are improved, updated, and extended to other amines (AMP, DEA, DGA, DIPA, K2CO3, NaOH, NH3, PZ, TEA, Sulfolane), amine mixtures (DEA-MDEA, MEA-MDEA, PZ-MEA, PZ+MDEA, Sulfolane-DIPA, Sulfolane-MDEA) and solvents (DEPG, NMP, PC, DEPG, MEOH). They are located in the Aspen Plus Examples directory created when the product is installed.
For V9 and earlier, the files are located in the following directories:
C:\Program Files (x86)\AspenTech\Aspen Plus V9.0\GUI\Examples\Amines_ENRTL-RK
and
C:\Program Files (x86)\AspenTech\Aspen Plus V9.0\GUI\Examples\Amines_ELECNRTL
For V10, the files are located in the following directory:
C:\Program Files (x86)\AspenTech\Aspen Plus V10.0\GUI\Examples\Carbon Capture
There's a pdf for each bkp/apwz file with specific information.
These examples include the relevant components, electrolyte reaction and chemistry, property methods, and data. Both equilibrium and kinetics reactions are considered. Properties were compared to literature data and parameters were re-regressed where needed. These property packages are now our recommended standard for modeling these systems rather than our older data packages or electrolyte inserts.
The applicability of the property packages is demonstrated by modeling the CO2 capture process using our rate-based distillation model RateSep within RadFrac. These CO2 capture columns are generally rate-limited rather than at equilibrium; hence, RateSep rather than RadFrac was used for accurate modeling. A valid RateSep license is needed to run RateSep. Process results are compared to literature data. Details of these excellent models are fully documented. Even if a RateSep license is not available, the user can still leverage the data in other equilibrium-based calculations.
Generally, the simulations include Aspen Plus rate-based model of the CO2 capture process from a gas mixture. The models consist of an absorber and a stripper. The operation data from a pilot plant at the University of Texas at Austin are used if available to specify feed conditions and unit operation block specifications in the model. Thermophysical property models and reaction kinetic models are based on the recent works of U.T. Austin. Transport property models and model parameters have been validated against experimental data from open literature. Detailed references are in the .pdf files available in the directories.
Keywords: None
References: None |
Problem Statement: Can I display Design-Spec, Transfer, and Calculator blocks connected to unit operations on PFD? | Solution: Starting in Aspen Plus V7, Design Spec, Calculator, and Transfer blocks can be placed on the flowsheet.
Dashed connection lines indicating the unit operation models affected by these blocks can also be added.
These options are accessed using the Display Options drop down on the Modify ribbon.
or by going to File -> Options Flowsheet
Keywords: Design-Spec, Transfer, and Calculator blocks, connections, PFD, flowsheet etc;
References: None |
Problem Statement: When modifying in the Feed Fuel block Carbon index, and specification is higher than some specific value (for example 1), the user may see infeasibility error in the message panel, which will look similar to the one as shown below:
Variable EQ_0025_S5.Inlet.CO2 (value 1.018868e+000) is outside upper bound (1.000000e+000). | Solution: of group 261 is infeasible.
EQ_0025_S5.Inlet.CO2 is equivalenced to the following variables:
BOILER.AirOut(Fluegases).CO2
BOILER.CO2fluegas
S5.Inlet.CO2
S5.Outlet.CO2
Why Aspen Utilities Planner is showing infeasibility error when only changing Carbon Index specified in Fuel Feed block?Solution
Feed Fuel block specifications are used when doing emission calculations, when combustion is included, for example in the Boiler block.
When performing this calculations Carbon Index (CI) is one of the variables which are considered. However, the user should also consider other fuel specifications that will affect the calculations, e.g.Oxygen Demand
The error shown above can appear when these specifications are not consistent, for example if Carbon index is high, when Oxygen Demand (OD) is specified as small value.
If OD is specified as zero it results in very little combustion air required in the boiler. Typically, OD is around 3.5 kg O2/kg fuel for gaseous fuel. Giving a reasonable OD value for the fuel will keep CO2 concentration in the boiler flue gas within 1, and it will be consistent with the Carbon Index (CI), Sulphur Index (SI) and Nitrogen Index (NI) specification.
Let’s take as an example CH4 as the fuel. Based on stoichiometry:
CH4 + 2O2 -> CO2 + 2 H2O
When specifying Feed Fuel block:
CI = TotalCO2 (tonne/hr) / Fuel_Mass (tonne/hr) = 44/16 = 2.75
OD = TotalO2 (tonne/hr) / Fuel_Mass (tonne/hr) = 32/16 = 2
When the user includes Carbon Index (CI) as 2.75 but Oxygen Demand is not specified (included as 0 or very small value), then specification is not consistent and the error message is displayed. When changing OD to 2, then simulation runs without problems.
Keywords: CI, OD, Infeasibility error, Carbon Index modification
References: None |
Problem Statement: When the date and time on a Distributed Control System (DCS) or Programmable Logic Controller (PLC) scanned by Aspen CIM-IO gets set to the future, or if the Aspen InfoPlus.21 server time is accidentally set ahead, the Aspen InfoPlus.21 archiver will write data with future time stamps to history. After correcting the source of the future time stamps, the Aspen InfoPlus.21 database will not write time-stamped data to file sets older than the future date already written to history. This knowledge base article outlines ways to recover from such a scenario. | Solution: There are three different ways to recover from this situation.
Method 1:
Ensure that the DCS/PLC/Windows system time is correct then wait until the Aspen InfoPlus.21 database time naturally catches up to the time stamp of the future data written to history. Data collection will then resume as normal. This method is preferable when the time written to history is not very distant in the future.
Important: The repository's Future Time parameter controls the maximum future time which can be written to each repository. Setting the Future Time to a relatively small value (e.g., 5 or 10 minutes) will ensure that times distant in the future are not written to history.
Method 2:
Delete the file set data which contains the future time stamps. The following procedure outlines this method.
1. Shutdown the Aspen InfoPlus.21 database.
2. Delete the archive files that contain the future dates. Most likely this will only consist of the active file set, unless the file-sets have shifted since the problem occurred (this is more likely to occur if the file set is set to shift based on date). The archive files to be deleted are arc.dat, arc.byte, and arc.key.
3. Delete the file cache.dat for the affected repositories.
4. Restore the last successful snapshot which was taken before the problem occurred.
5. Restore backups of the archive files which were taken before the future time stamps were written to history (if backups exist).
6. Restart Aspen InfoPlus.21.
Important: This method will cause data loss back to the most recent data in the backup files.
Method 3:
Try to cause the history repository to ignore data written to file sets with future time stamps.
Before attempting this procedure, make sure you have access to a backup copy of your Aspen InfoPlus.21 database saved before the future time stamps were written to history. If you do, make a copy of the backup before proceeding. If you do not have a copy of a database snapshot saved before the future time stamps were written to history, you cannot use this method.
Begin by correcting the source of the future time stamps to prevent additional data with future time stamps from being written to history.
Determine if the future time stamps forced a file set shift by using the Aspen InfoPlus.21 Administrator to examine the beginning time stamp of the most recent file set. If the time stamp is in the future, then the future stamps forced a file set shift.
Method 3a:
If the future date did not force a file set shift, then use the Aspen InfoPlus.21 Administrator to force two file set shifts. The net result is the two most recent file sets will have future starting and ending time stamps, and the third most recent file set (the one containing the newest valid historical data) will have a future ending time. Neither of the two newly created file sets will contain any historical data.
Use the utility SetArchiveDates.exe to adjust the ending time of the third most recent file set (the one with valid historical data) and the beginning time stamp of the second most recent file set. SetArchiveDates.exe is normally located in the folder ip21disk:\Program Files\AspenTech\InfoPlus.21\c21\h21\bin. Change the ending time of the third most recent file set to current time or sooner, and modify the starting time of the second most recent file set to the ending time of the third most recent file set plus one millisecond. Changing the ending time stamp of the third most recent file set to current time or sooner causes Aspen InfoPlus.21 to ignore historical data with future time stamps stored in the file set.
Stop Aspen InfoPlus.21 and delete the cache.dat file associated with the repository. Next, delete the contents (arc.byte, arc.key, and arc.dat) of the folder containing the most recent file set. Then, replace the database snapshot with the backup copy made before the future time stamps were written to history. Finally, restart Aspen InfoPlus.21.
Method 3b.
If the future date which was written to history caused the active file set to shift, stop Aspen InfoPlus.21 and delete the cache.dat file associated with the repository. Next, delete the contents (arc.byte, arc.key, and arc.dat) of the folder containing the most recent file set. Then, replace the database snapshot with the backup copy made before the future time stamps were written to history. Finally, restart Aspen InfoPlus.21.
Check if historical information is now being written to the repository. If so, then the problem is solved and there is nothing more to do. If not, this means that Aspen InfoPlus.21 wrote data with future time stamps to the current file set before performing the file set shift. Continue the problem reSolution by using the method described in Method 3a.
Keywords: History repository
Set Archive Dates wizard
SetArchiveDates
Future
Time stamps
Corrupt Filesets
Snapshot
References: None |
Problem Statement: Why different Advanced Process Control Collect methods could have a different timestamp?
The collect processes for APC Collect methods: DMCplus Collect, Inferential Qualities Data Collection (DC) module, Aspen Watch Miscellaneous Tags and Aspen Process Controller (APC) Builder Data Collection are different.
If a parallel collection process using more than a method is executed a different timestamp could be seen. NOTE: A single method of collection should be enough to record the data. | Solution: The timestamp difference should be less than 1 minute between the different Collect methods; if it’s a greater value check the following:
Determine where runs the Cim-IO Interface (e.g. Cim-IO for OPC) and the timestamp (including Daylight Saving settings, if applies) of that machine. If the data source is remote (e.g. the OPC server running on another machine) check the timestamp (including Daylight Saving settings, if applies) there too.
APC Builder can use three I/O source types: Cim-IO, OPC and Process Data. If OPC and Process Data are the connection type, check also in the data provider the timestamp (including Daylight Saving settings, if applies).
On the machine where the collection is being process check the timestamp (including Daylight Saving settings, if applies).
If the timestamps of the servers needs to be adjusted; the AspenTech services should be restarted (e.g. Aspen Tech Production Control RTE, Aspen APC Inferential Qualities Data Provider, etc.).
DMCplus Collect and IQ Data Collection module have relationship with ACO Utility Server; if cache reads are configured keep on mind that setting.
If using DMCplus Collect check if the OFFSET setting is different to zero.
If using IQ Data Collection module check the HISTOFST.
If using Watch Miscellaneous tags check the timestamp of the ScheduleActDef SCHEDULED_TIME related to the IOGet records.
Keywords: APC, Collect, Timestamp
References: None |
Problem Statement: How do you import legacy databank files into the Aspen Properties Enterprise Database (APED)? | Solution: You can create your database in two ways: Either import legacy databank files or clone an existing database. In either case, begin by opening the Database Manager from Start | Programs | AspenTech | Process Modeling <version> | Aspen Plus | Aspen Properties Database Manager. Right-click the Aspen Physical Properties Databases folder and select Create a new database. The Create a New Properties Database Wizard appears.
This option in the Database Manager lets you create a database using the following types of files.
DFMS input files used to create the traditional DFMS-based databanks and customize the simulation engine
TBS input files used to customize the GUI
Optional DIPPR access database which contains the literature citation, notes and uncertainties information
Optional molecular structure information (compressed mole files) that have been stored in an access database
Create the DFMS and/or TBS files according to the procedures under Customizing Traditional Properties Databanks, below. Then follow these steps to create the database.
On the first screen of the wizard select Import legacy data files and click Next.
When prompted, enter the Login Name and Password (this is normally apeduser2 and Aproperty88#) and the name of the new database. Keep the name short, using a maximum of 8 characters. Click Next.
To import the legacy data files, click the browse button and locate the folder(s) where the files are kept.
Use the up and down buttons to order the selected files.
Important: The selected files must be in the same order in which the DFMS input files were used to create the old databanks using DFMS. If you have TBS files, they must be listed after the DFMS input files. If you have a large number of these legacy files, it is recommended that you list the full path names of these files in the correct order in a text file. Save this file with extension .LST (use Files of type All Files and enter the filename with extension .LST when saving from Notepad). Then select this list file using the browse button instead of selecting the individual input files.
Once the database is created, it will be used automatically in Aspen Plus. There is no need to further customize the Aspen Plus Engine or GUI.
See the Aspen Plus System Management Guide, Chapter 4 for more information.
Keywords: None
References: None |
Problem Statement: This Knowledge Base article provides steps to resolve the following error:
Request Failed: 500
which may be encountered when testing Aspen Data Source Architecture (ADSA) Web Service Protocol. | Solution: To resolve the error message Request Failed: 500, there can be multipleSolutions based on root cause of the problem.
1. DCOM related errorSolution - Please click Start | Run, type Dcomcnfg and do the following:
- Click on the COM Security tab
- Both in Access Permissions and Launch Permissions in the Edit Limits add NETWORK SERVICE and check all boxes for Local and Remote access. This needs to be done because in IIS | DefaultAppPool, the Identity tab is using Network Service and not standard Local System
- You may also need to grant the user NETWORK SERVICE full access to the c:\windows\temp folder.
Alternatively, you can do the following:
- Go to the Control Panel, select Administrative Tools and then Internet Information Services
- Expand Application Pools
- Right-click DefaultAppPool and select Properties
- Select the Identity tab
- Change from Network Service to Local System
2. Incorrect version of ASP.NET - Please note that this error message can also be generated in situations where the Default Web Site is using an incorrect version of ASP.NET. (Example: ASP.NET 1.x is being used instead of required ASP.NET 2.0.).Solution - For an aspenONE V7.3 install on Microsoft Windows Server 2008 R2 64-bit, the error may be resolved as follows:
1) Open IIS Manager
2) In the left pane, select the server name
3) Double-click on Feature Delegation in the center pane
4) Select Handler Mappings
5) Right-click on handler mappings and select Read/Write
6) Issue an IISReset
3. Installation error - Applicable only if the problem is on Aspen InfoPlus21 client system:Solution - In case the problem is not resolved usingSolution in section 1 & 2, please follow the below -
Uninstall and reinstall the below AspenTech software components, in the client machines
- Process explorer
- Process data
- Aspen data source administrator
- Aspen local security
Keywords: Error 500
Request Failed:500
ADSA error
References: None |
Problem Statement: How can I change the default location setting in Activated Economics? | Solution: You change the location setting in Activated Economics by changing the template being used to another template using a different location. Templates can be change in Activated Economics by going to the Economics tab on the Ribbon, and choosing “Cost Options”. The “Costing Options” sheet will open up, and the first line will show the current template being used. In the screenshot below it is “US_IP” for “US Inch Pound”
If you click on Browse it will bring up a screen showing a “Templates” folder. Click on the “Templates” folder and it will show you your currently available templates to be used in your simulation:
If you click on one of the folders it will take you to the template. In the screen below we have clicked on the “JP_IP” folder to use the location setting for Japan. Click on “Open” and it will then show as your new default template for your current simulation.
Note: The new template must be chosen before initially doing Activated Economics to work properly
Keywords: Default location, location, Activated Economics, Activated, Economics
References: None |
Problem Statement: How is the pressure drop across a swage fitting in a pipe segment calculated in Aspen HYSYS? | Solution: In Aspen HYSYS, the pressure drop computed for swage fittings in pipe segments is calculated with the equations shown in the Pipe Segment | Rating Tab | Swage Fittings section of the Help Menu.
First, with the mass flowrate flowing through the pipe (mass in = mass out) and the mass densities at the inlet and outlet of the pipe segment, Aspen HYSYS computes volumetric flowrates. Then, it uses the pipe diameters to get areas and from there calculate velocities. Once Beta, K, in/out densities and in/out velocities are known, Aspen HYSYS calculates the swage fitting pressure drop. Depending on the case, the equations for either a reducer or enlarger will be used.
In the attached example simulation file, a swage fitting is used to model a reducer. A pipe segment has been set up with a feed stream that consists of pure water at 25 C and 116 psia. It consists of two pipe straight lengths with a 45-degree swage in between to account for the different inner pipe diameters of the inlet and outlet segments, 2.9 in and 1.28 in, respectively. Both pipe segments straight lengths are set to1E-6 m, so that the pressure drop calculation carried out for the pipe segment only accounts for the swage fitting pressure drop.
The pressure drop computed by Aspen HYSYS results in 22.85 psia.
Also, there is an Excel spreadsheet attached which shows the hand calculations for the swage fitting pressure drop. In the Excel spreadsheet, the assumption made is that mass density is unchanged, a valid one for an incomprehensible fluid that is not going through a phase change. The pressure drop results in 22.46 psia, which ends up being almost the same exact number as Aspen HYSYS.
Keywords: Swage Fitting, Pressure Drop, Pipe Segment, Diameter, Velocity, Density, Area, Volumetric Flowrate.
References: None |
Problem Statement: How to convert a Hetran file to an EDR file in V8.8 and later versions? | Solution: The Hetran file is saved as .btj file, this extension cannot be read in Aspen Shell and Tube Exchanger V8.8 and later versions.
Aspen Shell and Tube Exchanger V8.4 can read .btj files. The .btj file needs to be opened in V8.4 and saved it as .edr file.
Please follow these steps:
1. Open .bjt file using EDR V8.4
2. Select Run | Transfer
3. Check Shell & Tube Exchanger and click OK
4. Save file as .edr
Alternatively, you can use the built-in SaveAsEDR application - this is equilvalent to steps 1 through 4 mentioned above.
1. To access the application, open the File Explorer and enter:
C:\Program Files (x86)\AspenTech\Aspen Exchanger Design and Rating V8.4\XEQ
2. In this folder, find and open the file SaveAsEDR.exe to convert the .btj file to EDR. The following window should appear:
3. Select the BJAC (*.btj) ratio button and then select the file from the Source Drive/Folder. Once you open the folder containing the file to be converged, it should be available for selection inside the blank area Source File.
4. Highlight the file, choose a Destination Folder where you would like the converted file to be saved and then press Convert at the bottom of the window.
The file will then be converted into an (*.edr) application and will be available for opening in later versions.
Note: Both methods will also save the Hetran case in the EDR file so you will get a warning that EDR no longer supports Hetran once you open it in a later version. You wll also get a warning that Contran is no longer supported - this is the Hetran condensation model. After these two warnings, EDR will open normally and can be edited.
Keywords: Hetran file, convert, .btj
References: None |
Problem Statement: How to ramp a variable without creating a task? | Solution: It is a very useful feature when you are experimenting with your dynamic simulation. This feature can only be used for variables with Fixed specification.
You can implement the steps below while the simulation is running or you can pause the simulation first and then resume the run once finished. If you choose to do it while the simulation is running, the run will halt while you are entering the data and then resume as soon as you close the dialog box on step 6.
Open the table containing the variable you want to ramp.
Click on the row containing the variable with the right mouse button to bring up the context menu.
If the variable is Fixed you will see an entry Ramp...
Choosing this entry will bring up a dialog box, which allows the end point and duration of the ramp to be set.
The start time is the current simulation time and cannot be changed.
When you are finished close the dialog box to apply the changes.
Note this is one shot ramp. If you want the ramp to be done every time you run the simulation, you should create a task in the Flowsheet folder. You can simply drag&drop the variable on the task editor (in the same way as you add a variable to a time plot) to avoid any spelling mistake in the task syntax.
Keywords: task, ramp, dynamic
References: None |
Problem Statement: How to handle changes of model time units? | Solution: It is possible to define the model time units in the Run, Run Options menu. You can select the units of TIME, the simulation time. This specifies not only the units of the predefined quantity TIME, the current simulation time, but also the time and delays used in DELAY function, in tasks, and the time derivative $. It also defines the units of the time for the integrator (minimum, initial and maximum time step).
The model developer is responsible for taking care of the unit conversion factors. Say for example we model a tank
Model Tank1
V as volume; // m3
Fi as flow_vol; // m3/hr
Fo as flow_vol; // m3/hr
$V = Fi - Fo; // m3/hr
End
This model assumes that the time is in hours, because $V gives the time derivative of the volume, V is in m3, and the right hand side of the equation is giving flow in m3/hr.
If you want to change the model time units to seconds in the Run, Run Options, then the model must be modified to:
Model Tank2
V as volume; // m3
Fi as flow_vol; // m3/hr
Fo as flow_vol; // m3/hr
$V * 3600 = Fi - Fo; // m3/hr
End
The factor 3600 has been introduced to convert from $V in m3/s to m3/hr.
If you want more flexibility, you can use the same technique used in the Control models delivered in the Modeler library and in Aspen Dynamics models. The trick is to create a global parameter, GlobalTimeScaler, which is defined as the number of seconds in a model time unit. When the time is in hr, GlobalTimeScaler must be set to 3600. When the time is in s, GlobalTimeScaler must be set to 1. We need to modify our model to:
Model Tank3
GlobalTimeScaler as global TimeScalerParameter;
V as volume; // m3
Fi as flow_vol; // m3/hr
Fo as flow_vol; // m3/hr
$V * (3600/GlobalTimeScaler) = Fi - Fo; // m3/hr
End
The value of the GlobalTimeScaler will be specified on the AllGlobals table you find in the Globals folder in the Simulation Explorer.
ACM v10 and higher will update the parameter when the model time units are changed in the Run Options. In v9.0 and lower, you must update the parameter manually.
Finally, in some cases, you may want to use different model time units, if for example models have been developed without having agreed on the same time base. This is potentially risky and confusing, but handling this requires only a small modification to the model to define a local time scaler parameter, which is assigned by default to the global time scaler. The user has then the flexibility to define the time to be used on a block basis. Again please be aware that the time units must be set correctly by the user as ACM does no checking.
Model Tank4
GlobalTimeScaler as global TimeScalerParameter;
TimeScaler as TimeScalerParameter;
TimeScaler: GlobalTimeScaler;
V as volume; // m3
Fi as flow_vol; // m3/hr
Fo as flow_vol; // m3/hr
$V * (3600/TimeScaler) = Fi - Fo; // m3/hr
End
Keywords: time, GlobalTimeScaler, dynamic, model
References: None |
Problem Statement: Security, whether Aspen Local Security, or full Framework Security, is based on applications and roles and users attached to roles. The administrator sets up roles such as Admin, Manager, User, etc, and then adds users to these roles. These users are defined by their NT Login ID. You then define which roles have which privileges for different Aspen applications.
So what would happen if the same user were in more than one role. For example, John Doe is in both the User role and the Manager Role. Which permission set would he get? | Solution: You would get the higher permissions. The client software would ask the security system if person who logged with <login domain>\<user name> is able to perform a certain task. The system will return a yes if he or she is a member of a role that has that permission.
Therefore having a user in more than one role can be misleading. If a user is both in an Operator role, with limited permissions, and an Administrator role, the greater privilege set of the Administrator role will take precedence. The privileges are additive when a user is in multiple roles.
Keywords: Role
User
Permission
Security
References: None |
Problem Statement: This knowledge base describes how to fix Error 1320: The specified path is too long C:ProgramData\Document\AspenTech during Economic Evaluation installation. | Solution: To resolve this issue, please follow these simple steps:
1. Unhide all the hidden folders at Program File (x86) to see the ProgramData folder:
Open a Window Explorer | Click on C: drive | Click on the Organize from the menu tool bar | Select Folder and Search Options | Click on the View Tab | Uncheck Show hidden files, folders, and drives
2. Open C:\ProgramData\Documents and rename the Document (appears like a short-cut) folder. This is specific for this machine and it may have a different folder name that needs to be renamed for other machines (see the error dialog box for specific folder).
3. Click Retry button and installation will complete.
4. For more details, please refer to http://msdn.microsoft.com/en-us/library/bb756982.aspx related to why Error 1320 occurs.
Keywords: ProgramData
64-bit
Error 1320
Long filename
Economic Evaluation
References: None |
Problem Statement: How is the conduit length of a component calculated in Aspen Capital Cost Estimator (ACCE)? | Solution: For a component such as tower, conduit length is the sum of :
1. Tangent to Tangent (T-T) height of the component
2. The skirt height
3. The average distance of the component to the panel board ( area level spec)
For example, if you have a tower that has an 80 feet T-T height, a skirt height of 15 feet, and an average distance
of the tower to the panel board is 200 feet, then the length of conduit would be 295 feet:
 Conduit Length = (80’t-t + 15’ skirt + 200’ Dist to PB + ) x 2 wires = 295 feet
Keywords: ,
Conduit, Tower Lighting , Conduit length calculation
References: None |
Problem Statement: In my project I have one MODULE Area; why am I seeing foundation costs for an electrical substation? | Solution: Areas and their components will require electrical for electrical items (motors, lights, etc) and Process Control systems. If you only have one area in your project, these project costs will be reported against that one area. If you add an area at the bottom of the project, then the project costs (like electrical foundations) will be reported against that last area in your area list.
Keywords: Module, area, civil, electrical foundation, foundation
References: None |
Problem Statement: Excel Integration Utility is opening additional Excel Instances and it does not function properly | Solution: This TecH Tip provides a possibleSolution when you are observing that Excel Integration Utility is opening additional Excel Instances and it does not functions properly.
If you are observing that Excel Integration Utility is opening additional Excel Instances and it does not function properly, go to Task Manager | Details tab and add a column Elevated, Orion.exe and Excel instances should have the value of No as in the following image:
If one of the instances is running on elevated mode then this behavior will present.
To troubleshoot this issue that is related with Administrator Privileges contact your IT so both Orion.exe and Excel are running with values of No in Elevated column.
Keywords: None
References: None |
Problem Statement: When I run the model using the XLP Matrix Generator, it failed, generating the error message:
Error in CreateMPIProcess: CreateMPIProcess failed with error 2: The system cannot find the file specified.
How can I resolve this error? | Solution: The cause of the error is that Windows Firewall is blocking some network connections.
To resolve the problem, you need to change Firewall settings. Go to Control Panel, Windows Firewall, Exceptions tab, Add Program and add the following executables.
1. C:\Program Files\AspenTech\Aspen PIMS\CaseParallel.exe
2. C:\Program Files\AspenTech\Aspen PIMS\MultiStartParallel.exe
3. C:\Program Files\AspenTech\Aspen PIMS\PIMSWin.exe
4. C:\Program Files\MPICH2\bin\mpiexec.exe
Keywords: XLP
Parallel Processing
References: None |
Problem Statement: What is Feasibility Objective Factor in the XSLP settings? | Solution: Feasibility Objective Factor identifies the user scaling for feasibility. The best way to describe this number is that this is the cost of a unit of infeasibility. For example, if a non-linear equation is an E-Row, the Infeasibility Breaker option allows you to buy a unit of non-zero, So, if the Feasibility Objective Value is 10,000, it will cost $10,000 to have the equation equal to 1 instead of 0. This is, of course, a huge incentive to make this column have a value of 0 so you don't have to pay this $10,000.
Keywords: Feasibility Objective Factor
FOF
XSLP settings
References: None |
Problem Statement: The customer wishes to know how many Manipulated Variables (MVs) will be in the controller model. | Solution: This is relatively simple. When planning the Aspen Advanced Process Control Application two key source documents are the Process Flow Diagram and the Piping and Instrument Diagram of the process to be controlled.
To estimate the MV count for the controller, first determine the extent of the controller on the PFD and/or P&ID. This can be done by mentally or physically drawing the boundary of the control on the diagram. Next, review the regulatory system inside the controller boundary and count the number of regulatory loops: Temperature, Pressure and Flow.
Typically the Advanced Process Controller will be writing setpoints to each regulatory loop or in rare cases to the output of the regulatory loop. Therefore, the count of regulatory loops is an excellent estimate of the number of Manipulated Variables in the control application
Keywords: Manipulated Variable, MV, Regulatory Loop
References: None |
Problem Statement: What causes Error 1067:the process terminated unexpectedly, while starting CIMIO Manager service? | Solution: While starting CIMIO manager service, it fails to start with error message - Error 1067:the process terminated unexpectedly.
There could be multiple reasons for this error to occur. One of the reasons is machine hosting CIMIO manager is not able to connect to or acquire license from License server.
Please follow the steps below to rectify this problem:
1. On the server, check CIMIO_msg.log file from C:\Program Files (x86)\AspenTech\CIM-IO\log.
2. Log file will have error messages about failure to acquire license from server or cannot connect to server.
3. Run SLM configuration wizard to register all the licenses from license server.
4. Attempt to start the CIMIO_msg. log file after licenses are initialized. The CIMIO Manager service will now start.
AlternateSolution - In case the above procedure do not solve the problem, it could be possible that the .CSD file in server is corrupted and causing this issue.
In this case, delete the .CSD file located in C:\Program Files (x86)\AspenTech\CIM-IO\management.
Now restart the CIMIO service from windows services pannel
Please see KB 104913 and 120336, if the above steps do not resolve the problem.
Keywords: Error 1067
CIMIO Manager service
Windows service
.CSD file
References: None |
Problem Statement: When I add a Quoted Equipment item to my project, why is it only appearing in the TIC (Total Installed Cost), but the equipment cost shows zero? | Solution: Equipment costs are broken down in two basics categories; the cost of the equipment itself and the Total Installed Cost (which includes things like setting costs, civil, piping, instrumentation, etc). When you use the Quoted Equipment component, it needs three things: COA, “Material cost per unit”, and “Labor hours per unit”. If you give it a COA from 100 to 299, the item will be treated as equipment. You will see in your report under “Equipment Cost” the cost you gave for “Material cost per unit” and under “Total Cost” you will see the installed costs (basically the equipment cost and anything you added for setting and in the drop-down OPTIONS form). BUT, if you give it a COA from 300 to 999, it is not treated as equipment; it is considered a plant bulk item. In this case the “Equipment Cost” in the report will be zero, and all costs will appear in the “Total Direct Cost” field.
Keywords: COA, Quoted Equipment, TIC, Equipment
References: None |
Problem Statement: I noticed in Aspen Capital Cost Estimator (ACCE), Help, the option to Show Cost Basis for my current release. Where can I find the cost indices for all the past releases? | Solution: The list of all indices is in the Icarus
Keywords: Indices, cost basis
References: Manual, Chapter 33, titled, Base Indices. You can get the manual from ACCE under “Help”, “Documentation”. |
Problem Statement: After applying Windows updates from July-August 2018 release users may experience Error 429: ActiveX components can´t create Object. The Aspen ACP Web Provider Data Service may not be running or is not registered properly. This Error will not allow the user to login in PCWS.
On the other hand, the user also can experience Error 500 – internal server error when trying to call the History Plots for a Variable
In most cases the problem occurs after applying the 2018-7 Quality and Security rollup for .NET Framework. Below is a list of the MS updates that may lead to this problem:
KB4340556
KB4338600
KB4345424
KB4284815
KB4338815
KB4338424
KB4338419
KB405456
KB4095875
KB4043763
KB4040981 | Solution: These errors were introduced due to a Microsoft .NET framework security update that affects the PCWS. The following MS article explains theSolution
https://support.microsoft.com/en-us/help/4345913
The recommended approach is to allow Windows Update to detect and apply the August 2018 “2018-8 Security and Quality Rollup for .NET Framework” applicable to your operating system and .NET Framework versions installed.
In addition, it has been reported that MS KB4338600 helped to solve the problem with Error 500.
If you already have the August 2018 (2018-8) Security and Quality Rollup for .NET Framework applied to your system, then that should resolve the problem.
Keywords: MS Updates, PCWS. Error 429, Error 500
References: None |
Problem Statement: What does this error message Unhandled Exception Length cannot be less than zero mean when users are trying to print results in Aspen Exchanger Design & Rating tool? | Solution: When user tries to print some data from EDR file, it checks that the formatting of the Headings area is maintained like below:
Specifically, it uses the following strings to parse out the data the user may have entered in between:
Our
Keywords: Exception Length; Headings and drawings;
References: Your Reference:
Rev No.:
Job No.:
The above error message shows up when user removes one of the above strings and as a result the program now cannot parse correctly.
There are two |
Problem Statement: Why is the system generating concrete costs when I have a MODULE Bargeable or Truckable area? How do I remove the concrete costs? | Solution: The system is generating the concrete costs because the default is to have the system set the Module on a pad at the delivery site. You can set the concrete thickness to 0 (zero) on the Module form and no concrete costs will be generated.
Keywords: Module, concrete, pad, Bargeable, Truckable
References: None |
Problem Statement: What affect does entering the duty have on my thermosiphon reboiler? | Solution: Entering the duty will not affect the reboiler equipment cost, but it will have an affect on the installation bulks such as piping and instrumentation.
The actual thermosiphon reboiler cost is determined by the heat transfer area and other inputs. The duty, and stream properties such as vaporization, specific gravity, and molecular weight of the tower bottoms, are used to size the piping lines for the reboiler. If you give a different duty than the default duty, this will affect the line sizes and therefore the piping costs and instrumentation costs (possible different control valve size and/or type on the piping lines).
Keywords: thermosiphon reboiler, reboiler, duty, thermosiphon
References: None |
Problem Statement: Why is my Rental Equipment Report empty? | Solution: Check to make sure you have not entered either Rental as a percentage of DFL or an actual amount. You can have the system calculate the rental equipment types, durations, and cost, OR you can enter the rental equipment as a percentage of DFL or amount. If you enter a percentage or an amount, the system will use this value and not calculate the rental equipment types, durations, and cost, and there will be nothing in the Indirects| Rental Equipment Report.
Keywords: Rental Equipment Report, Rental, Equipment, Indirects, Calculate
References: None |
Problem Statement: Is there a report that shows whether a piece of euqipment's pricing is system generated or a Quoted Price? | Solution: Yes. There is an Excel report under Other Reports | Project | List of Equipment - Source of Quote
Keywords: Quoted Equipment, Quoted, System Generated, Report
References: None |
Problem Statement: How to add Ductile Iron pipe in Aspen Capital Cost Estimator. | Solution: Ductile iron (DI) pipe is commonly used for water and wastewater applications. This model is available in Aspen Capital Cost Estimator V8.8.2 and onwards.
The new plant bulk model “BPIPDI PIPE� is added to the list of available piping plant bulks in all three Economic Evaluation products. The model is cataloged under “Plant Bulks > Piping > Ductile Iron Pipe� in the component palette and in the “Add component� dialog.
Keywords: Ductile Pipe, Waste water Pipe
References: None |
Problem Statement: I would like to have my Scaffolding reported as a Direct Cost instead of an Indirect Cost; how can I do that? | Solution: You can not use the COA Manager to move an Indirect COA to a Direct COA. But for Scaffolding, it can be done using the Customer External Files | Indirects - Proratables. Here is how it is done:
1.) There is a Direct COA for Scaffolding under Civil. It is COA 405 but nothing is put there by the system. You need to go to Customer External Files | Indirects - Proratables and add a field for 405. In the screen shot below 405 is being set up as 7 percent of the Total Direct Man-hours.
2.) The project now has Scaffolding cost appearing as a Direct Cost, but the system is still calculating Scaffolding for COA 15. So you need to visit your Contractor forms and zero out the system generated Scaffolding Cost.
Once you do these steps, when you do your next Project Evaluation you should see Scaffolding cost appearing under COA 405.
Keywords: scaffolding, direct, indirect
References: None |
Problem Statement: How to add seamless steel tubes in Heat Transfer Equipment. | Solution: Starting in V8.8.2, Heat Exchanger tubes now feature a Seamless option with a symbol for the 2205 (duplex) seamless tube of 2205S. This tube material is available for all shell and tube heat exchanger models, reboilers, air coolers, furnaces and heating units. For ASME design code, the max. design temperature for this material is 600 F [ 315 C ]. For EN 13445 design code, the max. design temperature is 572 F [ 300 C ].
Keywords: Seamless Tubes, Heat Transfer Units, Heat Exchangers
References: None |
Problem Statement: How do I remove module setting costs? | Solution: The setting COA 285 is for module setting. You can index an adjustment of 0 (zero) on that COA at the module area level to get specific module setting costs removed, or on the whole project to get all module setting costs removed.
Keywords: COA 285, module, setting
References: None |
Problem Statement: How to create a shared On-Demand calculation in Aspen Calc using Aspen SQLplus. | Solution: You can take advantage of the Aspen On-Demand Calculations.
Calculations can be executed at regularly scheduled time intervals, on user request, and for events such as AspenTech's InfoPlus.21 database changes. One feature of AspenTech's Aspen Calc involving user requests is On-Demand Calculations. This is a feature that allows you to create and execute on the fly Calculations on current or history data from multiple data sources for multiple client applications. The calculations are only executed when an application needs the results.
For example, you can drag calculations from the Tag Browser just like regular tags. On-Demand Calculations must include at least one tag name.
Here's an example of how to create a shared On-Demand Calculation using Aspen SQLplus.
-- On demand shared calculation
INSERT INTO IP_CalcDef (Name, IP_Description, IP_Eng_Units,IP_Value_Format, IP_Graph_Maximum, IP_Graph_Minimum, IP_Stepped, #Calc_Lines)
VALUES('ATCAI_Calc', 'Atcai On demand calculation', 'pounds', 'F7.3', '25', '0', 'Interpolated', '1');
INSERT INTO ATCAI_Calc (CALC_LINE) VALUES ('=({ATCAI}+{ATCL101}/1.5)');
However, each line in the calculation must be 80 characters or less, so if the calculation is longer than this you will need to add more lines to the script. In order to do this you must change the last parameter #Calc_Lines to the number of lines that will be compose the query.
Here's an example of how to create a shared On-Demand Calculation that more than 80 characters long.
INSERT INTO IP_CalcDef (Name, IP_Description, IP_Eng_Units,IP_Value_Format, IP_Graph_Maximum, IP_Graph_Minimum, IP_Stepped, #Calc_Lines)
VALUES('ATCAI_Test', 'Atcai On demand calculation', 'pounds', 'F7.3', '25', '0', 'Interpolated', '2');
INSERT INTO ATCAI_Test (CALC_LINE) VALUES ('=({ATCAI}+{ATCL101}/1.5)+({ATCAI}+{ATCL101}/1.5)+({ATCAI}+{ATCL101}/1.5)');
INSERT INTO ATCAI_Test (CALC_LINE) VALUES ('+({ATCAI}+{ATCL101}/1.5)+({ATCAI}+{ATCL101}/1.5)+({ATCAI}+{ATCL101}/1.5)');
And here's how you'll use this On-Demand Calculation in AspenTech's aspenONE Process Explorer product.
Keywords: Aspen Calc
load calculation
On-demand
References: None |
Problem Statement: How to use the BackDoorVariable method to get access to variables in Aspen HYSYS. | Solution: Some variables used in Aspen HYSYS have not been fully unwrapped and are not directly accessible as an automation interface. However, backdoor methods can be used to get access to them. We could think of monikers as URLs that we can input directly in the address bar of a web browser and give us access to certain content that otherwise is not accessible browsing through the parent website.
Moniker backdoor methods are only recommended when there is no other alternative. The internal HYSYS monikers may not remain constant between versions. When possible, most of the times, please get access to the variables by using the defined properties applied to the starting simulation case object.
For example, the moniker for the pressure drop variable in a LNG exchanger is:
Please note the color lines that highlight the components of a moniker. When creating the backdoor from a unit operation object, you should only input the moniker starting after the unit operation as follows:
Dim LNG_BD as backdoor
Set LNG_BD = hyLNG 'the LNG unit operation object
Dim hyBDRVar as Object 'or as RealVariable
Set hyBDRVar = LNG_BD.BackDoorVariable(ExchSide.500.0:PressureDrop.300).Variable
If you create the backdoor object through the simulation case, you should include the moniker starting from the flowsheet, as follows:
Dim case_BD as backdoor
Set case_BD = hyCase 'the simulation case object
Set hyBDRVar = LNG_BD.BackDoorVariable _ (FlowSht.1/UnitOpObject.400(2T536)/ExchSide.500.0:PressureDrop.300).Variable
Keywords: Monikers, Backdoor method, Automation.
References: None |
Problem Statement: What units of measure are used in Aspen Plus for Henry Constant? Also, how is it possible to use data from Environmental Chemistry literature to obtain the Henry constant? | Solution: There are various forms Henry's Law as you can see in Table I
Table I
equation:
dimension:
[dimensionless]
Aspen Plus use the form kH,px (Pressure unit).
There are multiple equations assessing the effect of temperature on the constant. A simple way to describe Henry's law as a function of temperature is:
Where T0 = 298.15
In some literature the constant C may be regarded as:
You can find this data (C and k(T0) also called k0) for over 900 organic and inorganic compounds dissolved in water in the compilation made by Sander [1].
In Aspen Plus, Henry's Law is described as a function of temperature as:
Ln (Hij) = Aij + Bij/T+Cijln(T)+DijT+Eij/T2
Where Hij is the Henry constant in pressure unit (kH,px) and j is the solvent
Using the equation above
Aij = ln(kHpx0)+C/T0
Bij = -C
Dij = Eij=Cij = 0
Per example, using Sander data for Acetaldehyde:
k0 =130 (mol/L/atm)
C = 5800 (K)
First you need to change the unit of k0 to pressure
kHpxo= 55.3/k0=0.42538 atm
So
Aij = 18.6
Bij = -5800
Attached is a .bkp file showing how to input this information on the Properties | Parameters | Binary Interaction | HENRY form. In the file you can find the calculated kHpx by following the procedure shown in theSolution 118624 (How do you calculate Henry's constant in Aspen Plus in the typical pressure/concentration units found in literature?)
Keywords: Henry, Acetaldehyde
References: R. Sander (1999)
Compilation of Henry's Law Constants for Inorganic and Organic Species of Potential Importance in Environmental Chemistry (Version 3)
http://www.henrys-law.org |
Problem Statement: In Aspen Mtell, For Tuning an Agent, what does the post Failure Interval time hold significance for? | Solution: The post-failure interval is an excluded portion of the data set. The machine learning algorithms usually don't consider the post interval failure time as a part of normal or abnormal behavior. Hence It is entirely excluded.
The reason this exclusion exists is because once a failure occurs, it typically takes a few days to replace the part, start up the system, and resume normal operation. We do not want this to be learned as part of normal operation, so we exclude it.
The recommendation to set up the post failure interval can be anything ranging from 1 day to 1 week depending on the physical/chemical process, the asset, and the industry.
The default post-failure interval when creating new work orders is 2 days, but this can be changed in the failure library tab in the Agent Builder.
Once you are well aware about the time taken knows to complete the repair and resume normal operation for an asset, then that time period would be the ideal value for the post-failure interval, but usually this type of information is often undocumented.
Hence we would suggest you to keep the default range and change it once there is an idea on the time range.
Keywords: None
References: None |
Problem Statement: I have a simple estimate where I am adding some equipment and piping, how can I turn off the Volumetric Modeling in Aspen Capital Cost Estimator (ACCE) to just get the added items cost? | Solution: You can turn the Volumetric Model Option off in ACCE by using either the M or N option in the General Project Data | Suppress default equipment/area/project bulks” form. M will suppress the equipment volumetric model bulks and most of the system developed area and project bulks, and N suppresses the equipment volumetric model bulks but keeps most of the system developed bulks.
Keywords: Volumetric Model, turn off, off
References: None |
Problem Statement: Why am I not seeing Rental Costs in my Reports? | Solution: You are not seeing any Rental Costs because for your Workforces you have chosen, under Indirects:
“- Wage rates include all indirects”
One of the indirects included is the Rental Cost. So your Rental Costs are included in the wage rate.
Keywords: Rental Costs, Rental, Indirects, Workforce
References: None |
Problem Statement: When you open the Aspen InfoPlus.21 Configuration Wizard Excel addin and select the datasource, the fields are greyed out and not changeable. Shown below (fig 1):
Figure 1 | Solution: To resolve this issue go to a cell (A) and enter tag name or any character and then reselect the the Datasource in the Configuration Wizard. This will force the add-in to connect to the datasource and open the fields. Shown below (fig 2)
Please note: The addin cannot work unless there is data present within the spreadsheet
Figure 2
Keywords: Ip21 Configuration Wizard
Ip.21 addin
Infoplus.21
Excel
addin
Greyed
References: None |
Problem Statement: Is it possible to get the K values calculated for each section of a Pipe model? | Solution: Pipe assumes that the pressure drop due to valves and fittings is distributed evenly along the specified length of the pipe. The total length Pipe uses in calculations corresponds to the specified pipe length, plus any equivalent pipe length due to valves, fittings, pipe entrance and exit, a sudden enlargement and/or contraction, an orifice plate, and miscellaneous L/D or K.
Aspen Plus calculates pressure drop from most fittings (gate valves, butterfly valves, large 90 degree elbows, straight tees, branched tees) by calculating an L/D value for each fitting, multiplying that by the appropriate pipe diameter, and using the result as extra pipe length. Aspen Plus does not calculate a loss K factor for these fittings. These calculations are done internally and not reported for the individual fittings.
You can calculate the K factor from K = fL/D where f is the friction factor, D is the pipe diameter, and L is the equivalent length. The friction factor is shown on the Results | Profiles sheet. The total equivalent length (including the length of the pipe itself) appears on the Results | Summary sheet. The diameter is the specified diameter. If pipe schedules are used, view the input summary to find the diameter, which is labeled as IN-DIAM. To find the K factor used for a particular fitting, use this method with a pipe length of 0.
The steps are as follows:
Specify a pipe length of 0 on the Setup / Pipe Parameters form.
On the Fittings1 form, specify a single fitting, for example, 1 Gate Valve.
Run the simulation.
Find the friction factor from the Results / Profiles form.
Find the Equivalent Length from the Results / Summary form. This will be the pipe length used in the integration.
Get the Diameter from the user's specification. If pipe schedules are used, do View / Input summary and in the pipe block the actual diameter is IN-DIAM.
Use the equation K = f * (L/D) to calculate the K value for the single fitting.
In the attached file, a PIPE block with 0 length is set up for each type of valve. This file can be opened in Aspen Plus V9 and higher.
Gate
Valve Butterfly Valve Large 90 deg. Elbows Straight tees Branched tees
f 0.0132236 0.0132236 0.0132236 0.0132236 0.0132236
L (ft) 7.64189 61.13510 13.75540 7.64189 34.38849
d (ft) 0.948 0.948 0.948 0.948 0.948
K = f * (L/D) 0.1066 0.8528 0.1919 0.1066 0.4797
Correlations for Connection type (Flanged welded or Screwed) and fittings (Gate valves, Butterfly valves, Large 90 deg. elbows, Straight tees and Branched tees) are from a proprietary source.
For more information see the Help topic: Using the Simulation Environment -> Unit Operation Models
Keywords: None
References: -> Pressure Changers -> Pipe Reference -> Specifying Pipe -> Modeling valves and Fittings.
Reference: CQ00320063 |
Problem Statement: Why might I see differences between a regular case stack and a parametric analysis changing the same variables? | Solution: Sometimes users try to replicate results of a Parametric Analysis by running a set of comparable cases and find there are small differences in the results. By design, Parametric Analysis runs are initialized differently than a regular case stack in order to speed up theSolution time. The different approach to initialization can sometimes result in slight variations in the results. Below is a chart describing how each type of run is initialized.
Parametric Analysis Case Stacking
Designated inputSolution file used for base run (about 10-digit precision),
Rest of runs useSolution of base run for inputSolution (in memory, full precision) Designated inputSolution file used for all runs (about 10-digit precision) unless superseded for a case by LOADSOL keyword
LP basis is created from the base run and is used for all subsequent PA runs Each case is run independently – no LP basis unless specifically designated
No “cleaning” of noise in inputSolution file InputSolution file is cleaned for noise
(ie, values <1e-6 are set to zero)
So, what is the best way to replicate a Parametric Analysis run using regular case stacking? The steps below would initialize a case stack in a manner that closely mimics how parametric analysis is initialized.
Run the base case and create an inputSolution file from it.
Run the cases that mimic the parametric analysis variable changes
Designate in the Run Dialog box, the inputSolution file created in step 1
Turn on the “Start with Associated LP Basis” option located in Model Settings | Non-linear Solver (XSLP) | Advanced 2 tab
Slight differences could still arise from the “cleaning” of the inputSolution file data and from the difference in precision that arises from inputSolution data coming from memory versus a file. These would be extremely small and unlikely to lead to significant differences in the results.
Keywords: None
References: None |
Problem Statement: What is the Planning Objective Function Formulation? | Solution: Use the Planning objective function formulation to consider the following factors during optimization:
· Product Shipment Revenue
· Component Shipment Revenue
· Product Receipt Cost
· Component Purchase Cost
· Inventory Buildup Value (excluding final blend debits if applicable)
· Inventory Drawdown Cost (excluding final blend debits if applicable)
Formula
The Planning objective function is based on the following formula.
=
Where:
: Economic Objective Contribution
: Final Blend Contribution
: Holding Cost Contribution
- Overall infeasibility breaker weighting factor
: Infeasibility Breaker Contribution
Economic Objective Contribution
The formula for the economic portion of the objective function is as follows:
Product Shipment Revenue
+
Component
Shipment
Revenue
-
Product
Receipt
Cost
-
Component
Purchase/
Receipt
Cost
-
Component
Production
Cost
+
Inventory
Buildup
Value
-
Inventory
Drawdown
Cost
Where:
: sales price for product j (j = 1, )
: price of component shipment k (k = 1, )
: cost of product receipt l (l = 1, )
: cost for component receipt m (m = 1, )
: production cost of component i (i = 1, )
: cost of inventory buildup associated with tank o (o = 1, )
: price of inventory drawdown associated with tank o (o = 1, )
: quantity (volume or weight) for product shipment j (j = 1, )
: quantity (volume or weight) for component shipment k (k = 1, )
: quantity (volume or weight) for product receipt l (l = 1, )
: quantity (volume or weight) for component receipt m (m = 1, )
: quantity (volume or weight) of component i produced at period t (i = 1, , t = 1, T)
: Amount of inventory increase for tank o (o = 1, NTANK)
: Amount of inventory decrease for tank o (o = 1, NTANK)
NPS : number of product shipments
NCS : number of component shipments
NPR : number of product receipts
NCR : number of component receipts
NC : number of Components
T : number of periods
NTANK : number of component and product tanks
Final Blend Contribution
The formula for the final blend portion of the objective function is as follows:
Where:
: sales price for product j (j = 1, )
: final blended volume for product j
: number of products with final blends enabled
Holding Cost Contribution
The formula for the holding cost portion of the objective function is as follows:
Where:
: holding cost for component j at time t
: period duration at time t
: inventory for component j at time t
Infeasibility Breaker Contribution
The formula for the infeasibility breaker portion of the objective function is as follows:
Inventory Infeasibility Breakers
+
Property Specification Infeasibility Breakers
Where:
: penalty for inventory excess in tank i at time t
: penalty for inventory deficiency in tank i at time t
: inventory excess for tank i at time t
: inventory deficiency for tank i at time t
: penalty for property specification excess for blend b and property k
: amount of property k exceeding specification for blend b
: penalty for property specification deficiency for blend b and property k
: amount of property k below specification for blend b
: specification violation weighting factor
Keywords: MBO
References: None |
Problem Statement: How do I zero out Field Supervision” (COA 85) in my project? | Solution: You can zero out “Field Supervision” costs by going to the Engineering form, Engineering by Discipline. Choose F (Field Office Supervision), All Disciplines, and 0 (zero) and that will delete COA 85.
Keywords: Field Supervision, COA 85, COA, Engineering
References: None |
Problem Statement: Error database was created with non-FIPS validated cryptographic algorithms and is not accessible when FIPS compliant is enabled.
When trying to use Aspen Properties Custom database, user gets following error message:
SQLServer.com,1433.Custom_DB : (DataVersion - 23.0.0.0) OK
SQLServer.com,1433.Custom_DB : Custom_DB was created with non-FIPS validated cryptographic algorithms and is not accessible when FIPS compliant is enabled. | Solution: User will get this error message when the databases were created with non-FIPS compliant and later FIPS 140-2 Compliant was enabled on users machine. All new databases created with Aspen Properties Enterprise Database in version V8.8 or later are FIPS 140-2 compliant.
Databases created with V8.6 or earlier are not FIPS 140-2 compliant.
Databases restored from backup copies of non-FIPS-compliant databases are also not compliant.
When FIPS mode is disabled in Windows, APED can use both the compliant and non-compliant databases.
When FIPS mode is enabled, only FIPS-compliant databases can be registered and used in APED.
In FIPS mode, if you restore a non-FIPS-compliant database from a backup file, you will be warned that you will not be able to register the database due to FIPS mode.
To Disable FIPS on Windows:
1. As an administrator, click Start | Run (or type run on the start screen in Windows 10) and then in the Run window type gpedit.msc and press Enter.
2.The Local Group Policy Editor window appears.
3. Expand Computer Configuration > Windows Settings > Security Settings > Local Policies > Security Options.
4. In the list of policies, locate System cryptography: Use FIPS compliant algorithms for encryption, hashing, and signing. The Security Setting column displays whether FIPS-compliant mode is enabled.
5. To change the setting, double-click System cryptography: Use FIPS compliant algorithms for encryption, hashing, and signing and select Disabled, then click Apply and OK.
Note: To migrate from older version of custom databases to FIPS-compliant databases, create a new database in APED V8.8 or later, using DFMS files.
Keywords: FIPS-compliant
FIPS 140-2
U.S> Goverment standard
Cryptographic modules.
References: None |
Problem Statement: Several users have had difficulty setting up volume based purchases and sales in a global (M/XPIMS) model. The goal of this | Solution: document is to make it easier to set-up the model and to provide a couple of tips to avoid common problems.
This particularSolution document applies to MPIMS models in which the Global and all Local Models are WEIGHT based.Solution
Volume Based Purchases
There are a couple of key things to remember when setting-up volume based purchases:
The VOL column in table GSUPPLY does not do anything and can be ignored.
The VOL column in table SUPPLY is used to tell PIMS that the global model will handle the sales on a volume basis. The way the program is designed, the value in VOL for the first row that contains the material tag will set the sales basis for that material tag in the entire model. For example, if table SUPPLY has the following:
VOL
ANSA
ANSB 1
then ANS will be sold on a volume basis for all markets.
You need to supply a gravity in the global table UNITS in the form of either SPG, API, or VTW in order for PIMS to create the correct matrix coefficients. If you don't put in a gravity, then PIMS will assume and SPG of 1.0 which can be significantly different.
The gravity of the purchased stream will be fixed at the value in T.UNITS and will not be updated via the recursion process. This is the case even if the purchase stream is an internal pool (FCC Feed, for example).
You need to flag the VOL column in the local model table BUY. This will cause PIMS to generate the local model purchases report with the correct units description.
Volume Based Sales
The things to remember when setting up volume based sales are:
You need to flag column VOL in the local model table SELL in order for PIMS to correctly set-up the matrix structure. Other requirements depend on whether the product is a blend (specification or formula) or a submodel stream.
For a blended product, PIMS will set-up the structure for you. You just need to make sure that PIMS has a gravity for each of the blendstocks.
For a submodel stream, you must manually build the structure to convert the weight based stream (WBALxxx) to a volume based stream (VBALyyy). Note, the tag codes must be different.
You need to flag column VOL in the global table DEMAND. Otherwise PIMS will generate an error message.
I have attached an example model to help you with modeling this type of situation. If you have any questions, please contact PIMS support at [email protected].
(Editor's note: This tech tip is thanks to Brian Oakwood, our friend and former coworker. Apologies for the delay in making this public.)
Keywords: conversion
xpims
mpims
vol
References: None |
Problem Statement: What will prompt XLP to regenerate the matrix for a case? | Solution: Items that cause regeneration in XLP:
· GENERATE keyword
· EXPERT keyword
· MODEL keyword
· Current case modifies a case that uses the EXPERT keyword AND current case has changes
· Using keyword REPLACE or REPLACEALL instead of TABLE
· Adding an entire table
· Adding a new row to a table with the following exceptions:
· Not just a new periodic row for tables BLNSPEC
· Not a P-row in BLNSPEC
· Adding a new column to a table with the following exceptions:
· Not just a new periodic column
· Any change in Table BLNPROP
· Changing a submodel entry
· from a value to EMPTY
· from EMPTY to a value
· to or from +/- 999
· Changing any RFG parameters in the BLNSPEC tables
· Changes to table TRANSFER, column MAX to or from the value of 0
· Changing a DISABLE table or column entry
· from a value to EMPTY
· from EMPTY to a value
· Changes to existing column values in tables/columns other than those listed below. Note that even if the entry is not a change, the presence of a column other than those listed below may prompt a regeneration.
· BOUNDS (MIN, MAX, FIX)
· BUY (MIN, MAX, FIX, COST)
· CAPS (MIN, MAX, FIX)
· DEMAND (MIN, MAX, FIX, PRICE)
· GSUPPLY (MIN, MAX, FIX, COST)
· PROCLIM (MIN, MAX, PENALTY)
· SELL (MIN, MAX, FIX, PRICE)
· SUPPLY (MIN, MAX, FIX, COST)
· TRANSFER (MIN, MAX, FIX, COST)
· UTILBUY (MIN, MAX, FIX, COST)
· UTILSELL (MIN, MAX, FIX, PRICE)
Keywords: GENERATE, run time, PIMS-AO, PIMS Advanced Optimization, generate, generation
References: None |
Problem Statement: How do I resolve the issue of “Unexpected file size increase” in Aspen Plus Version V10. | Solution: Most of the time the file size get increases while we save any file in Aspen Plus V10, which is difficult to re-open & might get crashed.
To resolve this issue, check below possibleSolutions:
1. Check whether the recent patch for: Aspen Plus V10.0 Family- Cumulative Patch 3 (CP3) is installed or not. This issue of “Aspen Plus bkp file becomes very large after saving a simulation” is resolved in recent patch.
2. Unnecessary parameters are being retrieved where the file contains pseudo components. The retrieved parameters function did not support for Pseudo components. That's the reason to get error messages/ or file size get increased when you load the. bkp file.
If remove all retrieved parameters from .bkp file, the file size will be much smaller and loading time will be reduced significantly.
3. Unwanted report options may also selected which in turn increase the file size: To reduce the size - go to properties mode, Setup, report options. Try uncheck unwanted check boxes, so that less properties results are saved with simulation results (Setup-report Option snap is as below).
Keywords: None
References: None |
Problem Statement: モデルツリー上の全てのマトリックスファイルの違いは何ですか? | Solution: MPSBCD - このファイルにはDistributive Recursion (DR)のモデルの計算結果が含まれます
MPSPROB - このファイルにはDistributive Recursion (DR)のモデルの初期値が含まれます。計算結果は含まれません。
Xmps001.xlp - このファイルにはXNLP やXNLP と XLPを使ったモデルで特定のケースの計算結果が含まれます
Xlp_new001.xlp 及び xmps_new001.xmps - このファイルはXNLP と XLPを使ったモデル用で同じファイルで二つの異なる形式があります。これらは生成された初期値で収束解の値とは一致しません。これらのファイルはXNLP Model Settingsの設定でSave Case Specific XMPS Filesのオプションを解除すると生成されなくなります。
Keywords: xlp
MPSBCD
MPSPROB
JP-
References: None |
Problem Statement: AspenTech recommends setting the affinity of Aspen InfoPlus.21 processes to the same CPU regardless of whether Aspen InfoPlus.21 is hosted on a virtual or physical server. One reason is the length of time required to acquire a lock against the Aspen InfoPlus.21 database when accessing data. Some Microsoft operating system calls run significantly more slowly if processes making the calls are executing on different CPU cores. As a result, processes that frequently access the Aspen InfoPlus.21 database will generally run much more quickly if bound to the same CPU.
This knowledge base article contains a program that measures the length of time needed to lock a resource using the same mechanism as used by the Aspen InfoPlus.21 API. This program does not actually lock the Aspen InfoPlus.21 database and may be run on a computer without any Aspen products installed. | Solution: Download the zip file attached to this article to a computer with multiple CPUs. The zip file contains a program named LockTest.exe that measures how long it takes to lock a resource 1,000,000 times using the same locking mechanism as the Aspen InfoPlus.21 API.
Also included in the zip file are several batch procedures.
Batch File Name
Purpose
Multi_lock1
Executes LockTest on a single CPU
Multi_lock2
Simultaneously executes two processes running LockTest without being bound to the same CPU
Multi_lock2A
Simultaneously executes two processes running LockTest bound to the same CPU via affinity
Multi_lock3
Simultaneously executes three processes running LockTest without being bound to the same CPU
Multi_lock3A
Simultaneously executes three processes running LockTest bound to the same CPU via affinity
Multi_lock4
Simultaneously executes processes running LockTest without being bound to the same CPU
Multi_lock4A
Simultaneously executes four processes running LockTest bound to the same CPU via affinity
The tests demonstrate that in both physical and virtual servers it takes significantly more time to lock a shared resource (like the Aspen InfoPlus.21 database) with applications running on multiple CPUs than with applications running on the same CPU.
Keywords: affinity
performance
References: None |
Problem Statement: EO | Solution: fails after giving a Measurement block variable a description
Solution
After adding a Description, users need to update the EO variable names where they are referenced.
To set up a measurement, you add a Measurement block to the flowsheet and configure the block. Measurement processing automatically creates the three EO variables, using the following naming format:
blockid.BLK.tag_description_variable
Where:
blockid is the name of the measurement block.
tag is the tag specified for each measurement.
description is the description provided for each measurement.
variable is PLANT, MODEL, or OFFSET.
If the description is not provided, the variables have shorter names in the form blockid.BLK.tag_variable.
The behavior is similar to when a description is added to a EO variable in other places (e.g. Design Spec Define variables), it will change the local EO variable name. E.g. what used to be BLK.F.OFFSET, will now change to BLK.F_NEW_DESCRIPTION_OFFSET. The tooltip for the Description of a Define variable states A description string. Optional. If provided will be used to generate local EO variable name.
More information can be found in the Aspen OOMF Script Language
Keywords: None
References: Manual and in the Aspen Plus Getting Started Using Equation Oriented Modeling manual.
Reference: VSTS 48901 |
Problem Statement: When a Windows Installer (MSI) package is installed from removable media, such as a CD-ROM or DVD, and the MSI file doesn't reside on the root folder of the media, the following error message may appear:
Error 1706, No valid source could be found?
This occurs if the Installer needs to query the source. Browsing to the MSI package does not resolve this error. | Solution: Option 1: Copy the source media to a local or mapped drive and re-run the update.
Option 2: See https://docs.microsoft.com/en-us/windows/desktop/msi/mediapackagepath for information on modifying the registry to properly set the MEDIAPACKAGEPATH property. MEDIAPACKAGEPATH is a public property that enables you to define where on the removable media the MSI file is located.
Keywords:
References: None |
Problem Statement: How to specify when PPM, PPB or trace will be used for component fractions in stream results? What does trace mean for component flowrates or fractions? | Solution: It is possible to specify that PPM, PPB or trace to be used for values in the new Stream Results form available in V9 and higher.
PPM is used when a fraction is in the range of 10-3~10-6, PPB is used for the range of 10-6~10-9, and trace is used for the fractions less than 10-9. Fractions in the Stream Summary can use PPM, PPB or trace for smaller concentrations.
The following steps can be used to specify when PPM, PPB or trace are used:
1. First go to Results Summary | Streams or some other Stream Results form.
2. Click on Display Options in the Property Sets area of the Stream Summary ribbon.
3. The display options tab can be used to specify the options for component fractions as show in the screen shot below.
Trace is the value below which the Stream Summary points the Trace label string instead of a value.
The field for PPM (or PPB) can be used to set the threshold, below which the letters PPM (or PPB) will appear after the affect numbers.
The Format can be used to select the format for the values. The default (if no format is specified) is general with 6 significant figures.
4. Save the new template using the button in the Template area of the Stream Summary ribbon.
Keywords: Component fractions, stream summary
References: None |
Problem Statement: When we design Top Mounted condenser with one Vapour inlet from bottom with single liquid & vapour outlet respectively shows below error in V9, V10 & V11: | Solution: If you specify a normal condenser, program will issue a warning stating that inlet nozzle needs to be on the top. If inlet nozzle needs to be at the bottom, you can model it as knockback reflux condenser. See screenshot below.
Key Words:
Top Mounted Condenser; Knockback Reflux condenser
Keywords: None
References: None |
Problem Statement: There are two methods for trending a calculated tag, the Ad-Hoc method and the Stored Calculations method.
This Tech Tip addresses the Stored Calculations method which is used in conjunction with the AspenCalc client. | Solution: The Aspen Calc client must be installed onto the Process Explorer client node in order to use the Stored Calculations method.
Create an ADSA data source for the Aspen Calc connection. The only ADSA Service component that should be added to this data source is the Aspen Process Data (Calc) component. This ADSA data source should be created on the Process Explorer client node as a local User Data Source.
Next, in AspenCalc you need to create an On Demand calculation. There is an On Demand button on the main frame of the AspenCalc GUI. Use this button to define your Stored Calculation. The name of this calculation will be used later as the 'tagname' of the pen inserted into the Process Explorer trend plot.
Once an On Demand calculation has been created within Aspen Calc on the local client machine, Process Explorer can trend this as if it were a standard tag. Simply insert the name of the On Demand calculation into the legend of a trend plot (or use the Tag Browser to drag and drop). The Data Source name used must simply be the name of the data source created for the Aspen Process Data (Calc) component.
Keywords:
References: None |
Problem Statement: The Aspen SQLplus task 'TSK_SQL_SERVER' will not start and exits without error. | Solution: This is usually caused by another program utilizing the default Aspen SQLplus port '10014'.
Check if the port is being utilized by another program by executing the command 'netstat -a'. Look for lines similar to the following;
TCP
<yourhost>:10014
0.0.0.0:0
LISTENING
Or
TCP
<yourhost>:sqlplus
0.0.0.0:0
LISTENING
If either of these two lines show a listening state then you will need to configure the external task 'TSK_SQL SERVER' to run on a different port.
To configure 'TSK_SQL_SERVER' to run on a different port.
1. Modify the 'sqlplus' entry in '%windir%\system32\drivers\etc\services' to use a different port number for example.
sqlplus 15003/tcp
Make this change to the 'services' file on all systems that connect to the Aspen InfoPlus.21 server
2. In the command line parameters for the external task add/change the parameter to 'sqlplus'
3. When configuring ODBC data sources or Query Builder clients specify 'sqlplus' as the service.
After you make this change, you'll need to match the new port number in the Aspen SQLplus Service component in the Aspen Data Source Administrator (ADSA).
Go to your Aspen Data Source Administrator server, and run the ADSA Client Configuration Tool.
Select your Data Source, User or Public.
Select Edit.
Select Aspen SQLplus Service component and click on the Configure button.
Change the Port Number to 15003, as above.
Once you make the change to the Aspen SQLplus service component, you need to restart the Aspen Data Source Directory in the ADSA server for the clients to see the change.
(ThisSolution existed previously as number 102652.)
Keywords: SQLplus port
SQLplus will not start
References: None |
Problem Statement: How do you create a Custom type reaction available starting in V10? | Solution: Attached is an example of how to create a custom reaction where the custom term is simply a power-law rate expression for comparison. Block RCSTR uses Reaction POWERLAW uses a forward and reverse power-law reaction. The Reaction GENERAL has the reaction specified as a reversible POWERLAW reaction class. Block RCUST uses reaction CUSTRXS which specifies the same reaction as forward and reverse custom reactions using the custom term.
Reaction Rate = Power Law Rate x Custom Term
The pre-exponential factor, activation energy, and concentration exponents for the power law expression can be specified in the custom rate expression. However, for maximum flexibility, the Power Law Rate can be set to 1 by specifying
Pre-exponential Factor (k) = 1
Activation Energy (E) = 0
Concentration Exponents = 0 for each component
Then, the custom term will be the reaction rate:
Reaction Rate = Custom Term
In the Custom rate expression, constants are defined and variables are accessed to define the rate expression. You can use the custom terms defined here in other custom expressions provided that you avoid circular references. You can use the standard mathematical operators (+ - / *), ^ for exponentiation, and parentheses, as well as numeric constants. For each equation, the Status column verifies that the equation follows proper syntax.
For each constant, variable, and custom term, the name must adhere to the following rules:
Begin with a letter
Contain only letters and numbers
Not begin with ZZ
Be no more than 8 characters long
The constants are accessible variables within this reaction block, and can be manipulated in operations such as Calculator and Data Fit as a React-Var. Choose variable PARM-VALUE and in ID1 select the name of the constant.
The variable types available are the following:
Temperature
Pressure
Saturation pressure of a component
Molarity
Mole Fraction
Mass Fraction
Mass Concentration
Mole-Gamma
Partial Pressure
Pure component saturation pressure
Keywords: None
References: None |
Problem Statement: Vapor Pressure for component C4H11NO-1 is incorrect when using Electrolytes. | Solution: The incorrect Vapor Pressure comes from the Antoine Vapor Pressure (PLXANT) parameters for C4H11NO-1 (C4H11NO, 2-AMINO-2-METHYL-1-PROPANOL) that are in the system definition files (SDF) for electrolytes. Note that even if ElecNRTL is selected and not used, the parameters in the SDF are retrieved and used. The Pure Component databanks (PURExx) have included parameters that accurately predict the vapor pressure of this component for many releases.
This component C4H11NO-1 is in the electrolyte system (SDF) tables to support C4H12NO+ which is the result of the amine-acid reaction. The GMELCx parameters related to molecule-pair interactions are not regressed but are nominal values. It may seem as if there is no reason to keep the pure data for this species in the SDF tables; however, these pure parameters have been used by AspenTech and by customers since regressed pair parameters depend on them because the parameters in the SDF have priority over any databank by default.
To get the correct Vapor Pressure you need to uncheck the box for Require Engine to use special parameters for electrolyte method on the Setup | Calculation Options | Calculations sheet.
AnotherSolution is to use NIST and save the data and parameters for Vapor Pressure. Once the parameters are on the forms and user input, they will be used before any other parameters.
Keywords: electrolytes
References: : VSTS 21522, CQ00766463 |
Problem Statement: This knowledge base article describes the various places where debug logging can be enabled for Aspen Production Record Manager. | Solution: There are three places where various types of debug logging can be enabled for Aspen Production Record Manager.
Through the Aspen Production Record Manager Administrator:
1. Open Aspen Production Record Manager Administrator
2. Expand your data source
3. Under the Server Administration section right click on Logging
Note that when activating logging in the Administrator it takes effect immediately. There is no need to restart any Services or tools. Logging adds to the load on your system. It should be turned on to gather troubleshooting information, then immediately turned back off.
Through the BCU Server Manager:
1. Open the Aspen BCU Server Manager,
2. Click on the Configure button then click on the Logging tab.
Through the Scheduling Table for each Unit:
1. Open the Aspen Production Record Manager BCU Administrator
2. Go to Server | Scheduling Table
3. Right click on a Unit then select Disable (the logging options cannot be changed while the Unit is enabled)
4. Right click on the Unit again then select Scheduling Properties
5. Click on the Logging tab
When running version prior to V7.1, all log files are written to the following folder:
<Drive>:\Program Files\AspenTech\Batch.21\Data\Log
For V7.1 and later:
When Aspen Production Record Manager is running on Windows 2003 Server the log files are located under:
<Drive>:\Documents and Settings\All Users\Application Data\AspenTech\DiagnosticLogs\ProductionRecordManager
When running Aspen Production Record Manager server on Windows 2008 server (both 32-bit and 64-bit) the log files can be found under:
<Drive>:\ProgramData\AspenTech\DiagnosticLogs\ProductionRecordManager
The general log file for Aspen Production Record Manager activity is called Batch21Services. AtlServer.<Username>. log. This is available from the above-noted folders, but also can be accessed directly from the Aspen Production Record Manager Administrator Tool, by right-clicking on Logging and choosing Show Log . . .
Note: Starting with V8.5, when enabling logging, it no longer making entries in Windows registry. The Production Record Manager.Profile.xml found in C:\ProgramData\AspenTech\Production Record Manager will be updated. Logging enabled through Aspen Production Record Manager Administrator and BCU Server Manager will be updated in AltServer Scheduling and BcuServer Scheduling respectively. Additional logging such as for Aspen Batch Extractor can be enabled by manually editing the XML file in Notepad.
Keywords: Logging
Aspen Production Record Manager
Batch.21
BCU
References: None |
Problem Statement: The data regression feature of Aspen Plus allows you to fit time-profile data in RBATCH or axial-profile data in RPLUG. Several data types are supported. Component attribute profiles are not one of the data types supported by the profile-data feature. | Solution: document 111707 Click this link to go to the download page:
http://support.aspentech.com/webteamcgi/SolutionDisplay_view.cgi?key=111707
Keywords: RPLUG
RBATCH
data regression
data fit*
Component attribute
comp-attr
user prop-set property
property profile*
attribute profile*
CAT
UPP
References: None |
Problem Statement: Is there a way to change unit sets without the steam and block inputs changing as well? I would like only the calculated results to change units. | Solution: When you change the Global unit of measure, all of the input units of measure change and values are converted. If you then go to a form and manually change a value and unit of measure, it will stay with that specified unit of measure unit there is a global change.
For output units, It is possible to specify the units of measure on the Display Options tab of the Edit Stream Summary Template for the stream properties.
Templates should be saved using the Save as New button on the Template section of the Stream Summary ribbon.
In the attached file, FULL_V8_SI uses SI units of measure, FULL_V8_ENG used ENG units, and FULL_V8_MET uses MET units.
Keywords: None
References: None |
Problem Statement: Error HTTP Error 403.14 - Forbidden The Web server is configured to not list the contents of this directory. when Launching Aspen Mtell System Manager
Detailed Error Information when browsing http://localhost:80/AspenTech/AspenMtell/InteropServer/MIMOSA/:
Module DirectoryListingModule
Notification ExecuteRequestHandler
Handler StaticFile
Error Code 0x00000000
Requested URL http://localhost:80/AspenTech/AspenMtell/InteropServer/MIMOSA/
Physical Path C:\inetpub\wwwroot\AspenTech\AspenMtell\InteropServer\MIMOSA\
Logon Method Anonymous
Logon User Anonymous | Solution: This article outlines the instruction to address IIS Error HTTP Error 403.14 - Forbidden The Web server is configured to not list the contents of this directory. when Launching Aspen Mtell System Manager.
Enable Directory Browsing on IIS:
1. Click Start and type inetmgr to launch Internet Information Services (IIS) Manager
2. Expand Sites -> Default Web Site -> AspenTech -> AspenMtell -> InteropServer and select MIMOSA
3. Double click Directory Browsing
4. Click Enable
Restart IIS to apply this changes:
1. launch CMD as Adminisrtator
2. Type: IIS Reset
Keyword:
HTTP 403
IIS error
Keywords: None
References: None |
Problem Statement: Properties for streams in an Aspen PIMS model are important data pieces in driving the optimization. There are multiple sources for properties in Aspen PIMS and multiple ways to handle properties.
Here is a summary of how properties are defined, what tables are used for that purpose and the order of precedence if multiple sources are found for the same property for a stream. | Solution: There are two main types of properties: static and calculated. The tables involved in property calculations of both types are summarized below:
Static Properties Tables
Calculated Properties Tables
BLNPROP
ASSAYS
BUY
UNITS
PGUESS
PCALC
INDEX and Property Calculation Formula facility
ABML (Aspen Blending Model Library)
Static properties
This type of properties are used when we can assume that they are independent of the operation of the process units. Here is a short description of each of the main tables and their common uses. For a detailed description of the tables, please refer to the Aspen PIMS Help file.
Table BLNPROP
This is the main table for static properties. It is called BLNPROP because most of the streams that get their properties through this tables are Blending Components. It is common to attach several Excel files to this tables, to segregate the data by stream type, e.g. Gasoline components, Fuel Oil components, Diesel components, etc.
Table ASSAYS
This table provide crude assay data for the model in the form of crude cut yields and crude cut properties. The properties of each cut for each crude (i.e. Light Naphta from crude ARL) are assumed to be static; the crude cuts however are pools where each crude contributes its fraction of the total, therefore the properties of the crude cuts should be recursed, i.e. their properties will be calculated.
Table BUY
This table allows to provide SPG and API data for feedstocks, mainly for crudes purchased, so that the reports show the correct volume to weight conversions. No other properties can be entered here.
Table UNITS
This table allows to provide SPG and API data for streams. We recommend to use tables BLNPROP or BUY instead to define these properties.
Calculated properties
The most common way of calculating properties in PIMS is by using recursion.Solution 103882 describes this technique. While the main purpose of non-linear equations is not necessarily to define properties, it is possible to use them for this. Details of this method are not discussed in this article.
Table PGUESS
Table PGUESS is used to identify which stream properties are to be recursed and to provide initial estimates for the property values.
For crude cuts identified in table ASSAYS, an entry in table PGUESS means the initial value for the recursed pool and triggers the creation of the necessary recursion structure. A 999 can be used as an initial value in these cases.
For pools created outside of the crude units, the user must provide the recursion structure. The value entered in PGUESS is the initial estimate for the recursion.
Blends can also be recursed by entering an initial estimate for the required property. The recursion structure is built automatically by PIMS.
Table PCALC and PCALCB
Use the PCALC table if distributive recursion is used, to calculate properties of a target stream (e.g. stream TRG) in terms of properties of recursed source strema (e.g. SRC). The following equation is used:
Table PCALB can be used to provide the B factor in the following equation:
Table INDEX and Property Calculation Formula facility
Property indices is one way of modelling non-linear blending of properties. The index is defined in such a way that if blends linearly in volume. At report time, the original property is reported in the blending report based on the information provided in table INDEX in the form of smooth curves defined by sets of coordinates.
The Property Calculation Formula facility is used for the same purpose, except that the curve is provided explicitly instead of just providing discrete points as in table INDEX.
Table ABML (Aspen Blending Model Library)
This library provides a set of linear and non linear blending correlations to use for blending specifications.Solution 121223 provides a good introduction on how to use this table.
Order of Precedence when redundant properties are found
It is best to remove redundant property values from your model. In case there are more than one source of data for a property for a stream, the following precedence order applies:
1. Recursed
Property values resulting from recursion structure are always used over redundant properties in other tables.
2. BLNPROP
Property values in table BLNPROP override redundant property values in tables ASSAYS, BUY, and UNITS. .
3. ASSAYS
Property values in table ASSAYS override redundant property values in tables BUY and UNITS.
4. BUY
Property values in table BUY override redundant property values in table UNITS.
5. UNITS
The property values in table UNITS override redundant property values in table PCALC.
6. PCALC
Property values in table PCALC are only used if they do not appear anywhere else.
Keywords: Property
Properties
Static, Recursed
References: None |
Problem Statement: How do I model “sour water stripping with pump around system” in Aspen Plus without mass balance error? | Solution: To reduce convergence issue due presence of ions in the system, user may consider apparent component approach instead of true component approach (with this, the material balance will be done based on components & not on Ions) this will help in resolving the issues with material balances for ions in complex systems without affecting results.
If user would like to modify any pump around values, user may give small increments or decrements in the value which will help simulation program to converge easily. (i.e. if the recycle flow is 10000 Kg/hr is mentioned, one may give rise to 15000 not directly to 30000 or 40000 Kg/hr).
Keywords: None
References: None |
Problem Statement: How to stop Aspen Plus from showing message You are opening a file from network drive, would you like to open a copy locally? when trying to launch a model from network location. | Solution: Users who do not want to see above message can click on Don't show this message again check box to stop seeing this message again when opening a model from network location.
To roll out this setting on all the users machine, please run below registry on all the users machine.
Windows Registry Editor Version 5.00
[HKEY_CURRENT_USER\Software\AspenTech\Aspen Plus\36.0\mmgini]
ndcaseopendontshowmsgagain=dword:00000001
You can also download attached registry key and apply on all the users machine.
Keyword:
network location
do not show this message
multiple users settings
Keywords: None
References: None |
Problem Statement: When trying to duplicate an existing record (or creating a new record based on a definition record), the ensuing dialog box has the Name field grayed out and an ID of zero. | Solution: The problem is most likely because there are no Undefined records available. To check this, right-click on the data source name in the Aspen InfoPlus.21 Administrator and select Properties. Then select the Record Utilization tab of the server Properties dialog. If the number of Undefined records is zero, increase the number of Total records (the only field that allows input). This will then increase the number of Undefined records and allow the creation of new records. Note that you do NOT have to stop/restart the IP.21 database for these changes to take effect.
Keywords: duplicate
duplicating
create
creating
References: None |
Problem Statement: Sometimes it may be necessary to view Aspen InfoPlus.21 history data that is older than the earliest on-line file set (and therefore not visible to tools like Process Explorer). For example, there may be 6 file sets available each holding 1 month of data and there is a need to view data from 8 months ago. How can one make this older data available for viewing? | Solution: If there is a need to make historical data which is not currently online available to be viewed, it will be necessary to mark at least one file set as 'Reserved (and NOT 'Mounted'). This could be done to an existing file set or a new one could be created with those settings. Making it 'Reserved' and NOT 'Mounted' will prevent the file set from being shifted into, therefore leaving the file set available to have old data copied into it. In the example above, it would be necessary to add a 7th file set and mark that file set as 'Reserved'.
To restore the old data and make it available to users, copy the old file set contents, which consist of these three files:
arc.dat arc.byte arc.key
from its backup location into the directory pointed to by the reserved file set. Do not leave any of these three files out. Next, change the set properties to 'Mounted'. The reserved file set will now be marked as 'Mounted/Reserved' and the old data will be accessible to the users.
On rare occasions, marking the fileset as 'Mounted' results in an error, Disk history write error 0. If this error is received, restart the Aspen InfoPlus.21 database. After the restart the old data will be accessible.
Note: From the Aspen InfoPlus.21 Administrator one can right-click on the file sets and select 'Properties' in order to be able to change the 'Mounted/Dismounted', 'Read-Only', and 'Reserved' status. This Properties dialog also displays the location of the fileset in the Windows directory structure.
Keywords: reserved
References: None |
Problem Statement: When running Aspen Production Record Manager (APRM) Excel COM Add-in on a client machine, users may sometimes receive the following error message:
Application Error in dlgSelectDataSource FillAreaList
This knowledge base article shows how to resolve this error. | Solution: Aspen Production Record Manager Excel COM Add-in relies on DCOM to be able to fetch data from the APRM server. The users running the Add-in on their client machines must be properly authenticated within DCOM. The authentication is achieved by adding Authenticated Users group to the Distributed COM Users group in the Computer Management on the APRM server.
Additionally, please ensure that the Distributed COM Users group on the APRM server is added to both 'Access Permissions' and 'Launch and Activation Permissions' in DCOM.
The steps are described in Aspen KB article 121762.
Keywords: MES Addin
References: None |
Problem Statement: This Knowledge Base article provides steps to resolve the following error:
Connect Error: 0x800706BF
which may be encountered while using the Atpm Client Test Tool. | Solution: The above-mentioned error message may be related to DCOM permissions on the computer reporting the error.
In all recent releases of MS Windows operating systems Microsoft made a number of changes to enhance DCOM security:
1) The Windows firewall
2) Limits that can be configured to prevent a COM application from programmatically requesting unlimited access to the system
3) Local and remote access can be specified for launch of and access to COM objects per user
The following steps resolved the error on most systems in the past:
1) Turn the Windows firewall off
2) Click Start | Run | type DCOMCNFG and make the following changes:
A -- On the Default Properties tab make sure that Enable Distributed COM on this Computer is checked
B -- Set Default Authentication Level to None
C -- Set Default Impersonation Level to Identify
3) On the COM Security tab
A -- In the Access Permissions group box
-- Press Edit Limits and give Administrators, Anonymous login, Distributed COM Users, and Everyone local and remote access permissions
-- Press Edit Default and give local and remote access permissions to Administrators, Distributed COM Users, Anonymous login, Interactive, System, and Network
B -- In the Launch and Activation Permissions group box
-- Press Edit Limits and give Administrators, Anonymous login, Distributed COM Users, and Everyone remote and local launch and activation permissions (allow all four check boxes)
-- Press Edit Default and give remote and local launch and activation permissions (allow all four check boxes) to Administrators, Distributed COM Users, Anonymous login, Interactive, System, Network, IUSR_<computername>, and IWAM_<computername>
4) Save your changes (click Apply and OK in each case) and exit DCOMCNFG
5) On your desktop Right click on My Computer, select Manage, Local Users and Groups, Groups and double-click on Distributed COM Users group
6) Add Authenticated Users group
7) Save your changes (Apply and OK) and exit Computer Management
8) Reboot your computer. If that's not possible then any related services, such as Aspen Enterprise Model Server and IIS, that were previously running need to be restarted to pick up these changes.
Keywords: AtpmClient,
AtpmClientTest
References: None |
Problem Statement: How do the different License States impact the functionality of Aspen Production Record Manager V8 and higher? | Solution: This knowledge base article describes the various License States for Aspen Production Record Manager and the impact in functionality for each state.
License Granted: occurs when a license has been granted successfully. In License Granted state:
· Aspen Production Record Manager provides complete functionality.
· No licensing messages are displayed.
License Denied: occurs when either the license server denies the request checkout or the license server cannot be contacted and the license grace period, if any, has elapsed. In License Denied state:
· Aspen Production Record Manager will not allow any new configurations.
· Participating client applications cannot connect to the Aspen Production Record Manager Server.
License Timeout Period: occurs if the license has been granted but the Aspen Production Record Manager Server is no longer able to contact the license server. If contact with the license server is not restored, then the License Timeout period can last for a timeout period specified in the license key. The default timeout period is 900 minutes (15 hours). In License Timeout Period state:
· Aspen Production Record Manager provides complete functionality for a timeout grace period (15 hours).
Extended Grace Period: occurs if the license has been granted at least once, the license server cannot be contacted and the license timeout, if any, has elapsed. In Extended Grace Period state:
· Aspen Production Record Manager will continue to run, with reduced functionality, for an extended grace period (30 days).
· Aspen Production Record Manager will not allow any new configurations.
NOTE: The license behavior may differ with other MES products. Click on the product below to review its license behavior.
InfoPlus.21
Process Explorer
aspenONE Process Explorer
CIM-IO
Keywords: APRM, License Behavior, License State, Granted, Timeout, Grace Period, Denied
References: None |
Problem Statement: In some situations, it is may be desirable to determine the oldest allowable timestamps for InfoPlus.21 tags. | Solution: Aspen SQLplus can access the xoldestok utility to return the desired information. For more information on xoldestok consult knowledge base item 103040 using the link below.
http://support.aspentech.com/webteamasp/KB.asp?ID=103040
The following SQLplus query can be used to return the timestamp value used by xoldestok. In this query, timestamp information will be returned for the tag ATCAI from the demo database.
set output 't.inp';
write 'atcai 1 ip_trend_time', '', '';
set output default;
select substring(2 of line)||' '||substring(3 of line)
from (system '%SETCIMCODE%\xoldestok <t.inp')
where substring(line from 1 for 2) = 'is';
The XOLDESTOK function in SQLplus works the same as the XOLDESTOK command line and the HISOLDESTOK API function. It can be issued directly from SQLplus and it uses two parameters. The first parameter is the record and field to be checked. The optional second parameter specifies a new oldest timestamp.
The following example returns the oldest time stamp for atcai IP_TREND_VALUE.
WRITE XOLDESTOK('atcai 1 IP_TREND_VALUE ');
The following example sets the oldest timestamp for atcai IP_TREND_VALUE to be 7-may-03, 10:20.
XOLDESTOK('atcai 1 IP_TREND_VALUE ','7-may-03 10:20');
Keywords: date
xoldestok
timestamp
References: None |
Problem Statement: Can Aspen Plus handle azeotropes of more than 3 components? | Solution: Yes, Aspen Plus can handle azeotropes of more than 3 components using Aspen Distillation Synthesis. A quick search on the internet uncovered the acetone-chloroform-methanol-ethanol-benzene system which has a number of azeotropes including one quaternary azeotrope. This will be used as an example. The file is attached.
On Aspen Plus, define the components you want to search the azeotropes for, i.e, the components mentioned above:
Choose an appropriate property method, i.e., NRTL. To do the Azeotropic search first, in the Properties environment, click on either Ternary Diag or Residue Curves in the Analysis section of the Home ribbon. Then, click on Find Azeotropes.
On the Azeotrope Search | Input form check the boxes for your defined components.
The results for the found azeotropes will be displayed after clicking on Output | Singular Points. The temperature units and basis (mole or mass) can be modified. The grid can be copied and pasted to a spreadsheet.
A report will be displayed after clicking on Output | Report. The report can be saved as a .html file.
NOTE: This feature requires an additional license for launching the Azeotrope Search Feature.
Keywords: Components, azeotropes.
References: None |
Problem Statement: The Test API (sometimes called the T-A-P-I utility) is a valuable tool to assist in troubleshooting CIM-IO problems. This document explains the proper procedure for running the utility. | Solution: The utility can be found by typing Test API in Windows Search box.
If you have difficulty locating the utility, simply search for the cimio_t_api.exe file, and run it.
If you suspect a problem with CIM-IO (eg. data-loss, no new data coming in, etc.) first check if data is coming into the DCS or PLC. If no new data is coming into the DCS or PLC, you have to solve that problem yourself or by contacting your DCS/OPC Server vendor before contacting AspenTech Support.
If there's no problem with the DCS/PLC, run the Test API on the CIM-IO server. This will verify whether the CIM-IO interface is able to retrieve data from the DCS/PLC successfully. The procedure to perform the test is as follows:
At the Test API menu screen, press 9 and hit Enter to start the Get test.
(Notice that at every user prompt on the following screen, there is a value within square brackets (eg. Please enter unit number [1]: ). The value in brackets is the default value, and you can simply hit Enter to accept that value instead of typing it again. This value is set to the last (most recent) value which was entered at that user prompt)
When prompted for the logical device name, type in the logical device name and hit Enter. You can also find the logical device name in the cimio_logical_devices.def file. If there is more than 1 logical device listed, select the one that corresponds to the nodename of the CIMIO server you are at.
Type 1 and hit Enter when prompted for the unit number.
Type 1 and hit Enter when prompted for the number of tags.
Type 1 and hit Enter when prompted for the priority.
Type 10 and hit Enter when prompted for the timeout value.
Type 1 and hit Enter when prompted for the access type.
Type 100 and hit Enter when prompted for the frequency.
Type -1 and hit Enter when prompted for the list id. This will refresh the list of tags to be scanned by test API.
Type 1 and hit Enter at the Tagname entry options, to enter the tagname one at a time.
Type in the name of the DCS/PLC tag that you wish to read from, and hit Enter.
Select the correct data type for the tag and hit Enter.
Hit Enter for the Device data type.
After that, the results of the Get test will appear. Regardless of the result, you should repeat the test but provide the positive version of the number you used for list id (eg. if previously you used -1, this time use 1), this well cause the list created previously to be reused. Note any error codes and messages which may appear, and send it to AspenTech Support for analysis.
If the Get test is successful, then the problem may lie with the InfoPlus.21 server. To verify this, you should run the Test API again, this time, on the InfoPlus.21 server. Just follow the same procedure as described above. The tagname to be read should again be the DCS/PLC tagname, with the same logical device name. Send any error codes and messages to AspenTech Support for analysis.
If you have problems writing a value to the DCS/PLC, you can also use this utility to do a Put test. Simply select 'a' and hit Enter at the test API menu screen.
To exit the test API utility, hit Enter to return to the main menu, and type 'x' and Enter to exit.
For further information and more details on each cimio_t_api choice, please refer to the CIM-IO Users Guide, which devotes an entire chapter to more detailed information about the Test API utility.
Keywords: cimio_t_api
interface
test API
CIMIO
References: None |
Problem Statement: This knowledge base article describes how to perform basic checks to ensure that Aspen InfoPlus.21 has been installed successfully. | Solution: Checklist to verify the successful installation of Aspen InfoPlus.21
1. Aspen MES Family installation should complete with no error messages.
2. Start Aspen InfoPlus.21 database from start menu. Verify that the InfoPlus.21 Started Successfully message appears as the bottom of the Aspen InfoPlus.21 Manager screen.
3. Verify following services listed and running in Windows Services.
Aspen CIM-IO Manager
Aspen Data Source Directory
Aspen InfoPlus.21 Access Service
Aspen InfoPlus.21 Task Service
Aspen Process Data Service
4. Launch Aspen Process Explorer. Click on Trend button and trend for any historian tag. Verify to make sure trend is visible and new values are plotted.
5. Use Aspen InfoPlus.21 Health Monitor to verify InfoPlus.21 System health is good.
Launch Aspen InfoPlus.21 Administrator
Right click on InfoPlus.21 and select Aspen InfoPlus.21 Health Monitor.
Select your Aspen InfoPlus.21 System from the list then verify all tests show the green success indicator.
Keywords: IP21 validation
verify database
verify after install
check IP21 system
References: None |
Problem Statement: This Knowledge Base article explains new features available in Aspen Data Source Architecture (ADSA) V9.0. | Solution: This following dialog allows the user to configure the way ADSA information is accessed on a client computer.
The first tab, ADSA, contains the following User and System settings:
· Use Public Data Sources checkbox: When checked, Public Data Sources will be used. When unchecked, User Data Sources will be used. If both User Data Source and Public Data Source are configured under User Settings, the User Data Source will be used.
· Directory Server: Specifies which directory server to use.
· Protocol: Specifies which protocol to use (DCOM or Web Service).
· Timeout: Specifies the timeout in seconds.
There are two sets of these configuration parameters, one for a particular user and one for the whole system. If the configuration parameters for a particular user are set, they will be used when that user requests ADSA information. Otherwise, the system configuration parameters will be used. When a non-privileged user runs the ADSA Client Config tool, the System Settings is grayed out. This is also true for privileged user who runs the ADSA Client Config tool without elevating it to Administrator (Run as Administrator).
Note that in the previous versions of ADSA, these two sets of configuration parameters are already there. However, they are not specifically shown in the dialog and have caused a lot of confusion.
The second tab, Configuration, allows the user to configure the Public and/or User data sources. This works the same as before.
Keywords: None
References: None |
Problem Statement: Is it possible to adjust some parameter to achieve a desired heat of reaction? | Solution: Attached is an example for adjusting the standard enthalpy of formation of a component to achieve a desired heat of reaction. It is possible to set the heat of reaction in an RStoic block (seeSolution 3025), but adjusting the heat of formation keeps all of the enthalpy values consistent since the heat of formation is used to calculate both stream enthalpies and heat of reaction. When the heat of reaction is set in the block, the outlet stream enthalpy, as calculated by the heat of formations, etc. will not be consistent with the specified reactor duty except if the heat duty is set to zero.
For more information see the Aspen Plus Help topic Simulation and Analysis Tools -> Sequential Modular Flowsheeting Tools -> Design Specifications: Feedback Control.
See file - Dhform.bkp
The heat of reaction for the hydrogenation of ethylene is known to be -32700 cal/mol at 298 K. Aspen Plus predicts a value of -32570. Since it is possible to access physical property parameters (see the Aspen Plus Help topic Simulation and Analysis Tools -> Sequential Modular Flowsheeting Tools -> Accessing Flowsheeting Variables), a design specification is used to adjust the Standard Enthalpy of Formation to achieve the desired heat of reaction. In Aspen Plus, the heat of reaction is calculated as the difference in enthalpy of the pure components. Since the Standard Enthalpy of Formation (unary parameter DHFORM) is used to calculate vapor and liquid enthalpies, adjusting DHFORM will similarly adjust the heat of reaction in any other Aspen Plus reactor block.
Keywords: dhform
References: None |
Problem Statement: How to upgrade an existing Aspen Petroleum Scheduler /Aspen Refinery Multi-Blend Optimizer/Olefins Scheduler Database (Access, SQL or Oracle) | Solution: Use this procedure to migrate an existing Aspen Petroleum Scheduler(APS) /Aspen Refinery Multi-Blend Optimizer(MBO)/Olefins Scheduler Database:
1. Before updating your existing model it is advised to make a backup of the current model.
2. Run DBUpdate.exe. This program is located in the folder where APS, MBO or Olefins Scheduler is installed. This is typically C:\Program Files\AspenTech\Aspen Petroleum Scheduler, for example:
3. Select the TYPE of database it will be updated: APS, MBO or Olefins Scheduler.
4. Enter the path for the correct DSN file (for SQL Server or Oracle) or the correct database file (for Access) for the database you want to update in the Client Model field as shown:
If you store your database tables in multiple Access databases (e.g., Assays, Model, Schedule, Results), select the Multiple Database Model option and then complete the following fields:
Assays: Enter the location of the input tables that store assay data.
Model: Enter the location of the input tables that store model data.
Schedule: Enter the location of the input tables that store scheduling data.
Results: Enter the location of the output tables that store published data.
5. As a good practice, first to update the database, click Validate option to compare your current client database to the definition database to identify differences (validating the structure of current database). this is an optional step.
6. For Access based models skip to step 11.
7. Click Update to generate an update script. The script is placed in the Aspen Petroleum Scheduler/MBO/Olefins Scheduler working folder and called OrionDbUpdate.sql or same name of dsn file with sql extension. The output window of the DBUpdate program displays the exact location of this file.The script should be saved in the working folder.
8. Run the script against the desired database. This typically requires administrator rights to the database so this step must be performed by a database administrator. The administrator must be logged in as a DB owner (for SQL Server) for the script to work properly.
9. Run DBUpdate.exe again if you closed it. Otherwise go to step 10.
10. Click Update from the DBUpdate program menu.
11. Select the Migrate Data to Normalized Tables on Update option- Applies only if you are upgrading a database from v 2004.1 or earlier to 2006.5 or v7.3). Select this option to always migrate UNITS data to the ATORIONUnits table If this option is not selected, UNITS data will only be migrated to the ATORIONUnits table if a corresponding record does not exist. If UNITS data already exists in the ATORIONUnits table, UNITS information will not be migrated on update.
12. Click Update to migrate the data for UNITS, PARAMS, and EVENTS tables- Applies only if you are upgrading a database from v 2004.1 or earlier to 2006.5 or v7.3). Select this option to always migrate the EVENTS data to the ATORIONEvents table. If this option is not selected, EVENTS data will only be migrated to the ATORIONEvents table if a corresponding record does not exist. If EVENTS data already exists in the ATORIONEvents table, EVENTS information will not be migrated on update.
13. When the validation has completed its run, you will be prompted to run the update again. This starts the Pipeline Conversion Wizard. The Wizard will walk through the conversion of previously configured pipeline data to the new framework for the pipeline scheduling functionality. Before running the update, select Migrate Pipeline tables on UpdateCheck-box in the DBUpdate Window. Applies only if you are upgrading a database from a version erlier to V8.7.
14. Click Update ATORIONEventTanks FRAC for weight based events- Select this option to convert FRAC values in the ATOrionEventTanks table from VOL to WGT. This applies to the following WGT based events:
Crude Receipts
Pipeline Crude Receipts
Pipeline Shipments.
The model should now be ready to run in a recently version.
Keywords: DBUpdate program, Access, Oracle server, Update, Convert Database
References: None |
Problem Statement: How to verify Aspen InfoPlus.21 Desktop applications were installed successfully? | Solution: Checklist to verify the successful installation of Aspen InfoPlus.21 Desktop Application.
1. Validate Aspen Process Explorer
Launch Aspen Process Explorer.
Click on Trend button and trend for any historian tag.
Verify to make sure trend is visible and new values are plotted.
2. Validate Aspen Process Data Excel Addins
Launch Excel and select Aspen Process Data tab
Click Current Value button
Type the Tag name
Select Data Source
Click Apply and OK button.
Values for the tag will show up in the cells.
3. Validated Aspen SQLPlus
Launch Aspen SQLPlus
Type: Select * from IP_AnalogDef and click Execute button
It will display all the tags in the IP_AnalogDef definition record.
4. Validate Aspen Calc
Launch Aspen Calc
Select your calculation
Right click the calculation and select Execute Calculation
The values will get updated.
Keyboard:
Validate SQLplus
validate Calc
validate Excel addin
validate process explorer
Keywords: None
References: None |
Problem Statement: How can I efficiently edit numerous non-linear equations? | Solution: In Aspen PIMS-AO V11, users can now import and export their PIMS-AO non-linear equations from/to Excel. This allows for easy editing as Excel functions like “Find and Replace” can be used to quickly modify many equations.
To export existing nonlinear equations into Excel, Right click on Non-Linear Equations and select Export to Excel as shown below
You will be prompted to your desired file name and an Excel file will be created that has not only the equations, but additional information about them. Equation groups and variables are also listed in the file – an excerpt is shown below.
You can now modify, add or delete equations as desired. Once you are finished, simply import the updated file and the PIMS model tree will be updated accordingly.
NOTE: When an Excel file is imported for non-linear equations, it will replace ALL existing equations on the tree. When adding a new equation, export the existing equations first and then add the new equation to the file before importing to ensure that existing equations are retained.
Keywords: None
References: None |
Problem Statement: In PIMS-AO when a single case is run, the iteration details are shown in the execution log. However when multiple cases are run on multiple processors, I do not see these details. I just see the final objective function for each case. Where can I see the detailed iteration information for cases run in parallel? | Solution: PIMS-AO creates files called XSLP_Control1.log, XSLP_Control2.log, etc. There is a file for each processor that ran cases. Inside those files are the iteration details for the cases run on that processor. These files are created in the model directory.
Keywords: None
References: None |
Problem Statement: How do I turn on the Swing Cut Gradients option and what does it do? | Solution: PIMS-AO has an option called Swing Cut Gradients that alters how PIMS handles crude unit swing cuts. Traditional structure allows PIMS to optimize how much of a swing cut goes up or down. The entire swing cut has one set of properties and those constant properties are used to adjust the stream above and below are appropriate for the swing allocation.
With Swing Cut Gradients active, PIMS adjusts the properties of the up and down portions of the swing cut. Additional information may be required in Table ASSAYS to support this because PIMS uses the cut points of the streams around the swing cut to determine the gradient for a property. Other than ensuring the required data is in ASSAYS, the only thing necessary is to turn on the feature in the model settings as shown below.
Keywords: None
References: None |
Problem Statement: In Aspen Plus, it is possible to select units such as Mcum/hr and Mcuft/hr. However, the M for the first one stands for Million, while for the second it stands for kilo, because million would be MM. How do you know whether M stands for million or thousand? | Solution: M is Used as a prefix meaning thousand with English units, such as Mlb (thousand pounds) and Mscf (thousand standard cubic feet). With metric/SI units, m and M prefixes have their standard metric/SI meanings.
For more information see the help topic Aspen Plus Units Abbreviations.
Prefixes and suffixes
Symbol
Meaning
sq
Square. Used primarily with length units to represent area, such as sqft (square feet) and sqm (square meters).
cu
Cubic. Used with length units to represent volume, such as cuft(cubic feet) and cum (cubic meters). The standard abbreviationcc is used for cubic centimeters.
**.5
Suffix meaning square root of the preceding unit. Used in a few types of units such as dipole moment.
M
Used as a prefix meaning thousand with English units, such as Mlb (thousand pounds) and Mscf (thousand standard cubic feet). With metric/SI units, m and M prefixes have their standard metric/SI meanings.
MM
Used as a prefix meaning million.
G
The standard metric/SI prefix Giga, meaning 109
delta
Prefix on units used for temperature change, used for emphasizing that a value is temperature change and not temperature. Units without delta can also be specified for temperature change and are equivalent to the ones with delta.
Keywords: units of measure
References: : VSTS 426889 |
Problem Statement: When specifying the Particle Size Distribution (PSD) for a solid component in an inlet stream, the Conventional Inert CI Solid tab still appears as incomplete and shows the error:
SUBS-ATTR not complete: SATTSUB = CIPSD
Upper limit of last interval of selected PSD mesh cannot be greater than upper limit of last interval of PSD mesh for this substream (Setup Solids Substreams). | Solution: In some simulations, you might want to have two or more particle size distribution definitions, with different size ranges. This is useful if different sections in your flowsheet have very different particle sizes. You can add and modify particle size distributions using the Setup | Solids | PSD sheet, or directly on the form for a stream.
On the CI Solid or NC Solid sheet of a stream containing solids with a particle size distribution, in the Particle Size Distribution section, you can select what particle size distribution to use for the substream. You can select to define a new particle size distribution and click Edit Mesh to modify the size limits of the intervals in the PSD.
If you use a different mesh than the one defined in this stream's stream class, the data you enter are mapped into the mesh for the stream class. If any particles fall outside the limits of the mesh for the stream class, a warning will be generated. If the fractions do not add to 1 (within a tolerance) after mapping, an error will be generated.
If you select a PSD mesh for a component in the CI Solid sheet of a stream as follows:
Then this PSD mesh needs to have a lower limit higher than the upper limit of the mesh selected in the Solids | Substreams menu:
Keywords: solid
conventional
CI Solid
References: None |
Problem Statement: How to calculate a Utility Balance that changes with the Total Feed Rate? | Solution: If working with PIMS -DR, there is an example in our Volume Sample PIMS model where the user has to calculate the consumption of Energy UBALKWH (Power, KWH) according to the total Feed rate in the Cat Cracking Unit (SCCUTOT), because this utility consumption changes with the amount of total feed to the unit. The column TOT located in the TABLE SCCU represents the total feed to the unit (Please referred to SCCU TABLE located in the Volume Sample PIMS Model for more details).
Therefore, to calculate the UBALKWH, the user needs to go to TABLE NONLIN and identifies that this variable will be calculated through a non-Linear relationshop, and consequently, PIMS will use the data located in TABLE CURVE for that propose.
1.- The user defines this non-linear relationshop in Table NONLIN
2.- The Independent variable (Feed Cat Cracking Unit, XCCU) and Dependent Variable (Power consumption, YCCU) will be plotted in TABLE CURVE. For this example, 6 points under the curve have been represented. Then, for a specific Cat Cracking feed rate, the energy consumption will be interpolated according to the range identified in the table.
If working with PIMS-AO, the above method can be used, or a nonlinear equation can be added to the model tree to establish this nonlinear relationship. See PIMS Help for additional information about using Non-linear equations in PIMS-AO.
Keywords: None
References: None |
Problem Statement: How to verify Aspen InfoPlus.21 Desktop applications were installed successfully? | Solution: Checklist to verify the successful installation of Aspen InfoPlus.21 Desktop Application.
1. Validate Aspen Process Explorer
Launch Aspen Process Explorer.
Click on Trend button and trend for any historian tag.
Verify to make sure trend is visible and new values are plotted.
2. Validate Aspen Process Data Excel Addins
Launch Excel and select Aspen Process Data tab
Click Current Value button
Type the Tag name
Select Data Source
Click Apply and OK button.
Values for the tag will show up in the cells.
3. Validated Aspen SQLPlus
Launch Aspen SQLPlus
Type: Select * from IP_AnalogDef and click Execute button
It will display all the tags in the IP_AnalogDef definition record.
4. Validate Aspen Calc
Launch Aspen Calc
Select your calculation
Right click the calculation and select Execute Calculation
The values will get updated.
Keyboard:
Validate SQLplus
validate Calc
validate Excel addin
validate process explorer
Keywords: None
References: None |
Problem Statement: Electrolytes can be modeled with either the True or Apparent approach. How do I choose which to use? Where do I input my choice? | Solution: In terms of computed results (Temperatures, duties, phase equilibria, etc.), the two approaches give equivalent results. Both true and apparent compositions can be reported for streams and blocks regardless of the approach. Still, There are a number of reasons for selecting one approach over the other.
1. User preference. The true component approach reports all results in terms of the ions, molecules, and salts actually present after consideringSolution chemistry. The apparent approach reports results in terms of the base species present before considering reactions. Results often look simpler using the apparent approach, but that is because the true component approach gives more details. For the simple NaCl-water system, the components reported would be:
Apparent: NaCl, H2O
True: Na+, Cl-, NaCl(S), H2O.
2. Restrictions on unit operation blocks. Certain unit operation models do not support true species electrolyte chemistry. True components can be used with RadFrac and RateFrac, but not with any other distillation column models. The models that do not support true approach chemistry are DSTWU, Distl, SCFrac, MultiFrac, PetroFrac, BatchSep, Extract, REquil, RGibbs, and RBatch. In addition, the solids models do not support salt formation within the block from true species electrolytes (though the liquid compositions will be resolved), except for Crystallizer when using the option that salt precipitation is calculated from the chemistry. RPlug and RCSTR can be used with true species electrolyte chemistry provided that the reactions in the Reactions object do not contain electrolytic equilibrium reactions (such as H2O ⇔ H+ + OH-) and no species participates in both reactions and chemistry. For a further discussion of using electrolyte chemistry in reactors, seeSolution 3416.
3. Presence of two liquid phases. This sometimes requires that the Apparent approach be used. SeeSolution 4402.
4. Formation of volatile species. If reactions form a volatile species such as CO2 or HCN, these components are true species they will never be present using the apparent approach. The true component approach is needed for the flash to be calculated correctly.
5. Separation of salts. Salt are a product of the chemistry; hence, are only reported when using the true approach. If a salt separation step is needed downstream, the true approach should be used.
6. Convergence issues. In general, models using true components will converge more easily. SeeSolution 104394 for more information about tear stream convergence with ionic systems.
7. Ease of use. The approach used controls what information is made available to the unit operation blocks. If using the apparent approach, only apparent component information is available to blocks, design specs, and other tools. Specifications cannot be made in terms of ions, solids, or neutral salts that are not apparent components. Similarly, in the true component approach only the ions, molecules and solids actually present are available. In data regression, for example, it is often better to use the apparent approach when fitting data given on an apparent basis. (It is possible to access flows and compositions on one basis when using another by using Prop-sets. SeeSolution 104398)
Where do I input my choice of basis?
If using the Electrolyte Wizard, you are asked which approach to use. You can change this choice on the main Properties/Specifications form. You can also specify an approach in a block's Block Options form or in the properties for a flowsheet section.
If using different approaches in different parts of a flowsheet, you may need to take additional steps. SeeSolution 102338
Keywords: None
References: None |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.