question
stringlengths 19
6.88k
| answer
stringlengths 38
33.3k
|
---|---|
Problem Statement: SQL Express shows message GDOTOnlineHistory database is exceeding licensed limit. Is there a way to to keep the database size below a specified size and delete any older record? | Solution: Configuration parameter MaxDataBaseSize in GDOTOnlineWebHistorianConfig.txt. Sets a maximum database size. When this size is reached the service will start to delete the oldest data in the database. See Post configuration section in the GDOT installation guide.
However, the database Delete privilege in the Microsoft SQL Server needs to be granted to facilitate the database pruning functionality. To do so, follow these steps:
Right click on the database name and select properties.
Select the permissions tab and select search and select NT AUTHORITY\SYSTEM.
Tick box Delete (in the Grant column).
Fixed in
V12.1 documentation.
Keywords: …GDOT, SQL server, MaxDataBaseSize
References: None |
Problem Statement: Why does Aspen Plus seem to regress parameter elements that are not specified or excluded in the Data Regression (DRS) input? | Solution: When some elements of a property parameter are regressed using DRS (Data Regression System), the elements that are not regressed default to the values in the databank or specified by the user (i.e. NOT zero). All elements that are excluded or not explicitly specified to be regressed will default to the databank value or the value entered on the Properties / Parameters form (in other word, these values are fixed to the databank value or value enter on the Properties / Parameters form).
There is a logical sequence that sets the values:
Locally specified value on the DRS form
Locally specified values on the parameter forms
Database values
Property model default values shown in documentation
To avoid confusion, it is best practice to set values for elements that are not regressed to fixed values. It is even important to set the temperature limits since properties are often extrapolated outside the temperature range.
For example, to regress only the first three elements of PLXANT (PLXANT/1, PLXANT/2, PLXANT/3) for water when there are PLXANT parameters for water are available in the databank, the user has to fix elements 4, 5, 6 and 7 of PLXANT to zero while specifying the regression input, otherwise these elements will be fixed to the databank values.
See the attached file for an example:
Select Retrieve Parameter Results from the Tools menu to see the databank values of PLXANT on the Properties / Parameters / Results / Pure Component / T-dependent sheet.
In regression case R-1, only the first three elements of PLXANT (PLXANT/1, PLXANT/2, PLXANT/3) are specified to be regressed. If you run only that case, and look at the results of the regression (on the Properties/Parameters/Pure Component/PLXANT-1 form), you will see that the values of elements 4, 5, 6, and 7 (PLXANT/4, PLXANT/5, PLXANT/6 and PLXANT/7) have defaulted to the databank values.
To fix PLXANT/4, PLXANT/5, PLXANT/6 and PLXANT/7 to zero, these elements have to be explicitly specified to be zero in the regression input specification as in regression case R-2. If you run this case, you will see that elements 4, 5, 6 and 7 are now zero.
Keywords: Data regression, Exclude, Fix, Regress
DRS
VSTS 116677, 112237
References: None |
Problem Statement: How to configure an Aspen Cim-IO for OPC interface using the Cim-IO Interface Manager. | Solution: On the Aspen Cim-IO interface server Start the Cim-IO Interface Manager with Administrator privileges [RUN AS ADMINISTRATOR]
Click Create a new one or Add Cim-IO Interface to begin the configuration
On the Welcome screen select the Cim-IO interface to be configured Cim-IO for OPC using the pull down menu and click NEXT.
On the next screen provide an Interface name and Description.
If the interface connection to the OPC server requires an account different than the account that starts the Aspen Cim-IO Manager service than check the box Allow entry of user name and password and populate the fields with the user and password.
Click Next
On the Interface Configuration screen provide the OPC DA server computer name. This is the node name where the OPC server is running.
Click the Discover button to populate the list of registered OPC servers
Select the OPC server name to populate the OPC DA server dialog
Click Next to the configuration Summary screen
Next again will create and start the interface.
Click Finish to go back to the main Cim-IO Interface Manager screen and see that all the Cim-IO processes are started.
Now that the interface is configured and started it's a good time to review the ..\AspenTech\CIM-IO\Log\CimIO_MSG.log file to verify the interface connected to the OPC server and writes an OPC READY message.
Additional OPC interface arguments are available by selecting OPC Interface from the left Cim-IO Interface tree.
The properties are available in the Cim-IO for OPC DA User's Guide and are listed below for convenience.
Connection
Timeout and Retries: If a connection to the OPC server takes more than Timeout seconds, a new connection will be attempted. This will continue until the number specified by Retries has been exhausted.
Delayed OPC Servers: If you want a delay between OPC server startup and when the first OPC group is added, click Edit and then Add your OPC server and timeout in the Timeout window.
Force Update (secs) This parameter specifies the number of seconds until a changed is forced and reported by the Cim-IO for OPC DA server for unsolicited values that not updated from the OPC server since the last time a change was reported to Cim-IO clients. The forced update will report the same value sent last time but with the current present time. If zero, values are only updated when received from the OPC server. The default value is 3600 secs.
General
Timezone: If you are reading or writing values of OPC type VT_DATE, you may wish to specify your time zone so that the conversion to and from UCT is performed appropriately. Simply select a time zone. The setting will default to your computer’s current time zone. Note that time for Distributed Control Systems is absolute as a timestamp identifies a particular absolute second of time. It is not the result of a conversion to a time-of-day. If you wish to select another, choose one from the pull-down menu.
Configure Smart Data Types window allows you to add and remove Smart Data Type associations. To add a server, click Add. To remove a server, click Remove. If you click Add, the Adding a Smart Data Type window appears below. When you are finished adding and removing associations, click Close to close this window. Enter your OPC server’s progid, and the description and engineering units fields, with the appropriate starting delimiter. All characters starting with this delimiter and onward will be replaced by the string you have entered. For example, server “Aspentech.OPC.Sample.1” might use “/ID” for its description and “/UNITS” for its units. If the item “Boiler1.Valve25/PNT” was entered into an IP.21 Get record, and the selected data type was “Engineering Units”, the item ID would be modified to “Boiler1.Valve25/UNITS”, since all of the characters starting with the delimiter “/” character would be replaced by the Engineering Units string below. Note that if the delimiter exists in multiple locations in the tag name, all characters starting with the last occurrence of the delimiter will be replaced. Click OK to add the association, or click Cancel.
OPC
The Ignore No Valid Items in List error check box, when selected, will suppress the “No Valid Items” log message when there are no valid items in a Cim-IO request.
Perform Initial Synchronous CACHE Read determines the first action Cim-IO for OPC DA will perform to get data from a point. Generally, when an OPC client connects, the data in the OPC cache is of bad quality. The actual data is obtained by doing a DEVICE read (reading directly from the device). This operation can take a significant amount of time, so you are offered the option of selecting this check box to do a CACHE read instead of a DEVICE read as the first read operation.
Perform Synchronous Writes determines whether writes will be done using SyncIO (checked) or AsyncIO (unchecked).
Allow Output of Values with Bad Status? Determines whether values with a Bad Status in the transfer record or list of values to be output should be output. If the option is selected, values with a Bad Status are output, otherwise the value will be skipped.
Allow Output of Values with a Suspect Status? Determines whether values with a Bad Status in the transfer record or list of values to be output should be output. If the option is selected, values with Suspect Status are output, otherwise the value will be skipped.
SKIPPED BAD and SUSPECT values return a Bad Status? Determines whether values that were skipped due to settings above should return a Bad Status to the client. If the option is selected, values skipped due to a Bad or Suspect Status will report a Bad Status back to the client, otherwise they will be reported as Good.
Timestamp Origin determines how the Cim-IO timestamp will be obtained: either from the OPC server (OPC Server Time), or from the Cim-IO for OPC DA server computer’s time when an OPC message is received (System Time).
After the interface is configured the next step is to configure the Cim-IO client side communication. KB article 145464 outlines this configuration using the Cim-IO IP21 Connection Manager included with Aspen InfoPlus.21.
Keywords:
References: None |
Problem Statement: Aspen ADSA system configuration settings could not be changed as it is greyed out.
How to delete ADSA cache if System settings are greyed out after ADSA server migration? | Solution: This article explains detailed procedure to delete the ADSA cache, in case system configuration in ADSA client config tool is greyed out and cannot be changed.
This is theSolution to address problem editing the system configuration on ADSA client config tool.
First, it is necessary to make sure that Aspen Data Source Architecture is installed on this server (if the plan is to use this local machine as the directory server).
It is recommended to run ADSA Client Config Tool As Administrator.
The directory server host is defined in the registry (for 64 bit) is in below location.
And below is the location for 32 bit:
NOTE: the Client Config tool should store the information in both locations if it is running properly.
There is also a HKEY_CURRENT_USER location that should be DELETED.
The Caches node under the following location can safely be deleted since only in Public Data Sources is required.
This Caches node is automatically re-created from the above registry information when ADSA Client Config tool (either version) is run.
NOTE: After fixing the directory server problem, if available data sources don’t seem to be working properly, it is also recommended to DELETE the DataSources nodes under ANY of these locations and then manually re-define your Public Data Sources if needed.
Once the above procedure is completed, it should be now possible to edit the System configuration of ADSA client config tool.
Keywords: Delete cache
ADSA system configuration
Editing ADSA registry
ADSA configuration
References: None |
Problem Statement: How do I get rid of the following error message SOLID VOLUME MODEL VS0POLY HAS MISSING PARAMETERS: VSPOLY/1ST ELEMENT (DATA SET 1) MISSING FOR COMPONENT GUMARAB in Aspen Plus? | Solution: In order to get rid of the error message, Solid Molar Volume parameters must be defined for calculations to complete for this solid component. The help can be used to search for the definition of any parameters that may be unfamiliar.
To enter the parameter values, go to Properties | Methods | Parameters | Pure Components and create a new Temperature dependent correlation and select VSPOLY from the Solid molar volume folder.
Values for the Molar Volume (Solid) parameters should be entered using some experimental data or literature.
Keywords: Aspen Plus, VSPOLY, MISSING
References: None |
Problem Statement: After attempting to perform an action on an Aspen Unified model (GDOT, Petroleum Scheduler or PIMS) an error message shows up and the model closes suddenly after clicking OK on the message window.
The message says: “Agent Disconnected – Model workspace has been disconnected, please re-open model to restart session” | Solution: The root cause of this issue if the lack of RAM memory of the Aspen Unified host server. It is necessary to increase the number of processors to a minimum of 4 and the RAM memory to 8 GB is recommended for improving the performance of Aspen Unified functionalities and continue working on models.
Keywords: Aspen Unified, GDOT, PIMS, Petroleum Scheduler
References: None |
Problem Statement: 32-bit version: (V8, V9, V10)
Gas Properties Version 1.5.2 - (Wobbe Index, Higher and Lower Heating Values, Dewpoints, Water Content, Cp, Cv)
64-bit version: (V11 and newer)
Gas Properties Version 1.6.0 - (Wobbe Index, Higher and Lower Heating Values, Dewpoints, Water Content, Cp, Cv) | Solution: The attached unit operation extension calculates the Wobbe Index, Higher and Lower Heating Values, Dewpoints, Water Content, Cp, Cv for a selected stream.
HYSYS versions 3.1+ have all the Gasprops properties built in, hence the extension is no longer needed. To add these new properties to a stream, go to the Worksheet ... Properties page of the stream window. Press the green cross button (flyby says Append New Correlation) in the Property Correlation Controls group. Expand the Gas branch of the tree in the window that appears, and then choose the correlation you require.
Note: This Automation application has been created by AspenTech as an example of what can be achieved through the object architecture of HYSYS. This application is provided for academic purposes only and as such is not subject to the quality and support procedures of officially released AspenTech products. Users are strongly encouraged to check performance and results carefully and, by downloading and using, agree to assume all risk related to the use of this example. We invite any feedback through the normal support channel at [email protected].
Keywords: Gas Properties Extension
References: None |
Problem Statement: Is there any documentation available for Spyro and it's use with PIMS? | Solution: Attached is a copy of the Spyro manual.
Keywords:
References: None |
Problem Statement: Some customers using Aspen's Process Explorer Ad-hoc calculation feature have noticed that not all Aspen Calc functions work successfully in the Ad-hoc formulas. This gives rise to the question, Which functions are supported for use in Ad-hoc calculations? | Solution: The supported functions are the Time-Based Functions, Math Functions, Timestamp Functions and Character Functions.
For example, any of the functions listed in the Time-Based Functions area of the Aspen Calc Help file's On-Demand Calculations section may be used. These are:
TimeShift
RateOfChange
RateOfAvgChange
MovingAverage
LeadLag
The Time-Based Functions require the AspenCalc client to be installed on the Process Explorer machine in order to be used within an Ad-hoc calculated pen.
Also, the standard Math Functions can be used:
Abs
Acos
Acosh
Asin
Asinh
Atan
Atanh
Ceiling
Cos
Cosh
Degrees
Even
Exp
Fact
Float
Floor
Int
Ln
Log
Log10
Max
Min
Mos
Odd
Pi
Radians
Rand
Round
Rounddown
Roundup
Sign
Sin
Sinh
Sqrt
Tan
Tanh
Trunc
Value
For examples of the proper syntax required to use these functions in Aspen Process Explorer and aspenONE Process Explorer refer to KB#69073
Here is a list of the Timestamp Functions:
Day
Hour
LastExecutedTime
Minute
Month
Now
ScheduledTime
Second
UTCNow
Weekday
Weeknum
Year
Note: None of the functions listed in the History Functions section of the Aspen Calc help file can be used in ad-hoc calculations. These are:
TagAddNewHistory
TagAverage
TagAverageArray
TagHistory
TagHistoryArray
TagMaximum
TagMaximumArray
TagMinimum
TagMinimumArray
TagModifyHistory
TagSetValue
TagStatistics
TagStatisticsArray
TagSum
TagSumArray
TagUpdateHistory
UtcTagAddNewHistory
UtcTagAverage
UtcTagAverageArray
UtcTagHistory
UtcTagHistoryArray
UtcTagMaximum
UtcTagMaximumArray
UtcTagMinimum
UtcTagMinimumArray
UtcTagModifyHistory
UtcTagSetValue
UtcTagStatistics
UtcTagStatisticsArray
UtcTagSum
UtcTagSumArray
UtcTagUpdateHistory
Keywords: Process Explorer
Ad-hoc
Ad Hoc
Ad hoc
History Functions
Aspen Calc
115368-2
115368
References: None |
Problem Statement: The Aspen Automation Interface Help file does not contain an example of how to use the WriteAttribute method of the Tag Object (ProcessData) to write to a timestamp field. | Solution: Please download the attached zip file containing a code example.
Keywords:
References: None |
Problem Statement: How many options to model salt water in HYSYS? | Solution: There are different options to model the salt water in HYSYS:
a) OLI or Aspen Properties: The OLI is a rigorous electrolyte package developed by OLI Systems (http://www.olisystems.com/) that can be used to rigorously model oil-water-salt systems. HYSYS features an interface to the OLI property package, but you have to purchase the engine from OLI Systems. Also, starting in V7.0, you can use an Aspen Properties package to model rigorously an electrolyte system. Select the Aspen properties option in the Fluid Package window when you select the property methods.
b) Hypothetical components: Another approach is to create a hypothetical component for sea water and another for produced water. Go to the Simulation Basis Manager > Hypotheticals tab and click on Clone comps. You can choose the library H2O component as a starting point and then modify the
properties to suit your needs.
c) Tabular properties: this is another approach, where you can actually modify the properties of a library component. You can access this feature on the 'Tabular' tab in the fluid package. This allows you to specify tabular data points for certain properties (such as density and heat capacity) to which HYSYS does a curve fit and the fitted data is then used in the model. One thing to note with this approach is that the entire fluid package would use Tabular Properties if enabled, so it might be best to make a separate fluid package just for the sea water streams.
More information on options b and c can be found in the Simulation Basis User Guide attached.
Keywords: Aspen HYSYS, Sea Water, Model
References: None |
Problem Statement: What are the steps to configure the Aspen Production Control Web Server (PCWS) to use HTTPS rather than HTTP? | Solution: A Server Certificate is required to publish a website using a Secure Sockets Layer (SSL). You would either have to purchase such a certificate from a trusted 3rd party (such as Verisign) or use a self-signed certificate and install it on your web server. The PCWS has been tested using self-signed certificates so this is supported.
The first step is to create an SSL binding, for which the instructions are provided by Microsoft in this article: https://www.iis.net/learn/manage/configuring-security/how-to-set-up-ssl-on-iis
Once the SSL binding has been created, please follow these steps to configure HTTPS specifically for the PCWS web page:
1. Open IIS Manager and navigate to Sites. Verify that the bindings for Default Web Site includes HTTPS. If this is missing, please add the binding as per the instructions provided in the above Microsoft article link.
2. Navigate to ATControl on the left side navigation tree:
3. With ATControl selected, double click on the node for SSL Settings under the IIS section:
4. Select the checkbox for Require SSL and select the radio button for Ignore client certificates:
5. Click Apply Changes on the right side to save the settings:
You should now be able to open the Web Browser and navigate to the PCWS web page using the HTTPS protocol in the URL. Attempting to use HTTP in the URL after configuring the above should result in an error message like this:
HTTP Error 403.4 - Forbidden - The page you are trying to access is secured with Secure Sockets Layer (SSL).
Keywords: HTTPS, PCWS, SSL, certificate
References: None |
Problem Statement: How to configure OptiRouter project with S3D model. | Solution: Create a new Project Folder and provide it with an appropriate name without using any spaces or special character.
Update OptiPlantS3DProject.config file present in the “C:\Program Files (x86)\AspenTech\Aspen OptiPlant V12.1\S3D” with below information, all separated by comma.
SQL Server Name: In below screenshot <SystemName> is the name.
S3D Model Database Name: In below screenshot OptiPlantRouter_MDB is S3D model database name.
S3D Model Name: In below screenshot OptiPlant Router is the S3D Model name.
OptiRouter S3D Project Location: In below screenshot <ProjectFolderPath> is the OptiRouter project path.
Provide Plant and Building name separated by a comma. In below screenshot plant as “plnt and Building name as “bld” mentioned.
Save this file.
Key Words
OptiRouter
Keywords: None
References: None |
Problem Statement: What is the best way to model the CO2 Capture Process by K2CO3 using Aspen Plus? | Solution: In 2006.5, new Amines property packages for MEA and MDEA with H2S and CO2 were developed and delivered as application examples. These models are being improved, updated, and extended to other amines and solvents. They are posted on the support web site as soon as they are reviewed and ready for public use.
These examples include the relevant components, electrolyte reaction and chemistry, property methods, and data. Both equilibrium and kinetics reactions are considered. Properties were compared to literature data and parameters were re-regressed where needed. These property packages are now our recommended standard for modeling these systems rather than our older data packages or electrolyte inserts.
The applicability of the property packages is demonstrated by modeling the CO2 capture process using our rate-based distillation model RateSep within RadFrac. These CO2 capture columns are generally rate-limited rather than at equilibrium; hence, RateSep rather than RadFrac was used for accurate modeling. A valid RateSep license is needed to run RateSep. Process results are compared to literature data. Details of these models are fully documented. Even if a RateSep license is not available, the user can still leverage the data in other equilibrium-based calculations.
This file describes an Aspen Plus rate-based model of the CO2 capture process by K2CO3 (Potassium Carbonate) from a gas mixture of N2, H2O, CO2, and H2S. The model consists of an absorber and a stripper. The operation data from a pilot plant at TU Berlin were used to specify feed conditions and unit operation block specifications in the model. Thermophysical property models and reaction kinetic models are based on the works of Aspen Technology (2007) and Pinsent (1956). Transport property models and model parameters have been validated against experimental data from open literature.
The model presented here includes the following key features:
True species including ions
Electrolyte NRTL method for liquid and RK equation of state for vapor
Concentration-based reaction kinetics
Electrolyte transport property models
Rate-based models for absorber and stripper with packing
The most recent simulation file and documentation can be found in the examples directory for Aspen Plus. E.g,
C:\Program Files\AspenTech\Aspen Plus Vxx.x\GUI\Examples\Carbon Capture\Amines ELECNRTL
Documentation and an Aspen Plus V10.0 backup (.bkp) file are attached.
Keywords: None
References: None |
Problem Statement: How to connect multiple simulations to ASW? | Solution: In some cases it might be necessary to connect more than one simulation ASW, to do so follow the steps below:
Add the simulations needed to the Organizer using the + button located in the Configuration | Simulations form. These must be added one by one repeating this step.
Notice that the Status of the simulations is Disconnected, thus the next step must be to connect the simulations to ASW
Connect the simulations using any of the following options:
From the Organizer by changing the field Active from False to True
-or-
From the Simulations group in the Aspen Simulation Workbook ribbon by switching between simulations using the drop-down list and clicking on the Connect button
After this is done variables from all the connected simulations can be called into Excel even being able to transfer information from one to another.
Keywords: Connect, two, simulation, model, models, multiple, link
References: None |
Problem Statement: How to activate the Aspen Properties and/or the Aspen Simulation Workbook tabs in Excel? | Solution: Sometimes, the Aspen Properties and/or the Aspen Simulation Workbook tabs are not displayed in the Excel worksheet. To be able to activate these Add-ins follow the steps below:
Open the Excel Add-In Manager (located in the Aspen Engineering Tools folder) and activate the Aspen Excel Add-ins that you want to use.
Note: You will see an option for each version that you have installed, make sure that the selection matches the AspenTech software version that you will call from Excel.
Inside Excel go to File | Options, select COM Add-ins in the Manage field, and click on the Go… button.
In the COM Add-ins window activate the Aspen Properties Excel Calculator and/or the Aspen Simulation Workbook add-ins that you want to use and click OK.
The Aspen Properties and/or the Aspen Simulation Workbook tabs should display now.
If after all these steps the tab still doesn't appear, refer toSolution 97705 to fix the registry of the Add-In files. If the issue persists contact AspenTech Support.
Keywords: Add-in, Add-ins, Aspen Properties, Aspen Simulation Workbook
References: None |
Problem Statement: Setting up and sharing the User Database on Remote SQL Server and Local SQL Server | Solution: The procedure for migrating your Custom Reports will vary based on whether you are using a remote SQL Server or a local SQL Server. In the attached PDF file, you will find the step-by-step procedure for:
Setting up User Database (Icarus_User120) on Remote SQL Server
Sharing Icarus_User120 Database Attached to a Remote SQL Server
Setting up User Database (Icarus_User120) on Local SQL Server or LocalDB
Sharing Icarus_User120 Database with a user on Local SQL Server or SQL Server LocalDB
Keywords: Migrate, set-up, share, economics
References: None |
Problem Statement: How to import Custom Reports from Microsoft Access Database to SQL Server Database? | Solution: In the Economics Suite v11, it’s still possible to choose between the New Reporter (SQL Server Database) or the Old Reporter (Microsoft Access Database), but in v12 the New reporter is the only one that can be used. In other words, Access databases don’t work with this version anymore, thus any Custom Reports created in Access must be imported so that they can be used in the SQL Databases.
The attached PDF file covers the step-by-step process to import data and tables from Microsoft Access Database (Icarus_User.mdb) to SQL Server Database (Icarus_User120.mdf)
Keywords: Migrate, import, convert, transfer
References: None |
Problem Statement: When adding a compressor a message for an intercooler appears in the Scan Messages window, why? | Solution: When adding a compressor, a message for an intercooler is displayed in the Scan Messages window such as the one below:
Component Item Description: Feed Compressor Unit
User Tag Number: CP-1003
*Component Ref #: 685
INFO > 'GC - 685' ESTIMATED AREA FOR REQUIRED INTERCOOLER
INTERCOOLER 1: 1413.56 SF
This is an informational message only, thus it could be ignored. However, it is important to remark that it does not mean that an intercooler is being estimated automatically when adding a reciprocating compressor but that ACCE knows that it usually comes with an intercooler, therefore ACCE provides an intercooler area in case the user decides to add it separately.
Note: By default, intercoolers are cooled using water as the process fluid.
Keywords: Compressor, intercooler, comp, int
References: None |
Problem Statement: There is an excessive epoxy grout quantity calculated for the compressors in the project. Why is this happening and how can I fix it? | Solution: Epoxy grout quantity is calculated based on the estimated footprint dimensions for the compressor, using a grout height of 6 inches for small pumps (< 500 HP) and a grout height of 12 inches for large pumps and compressors. The footprint dimensions are used as an approximation for the base plate dimensions since the grout is typically poured into the base plate. In some cases, for compressors, the footprint dimensions estimated might be excessive, in such cases, change the footprint dimensions by specifying the footprint X and Y values in the form instead of specifying the grout quantity.
Keywords: Footprint, epoxy grout, unit man-hour, install, installation, cubic yard, cy
References: None |
Problem Statement: In furnaces, how should I define my process type if this is a mixture? | Solution: The furnaces process type only allows either GAS or LIQUID as the process type, but in some cases, the process type can be a mixture of both. If your process flow has:
More gas and less liquid, then specify a gas process type.
Less gas and more liquid, then specify a liquid process type.
50% gas and 50% liquid, then specify a gas process type and decrease the fluid flow rate to account for this difference in the equipment size.
You can use this logic for any type of process flow you may have in your project for the furnace.
Note: The process fluid flow rate in CFM units, stands for the ACFM since we are talking about real equipment when doing the estimations not SCFM.
Keywords: Standard gas flow rate, actual cubic feet per meter, standard, units
References: None |
Problem Statement: The Cv is reducing as the Valve Opening % increases, why? | Solution: In Aspen HYSYS, the valve flow coefficient reduces as the valve opening percentage increases because the Cv shown is the full Cv, not the effective Cv.
The Cv that appears in Rating | Sizing (whether provided by you or calculated by Aspen HYSYS) is the maximum Cv of the valve, indicating the volumetric flow through the valve at 100% valve opening. Do not confuse this value with the effective Cv that is obtained according to the current valve opening and the valve operating characteristics. In other words, Aspen HYSYS reports the full Cv at 100% valve opening.
Note: This is also true for the Cg of a valve
Keywords: Cv, sizing, flow coefficient, opening, increase, decrease
References: None |
Problem Statement: Is it possible to convert an Equation Oriented sub-flowsheet back to a conventional flowsheet? | Solution: If a part of your simulation has been converted to an Equation Oriented sub-flowsheet it is not going to be possible to take out those objects from it. However, the EO sub-flowsheet can run in a Sequential Modular mode as a workaround.
In order to do this, go inside of the EO sub-flowsheet and activate the Sequential Modular mode located in the Run group.
Otherwise, that part of the process would need to be reconstructed outside of the EO sub-flowsheet.
Note: It is a best practice to save the SM model and EO model as different files.
Keywords: Sequential Modular, SM, Equation Oriented, EO, convert, transfer, remove
References: None |
Problem Statement: It is possible to cause a Bad Value on a variable using DMC3 Builder calculations, this is a useful strategy for testing the behavior of the controller when corrupted data or communication issues happen between the DCS and the RTE controller. | Solution: To simulate the Bad Value on the desired variable you must follow the next procedure:
On DMC3 Builder in Controllers, go to the Calculations tab on the desired controller.
Create an input Calculation.
For this example, we are going to set the Measurement of the CV AI-2020 to Bad Value. The calculation lines must be similar to these:
if BAD=1 then
AI2020.level=qlevel_bad
else
AI2020.level=qlevel_good
end if
NOTE: Variable BAD and AI2020 can be named whatever you want, but it is important to leave the “.level” at the end of the variable to point at the status of it.
Create a user defined entry in the General tree for the variable named as BAD in this KB Article, data type is preferred to be OnOff. Check that the DefaultIOFlags are IsInput and IsTuningValue.
Map the variables to the Measurement of the desired variable and the Used defined entry created.
To make sure the calculation is working as expected, turn On the Test Mode and click on Initialize Inputs.
Click on Run and check that the value where the Measurement value has turned red when the User defined entry value is On and shows a normal color when the User entry is Off.
Click on Apply to save all changes.
To test the Offline simulation:
Go to the Simulation tab.
Perform a step on the Controller to initialize the simulation.
Go to Actions and Application details to modify the value of the User entry.
Modify the value of the Entry. In this case, it was changed to On.
Perform another step on the simulation and you will be able to see the Bad Value on the desired variable.
To test the Online simulation:
Go to Deployment tab and click on Deploy.
Check the Enable Online simulation box and deploy the controller.
Wait for the controller to Deploy and Start it.
Open the Production Control Web server and look that the application is working.
Modify the value of the User entry.
The status of the variable will change to Bad Value when the controller has been executed again.
Keywords: DMC3 Builder, RTE controller, Bad Value, offline simulation, online simulation
References: None |
Problem Statement: Aspen HYSYS V12.1 Stream Reporter (HSR 1.7.4) | Solution: HYSYS Stream Reporter (HSR) is an Excel spreadsheet utility that allows to import to a spreadsheet the material stream information such as conditions, properties and compositions, and also compare streams from different cases.
HSR can report properties from the following phases: Overall, Vapour, Light and Heavy (Aqueous) Liquid, Combined Liquid and Solid. It also allows stream user variables and property correlations to be reported. It is also possible to create formulae in the output table. The user can save sets of properties or use one of the pre-built property sets. Streams from different HYSYS cases can be reported in the same stream table. Once a stream table has been generated it can be updated by pressing a single button. Stream tables can be moved to another Excel workbook whilst maintaining the ability to be updated.
HSR takes the form of an Excel spreadsheet file with embedded Visual Basic for Applications (VBA) code that demonstrates how HYSYS can be accessed programmatically. The VBA source code is freely accessible and users are encouraged to learn from it and adapt it to their own needs.
For V12.0 please see the KB Article 098349.
For V11.0 please see the KB Article 056528.
For V10.0 please see the KB Article 056331.
For V9.0 please see the KB Article 057415.
For V8.0 - V8.8 please see the KB Article 057412.
For older versions see the KB Article 054553.
Note
This Automation application has been created by AspenTech as an example of what can be achieved through the object architecture of HYSYS. This application is provided for academic purposes only and as such is not subject to the quality and support procedures of officially released AspenTech products. Users are strongly encouraged to check performance and results carefully and, by downloading and using, agree to assume all risk related to the use of this example. We invite any feedback through the normal support channel at [email protected].
Keywords: HYSYS Stream Reporter, HSR
References: None |
Problem Statement: Running the Aspen OnLine V11 and up Service as Local System | Solution: The new V11/V12 versions of the AspenTech Engineering suite products runs now on 64-bit computers. Said change allows our software to take advantage of today’s hardware to solve larger models. Aspen OnLine and all its components are now 64-bit programs. The 64-bit versions of simulators delivered in V11/V12 work with Aspen OnLine V11/V12.
Aspen OnLine includes a service which performs some tasks. This includes accessing a database which is used to cache data retrieved from a historian. To use the Aspen Properties Enterprise Database with several Aspen Engineering products on a Windows Server operating system, an SQL server is required. The installer for SQL Express 2014 64-bit is included in the: <\3rd Party Redistributables\Microsoft SQL Express 2014 SP2> subfolder of the installation USB drive. The 32-bit SQL Express 2014 can also be used, but you must download it from Microsoft. Aspen Online requires for the SQL Express server to be installed.
Because Aspen OnLine is a 64-bit product, if you are using it with SQL Express 2014 (see Running the Aspen OnLine Service as Local System), it must be a 64-bit version of SQL Express 2014.
The Aspen OnLine service, by default, runs under an account with Administrator access. It is possible to configure the service to use the Local System account instead, which will allow any user to use the service.
Run the service as a user with local administrator privileges. The database will use the LocalDB built into Windows without any additional installation.
Let the service run as Local System. The Local System user cannot use LocalDB, so you must install SQL Express 2014 (64-bit) for this case.
Note: If the Aspen OnLine service runs as any account other than yours, then it will not be able to run any projects stored in your user-specific folders (anything under C:\Users\<username> including the Desktop and Documents folders; doing so will lead to Access Denied errors. Avoid storing projects in these locations.
To confirm and if necessary change the account which runs the Aspen OnLine service:
Using an account with administrative privileges, open the Control Panel. Search for Services and run it.
In the Name column, locate and click Aspen OnLineV11.0.
In the Status column in this row, the word Started will appear if it is started. If it is already started, on the left side of the window click Stop to stop the service.
In the Log On As column in this row, Local System appears if the service is running as Local System. To change this, right-click anywhere in the row and click Properties.
Click the Log On tab, and click This account, then either enter the account name or click Browse and search for an account. The account selected must have local administrator privileges.
Enter the password for the account in the Password and Confirm Password boxes to confirm it, and click OK.
At the left side of the window, click Start to start the service.
In the Startup Type column, the word Automatic should appear. If it does not, double click it and select Automatic or Automatic (Delayed Start) in the Startup type field in the dialog box that appears.
If you run the service as Local System, install SQL Express 2014 (64-bit) and configure Aspen OnLine to use it. (To confirm and change the account if needed, follow the steps above, but in step 5, choose Local System account.)
Ensure SQL Express 2014 (64-bit) is installed. This is available in the Aspen Engineering V11 installation media in the 3rd Party Redistributables folder. Contact AspenTech Support if you have any problems.
Ensure the user account performing this procedure has access to the SQL installation and has Administrator access.
NOTE: Do not use the Default Instance name when installing SQL. Create a unique instance name when installing SQL. The following steps will not work with the default instance name: MSSQLSERVER.
Open a command prompt with Administrator access and navigate to C:\ProgramData\AspenTech\AspenOnLineVXX.X\system\.
Run the batch file SelectInstance.bat.
A list of valid SQL instances will be displayed. Select the instance you want to use and press Enter.
To complete the configuration of the service running as Local System, you may need to:
Register model extensions under the Local System account
Ensure the Local System account can access a network folder hosting the Offline directory
Keywords: Online, v11, local, system, sql
References: None |
Problem Statement: Configuring SQLA tags using an Oracle DB connection isn't documented clearly. | Solution: This Knowledge Base article outlines the important configuration details for connecting to an Oracle DB.
On the IP.21 Server, check the Oracle client connection to the relational database:
Start | Programs | Oracle - OraHome92 | Configuration and Migration Tools | Net Configuration Assistant
On the Net Configuration Assistant: Welcome screen, select the radio button:
Local Net Service Name configuration - then click the Next button
On the Oracle Net Configuration Assistant: Net Service Name Configuration screen,
select the Test radio button, then click the Next button.
On the next screen, use the Drop-down menu to select the net service name of interest. This will be the link to the Oracle database where the SQLA tag data will reside. Click the Next button.
If the test does not succeed, try changing the Login information used. This test must be successful in order for SQLA tags to work properly.
After the Oracle client connection has been verified, the SQLConnectionDef record can be configured.
Using the Aspen InfoPlus.21 Administrator, expand the SQLConnectionDef record and select the Default SQLA Connection record. Right click on the IP_CONNECTION_STRING field and select the Edit SQL Connection String... option.
On the Provider tab, select Microsoft OLE DB Provider for Oracle, then click the Next button.
On the Connection tab, configure the Enter a server name: field using the same Oracle Net Service name used when testing the Oracle client connection. Then, fill in the User name and Password fields with appropriate information and check the Allow saving password box. Use the Test Connection button to verify that the link works as expected. Click the OK button to finish.
Now that the SQLConnectionDef record is configured, the IP_SQLADef records must be configured.
The IP_SQLADef records in the InfoPlus.21 database have several key fields:
IP_TABLE - specify the name of the table in the relational database containing the tag data.
IP_TAG_TYPE - SQLA
IP_COLUMN_NAME- specify the column name within the table where tag names are stored.
IP_COLUMN_VALUE - specify the column name within the table where the value data is stored.
IP_COLUMN_STATUS - specify the column within the table where the status is stored. The status must be set to zero for data to display within Process Explorer.
IP_COLUMN_TIME - specify the column name within the table where the timestamp data is stored.
Default values or blanks will work for the remaining fields in the IP_SQLADef records.
Keywords:
References: None |
Problem Statement: New features for Aspen production record manager with V10,V11,V12 | Solution: Please find the attached Presentation which explains the new features with APRM & APEM application with V10,V11, V12 .
Also find the V12 release notes PDF attached .
Keywords: None
References: None |
Problem Statement: : SLM services may show issues such as not starting or frequently stopping due to incorrect environmental variable. The below article shows the steps to edit the environmental variable to address issues due to incorrect environmental variable. | Solution: : Incorrect environmental variable can cause different errors in SLM services like, “All licenses are currently in use even though there are unused token available in the pool.
If you have followed all the steps from https://esupport.aspentech.com/S_Article?id=000081282, still getting the error, then you may need to look for the environmental variable as follows,
To edit the incorrect environmental variable,
Go to environment variable – This PC>Properties>Advanced System Settings> environment variable. You will land up on below page,
Copy the existing variable value and save to notepad for backup
Edit the system variable to -l C:\Program Files (x86)\AspenTech\SLMServerLogs\lserv.log -z 5m -lfe 2 -sbm 4
Restart the Sentinel RMS License Manager Service
Check the folder SLMServerLogs from Program Files (x86)>AspenTech>SLMServerLogs. The log file needs to be updated with the current date
Open SLM License Manager and now it should run without an error.
Keywords: Environmental Variable, SLM
References: None |
Problem Statement: It is possible to connect the Lab Sample parameters of an Aspen Inferential Qualities sensor to an OPC interface in order to enter those using a DCS or another human-machine interface different than the Production Control Web Server, or as in this example, InfoPlus.21. | Solution: The parameters for Lab Samples can be found opening the .iqf file using IQconfig in the left column, specifically in Lab Data Collection (LDC) tree.
There are four main parameters that are taken in account each time an operator enters a Lab Sample through the PCWS interface.
NEWLAB (New lab value): Reads the new lab value from this point when the AVLFLAG is set to one (1).
NEWLABSTMSTR (Sample Time Sample time corresponding to new lab value): Shows the sample time which will be associated with the next sample value. When there are no entries waiting in the lab, it corresponds to the latest lab value stored.
The other two parameters are not shown in the PCWS interface, however for the sample to be registered in the IQ engine these are necessary as well.
SAMPFLAG (Sample taken flag): Records the sample time for this lab property. The LDC module sets the status of the lab sample to Waiting or In Lab depending on the number of samples waiting. Logical parameter can only handle values of zero (0) or one (1). It needs some logic to be set to one (1) after a new value is entered into the OPC interface.
AVLFLAG (New lab value available flag): The operator sets it to one (1) after entering the lab value from PCWS, it needs some logic to be set to one (1) after a new value is entered into the OPC interface. The LDC module responds by changing the status of the corresponding lab sample from Waiting to Stored and resets the AVLFLAG back to zero (0). Logical parameter can only handle values of zero (0) or one (1).
NOTE: It is possible to change the NEWLAB value with a NEWLABSTMSTR, but it will not be registered as a Lab Sample unless both SAMPLAG and AVLFLAG parameters are set to one (1).
For this example, we are going to show the steps to connect the IQ GASOEP to different tags of InfoPlus.21 as an analog of the OPC interface using a specific Cim-IO device associated to an IP.21 Cim-IO interface (IOIP21):
Available_Flag_IQ: RDWRT. IP_DiscreteDef record. DBDV source.
IQ_TEST_LAB: READ. IP_AnalogDef record. DBVL source.
Sample_Taken_Flag: RDWRT. IP_DiscreteDef record. DBDV source.
ATextResult: READ. IP_TextDef record. ASC source.
AVFLAG and SAMPLAG need to be specified in the Keyword as RDWRT for the IQ engine to put them back to zero (0) one the sample has been registered for the Bias calculation.
It is important to mention that Inferential Qualities uses a specific time format for Analyzers and Lab Sample times. If the tag does not have the specific format, IQ won’t be able to read the value. OneSolution is to transfer the Lab Sample or Analyzer value to IP.21 and change the time format using an SQL query.
NOTE: IP.21 IP_INPUT_TIME stamp need also to be changed to the format %m-%d-%Y %H:%M:%S.
Useful information about IQ timestamps can be found in the next KB article: How to change Lab sample time format to be used on IQ
A test performed using a input value of 200 with the timestamp of 20-JUL-21 at 16:05:21 is shown down below to exhibit the expected behavior of this functionality:
Keywords: Lab Sample, Inferential Qualities, OPC interface, InfoPlus.21
References: None |
Problem Statement: After upgrading Aspen Calc, the following error is returned when attempting to edit or execute an existing calculation: Failed to edit calculation. Error Getting Calculation Object
All the calculations display with a red circle/slash
If the external task Tsk_CLC1 within the InfoPlus.21 Manager is used, the task will stop running.
Also, if a user tries to work with libraries (e.g. create a new one, or trying to edit an existing one, or trying to export one) the following error will be returned Failed to edit library. Invalid library name.. | Solution: After the upgrade, the environment viable ASPEN_CALC_BASE is missing.
TheSolution is to create a system variable named ASPEN_CALC_BASE and set the value to the installed location of Aspen Calc. The default path is: C:\Program Files\AspenTech\Aspen Calc
Reboot the PC or restart the AspenTech Calculator Engine service for the change to take effect.
If you are using the external task Tsk_CLC1 you will also need to restart the task using the InfoPlus.21 Manager.
NOTE: If the above does not resolve the issue, please check the Aspen Calc log file located at ...\Program Files\AspenTech\Aspen Calc\Log\AspenCalcMessageLog.txt for any error messages.
Keywords: Failed to edit calculation
Failed to edit library
Invalid library name
References: None |
Problem Statement: A deployed IQ application is not collecting data on Aspen Watch. Data collection cannot be started, and its status is “Not supported” | Solution: This message indicates that the current configuration for the IQ application does not support data collection on Aspen Watch.
To enable data collection for IQ application from Aspen Watch please follow the steps below:
Add the AspenIQ node to the Online Connection settings on Aspen Watch. Default port used for IQ data collection is 12350. This port should be open to allow data collection.
Make sure that the connection described above shows a green icon
Go to your .iqf Application file, right click on your IQ application and click on Properties, on the Properties window, check the option to allow Aspen Watch Performance Monitoring (AWENB). Do this for each IQ application that you want to monitor.
Go to Watch Maker and check that the collection status for your IQ application is now “Success”
Only IQ applications that count with the Prediction module can have AWENB parameter to be set to ON and be collected in Aspen Watch Performance Monitor.
Note: Standard IQ token consumption is 1 token for 4 inferentials, however IQ applications that have Aspen Watch collection enabled consume 1 token per inferential.
Keywords: Inferential Qualities
Aspen Watch Performance Monitor
References: None |
Problem Statement: MSC installation runs into the following pop up error, NobleNet service fail to start.
Cause:
Some specific version of VC++ runtime was corrupted, which is required to running NobleNet service. | Solution: Go to Control Panel|Program & Features, uninstall the following version of VC++ runtime:
Microsoft Visual C++ 2008 Redistributable – x64 xxxxxxxx
Microsoft Visual C++ 2008 Redistributable – x86 xxxxxxxx
Microsoft Visual C++ 2010 Redistributable – x64 xxxxxxxx
Microsoft Visual C++ 2010 Redistributable – x86 xxxxxxxx
Microsoft Visual C++ 2013 Redistributable – x64 xxxxxxxx
Microsoft Visual C++ 2013 Redistributable – x86 xxxxxxxx
Go to Aspen installation package, and install the following redistributables:
\aspenonesuite\core\Microsoft_vcredist_x64\vcredist_x64.exe
\aspenonesuite\core\vcredist_x86_VS2008SP1\vcredist_x86.exe
\aspenonesuite\core\vcredist_VC2010\vcredist_x64.exe
\aspenonesuite\core\vcredist_VC2010\vcredist_x86.exe
\aspenonesuite\core\vcredist_x64_VS2013SP1\vcredist_x64.exe
\aspenonesuite\core\vcredist_x86_VS2013SP1\vcredist_x86.exe
Click “Retry” button on the error dialog. If still not fix the problem, click “Cancel”. Uninstall already installed products, then reinstall.
Key words:
NobleNet Portmapper; PORTSERV.exe
Keywords: None
References: None |
Problem Statement: How to include the tag name into the node ID for InfoPlus.21 OPC UA Server? | Solution: The NodeIds of tags in InfoPlus.21 OPC UA Server are opaque type. It is a serialized byte stream that is generated on various properties of the tag.
It cannot be easily constructed by merely using the tag name or hierarchy.
Hence it is recommended to use browsepath instead of NodeIds.
The browsepath is the hierarchical path of the tag in the OPC UA Server address space which is build using the browse name attribute of the tag/node.The browsepaths are exactly built on the same approach used for building the nodeIds. The OPC UA Client has to then translate the browsepath into NodeId using the TranslateBrowsePathsToNodeIds service/API of the OPC UA server.
A tag in the InfoPlus.21 OPC UA Server can be read using either the DA branch or the RAW branch.
1. Under DA branch of InfoPlus.21 OPC UA Server, all records have a Measurement node that contains the tag value. In other words, it is basically the value of IP_INPUT_VALUE field of the record.
The browsepath for reading a tag, for example ATCAI or ATCDAI, using DA branch will be:
/Objects/2:DA/2:IP_AnalogDef/2:ATCAI/2:Measurement
/Objects/2:DA/2:IP_AnalogDef/2:ATCDAI/2:Measurement
2. The tag value can also be accessed directly using IP_INPUT_VALUE field of a record from the RAW branch.
The browsepath for reading a tag, for example ATCAI or ATCDAI, using RAW branch will be:
/Objects/3:RAW/3:IP_AnalogDef/3:ATCAI/3:IP_INPUT_VALUE
/Objects/3:RAW/3:IP_AnalogDef/3:ATCDAI/3:IP_INPUT_VALUE
So is it possible to create a node ID like: ns=2;s=RAW:IPAnalogDef:ATCAI and will it work?
Answer: No it will not work. The NodeIds created using this approach are invalid.
The nodeIds of tags in InfoPlus.21 OPC UA Server are opaque, so they are always ns={2/3};
Hence it is recommended to use browsepath instead of using NodeIds.
Keywords: OPCUA
IP21
Cim-IO for OPC-UA
References: None |
Problem Statement: Enabling an application—after it is deployed to an online applications server—to operate in online simulation mode. | Solution: The online simulation mode remains in effect the entire time the application is online, until the application is either removed from the online applications server or redeployed with the online simulation feature disabled.
Online simulation is available only if the application is completed to the Controller stage and ready for deployment (that is, completed to the Plant stage) so that the application is ready for online operation as a controller application.
Online simulation is available for all model types of applications: FIR, MIMO, and MISO.
Online simulation is especially useful for (but not limited to) training and testing the constituent controller applications that are part of a Composite suite. The feature can also be used for operator and engineer training, and system verification during commissioning.
During online simulation mode, an application does not engage in active control of the plant. The application runs in a true closed loop simulation, with measurement data received as follows:
For manipulated variables (MVs), the setpoint calculated by the move plan is transferred to the measurement value.
For controlled variables (CVs), the prediction for the next cycle is transferred to the measurement value.
The measurements of all variables do not become stale.
During online simulation mode, IO (input / output) communications and current values are managed as follows:
All of the IO connections are disabled.
The application does not read any values from the DCS.
Values of predictions for the CVs, calculated by the software, are the ones that replace the measurement values.
The simulation starts at the current values that the engineer inputs during the tuning of an online application.
Note: For an online simulation (there is a notable difference, compared to an offline simulation), the online simulation runs at the real execution time of the controller cycles.
As shown in the thumbnail illustration below, an application operating in online simulation mode is clearly marked and distinguishable when displayed in the Online tab of the APC Web Server.
Dark backgrounds with light-colored foreground text are used to emphasize that the current display is for simulation.
To deploy an application so that it is enabled for online simulation:
Access the Deploy dialog box.
Select the check box for Enable online simulation.
Complete steps for deployment by clicking Deploy, Redeploy, or OK, as applicable.
Fixed
Documentation added for DMC3 Builder V14.
Keywords: DMC3, Online simulation, RTE application, IO communications
References: None |
Problem Statement: If a 'Cim-IO for IP.21' interface is created the Test API can be used to validate the setup by doing a Get to retrieve data from the data source (in this case, another Aspen InfoPlus.21 database). If a message like the one below appears, what can be done to fix the problem?
GET successful
Tagname : OMA311
Type: REAL Device Data Type: REAL
Value= 0.000000
Timestamp: Fri Dec 11 15:11:00 2020
Status is 'Bad Tag'
Facility=19
Driver Status=19003 | Solution: When doing a Cim-IO Get via the Test API it asks a series of questions. One of the questions is:
Tagname entry options
Keywords: None
References: None |
Problem Statement: After a new installation of the MES Web Server, the aspenONE Process Explorer home URL displays a blank page on all browsers.
This KB provides steps to resolve this error | Solution: First try to access the IIS homepage on the Web Server with the following URL:
http://localhost/iisstart.htm
If the above URL also returns a blank page, most likely the Static Content feature was not installed on the Web Server while installing IIS.
To enable Static Content,
Open Server Manager -> Add roles and Features -> Select Role-based or feature-based installation -> Select the Server and open the Server Roles -> Locate Web Server(IIS) -> Expand Web Server and then Common HTTP Features -> Check Static Content
Complete the installation and then restart IIS.
Keywords: A1PE
Blank Home page
References: None |
Problem Statement: Customers may face some product issues after installing Windows Updates without following best practices. | Solution: In order to prevent installation errors that may affect Aspentech Software, the user will need to stop all the Aspentech applications and services, before applying Windows updates
Keywords: Windows updates
Aspentech
Software
Patches
Microsoft
MS
References: None |
Problem Statement: This knowledge base article describes how to obtain the locking information for AspenTech licenses on v8 and above when the SLM License Manager has already been installed. | Solution: 1. First search for AspenONE SLM License manager on the start menu:
2. Then Select the Locking Info Option on the top menu:
3. It should display the Locking information window:
4. Click on the Copy to Clipboard button then copy the content to an email or Microsoft Word document. Send the email:
Keywords: None
References: None |
Problem Statement: When you load an IQ or DMCplus application in the ACO platform, it creates a set of memory-mapped files that are stored on disk and maintained using a directory system called the MPF directory. In some cases, this directory system can get corrupted and may need to be re-built. Some symptoms of this condition are when you are unable to load or start a controller for unexplained reasons or if controller values do not update as expected. In this case, you may need to clean and re-generate the shared memory files associated with the online applications.
Note: This procedure does not apply to the RTE platform. | Solution: Procedure to clean and re-create the MPF shared memory for the APC ACO platform (not applicable to the RTE platform):
1. Stop all DMCplus and IQ applications using APCManage
2. Save the CCF or IQF for each of the loaded DMCplus and IQ applications to get the most up-to-date values in their configuration files (unless there are issues with the current system and you cannot reliably do this)
3. Stop the ACO Utility Server service. This should stop all the related services as well.
4. Delete all files in this folder: C:\ProgramData\AspenTech\APC\Online\sys\mpf
Note: If a message is generated Cannot delete, locked by another user for the mpf_dir file, then try renaming that file. Renaming the file will allow you to move it to another location. You can then empty the directory and recreate the MPF region.
5. Open a command prompt As Administrator and run the following commands (make sure each one completes before running the next).
Note: It is normal to see errors like “MPF: Region or file does not exist: DMC_DIRECT” and “StartService failed - An instance of the service is already running.”, etc.
mpf_manage create 200
mpf_manage msgcreate 15000
cd C:\ProgramData\AspenTech\APC\Online\sys\etc
SetupDMCplus.cmd
SetupAspenIQ.cmd
SetupDMCplusComposite.cmd
6. If any of these windows services are not running, then start them (the above procedure normally will start all these for you):
- ACO Utility Server
- Aspen APC DMCplus Data Service
- Aspen Inferential Qualities Data Service
- Aspen APC Message Log Service
- DMCplus Context Service
7. Load all of your DMCplus and IQ applications (see NOTE below).
NOTE: it is recommended that you DO NOT load applications using a batch or command file. There are known issues related to doing this that cause MPF corruptions. It is best to load each one manually to guarantee that no errors occur before loading the next one.
Keywords: MPF, ACO, corrupted
References: None |
Problem Statement: Are there any physical property parameters available for Therminol components? | Solution: Parameters for the following Therminol components are provided for all the Aspen Plus users by Eastman Chemical. These components and parameters have been added to the PURE38 databank in V12.
In the attached file, the data for the following components has been entered and regressed:
Therminol 55
Therminol 59
Therminol 62
Therminol 72
Therminol 75
Therminol D-12
Therminol LT
Therminol VLT
Therminol VP-1
Therminol VP-3
Therminol XP
Using the data from the technical bulletins, the scalar parameters MW (Molecular Weight, TB (Normal Boiling Point), TC (Critical Temperature), and PC (Critical Pressure) have been entered for all components. In addition, VL (Liquid Molar Volume), CPL (Liquid Heat Capacity), HVAP (Heat of Vaporization), KL (Liquid Thermal Conductivity), MUL (Liquid Viscosity), and PL (Vapor Pressure) have been entered and used to regress parameters for these models. SeeSolution 127338 for an example of how to regress parameters for a heating fluid.
The tech bulletins with property data that can be regressed are provided for a few additional components:
Therminol 66
Therminol 68
Therminol RD
Therminol SP
Keywords: Therminol, Parameters
References: None |
Problem Statement: After registering an application on the GDOT web viewers, GDOT-Simulation or GDOT-Online, it is not possible to see the trends for MVs or CVs, however, it is possible to see the application connecting correctly to the web viewer.
The following error message can show up related to the variable which has been opened in the web viewer:
It is possible to see in the SQL Management Studio that the respective database, GDOTSimulationHistory or GDOTOnlineHistory (depending on the web viewer in which the problem is being experienced), that the dbo.GDOThistory table is not being filled: | Solution: To solve this issue, it is necessary to perform the next procedure:
Go to the SQL Management Studio and in the left column expand Databases and then expand GDOTSimulationHistory or GDOTOnlineHistory database, expand Tables and locate dbo.GDOThistory table.
Right click on the dbo.GDOThistory table and click on Delete. One the Delete Object window shows up, click on OK.
Once the table has been deleted, left click on the respective database, in this example we selected GDOTOnlineHistory.
Go to File, and select Open, then File, and go to the next path:
C:\Program Files (x86)\AspenTech\GDOT Online\V12.1\WebBackEnd\sql – For GDOTOnlineHistory
C:\Program Files (x86)\AspenTech\GDOT Offline\V12.1\WebBackEnd\sql – For GDOTSimulationHistory
Open the file located in the desired path:
or
Edit the AppKey and the VarName VARCHAR definition and change the number inside the parenthesis to 255.
Make sure that the database selected is the desired one, this is a very important step.
Click on Execute. Ignore the error message related to dropping the table.
Right click on the database and click on Refresh, make sure that the dbo.GDOThistory table has been successfully created.
To confirm that the data is being collected correctly in the database right click on the dbo.GDOThistory and select Edit Top 200 Rows, you will expect that the table is being filled now.
NOTE: Make sure that the GDOT application has been successfully registered on the Web Core and Web Historian text files and that it is connected to the console.
If the database is still not being filled, try to restart the next services:
For GDOT-Online:
For GDOT-Simulation:
Trends in GDOT web viewer should work now.
Keywords: GDOT web viewer, GDOT Simulation, GDOT Online, trends
References: None |
Problem Statement: How is the COSMO-SAC model used in Aspen Plus? | Solution: COSMO-SAC is a solvation model that describes the electric fields on the molecular surface of species that are polarizable. It requires a fairly complicated quantum mechanical calculation, but this calculation needs to be done only once for a particular molecule and then the results can be stored. In its final form, it has individual atoms as the building blocks for predicting phase equilibria instead of functional groups. This attribute provides a considerably larger range of applicability than group-contribution methods. The calculation for liquid nonideality is only slightly more computationally intensive than activity-coefficient models such as NRTL or UNIQUAC. COSMO-SAC is complementary to the UNIFAC group-contribution method because it is applicable to virtually any mixture and its accuracy seems to be independent of the chemicals involved, making it a robust tool for application to new technologies where few data are available.
Implementation
COSMO-SAC is implemented in Aspen Plus as a system liquid activity coefficient model. The activity coefficient model name is called COSMOSAC, and the Property Method that uses this model is called COSMOSAC.
Parameters
For each component, there are six input parameters. CSACVL is the component volume parameter. SGPRF1 to SGPRF5 are five component sigma profile parameters; each can store up to 12 points of sigma profile values.
The table below lists the parameters in this model. The attached Word files contains details of the COSMO-SAC model equations and parameters.
Parameters in COSMOSAC model
CSACVL Component volume in COSMO-SAC gamma model
SGPRF1 Elements 1-12 of Component Sigma Profile in COSMO-SAC gamma model
SGPRF2 Elements 13-24 of Component Sigma Profile in COSMO-SAC gamma model
SGPRF3 Elements 25-36 of Component Sigma Profile in COSMO-SAC gamma model
SGPRF4 Elements 37-48 of Component Sigma Profile in COSMO-SAC gamma model
SGPRF5 Elements 49-51 of Component Sigma Profile in COSMO-SAC gamma model
The units for CSACVL are Å3 and for sigma profiles are Å2. The summation over all sigma profile points is equal to the component surface area. These units, however, are not allowed to change from input.
Option codes
The primary version of COSMO-SAC implemented is the model by Lin and Sandler (2002). Two other different versions are also available by using Option Codes in Aspen Plus Interface. The table below lists the option codes.
Option Codes in COSMOSAC model
1 COSMO-SAC model by Lin and Sandler (2002)
2 COSMO-RS model by Klamt and Eckert (2000)
3 Same as the model by Lin and Sandler, except the modification in the exchange energy
(Mathias et al., 2002).
Parameters input
The parameters in COSMO-SAC have to be computed from a fairly complicated quantum mechanical calculation, but this calculation must be done only once for a particular molecule. The Aspen Physical Property System includes a database of sigma profiles for over 1400 compounds from Mullins et al. (2006). The parameters were obtained by permission from the Virginia Tech Sigma Profile Database website (https://apps.che.vt.edu/Liu-2013/VT-Databases.html). Aspen Technology, Inc. does not claim proprietary rights to these parameters. Alternatively, there are other commercial software tools available to perform COSMO quantum calculations. See Knowledge document 113674 for a selected list of references for details on how to calculate these parameters and where you can get these parameters.
From Properties | Parameters | Pure component, users can input a single value for CSACVL and values for sigma profiles into the five parameters, SGPRF1, SGPRF2, SGPRF3, SGPRF4, and SGPRF5 (each can store 12 points of sigma profiles and SGORF5 normally stores only last three points (48-51) of sigma profiles).
An example using COSMO-SAC model
The attached bkp file is set up to perform some calculations using COSMO-SAC and compare the results with other two activity coefficient models in Aspen Plus, NRTL and UNIFAC for a two-phase flash calculation of a water/acetone system. Six parameters are required to input for each component and these parameters all should come from the same COSMO calculation for consistency.
Keywords: COSMO-SAC
VSTS 585510
References: s
Lin, S.-T.; Sandler, S. I. A Priori Phase Equilibrium Prediction from a Segment Contribution Solvation Model. Ind. Eng. Chem. Res. 2002, 41, 899.
Klamt, A.; Eckert, F. COSMO-RS: a Novel and Efficient Method for the a Priori Prediction of Thermophysical Data of Liquids. Fluid Phase Equilibria 2000, 172, 43. |
Problem Statement: This | Solution: provides a sample VBScript calculation that uses information as an array and sends the array to an Aspen Calc formula.
Solution
First create the formula using VBScript, named VbAryInForm. Add an input parameter named arrayParamIn. Use the following sample script:
arrayIn = arrayParamIn.Value
redim arrayOut(ubound(arrayIn))
i = 0
for i = 0 to ubound(arrayIn)
arrayOut(i) = arrayIn(i) + (10 * i)
next
ReturnValue = arrayOut
Create a function using VBScript named VbArrayIn. Use the following script:
dim arrayIn(1)
arrayIn(0) = 5
arrayIn(1) = 10
arrayOut = VbAryInForm(arrayIn)
ReturnValue = arrayOut(0) + arrayOut(1)
Keywords: VBScript
Array
parameter
References: None |
Problem Statement: How to get rid-off the application error about AspenSplash on the event viewer logs when trying to launch APEA or ACCE but with no response from the software? | Solution: When there is an application error about AspenSplash on the system, please try to register the AspenSplash on the machine as following:
1. Run Command line Prompt CMD as admin
2. Under the prompt, run >C:\Program Files (x86)\Common Files\AspenTech Shared\AspenSplash.exe /regserver
Then launch the application and go to Event Viewer/Windows Logs to find out the error. If the aspensplash error still exists, you may need to disable the aspensplash registering as follows:
1. Start regedit, go to HKCU\Software\AspenTech and
2. create a string value DoNotRegister and set it to 1
Keywords: APEA, ACCE, Installation
References: None |
Problem Statement: Why Aspen OnLine V12 services does not appear in Task Manager Services window after installation? | Solution: First you need to check if the interactive service is working. Double-check the installation folder for AOL and see if the following file exist (AOLServiceInteractive3xx.exe).
C:\Program Files\AspenTech\Aspen OnLine V12.0\AOLServiceInteractive380.exe
If the file exists, double-click to launch this .exe - a window like below screenshot will appear:
Leave it open and then launch AspenOnLine.exe in the same folder. Then try running an example project (e.g. AOLSample) to see if that works.
if the interactive service works, it's possible that something went wrong during the installation that prevented the AOL service from being installed.
You can try to installing the service manually:
1. Open Command Prompt as Administrator
2. Navigate to C:\Program Files\AspenTech\Aspen OnLine V12.0\
3. Input the following into the command prompt and press enter to run it:
sc.exe create AOLSvc390 binPath=C:\Program Files\AspenTech\Aspen OnLine V12.0\AOLService390.exe
This should create a new Service named AOLSvc390 in the services list - this will be the AOL V12.0 Service.
4. Make sure to change the service's user log on to the same user (ideally with administrator privileges) that installed the SQL Server Express 2014. If you are using Local System account, make sure to check the checkbox Allow service to interact with desktop.
Keywords: AOL, Services, Installation
References: None |
Problem Statement: Does Aspen Mtell support TLS 1.2 security protocol? | Solution: Yes, Aspen Mtell supports TLS 1.2 security protocol in versions V12 and later. Instructions for configuring TLS 1.2 can be found in the Aspen Mtell TLS 1.2 Configuration documentation.
Keywords: Transport Layer Security
Certificate
References: None |
Problem Statement: I would like to change the password for the 'Admin' user which was set during initial configuration. | Solution: When security is enabled in Aspen Mtell, the 'Admin' username and the password created during initial configuration are available to be used as a last resort to login to Aspen Mtell if all other users are disabled or otherwise unable to access the application. The 'Admin' user will always be able to access Aspen Mtell, so it is important to ensure that the password is kept confidential. If the password has been compromised or needs to be changed for another reason follow the steps below.
1. Open Aspen Mtell, click on the Configurations tab and then click on Settings
2. Click on Security Settings
3. Click on Set Admin Password
4. In the dialogue box that pops up enter the old Admin password, enter a new password and then confirm the new password. Click OK.
Your new 'Admin' password will be saved and you will need to use it the next time you use the 'Admin' username.
Note: If you do not have the old Admin password, please refer to KB 000097443 for steps for resetting a forgotten password.
Keywords: Administrator
Reset password
Reset admin password
Reset administrator password
Resetting password
Resetting admin password
Resetting administrator password
References: None |
Problem Statement: How to resolve “libmmd.dll” error messages when installing v12? | Solution: When installing the v12 Engineering Suite, you may experience error messages similar to these.
These errors is stating that the compliers that are used are not being initialized and found by the installer. To resolve this error message, follow these directions. It can be used for Manual or Silent Installations.
Close out of the installation.
Reboot the machine.
Open file explorer and go to these locations.
C:\Program Files (x86)\Common Files\Intel\Shared Libraries\redist\ia32_win\compiler
C:\Program Files (x86)\Common Files\Intel\Shared Libraries\redist\intel64_win\compiler
Confirm that these folders/files exist in these locations. These are the compiler files that the installation needs to continue.
Go to your Environmental Variables. You can access that menu by going to your Start Menu and typing Environmental Variables. Then click the button at the bottom. You may need administrative privileges to access this menu. Please make sure you contact your appropriate IT personal.
In this menu, you will need to edit your PATH variable in the User and System. Click on the Edit button.
Click New to add a new variable. Then add these two variables as separate entries.
C:\Program Files (x86)\Common Files\Intel\Shared Libraries\redist\ia32_win\compiler
C:\Program Files (x86)\Common Files\Intel\Shared Libraries\redist\intel64_win\compiler
Click Ok to save. Retry the Installation.
If you are still continuing to experience errors, please contact AspenTech support.
There is also an alternative way to set the Environmental Variable using the attached executable.
Download the exe to a folder
Open CMD and run as administrator.
Change the directory to the folder in step 1.
Run SetEnvironmentVariable.exe intelfortranx86 to set the 32bit intel fortran environment variable
Run SetEnvironmentVariable.exe intelfortranx64 set the 64bit intel fortran environment variable
Double check the environmental variable to if changes are present.
Keywords: ApSparse.EXE DistillationAnalysis.exe libmmd.dll v12 Engineering
References: None |
Problem Statement: You might get the following error when trying to import a data source or link a historian in Aspen Event Analytics: Validation Failed.
When you click on the Validation Failed message, you may see the error message Unexpected load failed exception. | Solution: This error is most likely due to tag names or descriptions with Non-Alphanumeric characters. It can be resolved by manually editing the Tag Names or Tag Descriptions to remove any special characters.
Users can also follow the instructions below to use VBA macros to remove all non-alpha numeric characters from a csv file or Excel spreadsheet if the scale of tags is too large for manual intervention:
1. Open your excel workbook, click on the DEVELOPER Tab, and then click on Visual Basic in the ribbon
If you don't see the DEVELOPER tab, you will need to enable it by going to File > Options > Customize Ribbon > Main Tabs and then checking the box next to Developer
2. In the Visual Basic Editor window click Insert > Module to create a new module.
3. Paste the below VBA code into the code window.
Function RemoveNonAlpha(str As String) As String
Dim ch, bytes() As Byte: bytes = str
For Each ch In bytes
If Chr(ch) Like [A-Z.a-z 0-9] Then RemoveNonAlpha = RemoveNonAlpha & Chr(ch)
Next ch
End Function
Sub RemoveNonAlphaMacro()
Dim rng As Range
Set MyRange = Application.Selection
Set MyRange = Application.InputBox(Select One Range:, RemoveNonAlphaMacro, MyRange.Address, Type:=8)
For Each rng In MyRange
rng.Value = RemoveNonAlpha(rng.Value)
Next
End Sub
4. Click the Save button
5. Go back to the current worksheet and press the Macros button
6. Select your new macro, then click the Run button
7. Select one range which contains non-alphanumeric characters that you want to remove and then click Ok button
7. You should see that all non-alphanumeric characters have been removed from the selected range of cells.
Keywords: Aspen EA
AEA
special characters
References: None |
Problem Statement: How do I restore a SQL database from backup using SQL Server Management Studio (SSMS)? | Solution: You can restore a database from backup to overwrite the one that already exists on your server, or you can restore a fresh instance of the database.
Restoring a database from backup to overwrite the one that already exists
Ensure that all your Aspen Mtell services are stopped and then follow the steps below
1. Log into SQL Server Management Studio and select your server in the Object Explorer panel
2. Click on the + next to Databases to expand it
3. Right-click on the database you would like to overwrite and select Tasks --> Restore --> Database...
4. Click on the Device radio button and then click on the ... button
5. Click Add
6. Navigate to the location where you stored your backup file, select your backup file and click OK
7. Click OK
8. Click OK
9. Click OK when you get a message saying the database was restored successfully.
Restoring a fresh instance of the database
Follow the steps below
1. Log into SQL Server Management Studio and select your server in the Object Explorer panel
2. Right-click on Databases and select Restore Database...
3. Click on the Device radio button and then click on the ... button
4. Click Add
5. Navigate to the location where you stored your backup file, select your backup file and click OK
6. Click OK
7. If necessary, edit the Database Name (this is often useful when restoring a backup of your Production database to use in Test)
8. Click OK
9. Click OK when you get a message saying the database was restored successfully.
Keywords: Back up
Back-up
References: None |
Problem Statement: How do you create an SLM silent installation? I am unable to record only the SLM tools portion of the installation. | Solution: AspenONE Installer does not support the record XML file option for the SLM Tools. Users who want to setup only the SLM Tools can follow this article to install SLM tools silently.
Please follow the below procedure to perform a SLM Server installation silently:
· Download the attached file Silent_SLM_Tools.zip, which contains the files SLM TOOLS.XML, SilentExecute.bat, and ATRunUnattended.exe to a particular location from where you would like to execute the installation.
· Modify the XML and batch files parameter as follows:
- Replace the \\MEDIALOCATION parameter with the location of the media (from the downloaded file, it should be on the following location: .\aspenONE_V11_SLM)
- Replace the \\SCRIPTLOCATION parameter with the location of the script where you will be executing the batch files.
· Open a Command Prompt (CMD.exe) window as an Administrator
· Call the SilentExecute.bat file through the Command Prompt window.
Keywords: install, SLM, silent, tools, record
References: None |
Problem Statement: The BatchSep block reports the Cumulative pot duty on the Results | Summary tab; however, it is different from the Net duty value on the Results | Pot tab. | Solution: The cumulative pot duty result reported on the Results | Summary sheet of the BatchSep block is the amount of heat added or removed from the reactor.
Cumulative Duty is defined by the following equation:
Cumulative_duty = H_reactor_contents - H_feeds_streams - H_initial
Net Duty is calculated by the equation:
Net duty: Net Duty = Externally added duty – Heat loss to the environment
Key words:
Cumulative pot duty, pot duty, cumulative duty, BatchSep
Keywords: None
References: None |
Problem Statement: Installation of Microsoft SQL Server Express guide for aspenONE products. | Solution: As part of the aspenONE Suites the user maybe required to install Microsoft SQL Server Express as a prerequisite. AspenTech provides Microsoft SQL Server Express within the 3rd Party Redistributables folder on the aspenONE Media. Attached is a step by step guide to install the software.
Keywords: SQL Server Express, Installation guide
References: None |
Problem Statement: GDOT applications can access IP.21 data using OPC DA. Some additional configuration steps are required to enable that communication. Changes are required on both the IP.21 machine and the GDOT machine. This KB explains the procedure | Solution: Prerequisites
Be ready with a list of the userids that will be used to run GDOT applications. These users are referred to as the “GDOT application userids” in the rest of this document.
Changes to the IP.21 machine
You will need an account with Administrator privilege to perform these configuration changes. IP.21 should already be installed.
Launch the Computer Management utility (compmgmt.msc from a command line) and add the GDOT application userids to the “Distributed COM Users” group.
Run the DCOMCNFG utility and navigate to “My Computer”, right click on My Computer and click “Properties”. In the Properties dialog window, make the following changes:
In the COM Security tab, Under Access Permissions, click Edit Defaults. If “Distributed COM Users” is not in the list of users and groups, then add it. Select Distributed COM Users in the list and then enable both Local and Remote Access. Click OK.
Still in the COM Security tab, but under Launch and Activation Permissions, click Edit Defaults. Select Distributed COM Users in the list, then enable Local and Remote Launch, and enable Local and Remote Activation. Click OK.
Click OK again, to close the Properties dialog.
Still in the DCOMCNFG utility, navigate to “DCOM Config” (it’s below My Computer), find the component IP21 OPC Data Access .NET Server Wrapper and open the Properties dialog. Make sure that:
The Security tab has the Launch and Activate permissions set to use the default
The Security tab has the Access permissions set to use the default
The Identity tab is set to use the launching user.
Click OK to close the Properties dialog.
Open the Windows Firewall with Advanced Security utility (wf.msc from a command line) and take the following actions:
Find the Inbound Rule COM+ Network Access (DCOM-In) and enable it.
Create an Inbound Rule to enable access to the program %SystemRoot%\SysWOW64\OpcEnum.exe; name the rule OPC Enum.
Create an Inbound Rule to enable access to the IP.21 OPC DA server. This program path is %ProgramFiles%\AspenTech\InfoPlus.21\db21\code\IP21DA_Server.exe. Name the rule Aspen IP.21 OPC DA Server.
Open the Services utility (services.msc from a command line) and take the following actions:
Find the service named “OPCEnum” and open its Properties dialog
Change the Startup Type to “Automatic (Delayed Start)”
Start the service
You are done configuring the IP.21 system.
Changes to the GDOT machine
You will need an account with Administrator privilege to perform these configuration changes. GDOT should already be installed.
Run the DCOMCNFG utility and navigate to “My Computer”, right click on My Computer and click “Properties”. In the Properties dialog window, make the following changes:
In the COM Security tab, Under Access Permissions, click Edit Defaults. If “ANONYMOUS LOGON” is not in the list of users and groups, then add it. Select ANONYMOUS LOGON in the list and then enable both Local and Remote Access. Click OK.
In the COM Security tab, Under Access Permissions, click Edit Limits. If “ANONYMOUS LOGON” is not in the list of users and groups, then add it. Select ANONYMOUS LOGON in the list and then enable both Local and Remote Access. Click OK.
Click OK to close the Properties dialog
Open the Windows Firewall with Advanced Security utility (wf.msc from a command line) and take the following actions:
Find the Inbound Rule COM+ Network Access (DCOM-In) and enable it.
You are done configuring the GDOT machine.
Keywords: …GDOT, IP21 OPC DA, DCOM, Distributed COM Users
References: None |
Problem Statement: What is the purpose of the Allow partial light ends input checkbox? | Solution: The Allow partial light ends input checkbox can be found on the Light-Ends Handling & Bulk Fitting Options window as shown below. This form is instantiated via the Assay view available in the Aspen HYSYS oil manager.
Activate the Allow Partial Light Ends Input checkbox to achieve a better fit between generated curves and input curves for partial light ends analysis data. In situations when either a full light ends analysis is not available or you do not want to identify part of the analyzed light ends components, HYSYS can generate overlapping hypothetical components to compensate the missing portion of the light ends, making the output stream matching both the partial light ends input and the other input curves.
Keywords: Oil Manager, Assay, Light-Ends Handling & Bulk Fitting
References: None |
Problem Statement: How can someone modify the correlations used to calculate the properties of the hypo components created by the Oil Manager? | Solution: HYSYS lets you choose from a wide variety of correlations to determine the properties of a generated hypocomponent.
1. Select the Oil Manager option from the Properties Environment Home Ribbon.
2. Select the Correlation SetsTab and Click on Add. Here you can define additional specialized correlation sets available to Oil Manager calculations. A correlation set is a collection of methods associated with and used to calculate the Oil Manager properties: Low and High End temp, MW, SG, Tc, Pc, Acc. factor and Ideal H.
3. Here you can modify the correlation methods to use for the calculation of each property by the Oil Manager.
4. You can select which Assay or Blend to use with this new correlation method.
5. To create a new correlation method for another assay/blend. Just follow the same procedure to add a new correlation set and append it to the desired assay/blend. (For example, you are interested on using Lee Kesler to estimate Molecular weight for Assays 1 and 2 but would but like to have Peng Robinson to estimate Assay 4)
Keywords: Oil manager, Hypo components, Assay, Blend, Properties, Correlations
References: None |
Problem Statement: Customer is able to use the Legacy Add-in with no problem, but when he/she tries to use the MES COM Add-in, Current Value, Historical Value, Historical values, etc., receive an error:
Error: The server has rejected the client credentials
There's no Database security configured on the Aspen InfoPlus.21 system, and customer can use the new COM add-in correctly on the IP.21 system.
Customer states that the client system and the IP.21 system are in different networks. | Solution: When the MES COM add-in using Process Data service to communicate with IP.21, it uses WCF that require user to be authenticated on the server. If the client and server are on different networks then there is a problem to authenticate the remote user. Process Explorer and the legacy add-in do not use WCF, therefore, they don't have this problem.
TheSolution for COM add-in is to use Aspen DA for IP.21 service instead of the Aspen Process Data Service. Through DA, the COM Add-in should work the same way as PE. To use DA instead of the Aspen Process Data Service, open the ADSA Administrator and see if the Aspen Process Data Service is configured there. If it is, remove it and make sure Aspen DA for IP.21 is still there.
Also, please make sure the Aspen Process Data Service service is stopped and disabled on the Aspen InfoPlus.21 server.
Before removing Aspen Process Data Service:
After removing Aspen Process Data Service:
Keywords: credentials
reject
COM add-in
References: None |
Problem Statement: After deploying a packaged | Solution: a Developers Community appears that should not be there.
Solution
Review the Definitions for Developer Communities with links to pipeline location/tree node security.
Remove the Developer Communities in Definitions
Remove any security set for Developer Communities in Pipelines.
Repackage theSolution.
Keywords:
References: None |
Problem Statement: When configuring the Aspen Mtell database through the Database Wizard, you might get the following error message:
Save Error
There was an error saving the database connection settings to the registry. Requested registry access is not allowed. | Solution: This error shows up when Aspen Mtell is not being run as an administrator. To resolve it follow the steps below.
1. Close the Database Wizard.
2. Right click on either Aspen Mtell System Manager or Aspen Mtell Agent Builder.
3. Select Run as Administrator.
4. Progress through the Database Wizard, then press Finish. You should not receive the error message.
Keywords: cannot configure database
Installation
New install
SQL server
DB
References: None |
Problem Statement: Is it possible to find and move to a block in the sequence of the Control Panel? The sequence is very long, so it is difficult to locate particular blocks. | Solution: Find in Sequence was available in earlier versions of the Control Panel in Aspen Plus; however, this feature was removed in V9. It has now been re-implemented in V12. If you right mouse click in the left hand pane of the Control Panel with the sequence (in a place where there is not a block name), it is possible to select Find Node.
After clicking Find Node, a dialog box pops up where it is possible to select or search for a block by typing in the Search Text box (text is case sensitive).
After clicking OK the system highlights the block in the sequence panel. Once the block is highlighted, it is possible to use the right mouse button to 'Move To' or add a stop point.
Keywords: None
References: : VSTS 428216 |
Problem Statement: How can a fuel cell be modeled in Aspen Plus? | Solution: Attached is a fuel cell example using User2 block with a Fortran subroutine. This example has two product streams to representing the anode and cathode outlet. An example of a fuel cell with only one product stream is given inSolution document 102568.
The User2 model has two feed streams (anode and cathode feed), two product material stream (anode and cathode outlet), a work and a heat output stream.
This model is based on a paper in J. of Power Source 71 (1998) 337-347.
The Fortran routine is an illustration of using an Aspen Plus user-defined unit operation USER2 block to model a fuelcell with H2 and air with two inlets and two outlets.
The Fortran routine also illustrates:
The use of component indices for conventional substream.
Calculate the information stream (work or heat streams).
The use of the IFCMNC function to retrieve physical property and simulation parameters from the Aspen Plus data structure.
The FLASH physical property routine for a TP and a PQ flash.
The SHS_CPACK and DMS_KFORMC ASPEN PLUS utilties and the AVEMW ASPEN PLUS function.
Write to the GUI Control panel using the Terminal Writer Utility DMS_WRTTRM.
Notes:
This routine supports two material feed streams, one material product stream, a single product work stream, and a single heat product stream.
Keywords: user unit operation
user fortran subroutine
References: s:
Aspen Plus User Models Reference Manual |
Problem Statement: This example shows how to use BatchOp unit operation under Batch Model in Aspen Plus for the simulation of batch salt | Solution: preparation and buffer tank simulation for continuous process feed. Solution
The saltSolution will need to be prepared as one batch per day to fill up the buffer tank for the continuous process feed.
In this example, the BatchOp model will be used for both a salt DISSOLVE unit operation and a BUFFER tank unit operation.
The process of saltSolution preparation is specified as per below:
There is an initial charge of 500 kg of Solid Salt into the DISSOLVE block.
In Unit Procedure WATERF Operation Step O-1, water is feed continuously until the total mass is equal to 2000 kg with the flow ramping over 30 minutes up to maximum 2000 kg/hr.
TheSolution in the reactor will be given some time before starting to discharge to the buffer tank. Discharge to the buffer tank starts after 1.5 hr of the operation through the side draw SALTS stream of the DISSOLVE BatchOp unit.
In Unit Procedure DISCHARG, the discharge of theSolution to the buffer tank is ramped up to a maximum of 2000 kg/hr of flowrate until the total mass in the reactor is about 100 kg before the discharge flowrate ramps down to 100 kg/hr to remove the remainingSolution in the reactor. A small quantity will be leftover in the reactor.
The process of Buffer tank is specified as per below:
There will be an initial charge ofSolution at 200 kg/hr in the BUFFER unit while waiting for new feed from theSolution preparation.
The continuous feed DRAW is drawing through the side draw of the buffer tank is set to 83 kg/hr in Unit Procedure BUFFER.
The whole cycle for the batch flowsheet is 24 hrs.
The attached example file is in BKP format (Batch Sequence_R2.bkp) and created using V12.
Keywords: None
References: None |
Problem Statement: What is dividing wall column?. | Solution: The dividing wall column is a just a Petlyuk column in a single physical column, with the pre-fractionator embedded into the main column. Since the pre-fractionator is physically inside the main column, there will be some heat transfer between it and the main column. The user can model the influence of the heat transfer if he/she wants to, by modifying the flowsheet inside the hierarchy.
When the dividing wall column is modeled in Aspen Plus, it is the same as the Petlyuk column. There is not much difference between the dividing wall column and the Petlyuk column in the first place. we can use the apm as a starting point and modify the flowsheet inside the hierarchy to model the heat transfer between the pre-fractionator and the main column if necessary. We can use MultiFrac to model a dividing wall column if we know how to connect the streams between the pre-fractionator and the main column.
You can download an example file in the attachment.
Keywords: Dividing wall column, Petlyuk, Aspen Plus, Multi-frac
References: None |
Problem Statement: When simulating oil mixtures, users usually use Assay input to represent the composition. In Simulation environment, distillation curves could be included in stream summary table to report the results of streams that contain Assay type components. However, the default curves only contain limited predefined cuts. How to add additional points to the curves (D86, D1160, TBP, VAC)? | Solution: Users are allowed to add additional points to enhance distillation curves. The steps are shown below with D86 curve as the example.
Add a new property-set in Property Sets folder.
Select D86T in Properties tab.
Input new cut points in Qualifiers tab.
In Stream Summary table, add the property set which was created in step 1. Then the new cut points will be displayed in the table.
Note:
In step 2, D86T and D86TWT could be selected to report temperatures at given liquid volume percent or given weight percent respectively.
D86LV or D86WT could be selected to report liquid volume percent or weight percent at given temperatures.
Similar properties are available for other curves.
Key words
Aspen Plus, Assay, Distillation curve, D86, D1160, True Boiling point, Vacuum Curve
Keywords: None
References: None |
Problem Statement: Example for modeling LNG Exchanger in HYSYS Dynamics | Solution: In this model the two LNG exchangers are modeled in HYSYS Dynamics using the Heat Transfer Co-efficient's and other geometry data generated from the rigorous simulation models using Aspen Plate Fin Exchanger.
Keywords: LNG, dynamics, exchanger
References: None |
Problem Statement: In versions prior to V14, the Aspen Watch Maker Help File documentation on this procedure is not very clear for users so this document will provide a more comprehensive guidance. As part of Documentation Defect 603365, the Help File is targeted to be improved in V14 with the detailed procedure below. | Solution: SQL Procedure for Custom DCS Types
The SQL procedure for a Custom DCS Type is used to manage the different operations related to a PID record. The PID record name is the only argument that is needed for the SQL procedure to perform the supporting logic.
The procedure must perform the following logic:
First, it checks the AW_PROCESSING field of the PID record to decide which logic to perform. The values for the AW_PROCESSING field are as follows:
–2
INITIAL (not yet configured; you must first go into CONFIG mode before switching to ON or OFF states)
–1
CONFIG (perform one-time configuration logic)
0
OFF (don’t process this PID record)
1
ON (perform normal cyclic logic)
The first step for any PID record is to execute the CONFIG (or configuration) logic. If the configuration logic succeeds, AW_PROCESSING should be set to ON mode. If configuration fails, it should be set to OFF to disable processing until the problem is fixed.
Each processing mode is described below. The best way to understand each processing section is to look at the existing examples for different DCS Types. Examine the .CFG file and the corresponding .SQL file for the DCS type.
Note that the CFG file creates mappings that read raw DCS parameters for PID loops and stores them into fields in the AW_PIDDef record. Depending on the value of the DCS parameter, you may need to write logic in this SQL procedure to convert that value into something that Aspen Watch recognizes. Some examples are given below for this type of scenario.
The Aspen PID Watch Help topic “Configuring PID Loops in Aspen Watch” below the “Aspen PID Watch Concepts” section includes general information for DCS vendor support, CFG file scan rate considerations, and Cim-IO logical devices configuration.
Also refer to the following topics in the Aspen Watch Maker Help document for additional supporting information:
PID Record Information
Configuration File for Custom DCS Types
Below is a description of each of the processing sections of the .SQL procedure with examples:
-2 = INITIAL (initialize intermediate values) Logic
This condition occurs when the PID tag is first created.
The code in this section should only initialize the data quality fields used in the processing of the record. The data quality fields (AW_<fieldname>Q) treat a value of zero as the initial quality. You only need to initialize the quality fields that are used to hold raw DCS parameters (check the .CFG file for those field mappings).
It should also set the AW_IOERROR to -1 (Initial).
Example:
WHEN -2 THEN -- INITIAL
Keywords: None
References: None |
Problem Statement: A procedure for copying a local security configuration from one local security server to another. | Solution: It is possible to copy the Local Security configuration, including the defined roles and applications, from one local security server to another. This would be helpful in a situation where a second security server is set up for redundancy purposes* or to migrate the server node to new hardware.
With Local Security, the security configuration is contained in the AFWDB.mdb file in the C:\Program Files (x86)\AspenTech\Local Security\Access97 directory. This file can be copied from one server and pasted into the corresponding Access97 directory on the new server. As long as both servers have access to the same Active Directory domain accounts built into the local security roles, they will authenticate clients correctly.
To point client machines to the new security server, run the AFWTools utility found under the Start Menu at Start | Programs | AspenTech | AFW Tools. On the Client Registry tab, double click the URL parameter to open the Update Registry Window. From here, you can change the host name in the HTTP address, which points to the pfwauthz.asp page on the current server to the node name of the new security server. For example, if your new security server is named LSSERVER2, the URL parameter would read: http://LSSERVER2/Aspentech/AFW/Security/pfwauthz.aspx.
*There is no automatic failover built into the security server. As a result, having redundant servers will only be beneficial if, in the event that the primary server fails, users know to change the URL parameter on their client machines to the secondary security server. Furthermore, it is important to note that there is no replication between the security servers. Modifying a role or application on one server will not automatically update the redundant server.
Keywords: afw
security
manager
References: None |
Problem Statement: How do I upgrade the SLM Client on application machines? | Solution: AspenTech supports using a newer SLM Client version with machines running older versions of a product. You can standardize on a higher version of the SLM client to simplify the deployment of a patch, or if a higher version of SLM client is needed to use new functionality. The following conditions must be met as part of this upgrade:
SLM Server is the same version or higher than the SLM Client. Please see the following for SLM Compatibility: https://esupport.aspentech.com/S_Article?id=000081689
The Machine adheres to the platform specifications and requirements of the SLM Client. Please see the following for platform support requirements: https://www.aspentech.com/en/platform-support
Upgrade must be completed using the SLM Tools install package from the media. If you want to perform a silent install, please see the following on how to perform a silent install of the SLM Tools: https://esupport.aspentech.com/S_Article?id=000022161
Scenario 1: If a machine is running an aspenONE V11 product and upgrading to V12 SLM Client Tools, the application machine must meet the platform requirements for V12 and the SLM Server must be from V12 or higher.
Keywords: Software License Manager
aspenONE SLM License Manager
Sentinel
References: None |
Problem Statement: Aspen InfoPlus.21 failed to start at boot. However it can be start normally from Aspen InfoPlus.21 Manager.
One possible explanation could be because some dependent services have not fully started. | Solution: Windows will wait for 120 seconds for all services with startup type “Automatic” to startup completely before starting services with startup type Automatic (Delayed Start). This knowledge based article aim to provide instructions on how to extend or prolong the time taken to start services configured with startup type as Automatic (Delayed Start).
Launch Windows Registry Editor by entering regedit Run dialog box.
Browse to the registry path HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\CimTskSrvgroup200.
Add a new DWORD(32) named “AutoStartDelay” and enter value in term of seconds.
Keywords: Delayed Start, Extend, Prolong, startup time
References: None |
Problem Statement: When working in a DMC3 Builder APC Project for Nonlinear Models, it has been observed that sometimes the data plots in the Model view thumbnail do not all show up. For example in the image below, we are expecting to see 4 data plots (3 inputs, 1 output) but there are only 3 data plots in the thumbnail view on the right side.
Data Plot for FIC-2001SP does not show up in the thumbnail: | Solution: The root cause of this issue is that the thumbnail view is case sensitive, so unless the name of the vector in the dataset exactly matches the name of the model variable, the plot will not show up. In the example shown above, the reason FIC-2001SP is not appearing is because the vector name in the dataset is lower case fic-2001SP:
As a workaround, you can rename the vector in the datasets view to be all upper case and it should then show up in the models view automatically. However, please note that when changing the vector name, it won't recognize a difference between the same name lower case to upper case and won't accept the changes, so you will need to change it with an additional character first and then change it back. For example:
Right-click on vector name and select properties
Change the vector name from lower case fic-2001SP to upper case FIC-2001SP1 (note the additional character 1 added) and click OK
Then right-click and select properties again, rename to FIC-2001SP (removed the 1) and click OK
Click on Models view to see that it shows up automatically
Renamed vector in Dataset:
Models View now shows FIC-2001 SP data plot in thumbnail:
This is a defect that is targeted to be fixed in V14. Defect ID: VSTS 631313.
Keywords: nonlinear, model, plot, thumbnail, rte, dmc3, builder
References: None |
Problem Statement: How do you add a user-defined property to the Global Data shown on the flowsheet for a stream in Aspen Plus? | Solution: Up to six property set properties can be displayed for a stream on the flowsheet in addition to Temperature, Pressure, Molar vapor fraction, Mole flow rate, Mass flow rate, Volume flow rate, and Heat/Work. You can specify the label, property set, variable format, color, and box shape for each property. If the property cannot be calculated for a stream, it will not appear on the flowsheet for that stream.
Adding a custom property set to global data will be demonstrated using Solid Mass Fraction in a Belt Dryer Case as an example.
Create a Property Set with the properties you would like to report. In this case, solid mass fraction is MASSSFRA of ALL substreams.
Note: make sure you select the appropriate substream so that the property can be shown.
In the Modify ribbon, click on Stream Results button. This ribbon only appears when the Main Flowsheet is at the front.
In the Flowsheet Display Options page, add a label for how you would like the property to be named in the legend. In this case, “Mass Solid Fraction” was used. Then, choose the appropriate Property Set.
Click Apply and OK. The properties will be displayed on the Main Flowsheet and labeled accordingly. (See file attached).
Keywords: Global Custom Data; Solids;
References: None |
Problem Statement: Why OptiPlant is giving an error message for equipment coordinates every time on opening the line list? | Solution: This error message is appearing because the decimal (.) value in the .LLS and .ELS file is getting replaced with a coma (,) because of OS regional settings. To resolve this issue, open Control Panel and Go to Clock and Region settings
Thereafter, Go to Region settings and open Additional Settings
From the additional settings, choose the decimal Symbol as Point (.) and apply the changes.
Finally, do a system Re-start for the settings to get updated in your OS. After making these changes, this issue will go away.
Key Words
Line List, OptiPlant
Keywords: None
References: None |
Problem Statement: How do you create a Hot Link to a URL in the Aspen Process Graphics Editor? | Solution: Open the Aspen Process Graphics Editor and create a new (or open an existing) graphic. Create a Hot Link. Right-click on the Hot Link and go to Properties. In the File: text box, enter the path to Internet Explorer (in double quotes) followed by a space and the URL.
Example: C:\Program Files\Internet Explorer\IEXPLORE.EXE www.aspentech.com
Note that this also works for a hotlink in a process graphic to a specific document in another application. For example, you may have an Excel worksheet with multiple bar graphs containing specific process data. If you want an operator to be able to go to that document with a click of a button from a Process Graphic:
First, create the document and save it on a common network drive (e.g. drive S), so it is accessible to others. Then in the File: text box, enter the file path to Excel, a space, and the path to the document:
Example: C:\Program Files\Microsoft Office\Office\Excel.exe S:\Aspentech\Desktop\Graphics\test.xls
Keywords: url
file
path
website
hotlink
process graphic
References: None |
Problem Statement: The IP_DC_MAX_TIME_INT field in an IP_AnalogDef record only works when the record is updated on a regular basis. If the record is updated on an irregular basis, such as values from a CIM-IO IoUnsolDef record, then this field does not always ensure that there is a value every IP_DC_MAX_TIME_INT interval. | Solution: TheSolution is to use an SQLplus script (see attachment) to check the records updated by the CIM-IO IoUnsolDef and to update the record if it has not been updated within the IP_DC_MAX_TIME_INT. This script should be run on a scheduled basis at a frequency of 10 minutes. If you have a large number of IoUnsolDef records, to avoid peak CPU load you can stagger several SQLplus scripts so they only process a subset of the IoUnsolDef records at any given time.
Keywords: IoUnsolDef
IP_DC_MAX_TIME_INT
Compression
References: None |
Problem Statement: This Knowledge Base article provides the answer to the following question: How does IP_DC_MAX_TIME_INT work? The question is often asked when data is not being written to history at the specified time interval. | Solution: The common misconception is that you can configure the tag field IP_DC_MAX_TIME_INT to write data to history after a certain time period. It does do this, HOWEVER, you need to have two data points to compare for IP_DC_MAX_TIME_INT to be effective.
For example, if you have two values, and the second value does not break compression, but it took a long time for it to come in, IP_DC_MAX_TIME_INT looks at the timestamp difference between the values at T1 and at T2. T2 may not have been stored because of compression, however, IP_DC_MAX_TIME_INT looks at T2 - T1 and if it is greater than its value, it will write the T2 point to history anyway.
It does NOT, however, go out and scan for data at that IP_DC_MAX_TIME_INT interval, and if no data has been stored, store a point to history. This would be analogous to giving the tag itself the ability to scan data, or communicate with the DCS. That is not possible, and is controlled by CIM-IO and its records and tasks.
OneSolution to a tag that doesn't get scanned very often or whose value does not change much, would be to write an SQLPlus query and save it as a scheduled Querydef record. You could activate it every 30 minutes, or whatever interval you deem appropriate, and if a new value hasn't come in, then just write the last data value again.
NOTE: The common misconception described above results from poor documentation. The following new information has been recently added to the Aspen InfoPlus.21 documentation:
The tag field IP_DC_MAX_TIME_INT specifies a maximum time interval parameter that is considered by the data compression processing. For more information, see History Repository Tag Fields.
-AND -
The time that has elapsed since the last value was stored in the InfoPlus.21 history repository must be more than the number of seconds specified by IP_DC_MAX_TIME_INT.
Keywords: IP_DC_Max_Time
IP_DC_MAX_TIME_INT
scan rate
References: None |
Problem Statement: How should you set data compression parameters to record values for a discrete tag (i.e. tags storing integer data like records defined against IP_DiscreteDef) only when the tag changes? | Solution: The following data compression settings will cause values for a discrete tag to be written to history only when the tag changes:
· Set IP_DC_SIGNIFICANCE to 1
· Set IP_DC_MAX_TIME_INT to +000:00:00.0
· Set IP_STEPPED to Stepped
Keywords:
References: None |
Problem Statement: What is the numeric format of the IP_DC_Significance field in the IMS tagset? Is it a percent? | Solution: It is an absolute value based on the IP_Input_Value field. For example if the IP_Input_Value of a tag is 95 (which could represent a percent, a degree, etc), and the IP_DC_Significance is set to 1, then if the next IP_Input_Value is <=94 or >=96, it will bypass compression (i.e. value will be recorded.) The number set in the IP_DC_Significance field is not a percent, but an absolute value based on the IP_Input_Value field.
Keywords: data compression; definition records
References: None |
Problem Statement: As described in knowledge base | Solution: s 103093 and 103491, the field IP_DC_Significance is used in both Ip_AnalogDef and IP_DiscretDef records for controlling Data Compression. It is an ABSOLUTE value. What happens if you set that value to 0.0 ?Solution
It is as if Data Compression is turned off - ALL values will be stored in History.
Keywords: None
References: None |
Problem Statement: Is there anything in AspenWatch to help understand the dynamic disturbance rejection rather than the steady state | Solution: .Solution
Two useful KPIs in the AW history are LSTRBI (TRI) for MVs and LSTRBD (TRD) for CVs. These are the contributions to the dynamic move plan. High values of LSTRBI tells you which MVs are contributing least to the dynamic move plan and high values of LSTRBD tells you which CVs are being controlled most with the dynamic move plan. As an example ramp variables often have high LSTRBDs. Looking at the values and what the controller is doing can provide insight into why the controller is moving dynamically in a certain way.
Both values can be positive or negative and are typically in the range of -5% to 105%. An LSTRBD of zero indicates the CV has zero weighting in the dynamic move calculation – an effective ECE of 10^6. The LSTRBI can be changed by tuning move suppression and LSTRBD can be changed by tuning dynamic ECEs and transition zones.
Below is a general description of what values LSTRBD and LSTRBI roughly mean.
LSTRBI for MVs
From 20% to 100%
Move suppression is an important priority for this MV, controller is attempting to minimize movement of this MV
The controller is sacrificing dynamic control of CVs or moving other MVs to minimize the movement of this MV
From -3% to 3%
The dynamic response of this variable is not an important priority
The controller will move this MV as needed to control the CVs
LSTRBD is negative
The move plan for this MV involves at least one change of direction as it moves from its current value to steady state
The MV is moving aggressively to improve the response of a higher priority variable
LSTRBD for CVs
From 20% to 100%
The dynamic response of this variable is important
From -3% to 3%
The dynamic response of this variable is not important
Is negative
The dynamic response of this variable is made worse to improve the response and get tighter control of a higher priority CV
Below is a screen shot of the PCWS where to find these parameters and they can be plotted in the PCWS by setting the attribute to Contribution.
Keywords: None
References: None |
Problem Statement: A customer is seeing the following error in the Batch21Services_AtlServer log file:
(10092) 01/08/2021 17:50:01.632 AtApplicationInterfaceMonitor::DoWork() - Getting the reporting datasource failed - refreshing: E_FAIL
(10092) 01/08/2021 17:50:01.634 AtApplicationInterfaceMonitor::DoWork - refreshing datasources: S_OK
It happens every minute. This started showing up after the customer switched data source names.
This Knowledge Base article shows how to resolve this error. | Solution: The error message indicates that the reporting datasource name is wrong in the following file…
C:\ProgramData\AspenTech\Production Record Manager\Production Record Manager.Profile.xml
Check the <Reporting> section, correct the datasource name and save the file. The error should disappear after restarting the Aspen Production Record Manager Services service.
Keywords:
References: None |
Problem Statement: This KB article explains the basic Excel configuration for GDOT Model Update and GDOT Excel Add-In. This configuration is encouraged to assure proper functioning. | Solution: Excel settings are per-user settings, and so need to be set for each user that uses the GDOT Add-in or that is associated with a registered Model Update application in DCOMCNFG).
To change the settings:
First, open Excel to a new, empty workbook, The use the File menu and go to Options.
In the Options dialog, choose “General” and scroll down to the bottom and uncheck the options for prompting about Excel not being the default and about displaying the Start page:
Then choose “Formulas” and set it to manual calculation:
Then choose “Save” and uncheck both Autosave and AutoRecovery
Finally, choose “Trust Center” and then click “Trust Center settings” and make sure these three folders are in the Trusted Locations, with “allow subfolders” enabled on each.
Keywords: Excel add-in, GDOT, Model Update
References: None |
Problem Statement: How can I track the velocity of a fluid in a pipe segment in dynamic mode? | Solution: In steady-state, it is possible to track the velocity profiles of the vapor and liquid phases of a fluid being transported through a pipe segment by going to the Pipe Segment | Performance | Profiles | View Profile… | Table (or Plot) form:
However, once in dynamic mode, the Performance tab of the pipe segment looks different from its steady-state version and there is no option to access the vapor and liquid phases velocities anymore:
There are two easy ways to work around this situation:
1) Right-click on the pipe segment operation, click on the ‘Show Table’ option, double-click on the pipe segment table and use the Variable Navigator to access the different velocity variables available for the pipe segment:
2) Create a strip chart and similarly to what is described above, use the Variable Navigator to access the velocity variables:
With this approach, you can either display the strip chart to graphically track the velocity behavior over time in the pipe segment or consult the trip chart’s historical data to have the data ready in a tabular format (for this, please refer to the KB Article 000094756)
Keywords: Velocity, Pipe Segment, Dynamic Mode, Performance Tab, Show Table, Strip Chart, Variable Navigator.
References: None |
Problem Statement: A | Solution: is needed to manage the size and number of MESDXT log files
Solution
Aspen Batch and Event Extractor stores information, debug and error messages in the MESDXT.log file. The maximum size this file can grow to is defined by the maxFileSize parameter in the file C:\Program Files\AspenTech\Batch.21\Server\MESDXTLogger.Config.xml, as shown in this picture:
Once this file's size exceeds the maxFileSize parameter specified in the configuration file, the MESDXT.log file is renamed, adding the current date and time to its extension, as shown below:
The maximum number of log files to keep around can be specified using the maxNumberOfLogFiles parameter in the configuration file. The most current maxNumberOfLogFiles files are maintained by deleting older files. If the value of this parameter is 0, no file is deleted and all the log files are kept in this folder until the user deletes them manually.
To make it easier to set the Global Debug Level, an MSDXT Logger Configuration utility, called AspenTech.MESDXT.LgrCfgTool.exe , can be used. The utility is also located in the same directory as the XML file mentioned above: C:\Program Files\AspenTech\Batch.21\Server .
Keywords: log full
service
crash
References: None |
Problem Statement: How do I resolve error Prediction model execution period does not match the application execution period while updating the .iqf file in IQConfig? | Solution: IQR is the runtime file of model file which is generated from a IQM file. To prepare an IQM model file, two sets of data are required. One is the process data & the other is Lab data.
The data frequency of process data is the input data frequency for an .iqm file.
The prediction execution period (P1PERIOD) for an IQ is based on the input data frequency. By P1PERIOD, we mean execution period for prediction process. When we create a prediction in IQConfig, the default value of P1PERIOD is 60 seconds.
a Execution period mismatch error appears when the execution period in the runtime file (.iqr) does not match with that in the config file (.iqf). Or we can say input data frequency does not match with the execution period for prediction.
You can find the P1PERIOD as the Application Execution Period, and IQR's as the Model Execution Period at the PR Properties dialog:
To overcome this error, there're two reSolution
1. Re-identify the model in IQModel, with 1 min process data frequency and then export the IQR file. Or
2. Set the P1PERIOD in IQConfig > Config to 360 sec here, or match the process data frequency in IQModel.
Note: P1PERIOD is a global parameter for the IQF, keep the IQR files at same period if there're lots IQs in the IQF file.
Keywords: Execution period does not match, Execution cycle error, Prediction model execution period, Application execution period, Execution period error
References: None |
Problem Statement: A Process Simulator File (PSF) file is a text file giving process and properties data, which can be read by both current Exchanger Design and Rating (EDR) programs and heritage HTFS programs, to provide inputs needed by those programs. Process data relates to stream flows and inlet and outlet conditions. Property data consist of arrays of isobaric properties covering a range of temperatures.
The file is made up of blocks of data. For each stream there is one process data block, followed by one or more (max 5) properties blocks. [An EDR “stream” is a “side” in Process Simulator terminology] Each property blocks consists of arrays of properties at a reference pressure for the block. The data within a block is in the form of lines. Each line begins with a line number, followed by (up to) six items of data, separated by spaces. Omitted data items can be identified by a ‘*’. The line number occupies the first three characters, and is followed by a space, or by an A, B or C to indicate continuation of a line.
Process data blocks have line numbers in the 200 series, Properties in the 300 series. Within a block the lines are usually in line number order. A blank line between blocks of data is useful, but not mandatory.
When referring to individual data items, an identifier comprising the line and item number is often used. For example so 202.1 is a stream number and 202.2 its flowrate (Identifiers are in EDR mdb’s).
The files could be created by hand with a text editor, where this | Solution: describes the format and structure of the file.Solution
A process block comprises
201 stream-name
202 stream-number flowrate inlet-X outlet-X blank blank
204 inlet-T outlet T blank blank blank blank
Where:
T refers to temperature
X to quality (vapour mass fraction)
The stream name (optional) can be up to 40 characters
The individual values can be in E or F format (in Fortran terminology) or can be integers. The number of characters occupied by each item is not prescribed, but 13 or 14 characters for each item (and adjoining space(s)) is often used for real values. Allowance should be made for minus signs (for specific e enthalpies). Terminal blanks can either simply be omitted, or be represent by ‘*’. Completely blank lines can be omitted entirely (except 201 or 301).
All data in a PSF file are in mostly pure SI units (m, kg ,s, J, K, Pa, W and combinations thereof), EXCEPT (for obscure historical reasons)
Pressures are in kPa (absolute, not gauge)
Specific heats and specific enthalpies in kW/kg /K and kW/kg,
Thermal conductivities are in kW/m K.
Enthalpies and specific heats are mass-based, not molar.
Also:
Process data relates to stream flows and inlet and outlet conditions.
Property data consist of arrays of isobaric properties covering a range of temperatures.
Properties data blocks begin as follows (blanks not referred to):
301 stream-name
302 stream-number
303 pressure
The name should be the same as the process name (or may be omitted). There are then up to three sub-blocks of data, liquid, main and vapour, in the 310, 320, 330 series respectively, each comprising an array of temperatures, followed by corresponding arrays of other properties.
The line numbers are:
311 temperatures for liquid properties
312 liquid densities
313 liquid specific heats
314 liquid (kinematic)viscosities
315 liquid thermal conductivites
316 surface tension
321 temperatures for enthalpies and qualities
322 specific enthalpies
323 qualities (vapour mass fractions)
331 temperatures for vapour properties
332 vapour densities
333 vapour specific heats
334 vapour (kinematic) viscosities
335 vapour thermal conductivities
Each array may have up to 24 items, so continuation indicator A, B, C will be used. For example suppose there are 20 points for liquid, the liquid temperatures on lines 311 will appear as:
311 Tliq-1 Tliq-2 Tliq-3 Tliq-4 Tliq-5 Tliq-6
311A Tliq-7 Tliq-8 Tliq-9 Tliq-10 Tliq-11 Tliq-12
311B Tliq-13 Tliq-14 Tliq-15 Tliq-16 Tliq-17 Tliq-18
311C Tliq-19 Tliq-20
The temperatures should preferably, but not necessarily, be in order. Increasing order preferred, decreasing allowed.
If a phase is absent, all the lines for that phase may be omitted. If there is only one phase, then the T-h-x lines 321-3 may also be omitted, but this is not recommended.
The three sets of temperatures must be identical. For liquid and vapour phases, at temperatures beyond the equilibrium range of the phase, properties may be omitted (use ‘*’) but the corresponding temperature should be included. For example, suppose that there are 12 temperatures in increasing order, and there is no vapour present until the eighth temperature. Lines 331 and 331A will appear with vapour temperatures, but for vapor densities, line 332 need not appear, but line 332A will appear, with the first item blank, followed by vapour density values as items 2 onwards on this line.
Selecting Properties Points
The general rules for selecting properties points and pressure levels are the same for PSF files as for other property generators for EDR. There should normally be two pressure levels, typically at the inlet and outlet pressures. More pressure levels should be included if the inlet and outlet differ buy more than 20%. The pressures must be in increasing or decreasing order but not mixed) no two pressures should be less than 1% different.
The general rules for selecting points are:
There should be good spacing in both temperature and enthalpy.
The temperatures should cover the range of potential temperatures in the exchanger but need not actually match inlet and outlet temperatures.
Bubble and dew point temperatures in the range should be included.
Both liquid and vapour properties should be given at the bubble and dew points, but not beyond these for the phase which is absent.
For pure components (or other substances which boil or condense isothermally) then no data points should be given between the bubble and dew points, and no vapour properties should be given at he bubble point, or liquid at the dew point.
Selecting points well is non-trivial for two phase streams, and simple approaches like equal enthalpy interval are problematic. An algorithm is available which, given half a dozen points, and the dew and bubble point, will predict a set of further temperatures where flashes should be performed.
Extension of PSF files
The following input arrays have not traditionally been part of PSF file definition, but need to be added to the PSF reader in EDR, and to PSF file generators.
324 specific enthalpy of (1st) liquid phase
325 specific enthalpy of vapour phase
326 specific enthalpy of 2nd liquid phase
327 molecular weight of (1st) liquid phase
328 molecular weight of vapour phase
329 molecular weight of 2nd liquid phase
330 mass fraction 2nd liquid phase (fraction of whole stream, NOT of total liquid)
342 2nd liquid densities
343 2nd liquid specific heats
344 2nd liquid (kinematic)viscosities
345 2nd liquid thermal conductivities
346 surface tension of 2nd liquid
Phase enthalpies and molecular weights are useful, but not fundamental to most heat exchanger calculations. 2nd liquid phase properties should be provided whenever a second liquid phase is present. Note that there is no line 341 with separate set of temperatures for the second liquid phase. All of the above properties should be provided at the line 321 temperatures for stream enthalpies and qualities. A blank ( ‘*’ ) should be used when a phase is absent.
Note that currently the PSF file importer in EDR loses all data for subsequent streams when a line number higher than 335 is present.
Future Development of PSF files
The basic definition, and extension described above, will enable PSF files to supply the properties information used by all HTFS programs. There is more information relating to compositions, which could be used for condensing streams in Shell&Tube (and potentially in future in AirCooled).
Lines 304, 350 and 351 contain constant properties for all components in the mixture. They are defined once per stream.
304 blank blank blank blank Mass/mol.composition No.ofComponents
350 Component.No Mol.wt. Crit-T Crit-p Crit-mol.vol Diff.param
351 Frac Acentric.fac Dipole-mom. Norm.boil.pt [email protected] blank
Then, at each pressure level, there is a series of lines giving the compositions at each of the temperatures used for enthalpies ant qualities, for each of 50 components. Fractions are mass or mole based, depending on the input for item 304.5
401 fraction of component 1 in liquid phase
402 fraction of component 2 in liquid phase
403 fraction of component 3 in liquid phase
….
450 fraction of component 50 in liquid phase
then
451 fraction of component 1 in vapour phase
452 fraction of component 2 in vapour phase
453 fraction of component 3 in vapour phase
….
500 fraction of component 50 in vapour phase
And if needed
501 fraction of component 1 in 2nd liquid phase
502 fraction of component 2 in 2nd liquid phase
503 fraction of component 3 in 2nd liquid phase
….
550 fraction of component 50 in 2nd liquid phase
If compositions are introduced, it probably makes sense to move from a 6 items per line format to a 24 items per line format for these new items.
Notes:
An example file is enclosed to see the format, where hot and cold streams are supplied at multiple pressure levels.
In the following article the user can find the steps to use the PSF file: How do I import a PSF file into Aspen Exchanger Design and Rating programs?
Keywords: PSF, EDR, Properties, HTFS, Line Code
References: None |
Problem Statement: The purpose of Future Trajectories is to configure the future movement of one or more feedforward, or disturbance, variables (DVs) according to any one of the following prediction algorithms: First Order, Ramp, or Piecewise Linear.
In this document, we will show some basics about enable and configure the Future Trajectories Feature on V11 | Solution: To Enable Future Trajectories, follow the next steps:
1.- On DMC3 Builder go to Options ---> Advanced Options and set Enable Feedforward trajectories to True
2.- On Controller Model go to Model Operations ----> Edit and select Future Trajectories
3.- Edit Required Parameter for the future trajectories on the FT edit Box:
First, select the desired option for the Type of prediction model to be applied to the feedforward variable:
None (default) - no future trajectory model is applied to the feedforward variable.
First Order - For additional guidance, see the Help topic First Order Model.
Ramp - For additional guidance, see the Help topic Pure Ramp Model.
Piecewise Linear - For additional guidance, see the Help topic Piecewise Linear Transform.
Model parameters include the following:
Time Constant - Applicable only to First Order. Roughly one-third of the time (following delay, or Deadtime) required for the process output to line out after a step-change in the process input.
Deadtime - Applicable to First Order or Ramp. Also called a delay. The time required for the process output to change, following a change to the process input.
Steady State Value - Applicable only to First Order. Anticipated value where the process lines out after a step-change in the process input.
Ramp (units per minute)- Applicable only to Ramp. Rate of linear change occurring after a step-change in the process input.
4.- Different parameters to enable and control Future trajectories for the DV, can be configured and access through simulation, and once the controller is deployed, these parameters can be modified on PCWS.
A user guide of the feature is available to download from this Knowledge base article.
Keywords: DMC3 Builder, Future Trajectories, Feedforward
References: None |
Problem Statement: During the Gain Update (V11) process after the switch is turned ON, the status does not change from Initializing (INIT) to (RUNNING) but there is no error message visible. | Solution: It is a situation that may occurs on V11. This problem happens because the Microsoft Excel software shows up a “Product Activation Failed” message and an activation message pops-up interrupting the communications between GDOT and Excel. It is required to count with the Microsoft Office license updated and the account activated in order to continue successfully with the Gain Update operation.
Keywords: GDOT, Excel, Gain Update.
References: None |
Problem Statement: A license is required to install CIMIO and start it for the first time. Some CIMIO environments are not connected to a license server. This article provides an unlocked standalone CIMIO license so you can install CIMIO and start it without being connected to a license server. This license will work on any CIMIO machine. | Solution: This file contains instructions for installing the CIMIO unlocked license file on any machines
Licenses contained in the CIMIO unlocked license file:
CIMIO_BaileysemAPI
CIMIO_Core
CIMIO_Fisher_Chip
CIMIO_Fisher_RNI
CIMIO_FoxAPI
CIMIO_InfoPlus21
CIMIO_Measurex_ODX
CIMIO_OPC
CIMIO_PI
CIMIO_RSLinx
CIMIO_Westinghouse
CIMIO_Yok_ACG10s
IN_FNC_MSCTKUTL
Instructions:
1) Copy the CIMIO standalone license file to the CIMIO machine.
***Please note that this license can’t be installed using the AspenTech License File Installer. We have created this file with the .txt extension to prevent opening the file in AspenTech License File Installer ***
2) Copy the SLMLockinfo.exe to the CIMIO machine
3) Run the SLMLockinfo.exe
Note down the file name for Hostname, IP Address, Ethernet Address or DiskID. It will look similar to this format: 008_26230
You will need to rename the standalone license file to include the “File Name”.
4) Rename the CIMIO standalone license file to lservrc_File_Name.lic
In the example, the file name will be lservrc_008_26230.lic
5) Rename any existing license files in the following folders to *.lic_old or *.slf_old
V8 and later: C:\Program Files (x86)\Common Files\Aspentech Shared\
V7.3 and earlier: Copy the file to C:\Program Files (x86)\Common Files\Hyprotech\Shared\
6) Copy the license file to the following folders
V8 and later: Copy the file to C:\Program Files (x86)\Common Files\Aspentech Shared\
V7.3 and earlier: Copy the file to C:\Program Files (x86)\Common Files\Hyprotech\Shared\
Note: ThisSolution is not applicable if your CIMIO software is running on a virtual machine, please request a standalone CIMIO license file for a virtual machine by submitting a license key request online. https://esupport.aspentech.com/S_LicenseRequest
Keywords: CIMIO
License
References: None |
Problem Statement: Sometimes you could face the issue when PIMS model run results are different from AUP run of migrated model which caused by discrepancies in matrix structure. This workflow will help you to investigate. | Solution: Here’s a brief how-to for comparing matrices from PIMS and AUP:
Run PIMS and AUP and locate the “Xlp_new.xlp” files from each run.
Xlp_new.xlp is the pre-solve matrix. AUP and PIMS use the same solver, so it’s likely that differences in the solved state are caused by differences in the pre-solved state.
You could copy-paste these files somewhere and rename them “Xlp_new_AUP.xlp” and “Xlp_new_PIMS.xlp” so you don’t mix them up.
Open the Matrix Comparison program (MatComp.exe, located in PlanningAnalysisTools) and load up the two files.
Matrix Comparison will tell you if a variable or equation is missing, among other things.
Differences between AUP and PIMS often come down to missing or added matrix structure.
It may also be useful to have the two matrix files open in the Matrix Analysis Program
The name of the variable or equation that is missing or added should give some direction as to what part of the model is misbehaving.
Keywords: migration
matrix
comparison
xlp
References: None |
Problem Statement: How is the Dynamic Unbiased Prediction (UAZ) calculated for an IQ with an FIR Model? | Solution: Attached Example: There is an Excel spreadsheet attached with a concrete example that you may find helpful in understanding the mechanism. It is a simple FIR model with input changes and IQ predictions calculated.
How UAZ is Calculated:
The dynamic unbiased prediction is calculated in the following way:
y(t+1) = y(t) + g1 [u(t) – u(t-1)] + (g2 – g1) [u(t-1) – u(t-2)] + …
The new value of the CV, y(t+1), is equal to the past value of the CV, y(t), plus the first FIR model coefficient, g1, multiplied by the change in the input from its past input value, u(t), minus the input value before that, u(t-1). Then plus the difference between the second FIR model coefficient, g2, and the first FIR model coefficient, g1, multiplied by the change in input one cycle previous to the last input change. You continue these terms for FIR coefficient minus the previous FIR coefficient times the change in the input, working your way back 1 TTSS until the last difference is the FIR model gain minus the FIR coefficient before it times the change of the input 1 TTSS in the past. The prediction is essentially a convolution calculation between differences in the FIR model gains and the differences in the inputs.
Why you might be seeing a large value for UAZ:
The values of the unbiased prediction depend on the how the input value changes and how the model coefficients change. Since the unbiased prediction is not biased, it's current prediction, y(t), depends on the past prediction, y(t-1), instead of the actual measurement. Therefore, there are two possible reasons that you are seeing a bigger value for UAZ than expected:
A large input change; i.e. the result of [u(t) – u(t-1)], [u(t-1) – u(t-2)], etc.
A large change in model coefficients, i.e. the result of (g2 – g1). This can happen if the FIR model has a dynamic overshoot and comes back down. With this type of model, it may explain the behavior you see since the FIR model is not just the steady state model and we can't just rely on the gain in knowing what the next value of the output will be.
Keywords: UAZ, IQ, prediction, FIR
Internal
References: : VSTS 563148 |
Problem Statement: By default, there is not an entry that can help reset the number of steps counter automatically, but this problem can be a workaround using a custom calculation. This article described the steps to set up this calculation. | Solution: Create an Input Calculation on DMC3 Builder with the follow logic:
if (stepsw=1 and mastersw=0) then
Nstep = 0
end if
where Stepsw is a user entry created for the general variables that will act as “Reset switch”. This user entry will have two states 1 meaning ON and 0 meaning OFF and will be triggering the reset of the Steps performed.
Mastersw: is the MasterON/OFF Status of the controller, with this entry we force the controller to do only the reset when the controller is Turn OFF and avoid the counter get reset on every cycle. (this entry needs to be mapped to the Master ON/OFF Status)
Nstp: is the Step Counter for the variable or variables that need to be reset. You can use * for all variables or create a list of Nstp if different variables are required like the example below:
Nstep = 0 ‘Counter for MV1
Nstep1 = 0 ‘Counter for MV2
To have the calculation active you are going to require to back to open the controller on DMC3 builder, create the calculation and then redeploy the controller once again. And by the effect of the last actions, the counter will be reverted to 0 (Redeploying the controller can revert the counter back to 0 but is know that this may not happen in all cases).
After that, with the calculation in place you can reset the counter without having to redeploy the controller again.
Keywords: DMC3 Builder, Testing, Calculation.
References: None |
Problem Statement: This | Solution: explains briefly what happens when in Simulation Environment a Controller shows a Calculation error can only compare two strings, two numbers, or a numeric string and a number.
Solution
If a controller shows this error in the simulation environment is likely that the controller won’t be able to turn on even if the rest of the validation filter is passed (Min. good CV, MV, Subcontroller etc.).
The way to resolve this is by checking that all Variables in Calculation had been mapped correctly and they have the proper datatype.
It is important to consider that the Calculation test will not show this error as this feature just test the calculation logic, but it does not validate the mapping or datatype.
Keywords: DMC3 Builder, Calculations, Simulation
References: None |
Problem Statement: Customer menus are one-click URL access that calls certain or specific pages from any APC applications, for example, the use of one specific History plots for Variable X. instead of going Controller X > Variable X > Open AspenWatch Menu > History Plots.
You can simply locate a custom Menu on the PCWS tabs and click directly on the URL for that specific History plot.
Note: This can be customized for any page that contains an URL (Internal or External if you have internet access) | Solution: The Steps to create the Custom Menu are the following:
On PCWS go to Configuration, then Custom Menus, and click Add
This will pop up a new window with some information required.
On application, you can select any (it will do not matter if it is ACO, RTE , IQ, or composite). the page will be added to the top ribbon on any application selected.
Menu Item will be the name of your customer menu. And for the URL section, I will use the URL from the page you want to call (History plot, PCWS Plots, A1PE, New APC Interface, Any Subcontroller etc.)
You can click to test to verify the URL is pointing to the right direction
Finally, go back to the application and you will see the new link on the top ribbon under the custom tab
Keywords: PCWS, Customer pages, URL
References: None |
Problem Statement: It has been reported that after installation (or V10 patches installation) IQConfig fail to open and return an error “Unexpected error; quitting”. | Solution: This is a reported problem that may occur on V10 (the problem has not been reported or has been reproduced on V11 or V12). To fix the issue simply run the application as administrator. After that try to close IQ config and open the application with any user (it does not require Admin rights after that) this time the application should open without any further problem.
Keywords: IQconfig, Installation.
References: None |
Problem Statement: How do I create a PSV Excel Datasheet in Aspen HYSYS or Aspen Plus V12? | Solution: Since V11, PSV datasheets are being created using Aspen Basic Engineering (ABE). In V12 there were major changes to the workflow and functionalities. Here are the steps to generate a Datasheet:
In Safety Analysis, go to Safety Datasheets ribbon.
Connect to a Server
For ABE Local Server configuration, choose Use Personal Workspace
For ABE Enterprise server use Join a Project Team and type the IP address.
Once connected to a Server, the options will be active now. Select an existing Workspace or create a new one in the Workspace drop-down menu.
Go to Mapper and click on Update Mapping and Transfer button.
Go to explorer and click on the Create Datasheet button.
Filter the list by selecting Safety as the Category.
From the list, select the appropriate template.
Choose the PSV and click on Create button.
The datasheet will be displayed. Finally, it can be printed as a PDF or Microsoft Excel document.
Keywords: Datasheets, PSV, Safety Analysis, Aspen Basic Engineering
References: None |
Problem Statement: How to enable verbose logging to record issues related to the installation of aspen applications? | Solution: Verbose logging is a type of computer logging method that involves more information than the standard or typical logging process. In software, verbose logging is the practice of recording to a persistent medium as much information as you possibly can about events that occur while the software runs. Typically, users can turn on verbose logging features to get more information about a system.
One of the most useful scenarios to enable verbose logging is when there is an issue with the installation of aspen software. We do have the installation logs that get generated but sometimes when the cause of failure during an installation is not clear from the installation log, we can try enabling the verbose logs and repeat the installation to get the exact information on the events that occur and to find the cause of the issue.
This is applicable to all Aspen software suites.
Follow the steps below, provided that you use a domain account with Administrator privileges on that computer.
1. Uninstall all the existing software.
2. Turn off any antivirus configured on the machine.
3. Enable verbose logging.
To do this, you will need to set a registry setting: New STRING, HKLM\Software\AspenTech\Setup\EnableLogging with value =1.
An easy way to do this, copy the attached file to your desktop and ensure that the file name is ‘enableLogging.reg’
4. Once done, open the registry editor > Right click Run as Admin
5. Then double click the ‘enableLogging.reg’ file from the Desktop
6. Once done in your registry editor, you should be able to see this registry entry under [HKEY_LOCAL_MACHINE\SOFTWARE\AspenTech\Setup]
So please check and confirm.
7. Once confirmed, Run the aspen installer > right-click Run as Admin and install the desired products, because of the logging, installation may take a long time so please be patient.
8. If you receive failure of Aspen application components, capture the error message with a screenshot and finish the installation wizard.
9. On your machine, the verbose log file should have gotten generated in the C:\Users\YOURUSERID\AppData\Local\Temp folder.
The log file created could give us more information about the error message that you are receiving. If you would like a consultant to review it, please send it to [email protected].
NOTE: Verbose logging options are usually enabled specifically for troubleshooting because they create large log files and can slow down performance. So, when you are done with logging the error, you will need to delete the registry entry and not just change the value by opening Registry editor as Adminstrator and navigating to the path[HKEY_LOCAL_MACHINE\SOFTWARE\AspenTech\Setup] and deleting the EnableLogging string.
The log will be at least a few MB in size and could be much bigger and can impact machine performance.
Keywords: None
References: None |
Problem Statement: How do I manually provide multiple heat curves in Aspen Heat Exchanger Design & Rating Program under user defined properties using heat loads? | Solution: If user wants to provide user defined properties for multiple heat curves/specified properties using the option set “user specified properties using heat load”. Then this option does not allow to provide multiple heat curves. User can provide only single set of value.
“There isn’t a way to have more than one pressure level” while specifying the properties data with heat loads.
What could be done:
If you know the composition of the stream? You could generate new properties using Aspen Properties.
If user have a property data like density, viscosity, surface tension, specific heat, thermal conductivity, molecular weight with multiple pressure sets, select “user specified properties” & in this option user can select multiple pressure sets, using add set button.
Keywords: Multiple heat curves, user specified properties using heat load
References: None |
Problem Statement: How to map Plant Tags to Model Variables in Aspen Simulation Workbook? | Solution: Plant data tags can be mapped to model variables. This allows you to automate workflows and leverage the power of process simulation in the plant operations environment. You can:
Map tags to model variables (e.g., populate model inputs with measured data from the plant).
Map model variables to tags (e.g., send model predictions back to the plant data server through tags, which allows the model to act as a virtual analyzer and/or provide predictions of unmeasured variables for operator decision support applications).
Create two-way flow of information between the model and plant data.
Note: It is possible to map multiple tags to the same variable, and multiple variables to the same tag. If information is flowing from the one to multiple, all mappings will work. If two tags are associated with the same model variable with type Tag to model, or two model variables are associated with the same tag with type Model to tag, then the tag-model pairing with the higher ID number in the tag->Model table will succeed, overwriting the one with the lower ID number.
To map variables to tags:
Open the ASW Organizer and switch to the Variable Mapping | Tag -> Model view in the left pane. This opens a grid in which each row displays a link between a model variable and a plant tag. The first time the grid is opened, it will be empty.
Right-click the variable grid pane to open the pop-up menu.
Select Add Unreferenced Tags to pull a list of tags into the variable grid, or to update the list with recently added tags.
Aspen Simulation Workbook will automatically map tags and variables to each other if the tag name and variable name are identical. If a matching variable is not found for a tag, the 'null' symbol will show up in the Model Variable column.
To map a variable to an unreferenced tag, click the 'null' symbol next to the tag name. This brings up a list of the unmapped model variables.
Scroll through the list to locate the desired model variable.
Click on the variable name to map it to the tag.
Note: It is not necessary to map every plant tag to a model variable. Use the Delete button to remove any tags that you do not want to map. Alternately, you can remove all the unreferenced tags together by right-clicking in the variable grid and selecting Remove Incomplete Tags from the pop-up menu
Keywords: Mapping, Tags, Model Variables, ASW, etc
References: None |
Problem Statement: What is petroleum BMCI value and how to calculate it within HYSYS simulation environment? | Solution: The petroleum BMCI (Bureau of Mines Correlation Index) is an index used to judge the suitability of heavy feedstock for the production of olefins, often refer to as petrochemical potential. The BMCI is based only on boiling range and density.
BMCI is specified for assay in properties environment, cannot be seen in simulation environment. From HYSYS V11, user can add BMCI in Assay conventional result in property environment.
If in simulation environment, user can calculate it using spreadsheet with formular as attached PDF.
In the formular, VABP is volume average boiling point temperature, d15 is specific gravity (SG 60/60).
Keywords: BMCI, Properties Environment, Assay
References: None |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.