question
stringlengths 19
6.88k
| answer
stringlengths 38
33.3k
|
---|---|
Problem Statement: How to install Software License Manager (SLM) License Server? | Solution: This knowledge base article provides the installation steps for the SLM License Server application. AspenTech recommends to setup a seperate server (Physical, Virtual, or Cloud) to host SLM License Server. Please refer to the software and hardware specifications here.
First, download the latest version of the SLM Tools from the Download Center.
Once you have downloaded the SLM Tools, extract the zip file to a local folder on your C drive and then run setup.exe with the Run as Administrator option.
Next, follow these steps to install the SLM License Server:
1) At the Welcome screen, click the Install and configure SLM button.
2) Accept the aspenONE Software License Terms by checking the I accept the terms of this agreement checkbox and click Next.
3) At the aspenONE Product screen, expand SLM License Server and SLM Tools and select the following minimum requirements to be installed:
a. SLM License Server
i. Auto Upload Tool
Note: HTTP Server is optional. If this license server will also act as a usage log collection server, then this option will need to be installed.
b. SLM Tools
i. SLM Server Admin Tools (WLMAdmin for Administrators)
ii. SLM Client Tools
Note: Dongle Driver and Aspen Licensing Dashboard are optional. The Dongle Driver will need to be installed if your license is locked to an Aspen USB Dongle.
4) At the Specify license file screen, click on the Browse button and select your license. Click Next.
Note: The license will be loaded automatically after the install. You may choose not to load your license during the install, but you will need to load it manually after the install. See knowledge base article 22408 to learn how to load a network license.
5) At the Specify Window services account information screen, provide a domain account username and password that has administrative privileges. Click Next.
Note: If your computer is not on a domain, you can use the local Administrator account by typing the computer_name\Administrator in the username field and use the Administrator password.
You may also use the computer’s SYSTEM account by typing in ‘SYSTEM’ for the username and leave the password blank.
6) At the Verify your installation screen, click the Install Now button.
7) During the installation, around 91% completion of the SLM License Server, the Auto Upload Tool will display its Configuration Tool. You may close the window and configure this at a later date or configure it now and click OK.
Visit knowledge base article 22451 for more information about configuring the Auto Upload Tool.
Note: The installation will pause until an action is made. When the Configuration Tool is displayed, it may be hidden behind the installer.
8) Once the installation is finished, click the Finish button.
Related Articles:
Video: How to install Software License Manager (SLM) Server
Keywords: Sentinel RMS License Manager, SLM Server, Gemalto
References: None |
Problem Statement: Is it possible to connect the GDOT Offline applications to the plant operations OPC server using GDOT Simulation Console? | Solution: Yes, it is possible.
GDOT Offline is a subset of the capabilities of GDOT Online, nevertheless, GDOT Offline supports using OPC I/O tags and APC Gateway I/O tags.
The point of GDOT Offline is for project engineers to test the behavior of the GDOT applications online. It is expected that the OPC servers and APC servers would be test servers (e.g. in a customer's testing lab or training lab).
However, when GDOT Offline is installed on the online system, the OPC servers that can be reached are the same as for GDOT Online and GDOT Offline applications can connect to them if configured to do so.
Keywords: GDOT Offline, GDOT Simulation Console, OPC connection
References: None |
Problem Statement: How to bulk change the Data Source for Aspen Process Explorer trend files? | Solution: Attached is a file called UpdateDataSourceTool.apx which needs to be run on the machine where all the apx. files are located.
1.- Double click on UpdateDataSourceTool.apx to open the file.
2.- Enter the folder path containing the apx. files to update, and enter the new Data Source name. Finally, click on Update Data Source.
Note: Before using the tool, it is recommended to backup the apx. files.
Example:
* The following trend is called Document1.apx and the Data Source is MES.
* The trend in saved under the folder Test located in the Desktop (C:\Users\Student\Desktop\Test).
* It is desired to change the Data Source to be IP21
1.- Opened UpdateDataSourceTool.apx and fulfilled the Folder Path and the New Data Source Name.
2.- Once Update data source is clicked, a confirmation messages is received.
3.- As a result, the Data Source has been changed from MES to IP21 for all the apx. files within the folder Test (C:\Users\Student\Desktop\Test)
Keywords: Aspen Process Explorer
Trend
Data Source
References: None |
Problem Statement: How to resolve the problem that Variables change in Excel is not displayed in Aspen Plus? | Solution: This can occur for one of two reasons:
If you put an equation in a cell, the equation will take priority over any attempt to enter a new value on the form in Aspen Plus.
Aspen Simulation Workbook expects you to change specified variables from Excel.
To fix, close the form/dialog box in Aspen Plus; then, open it back up; you will now see the value is consistent with Excel.
Another way to force an update, is to click the Refresh button in the Workbook group of the ASW ribbon .
Keywords: Aspen Plus, ASW, Variables, Shown
References: None |
Problem Statement: What are the difference between Model Summary Grid and Flowsheet Summary in Aspen HYSYS? | Solution: We have two options for result reporting options:
The Model Summary Grid displays input and results for material streams and unit operations in one view.
- Includes an embedded, real-time link to Excel via Aspen Simulation Workbook.
- In the main Model Summary Grid, the data appears split over separate tabs for each model type. --- The Model Summary Grid within individual blocks displays a summary of the variables for that block.
The Flowsheet Summary contains important material/energy balance details.
- Also shows results from included Process Utilities and CO2 emissions tracking.
- In the Model Summary Grid, the data appears split over separate tabs based on unit operation type.
- The Flowsheet Summary view contains the following tabs:
1. Mass Energy Balance
2. Utility Summary
3. Process CO2 Emissions
4. Stream Diagnostics
5. Pressure Balances
6. Convergence Monitor
Keywords: Aspen HYSYS, Model, Flowsheet, Summary
References: None |
Problem Statement: How to change the form from T-XY diagram to YX diagram in Aspen Plus? | Solution: In order to change the following T-XY diagram:
Please go the Results tab of the Analysis. Then, you will find the results table. On the right up corner there is a plot section, scroll down and you can choose the plot type from T-XY to Y-X.
Therefore, the YX plot will generate based on the analysis results.
Keywords: YX diagram, Plot, Analysis, Aspen Plus
References: None |
Problem Statement: How to import/export unit operations and flowsheet objects in Aspen HYSYS? | Solution: In HYSYS we can copy/paste flowsheet objects with n same file using below option:
- Cut/Copy/Paste flowsheet object(s).
Only user specified data is transferred.
Accessed by right-clicking a highlighted object/objects.
Most useful when doing a one-time transfer of data within a HYSYS simulation.
Also, in HYSYS we can import/export flowsheet objects to another file:
- Import/Export portions of a flowsheet.
To export, right-click and choose Export to File.
Saved externally as a *.hfl file.
Can be imported into any HYSYS case.
Most useful when repeatedly transferring data within a HYSYS simulation, or when transferring data to another simulation.
Keywords: Aspen HYSYS, Transfer, Object
References: None |
Problem Statement: Why Aspen Plus/Aspen Properties freezes/hangs when you initialize/run a model? | Solution: The investigation showed that all the effected users did not have access to cmd.exe (command prompt), which is required for Aspen Plus/Aspen Properties to initialize/run the simulation in the background.
Affected users will need to request for grant permission to cmd.exe (command prompt) in order to initialize/run the simulation.
Keywords: CMD, Permission, Model
References: None |
Problem Statement: All orders changed to Initiated status post Aspen Production Execution Manager (APEM) database upgrade | Solution: Post APEM database upgrade (done via database wizard) and recompilation of orders, we see all the past orders migrated from old system to new system in Initiated status.
In the old flags.m2r_cfg config file the following key assignment is present “FINISH_ORDER_BY_PFC_FLOW = 0” but in the new one it is commented out. This explains the different status values for the orders.
Following key assignment in flags.m2r_cfg file should fix the problem (it’s already present but commented out, so just uncomment it and run codify_all).
FINISH_ORDER_BY_PFC_FLOW = 0
The key only impacts the way order status is calculated and displayed on the screen. It will not change anything for existing completed orders in the database. There is no need to execute CompileProc again.
Keywords: parameter
configuration
References: None |
Problem Statement: This best practices article summarizes the recommendations given in KB Article 129045-2 (Configuring History Repositories for Optimal Performance). | Solution: 1. Create a dedicated Aspen InfoPlus.21 history repository for each distinct data source (i.e., every CIMIO server, Aspen Calc, calculated values from queries or external tasks, etc) to avoid placing data in older archives when forwarding stored Aspen Cim-IO data.
2. Create enough repositories so that no single history repository handles more than ten thousand history repeat areas.
3. Configure the archive shift criteria ('File Sets Size' and 'Time Span') so that archive shift will not occur more than once a week for a given history repository. Increase the file-set size to the desired maximum archive size (typically 1000 MB or more).
4. Set the 'Cache Buckets' specified on the Repository tab of the Properties dialog of the Aspen InfoPlus.21 Administrator to be greater than the number of history repeat areas assigned to the history repository.
5. Increase the history event queue 'Buffer Size' in the Advanced tab of the Properties dialog of the selected history repository to at least 1000000 (1 million bytes) on Aspen InfoPlus.21 systems that must handle large surges of process data to be archived (i.e. five thousand points per second or where Aspen Cim-IO store-and-forward is enabled).
6. AspenTech recommends placing file-sets on local drives. SAN devices with dedicated high-speed fiber-optic cables may also be used. Do not use network drives or NAS devices.
Keywords:
References: None |
Problem Statement: Will Aspen Calc V11 (and later versions) work with the 64-bit version of Microsoft Excel 2016? | Solution: No. Aspen Calc is a 32-bit application and as a result can only work with other 32-bit applications. If the 64-bit version of Excel has been installed the best course of action would be to uninstall Excel and the Aspen products and then install the 32-bit version of Excel and then the Aspen products
Keywords: None
References: None |
Problem Statement: SQL Return: SQL_ERROR
Error Code: S1000
Error Message: [ORACLE] [ODBC] [Ora] ORA-28001: the password has expired | Solution: This error means that AORA Database Account Password used for the Client App ODBC Connections has expired. It is required to contact the Data Base Administrator to reset the password.
Keywords: AORA
Password
References: None |
Problem Statement: How to extract all instrument tags used by AORA? | Solution: You can extract the information using the Excel Add-In or with SQL queries written and executed either from AORA DBTools or from Query Writer within the SQL Server or Oracle RDB.
Query example that will bring all the basic information (Tag Names, Tag Aliases and Instrument Type Identification):
Select Distinct 'FCC Coke Burn' as 'INSTR_TYPE', PLOINSTR.TAG, PCOICOKE.TAG_ALIAS from PLOINSTR, PCOICOKE where PLOINSTR.DBINDEX = PCOICOKE.IND2OBJECT
UNION ALL
Select Distinct 'Flow Mass' as 'INSTR_TYPE', PLOINSTR.TAG, PCOIFLOM.TAG_ALIAS from PLOINSTR, PCOIFLOM where PLOINSTR.DBINDEX = PCOIFLOM.IND2OBJECT
UNION ALL
Select Distinct 'Flow Volume' as 'INSTR_TYPE', PLOINSTR.TAG, PCOIFLOV.TAG_ALIAS from PLOINSTR, PCOIFLOV where PLOINSTR.DBINDEX = PCOIFLOV.IND2OBJECT
UNION ALL
Select Distinct 'Generic and Orifice Flow Meters' as 'INSTR_TYPE', PLOINSTR.TAG, PCOIMETR.TAG_ALIAS from PLOINSTR, PCOIMETR where PLOINSTR.DBINDEX = PCOIMETR.IND2OBJECT
UNION ALL
Select Distinct 'Tank Inventory Mass' as 'INSTR_TYPE', PLOINSTR.TAG, PCOIMASS.TAG_ALIAS from PLOINSTR, PCOIMASS where PLOINSTR.DBINDEX = PCOIMASS.IND2OBJECT
UNION ALL
Select Distinct 'Tank Inventory Volume' as 'INSTR_TYPE', PLOINSTR.TAG, PCOIVOLM.TAG_ALIAS from PLOINSTR, PCOIVOLM where PLOINSTR.DBINDEX = PCOIVOLM.IND2OBJECT
UNION ALL
Select Distinct 'Tank Decimal Gauge' as 'INSTR_TYPE', PLOINSTR.TAG, PCOIGAUG.TAG_ALIAS from PLOINSTR, PCOIGAUG where PLOINSTR.DBINDEX = PCOIGAUG.IND2OBJECT
UNION ALL
Select Distinct 'Tank Fraction Gauge' as 'INSTR_TYPE', PLOINSTR.TAG, PCOIGFRC.TAG_ALIAS from PLOINSTR, PCOIGFRC where PLOINSTR.DBINDEX = PCOIGFRC.IND2OBJECT
UNION ALL
Select Distinct 'Lab Pipe' as 'INSTR_TYPE', PLOINSTR.TAG, PCOILABP.TAG_ALIAS from PLOINSTR, PCOILABP where PLOINSTR.DBINDEX = PCOILABP.IND2OBJECT
UNION ALL
Select Distinct 'Lab Vessel' as 'INSTR_TYPE', PLOINSTR.TAG, PCOILABU.TAG_ALIAS from PLOINSTR, PCOILABU where PLOINSTR.DBINDEX = PCOILABU.IND2OBJECT
UNION ALL
Select Distinct 'Multi-Product' as 'INSTR_TYPE', PLOINSTR.TAG, PCOIMPRD.TAG_ALIAS from PLOINSTR, PCOIMPRD where PLOINSTR.DBINDEX = PCOIMPRD.IND2OBJECT
UNION ALL
Select Distinct 'Product Adjustment' as 'INSTR_TYPE', PLOINSTR.TAG, PCOIPRAJ.TAG_ALIAS from PLOINSTR, PCOIPRAJ where PLOINSTR.DBINDEX = PCOIPRAJ.IND2OBJECT;
If you want to use Excel Add-In:
Using the Add-In's General Interface, start by Logging into the AORA Model Database.
Click the Read Model Information Checkbox to pull up the Information Data Retrieval Extract Form.
On the resulting dialog in the Section Labeled, Model Configuration Read, then check the Checkboxes next to all of the Instrument Type Selections.
Click the OK Button to extract the Configuration Data to Excel and Return it onto separate Worksheets for each Instrument Type selected
Keywords: AORA
Instrument Name
References: None |
Problem Statement: When running AORA the following error appears:
SQL Return: SQL_ERROR Error Code: S0002 | Solution: The error normally is caused by connection problems, verify the following:
The Windows user has rights to read the AORA Database.
The ODBC connection aims to the correct AORA database.
The AORA database is not empty.
Keywords: AORA
S0002
References: None |
Problem Statement: What is the name convention for the AORA reports? | Solution: All of the report filenames follow a specific naming convention, and this naming convention allows you to quickly determine the basic format for the report from the file name. The naming convention follows the following format: abbccddeef.rpt (The import table views have the same names as the corresponding import tables).
The first character describes the type of report: daily, monthly, graphical, custom, or other miscellaneous type of report. The first character of the report will be one of the following characters:
Character Description
d Daily Report
m Monthly Report
g Graphical Report
c Custom User Report
i Import Report
o Other Miscellaneous Report
v View Report
The next characters are grouped in pairs and any report may have as many as three pairs to further describe the report. These pairs will be one of the following character pairs:
Character Description
bl Balance
cp Components
cv Cover page
cy Charge and Yields
ev Event
in Instrument
mt Meter
nt Notes
pd Product
pi Pipe
pr Properties
rf Refinery
sr Sales and Receipts
tk Tank
tr Transfer
vs Vessels
The last character in the file name indicates whether the report is using the measurement of volume or mass, or it describes the presentation of the report as detailed, landscape, or portrait. The last character of the report will be one of the following characters:
Character Description
v Volume
m Mass
d Detailed
l Landscape
p Portrait
Example: The dsrpdd.rpt report will serve as an example, simply break down the name as follows:
The first character 'd' means it will be a daily report.
The second and third characters form the pair 'sr' which means it is a sales and receipt report.
The fourth and fifth characters form the pair 'pd' which means it has product information in it.
The last character 'd' means that it is a detailed report.
When all of this is combined together, it results in a daily sales and receipt product details report.
Keywords: AORA
AORA Reports
References: None |
Problem Statement: What is the description of the Standard Reports provided by AORA? | Solution: The list of standard reports is subdivided into the following categories:
Daily Reports
Graphic Reports
Monthly Reports
Other Reports
Views Reports
Import Reports
ERP Reports
Planning Reports
The attached pdf document contains the description of all the Standard Reports.
Keywords: AORA
AORA Reports
References: None |
Problem Statement: When trying to run an SQLplus Report the following error is retrieved on Even Viewer
Event code: 4010
Event message: An unhandled security exception has occurred. | Solution: Go to C:\ProgramData\AspenTech\, right click on SQLplus folder and go to the Security tab. Make sure that the user has the correct security permissions.
If the user does not have Read & Execute permission, he/she will not be able to run a report.
If the user does not have Modify permission, he/she will not be able to modify an existing report.
If the user does not have Write permission, he/she will not be able to create a new report.
Once you have identified which permission the user is missing, select his user name and Allow the right permissions.
Note: If you enable Modify permission, by default Read & Execute and Write will be allowed. If you enable Read & Execute, by default Write will be allowed
Keywords: SQLplus Reporting
Security
References: None |
Problem Statement: How to create shortcuts in AORA? | Solution: In AORA, go to Tools and select Configure Tools.
Click on Add, then enter a name in Tag and also in Hint. After click in the three dots […] to select a command or a batch script.
For this example, DbTools.exe was selected.
Finally click OK.
Select Close and go back to Tools. You will see that the shortcut now has been added.
Keywords: AORA
Shortcut
References: None |
Problem Statement: Where can I find information about the batch commands used in AORA? | Solution: You can find more information about the batch commands in the Help File “aspeniiauhist.chm”. It is located in C:\Program Files (x86)\AspenTech\Advisor.
Once you have opened that Help File, please go to “Running Aspen Operations Accounting Connect in Batch Mode” à “Batch Commands”.
Keywords: AORA
Batch Command
References: None |
Problem Statement: When trying to connect to PI Historian, the following error appears:
Error - PI Interface XXXXX: Historian connect failed: PI Interface: -1
Where XXXXX is the name of your interface. | Solution: PI Errors with a negative number, like “-1”, usually indicate communication problems between PI Server and AORA Connect machines.
For these cases is it recommended to check:
PI Trust Relationship between PI Server and AORA Connect machines.
The PI-SDK software is installed correctly.
Security Settings or Firewall that is blocking the communication.
If the previous points are configured properly, restart the PI Interface as this might also help.
Keywords: AORA
PI Historian
References: None |
Problem Statement: Query example to move Transfer Records from one Logical Device to another. | Solution: The steps required to move a Transfer Record to another Logical Device are the following:
Turn OFF the Transfer Record.
Set the Transfer Record to Unusable.
Change the IO_MAIN_TASK to the corresponding in the second Logical Device.
Set the Transfer Record to Usable.
Turn On the Transfer Record.
In order to do it for all Transfer Records that share the same Definition Record, the following query can be used:
UPDATE IOGETDEF SET IO_RECORD_PROCESSING = 'OFF' WHERE IO_MAIN_TASK = 'TSK_M_1';
UPDATE IOGETDEF SET USABLE = 0 WHERE IO_MAIN_TASK = 'TSK_M_1';
UPDATE IOGETDEF SET IO_MAIN_TASK = 'TSK_M_2' WHERE IO_MAIN_TASK = 'TSK_M_1';
UPDATE IOGETDEF SET USABLE = 1 WHERE IO_MAIN_TASK = 'TSK_M_2';
UPDATE IOGETDEF SET IO_RECORD_PROCESSING = 'ON' WHERE IO_MAIN_TASK = 'TSK_M_2';
Where TSK_M_1 belongs to the Source Logical Device and TSK_M_2 to the Receiving Logical Device.
Note: This example query will work for the Transfer Records defined under IOGETDEF. If it is desired to use it for another Definition Record, it is required to change that part of the query.
Example:
The Get Records Example 1 and Example 2 are within the Logical Device CIOOPC21 and it is required to move them to the Logical Device CIMIO_2
The group of Transfer Records share the same IO_MAIN_TASK.
Note: If after the initial configuration nothing is changed, the name structure for the TSK's is the following: TSK_M_XXXX, TSK_A_XXXX and TSK_U_XXXX, where XXXX is the name of the Logical Device.
IO_MAIN_TASK for the Source Logical Device:
IO_MAIN_TASK for the Receiving Logical Device:
Therefore the query will be the following:
UPDATE IOGETDEF SET IO_RECORD_PROCESSING = 'OFF' WHERE IO_MAIN_TASK = 'TSK_M_CIOOPC21';
UPDATE IOGETDEF SET USABLE = 0 WHERE IO_MAIN_TASK = 'TSK_M_CIOOPC21';
UPDATE IOGETDEF SET IO_MAIN_TASK = 'TSK_M_CIMIO_2' WHERE IO_MAIN_TASK = 'TSK_M_CIOOPC21';
UPDATE IOGETDEF SET USABLE = 1 WHERE IO_MAIN_TASK = 'TSK_M_CIMIO_2';
UPDATE IOGETDEF SET IO_RECORD_PROCESSING = 'ON' WHERE IO_MAIN_TASK = 'TSK_M_CIMIO_2';
After running the query the result is:
Keywords: Logical Device
References: None |
Problem Statement: When using Excel Add-In the following error appears:
Error:Invalid URI: The hostname could not be parsed. | Solution: Normally this happens after a migration of a server where the ADSA name was changed. This error means that the cell within the spreadsheet can not find the Data Source for which they were built at the beginning.
Verify on your spreadsheet that the function is pointing to the right Data Source defined on ADSA.
If there exist differences between your defined Data Source and the spreadsheets references, proceed to change them. After this the error should not appear.
Note: If after noticing that the error is caused due to a wrong ADSA reference, and multiple cells within the spreadsheet have the problem. You can use the Excel option Find and Replace to make all the changes at once.
Keywords: Excel
Invalid URI
References: None |
Problem Statement: How to determine the Maximum Number of Components in a Principal Component Analysis? | Solution: The exact total number of components is not important for most methods including Principal Component Analysis (PCA); For this method it is important that you have enough components to explain the main variability in the data. Once the noise starts to be present on your results, then is an indicative of non-important components, and that your model is adequately described with the previous X number of components used.
The most important thing when deciding on the optimal number of components to use in your final model is to use a proper validation. This can be either a separate test set or a cross-validation where all replicates are kept out in same segments
Note: All components from 1 to the optimal selected (validated) number can and should be interpreted to understand the underlying variability being explained by model.
Keywords: Maximum Components
PCA
References: None |
Problem Statement: Where can I find Sustainability sample cases in Aspen HYSYS V12.1? | Solution: Aspen HYSYS V12.1 includes five new sample files to facilitate this objective. The new sample cases can be found by clicking the Examples button on the Resources ribbon and navigating to the Sustainability folder or by navigating to the examples folder (C:\Program Files\AspenTech\Aspen HYSYS V12.1\Samples\Sustainability). A PDF file is included with each example to illustrate the process.
Each of the following five carbon capture models demonstrates how carbon is captured from different sources of flue gases using Acid Gas fluid package in Aspen HYSYS. It is a typical two-column process. CO2 is absorbed by using solvent in the absorber and then regenerated in the stripper. The acid gas distillation columns are used to model the absorber and stripper. The column geometries and internals are identified, and the rate-based distillation column in Advanced Modeling mode is used for accurate prediction of the mass transfer in absorption and desorption. Finally, the utility feature is used in the example for a quick estimate of operating cost.
CO2 Capture from Coal Power Plant using MEA
This Aspen HYSYS example models CO2 capture from a coal power plant flue gas at the rate of 1 Million TPY by using MEA.
CO2 Capture from Natural Gas Power Plant using MEA
This Aspen HYSYS example models CO2 capture from a natural gas power plant flue gas at the rate of 1 Million TPY by using MEA.
CO2 Capture from Syngas for IGCC using DEPG
This Aspen HYSYS example models CO2 capture from syngas for the IGCC process at the rate of 1 Million TPY by using DEPG.
CO2 capture from Syngas for IGCC using MEA
This Aspen HYSYS example models CO2 pre-combustion capture of the IGCC process at the rate of 1 Million TPY by using MEA.
Carbon Capture and Storage
This Aspen HYSYS example elaborates how carbon can be captured at the rate of 500,000 TPY from the exhaust gases coming from gas turbines, boilers and other sources using MEA. This model includes the modeling of sources, pre-treatment of flue gases, carbon capture, post-treatment and storage. The flue gases coming from different sources are at high temperature and must be cooled before they are sent for carbon capture. Water is knocked out in knockout drums, and O2 is removed using O2 removal units. The carbon capture is performed using a two-column process using Acid Gas package in HYSYS. The rate-based distillation column is modeled in efficiency modeling mode for accurate prediction of the mass transfer in absorption and desorption. After the carbon dioxide is captured, the pressure of the gases is increased before they are sent for injection into geological storage sites or for EOR in oil fields. That is done by using compressors and other pressure increasing equipment. Full description of the model can be found in the PDF provided with the model.
Keywords: Sustainability, Carbon Capture
References: None |
Problem Statement: The problem is related to Version V12.1. The problem basically occurs when you create a DMC3 project on the server or network location that is going to be modified by different users. In this case, take for example User A who is the owner of the DMC3proj. file and is saved on location X. If User B access location X and try to open the project (even if the project was properly closed and User A is logged out) DMC3 builder return with a message Project Is Already open like the picture below: | Solution: TheSolution to this problem is related to the user permission to the location folder, subfolders, and files. basically, theSolution is to create projects in public folders that multiple users have read/write permissions in.
If one of the users only has Read access the message will pop up and won’t allow the user to open the project. In the next example, the user gundam has only Read access thus the message pop up when trying to open the project.
In the second case, the user gundam has read/write access which allows to access the project.
In Both Cases, the user Student was the owner of and creator of the project.
Keywords: DMC3, Users, DMC3proj.
References: None |
Problem Statement: By Default, is not possible to Historize Controller User entries from the DMCplus or DMC3 Controller. However, this | Solution: presents a Workaround Using the capabilities of the IP21 Database to get a User Entry Historized.
Solution
The Following example was using an Example DMCplus controller named DEMCOL12, the following procedure is just to show the steps that must be done to have a User Entry historized and the Variables used in this example does not represent a real application:
1.- The first step is to create Two Different User entries on the Controller, in this case, we have created the Variables, TEST_ENTRY and WRITE_BACK:
It can be noticed that the variable TEST_ENTRY has been configured as Read, (but the variable can be configured as any other IO Flag). TEST_ENTRY will READ the values from the Measurement of an MV named SSFLOW which belong to the same Controller but will be displayed as one of the Parameters of the CV variable SSCOMP. The logic behind this is just to verify that the User Entry is getting the READ values from a dynamic source that will be changing with time (In a normal case this will come from the DCS).
On the other hand, WRITE_BACK is configuring as AWRITE as we will use this variable to Always Write the value back to the IP21 Database. In this case, we are pointing to a Created tag on the IP21 Database named WRBACK (we will back to this tag later).
2.- As the User Entries can only use one IO Flag at the time, we will use an input calculation to send the Reading Values from TEST_ENTRY to WRITE_BACK, then WRITE_BACK will Write the Values to IP21.
The Calculation is just the Following:
WRITE_BACK = TEST_ENTRY
3.- Once Step 2 is complete we will check now on the IP21 Database. In this case, I have Created an Analog Defined Tag name WRBACK ( In this particular case I just Duplicated an Analog Tag Pre-defined on my IP21 Database and Save it Under a Folder named USER_ENTRIES that I have created for this example)
As you can see in the first Picture the User Entry WRITE_BACK is pointing to the IO “WRBACK IP_INPUT_VALUE”. This is done because I want that the Written value from WRITE_BACK ended up been Writing in this specific Field so every time WRITE_BACK sends values, they will be written on the field IP_INPUT_VALUE (which I can later on Plot on Process Explorer or bring the Values using Aspen SQL).
Take into account that the correct configuration to Historize values is to set the filed for IP_REPOSITORY, IP_ARCHIVING AND IP_#_OF_TREND_VALUES.
4.- Once all is correctly Configure the next step, will be to deploy the Controller and Start the Collection in AW. I have run a few tests to show the Historization of the User Entry in the AW IP21 Database.
Display on PCWS
History on AW
Plot of the Values on Process Explorer
Keywords: DMCplus, IP21, User Entry
References: None |
Problem Statement: How to modify the fixed area of a record by importing data from a .CSV file? | Solution: This query helps you to modify the fixed area of a record by a template in a .CSV file.
First you have to create a .CSV in Excel with this structure:
<Tag Name> <Field 1> <Field 2> .... <Field n>
For this example, the description, engineering units, High high limit, high limit, low limit and low low limit fields will be modified:
Then in the query the columns of the .CSV file must be defined as local variables and in the for loop enter the name of the .CSV file and its path.
local nameoftag InfoPlus.21 Data Type;
local Field1 InfoPlus.21 Data Type;
.........
local Fieldn InfoPlus.21 Data Type;
for (select line as tagname from '.CSV path and name' where linenum > 1) do;
nameoftag = substring (1 of tagname between ',');
Field1 = substring (2 of tagname between ',');
......
Fieldn = substring (n+1 of tagname between ',');
Update IP_analogdef
set Field of record fixed area = Field1, ..... Field of record fixed area 2 = Fieldn where name = nameoftag;
end;
For this example the code would be this:
local description char(25);
local nameoftag char(25);
local engunit char(10);
local highhigh real;
local high real;
local low real;
local lowlow real;
for (select line as tagname from 'C:\Users\Student\Desktop\test.csv' where linenum > 1) do;
nameoftag = substring (1 of tagname between ',');
description = substring (2 of tagname between ',');
engunit = substring (3 of tagname between ',');
highhigh = substring (4 of tagname between ',');
high = substring (5 of tagname between ',');
low = substring (6 of tagname between ',');
lowlow = substring (7 of tagname between ',');
Update IP_analogdef
set IP_DESCRIPTION = description, IP_ENG_UNITS = engunit, IP_HIGH_HIGH_LIMIT = highhigh, IP_HIGH_LIMIT = high, IP_LOW_LIMIT = low, IP_LOW_LOW_LIMIT = lowlow where name = nameoftag;
end;
Finally check in the IP.21 Administrator that the changes were made.
Keywords: SQLplus
Query
Fixed Area
Fields
References: None |
Problem Statement: When you redeploy an RTE controller in V12.1, the Master Request and Status go to OFF without giving any warning or message when using the Warm Initialization option. | Solution: For previous versions, DMC3 Builder shows up a warning message saying that the controller must be turned OFF before redeploying when using the Warm Initialization option.
In V12.1 you can redeploy with the MasterOnOffStatus ON, but you need to have an IO tag linked with the MasterOnOffRequest.
In V14.0 the defect fix removes the requirement of having an I/O tag as a work-around for keeping the controller with the MasterOnOffRequest ON.
Fixed
In V14.0.
Keywords: DMC3 Builder, RTE controller, MasterOnOffRequest
References: None |
Problem Statement: The Merge action allows joining two or more DMC3 applications in one application. This action keeps the applications Models, Cases, and Configurations from the previous application on the new one. However, there are a few tips to remember when you need to perform this action. | Solution: 1.- Import the application into one project that considers the main application. Also, do not affect this main application consider having a backup copy of this project.
2.- Check the list of options that you consider correct for the application. By default, the merge controllers window will allow you to Merge Models, Merge Tuning values, Calculation and User Define Entries and Marge identification Cases. If one of these is not needed, consider unchecking the boxes. This helps just to keep the information you need for the new merged application.
3.- Consider the following case. Application A has a model and a basic Configuration and Application B is a more advanced application that has a Model, a robust configuration, and Optimization Values. In this case, you need to conserve the configuration and tuning of B application. When you click on Merge the Merge Controller window will appear and it will ask you to select a primary Controller and a Secondary Controller. In this case, as it needs to priority to conserve the parameter for Application B, this one needs to be set as the secondary controller and set A as the primary controller. Otherwise, the configuration of application A will be set as a priority and the configuration of Controller B will be lost in the new merged application.
4.- The Merge Action only allows merging two applications at the same time. If you require to have more than two applications, then you will need to merge the product of the first merge process with the next application. Let say that you will need to merge A, B and C. Then you need to merge first A and B, and then the AB application needs to be merged with C. (Note: Keep in mind the previous consideration for this last actions)
Keywords: DMC3 Builder, DMC3 applications, Merge
References: None |
Problem Statement: Aspen HYSYS Dynamics – CCC Prodigy & Series 5 Emulator link | Solution: A HYSYS extension has been developed to create a link between Aspen HYSYS Dynamics and the CCC Emulator from Compressor Controls Corporation (links to both CCC Prodigy and S5 OPC Emulator are supported).
The communication occurs through the OPC server that comes with the CCC Emulator. Further requirements and a description may be found in the help document AspenHYSYS_CCC_Link.pdf”
Aspen HYSYS CCC Link - Installation Instructions
Download the CCClink_Dec2021.zip file and extract the files.
Run the self-installer file CCCLink.exe and install the package under default folder C:\ProgramData\AspenTech\HYSYS CCC Link, or specify the current HYSYS installation directory, e.g. C:\Program Files\AspenTech\Aspen HYSYS VXX.X\Extensions\HYSYS Prodigy Link
A sample case is provided COMPRESSOR CONTROLLER PROJECT.hsc.
RegisterOPC.bat is a batch file that registers the OPCDA.dll. (this is available when the .exe file is run). This is required for the link to run. The registration should be run as an administrator.
Before using the link, the user needs to launch HYSYS and register the extension using Register Extension in the Customize ribbon tab.
Keywords: CCC Controller Emulator Link, CCC Prodigy, CCC Series 5, Dynamic Extension, Controller, Extension.
References: None |
Problem Statement: How can the equilibrium of a reaction be controlled/specified in a RGibbs reactor? How do you restrict the equilibrium in the simulation?
What is the temperature approach option in the equilibrium reaction, and how can it be used to modify the effluent composition from an equilibrium reactor?
This option is useful in modelling applications such as fired-reactor reforming furnaces with low-activity catalyst. | Solution: In RGibbs, you can restrict the chemical equilibrium with two Calculation options on the Setup | Specifications sheet: Restrict chemical equilibrium - specify temperature approach or reaction extents or Restrict chemical equilibrium - specify duty and temperature, and calculate temperature approach
If you select Restrict chemical equilibrium - specify temperature approach or reaction extents, you must specify one of these two options on the Setup | Restricted Equilibrium sheet:
Temperature approach for the entire system to select the Entire System With Temperature Approach option, and specify the temperature difference between the chemical equilibrium and reactor temperatures. Use a negative value if the reactor temperature is greater than the temperature at which RGibbs calculates chemical equilibrium.
Temperature approach or molecular extent for individual reactions.
The equilibrium can be restricted in the following ways:
Specifying the molar extent of the reaction (EXTENT-SPEC) for any individual reaction defined by a STOIC statement. This should not be entered for a reaction that has a temperature approach specified by a TAPP-SPEC statement.
Set the temperature approach to chemical equilibrium for any of the individual reactions defined by the STOIC statement. A temperature approach (TAPP-SPEC) should not be entered for a reaction that has an extent fixed by an EXTENT-SPEC statement.
Select the Individual Reactions Option, when you want to restrict one or more chemical reactions from reaching chemical equilibrium by specifying a reaction extent or temperature approach. To specify new reactions, click the New button and specify the stoichiometry and equilibrium restriction in the Edit Stoichiometry dialog box. After specifying one or more reactions, you can use the Edit button to modify and the Delete button to remove existing reactions.
Note 1: If either molar extent or temperature approach is specified, then the stoichiometry for a set of linearly independent reactions involving all components present in the system must be specified. Only these reactions are used. In addition, the set of reactions needs to be complete and satisfies the degrees of freedom of the system. In REquil, you can have a partial set of reactions.
Note 2: If you do not specify molar extent or temperature approach (that is, all reactions are set to the default 0 temperature approach) then RGibbs ignores the reactions. In this case, no restrictions are enforced on the reactions specified.
If you select Restrict chemical equilibrium - specify duty and temperature, and calculate temperature approach, both temperature and heat duty must be specified. RGibbs calculates the temperature approach to chemical equilibrium internally. You can provide an estimate of the temperature approach on the Advanced | Estimates sheet and energy balance convergence parameters on the Advanced | Convergence sheet. The default value for temperature approach estimate is 0.
When Restrict chemical equilibrium - specify duty and temperature, and calculate temperature approach is specified, the chemical equilibrium constant is evaluated at the temperature T + DeltaT, where T is the actual reactor temperature specified by the user and DeltaT is the desired temperature approach. If a temperature approach for the entire is desired, the Restrict chemical equilibrium - specify temperature approach or reaction extents option should be used instead. For this case, the chemical equilibrium is computed at T + DeltaT, while the phase equilibrium is computed at T.
Temperature Approach
The temperature approach option in the equilibrium reaction allows the user to adjust the equilibrium constant of a reaction by offsetting the temperature at which it is calculated. The approach value is an empirical adjustment that is used to modify the extent of reaction at equilibrium when the value of the equilibrium constant is not well established.
Using the temperature approach option offsets the effective temperature used to calculate the equilibrium constant:
ln Keq = -DG/R(T + DeltaT)
In Aspen Plus, it is possible to identify if a reaction is endothermic by viewing its reaction heat. If the reaction heat is positive, the reaction is endothermic, if it's negative, the reaction is exothermic.
the Temperature approach is usually used to determine how far the real reaction is away from the equilibrium state.
If the reaction is exothermic, a positive value will make the conversion of the reaction decrease and a negative value will make the conversion increase.
If the reaction is endothermic, a positive value will make the conversion of the reaction increase and a negative value will make the conversion decrease.
Having a conversion increase from the predicted equilibrium value will not occur in practice, but the option is available.
In the following example:
CH4 + H2O <==> CO + 3 H2
The forward reaction (CH4 + H2O to produce CO + 3 H2) is endothermic, hence the reverse reaction (3CO + H2 to produce CH4 + H2O) is exothermic. In Aspen Plus, if a positive DeltaT is specified as the temperature approach for this reaction, the CH4 composition in the product stream will increase. This is because the reverse reaction (producing CH4 + H2O) is being favored since the equilibrium temperature is effectively being decreased to T - DeltaT.
The specified approach temperature is an approach to the calculated reactor outlet temperature, rather than the approach to the chemical equilibrium temperature.
The approach to chemical equilibrium temperature is NOT an approach to reactor outlet temperature.
Keywords: rgibbs
t-app
References: None |
Problem Statement: How to obtain the estimate for the equilibrium constant for an ionic reaction ?
When ionic reactions are specified in the chemistry folder, the equilibrium constant can be specified by the user or it can be calculated internally by the Aspen Plus solver using the Gibbs free energies for reactants and products, which are available in the databanks. In this case, the equilibrium constant page is displayed blank and there is no report of the value of equilibrium constant that is used in the simulation. | Solution: The equilibrium constant, Keq, can be calculated using the activities and true mole fractions for the reactants and products as follows:
In the attached file the following reaction has been generated in the chemistry folder
If there are no specified values for the equilibrium constant, by default the equilibrium constant will be calculated internally by the Aspen Plus solver based on the Gibbs Free Energies for the reactants and products.
If values are specified for A (and B, C, D, and E), these will be used to calculate the equilibrium constant Keq.
In the attached file, a calculator block has been set up in the simulation environment to retrieve the activity coefficients and true mole fractions in the liquid phase for the reactants and products.
After running the file, the equilibrium constant has been calculated to be 0.0099, or approximately 0.01 when A is specified as -4.60517 since exp(-4.60517 ) = 0.01.
If no value is specified for A, the equilibrium constant will be calculated as 1.08e7.
Keywords: Ionic Reactions, equilibrium constant.
References: None |
Problem Statement: Docker Engine is the industry’s de facto container runtime that runs on various Linux and Windows Server operating systems. Aspen Mtell Maestro is a containerized application and requires Docker Engine on the environment on which it will be deployed.
We have identified that many antivirus software (e.g. Symantec, Kaspersky, Malwarebytes) with their default configuration break Docker Engine. When this happens, Aspen Mtell Maestro containers cannot be started and Maestro Failure Agents cannot be created, trained, and deployed in Aspen Mtell.
Example:
Aspen Mtell Maestro will not go past the following step during startup:
After about 5 minutes, you may find the following warning message related to Docker in the Event Viewer app:
“Syscall did not complete within operation timeout. This may indicate a platform issue. If it appears to be making no forward progress, obtain the stacks and see if there is a syscall stuck in the platform API for a significant length of time.” | Solution: This article contains some guidelines on how to approach this problem, however, the final reSolution most likely will depend on the antivirus software and it may be necessary to contact the antivirus vendor to request information about how the antivirus need to be configured on the machine with Docker Engine.
General Recommendations
Please, follow the general recommendations made by Docker:
When antivirus software scans files used by Docker, these files may be locked in a way that causes Docker commands to hang.
One way to reduce these problems is to add the Docker data directory (/var/lib/docker on Linux, %ProgramData%\docker on Windows Server, or $HOME/Library/Containers/com.docker.docker/ on Mac) to the antivirus’s exclusion list. However, this comes with the trade-off that viruses or malware in Docker images, writable layers of containers, or volumes are not detected. If you do choose to exclude Docker’s data directory from background virus scanning, you may want to schedule a recurring task that stops Docker, scans the data directory, and restarts Docker.
Uninstalling the antivirus software
Uninstalling the antivirus software might not be the finalSolution, but it can be done to quickly confirm that the antivirus is the cause of the issue you are observing. After this test, the antivirus can be installed again and investigation can proceed on what the antivirus exceptions are and settings that need to be configured for the machine on which Docker Engine needs to be running.
Note: Disabling the antivirus is not a recommended approach to confirm whether or not the antivirus is the root cause of the issue, as disabling the antivirus may not remove the filter drivers created by it.
Symantec Endpoint Protection
Please, follow the recommendations by Symantec titled: Cannot create or launch Docker containers on Windows Server 2016 when Symantec Endpoint Protection (SEP) is installed.
To work around this issue, you will need to upgrade to SEP 14 RU1, or newer, and add the following paths as Windows File Exceptions to the Exceptions Policy at the SEPM.
Prefix Variable File and Path (Exclude child processes)
%[SYSTEM]% lsass.exe
%[SYSTEM]% svchost.exe
%[SYSTEM]% cexecsvc.exe
%[SYSTEM]% oobe\windeploy.exe
Ensure that you Choose Application Control (for the type of scan that excludes the file) and select also Exclude child processes. The new Exceptions Policy should then be deployed to the affected clients.
Note: if you were experiencing a Docker installation failure before putting these exceptions into place, you may need to uninstall the failed package before retrying.
For situations where you will be adding Windows Features to a live container, or installing a service, additional exceptions may be needed. The following example shows the exceptions to both run an MSI install and run the DNS service (Not all of these are necessary for all situations):
Prefix Variable File and Path (Exclude child processes)
%[WINDOWS]% servicing\trustedinstaller.exe
%[SYSTEM]% msiexec.exe
%[SYSTEM]% dns.exe
Kaspersky
We identified the following Kaspersky service to conflict with Docker Engine in one instance: “Kaspersky Security Exploit Prevention Service”. When such service was disabled, Docker Engine run successfully.
Keywords: Compose
Swarm
Deployment
References: None |
Problem Statement: How to delete the items in mMDM when the checkbox ‘show only deleted definitions’ is greyed out in mMDM Editor? | Solution: In some cases, it is noticed that the checkbox to show deleted definitions is not enabled. Normally this checkbox should be enabled.
One case where it might not be enabled is when the user account is viewing the mMDM Editor as launched from the mMDM Bulk Load application. Please ensure that the user launches the mMDM Editor directly, and not from the Bulk Load application.
Here are the steps to resolve this issue:
Launch the mMDM Editor directly from the Start menu.
Navigate to the Unit of Measure | Unit of Measures collection.
Click the View | Filter menu option.
From the Filter dialog, click the Show only deleted definitions button.
Click OK when prompted to review the warning, then click OK again to view the deleted items.
Right-click on the deleted UOM, then click Undelete.
Click OK to use the default undelete settings.
Now the UOM is active again. Visit all locations where the UOM is referenced and remove the reference. The references can be determined by using the Validate button on the Tools | Publish menu in the mMDM Editor.
After all, references are removed, then the UOM can be deleted.
Note: When the UOM is deleted, it is still is kept in reference by a class attribute value on another mMDM definition. The correct process is to attempt to undelete the UOM, then remove it from referenced locations, such as class values. The procedure to delete the items is listed in this KB: https://esupport.aspentech.com/S_Article?id=000049562
Keyword: Deleted definition, greyed out,UOM,mMDM Editor,checkbox
Keywords: None
References: None |
Problem Statement: aspenONE SLM License Manager Overview | Solution: The aspenONE SLM License Manager allows you to access various SLM features from a central location.
To open the aspenONE SLM License Manager, from the Start menu, select aspenONE SLM License Manager
On the aspenONE SLM License Manager, you can click the following buttons:
Button
Description
Click the Configuration Wizard button to access the SLM Configuration Wizard. You can click the Help button on the Welcome to the SLM Configuration Wizard screen to access help for this tool. The SLM Configuration Wizard is a utility that guides you through a series of steps that configures the SLM to generate license keys.
Click the Commute button to access SLM Commute. You can
use SLM Commute to manage commuted licenses. Commuted
licenses are borrowed licenses that allow a client computer to
run the licensed product while disconnected from the network
without the use of a SLM dongle.
Click the License Profiler button to access the SLM License
Profiler. You can click Help | Contents on the SLM License
Profiler screen to access help for this tool.
The SLM License Profiler lets you obtain specific information
about the licenses available on an SLM server or license file.
Typically, you will use the SLM License Profiler to verify licenses on a license server or license file and to diagnose license related problems.
Click the Auto Upload Tool button to access the Auto Upload Tool. The Auto Upload Tool lets you systematically transmit usage log files to AspenTech, either by secure http, secure ftp transmission, or as an attachment to an email sent to the ALC mailbox. You can select the secure method that best meets your needs. For more information, refer to the Auto Upload Tool Installation Guide.
Click the License File Installer button to install a license file.
The aspenONE License File Installer lets you install either a
standalone or network license file on your machine.
Click the Locking Info button to access SLM Locking &
Configuration Information. The Locking Information and
Configuration Information sections display SLM and system
information, including the configuration settings set using the
SLM Configuration Wizard. This information helps you understand how SLM is configured on your computer and is required by AspenTech to generate your license files.
Click the Dashboard button to access the Aspen Licensing
Dashboard. You can use Aspen Licensing Dashboard to
increase productivity by monitoring denial of licenses, license
expiration dates, and connection to your license server. For
more information, refer to the Aspen Licensing Dashboard
Getting Started Guide.
Click the WLM Admin button to access WLM Admin.
WLMAdmin is the primary network license administration tool.
It is designed to provide access to most of the SLM licensing
features and full information on licensing activities at the
following levels: server, feature/license, and user.
Click the Baseload Tokens button to access the Base Load
Service Report. The Base Load Token Service provides a way
to track and report on product token consumption.
The aspenONE SLM License Manager opens automatically connected to the
current configured license server.
Depending on your setup, a summary may appear, displaying the following information:
· Server system name
· Total tokens in license file
· Number of tokens currently checked out
· Number of unique users currently using licenses
Keywords: aspenONE License Manager
aspenONE SLM
aspenONE License Manager Overview
References: None |
Problem Statement: How to manually back up Aspen Mtell database using SQL Server Management Studio | Solution: We recommend customer to schedule Aspen Mtell Database full backup everyweek. This article shows steps to take manual backup of Aspen Mtell Database.
1. Stop all Aspen Mtell services and close all Aspen Mtell applications.
2. Open Microsoft SQl Management studio
3. On the left, expand the databases section
4. Right click on the database you are interested in backing up and mouse over the task option and select back up
5. In the back up database window make sure you are in the general page and select the back up type, make sure it is in full
6. Click the Options page on the left and select Overwrite all existing backup sets
7. Return to the General page and press remove to eliminate the last default back up name
8. Click add to open the select backup destination window
9. Press the three dots and navigate to the location you wish to back up the database to.
10. In the file name section write the name you wish to save the database as
11. Click OK to close this window
12. Click OK to close the selected backup destination window
13. Click OK to start the back up process.
14. When the completed successfully message appears the back up is done. The back up database if not edited will be called Mtellsuite.bak by default
15. When finished the back up is finished, if the location was left to the default, the database will be available as below
Keywords: Back up
Database
SQL
References: None |
Problem Statement: How to backup a SQL Database on a regular schedule using Microsoft SQL Server Management Studio | Solution: This article will help you to take SQL Database backup on a regular schedule. These steps will create new backup files every time it runs the job.
Create a SQL Server Agent Job
Launch Microsoft SQL Server Management Studio and connect to the database
Expand Management and right click Maintenance Plans and select Maintenance Plan Wizard
Click Next on Maintenance Plan Wizard
Type Name and Description
Select Single schedule for the entire plan or no schedule
Click Change to setup a schedule
Select the Frequency of the backup and click OK
Click Next after selecting the Schedule
Select Back Up Database (Full) option and click Next
Click Next
Click Database dropdown and select the database to backup and click OK
Click Destination Tab, select Create a backup file for every database and check Create a sub-directory for each database
Select the folder to backup the database and click Next
Click Options tab and select Compress backup within set backup compression dropdown and click Next
Select the same folder as your backup for the log file and click Next
Click Finish
Make sure all actions are successful and click Close
A new Maintenance Plan should be created as per the given name
Test SQL Server Agent Backup Job to confirm it runs successfully
Go to Services.msc
Update the SQL Server Agent to Startup Type Automatic and Start the service
Go back to Microsoft SQL Server Management Studio
Expand SQL Server Agent -> Jobs and right click Maintenance job you created and select Start Job at Step
Confirm both the actions are successful
Confirm backup is created in the folder you specified in the wizard
Keywords: Database backup
Mtell backup
Back up
Schedule backup
References: How to restore a SQL database from backup using SSMS
How to manually back up Aspen Mtell database using SQL Server Management Studio |
Problem Statement: SQLplus takes a very long time to start (10-15 minutes). Once it starts, it works normally. The same thing occurs with AspenCalc. | Solution: This is a Framework Security configuration problem. To test this out, try adding n to the command line of TSK_SQL_SERVER in the InfoPlus.21 manager. The full command line should then be 10014 n. Then restart TSK_SQL_SERVER. This disables security in SQLplus.
If this fixes the problem, you need to check the Framework security setup. Start with the HKEY_LOCAL_MACHINE\Software\AspenTech\AFW\URL registry key. Check that this is pointing at the correct security server and try to view the URL with Internet Explorer. Also open the Control Panel/Services and check that the AFW Security Client Service service is running with a Domain account, not a local account.
Keywords: slow startup
startup
sqlplus
aspencalc
References: None |
Problem Statement: IQ Applications disappeared from PCWS and from the APCmanage application. From iq_manage.exe command prompt, it is possible to see that IQ applications were still running and writing values to the OPC.
It is possible to have only the ACO and RTE applications running and visible in PCWS and the ACO applications visible on APCmanage.
There is an error message on the bottom of the window of APCmanage on the IQ Applications section: Application data services not running. | Solution: The root cause of this issue is that the Aspen APC Inferential Qualities Data Service was stopped for some reason, this service communicates the IQ server with the PCWS, but it is not responsible for the calculation engine of the IQ sensor itself.
The Aspen APC Inferential Qualities Data Service needs to be running in order to see IQ applications on PCWS and APCmanage.
If the service is stopped and the IQ application was started, then the calculations of the IQ sensor will continue executing but the application will stop being visible.
This can be confirmed keeping an IQ turned ON and manually stopping the mentioned service, causing the issue described previously, then in the Task Manager it is possible to see the tasks related to the IQ calculations are still running.
When the service is restarted, the IQ application should be visible again in APCmanage as well as in PCWS.
Keywords: Inferential Qualities, APCmanage, PCWS
References: None |
Problem Statement: Starting in V12.1, DMC3 Builder has a new feature of creating Deep Learning cases for CVs. However, after taking a snapshot of the online controller, you will notice that Deep learning case information is missing from the snapshot file. How can this case information be recovered? | Solution: This is a defect that will be fixed in the future release. This article provides the workaround for this issue:
1. Create a new Deep Learning case in the snapshot application
2. Then select current nonlinear CV as Master model output like the screenshot below and click OK.
This will recover the current Master Model's Deep learning case information and the modeling result is also repopulated automatically as shown below.
Keywords: Deep learning, CV, DMC3, V12.1, Snapshot, missing, empty, case, model, training
References: None |
Problem Statement: How can define Rated Capacity of a Centrifugal Pump given its calculated Normal capacity data via a Method script in ABE? | Solution: Rated Capacity of Centrifugal pump is not populated by simulators and User can calculate it via a Method that runs when Normal capacity of a Centrifugal pump is available. So, under the KBs folder one can add the Method .azkbs file under the directory: C:\AspenZyqadServer\Basic EngineeringXX.X\WorkspaceLibraries\KBs\ExampleScripts. A script code file is attached.
Keywords: None
References: None |
Problem Statement: It is possible to install a GDOT Web Viewer in a different server than the GDOT Online one. | Solution: Using the Installation media, it is possible to install the GDOT Web Viewer components in a different server than the GDOT Online Server.
The supported architecture for this functionality is having the SQL Server in the GDOT Online machine with the GDOT Databases, specifically GDOTOnlineHistory, already configured.
GDOT Web Viewer Server is usually located in a Demilitarized Zone or a Business Network depending on the implemented architecture.
Note that for V12.1 and earliest versions the Web Viewer service are only in the GDOT Online Server:
That is the main reason the SQL Database must be installed in the same server as the one with the GDOT Online components.
For the post-configuration of the Web Server, you must modify the GDOTOnlineWebViewerConfig.json located in C:\ProgramData\AspenTech\GDOT Online\V12.1\WebFrontEnd using a text editor to change the default IP Address to the GDOT Online Server one:
Default port for this service was set as port 8000, if the default wants or needs to be changed to a different one, modify the port on this file as well as the GDOTOnlineWebCoreConfig.txt located in C:\ProgramData\AspenTech\GDOT Online\V12.1\WebBackEnd on the GDOT Online Server:
Restart the AspenTech GDOT Online V12.1 Web Core to take these changes in consideration.
NOTE: Remember that the GDOT Online Server and GDOT Web Viewer Server must be reachable using a ping test on a Command Prompt window and the firewall rules must be specified for the port specified for the web service.
Keywords: GDOT Web Viewer, GDOT Online Server, post-configuration
References: None |
Problem Statement: How do I characterize a crude assay in Aspen HYSYS? | Solution: Characterization refers to the process in which limited input assay data is used in conjunction with estimation methods and other mathematical techniques to generate a “model” of the assay. The model allows properties to be calculated outside of the range of the available data to satisfy the needs of the planning model. It also allows assay data to be recut at any cut points required by Aspen HYSYS.
There are two options available in Aspen HYSYS for assay characterization:
Assay Manager
Oil Manager
The difference between Assay Manager and Oil Manager is explained in the following article: What are the main differences between Aspen HYSYS Oil Manager and Petroleum Assay Manager?
Note: One of the advantages of Aspen Assay Manager is that it automatically creates a Component List with Hypothetical Components and Fluid Package once an assay is entered.
Aspen HYSYS Petroleum Refining assays are added via the Assay Manager. There are three ways to add it:
Import it from a file,
Import from Aspen Library (pre-characterized assays)
Enter the assay manually.
To add the assay manually, follow these steps:
In the Properties Environment, go to Petroleum Assays folder,
Click on Add | “Manually Entered” option.
Select one of the components list from the drop-down Assay Component Selection menu.
Select the assay type: Multi Cut Properties, Single Stream Properties, or Back Blending.
Multi Cut Properties: lets you specify mass, molar, or volume cuts of the crude assay sample with user defined initial and final boiling points.
Single Stream: Lets you define the distillation percent and temperature of individual streams.
Back Blending: Back blending lets you define feed streams by blending their associated products.
Provide the necessary information and click on OK. The Input Assay form will be created. It is used to enter and edit the experimental assay data of a crude assay and consists of the following tabs:
Input Summary: Enter the property data for the whole crude and product cuts.
Pure Component: Define the pure components percentage in the overall assay.
Distillation Data: Define the distillation curve data for the whole crude and any cuts that have been defined on the Input Summary form.
After all the specifications are entered, click on Characterize Assay button in the bottom-right corner of the Assay window and wait a moment for the assay to fully compute (if manually enter option was selected, otherwise characterized results will be available immediately). To see the results, go to Conventional Results form. You can also create plots of your characterized assay from Plot Gallery in the Assay Management ribbon.
The last step is to attach the assay to the material stream in the Simulation Environment. Add a new stream, double click on it and go to Worksheet | Petroleum Assay form. Select the “Attach Existing” option and use the drop-down menu to choose the assay. Use the Conditions form to specify all the required stream properties.
Keywords: Assay Manager, Oil, Characterize, HYSYS
References: None |
Problem Statement: Why I am getting the outlet moisture content in decreasing trend while I increase the inlet feed moisture content, whereas ideally this should be opposite. | Solution: There might be a various issue like wrong inputs, wrong specifications in sensitivity analysis which may cause such errors. If everything is perfectly fine, then modify Sherwood Number, if using a very small Sherwood Number (0.01), you can get the expected phenomena. User shall check that the mass transfer rate is well calculated based on the method selected and data provided.
After selecting this, user may observe behaviour like below: (increasing moisture content).
Keywords: None
References: None |
Problem Statement: Does EDR calculates weights of Structures & Walkways for Air cooler?? if not why these details are available under Cost/Weight Table!!!... | Solution: The focus of this Air Cooler is to find thermal design, Aspen Air Cooler does not estimate the weight of structure, walkway, etc. It provides the estimated weight of bundle and side frame with tube support only. User shall check thoroughly all the estimated weights before finalizing the cost of exchanger.
(Structure & walkway options are available as above which take parts in only defining geometry but not taking part in weight estimation).
The user can use aspenOne Exchange to search HTFS reports for various topics. Below are some examples:
DR61 Part 1: Air Cooled technical manual
AE3: Fans for Air-Cooled Heat exchangers
Keywords: None
References: None |
Problem Statement: The value of Aeration factor multiplier in PetroFrac column is not changing.
The bydefault value is 1.
When user tries to change this value, could not change, it shows an error massage:
** ERROR IN THE BLOCK PARAGRAPH WHICH BEGINS ON LINE 117
BLOCK NAME: CDU MODEL NAME: PETROFRAC SKW: TRAY-RATE1
VALUE IS REQUIRED FOR TERTIARY KEYWORD: SECNO
BUT WAS NOT ENTERED. PARAGRAPH IGNORED.
*** SEVERE ERROR
BLOCK: CDU APPEARS IN THE FLOWSHEET, BUT
NO BLOCK PARAGRAPH HAS BEEN ENTERED FOR IT. | Solution: Aeration factor is to tune or adjust the pressure drop.
This is an adjustment factor to the aeration parameter β used in the calculation of pressure drop through the aerated liquid on a tray. Adjust this parameter to get better agreement between calculated and measured tray pressure drops. Default value is 1.
Option is available in Rating mode- TrayRating-Design/DP.
To access this parameter, turn on column targeting hydraulic analysis. Now, the input language is OK when we change the aeration factor.
(Note: This issue is raised to development team to get access this variable. Would be resolved in next release of version -V12).
Keywords: None
References: None |
Problem Statement: This KB article explains why the GDOT DR | Solution: is affected by changing the QP target for a MV even if the QP weight is zero for that MV.
Solution
For DR, the “QP Target” field on a MV is actually used to wire in the process value, or in simulation mode the currently specified value (as if it were coming from the process). For the MV penalty term we are penalizing the difference between the reconciled MV value (which in the console is displayed in the “Current” value field) and this process value (in the “QP Target” field). And yes, if the “QP Weight” is zero then this penalty term is zero, as expected. However, when this “QP Target” (actually the process value or simulated process value) changes this causes all the predicted CV values to change, which in turn alters the objective function value through the CV penalty term. This is the cause of the DRSolution changing when the “QP Target” on a MV changes (even with a zero “QP Weight”) and is the correct behavior.
Keywords: GDOT, QP Target, QP Weight
References: None |
Problem Statement: What does Aspen Capital Cost Estimator considers as Support Personnel in the *.ccp evaluation report? | Solution: Support Personnel includes the costs assigned for secretarial, clerical, administrative, and accounting support in each engineering category.
This concept can be found in the *.ccp report in the following fields:
Procurement report
Home office construction services
Field office construction supervision
Keywords: Support Personnel, Report
References: None |
Problem Statement: What is the Kinetic correlation option in a Sulsim Incinerator? | Solution: The Kinetic correlation requires that you enter the kinetic correlation.
Aspen HYSYS uses this value to calculate the conversion of CO, H2, H2S, COS and CS2 in the incinerator because HYSYS includes the oxidation reactions of these components.
So HYSYS uses the conversion equation when you specify the kinetic parameter. If you don't have this parameter, its is recommended to use the Include Stacks check box.
In this second option, HYSYS determines the Kinetic parameter based on the delta P and delta T.
So you have two options: either define the kinetic parameter, or ask HYSYS to evaluate it if you dont have such information.
The attached pdf document describes the reactions and equations used for the calculation.
Keywords: extension, incinerator & stack, claus
References: None |
Problem Statement: I would like to view / customise the VBA code related to the Aspen Utilities Planner (AUP) plug-in for MS Excel. Where do I find this? | Solution: Like most plug-ins for MS Excel, the AUP plug-in is an *.xla attachment to the MS Excel application. When AUP is installed on a PC with MS Excel, this association should be made automatically. AUP will be available as a tab in the MS Excel ribbon.
If this tab is not shown, it can be manually added to MS Excel (from the Add-in Manager, usually from the Program options) from the following location:
C:\Program Files (x86)\AspenTech\Aspen Utilities Planner V10.0\bin\Utilities360.xla
Note: the above *.xla refers to V10 so the number (360) will be different based on the version of the AspenTech Engineering suite installed.
Each of the commands in the drop-down menu refer to subroutines located in the *.xla reference. The table below shows the location of each of these subroutines in their respective modules.
Note: The *.xla is not locked and can be viewed/edited without a password.
Button Text Sub Button [Parent module].Subroutine
Open Aspen Utilities - AddInModule.OpenButton_Click
Close Aspen Utilities - AddInModule.CloseAM
Show Aspen Utilities - AddInModule.ShowAM
Hide Aspen Utilities - AddInModule.HideAM
Keywords: None
References: None |
Problem Statement: How can I find the Equipment Rental estimated for my Aspen Capital Cost Estimator project? | Solution: Aspen Capital Cost Estimator (ACCE) can generate and estimate the cost of Equipment Rental, such as cranes and trucks, which will be required during the construction of the facility.
The Equipment Rental results can be found on the CCP report, searching for the label “EQUIPMENT RENTAL SUMMARY”:
And on the Standard Interactive Report Capital Cost Reports | Indirect Costs | Equipment Rental Summary by Contractor:
Keywords: Rental, Equipment, Indirects, Proratables, Rental Rate, Duration, Fee
References: None |
Problem Statement: What does the following information message mean: INFO > ' - 0' SOIL TYPE NOT SPECIFIED ? | Solution: The INFO you are receiving is an informational message and it won't prevent your estimate from completing. The Civil/Steel specs in the Project Basis View is where you may set the soil type.
Keywords: Info, message, soil, type
References: None |
Problem Statement: Why does the Equilibrium reactor predict carbon formation while the Gibbs Reactor does not? | Solution: Aspen HYSYS Component Database has only one kind of carbon. Carbon presents itself as different allotropes, such as C1 (Carbon-Graphite), C2 (Carbon-Diatomic Gas), C3 (Carbon Triatomic Gas), C-D (Carbon-Diamond) etc. Each one corresponds to a different free energy state. The carbon atom contained in Aspen HYSYS Component Database has a huge free energy value relative to other carbon states you may have experienced in your case.
The reason for the Gibbs free energy for carbon not to equal zero is related to the state of carbon used in Aspen HYSYS which, by default, is in the form of activated carbon. Therefore, a change in the enthalpy of formation for carbon will result in a change in the Gibbs free energy (G) for the same element. Since G=0 is assumed for all the substances at their reference state, the carbon structures away from that state will have a non-zero Gibbs free energy value.
Usually, this problem can be fixed by cloning the Carbon component, altering its first default Gibbs free energy value, and setting all other coefficients to zero (refer to Figure 1).
Figure 1. Gibbs Free Energy
We have had positive feedback from clients who have modified carbon component this way and could match literature data.
Please follow the following steps to modify carbon to use Gibbs reactor
1) From the Simulation Properties Environment, add the carbon component from HYSYS Database (as a conventional component).
2) From the Ribbon, in the Hypotheticals tab, click on “Convert”.
3) From the left column, select the component Carbon and click on “Convert to Hypo(s)”. Your component now should be already converted into a Hypothetical with the same properties as the original conventional one. (see Figure 2)
Note: In this step, the original component disappears, so if you’d like to keep the activated carbon on your component list, you will need to select and add it again as a new component.
4) Finally, double-click the cloned component. You can now edit its property data. The Gibbs free energy should now be available under the TDep (temperature dependent) tab.
Figure 2. Convert to Hypo(s)
For an example file, please refer to KB: How to predict carbon deposition in a steam reformer unit in Aspen HYSYS?
Keywords: Equilibrium; Gibbs; Carbon; Hypothetical;
References: None |
Problem Statement: Can I enter pressures greater than 1000 bar or 14503.8 psi for my heat exchangers in Aspen Exchanger Design and Rating (EDR)? | Solution: Currently, Aspen Exchanger Design and Rating does not support simulations with pressures larger than 1000 bar (14503.8 psi).
Reason: EDR must be generic with respect to different shell and tube exchanger configurations and include a variety of physical effects such as entry length effects, natural convection, transition to turbulence, and most importantly two-phase flow modeling. For all these effects, a limit must be applied, since the models used for these effects may not be valid for values significantly greater than those for which they are tested/validated. In addition to this, obtaining physical properties under these conditions can be quite challenging. Furthermore, ASME regulations extend only up to 690 bar (10000 psi).
For these reasons, EDR currently does not support very high pressures (>1000 bar). This is currently being considered as an enhancement for a possible future version but is not available right now.
Key Words:
Exchanger Design and Rating (EDR), high pressure, greater than, 1000 bar, 14503.8 psi, Input Error 1122
Keywords: None
References: None |
Problem Statement: How to create a Put Transfer record in Aspen InfoPlus.21? | Solution: 1. Open the Aspen InfoPlus,21 Admnistrator.
2. Expand your Logical Device name under Cim-IO and highlight Put Transfers
3. Right click on Put Transfers and select New record defined by IoPutDef:
4. Name the new Put Transfer record (up to 24 characters). Put_Test in this example.
5. You will receive the error message Processing inconsistent with value in field IO_MAIN_TASK. This means that the main task has not been set up for the transfer record, however, this step will be made further. Click OK on the message.
6. Use the Find InfoPlus.21 Record tool to navigate to the new transfer record (Put_test in this example).
7. By default the record would be marked as Unusable.
8. Select the Main Task that corresponds to the Logical Device of the Interface in the field IO_MAIN_TASK.
9. Select a priority for the Transfer Record. You can select a number between 1 to 9 in the IO_PRIORITY field.
10. Make sure that IO_ASYNC? field is set to No if you want to disable Asynchronous communication with the Logical Device.
11. Set up a Timeout value in the field IO_TIMEOUT_VALUE (10 seconds for this example).
12. Right click on the new transfer record in the left side pane and select Make Usable
13. Increase IO_#TAGS by 1 or more, depending on how many tags will be added.
14. Double click on IO_#TAGS in order to access to the repeat area of the record. There should be as many new occurrences as defined in the last step.
15. Enter the following in the respective columns:
IO_TAGNAME (The name of the tag as it is in the DCS/OPC). <NEWTAG.PV> in this example.
IO_VALUE_RECORD&FLD (The name of the IP21 tag, then a space character and the field of that tag that will be sent by the transfer record. <ATCAI IP_INPUT_VALUE> in this example:
16. Set IO_DATA _PROCESSING to ON:
17. Go back to the fixed area of the transfer record and set IO_RECORD_PROCESSING to ON.
18. If you want to send data every time an IP21 tag changes (Change of state), put the name of the tag and its field that will be monitored, separated by a space character in the field IO_ACTIVATION_COS:
Keywords: Put Transfer
Cim-IO
COS
References: None |
Problem Statement: When configuring a new Cim-IO interface manually and trying to configure a transfer record in the Aspen InfoPlus.21 Administrator, the error No Such Record is displayed when the Main Task from that new interface (TSK_M_XXX) is set up in the IO_MAIN_TASK field. | Solution: The root cause of the issue is that the Main Task has not been added to the IoExternalTskDef Defintion Record.
In order to fix the issue, go to the record IoExternalTskDef, then right click on it and select New Record Defined by IoExternalTskDef. Name it with its corresponding Main Task name (TSK_M_XXX).
Once the record is created, put the name of the new interface in the field IO_DEVICE. Press F5 to refresh the Aspen InfoPlus.21 Administrator and go back to the Transfer Record. Now you should be able to assign the main task to the Transfer Record in the IO_MAIN_TASK field.
Keywords: No Such Record
Transfer Record
Cim-IO
External Task
References: None |
Problem Statement: Aspen Mtell client is unable to load and view tags from an online historian through System Manager--> Sensor Data Sources--> Map Sensors, whereas the Aspen Mtell application server is connected and able to browse tags. Users get the error message below:
Unable to load tags.
Unexpected character encountered while parsing value: <.Path, line0, position 0.
This error often occurs when the Historian OLEDB Driver service is not running. | Solution: This generally occurs in newly installed Aspen Mtell client applications. The Aspen Mtell application server may be connected to the historian and able to browse tags, but the client machine gives the above error. When this happens it is likely because the Adapter URL on the client is pointing to the wrong machine. To change the Adapter URL, follow the steps below:
1) Open Aspen Mtell System Manager on the client machine and click on the Configuration (1) tab
2) Click on Settings and then Sensor Data Sources (2)
3) Select the Sensor Data Source you are pulling the tags from and make sure the Adapter URL (3) is pointing towards a machine which has the adapter installed and configured on it (most likely the Aspen Mtell application Production or Test server). In this example, APM is my Aspen Mtell Application Server name.
Note: The client machine name should not be a part of the Adapter URL since no adapters have been configured on the client side. The adapter configuration is usually on the main Aspen Mtell server.
4) Test (4) The connection once you have changed the Adapter URL to point to a machine with configured adapter. The test should be successful if it is now pointing to the correct machine.
5) If the test is successful Save (5) the changes.
Keywords: Aspen Mtell Client Tags
Unable to load Tags
Client connection
Client Adapter
Production Server
Test Server
References: None |
Problem Statement: How do I estimate an incinerator? | Solution: You may use the Quoted Equipment component to estimate an incinerator. A built-in incinerator component is not available.
Please note the system does have a default Code of account (COA) value of 253 for incinerators.
You may also create and store cost data for incinerators in the Equipment Model Library.
Keywords: Incinerator, 253, estimate, Quoted Equipment, Equipment, Quote, model, library
References: None |
Problem Statement: How is electrical tracing of piping calculated? | Solution: The length of the tracing cable is calculated based on the pipe length and diameter. There are two types of electrical tracing:
1) E-PRO to maintain at the process temperature, and
2) E-AMB to maintain at ambient temperature.
The heat loss of the pipe is compensated by using electrical tracing cable. ACCE determines the number of circuits/ft that is needed to compensate the calculated heat loss, which is then multiplied by the length of pipe to give the total length of the tracer cable.
Note the options E-PRO and E-AMB are available from the Special pipe description drop-down selection. The electrical tracing bulks will be listed in the report under COA 792.
Keywords: Electrical tracing, piping
References: None |
Problem Statement: What is the maximum number of steel structures in an area? | Solution: The maximum number of steel structures in an area is 10.
The 11th will give you the following error message:
If you have more than 10 steel structures, then note that you may add additional areas as needed.
As a reminder, you may add steel structures to areas from the Components tab | Plant bulks | Steel | Open steel structure component.
Keywords: Steel, structures, areas
References: None |
Problem Statement: When attempting to import an Excel spreadsheet into Aspen Flare System Analyzer, I am receiving an import error. I then get a message, Sorry, this function doesn't support excel file which is generated or edited by Excel2010. I am not using Excel2010, so why am I getting this error? How do I fix it? | Solution: Before exporting the original Flare file to Excel (.xls or .xlsx), first export to .xml and then import the .xml file to generate a fresh copy of the Flare model. This procedure can, in effect, clean up the Flare file. Once the .xml file has been successfully imported and the fresh flare model is available, save it and then export the Excel file from it again. Once the Excel file has been generated, try to import it to see if the error message has been resolved.
If this workflow does not fix the issue, then please contact AspenTech Support.
Keywords: Excel2010, Error, spreadsheet, export, import, Flare
References: None |
Problem Statement: Where can I find the default Construction Workforce Wage Rates? | Solution: The default Construction Workforce Wage Rates may be found in the Aspen Icarus
Keywords: Wage Rates, Craft, Construction
References: Manual, Chapter 30, Field Manpower Titles and Wage Rates.
The reference manual may be accessed from Help | Documentation | Icarus Reference. |
Problem Statement: Why am I getting the warning, MABP exceeds the relieving pressure when I click the Check Model button? | Solution: This warning alerts the user that the Allowable Backpressure (Control Valve Editor | Conditions tab) is set higher than the Inlet (Relieving) Pressure.
To resolve this warning, ensure the Inlet Pressure is higher than the Allowable Backpressure.
Keywords: MABP, inlet, relieving, pressure, exceed, allowable, backpressure, control, valve, warning, message
References: None |
Problem Statement: How does the system handle separate productivity sets for craft and workforce? | Solution: The workforce productivity establishes the default for all crafts, and may be useful for high-level estimates. The craft productivity overrides this default and may be specified craft-by-craft, improving accuracy for a detailed estimate. The system does not compound the workforce and craft productivity.
Productivity may be adjusted from the Project Basis View | Construction Workforce | Wage Rates form.
Please remember the default productivity of 100% is equivalent to 42 mins of labor/1 work hour (with 18 minutes of downtime automatically assumed). For example, a productivity of 90% would be:
0.9 * 42 mins/hr = 37.8 mins of labor/1 work hour.
Keywords: craft productivity, productivity, workforce
References: None |
Problem Statement: How do I specify a variable speed (frequency) drive (VFD)? | Solution: Option 1: For pumps and other applicable components, you will find a Driver type option which you can change from Motor (default) to VFD.
Option 2: Under Process Equipment | Motors, there is a TEFC Motor with variable speed drive.
Option 3: Lastly, you could create a variable speed drive as a quoted equipment or Equipment Model / Unit cost library item.
Note: In ACCE, if you go to Help | Documentation | Icarus
Keywords: Variable, speed, frequency, drive
References: , and then search for variable, you may find additional information. |
Problem Statement: Why Empty K-Values on Aspen HYSYS Material Stream? | Solution: Aspen HYSYS V11, The Worksheet tab | K Value page on the Material Stream property view displays the K values or distribution coefficients for each component in the stream.
A distribution coefficient is a ratio between the mole fraction of component i in the vapor phase and the mole fraction of component i in the liquid phase:
Ki = yi/xi
where,
Ki = distribution coefficient
yi = mole fraction of component i in the vapor phase
xi = mole fraction of component i in the liquid phase
If material stream shows Vapor & Liquid phases are detected, you can see that K Values are calculated.
It depends upon the flash calculation.
Keywords: K Values
References: None |
Problem Statement: When installing GDOT Unified the first time, it requires creating some databases and check on them on SQL. Additionally, SQL is requiring for the history and data collection for GDOT application. However, there could be some connectivity problem on the Database and additionally, the connection to the Database may affect the starting of Unified Agent Service. | Solution: The Unified Agent Service basically requires another important service to start which is the SQL Server Service (the name of the server will depend on the version). Every time to start the server or reboot the server make sure that the SQL Server service is running.
This service requires a Local Administrator account to run and can be set to Automatic so it can start at boot.
When the service is stopped you would see most likely two different symptoms.
1.- Unified Service Agent would not be able to start and return Error 1503
2.- Trying to access your SQL database it will return fail to in the connection displaying the following error
Keywords: GDOT, Unified, SQL
References: None |
Problem Statement: When launching Aspen GDOT Unified the first time or after an upgrade and all Database has been already configured Error 403.1 may appear instead of the Home page or the login prompt. | Solution: The way to fix this problem is simply by changing the feature permissions from the handler mappings feature from IIS.
To apply the change, apply the next steps:
1.- On Windows Start look for the application IIS Manager
2.- Once is open, on the left side three open Default Web Site > AspenTech > AspenUnified.
3.- On the display right-hand display look for Handler Mapping under IIS
4.- Open the Feature by doing double click and in the right hand go Actions > Edit Feature Permissions
5.- Once is open check the boxes for Read and Script only
6.- Close Intern Browser and launch GDOT Unified again. This time the error should disappear
Keywords: GDOT, Unified, IIS
References: None |
Problem Statement: GDOT can present different | Solution: Status, please find a quick description on the table below:
However, is not mentioned what couldSolution Status 4 could mean. ThisSolution frames a description of what theSolution Status 4 stands for.
Solution
ASolutionStatus of 4 in GDOT usually means Skipped which means that the execution cycle skipped. Typically this is because it is waiting for some I/O which wasn't updated yet, or in the case of GDOT DR because Model Update is running and hasn't provided new gains yet. Either way, it can sometimes be resolved by changing the different GDOT execution offsets and making sure that the machine has adequate resources for running the different GDOT applications.
Keywords: GDOT,
References: None |
Problem Statement: The CLC files are basically Data Collection files that contain MV and CV vector information generated during normal operation or Step Testing. This information is required to run Model-identification on DMC3 Builder or DMCplus model. This | Solution: frames the information contained in the CLC header.
Solution
A Normal CLC file would look like this:
The file is divided basically into three sections: the header, the Vector names and description, and the Data collecting showing Data/ Value/ Quality Status.
The header section displays the following information (Based on the above example):
Fractionator =====> CLC Name
FracData_Fractionator =======> CLC Description
18 =====> Number of Tags
28 =====> Invalid number of tags per section (Recommend no change or make sure that this at least 4 times bigger than the number of Tags)
10-01-2009 08:14:00 =====> Data Collection start time
60 =====> Sampling Period on seconds
5820 =====> Total number of samples collected (Number of Tags X Number of Samples X Total time of collection)
Keywords: CLC, DMC3 Builder, Data Collection
References: None |
Problem Statement: Additional filesets can be created in the AW IP21 Administrator. However, it should be notice that the default location for the AW fileset may be different from the one used for an independent IP21 Administrator, and this location may require further security privilege. When these privileges do not exist the error “Failed to create a directory for fileset” return | Solution: TheSolution basically requires granting the correct permission for the account used to manage the IP21 Administrator. There are some important things to check:
1.- Check the directory where the file sets will be added. You can refer to the current fileset on IP21 administrator to check the correct path. On the directory, select the directory name do right click and select properties, then go to the Security tab and make sure the user account used to add the file sets has permission to modify, read & execute, read, write, and full control.
2.- on the same path as in point 1. Choose the Sharing tab > Permissions and make sure the User Account has permission to change, read, and full control.
3.- Make sure the User Account as defined is the same account that is used to start InfoPlus.21 Task Services.
It is also important to stress that the User Account should not contain any special character as @ / ? ! As these characters can fail during the account validation from IP21. In case you have something like this consider changing or change the account to successfully create the new fileset.
Keywords: AspenWatch, IP21, Fileset
References: None |
Problem Statement: Docker Engine Enterprise and Docker Compose quick installation and upgrade guide for Windows Server when internet connection is not available on the host machine. | Solution: This article will explain how to install and upgrade Docker Engine Enterprise and Docker Compose on a machine without internet connection.
Prerequisites Installation:
A second machine with internet connection to download attached files.
Host machine should have Windows Server 2019 OS (version 1809 or higher).
Phase 1: Download required installation files to a machine with internet connection
On a machine with internet connection, download attached zip file Docker install scripts.zip. Do not modify the name of the zip file after download. This zip file contains PowerShell scripts and files to install Docker Engine 19.03.5 and Docker Compose 1.26.2. Maestro V11.1.2 and V12 can work with these versions.
Phase 2: Move required installation files to the target Aspen Mtell Maestro Server
Move zip file Docker install scripts.zip to C:\ProgramData\AspenTech\Aspen Mtell Maestro on the target machine. Create such path if it does not exist on the target machine.
Phase 3: Install Docker Engine Enterprise and Docker Compose on target machine
Unzip file Docker install scripts.zip (do not modify the default folder where the contents will be extracted).
Go to C:\ProgramData\AspenTech\Aspen Mtell Maestro\ Docker install scripts\ and verify the files and folder are present (Do not rename files or folder)
Go to C:\ProgramData\AspenTech\Aspen Mtell Maestro\ Docker install scripts\DockerOfflineInstallFiles\ and verify the files and folder are present (Do not rename files or folder)
Launch PowerShell using run as Administrator option
Move to directory where the PowerShell scripts are located. To do this, execute the following command.
cd C:\ProgramData\AspenTech\Aspen Mtell Maestro\Docker install scripts
Example:
Run “install_driver.ps1” specifying “offline” installation option.
.\install_driver.ps1 -OfflineInstall
Example:
After the installation process is completed, restart the machine.
Launch PowerShell using Run as administrator option
Check successful Docker Engine install by running the following command:
docker version
Expected result:
Check successful Docker Compose install by running the following command:
docker-compose version
Expected result:
Keywords: Maestro install
Xray install
Docker
Container
with out internet
no internet
offline
References: Aspen Mtell Maestro quick deployment guide (without internet connection / offline)
Aspen Mtell Maestro quick deployment guide
Docker Engine Enterprise and Docker Compose quick installation guide for Aspen Mtell Maestro |
Problem Statement: This | Solution: provides a way to run different and multiple cases for a DMC3 Builder project.
Solution
The Model Structure of a DMC3 Builder is basically divided into two parts: Master Model and Case Folders. Inside of these Case Folders, we can create multiple cases with different vectors, parameters trials, datasets, etc.
Using this structure DMC3 builder can run a single Case that exists inside a Case Folder just by selecting the Case and Run Identify:
or Run Identify for all cases inside the Case Folder, by selecting the Case Folder Icon and Run Identify:
However, this structure by default does not allow you to select and run some specific cases at the same time for example we would like to run just CASE_1 and CASE_3. Unfortunately, there is not a default function that allows doing something like that, but we can take advantage of the described structure to perform this task.
The proposed workaround is basically creating a New Folder and create a copy of the cases that you would like to run inside that Folder.
And again, by selecting the Folder and click on identify we will be able just to identify CASE_1 and CASE_3.
Notice that Copy of CASE_1 is essentially a different case than CASE_1 (even though they have the same structure), so whatever you may change on Copy of CASE_1 will not be reflected on CASE_1
Keywords: DMC3, Cases, Model
References: None |
Problem Statement: This | Solution: just frames the process of Import Tag Template. Attached the equivalent Tag Templates that exist on DMCplus Build for DMC3 builder.
Solution
Tag Templates basically work to modify and create multiple tag names for multiple variables. These are particularly useful to update and change the tag names of the controllers.
In the case of DMC3 Builder tag templates require to be .xml files. In particular, we will see these files defined as xxxx.apctagtemplate.xml (xxxx = Tag Template name). this is different from the filetype use in DMCplus Build which is defined as .tcc file. Unfortunately, the structure of the .tcc file is not supported in DMC3 builder.
Manage Tag Template is the feature used to Manage everything related to tag templates (create, delete, import, export …). The Import and Export actions are useful as the xml version are compatible with other versions and allow also to be moved between servers. for DMC3 builder .xml files do not require to be saved in any specific location, just by importing them once into DMC3 builder the information will stay there.
To Import an xml file, go to Deployment > Manage Tag Template > Import. Then Select the xml that will be imported, and it will automatically appear on the Manage Tag Template window.
Attached to thisSolution find some DMC3 Builder Tag Templates that would be equivalent to the DMCplus build pre-defined ones. This information can be found by default in V11 and above.
Keywords: DMC3 Builder, Tag Templates, DMC+
References: None |
Problem Statement: This | Solution: presents a quick guide on how can IP21 tags can be read to the IQconfig. file
Solution
Please find on the attached PDF the details on how to set the IQconfig. File to read values from IP21.
Keywords: IQ, IP21
References: None |
Problem Statement: The Scheduled “Autorun” operations for PID groups explicitly use the default “SQLplus on localhost” ODBC connection, which Is created by the AspenWatch server software installation.
When the ODBC connection is not properly setting this can lead to problems as PCWS not showing results under PID loop analysis after setting group configurations.
In addition, by checking the PID watch logs located in the following path:
C:\ProgramData\AspenTech\APC\V10.0\Builder\PidResults
It can be shown the following messages: “PID watch AutonRun ODBC Server Connection Error” and “PID watch auto-scheduled End Program”. | Solution: To solve this issue, it is necessary to create an additional ODBC connection on the AspenWatch Server. This connection can be identical to the existing one, except replacing the ODBCS data Source with the name “SQLplus on localhost”
The new ODBC connection should look as it follows:
ODBC Data Source Name: SQLplus on localhost
Description (Optional): PIDWatch SQLplus ODBC Data Source
Aspen Data Source): --This should be the same as you use for your current “SQLplus on Aspen AW server”--
Keywords: PIDwatch, PCWS, ODBC
References: None |
Problem Statement: How to develop a First Principles Driven Hybrid Model for Heat Exchanger Fouling? | Solution: Heat Exchanger Fouling can change with time and process conditions. Building a customized model can be challenging and often requires require significant time and effort, and additional costs.
OneSolution is to build a predictive heat exchanger model and incorporate historical data from the heat exchanger to capture the fouling factors from the process using a First Principles Driven Hybrid Model.
In this article, we highlight the different steps required to create such a First Principles Driven Hybrid Model to accurately capture these effects.
Please see the attached PDF and zip files for Aspen Plus. Similar workflows may be implemented in Aspen HYSYS.
Key Words:
Heat Exchanger, Fouling, First Principles Driven Hybrid Models, Aspen Plus
Keywords: None
References: None |
Problem Statement: Retrieve Parameters option is not working as designed in V12.1.
In V12.1:
Retrieve parameters with Copy pure component parameter to input CHECKED, the source will be PURE39 and USER-PURE39 is in the source list.
Retrieve parameters with Copy pure component parameter to input UNCHECKED, the source will be PURE39 and USER-PURE39 is NOT in the source list. | Solution: The correct behavior is as follows:
When Retrieve parameters with Copy pure component parameter to input CHECKED, the source will be USER-PUREXX, and it will apply to all components in the list, not just newly added components.
When Retrieve parameters with Copy pure component parameter to input UNCHECKED, the source will be DB-PUREXX, and it will apply to all components in the list, not just newly added components.
E.g., in V12.0 and V11.0:
Retrieve parameters with Copy pure component parameter to input CHECKED, the source will be USER-PURE38
Retrieve parameters with Copy pure component parameter to input UNCHECKED, Nothing happened.
Fixed in Version
Fixed in the next release.
Keywords: None
References: : VSTS 696000 |
Problem Statement: Does aspenONE software support Office 2021? | Solution: Office 2021 has been tested on all aspenONE V12.x software and will support aspenONE V12.x released software.
Keywords: Office 2021 support
References: None |
Problem Statement: Does aspenONE software support Windows 11 Operating System? | Solution: Windows 11 operating system has been tested on all aspenONE V12.x and V11.x software and will support aspenONE V12.x and aspenONE V11.x released software.
Note: Aspen Unified in V11 does not support Windows 11.
Keywords: Windows 11 support
References: Known Issue: Windows 11 with V11.x with Aspen Properties: Button apply in run settings does not activate in V11, V11.1 |
Problem Statement: What is the arc.byte file used for? It always seems to be 128 KB in size. | Solution: The file arc.byte is part of an Aspen InfoPlus.21 history file set. A file set consists of three files - the other two are arc.key and arc.dat. (SeeSolution 000077763 for an explanation of arc.key.)
An occurrence of trend data in a record defined by IP_AnalogDef or IP_DiscreteDef is 15 bytes in size holding the value, timestamp, and two quality fields. This data is stored in the file arc.dat. If a definition record has a history occurrence that is 2048 bytes or larger, this data is stored in the file arc.byte. In this case, the file arc.byte will grow larger than 128 KB.
Even if there is no data stored in arc.byte, it is an integral part of the file set and must not be separated at any time from the files arc.dat and arc.key.
Keywords: arc.byte
arc.dat
References: None |
Problem Statement: This knowledge base article describes the behavior of Aspen InfoPlus.21 when the historian file, tune.dat, has become corrupted, but is still in a usable state, or if the parameters in the tune.dat have been set to zero. | Solution: 1) The following Aspen SQLplus statement returns no rows.
Select occnum, fs_start_time from DiskHistoryDef where name=TSK_DHIS;
2) In the Aspen InfoPlus.21 Administrator the column % Full shows 0.00 for each existing fileset.
3) In the Aspen InfoPlus.21 Administrator the History Parameters shows 0 and 00000:00:00.0 respectively for all values in the columns Max, Min and Default.
To correct a situation in which a tune.dat has become corrupt:
a) Stop Aspen InfoPlus.21
b) Delete the tune.dat
c) Replace the tune.dat with a tune.dat from a system backup or from a new Aspen InfoPlus.21 installation
Keywords: Zero
Zeroed
Corrupt
Missing
Empty
References: None |
Problem Statement: Thermodynamic states the relation between enthalpy (h), Gibbs energy (g) and entropy (s): g = h - T * s (where T is the absolute temperature).
When checking the stream results in conventional solid substream, one can observe the relation is not satisfied. Why? | Solution: When reporting the default stream property solid SMX for a stream, whether in the MIXED or CISOLID substream, we calculate the property as a mole-average of the solid species, because Aspen Plus handles solids as pure phases, not as solidSolutions. The value is correct.
When reporting the prop-set property solid GMX (and most likely also solid SMX) for a stream, we report the value calculated from SMTHRM as if the solid mixture were a solidSolution. This value is therefore not consistent with the entropy.
This calculation will be corrected in a future version, and a mole-average quantity will be reported instead, as is done for the entropy.
This applies to GMX and the related MASSGMX, GMX-FL. SMX and the related MASSSMX and SMX-FL.
The only exception is when the solid mixture is calculated from an RGIBBS block where solidSolution phases were considered, and the mixture is a solidSolution. In that case, the stream's solid SMX will be the value for a solidSolution, which includes the mixing term (the quantity calculated from SMTHRM).
There's a fundamental difference in the handling of the solid substream compared to liquid or vapor phases, for which there is no problem. The problem is that the solid may be a solidSolution or a mixture of pure solids. The difference in entropy and Gibbs energy is due to the fact the mixing term (R * sum [ Xi * ln(Xi)] is omitted in the entropy reported in the stream results, but the term RT * sum[Xi ln(Xi)] is included in the Gibbs energy.
The documented solid mixture property monitor, SMTHRM, returns SMX and GMX which include the mixing term. For this monitor, the solid mixture is a homogeneous phase, a solidSolution. There is no way for the user to indicate that a solid mixture in CISOLID is truly aSolution.
The attached example (Aspen Plus V12) illustrate this problem.
Keywords: None
References: None |
Problem Statement: What does equipment material cost include? What does it not include? | Solution: Equipment material cost includes the following:
Cost of the parts of the equipment, such as internals, shell, nozzles, manholes, covers, etc.
Fabrication labor (shop and/or field)
Shop and office overhead
Vendor engineering, shop drawings, shop testing, certification, typical manuals, small tools, and accessories
Packing for shipment by land
FOB vendor
Vendor profit
It does not include:
Owner/contractor indirects (engineering, shop inspection)
Packaging for shipment by air or sea, modularization
Freight, insurance, taxes/duties
Field setting costs (storage, transportation, setting, testing)
Installation bulks (piping, steel, instrumentation, electrical) Note: these are listed separately as Installation Bulks
Keywords: Equipment material cost
References: None |
Problem Statement: Why is there no effect on my piping costs when I add a Rupture Disk to my Above grade or buried pipe component? | Solution: A Rupture Disk (RD) is considered an instrumentation cost (COA 683), so its cost is associated with instrumentation.
Keywords: piping, rupture disk, RD, instrumentation, COA 683
References: None |
Problem Statement: What are the Design gauge pressure Inlet and Outlet fields for the compressor setup? What information are these fields requesting? | Solution: These fields are requesting the design (not operating) gauge pressure of the gas entering the compressor (low pressure) and exiting the compressor (high pressure). Typically, the design inlet and outlet pressures includes a safety factor over the operating pressures.
For instance, for the compressor modeled in Aspen HYSYS as shown below, the values 100 and 300 would be entered in the inlet and outlet fields above, respectively.
Keywords: Design, gauge, pressure, inlet, outlet
References: None |
Problem Statement: How can I estimate Orifice Plates? | Solution: One way to estimate an orifice plate is to add an Above grade or buried pipe component and include the fitting OP | Orifice plate and union and its quantity. Set the pipe diameter as needed and note the minimum pipe length is 1 ft.
Alternatively, you may use the Equipment or Unit Cost Libraries to store the orifice plate cost data to be used in projects, as needed.
Keywords: Orifice, plate
References: None |
Problem Statement: How do you send all welds to shop for all equipment at once? | Solution: Construction management cost is part of Indirect. By default, Engineering management and Construction management fields are blank. To drive generation of estimated of engineering and construction management costs, a contractor MUST be specified. If there is no line item in the output report for Code of Account 82 (COA 82), it means that there was no cost calculated for this Construction Management Supervision in the project.
To specify Construction Management Supervision in Aspen Capital Cost Estimator (ACCE), please see article 79697.
Keywords: Construction Management, Engineering Management, Construction Management Supervision
References: None |
Problem Statement: Can you link cost variables from Aspen Process Economic Analyzer (APEA) to Aspen PLUS Dynamics or Aspen HYSYS Dynamics? | Solution: No, you cannot link cost variables from APEA to Aspen PLUS Dynamics or Aspen HYSYS Dynamics. Cost analysis using APEA is supported for steady state mode only.
Keywords: Cost Variables, Cost Variables Linking
References: None |
Problem Statement: On the General Wage Rate/Productivity-IP Box under Project Basis View, I entered manually all the Craft Wage Rates for all the craft code. Now I need to reduce the wages by a percentage (%). Instead of revising all the wages, is it a possibility to just reduce the wages by a percentage basis? | Solution: Yes, it is possible to reduce the wage rates in one field. This however requires creating a new construction workforce (CWF).
Let's say, CWF #1 was created previously, where the craft wage rates are defined.
Now, create a second CWF and adjust the craft wage rates as a percentage of CWF #1 as follows:
Note: Make sure CWF#2 is now linked properly with the contractor as opposed CWF#1.
See below:
Keywords: Wage Rates, Wage Rate Adjustment, Capital Cost
References: None |
Problem Statement: I have been trying to access the Aspen Custom Modeler (ACM) example files. But every time I open any file, I get a blank flowsheet and a message similar to this, Failed to load file C:\Program Files (x86)\AspenTech\Aspen Custom Modeler V10.0\Examples\Absorber\Absorber.acmf, working directory is in use | Solution: This problem happens because the Allow setting of working folder location is unchecked.
To resolve this error, check the Allow setting of working folder location option under Tools > Settings > Preferences.
Keywords: Working folder, Example files, ACM
References: None |
Problem Statement: Why does changes to the Total Construction Weeks change the Total Project Costs? | Solution: The construction schedule is integrated with the cost estimate to provide the basis for estimation of schedule-dependent costs such as equipment rental requirements, field supervision and construction management. Because of this dependence, changing construction weeks will impact the project total costs.
Additional discussion on the interdependency of project schedule and cost is available in the ICARUS reference guide (Help > Documentation) under chapter 36.
Keywords: Total Construction Weeks, Total Project Costs, Project Schedule.
References: None |
Problem Statement: How do I create a Code of Account (COA) exception and when should it be used? | Solution: Consider a scenario where you want to allocate fittings on a pipe, such as valves and bolt-up connections, to a different COA other than pipe erection. This can be accomplished by creating exceptions in the Code of Account. In this example, to allocate these to a different COA, you will have to create exceptions to COA 317 (CS Pipe Erection) using subtypes 10 (Erect valves) and 11 (Bolt-up connections). Before creating a COA exception, you will notice that the CCP (or CCI) report indicates that costs associated with these fittings are reflected in COA 317.
How to create a COA exception?
Select the COA file where you would like to add an exception, and then choose Allocations. The COA Allocations window will pop up.
You need to have the COA allocation first and then have another line that defines the COA exception to the generic allocation.
Click Add at the top to insert a line below it to create the exception to the generic rule. Let's say we want to allocate these costs to a different COA, which is 392 (Pipeline-Valves, Fittings). The COA exception flag has to be set on the exception line. Also on the exception line you have to define what the exception applies to. In this case it applies to subtypes from 10 to 11. You can similarly create exceptions that apply to specific material or COA modifier or diameter etc. The exception line doesn't have to be right below the COA allocation line. You can add it all the way in the end and it should work OK.
To see the different subtypes, where you can apply exceptions, please refer to Icarus
Keywords: COA exception, COA, exception
References: , Chapter 35 on Database relation and topic- Attribute Descriptions
Evaluate the plant bulk pipe, and notice that the erect valve and bolt-up connections are going to COA 392. The other line items in CS PIPE ERECTION are going to COA 317. |
Problem Statement: How can I override Procurement and Home office expenses in Basic engineering? | Solution: Overriding Procurement in Basic Engineering
1. Go to Project Basis - Engineering Workforce - By Phase.
2. Add another column.
3. Engineering Workforce number - Specify EWF 1.
4. Engineering Phase - Specify P (Procurement).
5. Engineering hours - specify 0.
6. Engineering cost - specify 0.
Overriding Home Office in Basic Engineering
1. Go to Project Basis - Engineering Workforce - By Phase.
2. Add another column.
3. Engineering Workforce number - Specify EWF 1.
4. Engineering Phase - Specify H (Home Office).
5. Engineering hours - specify 0.
6. Engineering cost - specify 0.
If you wish to do this for other Equipment workforce numbers that you specified, you will have to add more columns and make your adjustments as you see fit.
You can also do these procedures for overriding procurement and home office with Detail Engineering as well.
Note: If you want to change Percent calculated hours for Home office construction services, under Engineering Workforce | By Phase, please see article 98746 for more details.
Keywords: procurement, home office, override, workforce, phase
References: None |
Problem Statement: Is there something to consider for leaving the equipment as it is? This is not demolition, this is a component already present in the project but does impact the cost because of the connections and so on that will be different. | Solution: For an existing equipment, select “EXIS” as the installation option field of the component.
For additional cost associated with this component, you can then use the “Mat’l/ man-hours additions” form from the “Options” drop down menu.
Other installation options are explained on article 97099.
Keywords: Existing Equipment, Installation Option
References: None |
Problem Statement: How do you send all welds to shop for all equipment at once? | Solution: There is no direct way to send all welds to shop. You can select “RMT” in the project basis view but if the length exceeds the limit, some of the weld will be done in the field.
However, for each piping component, you can put 100% in the “Percent of welds in Shop” under Options > Pipe -Item Details.
Keywords: Remote Shop welds, RMT, Pipe fabrication
References: None |
Problem Statement: When I change Percent calculated hours for Home office construction services, under Engineering Workforce | By Phase, it seems to have no effect. | Solution: Home office (HO) construction services are calculated based on the engineering workforce (EWF) performing the task. To capture any customization of costs for home office construction services, the EWF needs to be linked for the contractors doing the installation (see below).
Otherwise, the system is going to use the Out of the Box (OOTB) HO model and will not apply any adjustments.
If more information is required to override the procurement & home office expenses, please see article 80459.
Keywords: Home Office costs, Home office construction services, engineering workforce, EWF
References: None |
Problem Statement: How to get equipment footprints data from Aspen Capital Cost Estimator (ACCE)? | Solution: To get the equipment footprint data on ACCE, please follow the steps below -
1. Turn on the equipment footprint from Equipment Specs form:
2. Evaluate the project and Run the Design and Basis Reports > Components >(component type- pumps, compressors etc.) Report to get the equipment footprints:
Keywords: Equipment footprints, Design and Basis Reports
References: None |
Problem Statement: How do I restore the missing tabs in the palette window and Project Basis View forms? | Solution: After the installation of Aspen Capital Cost Estimator (ACCE) V8.0 and above, the user is required to check whether the resiliency (configuration for first time use) is successful. Default folders and shortcut under My Documents will be created and if resiliency fails to set-up these folders properly, users may end up losing some tabs and looking at a blank Project Basis view.
There are few possibilities why this could happen:
1. Insufficient disk space which prevent the resiliency to complete successfully. Software will give warning if this step is not successful, and user will not be able to use ACCE.
2. Insufficient write permissions in the My Documents redirect to the drive where ACCE default folders would be created.
3. Corrupted registry due to previously installed version which does not co-exist with the version installed (ex. V8.7.1 and V8.8).
The workaround for each scenario follows:
1. Free disk space to allow resiliency to complete.
2. Some users have very limited permissions setting. We suggest requesting their IT to give write permission rights for the folders where ACCE default folders will be created. Refer to the word document for the location of the default folders.
3. If users have installed V8.7.1, they have to un-install and clean up registry before installing V8.8.
Alternatively, rename or remove the registry folder 40.0 for V12.0 (this number will be different for earlier versions i.e. 39.0 for V11.1, 38.0 for V11.0, 34.0 for V8.8 and so on) then re-start ACCE to start over resiliency. This folder 40.0 is located at HKEY_CURRENT_USER/Software/Aspentech/Aspen Icarus Products. User with limited permission rights might need their IT with admin rights to perform this task.
Keywords: Missing palette window, blank project basis view
References: None |
Problem Statement: Why does Datasheet Definer Tab disappear in Excel?
In some occasions, when I open Datasheet Definer I am getting the following error message:
Note: If the Datasheet Definer Ribbon is unavailable after a recent installation of Aspen Basic Engineering, the ABEDefinerAddIn may NOT be loaded yet. To find more details on this issue and how to resolve it, please see knowledge Base (KB) article 97593. | Solution: If you are seeing this error message or not able to display the Datasheet Definer tab, make sure that AZDatasheetDefiner Add-In is not disabled in Excel options. Excel can disable an Add-In if the trust or security settings enabling it to run were not previously set, or if Excel is terminated by the Task Manager or crashes when the Add-In is running.
You can check it by following these steps:
1. Start Microsoft Excel
2. Go to File > Options > Add-ins
3. In Excel Options windows choose Add-Ins from left hand side and in Manage drop down box select Disabled items as shown in screenshot below and click on Go button. This will show Disabled Add-ins and you can enable it if AZDatasheetDefiner Add-In is in this list.
Once you do above steps, make sure you re-start excel by closing all excel file on your computer and make sure in task manager that Excel.exe process is not running.
Additionally, the Datasheet Definer may not have been installed properly due to a known issue with Office 2016. In this case, you won't find the Add-ins under the Disabled Items list. To correct the issue with Office 2016, please refer to the following article:
https://esupport.aspentech.com/S_Article?id=000048951
Keywords: Application-defined or object-defined error, error in AzDatasheetDefiner, object defined error, application defined error
References: None |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.