question
stringlengths 19
6.88k
| answer
stringlengths 38
33.3k
|
---|---|
Problem Statement: After the Report Writer is installed, you try to run a template from within Excel. When you click on the Run button, you get the following error messages:
ReportExecutor Object could not be created
the ATReportExecutor instance was not been created
Report Writer doesn't run, nor can you edit a template.
What can you do to solve this problem? | Solution: This error appears because the AspenRpt.dll was not properly registered during the installation.
Follow this procedure, which has solved the problem in several cases:
1) In Excel, hit Alt + F11, which will open the VBA editor. Go to Tools |
Keywords: None
References: s and check whether there is one line that says Missing: AspenRpt. This would mean that it is not seeing the dll.
2) Go to the' C:\Program Files\Common Files\AspenTech Share'd folder, and check whether the AspenRpt.dll is found there.
3) Unregister and Reregister the dll:
Open an MS-DOS session:
Start | Run | cmd |
Change to the correct directory (whatever the name is in the machine):
c:\> cd Program Files\AspenTech\Aspen Report Writer
First unregister the .dll file:
C:\Program Files\Common Files\AspenTech Shared\> regsvr32 /u AspenRpt.dll
For W7, 64Bit
C:\Program Files (x86)\Common Files\AspenTech Shared\> regsvr32 /u AspenRpt.dll
Then re-register it:
C:\Program Files\Common Files\AspenTech Shared\> regsvr32 AspenRpt.dll
For W7, 64Bit
C:\Program Files (x86)\Common Files\AspenTech Shared\> regsvr32 AspenRpt.dll
The confirmation message DllRegisterServer in AspenRpt.dll succeeded should appear.
4) Check in VBA whether the AspenRpt is now a valid reference.
5) Recheck the Report Writer Add-In in Excel, and then try to run a template again. |
Problem Statement: The Flowsheet report offers several functionalities that may not be obvious and can be very helpful in finding and analyzing data.
We present here a list of tips that can help to use the Flowsheet report in a more efficient way:
How to see only a subset of streams (or one stream, if you want to track sources and dispositions)
How to combine process units reports directly in the flowsheet
How to see the Blends individually | Solution: How to see only a subset of streams (or one stream, if you want to track sources and dispositions)
Go to the menu, View | Stream Visibility | Hide Individual Streams
Keep the stream(s) that you want to view in the flowsheet on the left window (Visible Streams), and move the rest of the streams to the right window (Hidden Streams).
The resulting flowsheet will only show the stream(s) that you selected. This makes it easy to follow sources and dispositions of individual streams.
How to combine process units reports directly in the flowsheet
You can combine process units directly in the flowsheet by dragging a unit and dropping it into another unit.
It will ask you if you want to group the node, after you say Yes, it will create a new unit called C00Y, where Y is the number of the grouped nodes (including the grouped sumbodels from table SUBMODS).
You can ungroup the submodels by going to View | Grouping | Group Nodes and delete the groups that you don't want to see in the flowsheet. The members of the group will be seen instead.
How to see Blends individually
By default, all blends are grouped together, which makes it difficult to track components per blend.
To see the blends individually, go to to View | Grouping | Group Nodes and delete the group called BLND. This group is automatically created by PIMS.
Keywords: None
References: None |
Problem Statement: Why am I getting error message Square brackets not in pair when I define a formula for a tag? | Solution: To troubleshoot this error message, you should make sure that,
Tag name is defined in square bracket
Square brackets should be in pairs, and make sure you close the square bracket after the name of the tag
Keywords: bracket, tag name
References: None |
Problem Statement: The information that you need to track for a given stream in an Aspen PIMS model may include the material balance, the properties, blending information, marginal values and pricing, etc.
All this data is available in different reports or in different sections of the same report (e.g. the Full | Solution: report).
However, there is no report that shows all relevant information for a stream consolidated in one single page.
Solution
The attached Report Writer Template allows to track all the relevant information for a given stream in one page.
It will identify what type of stream (feedstock, final product, blend, intermediate stream) it is and display the relevant information.
Please see the Limitations section below to make sure you are applying it as intended.
Functionality
The information shown includes:
Overall Material Balance
Material Balance by Process Units (Submodels)
Stream Properties
For a blend: composition of the blend, properties, blending specifications, marginal values
For a purchased or a sold stream: activity, MIN, MAX, Marginal Value, Pricing, Expense or Revenue
For a stream using Tiered Pricing: list of tiers, activity, MIN, MAX, Marginal Value, Pricing, Expense or Revenue
For a component to blending: activity in destination blends, percentage in blend
How to run the report
You need to have Report Writer installed.
Run the Aspen PIMS model
Copy the attached Report Writer template in the Aspen PIMS model folder
From MS Excel, go to the Report Writer Add-In and select the template
Type the 3 character tag of the stream that you want to report, hit Enter and then Enter' again, or hit OK
The mouse icon doesn't change from hourglass, but it works normally.
Limitations
The current limitations of this report are:
It can be used with standard PIMS and PPIMS (multiperiod) only.
It will not report information properly for MPIMS or XPIMS.
It will display the Activity, Marginal Values and Bounds information in Volume only (it still can be used with Weight based models).
It will report information for Stream Tags only. It will not display information for Utility tags or Group Tags.
It will display information for the first case only, and for all periods.
If you want to be able to select for which case you want to report, change the behavior of the Database Connection function in the LookUpStreamInfo.XLT: Open it from within MS Excel, go to sheetDataSetUp, cell E3, and change the parameter FALSE to TRUE, i.e. from =ATDBPMDB(.\Results.Mdb,60,FALSE) to =ATDBPMDB(.\Results.Mdb,60, TRUE).
Keywords:
References: None |
Problem Statement: Occasionally, after an Aspen InfoPlus.21 (IP.21) and Aspen Production Record Manager (APRM) server restart, units in the BCU Scheduling Table will show up as Load Failed. This is most likely because the BCU Server unit processing began before the Infoplus.21 tag data was available. The Process Data components in InfoPlus.21 need to be running before the BCU can verify tags used in the BCU unit scripts. If these components have not been started, the units will fail to start.
When the units show as Load Failed, they will need to be removed and readded to the Scheduling Table in the BCU Administrator before they can be rescheduled. This can be especially time consuming when a Batch area has multiple units. | Solution: This situation can be avoided by ensuring that the BCU unit processing does not start before the necessary process data components are available in Infoplus.21. To do this, take the following steps:
1. In the BCU Server Manager, uncheck the box to Begin unit scheduling automatically when the BCU Service starts
2. In the InfoPlus.21 Manager under Defined Tasks, highlight the TSK_BCU_START task. Uncheck the option to Skip during Startup. Move the task to the bottom of the Defined Task list by clicking the move down button. It should be the last task listed.
This will ensure that Unit Scheduling is started after InfoPlus.21 tag data is available. Next time the server is restarted, the units should start correctly in the BCU.
If the above does not resolve the issue, make sure that TSK_BATCH21_SERVER is running in the IP.21 Manager (is NOT set to skip during startup) and that both APRM services are running in the Services snap-in.
Keywords: Load Failed
References: None |
Problem Statement: How do I build MPIMS Tables using the 'Generate Global Model Tables' feature? | Solution: A global model links multiple local models together and allows for interaction between them. MPIMS is a separately licensed feature that must be purchased in order to use this capability. When setting up a global model, there are several required tables and many optional ones. Many of these tables have entries that are based on the content of tables in the local model. The 'Generate Global Model Tables' feature creates the global model tables which are based on the entries existing in the local models.
Before using this feature, make sure the local models are saved in the desired parent directory and are running and converging well. All execution log warnings should also be addressed.
Next, create a new PIMS model (MODEL | NEW). Save this new model into the same location as the local models. By default PIMS makes new models STANDARD models. So once the new model is created, right click on the model name and select 'Model Type', then select 'Global' as shown below.
The tables required for a global model are listed below.
TABLE
PURPOSE
MODELS
Defines the local models to be linked
MARKETS
Defines the markets to which products will be sold
MODES
Defines modes of transportation between plants/markets
GSUPPLY
Defines global supply constraints
SUPPLY
Defines local supply constraints
DEMAND
Defines product sales
DEMALLOC
Defines which products can be sold in which markets
The 'Generate Global Model Tables' feature cannot generate all of the required tables because some introduce data that is new and not represented in the local models. This feature will generate GSUPPLY, SUPPLY, and DEMAND. It will also generate the optional tables CAPS, PROCLIM, GROUPS, and PLANTGRP.
Before this feature can be used, Table MODELS must be built and attached to the model tree. This table defines a model identifier and the model names of the local models which are to be linked together. Below is an example of this table.
* TABLE
MODELS
*
List of Local Models
TEXT
CASE
***
*
A
M_Volsamp A
8
B
M_Volsamp B
C
M_Ethylene C
***
Once table MODELS is attached to the model tree, you can proceed with making the global tables. To do this, go to the RUN menu and select 'Generate Global Model Tables'.
PIMS will review the local model tables and map the data to the global tables as described in the table below.
Local Model
Global Model
Action
BUY
SUPPLY
PIMS ensures that material tags and material group tags present in the local model table BUY are also present in the global model table SUPPLY, with the exclusion of disabled materials.
PIMS extends the local model grouping of BUY tags into global model table SUPPLY.
GSUPPLY
PIMS ensures that unique tags contained in global model table SUPPLY are present in global model table GSUPPLY.
GROUPS
PIMS ensure that any groups present in global model table SUPPLY and global model table GSUPPLY are present in global model table GROUPS.
SELL
DEMAND
PIMS ensures that material tags and material group tags present in the local model table SELL are also present in global model table DEMAND, with the exclusion of disabled materials.
PLANTGRP
PIMS ensures that the group tags present in local model table SELL are also present in global model table PLANTGRP.
CAPS
CAPS
PIMS ensures that processes capacities present in local model table CAPS are also present in global model table CAPS.
PROCLIM
PROCLIM
PIMS ensures that processes capacities limits present in local model table PROCLIM are also present in global model table PROCLIM.
When PIMS is complete it will show this message:
If you now look in the model directory, you will see files that PIMS created. The names will be AUTO_GLOBAL_GSUPPLY, AUTO_GLOBAL_DEMAND, etc. Each of these tables must be attached to the model tree. This provides the required tables GSUPPLY, SUPPLY, and DEMAND. It also generates some optional tables CAPS, GROUPS, PROCLIM, and PLANTGRP which should also be attached to the model tree. Note that you may need to update and modify the automatically generated tables as you construct the rest of your model depending on the changes that you make.
You will now need to complete the Global Model setup by manually creating the rest of the required tables (MARKETS, MODES, DEMALLOC). These cannot be automatically created because they define data that is not represented in the local models.
For detailed information about each individual table and its required formats, please refer to the PIMS Help system.
Keywords: MPIMS
global
References: None |
Problem Statement: A customer recently reported that he experienced problems every time he ran Aspen PIMS. There were frequent warning messages about not being able to delete work files, occasional aborted runs and even PIMSWIN crashes.
These problems were encountered under Aspen PIMS versions 2006 and 2006.5. | Solution: This particular company uses an automatic backup utility, Veritas Backup Exec, that runs in the background and backs up files from the computers to a folder on a company server. When the backup utility was turned off, the problems stopped. If you are experiencing similar problems, check if your machine is utilizing such a backup tool and try turning it off.
Keywords: work file
aborted
crash
References: None |
Problem Statement: This procedure documents how to configure the CIMIO for Setcim/IP.21 Interface when connecting to multiple IP.21 servers. | Solution: For each IP.21 server that you are connecting to, assign a unique logical device name (eg. CIMIOSETCIM_200, CIMIOSETCIM_300, etc) and services port number (eg. 22001/tcp, 22002/tcp, etc). In IP.21 v4.1 and higher, the group number is no longer relevant, and thus, you can assign any number to each IP.21 server, it does not have to be 200.
On each of the IP.21 server (ie. the CIMIO server), amend the cimio_logical_devices.def and services file. To configure store and forward, make sure that the service name entries are 15 characters or less, eg. IOSETCIM_200_SC, IOSETCIM_200_ST and IOSETCIM_200_FW.
Cimio_logical_devices.def
CIMIOSETCIM_200 SERVER01 CIMIOSETCIM_200
Services (including entries for Store and Forward)
CIMIOSETCIM_200 21001/tcp
IOSETCIM_200_SC 21002/tcp
IOSETCIM_200_ST 21003/tcp
IOSETCIM_200_FW 21004/tcp
Note that the service name must be called CIMIOSETCIM_xxx, where xxx is the assigned group number. There is no restriction in the logical device name.
Configure the cimio_setcim_start.bat file to look like the following:
START /B /MIN cimio_setcim_dlgp -t 30 -n 200
START /B /MIN cimio_setcim_hist_dlgp -t 30 -n 200
This batch file would be used to startup the CIMIO interface. The -t parameter defines the timeout value (eg. 30 secs), and the -n parameter defines the group number (eg. 200). This group number must match that in the service name.
Configure the cimio_setcim_shutdown.bat file as follows:
START /B cimio_setcim_shutdown.exe CIMIOSETCIM_200.
If you wish to configure the CIMIO interface processes to startup automatically from the CIMIO Manager service, you will need to amend the following files:
cimio_autostart.bat
call %cimiorootq%\commands\cimio_sf_start CIMIOSETCIM_200
%cimioroot%\io\cio_set_cim\cimio_setcim_start
cimio_autostop.bat
%cimioroot%\code\cimio_sf_shutdown /S=CIMIOSETCIM_200
%cimioroot%\io\cio_set_cim\cimio_setcim_shutdown.bat
On the CIMIO client, amend the cimio_logical_devices.def and services file to look like this:
Cimio_logical_devices.def
CIMIOSETCIM_200
SERVER01
CIMIOSETCIM_200
CIMIOSETCIM_300
SERVER02
CIMIOSETCIM_300
CIMIOSETCIM_400
SERVER03
CIMIOSETCIM_400
Services
CIMIOSETCIM_200 21001/tcp
IOSETCIM_200_SC 21002/tcp
IOSETCIM_200_ST 21003/tcp
IOSETCIM_200_FW 21004/tcp
CIMIOSETCIM_300 22001/tcp
IOSETCIM_300_SC 22002/tcp
IOSETCIM_300_ST 22003/tcp
IOSETCIM_300_FW 22004/tcp
CIMIOSETCIM_400 23001/tcp
IOSETCIM_400_SC 23002/tcp
IOSETCIM_400_ST 23003/tcp
IOSETCIM_400_FW 23004/tcp
Make sure that each port number is unique.
Keywords: multiple
group
References: None |
Problem Statement: After a data reconciliation run with Aspen Utilities model loaded via Aspen Online, Aspen Online cannot initiate an optimization run. Attempting to run optimization results in:
Err description: The process cannot access the file 'C:\WINDOWS\TEMP\AUEditorsLog.log' because it is being used by another | Solution: To resolve the issue you should carry out the following checks:
Check to make sure that you do not have a minus - sign in the name of any of the block variable in your Aspen Utilities model.
Make sure that the beginning letter in each block of variables in Aspen Utilities is not a number or symbol.
Check to make sure that all contract Peak Prices are set to non-zero values in the contract editor (instead of 0 try 0.0001). When converting the price, the optimizer uses the inverse of the value so zero will cause math overflow error.
Keywords: Aspen Online, Optimization, AUEditor
References: None |
Problem Statement: Is it possible to localize Batch.21 to languages other than those provided with Service Pack 2 for version 4.1? | Solution: The attached kit allows users to localize Batch.21 software and help files to languages other than the ones provided by AspenTech.
Keywords: language
translate
local
localize
international
References: None |
Problem Statement: Which ini file enables the stored procedures (psoft or psoftG)? Some of the users report GUI problems when UseSP=1 is in both ini files | Solution: Here is a break down on the USESP setting. This setting should be in the psoftg.ini file. The setting can be used only if the client has run the IO package upgrade on their database. If the IO scripts have not been run on then database then it can cause GUI problems if this setting is turned on. The ?USESP? stands for use-stored procedures if the IO package has not been run then the stored procedures are not on the database and this causes the GUI problems. The main goal of the stored procedures was to increase the speed of the services, which make many redundant database calls. In general most of the performance upgrades will be seen in the services but overall using stored procedures should increase speed of the system even non-services.
Keywords:
References: None |
Problem Statement: Some customers generate and save Batch.21 Web Reports with preset queries so the user only has to select them and run the query. The user shouldn't have to see the Datasource and Area and the Batch Query Criteria options maximized. These options can be hidden manually by pressing the button on the right side of the page but this only keeps the options hidden for that particular user session and does not remain hidden once the user opens that report again. Is there a way to set the default settings for these options to start auto hidden instead of auto show? | Solution: If you are just trying to have your users automatically launch a previously generated report, you can bypass the Reporting.aspx page altogether and use the RunReport.aspx page by passing in the following information pertaining to the saved report:
- batch data source name that report is saved under
- batch area name that report is saved under
- storage location (atStoragePrivate or atStoragePublic) of saved report
- name of saved report
Here is an example of a url:
http://batchserver/pages/Reporting/Reporting.aspx?datasource=PRODUCTION&area=LINE1&location=atStoragePublic&name=last5summary
Keywords:
References: None |
Problem Statement: Sometimes, when user installs PIMS on top of existing version, such as install a new build on another build within a major release. He/She may encounter problem later when he/she tries to uninstall the application. | Solution: Three ways to uninstall PIMS application,
1. Use Aspen Uninstall tools from Start | All Programs | AspenTech | Uninstall Aspen Tech Software.
2. Use window Add/Remove from Control Panel | Add or Remove Programs.
3. If either one works, then user needs to use a window uninstall cleanup utility. Here is a link from Microsoft website, http://support.microsoft.com/kb/290301
Following is a screen shot from this link. The cleanup utility name is called ?msicuu2.exe?. After download, run the file, and follow the procedure to cleanup.
At the end, always make sure the application folder is clean.
Keywords: uninstall
utility
References: None |
Problem Statement: Is the timestamp for a Aspen Process Explorer aggregate value set at the beginning, the middle or the end of the aggregate period? | Solution: The Aspen Process Explorer default timestamp anchor is the beginning of the aggregate period. If this is not desirable, then the timestamp anchor can be changed in the trend properties.
1. Right Click on the Trend and select Properties.
2. On the Sampling tab select the aggregate type which will make the Settings button available.
3. Click the Settings button to enter the Aggregates settings.
4. Set the Timestamp Anchor as desired [Begin, Middle, End].
Keywords:
References: None |
Problem Statement: When saving a report or plot created with the Aspen Production Record Manager (APRM) Web Reporting tool, you are not given an option to choose the directory location to save those files. Where are these reports or plots saved? | Solution: By default the reports are saved to the following directory location on the APRM server
Windows 2003: <drive>\ProgramFiles\AspenTech\Batch.21\Data\Reports\Queries
Windows 2008: <drive>\ProgramData\AspenTech\Production Record Manager\Data\Reports\Queries
Similarly plots are saved under the following directory location on the APRM server
Windows 2003: <drive>\ProgramFiles\AspenTech\Batch.21\Data\Plots
Windows 2008: <drive>\ProgramData\AspenTech\Production Record Manager\Data\Plots
Within both the Queries and Plots directories are Public and Private folders. Under the Public and Private folders, are user's folders, and within the user's folders are Area folders. The reports and plots are saved as XML files in the Area folders.
Key Words
Reports
Plots
Batch.21
folder
Keywords: None
References: None |
Problem Statement: Batch.21 triggers can be configured to run command line programs. Often times someone will want to run a system command, or perhaps execute an SQL Query. You find this option in the BCU Administrator on the trigger command's general tab.
What is the correct syntax for a post trigger command that runs an sql query? | Solution: The following is an example of a command that would execute a saved SQL script.
%SETCIMCODE%\\IQ %b21%\\batch.sql
Also note in the BCU Administrator GUI when configuring this type of trigger that command line arguments can also be specified.
Keywords: trigger
command
batch
References: None |
Problem Statement: How do I know when my PFD is converged? | Solution: The status bar is situated near the bottom left hand corner of the application window, and indicates convergence status. A green color indicates that the simulation is converged, a yellow color means the simulation is converged with warning(s) and a red color means theSolution has not converged and there are error(s).
Keywords: convergence, converged, status
References: None |
Problem Statement: When I create the tags using GUI I get the tag descriptions from DCS. When I create the tags using a text file (CSV file) I do not get the tag descriptions. I checked the text file to make sure that I am not loading the tag descriptions. How do I get the tag descriptions? | Solution: RTExec performs tag validation during importing. Aspen Online (AOL) made this an on-demand event.
In the Client GUI, if you go to Tools menu you should see the Tag validation menu item. There are two sub-menu items for tag validation. One option is to validate all relevant tags. The other option is to validate relevant tags that are not marked as Valid.
In AOL, a DCS tag is marked as Valid if it passed validation. However, this only means that it is valid at the time of validation. A tag can be deleted or renamed on the plant database side later on.
After you perform tag validation, a dialog box will pop-up summerizing validation results. There is a check box about Update description and unit string retrieved during validation. If this box is checked and you click Update/Delete button, then, you will have description and DCS unit from plant database in your project.
There are two other options about deleting tags that failed validation. By default, these boxes are not checked.
Keywords: None
References: None |
Problem Statement: The following problem has been reported for several different models using the XPRESS optimizer:
You get an optimal | Solution: with a given capacity not bounded (no MAX).
In a CASE, you put a MAX higher than theSolution value for this capacity and now you get an unboundedSolution.
What is the reason for this behavior and how can it be fixed?
Solution
This is related to a problem with the matrix Presolve feature of the XPRESS optimizer version 18.00 (used in PIMS 2006.5) and version 18.10 (used in PIMS 7.1). It has been fixed in XPRESS 19, incorporated in Aspen PIMS 7.2.
If you are still running a previous version, this problem goes away by turning off the IFPRES (Pre-solve enabled) setting from:
Model Settings | L.P. | Options | IFPRES
Note: The Presolve feature in general allows for a fasterSolution time, so it should remain turned on unless you see this kind of problems.
Keywords: IFPRES
Unbounded
XPRESS
References: None |
Problem Statement: As explained in Knowledgebase | Solution: 108040, Batch.21 has undergone many major design changes between versions, particularly in the relational database structures. Section 3 of the document Batch.21 V4.1 Installation Manual for Windows 98, 2000 and NT is entitled Upgrading Batch.21 from V4.0. It is intended that the version 5.x installation manual will describe how to upgrade from version 4.1 to 5.x.
What path should one follow if upgrading from version 4.0 to 5.x, or from version 2.x or V3.x to a higher version?Solution
Unfortunately there is not a simple single step option. As mentioned in the 4.x and 5.x installation manuals, Aspentech recommends the following sequence:
Upgrade from version 2.x (or 3.x) to version 4.0.
Upgrade from version 4.0 to 4.0.1 (meaning install version 4.0 SP1.)
Upgrade from version 4.0.1 to 4.1.
Upgrade from version 4.1 to 5.x.
You cannot and must not skip any of these steps.
A list of necessary documentation is also listed in the above mentioned installation manuals.
Caution: If you have previously installed more than one Aspen Manufacturing Suite product on the same computer, you must upgrade all previously installed products at the same time. If you are also installing an additional Aspen Manufacturing Suite product, you may install it at the same time as you perform the upgrades.
KnowledgebaseSolution 108040 describes how to confirm that your relational database has been correctly upgraded.
Keywords:
References: None |
Problem Statement: Oracle treats empty strings as null values. The CHAR_BATCH_DATA table has a constraint where the characteristic''s value must be a non-null value. If an InfoPlus.21 tag that is used as the source for a Batch.21 characteristic, and has an empty value (as opposed to a single space which is accepted by Oracle), the characteristic cannot be written to Batch.21. The BCU subsequently reports a relational database error indicating that a NULL value is not acceptable. | Solution: Implementing a database trigger which detects the presence of a NULL value and transforms it into a single space should provide a workaround to this Oracle limitation. The trigger can be attached to the CHAR_BATCH_DATA table, and will only execute when the value is determined to be NULL.
The following script which can be run from the Oracle SQLPlus utility:
CREATE OR REPLACE TRIGGER ASPENBATCH21.BLANK2SPACE
BEFORE INSERT OR UPDATE
ON ASPENBATCH21.CHAR_BATCH_DATA
FOR EACH ROW WHEN (NEW.CHAR_VALUE IS NULL) BEGIN
:NEW.CHAR_VALUE := '' '';
END;
/
Keywords:
References: None |
Problem Statement: What''s the difference between archive and backup? | Solution: Backup:
The Batch.21 Backup Tool allows you to backup batch AND configuration data for specified batches into a single file. This is useful if you want to copy data from an online system to an offline system for testing purposes. The backup file can be restored using the restore procedure.
Archive:
The Batch.21 Backup Tool allows you to archive selected batch data from the Batch.21 database. Archiving is used to save online data for long-term storage. When data is archived, the state for each piece of batch data is set to archived thereby allowing that data to be purgedPurge_Procedure from the system. The archive file can be restored using the restore procedure.
Keywords: purge
archive
backup
restore
References: None |
Problem Statement: Error running Aspen Database Wizard on laptop: error enumerating data servers - also, user not associated with a trusted SQL server connection | Solution: Within SQL Server, it is necessary to define the Authentication mode as SQL Server and Windows, also known as MIXEDmode in SQL 2000.
To set up Mixed Mode security from Enterprise Manager:
Find the server in question in the Enterprise Manager.
Right-click the server, and then click Properties.
Click the Security tab.
Under Authentication, click SQL Server and Windows.
Keywords:
References: None |
Problem Statement: No new Batch data is been generated. The BCU log reports:
Error: Relational Database Error -2147217900: [Microsoft][ODBC SQL Server Driver][SQL Server]Violation of PRIMARY KEY constraint Cannot insert duplicate key in object 'batches'.: Execute stored procedure BATCH21_SP_INSERT_BATCH Failed! | Solution: Examine the SQL script referenced in the error message (in this case it is BATCH21_SP_INSERT_BATCH) and the (2) Batch.21 tables: ID_COUNTERS and BATCHES.
Make sure that in the ID_COUNTERS table, the value for the counter corresponds to the number of batches within the BATCHES table. If not, set the ID_COUNTERS to the appropriate number shown in the BATCHES table; the counter and number of batches should be synchronized (i.e. equal).
Restart all BCU scripts and allow them to catch up.
Keywords: Violation of PRIMARY KEY constraint
Can not insert duplicate key
BCU
References: None |
Problem Statement: Modelers frequently wish to represent a loss term in an Aspen PIMS submodel. Such a loss term might represent a waste stream, or an extra term to force the submodel to balance on a weight basis.
Aspen PIMS provides a reserved tag, LOS, for this purpose. This | Solution: describes the advantages of using the tag LOS for a loss term.
Solution
To demonstrate a loss term, we added an artificial stream to a submodel. Two versions of the model were created: the first version uses the tag XXX for the loss term; the second version uses the tag LOS. We will show the Aspen PIMS output for the two versions of the model.
The model was run under Aspen PIMS version 17.1.13, a beta release of version 2006.5.
First version: Use Tag XXX to represent loss term
The figure below shows the SNHT submodel table from a PIMS training model. We added an extra material balance row, VBALXXX, which will create a significant material imbalance in the model.
The figure below shows an excerpt from the Execution Log after the model is run. Note the large imbalance for material XXX.
The figure below shows an excerpt from the FullSolution Report cover page. Again, the large imbalance of material XXX is prominently displayed.
Also in the FullSolution Report, the Stream Disposition Summary flags the imbalanced XXX.
The figure below shows the SNHT submodel report from the FullSolution Report. Note that stream XXX is included in both the volume and weight totals.
In the SummarySolution Report, material XXX is reported in the Materials Out Of Balance section.
Second version: Use Tag LOS to represent loss term
The figure below shows the SNHT submodel table. The loss term is now represented by the material balance row, VBALLOS.
The figure below shows an excerpt from the Execution Log after the model is run. Note that LOS is not reported as out of balance.
The figure below shows an excerpt from the FullSolution Report cover page. LOS is not reported as out of balance.
In the Stream Disposition Summary of the FullSolution Report, LOS is shown, but it is not flagged with an asterisk.
The figure below shows the SNHT submodel report from the FullSolution Report. Note that stream LOS is reported, but is not included in the weight total.
In the SummarySolution Report, LOS is not included in Materials Out Of Balance section.
Conclusion
The reserved tag LOS has the following advantages over other tags for representing loss terms:
? LOS is not reported as out of balance on the Execution Log.
LOS is not reported as out of balance on the cover page of the FullSolution Report.
LOS is not flagged as out of balance in the Stream Disposition Summary of the FullSolution Report.
LOS is not included in the weight balance of the submodel in the FullSolution Report.
LOS is not included in the Materials Out Of Balance section of the SummarySolution Report.
Keywords:
References: None |
Problem Statement: This knowledge base article describes AspenTech's position regarding the hosting of multiple Batch.21 databases on a single Oracle server. | Solution: Multiple Batch.21 databases can reside on the same Oracle server as long as the Batch.21 databases are hosted in separate Oracle instances. It is not possible to have two Batch.21 databases in the same Oracle instance.
Keywords: Relational
Support
References: None |
Problem Statement: Are there any symbol libraries or sample graphics available for use in the Aspen Process Graphics Editor for valves, pumps, tanks, pipes, etc.? | Solution: There are a variety of example graphics and symbol library graphics located at:
(32-bit systems)
<drive>:\Program Files\AspenTech\APEx\Samples\GraphicsEditor\Graphics
(64-bit systems)
<drive>:\Program Files (x86)\AspenTech\APEx\Samples\GraphicsEditor\Graphics
These two files in particular have symbols that can be copied and used in other graphics:
symbols.atgraphic
3Dsymbols.atgraphic
There are many sample graphics including the following:
atmospheric.atgraphic
crudeover.atgraphic
furnace.atgraphic
tertiarywatertreatment.atgraphic
Keywords: None
References: None |
Problem Statement: What is the purpose of the Recent Batch Cache in the BCU? | Solution: At any particular trigger time, the BCU must acquire the values of a unit''s designator tags and figure out what batch ID those designator values represent.
It could do this by always simply going to the Batch.21 server and doing a query, but as there are many trigger firings that occur for each batch, caching this information (batch ID vs designator values) will greatly reduce traffic between the BCU and the batch database.
This caching is done in the Recent Batch Cache in the BCU.
When the BCU gets a trigger it takes the designator values and checks the cache to see if it already knows about a batch ID with those values. Here are the possible outcomes:
If the current designator values already exist in the Recent Batch Cache, then no action is taken, and the characteristics associated with the trigger are recorded.
If the current designator values are not found in the Recent Batch Cache, then the BCU goes to the server and does a query. If a batch ID is found that matches the designator values, then (1) that Batch ID is added to the Recent Batch Cache, and (2) the characteristics associated with the trigger are recorded.
If the current designator values are not found in the Recent Batch Cache OR on the Batch server, then (1) a new batch ID is created, (2) the new batch ID is added to the Recent Batch Cache, and (3) the characteristics associated with the trigger are recorded.
It is possible for the user to pull the rug out from underneath the cache by deleting batches from the Batch.21 database, or by manually changing designator values. If either of these happen, the information in the cache will be out of date (in the former, because the BCU thinks it knows about a batch that actually doesn''t exist any more; and in the latter, because the BCU thinks a certain batch ID has different designators than it really does.)
There are two ways to wipe out the cache (and subsequently rebuild it via individual queries to the Batch.21 database as needed): (a) manually, or (b) restart the BCU.
In versions prior to 4.1, it could be done only via (a), because the contents of the Recent Batch Cache were preserved in a file during a restart.
Keywords: cache
Batch.21
InfoPlus.21
References: None |
Problem Statement: Batch Query Tool returns No key value for list item xx error messages for every row it returns for a query. User is likely to be using a localized version of the Windows operating system (eg. Japanese or Korean). | Solution: The No key value for list item xx error appears because the Batch.21 query tool cannot understand foreign character strings for ''AM'' and ''PM''. This is a problem with some Microsoft functions apparently not handling foreign character strings correctly.
A few workarounds exist:
display the time without the AM and PM designations (24 hour time seems to work fine).
change the am/pm designator from the foreign characters to AM and PM (in English).
This problem occurs because Batch.21''s display format is based on your Windows Regional Settings. To change this, you have to go to Control Panel-> Regional Options-> Time. After getting rid of the foreign characters, open the Batch Query Tool again and do a query for batch data. The error should not reappear.
Keywords: No key value
Batch Query Tool
References: None |
Problem Statement: After installing Batch.21 version 4.1, the Batch.21 Administrator fails to launch without error. The batch area is created with a error message:
B21BSC-50187: Failed to get area attributes
Each time the area is clicked on to select it, the same error appears. | Solution: Install ER BA032902A. The ER is available from our Support website as Knowledge BaseSolution 108129.
This problem is caused by an Oracle service. TheSolution is to stop the service OracleHTTPServer which is installed during a standard ORACLE installation. This service is started automatically.
Keywords:
References: None |
Problem Statement: How to properly set the Subbatch Level for Unit Characteristic on the Characteristics tab of the area properties dialog box? | Solution: When configuring an area in the Batch Administrator, you will have to set the proper Subbatch Level for the Unit Characteristic on the Characteristics tab of the area properties dialog box.
Before you can do that, however, you will first have to define the number of subbatch levels defined on the Subbatch Levels tab because the available subbatch levels are based on the number of subbatch levels defined on the Subbatch Levels tab.
The Subbatch Level number for Unit Characteristic must correspond to the highest subbatch level that contains an entire unit procedure. In other words, it must correspond to the subbatch level at which the batch moves from one processing unit to the next.
Example:
Please see attached MS Word document.
Keywords:
References: None |
Problem Statement: This | Solution: s shows how to upgrade the Batch.21 database from versions 4.1, 5.x and 6.x to the aspenONE versions (2004 and above).
Solution
In the past there was a migration table that needed to be followed when upgrading the various versions of Aspen Batch.21. With aspenONE there is an upgrade wizard that allows the system to be upgraded from versions 4.1.x, 5.x and 6.x directly to v. 2004 and above.
The upgrade wizard is accessed from the Start menu:
Start | Programs | AspenTech | Common Utilities | Aspen Database Wizard
The screen capture below shows the drop down menu for choosing the database version to upgrade from.
For additional information regarding the use of the Aspen Database Wizard, please see the Aspen Batch.21 Configuration Guide.
Please note that the Configuration Guide states that the Wizard allows you to upgrade the Batch.21 database from v5.0 or higher, to the aspenONE format. This is not correct, as the Wizard allows the Batch.21 database to be upgraded from v4.1 or higher, to the new database schema.
A documentation enhancement defect (CQ0029128) has been introduced to correct this.
Keywords: Upgrade
Database
Wizard
References: None |
Problem Statement: When starting Aspen Process Explorer, the following error is displayed:
Failed to update the system registry. Please try using REGEDIT. | Solution: This error is caused because the application was installed with a non-privileged user. ProcessExplorer.exe must be installed and registered by a privileged user. If it has been, and it is then run by a non-privileged user, the registry error message appears, but the error is resolved, and the program should continue to execute normally.
If ProcessExplorer.exe does not continue to execute normally, it needs to be registered on that machine by a privileged user - one with local Administrative privileges. This is done from a command prompt navigating to:
<drive>:\Program Files\AspenTech\APEx\Pe and typing:
ProcessExplorer.exe -regserver.
This issue is due to a known Microsoft defect with Windows 2000 documented in Microsoft Knowledge Base Article 254957:
When an underprivileged (for example, a non-administrator or a non-power) user runs a standard Microsoft Foundation Classes (MFC) OLE server on Windows 2000, the registry update fails and displays the following error message: Failed to update the system registry. Please try using REGEDIT. The error occurs because each time an MFC OLE executable server runs, it calls the UpdateRegistry function in the COleTemplateServer class. The MFC UpdateRegistry function updates the registry entries in HKEY_CLASSES_ROOT for both the application and its document types. In Windows 2000, access to HKEY_CLASSES_ROOT is restricted to administrators and power users.
Process Explorer already contains the only known reSolution to this problem, which allows it to register itself when it is executed by a privileged user.
Keywords: Registry
Failed to Update
Install
References: None |
Problem Statement: Sometimes, due to a system crash or some other undetermined reason, there may be a period of several days when the interface will return the same timestamp for all the data values each time the samples are taken. This could result in thousands of points in history with the same timestamp but different values. This data would typically be considered bad by process engineers even though the IP_TREND_QSTATUS would show status as GOOD for each occurrence. Since historical data cannot be deleted, the reports generated by the database would interpolate all the data, including those occurrences regarded as bad, thus creating a false picture of the events. When viewed in Process Explorer, the interpolated trends could be rather confusing and generally undesirable to the process control engineers.
There also may be a situation where no data is gathered at all in InfoPlus.21, for some period of time. Again, when viewed in Process Explorer, a straight line, or interpolated line, would be drawn between the last value, and the next value, regardless of the length of time passed between them.
For either of the reasons above, you may wish to see a gap in the trend line, rather than a straight line drawn between the data points. | Solution: OneSolution to the above problem would be to run an SQLplus query which would change the IP_TREND_QSTATUS of the affected points in history from GOOD to BAD, thus inserting a gap in history which would prevent undesirable data from being displayed in Process Explorer. Process Explorer does not display data with a bad Quality status. You would see a small x at the last good value, a gap, and another small x at the first good value.
Here is an example of a query which would change the status in the IP_TREND_QSTATUS field of the affected data points from GOOD to BAD. This example query is only an example, and will need modification to meet the specific needs of your data.
Please save query1 and query2 as separate files and then run query1. Make sure that query2 is saved in the group 200 directory.
Example:
query1
FOR (SELECT name n FROM IP_AnalogDef)
DO
START 'query2.sql',n;
END
query2
UPDATE &1 SET IP_TREND_QSTATUS = 'BAD'
WHERE IP_TREND_TIME BETWEEN '18-OCT-00 09:37:00' AND '18-OCT-00 09:38:00';
The above query combo will change the IP_TREND_QSTATUS to BAD for all your affected data records defined by ip_analogdef.
NOTE: For more information on using the UPDATE statement, see the SQLplus Users Manual or the online help. Also, seeSolutions #102623 and #102892 for information about inserting occurrences into history, which might come in handy if you wanted to mark, for future reference, the beginning and the end of the altered block of points in history by inserting a couple of extra points.
Key Words:
Keywords: None
References: None |
Problem Statement: How can I view all database records in the Aspen Tag Browser? Why can't I list (and then plot in Aspen Process Explorer) the custom records I defined?
Customized definition records can be used to create user-specified record types in Aspen InfoPlus.21. These records can be developed to include more plant- or process-specific record fields than the default Aspen InfoPlus.21 definition records such as IP_AnalogDef or IP_DiscreteDef.
This tech tip provides suggestions on why the Aspen Tag Browser will return no, or fewer than expected, records for searches on customized record types.
An example can be seen in the two pictures below: the first one shows the results of an advanced search on standard AnalogDef tags, while the second one shows that the Aspen Tag Browser displays no records (although they exist in the database) when performing a search on custom-defined records (here the definition record is PIDVacioHornoDef). | Solution: In order for custom records to be found when using the Aspen Tag Browser they must have a corresponding map record. To create a map record, it is necessary to create and configure a record defined by AtMapDef. Map records were conceived to tell Aspen Process Explorer which values (from which field in the record) are to be plotted. You can find more information on maps and map records in the Aspen Process Explorer online help and in the Aspen InfoPlus.21 Database Developer's Manual (Map Records chapter).
If the MAP record has been configured, but the Aspen Tag Browser still does not return the correct list of tags, please check the following points:
1. Make sure the Aspen Tag Browser is configured to search for customized tags.
Open the Aspen Tag Browser and select View | Options from the menu, select the Search tab and uncheck the IMS Tag Set Only checkbox.
2. Verify the map record is pointing to the correct definition record.
In the InfoPlus.21 Administrator, open the map record under AtMapDef that corresponds to the custom definition record and verify field MAP_DefinitionRecord. For example, the map record for IP_AnalogDef, called IP_AnalogMap, points to IP_AnalogDef inside field MAP_DefinitionRecord. If this pointer is changed to a different definition record, then tags defined under IP_AnalogDef cannot be found in the Aspen Tag Browser and trended in Aspen Process Explorer.
3. Verify that field MAP_CurrentValue in the map record points to a valid field.
If the field pointed by MAP_CurrentValue is invalid, then tags defined by your custom definition record cannot be returned by a tag search. The SQL query developed and used by Aspen Tag Browser searches will fail and return an internal error. Externally, all the user will see is message no record was found, indicating that no tag names matched the search.
An invalid field name for MAP_CurrentValue would be any field that is not listed in the fixed area of the target record. For example, say MAP_CurrentValue field is set to FILE_NAME. Since this is not a field in the fixed area of the standard IP_AnalogDef record, Aspen Tag Browser searches will not be able to recognize tags defined by IP_AnalogDef. On the other hand, you could point the MAP_CurrentValue field to IP_Value_Format. Even though this is not the field that contains the current tag value, it is a field listed in the fixed area of the IP_AnalogDef record. Subsequent tag searches will find IP_AnalogDef tags; however, the searches will also not return current values of the type F6.3, F7.2, etc.
An invalid IP_CurrentValue field will cause more damage than just affect searches on the map record's corresponding tag type; it will also interfere with searches on tags defined under different definition records. In fact, one incorrectly formatted map record can cause the Aspen Tag Browser to give misleading and confusing results for all the tags in the Aspen InfoPlus.21 database.
To explain, say that there are two custom definition records, IP_CustA and IP_CustB, with the following map record configuration. In addition, there are six tags defined in the database, three for each Definition Record.
IP_CustDefA
Map Record:
IP_AMap
IP_CurrentValue:
FILE_NAME
Defined Tags:
t1, t2, t3
IP_CustDefB
Map Record:
IP_BMap
IP_CurrentValue:
IP_INPUT_VALUE
Defined Tags:
t4, tt4, b1
While the IP_BMap record is correctly setup, the IP_CurrentValue field for IP_AMap points to the field FILE_NAME, which does not exist in the fixed area of the IP_CustDefA records. Consequently, the IP_AMap record is incorrectly configured.
A tag search of t* would potentially find the five tags that begin with t: t1, t2, t3, t4, and tt4. Since IP_AMap is incorrect it is expected that tags t1, t2, and t3 would not be returned to the Aspen Tag Browser results, but tags t4 and tt4 would be returned as they have a correct map record. However, with the setup given above, this search will actually return no tags to the Aspen Tag Browser. Why is this?
The SQL query used by the Aspen Tag Browser will step through the database and look at each tag to see if it meets the search criteria. If it comes across a single tag, which meets the criteria, but has a faulty map record, the entire query is aborted.
In the above example, tags t4 and tt4 are not returned because tags t1, t2 and t3 have already caused the query to end prematurely. To find these tags with the Aspen Tag Browser the search would need to be narrowed so that it filtered out any tags of IP_CustDefA type. For example, searches of t4* and tt* would return tags t4 and tt4, respectively.
With a larger database and more custom tag types, this error can create a great deal of confusion with users.
Keywords: BROWSING
PLOTS
PLOTTING
CANNOT PLOT
TAGBROWSER
CUSTOM DEFINITION RECORD
References: None |
Problem Statement: The programmable characteristics available in Aspen''s VBA environment can also be made available in the VB environment (functions such as AT_GETAGG, for instance.) | Solution: In Visual Basic, choose Add Components, and then select Aspen Tag (non-visible). If this component is not available, then search for the underlying ActiveX file on your system, which is at_tag.ocx.
Add this tag to a VB form to make available the properties and methods you want to use.
Keywords: VB6
VB5
get aggregate
References: None |
Problem Statement: If the interface is installed onto an InfoPlus.21 system that runs a non-default group number (anything other than group200), there is an undocumented command line parameter needed for starting the cimio_setcim_dlgp.exe process. | Solution: If your InfoPlus.21 system is being run under group400 for example, the following entries will be needed in the cimio_logical_devices.def file:
IOSETCIM <nodename> CIMIOSETCIM_400 (where <nodename> is the name of the InfoPlus.21 server being used as the CIM-IO server)
There would then need to be an entry in the services file for CIMIOSETCIM_400.
When starting the DLGP process, the following parameter needs to be passed to the DLGP process: -n XXX (where XXX corresponds to the group number used)
Following the example above, the command used to start the DLGP process would be:
%CIMIOROOT%\io\cio_set_cim\cimio_setcim_dlgp -n 400
If the DLGP process is being started as an external task in the InfoPlus.21 Manager, the Command line parameters field should be updated to include:
-n 400
Keywords:
References: None |
Problem Statement: If you attempt to install Batch.21 to a root directory (such as C:\) or move an existing installation to a root directory, the warning message
Use of root directory for Batch files is not recommended
is returned. | Solution: This message stems from the fact that the root directory which is shown in the Batch Query Tool scope window must normally be above the saved query file you want to use. If you put the saved files somewhere else, sometimes the the Batch Query Tool can't figure out where the query file is.
Keywords:
References: None |
Problem Statement: In the BCU it is possible to execute a custom command when a trigger fires. There are restrictions (see | Solution: #108033) and recommendations on how to specify these.
Solution
For complex commands, follow either of these recommendations:
1. cmd /C custom_command where custom_command equals your command(s)
2. Call a batch file from the BCU, where the batch file executes your complex command(s)
Keywords: Batch.21
BCU
custom
command
References: None |
Problem Statement: In the Full | Solution: Report, under the Process Submodel section for the crude units, I can find the Swing Cut End Points reported.
Is there a way to suppress the logical crude units reporting of end point assays?
Solution
Aspen PIMS automatically generates this report if it finds the FVT property in table ASSAYS for the swing cuts.
From Table ASSAYS:
Into FullSolution
The only way to suppress the report would be to rename this property to something else, i.e. change the tag name from FVT to a different 3-character tag.
Keywords: Process Parameters
References: None |
Problem Statement: Mapping a large number of model variables to tags in the Aspen OnLine client user interface forms is a difficult process since the mapping is done one variable at a time. Is there a more efficient way of entering the mapping for a large number of variables? | Solution: When there are more than just a few variables to link to tags in Aspen OnLine, the recommended procedure is -
1. Manually add one or two tags in the Aspen OnLine client graphical interface (GUI)
2. Manually add one or two variables in the GUI as well.
3. Manually link those tags and variables in the GUI.
4. Export a .CSV file for the tags from the GUI (File > Export > Tags)
5. Export a .CSV file for the variables from the GUI (File > Export > Variables). Please note that this file includes the tag-variable linkage definitions.
6. Using the exported tags file as a template, prepare a similar file which can be imported to add the remaining tags to the Aspen OnLine project. There could be hundreds or thousands of tags, and it can be done in one or multiple files. (File > Import > Tags)
7. Using the exported variables file as a template, prepare a similar file which can be imported to add the remaining variables to the Aspen OnLine project and subsequently link these variables to the appropriate tags as well (File > Import > Variables). Please note that the linkages between variables and tags are defined in the variables file as exported in step 5. There could be hundreds or thousands of variables. This process can be carried on in one or multiple files.
Please note that is a recommended practice. Select only those tag and variable attributes in the export and import files that are necessary. The rest of attributes will get the default values, which is appropriate.
Keywords: size, mapping, tags, variables
References: None |
Problem Statement: When trying to add a tag to a unit the Batch.21 Administrator, the following error occurs:
Unable to CCreateNewTagDlg error message | Solution: To resolve this problem re-register the AtBatch21Administrator.dll file.
Keywords:
References: None |
Problem Statement: When importing an XML file that contains batch areas from another batch system, the following error message may appear:
B21SVR-50201 Permission Denied : Batches | Solution: The above situation was encountered when batch areas from a production server were imported into an identical batch test server. If there are existing batch areas on the test server then the import of unique batch areas via the XML file will not be possible. The error message: B21SVR-50201 Permission Denied: Batches will appear.
The error message indicates that the user lacks the permission to view, merge or modify the batch areas.
On a system with this problem, looking at the Batch.21 tables in the Oracle database should indicate a datatype as ORABLOB - capitalized. By setting the ACL (Access Control List) information in the database structure to NULL, this datatype is changed to lowercase orablob, indicating to Oracle that security is no longer enforced on this Batch.21 database. Here's an example script for either Oracle 8i or 9i (the problem can exist in both) that resolvs the issue:
- The affected database schema is AspenBatch21
- The affected tables are AREAS, CHARACTERISTICS, SB2_NAMES, SB3_NAMES, SB4_NAMES and SB5_NAMES
- The affected columns are: ACL, BATCH_ACL
Here is the sql used to update the AspenBatch21 Batch area:
update areas set acl = null;
update areas set batch_acl = null;
update characteristics set acl = null;
update sb2_names set acl = null;
update sb3_names set acl = null;
update sb4_names set acl = null;
update sb5_names set acl = null;
IMPORTANT NOTE: After running the above SQL, the same error may result again if you immediately try to re-import the XML. Allow 30 to 60 minutes before trying to re-import so that all security information has been updated.
Keywords: Orablob
blob
permission
50201
B21SVR-50201
References: None |
Problem Statement: Aspen OnLine may fail with the following message in the EngineLog.log file when trying to load any model for execution:
Monday, Sep-16-2002 14:16:30 Client 0 encountered error loading Aspen Utilities model (Example.auf) execution. Err description: (0) | Solution: One cause for the error message is that NT services do not have sufficient memory available. This can be modified by changing a registry key:
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\Subsystems\Windows
Old value - %SystemRoot%\system32\csrss.exe ObjectDirectory=\Windows SharedSection=1024,3072,512,1024 Windows=On SubSystemType=Windows ServerDll=basesrv,1 ServerDll=winsrv:UserServerDllInitialization,3 ServerDll=winsrv:ConServerDllInitialization, 2 ProfileControl=Off MaxRequestThreads=16
New value - %SystemRoot%\system32\csrss.exe ObjectDirectory=\Windows SharedSection=2048,3072,2048,1024 Windows=On SubSystemType=Windows ServerDll=basesrv,1 ServerDll=winsrv:UserServerDllInitialization,3 ServerDll=winsrv:ConServerDllInitialization, 2 ProfileControl=Off MaxRequestThreads=16
After modifying the registry key, the PC must be rebooted for the new setting to take effect.
This registry key may be modified by the installation of any programs. Although you may have already modified the SharedSection resources, subsequent installation of a third party program may have reset it back to the default, smaller values.
Keywords:
References: None |
Problem Statement: Is it possible to localize Batch.21 to languages other than those provided with Service Pack 1 for version 5.0? | Solution: The attached kit allows users to localize Batch.21 software and help files to languages other than the ones provided by AspenTech.
Keywords: language
translate
local
localize
international
References: None |
Problem Statement: When using the Aspen Blend Model Library (ABML), when am I required to recurse the properties of specification blends? | Solution: When implementing ABML, the modeler should recurse properties for a blend under the following circumstances:
1. When using a component-based transform, the index property of the blend must be recursed so that it is visible to the reverse transform. For example, inSolution 121223, which demonstrates the use of RVPINDEX, the RVI (Reid Vapor Pressure Index) property of the gasoline blends must be recursed so the RVP can be calculated via the reverse transform.
2. When using a second-order correlation, the properties that are input to the correlation must be recursed. For example, when using the DON correlation to compute the road octane of a gasoline blend, the Research (RON) and Motor (MON) Octane Numbers must be recursed.
3. When daisy chaining correlations, the input properties to the cascaded correlations must be recursed. For example, when using the TVL correlation, the RVP, T10, and T50 correlations must be recursed. However, if daisy chaining correlations, the user only needs to provide initial guesses for the first set of inputs. Aspen PIMS automatically propagates the guesses down the daisy chain.
In general, ABML requires properties of specification blends to be recursed so they will be visible to correlations that are applied to the blends.
Keywords: None
References: None |
Problem Statement: In the BCU Administrator, the Scheduler Table shows the Executing Units, and a Progress Time column for each. How is the Progress Time computed and reported? | Solution: Processing Data
When the BCU starts up, each trigger requests a chunk of history data for each tag in its conditions, beginning at whatever time point it left off last time. After this data gets processed, it automatically goes out and gets some more.
In a unit that has Synchronized triggers, the unit will strobe across all the triggers, asking each one what is the first history data time point that it cares about. (This timepoint may or may not indicate a trigger firing -- it could simply be one of a sequence of 0's, for example.) Then the unit figures out which of these timepoints needs to be analyzed first (i.e. the earliest one), and tells the relevant trigger to analyze that timepoint (and execute its actions if the conditions are met).
There can be basically four results of that analysis: (1) an error was encountered in requesting new data or executing the actions, (2) analyzed successfully / no firing found, (3) analyzed successfully / fired successfully, (4) analyzed successfully / waiting on more characteristic data to arrive.
Saving State and Reporting Progress Time
If a trigger has fired, the unit saves its state (and that of its triggers) to the database.
If a trigger has not fired, the unit continues on without saving its state, because it has not done anything irreversible -- should the plug get kicked out of the wall accidentally, the unit could pick back up from its last saved state and recreate this work.
But at the end of one complete processing pass (state goes from Executing back to Enabled), the unit saves its state to the database no matter what.
Additional Note regarding timestamp formatting
When observing Progress Times they appear as formatted timestamps that are correct for the timezone where your Batch.21 system is located. But internally all timestamps are stored in UTC format. This format is ideal, because it allows times to be stored and retrieved without regard to issues such as timezone and Daylight Savings Time. When the UTC time value is retrieved from the database, the last step before it is displayed is a conversion using the local client information (Regional Settings) to apply the correct timezone and Daylight Savings information.
This is normally something you do not have to think about. However if using the BCU API to build your own BCU application, you will find the timestamps returned from the BCU server are UTC. This is because UTC time to local time conversion occurs at the very last layer, the one the client applications interact with. This layer is most often the Batch Application Interface, which lives on the client machine. Having a layer that lives on the client machine means that system calls can be made to find out what time zone the client machine is in. The conversion is made on the client and the times are passed to the server in UTC.
In the case of the BCU API, the design is different. No layer lives on the client machine -- the client applications are interacting directly with objects that live on the server machine -- and therefore there is no opportunity to perform a local-to-UTC translation on the client, and no opportunity to find out what time zone the client is in. Therefore programmers using the BCU need to compensate in their code by making conversion calls for timestamps.
Using SQLplus 6.0 or later for example, you can define functions like these, and then the included code below would work to return correct timestamps from the UTC values supplied by the BCU API:
function to_utc_date(x timestamp);
return delta_time(x, '1980-01-01T00:00:00Z')/24:00 +29221;
end
function from_utc_date(x double);
return cast('1980-01-01T00:00:00Z' as timestamp)+(x -29221)*24:00;
end
-- Example using these functions
local a vardate;
local t timestamp;
t = getdbtime;
write t;
a = to_utc_date(t);
write a;
write from_utc_date(a);
Keywords:
References: None |
Problem Statement: Which 4.x versions of Batch.21 client tools work with what 4.x versions of Batch.21 server components? | Solution: The version 4.1.1 client tools are not backwards-compatible to the 4.0.1 server components.
The version 4.0.1 clients tools are compatible with the 4.1.1 server components.
This means that the server can be upgraded first, and the clients later. However, the reverse scenario is not supported. A customer cannot upgrade the client tools first, and then upgrade the server later.
Keywords: install
migrate
setup
References: None |
Problem Statement: The model is set-up to sell a stream on a volume basis in a weight based model. When the model is run, it generates an optimizer error of the following type:
For XPRESS Optimizer:
98 Error: At line 729 no match for row VBALaaa
For CPLEX Optimizer:
CPLEX Error 1448: Line 729: 'VBALaaa' is not a row name
These messages mean that the matrix reader encountered one or more rows in the columns section of the MPSPROB.MPS file that were not contained in the rows section of that file. | Solution: This is typically caused when trying to setup the model to sell a product on a different basis (ie, volume), than the model basis (ie, weight). There are two ways to specify that the material is sold on a volume basis in a weight-based model. The first is to use the PRICE and VOL columns in Table SELL. The second is to use the VPRICE column in Table SELL. The difference is that if VPRICE is used, the material balance within the model is still handled on a weight basis. If PRICE and VOL are used, then the entire material balance is done on a volume basis.
The No Match... error is typically generated when the user has specified WBALaaa in a submodel (or ROWS) and the model is expecting to find VBALaaa (or vice versa). When this happens, the user needs to verify that the parameters set in Table SELL are consistent with how the material balance is referenced in other parts of the model (submodels or table ROWS).
When using PRICE and VOL in SELL, the submodel where such a material is created would likely bring feed in using a WBAL row, but the this product would be produced using a VBAL row. The submodel coefficients would need to account for the necessary conversions.
When using VPRICE in SELL, the submodel entries for the product production would be in a WBAL row.
Another possible cause of this problem is that the model has several ALTTAGS tiers and at least one of them has not been flagged as VOL/PRICE (or VPRICE) in table sell. Correcting the entry so it is consistent with the other alternates will solve the problem.
Note that in Table BUY there is also the option of the VOL column to specify that a stream is purchased on a volume basis in a weight based model. However when VOL is used in Table BUY it does not effect the basis of the material balance. That stream's material balance will still be handled on a weight basis.
Keywords: alttags
98
1448
VPRICE
VOL
WBAL
VBAL
SELL
BUY
References: None |
Problem Statement: The Aspen Batch.21 manual and various | Solution: s describe characteristics, subbatches and instances of them separately. This Knowledge Base article provides an answer to the following question: What is the relationship between characteristics, subbatches and instances, and what are the differences between them?Solution
Instances are for repeated actions and subbatches are for different actions:
A batch can have characteristics which hold, for example, measurement values. These characteristics can have instances, for repeated measurements.
A batch can have sub-batches, one for every DIFFERENT sub-process. Such a sub-batch can have instances, for repeated sub-processes.
A sub-batch can have characteristics which hold, for example, measurement values related to the sub-batch. These characteristics can have instances for repeated measurements, related to the SUBbatch.
For example:
Let me reveal some of the 'structured' private life of an Anonymous Support Consultant as an example for these Batch.21 topics, both in text and a scheme below.
My weekend starts on Friday evening. I count my money (characteristic-instance Money[1]) and see that I'm broke for the moment. So, I go to the money-dispenser (subbatch 'ATM'), get some money (subbatch-characteristic MoneySpend) and the fun can start! I count my money again (mainbatch-characteristic-instance Money[2]) and go for a dinner (SUBBATCH 'RESTAURANT'), where I order two courses (subbatch-characteristic-instance Food[1] and Food[2]) and of course some fine Belgian drinks (subbatch-characteristic-instance Drink[1] [2] [3]). After I'm done I pay (subbatch-characteristic MoneySpend).
When I walk to my favorite bar I count my money (mainbatch-characteristic-instance Money[3]). Upon arrival at this bar (start a new subbatch BAR [1]) with the name (subbatch-characteristic name) 'Corbeau', I drink another couple of great drinks (subbatch-characteristic-instance Drink[1] [2] and [3]), and leave.
When I walk to my second favorite bar I count my money (mainbatch-characteristic-instance Money[4]) again. Upon arrival at this bar (additional instance of subbatch BAR[2]) with the name (subbatch-characteristic name) 'Celtica', I drink another couple of great drinks (subbatch-characteristic-instance Drink[1] [2] and [3]), and leave.
Just before I go home I count my money (mainbatch-characteristic-instance Money[5]) and have a midnight snack (start a new subbatch SNACK) at a place called (subbatch-characteristic name) 'SJ's' where I order a (subbatch-characteristic FastFood) tasty and dripping 'Kebab with Garlic Sauce'.
Finally I count my money again (mainbatch-characteristic-instance Money[6]), get a taxi which costs me (subbatch-characteristic MoneySpend) 7 Euro, count my money again (mainbatch-characteristic-instance Money[7]) and go to bed.
BATCH characteristic SUBBATCH subbatch-characteristic
Keywords: None
References: None |
Problem Statement: There have been a few cases in which the Batch21Services process consumes a large CPU load when a custom XML-based batch import application imports data to the Batch.21 database. This knowledgebase article provides several suggestions to minimize the load put on the Batch21Services process during such an import process. | Solution: A significant performance enhancement to the XML batch data import process was put into the cumulative ER for v.6.0.1 and subsequently rolled into v2004.1. This enhancement was tracked as:
CQ00197054: Improve performance for XML batch data import
There was roughly a 3X overall performance improvement for importing batches using the XML interface on a test system after this enhancement was implemented (results may vary on other systems.)
Make sure your custom code combines as many characteristics as possible for the XML string so that many characteristics are added at once. This is one of the benefits of using the XML interface to add data to batches as opposed to directly calling the Batch.21 API. The Batch.21 API add method only allows you to add one characteristic at a time.
Move the Batch.21 software to a dedicated server. The InfoPlus.21 server and Cim-IO processes can use an average of 30% of the CPU. If more of the CPU resources were free then the Batch.21 services would have more CPU resources to use.
Oracle databases need to be 'tuned' by a qualified Oracle DBA in order to extract the maximum amount of performance from the database. If using Oracle, make sure that the Oracle database has been properly tuned.
Keywords: slow
cpu
load
References: None |
Problem Statement: If you are in a position where you have to restore a Batch database from a Batch.21 archive file created with the Backup/Restore utility and the Batch.21 Area does not exist, the archive will not successfully recover unless you prepare the database environment correctly. | Solution: The attempt to restore fails because the Batch.21 area does not exist in the relational database.
Follow these steps to work around the problem:
Open the Batch.21 Administrator.
Create an area named EXACTLY the same as what you want to restore (you do not need to fill out any configuration info.)
Open the Batch.21 Backup/Restore Tool and select the area you just created.
Delete the newly created area from the Batch.21 Administrator.
Restore the area (and data) that you have selected in the Backup/Restore Tool.
This will restore the area (including all of its configuration and security-related data.)
Keywords: data loss
dataloss
crash
References: None |
Problem Statement: This Knowledge Base article provides an Aspen SQLPlus query example showing how to query for batches by Batch Handle (Batch_ID). | Solution: LOCAL batch_sources, batch_ds, batch_area, batch;
local Batch_ID INT;
Batch_ID = 1200;
batch_sources = createobject('AspenTech.Batch21.BatchDataSources');
batch_ds = batch_sources('Your_Data_Source');
batch_area = batch_ds.areas('Your_Area');
batch = batch_area.GetBatch(Batch_ID);
write 'BATCH ID: ' || BATCH_ID;
write 'Batch_id: ' || batch.ID;
write 'Batch Handle: ' || batch.GetBatchHandle;
write 'Batch NO: ' || batch.Characteristics('Batch No');
Keywords: Batch_Handle
BatchHandle
References: None |
Problem Statement: Knowledge base article 115049 describes how to force Aspen Process Explorer to connect to Aspen InfoPlus.21 through synchronous communication by either stopping the Noblenet portmapper or by adding a specific Windows Registry key on the client PC. This knowledge base article describes the differences between synchronous and asynchronous communication. | Solution: For queries with a small set of data, the synchronous call is preferred since there is less computational overhead with the synchronous communication process. However, for large queries, the asynchronous calls are more reliable since the caller will not be blocked. Hence there will be no timeouts with asynchronous communication (unless the tcp/ip ports used for communication are blocked - then communication will fail).
Another benefit of asynchronous communication is that a query to Aspen InfoPlus.21 can be cancelled by the client application after it has been initiated. Once a synchronous query is executed it cannot be cancelled.
Over the years, many issues with ports have been reported and have resulted in some changes to the default behavior:
As of V9.1, asynchronous was only used when running in Aspen Process Explorer (and only if DisableAsync registry key was not specifically set).
As of V14.0, asynchronous was turned off by default. If it is desired, you need to EnableAsync (similar to DisableAsync).
Keywords: API
References: None |
Problem Statement: This knowledge base article describes why the Aspen Batch.21 Application Interface (API) may not return data when the function calls are executed through the Aspen SQLplus Query Writer - even though the other Aspen Batch.21 client tools return data when the same batch or time frame is queried. | Solution: When a user without sufficient privileges attempts to query a secured batch area, no data will be returned to the user. The Aspen SQLplus Query Writer executes all queries under the account which starts the Aspen InfoPlus.21 Task service. Therefore, the account which starts the Aspen InfoPlus.21 Task service must be in a role which has privileges to access all Aspen Batch.21 areas if the Aspen Batch.21 API calls are executed from the Aspen SQLplus Query Writer.
Keywords: Access
Deny
Denied
Permissions
Allow
Blank
Empty
References: None |
Problem Statement: A ComboBox is a standard Microsoft ActiveX control which allows a user to view and select from a list of items. See below for an example.
This knowledge base article describes how to populate a ComboBox with a list of Aspen InfoPlus.21 records & their corresponding descriptions using an Aspen Process Explorer VBForm. | Solution: Place the following code in the VBForm's UserForm_Initialize() event so that the ComboBox is populated when the form is loaded. In this example the ODBC data source to the Aspen InfoPlus.21 database is called hklap. Please make sure to change the data source name when you use the code in your environment.
-----
Dim cn As New ADODB.Connection
Dim rs As New Recordset
Dim strSQL As String
cn.Open hklap
strSQL = SELECT Name, IP_Description From ip_analogdef
rs.Open strSQL, cn, adOpenForwardOnly, adLockPessimistic, adCmdText
ComboBox1.Clear
Do Until rs.EOF
ComboBox1.AddItem rs.Fields(Name).Value & - & rs.Fields(IP_Description).Value
rs.MoveNext
Loop
-----
Note: It is necessary to add a reference to Microsoft ActiveX Data Objects in
Tools |
Keywords: Listbox
Combo
List
References: s...
or else the ADO objects (which connect to the Aspen InfoPlus.21 database to run the SQLplus query) won't be recognized.
A sample Aspen Process Explorer VBForm which uses this code is attached. |
Problem Statement: When using the Batch Reporting tool in Web.21, the web browser session expires after a certain period of inactivity.
To recreate the situation:
With the Batch Reporting tool in Web.21, select 5 most recent batches.
View batch detail for selected batches
Don't touch any of the browser sessions for longer than the timeout period
Press View query results to obtain the most recent 5 batches again; you will get the following error message:
Batch.21 Application Error:
Reporting::Page_Load-The session has timed out or has been terminated. | Solution: There a way to configure timeout values for Batch Reports.
Here is step-by-step procedure to adjust the browser timeout values for Batch Reports:
1. Open the web.config file located in C:\inetpub\wwwroot\aspentech\batch.21 directory on the web server machine with notepad or some other document editor.
2. Inside the file is an xml element called SessionState with a property called timeout. Change the property value (default is set to 20, value implies minutes so 20 = 20 minutes) to whatever you want keeping in mind that this means the user state information will not go away until this timeout has expired potentially causing extreme web server memory usage.
3. Save the file as a 'text' file and restart IIS to load the new configuration.
Keywords:
References: None |
Problem Statement: What is the correct product name, Aspen Online or Aspen Plus Online? | Solution: The product Aspen Plus Online was the online interface for Aspen Plus EO RTO models for Version 11 ONLY.
Since Aspen Plus 12.1, RTO functionality of Aspen Plus Online was merged into the existing Aspen Online product. Refer to the table below.
Simulator Version
Online RTO Interface Product
Comments on Online Product
RT-Opt 10.0
RT-Exec
Aspen Plus 11.0, 11.1
Aspen Plus Online 11
Updated version of old RT-Exec product
Aspen Plus 12.1
Aspen Online 12.1
RTO functionality of Aspen Plus Online was 'merged' into the existing Aspen Online product
Aspen Plus 2004, 2004.1, 2004.2
Aspen Online 2004
Aspen Plus 2006
Aspen Online 2006
Aspen Plus 2006.5
Aspen Online 2006.5
Aspen Plus V7.0
Aspen Online V7.0
Aspen Plus V7.1
Aspen Online V7.1
Keywords: Aspen Plus Online, Aspen Online
References: None |
Problem Statement: When adding the 12th condition to a query, the following error is displayed:
B21BSC-50191: Too many characteristic conditions in one query
This error occurs regardless of the method used for submitting the query. For example, it could appear when using the Batch Query Tool, or when submitting a query via XML using the Web Service. | Solution: There is a hard-coded limit to the number of conditions (12) that has existed in all version of Batch.21 up to version 2004. The Batch Server will be enhanced to handle an unlimited number of query conditions starting with the v2004.1 release. Keep in mind that as more conditions are added, performance is diminished.
Keywords:
References: None |
Problem Statement: Database Wizard fails on Batch.21 Database creation with the error: ORA-00988: missing or invalid password(s) | Solution: A user cannot be created when the password begins with a number.
Keywords: batch
oracle
fail
References: None |
Problem Statement: Getting data server not responding when trying to start Aspen Advanced Tag Browser within Process Explorer. The Advanced Tag Browser uses SQLplus to connect to the IP.21 database, therefore you need to ensure that the connection to SQLplus can be made. | Solution: Check to make sure TSK_SQL_SERVER task is running
Check to make sure that the Command Line Parameter for the TSK_SQL_SERVER in the InforPlus.21 Manager is set to a correct service # (i.e. 10014)
Check to make sure that InfoPlus.21 system is up and running
Check to make sure that the Servers area of the Configure Servers is correctly setup. (i.e. Name is localhost and System is InfoPlus.21 V2+)
Keywords: data server not responding
References: None |
Problem Statement: Is there an example of a Global model in which Parametric Analysis is being used? | Solution: Parametric Analysis requires a model that is using the XNLP solver and a PIMS Advanced Optimization license.
The attached Global model has Table PARAOBJ and performs Parametric Analysis on the sale of LPG and the purchase of ARL. You can download this model and select a case to run, then run Parametric analysis on that case. Below is the structure of the PARAOBJ table for this model:
*TABLE
PARAOBJ
Objective Function Parametrics
*
Initial Objective Function Value
Final Objective Function Value
Delta Objective Function Value
ROWNAMES
TEXT
INITIAL
FINAL
DELTA
GROUP
***
SELLPGA
34
36
1
1
SELLPGAA
0.02
0.04
0.01
1
PURCARL
49
51
1
PURCARLA
0.04
0.06
0.01
***
Keywords: Parametric
PARAOBJ
References: None |
Problem Statement: What languages does Aspen OnLine support? | Solution: Aspen OnLine only supports English OS.
Keywords: language
References: None |
Problem Statement: The Microsoft Excel spreadsheet contains the entire path to the process data add-in (e.g.='C:\apps\OFFICE97.E01\OFFICE\LIBRARY\AtData.xla'!ATGetTrend). As a result the Excel Add-in may not function as expected. This problem is related to spreadsheets created on one machine and accessed on another.
It's also been found that a spreadsheet created on one PC and then copied to a network drive, cannot be opened by the same PC that created it (gets error saying it can't find the atdata.xla or atdata.xlam). However, if the spreadsheet is Saved As to the network drive, then it is able to be opened successfully. | Solution: Excel requires that the ATData.xla or ATData.xlam be installed in the proper location. For the sheets to work, the Process data add-in (ATData.xla) must be installed in the Excel library directory (\program files\microsoft office\office\library).
This is critical for the machine creating the original sheet. Placing the add-in anywhere else has the results described above.
To ensure that the process data add-in is installed in the proper location, Microsoft Excel must be installed on the system before Aspen Process Explorer.
Changes
For later Microsoft Excel version XP, 2003 and 2007, the following library paths are updated:
Excel Version XP: '\program files\microsoft office\office10\library'
Excel Version 2003: '\program files\microsoft office\office11\library'
Excel Version 2007: '\program files\microsoft office\office12\library'
It is also important to note that if the existing AtData.xla or ATData.xlam path differs from the above, users will need to remove the Excel Add-In and re-Add from the paths stated above.
Keywords: path
directory
share
References: None |
Problem Statement: IP.21 clients are unable to communicate: Unknown Security Error: Authenticate(6) | Solution: NT networking is a prerequisite for communication between APEx, ADSA and IP.21. On a workstation, the main NT service for this is the Workstation service, which has to be running correctly.
You can see whether this service is running with: Start || Settings || Control Panel || Services. Look for Workstation. It should have a status of Started. If it is not started, error messages can be found in the NT Event Viewer.
Keywords: workstation service
NT networking
Unknown Security Error
Authenticate(6)
References: None |
Problem Statement: The message:
An error occured getting the XML in the DOM
appears when accessing a particular Batch.21 Area. | Solution: This error may occur due to an Oracle database error returned from the oledb driver to ado and then to the Batch.21 service. Same situation occurs if the detail display attempts to show the contents of a specific batch. This happens for batches containing characteristic values over e+28.
There is a limitation in the Oracle oledb driver that allows values reasonably at or above e+28 to be stored in the Oracle database using ADO, but when the ADO record set field object attempts to retrieve the values from the char_value field in the num_batch_data database table the driver returns Multiple-step operation generated errors. The only way to work around this limitation to retrieve characteristic values up to e+205 is to configure the ADSA Batch.21 service to use an Oracle ODBC connection. If you change the characteristic values in the database to at or below e+28, both the batch query and detail display succeed.
Use ODBC connection method if it is necessary to store numbers in the range e+28 to e+205.
Here is a query that will help you find large values that are out of range of the OLEDB driver:
select * from num_batch_data where char_value >= 1e+29
update num_batch_data set char_value = 1.0E+28 where char_value >= 1.0E+29
Keywords:
References: None |
Problem Statement: Here are a few helpful tips as you migrate Batch.21 from version 4.1 to 2004. | Solution: From software version 4.1 to software version 4.1.2:
Make sure the BCU server is turned off.
All services associated with InfoPlus.21 software are turned off.
Take a snapshot.
Remove check box on the InfoPlus.21 manager that allows InfoPlus.21 to automatically startup on reboot.
Install AMSSolution 108236 to upgrade software to version 4.1.2.
Reboot the computer.
From software version 4.1.2 to software version 6.0:
Make sure the BCU server is turned off.
All services associated with InfoPlus.21 software are turned off.
Remove check box on the InfoPlus.21 manager that allows InfoPlus.21 to automatically startup on reboot.
Install AMS version 6.0 software to upgrade all AMS software required products.
Reboot the computer.
Upgrade the snapshot using the upgrade wizard in the IP21 manager.
Upgrade the Batch.21 database using the Database Upgrade wizard.
From software version 6.0 to software version 6.0.1:
Make sure the BCU server is turned off.
All services associated with InfoPlus.21 software are turned off.
Remove check box on the InfoPlus.21 manager that allows InfoPlus.21 to automatically startup on reboot.
Install AMSSolution 112429 to upgrade only Batch.21 products. You will have to select ADSA as well as Batch.21 to get Batch.21 to version 6.0.1. You will not need to select any other AMS software layered products to get to version 6.0.1.
You will not be required to upgrade the snapshot using the upgrade wizard or upgrade the Batch.21 database using the Database Upgrade wizard.
Reboot the computer.
From software version 6.0.1 to software version 2004:
Make sure the BCU server is turned off.
All services associated with InfoPlus.21 software are turned off.
Remove check box on the InfoPlus.21 manager that allows InfoPlus.21 to automatically startup on rebot.
Install all AMS version 2004 required software products.
Reboot the computer.
Upgrade the snapshot using the upgrade wizard in the IP21 manager.
Upgrade the Batch.21 database using the Database Upgrade wizard.
InfoPlus.21, Process Explorer, Batch.21, SQL+ and Aspencalc should now be at level 2004 from version 4.1. Verify that all software layered products are working, all services are started and data is going into history.
Plan on this upgrade process taking two full days to perform.
Keywords: Batch.21
InfoPlus.21
References: None |
Problem Statement: When attempting to access the Aspen Batch.21 Application Programming Interface (API) from the Aspen SQLplus Query Writer, type library errors may appear. The errors can be of two forms.
1) From the SQLplus Query Writer tool, when a user tries to execute the code which accesses the Batch.21 API, the following error appears.
Type Library AtBatch21BCUApplicationInterface not found at line 31
2) From the SQLplus Query Writer tool, when a user goes to View | Object Browser, and attempts to expand the Aspen Batch.21 Application Interface, the following error appears.
Error executing method TypeLibInfoFromRegistry: No matching typelib is registered, or the registered information is invalid at line 1 | Solution: AspenTech periodically updates type libraries. As a result, it may be necessary to delete and re-add the references to the Batch objects (BCU or Batch.21 API) using View |
Keywords: method
type
library
References: s in the SQLplus Query Writer.
Additional Information
COM objects are updated with each new release of Aspen Batch.21. At the same time, DLL and type library versions are also updated. SQLplus can reference the Batch.21 objects via the type library reference hence the version number dependency.
When upgrading AspenTech software, the installer removes the old Aspen Batch.21 API and installs the new one. The reference in SQLplus refers to a specific type of the library version. As a result, the existing link is broken in SQLplus. To resolve the problem, it is necessary to reference the updated type library within SQLplus. The interfaces remain the same and no code changes are required.
SQLplus does have a CreateObject call that can create COM objects without reference to a type library. However, this method is not recommended because using the type library reference is more efficient and can access more methods. |
Problem Statement: Periodic Aspen PIMS, or PPIMS, is included with the standard Aspen PIMS license starting with the release of version 2006. Since many users are not experienced with PPIMS and its requirements, this is a review of how to convert a nonperiodic model into a periodic model. | Solution: There are two basic requirements for a periodic Aspen PIMS model: Table PERIODS and Table PINV. Below are the steps to change the model type and define these tables.
1) Start Aspen PIMS and open the model to be converted.
2) Right-click on the name of the model on the model tree. Select Model Type, then Periodic.
3) A new branch will appear on the model tree called PERIODIC. Open this to the PERIODS table and right click to ADD a new PERIODS table. An example of Table PERIODS is below. Column A defines the single character designation for each period, Column TEXT defines the period's name and Column LEN defines the period length in the time units of the model (usually days). For detailed information on Table PERIODS, please see the Aspen PIMS Help system on this topic.
* TABLE
PERIODS
*
Time Period Definitions
TEXT
LEN
*
1
JAN
31
2
FEB
28
3
MAR
31
4) Right click the PINV branch and add your new PINV table. An example of Table PINV is below. For detailed information about table PINV, please refer to the Aspen PIMS Help system. Remember that even if you do not want to model inventory, this table is required and must contain at least one stream name. You can create the table and enter a dummy stream name in column A of the table. For more information about modeling inventory and how to setup your desired inventory policy, please refer toSolution #113149.
* TABLE
PINV
Table of Contents
*
Periodic Inventories, '000 bbls
TEXT
OPEN
MIN
MIN3
TARG
MAX
URG
Unleaded Regular
70.0
20.0
60.0
60.0
200.0
UPR
Unleaded Premium
40.0
15.0
40.0
45.0
150.0
5) Now that your model is configured as a periodic model, you will notice that the reports are different for periodic models. Instead of the FullSolution report, periodic Aspen PIMS generates ACROSS and DOWN reports. The ACROSS report displays the periods beside one another with less detail than the DOWN report. The DOWN report has a similar format as the FullSolution report, but each section repeats for subsequent periods.
Keywords: None
References: None |
Problem Statement: PE installation receives errors when attempting to use PE Add-ins in Excel. Error creating aspen tag control.obj | Solution: Run a program called TagUpdate.reg (in C:\AspenTech\Desktop) by double clicking on it.
Keywords: tagcontrol.obj
References: None |
Problem Statement: By design, the BCU can only be connected to a single InfoPlus.21 database at a time. But what if there is Process Data on other InfoPlus.21 systems the BCU implementer would like to access as part of the configuration? | Solution: By using the On-Demand Calculation functionality built into Process Data (IP.21), during BCU implementation tags from other InfoPlus.21 systems can be accessed. For example, while sourcing a BCU characteristic to a tag, define the tag as follows:
=(MyDataSourceName:MyTagName)
where MyDataSourceName is the IP.21 datasource name defined in the ADSA, and MyTagName is a standard tag.
IMPORTANT NOTE ONE: There is much more overhead in resolving an On-Demand tag reference than a standard tag reference from the default datasource for the BCU. This work-around method should only be used as part of an implementation to add a small number of tags from additional datasource(s).
IMPORTANT NOTE TWO: An error may result if the mapping record is specified for an On Demand reference, like:
=(ATCAI*3/20)
Typically, in the Batch Administrator, if a specific map record is desired, it can be specified. However, if the tag (or alias) reference is changed to an On Demand calculation, do not specify the mapping record. Let Batch.21 resolve the reference itself. This is necessary because an On Demand calculation may reference tags from different definition families, so specifying a map record for the On Demand Calculation would prevent it from using whatever map records it needs to for all the tags involved in the calculation.
Keywords:
References: None |
Problem Statement: Our refinery exports naphtha. The contract specifies an absolute minimum NPA (volume % naphthenics plus aromatics) of 40, but we have to pay a penalty if the NPA is less than 47. How can we model this in Aspen PIMS? | Solution: This is a problem with two minimum specifications for the same quality:
An absolute minimum specification. This will be modeled in the customary way in Table BLNSPEC.
A soft specification: the product quality can be below this specification, but the seller will have to pay a penalty for doing so. This will be modeled with structure added to Table ROWS.
Example Problem Setup:
We will demonstrate the technique with an example model. The product tag is XNP, for export naphtha. XNP is a blended product that appears in Table SELL. The figures below show the definition of this product in Tables BLENDS and SELL.
Product XNP is blended from three naphtha streams: NA1, NA2, and NA3. This is shown in the excerpt from Table BLNMIX, below.
In this example, these streams are purchased. However, they could just as well be produced in the refinery. The figures below show the definitions of these materials in Tables BUY and BLNPROP.
Modeling the Specifications and Penalty:
The minimum NPA specification of 40 is modeled in Table BLNSPEC, as shown below.
The soft specification is modeled in Table ROWS as shown below.
? Row NNPAXNP is the minimum NPA specification constraint for product XNP. The entry of 1 under column SLACK causes Aspen PIMS to change the sense of the row to equality and to implement a slack variable, also called NNPAXNP. The activity of the slack variable will be the volume of the blend multiplied by the number of NPA points by which the blend exceeds the specification.
Row GPENXNP calculates the penalty. Column BVBLXNP represents the total volume of the blend. The coefficient under this column is -1 * (Threshold NPA for penalty - Absolute minimum NPA limit) = -1 * (47 - 40) = -7. Column UXNPPEN represents the number of NPA-barrels requiring penalty. Because the sense of this row is greater-than-or-equal-to, UXNPPEN will take on a non-zero activity only if the NPA of the blend is below the threshold value.
UBALNPP models the consumption of a pseudo-utility that represents the penalty for the NPA of the blend below the threshold value.
Utility NPP is purchased via Table UTILBUY, shown below. The table shows a penalty cost of $0.10 per NPA-barrel.
Results:
The attached model, Quality Penalties Example.zip, demonstrates the technique and has cases that correspond to several penalty costs. For example, if the penalty cost is $0.10 per NPA-barrel, the NPA of the product is 42, which will require payment of a penalty.
The penalty payment is shown as a utility purchase.
Keywords: None
References: None |
Problem Statement: In the BCU, under the Server dropdown menu, there is an option called Update Batch Configuration. What does this do? | Solution: Perhaps the most precise labelling for this option would be Refresh Batch Configuration; Update could be understood that information is being sent to the server.
In more detail, if you have the Batch Administrator and the BCU Administrator running and have pre-existing characteristics A and B, and add a new third one X in the Batch Administrator, you will only see two in the dropdowns in the BCU Admin unless/until you select the Update menu item.
Note, however, that you could manually type in the new characteristic name X in the BCU Admin without doing an Update, and when you tried to Verify or Install your unit, it would succeed. (The BCU server updates its copy of the batch configuration before starting the Verify or Install.)
Note also that installed BCU units do not have their configuration re-checked when the BCU server is restarted, so if you have modified or deleted characteristics A or B, a BCU unit will continue processing until it actually tries to record one of them, when it will encounter an error from the Batch server that the characteristic does not exist. This would put the BCU unit into a failed state in the BCU Scheduling Table display.
A good troubleshooting idea at that point is to delete and re-add the unit, since the enhanced error checking built into the BCU starting with Version 6 will give you more information to help understand why the failure occurred.
Keywords:
References: None |
Problem Statement: Using BatchConnect, event journal files created from Rockwell RSBatch are processed. Occasionally, an event journal will be created from a batch that never started. These journals create errors which will repeat indefinitely.
What manual steps are required to stop processing these certain event journal files?
If I delete the OpenBatchScanner_1.mem and OBInterface.mem files, how will the BatchConnect know to NOT re-process the event journals from the journal_directory? | Solution: Answer to Question #1:
To determine which journal file is causing the errors, go to the Logs directory (..\Program Files\AspenTech\Batch.21\Data\BatchConnect\Logs) and proceed as follows:
Turn ON debug by editing BatchConnectOpenBatch.def
View the log file that corresponds to the error file where the errors are being noticed.
Restart the Batch Connect service.
Fix the problem.
Turn OFF debug by editing BatchConnectOpenBatch.def
Restart the Batch Connect service.
To get the scanner to quit processing a certain event journal file:
Stop the Batch Connect service (making sure it is not processing a journal file first)
If it exists, remove the event journal from ..\Program Files\AspenTech\Batch.21\Data\BatchConnect\Data.
Move the event journal from the location specified by the journal_directory keyword in the OpenBatchScanner_N.def file.
Start the Batch Connect service
Answer to Question #2:
The .mem file keeps track of the status of whether the event files are processed or currently being processed. The event files left in the directory will be reprocessed if the .mem file is deleted. This is a last resort since there is no processing of the evt file and it is better to correct the evt file so it can be processed normally.
Keywords:
References: None |
Problem Statement: Is it possible to localize Batch.21 to languages other than those provided with Service Pack 1 for version 5.0? | Solution: The attached kit allows users to localize Batch.21 software and help files to languages other than the ones provided by AspenTech.
Keywords: language
translate
local
localize
international
References: None |
Problem Statement: How to set up a pass through query from Microsoft Access Visual Basic to the pseudo HISTORY table. | Solution: First the Microsoft ActiveX Data Object 2.0 (or higher) Library needs to be loaded.
FYI: The MSDN Online Library is a valuable reference for all types of Microsoft software problems. http://msdn.microsoft.com/library/
Open the Visual Basic Code which uses the reference.
Select Tools>
Keywords:
References: s
Find the reference on the list or find it using browse
Click OK
Second the Visual Basic code was written. It is shown below. Please note the string manipulation required for the returned timestamp.
Sub Get_Production_Data()
On Error GoTo Err_Get_Production_Data
''Valid string format temp variable Dim tempdate As String
''Position of . in Date/Time data historian string Dim dotpos As Long
''Set up ODBC connection
Dim adoCon As New ADODB.Connection Dim adoRecords As ADODB.Recordset
adoCon.Open (saslk007 data historian)
''OEM database
Dim OEM As Database
Set OEM = CurrentDb
''Tab-FW recordset
Dim rst_tabfw As Recordset
Set rst_tabfw = OEM.OpenRecordset(Tab-FW) rst_tabfw.MoveLast
''Record set for Access table
Dim rst_Weekly26Rate As Recordset
Set rst_Weekly26Rate = OEM.OpenRecordset(Tab-Weekly26Rate)
''Query name to delete old data
Dim stDocName As String
''Production tag name
Dim prodtag As String
prodtag = ''TOT26PROD''
''SQL string to pass
Dim sqlstr As String
''DoCmd.SetWarnings False
''Delete old data
stDocName = Qry-NewWeekRateReady DoCmd.OpenQuery stDocName, acNormal, acEdit
''Set up sqlstr
''Start and end times come from Tab-StartEndDates table ''Collect hourly interpolated data from the pseudo HISTORY table sqlstr = SELECT TS, VALUE AS RATE FROM HISTORY WHERE NAME = & _
prodtag & _
AND TS BETWEEN '' & _ Format(rst_tabfw!StartDate, dd-mmm-yy hh:nn) & _ '' AND '' & _
Format(rst_tabfw!EndDate, dd-mmm-yy hh:nn) & _ '' AND PERIOD = 1:00
''Retrieve new data
Set adoRecords = adoCon.Execute(sqlstr)
''Put data into table
Do While Not adoRecords.EOF
rst_Weekly26Rate.AddNew
''The format of the date coming back from the data historian ''is 10-JAN-00 00:00:00.0 and the CDate function does not ''interpret the .0 part of the time so it must be removed. ''The CDate function is required to convert the string expression ''into a valid date expression to store in the table. dotpos = InStr(1, adoRecords!TS, .) tempdate = Left(adoRecords!TS, dotpos - 1) rst_Weekly26Rate!DateTime = CDate(tempdate)
rst_Weekly26Rate!Rate = adoRecords!Rate rst_Weekly26Rate.Update
adoRecords.MoveNext
Loop
rst_Weekly26Rate.Close
adoRecords.Close
rst_tabfw.Close
Exit_Get_Production_Data:
Exit Sub
Err_Get_Production_Data:
Msgbox Err.Description
Resume Get_Production_Data
End Sub |
Problem Statement: Getting #ERROR 50103: Bad attribute ''VAL'' in Excel Add-In (Aspen->Process Data->Get Data->Current Values) | Solution: First check that the history task (h21archive) is not paused, stopped, or crashed. Then check if history has been turned off for that tag. If so, be sure to have IP_ARCHIVING turned to ON, set the IP_REPOSITORY, and set IP_#_OF_TREND_VALUES to two or more.
Keywords:
References: None |
Problem Statement: Can Aspen Process Explorer display the timestamps in military time format? | Solution: Yes, military time can be displayed in Aspen Process Explorer on all Windows platforms.
On Windows 2000:
In the Control Panel, select Regional Settings.
Within Regional Settings select the Time tab.
In this tab, find the Time Style field and use the drop-down menu to select either H:mm:ss or HH:mm:ss. In general, a capital H indicates that the format will be in military time. Click OK. This will configure the time settings of the operating system (as well as Process Explorer) to display military time.
On XP and Windows 2003:
In the Control Panel, select Regional and Language Options.
Within Regional Settings select the Customize button to display the Customize Regional Options window.
Select the Time tab.
In this tab, find the Time format field and use the drop-down menu to select either H:mm:ss or HH:mm:ss. In general, a capital H indicates that the format will be in military time. Click OK. This will configure the time settings of the operating system (as well as Process Explorer) to display military time.
On Windows 7 and Windows 2008:
Click the Start button (usually in the lower left corner of the screen) and in the box that appears at the base of the Start menu type Region and Language. One of the choices that appears will indicate that it is a Region and Language Setting in the Control Panel. Select that one. On the dialog box that appears simply adjust the time settings as desired (or as described above). This will configure the time settings of the operating system (as well as Process Explorer) to display time in the chosen format.
Keywords: timestamp
References: None |
Problem Statement: Because there is no linking between levels at v2.x, the migration tool builds the linking automatically. It does this by adding every instance of the next lower level as a link for each instance. In other words, my new batch structure would include Fill, Mix, Heat1, Heat2 and Dump as steps in the Mix phase, in the React phase and in the Store phase. So my batch tree would look like this:
Batch
Phase Step
Mix Fill Mix Mix Mix Heat1 Mix Heat2 Mix Dump React Fill React Mix React Heat1 React Heat2 React Dump Store Fill Store Mix Store Heat1 Store Heat2 Store Dump
My migrated Batch.21 system has 15 steps defined instead of the 5 I had in version 2.x and the 6 that I want in version 3.x. This is not much of a problem with this simple example but it can be for larger batch processes. For example, take a process that has Batch, Lot, Operation, Phase and Step (assume that every instance at every level has a unique name) and a tree structure including:
Batch
Lot Operation Phase Step 5 instances 5 Instances 5 Instances 2 Instances
for each Lot
for each for
Operation
each Phase
Total 25
Total 125
Total 250
This structure would create 5 Lots and 125 Operations (5 25) because each lot would have all 25 Operations. It would create 15,625 Phases (5 25 125) because each of the 125 Operations would have all 125 Phases. And it would create 3,906,250 Steps (5 25 125 250) because each of the 15,625 Phases would have all 250 Steps. This creates 3,906,250 steps instead of the 250 steps desired. This happens because there is no way for the migration to tell which instances need to be linked to the level above. To make sure none are left out, they are all created. | Solution: Back up the v2.x batch data (use b21backup). Use the Batch.21 Administrator to create the batch structure. Use the Restore v2 Backup (Start | Programs | AspenTech | Aspen Manufacturing Suite | Batch.21 | Server Tools | Restore v2 Backups) utility delivered with v3.x and v4.x to restore the v2.x data into the new structure. There is sufficient information stored in v2.x to put the restored data in the correct place in the sub-batch level tree.
Keywords:
References: None |
Problem Statement: Aspen Online 2006.5 fails to startup in Windows 2003 server with Office 2003 | Solution: For combination of Aspen Online 2006.5, Windows Server 2003 and MS Office 2003, if Office 2003 patch level is SP2, then Aspen Online may fail to start up since Aspen Online Service uses system account while the Microsoft known issue at http://support.microsoft.com/kb/912448 prevents it to work properly
To fix, apply Office 2003 SP3 (http://support.microsoft.com/?kbid=923618). Please note that this link is from Microsoft technical support, it may change in future. You can also search for Office 2003 SP3.
Keywords: 2003 Server, Office 2003, Excel, Start, Fail, Fails
References: None |
Problem Statement: I have liquid in the pipe; why is the reported Mach number so high? | Solution: A large Mach number may be reported if you have liquid present and you use Compressible Gas as your property method. Select Calculations | Options from the main menu and change the property method from Compressible Gas to anything else (Peng Robinson for example) on the Methods tab.
Keywords: Mach, high, liquid, water
References: None |
Problem Statement: After a silent install, the Batch tools cannot be selected from within Process Explorer even when a Batch plot is being displayed | Solution: Manually create the following key:
[HKEY_LOCAL_MACHINE\SOFTWARE\AspenTech\Setup]
BATCH21ROOTDir=C:\\Program Files\\AspenTech\\Batch.21\\
This known issue will be addressed in AMS Release 7.0.
Additionally, the following registry entries are not created by the Silent Install. These additional entries can be added manually also, but there is no documented loss of functionality if they are not added:
Here are the reg values needed:
[HKEY_LOCAL_MACHINE\SOFTWARE\AspenTech\Setup]
ASPENBPEDIRDir=C:\\Program Files\\AspenTech\\BPE\\
ASPENROOT=C:\\Program Files\\AspenTech\\
ASPENCOMMON=C:\\Program Files\\Common Files\\AspenTech Shared\\ ASPENCOMMONDir=C:\\Program Files\\Common Files\\AspenTech Shared IP21BASEdir=C:\\Program Files\\AspenTech\\InfoPlus.21\\
APEXROOTdir=C:\\Program Files\\AspenTech\\APEx\\
ASPENROOTdir=C:\\Program Files\\AspenTech\\
CALCROOTdir=C:\\Program Files\\AspenTech\\Aspen Calc\\
IP21ADMINROOTdir=C:\\Program Files\\AspenTech\\IP21Admin\\ GCSROOTdir=C:\\GCS33\\
DEFROOTdir=C:\\Program Files\\AspenTech\\Definition Editor\\ PROCESSDATAROOTdir=C:\\Program Files\\AspenTech\\ProcessData\\
Keywords: gray
grey
focus
disable
disabled
References: None |
Problem Statement: Is it possible to create multiple DLGPs when using CIM-IO for Setcim/InfoPlus-X/InfoPlus.21? | Solution: Yes. The following procedure can be used to create multiple DLGPs in version 7.5 of the interface.
Add services to the services file for the new DLGP's. The following example uses the range 10021 to 10024.
WINNT\system32\drivers\etc\services file
CIMIOSETCIM_10021 10021/tcp
CIMIOSETCIM_10022 10022/tcp
CIMIOSETCIM_10023 10023/tcp
CIMIOSETCIM_10024 10024/tcp
Add logical devices to the cimio_logical_devices.def file. The following format can be used.
cimio_logical_devices.def file
IOSETCIM1
machinename
CIMIOSETCIM_10021
IOSETCIM2
machinename
CIMIOSETCIM_10022
IOSETCIM3
machinename
CIMIOSETCIM_10023
IOSETCIM4
machinename
CIMIOSETCIM_10024
Add CIM-IO client tasks to the IP.21 Manager for each of the new devices. The client tasks will start the that start the cimio_setcim_dlgp.exe executable. In the example, four client tasks are necessary.
TSK_IP21_SERV1
set the execuatble to %CIMIOROOT%\io\cimio_setcim_dlgp.exe
set the command line parameter to
-n 10021
TSK_IP21_SERV2
set the execuatble to %CIMIOROOT%\io\cimio_setcim_dlgp.exe
set the command line parameter to
-n 10022
TSK_IP21_SERV3
set the execuatble to %CIMIOROOT%\io\cimio_setcim_dlgp.exe
set the command line parameter to
-n 10023
TSK_IP21_SERV4
set the execuatble to %CIMIOROOT%\io\cimio_setcim_dlgp.exe
set the command line parameter to
-n 10024
Start the CIM-IO client new tasks.
Test the new DLGP's using cimio_t_api.
Keywords: InfoPlus
InfoPlus.21
CIMIO
processes
References: None |
Problem Statement: For crudes where the MIN =0 in table BUY and they have no activity at | Solution: , a Marginal Value is expected because the variable is a the lower limit.
However, in many cases, this Marginal Value for this crude is missing (i.e. it is reported as 0).
If you force 1 bbl of crude instead of using a MIN=0, you get marginal values reported.
Why is it not showing the Marginal Values when Crude is at MIN=0, and what can you do to see this Marginal Value reported?
Solution
This situation is not a reporting problem as could be the first impression. We will analyze this situation based on the attached sample model, for crude BAC.
In case 1, for example, there is no marginal value (DJ) on PURCBAC (see FullSolution below), although the purchase variable is at its minimum. The cost of PURCBAC is $43.38.
By thinking that as the Purchase variable (PURCBAC for crude BAC) of the crude is at zero it should show a DJ, we are making the assumption that the LP knows what the independent variables are. However, in this case, the purchase column is in theSolution basis at zero activity. When a variable is in the basis, it has no DJ.
The DJ we are looking for is on the single column that processes that crude, SCD2BAC (which is related to the PURCBAC column through the material balance row, VBALBAC). This column is also at the MIN =0 (because by default all variables are bounded MIN = 0). See the matrix screen shot for details below.
That DJ is -0.9832, and it is the value we were looking for. The DJ just as easily could have been on the purchase vector but because of theSolution path taken by the solver, it isn't. SCD2BAC could have been in the basis with zero activity and the DJ shown on the purchase column.
You only have 1 degree of freedom here. If there were TWO SCD columns and ONE purchase column, there would be two marginal values on any two of those three columns - either the two processing columns or one processing column and the purchase column.
There is no way of directing that a marginal value appear on a specific column.
The best and quickest way to see the Marginal Values for the crudes left out from theSolution is to force one bbl of crude, i.e. MIN=0.001. In this case, the DJ of the crude will always appear on the Purchase vector and you can see it directly in the Purchases sector in the FullSolution report, as shown below The DJ has the same value as was seen before in the SCD2BAC variable.
Keywords: Marginal Value
DJ
Bound
Bounds
References: None |
Problem Statement: Getting error setcim trend rqst failed with error -24 after creating new unit. | Solution: One of the designator tags may have IP_ARCHIVING set to OFF. Often archiving doesn''t get turned back on after renaming the tag.
Keywords: ip_archiving
historical data
BCU
References: None |
Problem Statement: When importing an Area XML from an older version of Aspen Batch.21 to a newer version of Batch.21 Administrator, the following error occurs:
B21BSC-50401: Line 1: (-1072896763) A name contained an invalid character. | Solution: In recent versions of Batch.21, the XML file must be imported using Unicode format. To change the XML file format, open the file using Notepad and select Save As. Specify the Unicode format under the Encoding dropdown. Once the file format has been changed, it can be imported into the latest versions of Batch.21.
Note: If the above suggestion does not resolve the issue, please review the list of invalid characters available inSolution 126889.
Keywords: None
References: None |
Problem Statement: What is the proper syntax for creating a Batch programatically using Visual C++? | Solution: The code attached to thisSolution comes with no warranty. It is meant to provide the programmer with a starting point for writing their own code.
Keywords:
References: None |
Problem Statement: What does Operation Outside Miller Chart.... mean? | Solution: Flow at junctions (particularly two phase flow) is extremely complex and there are many variables that can affect the pressure drop. Miller charts attempt to take these variables into account based on certain parameters; if a K value is outside the range covered by the Miller charts, then the default or maximum value is taken.
In general, it has been observed that with the maximum or default value, predictions are more accurate than with other methods.
Keywords: miller, chart, warning, outside
References: None |
Problem Statement: Export wizard is not exporting all pipe lengths to excel using the customized definition file. | Solution: This problem is due to the customized definition file set to export the pipe lengths to columns instead of the rows.
Microsoft Office Excel 2003 worksheet contains 65,536 rows and 256 columns. So the data in cells outside of this column and row limit is lost in earlier versions of Excel. In Excel 2007, the worksheet size is 16,384 columns and 1,048,576 rows, which will not have problems with the data loss.
If using Excel 2003,set the definition file to export the data to rows when creating a customized definition file.This will ensure that all the data is exported.
Keywords: data, loss, missing, columns, definition, file, excel, limit
References: None |
Problem Statement: How do I change the number of decimal places in the flare tip pressure drop curve? | Solution: The decimal places in Aspen Flare System Analyzer can be changed in the Formatting page which can be accessed from the File menu asÂ
File || Preferences || Formatting
From the list of variables select DpCurvePt if the users requires to increase the number of decimal places for pressure drop in the FlareTip curve.
The number of decimal places for the pressure drop points can be changed in variable Static pressure drop.
This formatting option for the curve does not work in versions V7.3 and V8.0 (Defect reference CQ00471177). Users requiring this flexibility should upgrade the version to V8.2.
Keywords: FlareTip, Pressure Drop Curve, Decimal Places
References: None |
Problem Statement: Flow bleed calculates off take when a case has one flare tip but doesn't calculate off take when there are two tips | Solution: Aspen Flare System Analyzer currently does not support off take from flow bleeds in loop /split cases where the Looped solver used is a linear solver type e.g Newton-Raphson which is the default selected solver type.
Off take for looped /split configurations e.g 2 flare tips is only calculated using the Force Convergent Solver type.
Keywords: Off take, Flow bleeds, Newton-Raphson, Force convergent, Loop
References: None |
Problem Statement: How do I select a length multiplier for several selected pipes in Aspen Flare System Analyzer? | Solution: The easiest way to accomplish this goal is to use bulk pipe editing feature.This procedure allows us to globally replace or update certain pipe parameters for the whole network at one time.
Launch the Pipe Manager via the Pipes shortcut in the Build toolbar on the Home tab.
From the list of pipes displayed, highlight all the pipe names either by marking with the mouse and dragging or using the shift key and mouse to mark all the pipes. Click the Edit button to reveal the global Pipe Editor. Note that in the Pipe Editor, fields containing an asterisk can be updated to a value that can be applied to all the highlighted pipes.
Keywords: Flare System Analyzer, Editor, Global
References: None |
Problem Statement: How is vapor fraction defined in Aspen FLARENET? Why does it not match with the Aspen HYSYS result? | Solution: Vapour Fraction in Aspen FLARENET corresponds to the value HYSYS reports for the Vapour / Phase Fraction on the Worksheet | Conditions page of the stream view.
Other definitions of vapor fraction used in Aspen HYSYS are listed below:
Vap. Frac (molar basis) = Vapor Phase Mole Flow / Total Mole Flow (i.e. Aspen FLARENET definition)
Vap. Frac (mass basis) = Phase Mass Flow / Total Mass Flow
Vap. Frac. (Volume basis) = Phase Std Ideal Liq Vol Flow / Total Std Ideal Liq Vol Flow
Keywords: vapor, vapour, fraction, vapor fraction
References: None |
Problem Statement: My Flarenet case with many heavy components does not solve what should I do? | Solution: In Aspen Flare System Analyzer, if the source data has many hypo components with very heavy molecular weight (MW), e.g. 180 or higher, the program may have difficulties solving this scenario. The recommended workarounds are:
1) Specify the source composition in MW rather than in mole/mass fraction.
By specifying MW directly, two components with adjacent MWs will be used which can facilitate the calculation.
2) If mole/mass fractions have to be used, then try to lump the heavy components to reduce the total component number. Trail and Error could be used to determine the appropriate number for the specific case.
Keywords: Aspen Flare System Analyzer, Molecular Weight, Hypothetical Components
References: None |
Problem Statement: How does enthalpy balance consider kinetic energy across the valve? | Solution: The general energy balance equation for fluid systems:
dU + d(p/rho) + v*dv/gc + g*dz+dW+dq =0
which accounts for internal energy, pressure energy, kinetic energy, potential energy, work done on fluid and heat flux.
Here, U is internal energy, J; p is pressure, Pa. rho is density, kg/m^3; v is velocity, m/s; g is gravity, m/s^2; c is constant; z is elevation, m. W is work, J; q is heat duty, J.
Now consider for simplicity the equations of components with no storage and no heat or work added from the outside : the balance equation is then
dU + d(p/rho) + v*dv/gc + g*dz =0
For the fluid through valve, we assume no elevation changes, no work and no heat. The balance equation should be:
dU + d(p/rho) + v*dv/gc =0
We can get dU
dU=dh-d(p/rho).
So we will get the following equation for energy balance:
dh+v*dv/gc =0
As you expected, it looked like that kinetic energy (KE) transfer from enthalpy changes. However, it should be internal energy due to expansion, even it mathematically eliminated the term.
In most of the case, the KE term is negligible (due to very low velocity) but in flare sytem the velocity can be very high close to MACH1. Flarenet givse you an option to select this option under Calculation ll Options ll General tab.
Keywords: Kinetic energy, enthalpy balance, PSV, control valve
References: None |
Problem Statement: What are the equations used for rated flow for relief valve? | Solution: Aspen FLARENET uses API 520 sizing equation for relief valve.
The General sizing equation for API 520:
A = W*SQRT(T*Z/Mw) / (C*Kd*P1*Kb)
where:
A = valve area (in2)
W= mass flow rate (lbm/hr)
T = temp (R)
Z = compressibility
Mw = mole wt (kg/mol)
C = coefficient based on ratio of Cp/Cv
Kd = discharge coeff = 0.975
P1 = relieving pressure = MAWP*121% fire case (in psia)
Kb = back pressure correction based on ratio of MABP (gauge) to MAWP (gauge)
Keywords: equation, size, sizing, API
References: None |
Problem Statement: I have trouble lining up the objects in the PFD. Is there any button that helps in lining them up? | Solution: If you press toggle grid display button on the PFD tooldbar then you can line the pipe segments easily.This button toggle the grid on and off. When the grid is on, it is easy to line the pipe segments.
Keywords: None
References: None |
Problem Statement: Is the Copy/Paste function available in Aspen Flare System Analyzer V7.3? How to use it? | Solution: Yes, the Copy/Paste function available from Aspen Flare System Analyzer V7.3. You can select the branches using the mouse and right-click. Select copy commend in menu. Move your mouse to empty space. Right-click your mouse button and click Paste.
Or you may use the clipboard on Home tab of Ribbon.
Keywords: Copy/Paste, clipboard, V7.3
References: None |
Problem Statement: How do I incorporate heat gain from solar radiation in Aspen Flare System Analyzer? | Solution: In Aspen Flare System Analyzer, external radiative HTC is used to model the pipe surface and surrounding temperature. Aspen Flare System Analyzer can not model heat gain from solar radiation.
However, if you only try to model the heating effect from the Sun, you can using heating option in Pipe->Heat transfer tab and give heat duty for each pipe which receiving from the Sun due to radiation. When you give heat duty, you will not be able to enable heat transfer in calculation-> options.
Keywords: solar radiation, heat transfer, heat duty.
References: None |
Problem Statement: How to include Therminol-55 in Aspen Flare System Analyzer | Solution: This component is not available in Aspen Flare System Analyzer (AFSA) database. You can import it from the Aspen HYSYS database.
Follow these steps to import Therminol-55 in AFSA from HYSYS:
1. Open a New HYSYS case and add Therminol-55 as a component. You can find Therminol-55 as THEOL-55.
2. Add Peng Robinson as a fluid package.
3. Go to the Simulation environment and create a material stream. Define the material stream conditions and composition.
4. Save the HYSYS case and close the program.
5. Open a new Aspen Flare System Analyzer case, then go to File | Import Sources | HYSYS Stream Sources…
6. Click on Browse to select the HYSYS case just created and click on Open. Remember to keep the selection “Use HYSYS Components� under Component Data. Finally click OK.
7. At this point, the component should be added to your AFSA case. You can confirm by checking the component list under Home | Components.
Note: If you want to include this component in an existing AFSA case, you need to merge both files. For further details, please reviewSolutions 143238 and 119474.
Keywords: Therminol-55, THEOL-55, Component, Import, HYSYS, Database
References: None |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.