question
stringlengths 19
6.88k
| answer
stringlengths 38
33.3k
|
---|---|
Problem Statement: Is TSK_MKOB still needed in current versions of InfoPlus.21? | Solution: No, the TSK_MKOB is an obsolete task that has not been used since version 2.5.1 of InfoPlus.21. Therefore, if this task is still listed in the Defined Tasks list in the InfoPlus.21 Manager, it should be set to Skip during startup.
To do this, double-click on TSK_MKOB in the list of Defined Tasks, check the Skip during startup box, and click the Update button.
Keywords:
References: None |
Problem Statement: In the very early versions of Aspen InfoPlus.21, there was a requirement that the Windows directories holding the Historian Repositories and Filesets should be set as shared folders.
Now that this is no longer a restriction, are there any recommendations? | Solution: Testing in preparation for the release of v2006 has shown that it is better to NOT make these directories as Windows Shared.
This was discovered during performance testing. It was attributed to a small Microsoft overhead, causing a slight slowdown during access of shared folders.
The only disadvantage of not using Shared folders is that History management, such as adding more repositories or filesets, must be done on the InfoPlus.21 Server itself.
Keywords:
References: None |
Problem Statement: Generally, the RSLinx tagname specification is a string containing 8 parameters separated by commas. The tag is passed directly to RSLinx for processing, so the format of the tag is the same as in the RSLinx DTL_C_DEFINE function. It is not uncommon for users to encounter difficulty in determining the correct parameters necessary for the tagname. Information on the tag format can be found at the following locations:
CIMIO for RSLinx User's Manual
RSLinx documentation on the DTL_C_DEFINE function
the Rockwell customer support website:
http://support.software.rockwell.com/supportlibrary/
The CIMIO for RSLinx User's Manual contains example tagnames for several different hardware configurations. However, there are no tagname examples for some newer devices such as Power Monitor based on SLC503+ communication.
This | Solution: provides such an example.Solution
The RSLinx string has the following format (described in detail in the CIM-IO for RSLinx Interface User's Manual):
Data Address, Elements, Data type, Access, Port ID, Station, Proc Type, DriverID
The following tagname syntax should work for Power Monitor based on SLC503+ communication:
F15:4,1,FLOAT,READ,AB:LOCAL,0,SLC500,0
NOTE: Please double-check the position of AB_ETH-2 driver on your driver list. The driver ID shown above is 0, which represents the first entry on the list of drivers seen by the RSLinx software.
Keywords:
References: None |
Problem Statement: With Aspen Cim-IO for RSLinx it is possible to configure aliases for the records, but sometimes an end user wishes to know what these aliases really mean. | Solution: In order to get the alias information you must run a utility called CIMIO_RSlinx.exe which is located in :
C:\Program Files\AspenTech\CIM-IO\io\cio_abd_rsl
You will see a GUI called alias definition, as shown in the figure bellow, where you can find the alias information.
Keywords: alias
Record alias
RSlinx
References: None |
Problem Statement: When writing a value of a text tag to a PLC using CIM-IO for RSLinx and a PUT record and the length of the string changes, this change is not recognized.A For instance if you start out by sending ABCDJFK, the PLC gets the length of 7 characters but if in the next pass you send OPRS you still see on the PLC a length of 7 and a value of OPRSJFK - keeping the last three characters of the last send. The question then becomes how do you get a PUT record character string length shorter than a previous PUT record to be recognized when writing a value of a text tag to a PLC? | Solution: This problem can be resolved by cycling the PUT record OFF/ON.
Keywords: IoPutDef Record
String
Character
PUT
References: None |
Problem Statement: Even though most InfoPlus.21 servers are located in secure locations, some users may still have access to the server. In such situations, it will be wise to secure your InfoPlus.21 server from users making unauthorized changes which are detrimental to the proper functioning of InfoPlus.21 (accidental or otherwise). Examples of such crippling changes are:
-- Changing the system date, even without applying the changes. The changed system time will affect InfoPlus.21 immediately. If data with future timestamps get written into history, the historian will not function properly.
-- Shutting down the server from the Start menu. Users may not know that InfoPlus.21 should be properly shut down first before shutting down the server, or any recent changes made to the database will not be saved.
To prevent any of these from happening, you can use local policies (NT and Windows 2000) to secure your InfoPlus.21 server from unauthorized changes. | Solution: To access local policy settings for:
Win NT servers - go to Start | Programs | Administrative Tools | User Manager | Policies.
Win 2000 servers - go to Start | Programs | Administrative Tools | Local Security Policy, or Control Panel | Administrative Tools | Local Security Policy.
Under Local Policies | User Rights Assignment, locate Change the system time and Shut down the system. Double-click on these policies and uncheck the check boxes for all groups and individual accounts, except for the Administrator group and/or individual Administrator accounts.
Care must be taken if you wish to make changes to other local policies. In general, Administrators should also have the rights for the following policies: Access this computer from the network, Log on as a service, and Log on locally.
Some policies cannot be implemented from the local policies settings. Some of these desirable security features include:
-- Restricting access to the system registry for non-Administrator accounts. Unauthorized changes made to registry settings may cause irreparable damage to system software.
-- Preventing non-Administrator accounts from accessing the System panel in the Control Panel. From the System panel, users can make changes to environment variables, which may cause InfoPlus.21 to crash if it cannot locate certain key files.
-- Locking out all other accounts except Administrators from the InfoPlus.21 server.
These domain level policies may be implemented from the domain controller, either via Group policies or as Profiles. You may need the assistance of your IT department to do this. Also note that domain level policies will override local policies.
Therefore, it is important to coordinate security efforts with your IT department. Having good domain level and local policies are important in securing your InfoPlus.21 server from unauthorized changes.
Keywords: policy
security
References: None |
Problem Statement: What is the purpose of the IP_STEPPED field in records defined by IP_AnalogDef and IP_DiscreteDef ? | Solution: The purpose of the IP_STEPPED field is to specify the type of data compression used to record incoming values.
The standard form of data compression has always been the Boxcar with Backslope algorithm. This algorithm is ideal for analog type data. (SeeSolution 103491 for an understanding of how this algorithm is calculated.) However, for discrete or digital type data, this is not necessarily the case. For these types of data, Boxcar type data compression (or Stepped compression) is used.
By default, the field, IP_STEPPED, is set to INTERPOLATED. When the field is set to STEPPED, the stepped compression logic, Boxcar, is enabled. With this type of compression, whenever a new incoming value exceeds the deviation limit (IP_DC_SIGNIFICANCE) or it exceeds the maximum time interval (IP_DC_MAX_TIME_INT), the system records the new value with the new timestamp into history. That is, instead of creating a slope between points, Stepped compression creates a stair step pattern between values. In other words, when IP_STEPPED is set to STEPPED, the new STEPPED compression occurs INSTEAD of the regular Boxcar with Backslope compression.
It is important to understand the difference between STEPPED and INTERPOLATED. Stepped compression was designed to be used with discrete or digital type data. It is possible, however, to use BOXCAR or Stepped compression with analog type data by simply changing INTERPOLATED to STEPPED in the IP_STEPPED field.
Keywords: boxcar
stepped
interpolated
data compression
ip_stepped
References: None |
Problem Statement: Aspentech provides a local API to allow customers to write their own C/C++ or even Fortran programs incorporating standard Aspentech function calls used by all of our own software. We have had several inquiries on connecting to multiple databases when running this home written program. This document attempts to describe your options for connecting to:- 1 Single Database
Multiple Databases at the same time
Make RunTime decisions on switching from one database to another. | Solution: ' section below.
Keywords: multiple servers
inisetc
endsetc
setcimrpc.cfg
References: None |
Problem Statement: This knowledge base article provides a template to troubleshoot the following error which can be returned from the Aspen InfoPlus.21 API function call, RHIS21DATA()
ERRCODE = -33
Fields are not in same repeat area | Solution: RHIS21DATA() is a function which retrieves historical data from an Aspen InfoPlus.21 record's repeat area. The aforementioned error can be returned if:
? The history repeat area field referenced within the custom program is not referenced properly (not spelled correctly, etc.)
A fixed area field is specified within the custom program instead of a repeat area field
Keywords: Application
Customized
API
References: None |
Problem Statement: Can I connect a version 3.x (or higher) Aspen Calc client to an IP.21 / Aspen Calc server version 2.5.1 database? | Solution: If Aspen Calc client and InfoPlus.21 and the Aspen Calc server are on different machines, this will work. Follow the Aspen Calc installation manual''s directions on the DCOM configuration. Be sure to make the DCOM changes to BOTH the server and the IP.21 machine. This will allow you to create the calculations and formulas on the client and store them on the IP.21/ Aspen Calc server machine.
Keywords:
References: None |
Problem Statement: Error message Please enter a valid UNC pathname \nodename\folder\folder. This error message is seen on a client PC when using the Administrator to view the repository properties when the path contains a drive letter instead of a UNC pathname. If this drive is not mapped on the client PC, and the user clicks the OK button to exit, the message is displayed. | Solution: To avoid this error message, click on Cancel when on a client PC.
Keywords: administrator
property
repositories
References: None |
Problem Statement: Is it normal to have File Sets that are over 100% full? | Solution: This is a valid scenario where fs_percentage_used is greater than 100%. This can be caused when there is a file set shift at 95% full, followed by a manual/programmatic insertion of older data which falls into the fileset's time boundaries.
There is no harm from having a fileset which is greater than 100% full, as long as the archive file doesn't get larger than 2GB.
Keywords: 100%
File sets
References: None |
Problem Statement: When applying security on the IP.21 database using the IP.21 Administrator, the following message is received:
Unknown Error from IP.21 database component
or
Unknown Error = -200 | Solution: This is caused by a DCOM configuration issue.
To resolve, run dcomcnfg from Start | Run. On the Applications tab locate the Aspen.IP21.Security entry and double click. On the General tab, set the Authentication Level to Connect. Click OK, and then OK again.
Keywords: unknown error
-200
References: None |
Problem Statement: The No History in the Database message appears when attempting to view Control or Master recipe Edit History. | Solution: The message No History in the Database indicates that no edit history data was saved during the times specified in the History dialog. Edit history data is generated anytime a Control or Master recipe is edited so the likely cause of this message is that no edits occurred during the time range.
Edit history is stored in the Process Recipe database in the Change_Log table. The contents of the Change_Log table can be viewed in Enterprise Manager by expanding the database tables and Right-Clicking on the table and selecting Open Table (SQL Server), or View Edit Contents (Oracle). The contents on the table can also be viewed selecting Query and returning all columns.
Keywords:
References: None |
Problem Statement: Is it possible to make a strategy into a .pdf file? | Solution: Though this is not officially supported, it is possible IF you have Adobe PDF write installed on the client machine. When this is done you have extra 'printers', a PDF write and PDF distiller. When you print something you can also choose to write it to a PDF file.
Keywords: pdf
References: None |
Problem Statement: This knowledge base article describes how to remove the text Group 200 on from the name of the database that appears in the Aspen InfoPlus.21 Administrator. | Solution: The name of the database when viewed from the Aspen InfoPlus.21 Administrator is taken directly from the name of the ADSA data source for the server to which you are connected. For the case where the name of the database is shown as Group 200 on XXXX (where XXXX is the name of the Aspen InfoPlus.21 server) the cause is due to the fact that either:
? You do not yet have an ADSA data source configured for this server
Your ADSA data source does not contain the following two service components
Aspen DA for IP.21
Aspen Process Data (IP.21)
When the Aspen InfoPlus.21 Administrator detects an Aspen InfoPlus.21 server on the local machine it will automatically add the local Aspen InfoPlus.21 server to the server tree while appending Group 200 on to the name of the server if either of the above is true.
Keywords: group200
References: None |
Problem Statement: The IP.21 Administrator tool running against an older version of an IP.21 database does not show CIM-IO details.
For example:
When you use the v6.0.1 IP.21 Administrator tool to look at your v4.x or v5.x IP.21 database, it's not possible to expand the Logical Devices beneath the CIM-IO area of the Administrator tool. It will show information for each of the records defined by IoDeviceRecDef, but nothing beneath these Logical Devices will display (such as the External tasks, or the GET/PUT records) | Solution: Here are the steps to resolve this problem.
Obtain a CIMIO.RLD file from an IP.21 v6.0 system.
Save a database snapshot on the IP.21 v5.0 system (just in case).
Delete the record CIMIOExtensionRecords from the IP.21 v5.0 system.
Then delete the record CIMIOExtRecDef from the IP.21 v5.0 system.
Recload the v6.0 CIMIO.RLD file into the IP.21 v5.0 system.
Start a v6.0 Administrator and verify that it can see the CIMIO records in the v5.0 database.
NOTE:
If you are connecting to a v4.1.2 database with a v6.0.1 IP.21 Administrator, then replace all references to v5.0 above, to v4.1.2
A copy of the CIMIO.RLD file from v6.0.1 system is attached for you to download
This fix will be rolled into the release accompanying AMS 7.0.
Keywords: IoDeviceRecDef
CIMIOExtensionRecords
CIMIOExtRecDef
References: None |
Problem Statement: What process is modeled in the InfoPlus.21 demo database simulation? | Solution: Liquid Raw materials are collected in three Raw Materials tanks, units ATC 101, 102, and 103.
Outlet valves from the three Raw Materials tanks allow liquid to flow into a Mixer unit (M201). This marks the beginning of a batch as far as Aspen Batch.21 is concerned.
The three raw materials are agitated in the Mixer unit.
An outlet valve from the Mixer Unit opens, admitting the product to a Reactor Unit (R301), which begins the React Stage.
When the product is in the Reactor Unit, it is subjected to two heat treatments.
At the end of the second heat treatment, a Vent valve in the Reactor opens to allow excess gases to escape. This concludes the React Stage.
An outlet valve from the Reactor opens, and the product is admitted to any one of three Storage Units (S401, S402, and S403). Each storage unit stores a different grade of product, which can be Regular, Super, or Premium. This is called the Dump Stage.
A batch has to run all the way through the system before another batch can begin.
A second mixer and reactor exist, but the simulation does not use them.
The simulation causes tags in the Aspen InfoPlus.21 database to acquire values as though the values were being transmitted from plant automation. This is achieved by a sequence of Aspen SQLplus scripts, each of which automatically triggers the next.
The Aspen InfoPlus.21 definition record IP_AnalogDef defines all of the tags used in the simulation.
Some of the tags, which are named according to a naming convention, store measured data such as levels, temperatures, pressures, flows, and valve positions.
Examples:
ATCL201 and ATCL301 store the levels in the Mixer and Reactor, respectively
ATCF201 and ATCF301 store the flows into the Mixer and Reactor, respectively
ATCPH101 stores the pH of the Raw material in Raw Tank 101
ATCTIC301 stores the measured temperature in the Reactor
ATCP301 stores the pressure in the reactor
ATCV401, ATCV402, and ATCV403 store the closed (0) or open (1) positions of the valves between the Reactor and the three storage units
Other tags store data from other sources.
Examples:
ATCBATCH stores the batch identification number
ATCPRODN stores the product identification number
ATCLOST stores the quantity of product lost
Keywords: atc
demo
database
simulation
References: None |
Problem Statement: Unable to connect to the PI server. The cimio_t_api returns the error:
Status is 'Bad Tag'
Facility=39
Driver Status=39203
From the CIMIO for PI Interface user's Manual:
CIMIO_PI_PS_INV_POINT
Error # 39203
Error Message: Invalid Point Number
User Action: The specified point does not exist in PI or the DIOP does not have Point read permission for the point.
Verify the tag name and the tag's permissions. | Solution: Aside from the obviousSolutions (i.e. the specified point does not exist in PI or the DIOP does not have Point read permission for the point), another possible cause is that in a domain with multiple PI servers, the PI home node is not set up as the DEFAULT server in the PILOGIN.INI file.
Configure the PI home node as the default server as follows:
A portion of the PILOGIN.INI file...
^^^^^^^^^^^^^^^
[SERVICES]
PI1=PI
[DEFAULTS]
HELPFILE=f:\pi\library\pilogin.hlp
piserver=<pi home node name> ; This must be the name of the default server
; The user names for the servers above
pi1user=piadmin
PI2USER=piadmin
PI3USER=piadmin
PI4USER=piadmin
Keywords:
References: None |
Problem Statement: When starting IP.21, an error occurs for TSK_H21_INIT: ERROR: Cannot create shared memory block. | Solution: This has been seen by serveral customers who have uninstalled a previous version of IP.21 and then install a later version. It each case, the customer used the uninstall procedure, but the problem is likely caused by not having everything removed completely. Therefore, theSolution is to uninstall the old version, delete registry keys (HKEY_LOCAL_MACHINE\SOFTWARE\AspenTech), delete or rename old files (C:\Program Files\AspenTech), then install the new version.
Keywords: startup
start-up
un-install
re-install
References: None |
Problem Statement: Can I pull IP.21 process data into a Microsoft Access database? | Solution: In order to pull data through the ODBC link from IP.21 into MS Access, all you need to do is make sure you have installed SQLPlus on the client. This installs and configures the ODBC driver needed by MS Access. It is necessary to install SQLPlus because Desktop ODBC is no longer available as a separately licensed component.
In Access, go to File --> Get External Data --> Import.
Select the Machine Data Sources tab and find the SQLPlus driver.
You will be presented with the list of tables (definition records) in IP.21.
You can get the historical information by selecting the definition record with the _1 behind it. The _1 represents the repeat area, then you can choose the ip_trend_value, ip_trend_time, status, etc.
NOTE: Keep in mind this pulls in entire table to Access. You would then use the Access queries to limit and use the data efficiently.
Keywords: access
data
tables
References: None |
Problem Statement: This Knowledge Base article provides some general troubleshooting tips for the Aspen InfoPlus.21 history subsystem. | Solution: If you suspect a history problem, first check the list of running processes using the Windows Task Manager on the server running the Aspen InfoPlus.21 database. There should be one h21archive.exe process for each repository that exists. If you have three repositories, you should have three copies of the h21archive.exe process. If one or more is missing, this confirms a history problem.
Next check the Queue State of each repository. Using the Aspen InfoPlus.21 Administrator, navigate to each repository. right click on a repository name and choose the Check Queue option.
In a correctly operating system, the queue state will be Normal. The other possibilities are Overflow Queue and Disk Overflow. Overflow Queue indicates that the memory buffer holding historical data that has not yet been placed in a fileset has filled up. This data is now buffering to a different location in memory - the Overflow Queue. Disk Overflow indicates that the Overflow Queue has reached its 16 KB limitation and now data is being buffered in a disk file called the EVENT.DAT. This event.dat file is a Windows file located in the same directory as the root files for the repository. If an event.dat file exists, there is definitely a history problem.
Important history troubleshooting information is in a log file named ERROR.LOG. There is a log file for each repository located in the same directory as the root files for the repository. Use either Notepad or WordPad to examine the history error log file. All history related activities are recorded in this log file: normal program startups/shutdowns, fileset shifts, and abnormal program crashes. Check the log file for the most recent entries to help diagnose the problem.
Some of the more common error messages include:
comp_mean() cache / key file mismatch error (SeeSolution 103767 for complete description.)
save_cache() open failure (SeeSolution 104466 for complete description.)
archive > < archive # > corrupt, dismounted (This fileset is completely unreadable and needs to be deleted and a backup of the fileset restored.)
archive > < archive # > key file open failure (SeeSolution 104575 for complete description.)
After reading the log file and determining the root of the problem, make the necessary adjustments to the history files/configuration and restart InfoPlus.21.
If you have an error message that is not documented here or is not understandable, do not hesitate to contact AspenTech Support.
Keywords: error.log
queue state
h21archive
history
historian
References: None |
Problem Statement: How to specify a repeat area field when running changeauditattribute.exe? | Solution: add a 1 in front of the field name to specify the first repeat area. Example: 1 io_data_status
Keywords: changeauditattribute
repeat
References: None |
Problem Statement: How can you insert values with a specific timestamp in history, using VB?
Specifically, using ip_input_value writes data to the specified tag, but the timestamp supplied is ignored - no matter what time is put there, the value is assigned the current time as the timestamp. | Solution: Below is some sample code - be sure to include a reference to Aspen Process Data and Aspen Time Components. If the latter is not in the available references list, you can add it via the browse button (typically in C:\Program Files\Common Files\AspenTech Shared\Apex\AtTime.dll).
Dim AtPdDs As New AtProcessData.DataSources
Dim AtTime As New AtTime.AbsoluteTime
Dim myDS As AtProcessData.DataSource
Dim myTag As AtProcessData.Tag
Dim curTimeUTC As Date
Dim curValue As Double
Set myDS = AtPdDs.Item(localhost)
'name of Datasource
Set myTag = myDS.Tags.Add(TestTag)
'name of tag
AtTime.Parse CDate(12/01/01 11:44:22)
curTimeUTC = AtTime.ValueUTC
curValue = 98.6
'NOTE: WriteAttribute(Attribute As String, TimeUTC As Date, UseCurrentTime As Boolean, Value, Level As Long, Status As Long) myTag.WriteAttribute Val, curTimeUTC, False, curValue, 0, 0 myDS.Tags.RemoveAll
Solution #100949 describes how to insert values with a specific timestamp, using SQLplus.
For more information, please see the Aspen Process Explorer OLE Automation Interface Manual and AtProcessData.hlp (in C:\Program Files\Common Files\AspenTech Shared\Apex).
Keywords: vb
visual basic
writeattribute
References: None |
Problem Statement: How do I resolve the IOGetDef / IOLongTagGetDef issue when trying to create new tags in AspenWatch Maker?
Tag already exists-- | Solution: The issue is the architecture we have in place for IO Get records to prevent mismatch between IOGetDef and IOLongTagGetDef records being accessed by the same IO-Group tag (IODEV1, IODEV2, etc.). We did not begin enforcing this rule until recently.
If you've been using the IOLongTagGetDef records before this rule began being enforced, it may be that the offending IOGetDef records is: D-IOGet
This record is a required template record that is used by the database. It does not affect Aspen Watch record processing and can be safely ignored. This record is probably using IODEV1 in its IO_GROUP field. This is what is causing the problem. Our check for IOGet record type association is not filtering out this template record during the check.
To work around this problem, temporarily assign the D-IOGet record's IO_GROUP field to some other unused IODEVn group. Alternatively, you could create a dummy IO_GROUP entry (see screenshot). And assign it to that one (I created one called DontUse). (see attachment).
For the long term, we will fix the query that we use to do this check and ignore D-IOGet.
You can run the following query on the system to identify the IO_GROUP with the GetDef and send to us if necessary.
select NAME, IO_GROUP from IoLongTagGetDef;
select NAME, IO_GROUP from IoGetDef;
Keywords:
References: None |
Problem Statement: A new custom definition record has been created to allow change-of-state (COS) activation via a COS field pointer. The creator intends to limit the range of trigger values which will activate the record by making the WAIT_FOR_COS_FIELD dependent on COS_RECOGNITION, basically using the functionality of a QueryDef record as a model.
The user expects that tags subsequently created from this new definition record would only activate with changes in the trigger record/field that are allowed by the value in the associate COS_RECOGNITION field, however it's observed that the COS activation occurs with any change in the trigger record/field. | Solution: InfoPlus.21 does not recognize the COS_RECOGNITION field as anything other than a field with eight allowable values (and that, only because it was configured with the COS-OPTIONS selector record). InfoPlus.21 only knows activate or don't activate, as determined by the COS field pointer, scheduled, or manual activation.
It is an external task, which is activated by the changing field pointed to by the COS pointer field, which recognizes and uses the values in COS_RECOGNITION to limit the further results of the record activation.
Take the example of a record defined by the standard QueryDef definition:
The record/field pointed to by the COS field pointer changes.
InfoPlus.21 sees this and triggers the external task (pointed to by the EXTERNAL_TASK_RECORD field) to run.
The external task, iqtask.exe, is specifically designed to go back to the query record in the IP.21 database and examine the COS_RECOGNITION field for the associated WAIT_FOR_COS_FIELD field that triggered it. If the trigger value is within the range allowed by COS_RECOGNITION, then the task continues to run -- it goes on to process the SQL script found in the repeat area #QUERY_LINES and place any output in #OUTPUT_LINES.
In summary, it is the external task, not InfoPlus.21 that actually uses the COS_RECOGNITION field in its functionality.
Keywords:
References: None |
Problem Statement: Is it possible to access tag attributes using Visual Basic? | Solution: The following code allows for an SQLplus call to be made to the database via Visual Basic. In this case, names for all tags beginning with atcl are selected. Similar calls can be made to return other tag attributes from the database. In the example the name of the InfoPlus.21 server is wesleyr1.
To run the code successfully, the ActiveX Data Object 2.5 Library and Aspen Tag Browser SQLplus references must be added to Visual Basic.
Private Sub Command1_Click()
Dim variable As ADODB.Recordset
Dim bla As ADODB.Record
Dim MyTagBrowser As New AtTagBrowserSQLplus.TagBrowser
mycon = MyTagBrowser.Connect(wesleyr1, wesleyr1, 10014, , )
Set variable = MyTagBrowser.Query(select name from ip_analogdef where name like ''atcl%'', False) test = variable.GetRows
For i = 0 To variable.RecordCount - 1
List1.AddItem (test(0, i))
Next
End Sub
Keywords: None
References: None |
Problem Statement: Data in InfoPlus.21 can sometimes appear to be a flatline or show a stair step pattern. A view of the data table, will show the same value showing up with new timestamps at the same frequency as the get record is processed.
The cause of the pattern is that CIMIO doesn't recognize a control file shutdown or when RNI closes a DDS session. This means that the RNI API was not updating the data and the diop continued scanning the last updated value thinking it was current.
Here is an example of what the data table might look like:
ip_trend_time
ip_trend_value
IP_TREND_QLEVEL
IP_TREND_QSTATUS
19-SEP-01 01:59:00.4
100.161
Good
Good
19-SEP-01 01:58:00.6
100.161
Good
Good
19-SEP-01 01:57:00.5
97.819
Good
Good
19-SEP-01 01:56:00.5
97.819
Good
Good
19-SEP-01 01:55:00.5
97.819
Good
Good
19-SEP-01 01:54:00.5
95.506
Good
Good
19-SEP-01 01:53:00.5
95.506
Good
Good
19-SEP-01 01:52:00.5
95.506
Good
Good
19-SEP-01 01:51:00.5
93.438
Good
Good
19-SEP-01 01:50:00.6
93.438
Good
Good
19-SEP-01 01:49:00.5
93.438
Good
Good
19-SEP-01 01:48:00.5
91.083
Good
Good
19-SEP-01 01:47:00.5
91.083
Good
Good
19-SEP-01 01:46:00.5
91.083
Good
Good
19-SEP-01 01:45:00.5
102.331
Good
Good
19-SEP-01 01:44:00.5
102.331
Good
Good
This tag was configured to be scanned every minute and the interface was configured to attempt to reestablish communications when no messages have been received after three minutes. This interface interprets a lack of messages to mean that none of the tags have changed. It continues to report the last known value to the CIMIO client. If the lack of messages lasts too long, the interface assumes a loss of communication with the RNI and attempts to reestablish it. So the pattern is that there is a loss of communications with the RNI which results in no messages received by the interface. The one minute scan returns the last value reported until the three minutes elapses and the interface attempts to reestablish communications. The interface then gets a new value but communications fail and the cycle repeats itself. | Solution: This is actually a problem with the configuration of the RNI. Increase the settings for block_scale_cache_size and point_scale_cache_size in the RNI's SB.CFG. Consult your RNI documentation or contact Fisher-Rosemount for further information.
Keywords:
References: None |
Problem Statement: History values are not propagating into history for records defined by IP_TextDef. | Solution: Here are a couple of possibleSolutions for this behavior.
1. IP_Archiving is set to OFF
2. The History repository is not running.
3. The records are configured for history compression yet the values are not changing by more than the DC_Significance value.
4. The IP_TextCompress record in not configured correctly.
If values for records defined by IP_AnalogDef and IP_DiscreteDef are propagating into history. The problem may be with the IP_TextCompress which is the processing record associated with IP_TextDef. This record must be configured properly in order for data to successfully propagate from IP_INPUT_VALUE to IP_VALUE and into history whether or not compression is being used.
The correct configuration for the record IP_TextCompress is shown in the following example:
Keywords: IP_TextDef
Historian
History
IP_Input_Value
IP_Value
References: None |
Problem Statement: After application of Microsoft hotfix KB828035 to an NT4 system, the Aspen InfoPlus.21 tasks may not start and clients secured through Aspen FrameWork / Aspen Local Security may receive authentication errors when connecting to a secured Aspen InfoPlus.21 server. | Solution: This is due to a defect in Microsoft hotfix KB828035. This hotfix can cause group/role authentication not to work in environments which use an NT4 domain controller with a large number of NT4 clients. Microsoft has published a fix for this defect. Information regarding this fix can be found at:
http://support.microsoft.com/default.aspx?scid=kb;en-us;831579
Note: If you don''t wish to apply Microsoft''s fix for this problem, you can always:
Stop InfoPlus.21
Uninstall hotfix KB828035 through the ''Add or Remove Programs'' utility.
Restart InfoPlus.21
Uninstalling KB828035 should resolve any problems caused by the hotfix.
Keywords: None
References: None |
Problem Statement: How to create an .RLD file for custom definition records that also saves additional records required by the custom record. | Solution: The InfoPlus.21 Manager Utilities application saves the selected records to the specified .RLD file. If a target database is specified, the application also saves records required by the custom record if the records are not already in the reference snapshot.
Open the IP21 Manager and go to Actions, Utilities, Records.
Add a file name and records to save in the .RLD file.
Click the Use
Keywords:
References: Snapshot and identify the snapshot to use as a reference. Note: The reference snapshot will need to be copied to the system where the utility is running.
Click Execute to create the .RLD file |
Problem Statement: When entering a value into IP_INPUT_VALUE, duplicate records appear in trend history with the same value and time as the original entry. That is, rather than one occurence in trend history, you receive two or three. | Solution: Either turn off compression or change IP_STEPPED from Stepped to Interpolated. SeeSolution 106625 for a description of this field usage. For analog values, having IP_STEPPED set to INTERPOLATED (Boxcar with Backslope algorithm) is ideal. Having IP_STEPPED set to STEPPED (Boxcar algorithm) is typically used for discrete values. With the INTERPOLATED setting, occurrences are stored producing a linear representation of the data. With the STEPPED setting, Aspen InfoPlus.21 will store values to provide a stepped representation of the data. To do this, extra points are added around the actual data to provide the STEPPED effect. This is the cause of the additional points.
Keywords: IP_STEPPED
duplicate history
duplicate record
trend history
References: None |
Problem Statement: At almost the last part of an installation process of Aspen InfoPlus.21, there is an optional step called Upgrade. The Upgrade step allows you to take a snapshot from a previous version of the software, and upgrade it, so that it will work with the new version. There are two terms that upgrade uses:-
Source Database - This is a pointer to the snapshot that was saved from the previous older version Target Database - This is a pointer to the snapshot that comes with the new version. The Source Database Snapshot is upgraded INTO the Target Database Snapshot
Our Installation manuals are very good and give explicit step by step procedures on how and what to answer for all questions. The ones that are usually the most confusing are the ones that asks... O.K. to modify ALL records in the Target database (Y/N) ?
and the combination of..
O.K. to modify SOME records in the Target database (Y/N) ?
Enter name of file containing names of existing records to be modified, or (c/r) for oktomodify.inp
Why would you want to modify some or all records during Upgrade, and what is the difference between Some and All related to OKtoModify ? | Solution: As you probably know, the (target) database that comes with the new version contains all of our standard records. For example it contains all the standard definition records such as Ip_AnalogDef and Ip_DiscreteDef etc. It contains all the standard Format records like F7.3, I5 etc It contains all the standard Selector records like ip_Eng_Unit, Ip_Plant_Areas etc. It contains all the standard Field Name records and so on and so on. Basically it contains all of our standard records but of course NOT your Data records.
Now suppose, again for example, that we made some simple subtle changes to the Ip_Analogdef. Therefore Ip_AnalogDef in the Target Database is newer than Ip_AnalogDef in the Source Database. What Upgrade will do, is to leave the final Upgraded database with the new version of the definition record. It will then copy all of our Data records defined by Ip_AnalogDef, and modify them to take on the appearance of the new copy of Ip_AnalogDef.
So this is an example of 'modifying' records, but it is NOT related to the questions regarding OKtoModify
In the example of the data records, BEFORE upgrade, they only exist in the Source database - not in the Target. There are however some other records that may be in both the target and Source database - but different. Two examples of this are Ip_Eng_Units and IP_Plant_Areas.
In this case the Target will contain the deafult record whereas the Source will contain the copy that you customized. The critical thing here is that you want any subtle changes that we have made to be performed, but you want to keep al your own customization. These changes are the times that OKtoModify is important.
You have three choices.
If you answer that you do not want any records modified then you will lose your customization. If you answer that you want ALL records modified, then ANY record that appears in both the Target and the Source will be modified and all customization retained. If you answer that you want SOME records modified, then there is a test file called OKTOMODIFY.INP that will be analyzed. This file can be modified by users and only those records that exist in both Target and Source AND in the OKTOMODIFY.INP will be modified and customization retained.
When going through the Custom Upgrade path, a Browse button takes you to the default OKTOMODIFY.INP file, which is located in the Group200 directory (for V7.2 and later, refer to KB Article 129813 if you have questions about where your Group200 directory is located.) The file itself contains notes and comments about its structure to help you configure it up successfully. Conveniently, there is a View button in the Wizard that lets you see the contents of the file when running through an upgrade.
Keywords: OKTOMODIFY.INP
upgrade
References: None |
Problem Statement: Getting an error Invalid item in Error_Type field within AspenCalcDef Record when executing the calculation by COS | Solution: This error happens when the calculation within Aspen Calc is created under a folder and if the folder name is not appended to the calculation_Name within the AspenCalcDef Record. In order to resolve this error and for the calculation to execute properly on COS the Calculation_name field should be set to <folder name>\<calculation name>
For example if the calculation within Aspen Calc is created under a folder called 'Aspen' and the name of the calculation is 'Test' the calculation name in the AspenCalcDef Record record needs to be set to Aspen\Test
Keywords: Calculation
Aspen Calc
COS
Folder
References: None |
Problem Statement: What are the different ADSA configurations that can be used to view multiple ADSA Data Sources? | Solution: User Data Source Configuration (ADSA Client/Workstation based configuration)
To allow a single user (only) access to view multiple Aspen InfoPlus.21 Servers, the user can configure a private or User Data Source in the ADSA Properties on their local workstation.
Public Data Source Configuration (ADSA Server based configuration)
To allow a multiple users throughout the organization the ability to see multiple Aspen InfoPlus.21 servers, the Server Administrator must include all ADSA server names in the Public Data Source list of all ADSA servers.
Keywords: ADSA, Public Data Source, Private Data Source
References: None |
Problem Statement: What is the difference between XOLDESTOK and CLROLDEST? | Solution: XOLDESTOK is for resetting the creatation date for an individual tag or group of tags. This allows for entering data that is older than the creation date. With XOLDESTOK, you specify the new date and the repeat area. Please seeSolution #103040 for more information on how to use XOLDESTOK.
CLROLDEST is for resetting the creation date for all tags. It resets the oldestok field to an undefined time, so that any timestamp can be used with data input. Also, it assumes a repeat area of IP_#_OF_TREND_VALUES. Other repeat areas must be reset with XOLDESTOK. This utility is only used with migrations, e.g. from CIM/21, Setcim, or PI data historians to InfoPlus.21.
Keywords: xoldestok
clroldest
migrate
history
clear date
References: None |
Problem Statement: A user databank can be created in Aspen Plate Fin Exchanger that contains along with geometry information, the Reynolds Number, f and C factors to determine the pressure drop and heat transfer for a particular fin.
Each fin is identified by a number that the user chooses that is in the range of 101 to 9999. Within the Plate Fin Exchanger to identify a fin in the exchanger geometry input, you simply give the fin number used in the databank as the fin code number in the program. There is then no need to provide other information in the input about the fin's geometry or performance.
Below is described the format of the user databank and the location where it is stored. | Solution: The user databank is called FinData.txt and is stored in the directory, typically ?C:\Program Files\AspenTech\Aspen Exchanger Design and Rating V7.1\XEQ ? where the file platefin.dll is located
The first three lines of the databank identify the fin and its geometry, where after the remaining data gives the Re-f-Cj performance. The databank should be in metric units (mm, fin/m).
The format of the FinData.txt file is described below;
For each fin, the data begins with a 701 line, followed by a 702 line giving the fin number, which uniquely identifies the fin and fin type; 1 - plain, 2 - perforated, 3 - serrated and 4 - wavy (herringbone).
The data continues with a 703 line giving the fin geometry where the first item is the fin height, the second fin thickness, the third fin frequency, the fourth fin porosity and lastly the fin serration length. Items 4 and 5 will only be appropriate, depending upon the fin type, where if not relevant a zero can be entered.
Lastly is a series of 710 lines giving Re-f-Cj performance data. It is wise to set up data for a wide range of Reynolds number, to cover all potential uses of the bank.
An example of the databank might be;
COMPANY NAME (SURFACE 11.1)
701 FINS S KAYS/LON 11.1
702 136 1
703 6.35 0.15 437.01 0
710 500 0.03500 0.00840 800 0.02280 0.00599
710 1200 0.01690 0.00471 2000 0.01390 0.00436
710 2500 0.01190 0.00424 3000 0.01120 0.00412
710 4000 0.01030 0.00390 5000 0.00991 0.00372
710 6000 0.00971 0.00356 10000 0.00878 0.00314
The fin databank may contain null lines, beginning with two blanks or an asterisk, which are ignored when the data are read. These can add comments and improve the readability of the databank.
Note: If you are transferring from MUSE, then this is the same file that was previously called FinDat (no extension).
Keywords: Fin Databank, FinDat, FinData
References: None |
Problem Statement: Creating an Aspen Calc formula generates the error: Formula <formula name> found in multiple libraries at line X.
A user cannot copy or type a pre-defined formula name in calcscript directly. Doing this generates an error Formula <formulaname> found in multiple libraries because the formula already exists in the builtin library, and Aspen Calc does not allow the creation of formulas using the pre-defined names. | Solution: TheSolution is to click Insert Formula in the Formula Wizard, select the Built-in Library, choose the appropriate formula, highlight it, and click the Insert button.
Keywords: libraries
multiple
formula
References: None |
Problem Statement: In Aspen PlateFin, when in Simulation or Checking mode, there are two ways to enter the flow direction in the layer;
· Input | Problem Definition | Process Data | Process Options tab, the “Flow direction” may be entered as End A to B (down) or End B to A (up). In general the default is for hot streams to flow up and cold streams down.
· Input | Exchanger Geometry | Layer Types the sequence of the components that make the layer are entered. If the inlet distributor is first (near end A) then the flow will be downwards if a vertical unit is set.
If these inputs are inconsistent above, then the input error will be given | Solution: If these inputs are inconsistent above, then the input error will be given. Users should check that the flow descriptions specified above are consistent.
Keywords: Input error 1212
References: None |
Problem Statement: Aspen Calc application will not open successfully, and instead freezes. | Solution: Aspentech Calculator Engine should appear in the list of applications in dcomcnfg. If it isn''t there, you can register it again using the following commands:
cd %ASPEN_CALC_BASE%\bin
CalcScheduler/Remove
CalcScheduler/Install username password
Where username and passwords are the user name and password of the account assigned to the service.
Keywords: calc
aspen calc
References: None |
Problem Statement: Can Aspen Calc be used across a firewall? | Solution: Aspen Calc uses DCOM to communicate. As a result, Aspen Calc is difficult to configure for use across a firewall since DCOM is designed to dynamically assign a TCP port and UDP port at run-time. Consequently, you can''t predict the port that will be used and specify it beforehand on the firewall.
Microsoft does describe how this can be set up in an article detailing the use of DCOM with firewalls at http://www.microsoft.com/com/wpaper/dcomfw.asp
Please note that using AspenCalc in this way is untested and so is not an officially supported implementation.
Keywords: port
fire
wall
calc
References: None |
Problem Statement: This knowledge base article explains how to use Aspen Calc to retrieve the last two values of a tag historized in Aspen InfoPlus.21 using Aspen Calc. | Solution: The example code below was provided by Wayne Bylsma of TRANSALTA CORPORATION-CALGARY, CANADA.
' Begin Example Calculation
dim conn,rs, stat
set conn = createobject(ADODB.Connection)
conn.open SERVER
stat = SELECT ip_trend_value from ta_analogdef where name = ' & tag&';
set rs = conn.execute(stat)
i = 0
do while not rs.EOF and i < num
r1 = rs(0).value
rs.moveNext
i = i+1
loop
returnvalue = r1
where num is the number of values you want to go back.
Keywords:
References: None |
Problem Statement: Error when trying to add roles to the IP.21 Administrator to assign permissions:
Function GetLocalRolesList failed with empty roles list | Solution: Verify that the AFW Security Client service is running in the Control Panel | Services. Most likely it is. But, chances are, the AFW Service process does not have sufficient privileges to instantiate the Authorization object - causing security type functions to fail.
Run DCOMCNFG and verify that:
1. DCOM is enabled on the computer
2. Default Authentication Level is set to Connect.
3. Default Impersonation Level is set to Identity. (This, in error, may be set to anonymous.)
Restart the AFW Security Client service.
Also, make sure that the Infoplus.21 Browser service is started with an account that can access the Aspen Security Server.
Security should now work.
Keywords: GetLocalRolesList
empty roles
References: None |
Problem Statement: During installation of Service Pack 1 for Aspen Manufacturing Suite 6.0, the installer may give disk space errors. | Solution: Search for directories having the name PFT*.TMP, where * is a wild card. If many of these directories are found in the default %TEMP% directory (<User Profile>\Local Settings\Temp), the temporary directories should be deleted.
If the location specified by the default %TEMP% environment variable does not contain these directories, it may be necessary to create a new %TEMP% environment variable. Set the location of the new %TEMP% environment variable to C:\TEMP.
For more information on specifying operating system environment variables, please see the following Microsoft documents.
WindowsXP:
http://www.microsoft.com/windowsxp/home/using/productdoc/en/default.asp?url=/windowsxp/home/using/productdoc/en/environment_variables.asp
Windows 2000:
http://support.microsoft.com/default.aspx?scid=kb;en-us;311843
Keywords: AMS
full
6.0.1
install
installation
References: None |
Problem Statement: When using SQLplus to insert data into the InfoPlus.21 database with a bad status, the data gets historized as bad data (instead of good data with a bad status). This results in Aspen Process Explorer not being able to display the data because it is INVALID. | Solution: For every data point in history, you want to have qstatus set to 'Bad' (the status of the data) and qlevel set to 'Good' (the quality of the data). This can be accomplished by temporarily modifying the Quality-Statuses selector record when entering the data:
IN MORE DETAIL:
Quality Statuses and Quality Levels of data are controlled through a selector record called Quality-Statuses. Each of over 70 Statuses has a Level of Bad, Suspect, or Good. Process Explorer displays only data with a Level of Good in trends. To trend all data, edit the Quality-Statuses selector record in InfoPlus.21 to change all Bad and Suspect statuses to Good.
This will affect the translation from IP_INPUT_QUALITY to IP_TREND_QLEVEL for new incoming data at the moment the data (with qlevel and qstatus) will get stored in history. (NOTE: You must undo the temporary changes afterwards).
FOR INSTANCE:
begin -- insert of data
quality-statuses.quality_level[53] = 0; -- change temporary to good
update testtag set ip_input_value = 684, qstatus(ip_input_value) = -20; -- -20 is the status that belongs to line 53
quality-statuses.quality_level[53] = 2; -- change back to the original bad
end -- insert of data
select ip_trend_value, ip_trend_qlevel, cast(ip_trend_qlevel as integer), ip_trend_qstatus, cast(ip_trend_qstatus as integer),
ip_trend_time from testtag where ip_trend_value = 684;
BACKGROUND:Solution #104180: What is the difference between IP_TREND_QLEVEL and IP_TREND_QSTATUS?Solution #104090: What does the 'VALID' column represents and how data is determined as valid from other databases.
Keywords: bad
good
status
level
quality
APEx
References: None |
Problem Statement: This knowledge base article provides an example program which illustrates how to use the RHIS21DATA routine to read repeat area data from IP_TextDef records. | Solution: The RHIS21DATA function can extract data from the repeat area of IP_TextDef records in the same way as it does from IP_AnalogDef records - the difference being the datatype used in the program. The following code is written to extract text values from the LOGS_IN_MEMORY repeat area of the IoLog record (defined by LogDef). The IoLog record exists in the Aspen InfoPlus.21 demo snapshot.
/********************************************************************************************************************
This program reads history from a record defined by LogDef
*********************************************************************************************************************/
#include
#include
#define MAXOCCS 10L
main()
{
ERRBLOCK err;
ERRARRAY errar;
XUSTS timeold; /* Start of time window */
XUSTS timenew; /* End of time window */
XUSTS keytimes[MAXOCCS];
short keylevels[MAXOCCS];
char vals[MAXOCCS][112];
char asctime[25];
void *ptdatas[1] = {(void *)vals};
long occs_ok;
short fts_ok;
long recid = 2027; /* ID of record defined by LogDef*/
long ft = 0x78310000; /* Field tag of LOG_ENTRY field */
short dt = 112; /* Datatype equals length of character field */
short errsz;
char error;
int ii;
if (INISETC())
{
/*
Set timenew to current time; Set timeold to 24 hours ago
*/
GETDBUSTS(&timenew); // Set timenew to current time
timeold.secs = timenew.secs - (60 * 60 * 24); // Set timeold to 24 hours ago
timeold.usecs = 0;
/*
Read and print history
*/
RHIS21DATA(H21_GET_ACTUALS, 0, 0, recid, ft+1, &timeold, &timenew, 1, &ft, &dt,
MAXOCCS, keylevels, keytimes, ptdatas, &occs_ok, &fts_ok, &err);
if (err.ERRCODE != SUCCESS)
{
ERRMESS (&err, errar, &errsz);
printf(**RHIS21DATA Error: %.*s**\n, errsz, errar);
}
else
{
for (ii = 0; ii < occs_ok; ii++)
{
USTS2ASCII(&keytimes[ii], asctime, 25, &error);
if (error)
printf(Cannot convert key time for occurrence %d\n, ii);
else
printf(%.*s %.*s\n, 25, asctime, 112, vals[ii]);
}
}
}
return 0;
}
/********************************************************************************************************************/
Note: The DataTypes argument (dt in this example) should be specified as the length of the character field you need to read. In this example, the line of code which declares the data type is:
short dt = 112; /* Datatype equals length of character field */
Keywords: datatype
data
type
types
References: None |
Problem Statement: How are point counts related to the word count for the IP.21 database? | Solution: A point is considered a record with history. The size of those points can be measured in words.
One would use the point count to determine how many tags they could create and record data for.
The word count is used in the InfoPlus.21 Manager task TSK_DBCLOCK. Set the command line parameter to DOUBLE the size of shared memory in database words.
If you are unsure of what this number should be, the ratio is 8,000 points equals 16,000,000 words. For more information on TSK_DBCLOCK and memory utilization, please see Soln. 103866.
Keywords: point
word
count
dbclock
References: None |
Problem Statement: Standard IP_Analog and IP_Discrete tags don't allow for clamping of upper and/or lower values (e.g. forcing all input values to be stored in history and greater than or equal to zero). | Solution: Use IP_SetDef to configure a clamped limit for the target definition. Note that the filter applies to every tag defined by the base definition being altered.
View IP_SetLimits (defined by ValidRealDef)
Create a new record [test_set]defined by IP_SetDef:
Use an SQLscript to input values to test_set from 22222 down to -22222:
You can do the same thing with IP_AnalogDef records but need to make a small change to the definition record.
In my example below I duplicated IP_AnalogDef (created the definition #IP_AnalogDef) and assigned the record IP_AnalogLimits to the ?Process Record? field of IP_INPUT_VALUE
Of course, first I had to create IP_AnalogLimits:
I then made the new definition usable and created the records D-Anatest and AnaTest:
In AnaTest I set the IP_LOW_LOW_LIMIT field to ?0? and ran the same SQLplus input script (from 33333 to -33333) as before. Negative values were clamped at zero:
Keywords: Negative values
IP_INPUT_VALUE
Historian
References: None |
Problem Statement: Repository won't start and the following message appears in the affected repository's error.log file:
sequence update failure in active archive | Solution: The active archive is corrupt in this repository.
To address this issue perform the following steps.
1. Verify that the repository is not running.
2. Open a command window and cd to %h21%\bin
3. Type: h21mon
4. In the h21mon utility type: CHECK -a<fileset #> -d -o -r<repository name> (Note that the repository name is case sensitive)
5. When the check is completed, restart the repository.
Keywords:
References: None |
Problem Statement: If you have deleted a role in AFW security Server and have not removed the permission for this role in InfoPlus.21 Administrator, you will see that the IP.21 permission for the non-existing role will still be shown in the Administrator and the role name will appear as a long hex number. This might cause security problems trying to access the Infoplus.21 database. | Solution: Note: If you have MS Access available, you could use it in place of steps 1, 2 and 3.
1. Setup an ODBC DSN connection on the IP.21 server pointing to the AFWDB database.
2. Get into SQLPlus Query Writer and setup a Database Link using the DSN setup in step 1.
3. Insert a new role entry specifying a unique RoleName and the RoleID that is orphaned.
INSERT INTO AspenRoles.C:\Program Files\AspenTech\Local Security\Access97\AFWDB.PfwRole
(RoleID,Rolename) values('{83F45E8B-0506-11D6-AC01-0002A507CA52}','OrphanedRole')
4. Remove all database security using the RoleName specified in step 3 (OrphanedRole).
REVOKE Read ON IP_AnalogDef FROM OrphanedRole;
REVOKE Read ON IP_DiscreteDef FROM OrphanedRole;
REVOKE Read ON IP_TextDef FROM OrphanedRole;
REVOKE Read ON (SELECT NAME FROM IP_AnalogDef) FROM OrphanedRole;
REVOKE Read ON (SELECT NAME FROM IP_DiscreteDef) FROM OrphanedRole;
REVOKE Read ON (SELECT NAME FROM IP_TextDef) FROM OrphanedRole;
5. Remove the RoleName from the security database setup in step 3 using AFW Security Manager.
6. Remove the Database Link setup in step 2.
7. Remove the DSN setup in step 1.
Keywords: Role
Non-existing role
Security
References: None |
Problem Statement: How can I import and model the Process/Property Data from a HYSYS LNG block into PlateFin? With the case model in PlateFin, how can I then run the model within the HYSYS simulation? | Solution: Importing a HYSYS LNG Block Process & Property Data into PlateFin
Open an Aspen PlateFin Exchanger case from scratch and then, on Input | Problem Definition | Application Options tab, set the Calculation Mode to be ?Design? and the required number of streams as same in the HYSYS case (4 in this example).
Now from main menu of PlateFin diagram, go to File | Import from | Aspen HYSYS then call up the HYSYS case that contains a LNG block. The HYSYS case will be opened. Minimizing the HYSYS window, you will see a window containing all the available LNG blocks in the HYSYS case as below.
It is best to double-check the pressure level 2. If they are not consistent with the corresponding data in the HYSYS case, users may need to manually enter the pressure level 2 as the outlet pressure levels (or based on the required pressure drop) used inside the HYSYS simulation. If required, a third pressure level can also be added.
The pressure drops may be found in HYSYS on Design | Connections tab of the LNG block.
Click ''OK'' on the ''Exchanger List'' window. Now an Import PSF Data screen should show up, where users need to set the stream (Strm) number that corresponds to the inlet/outlet streams as given in HYSYS.
Click OK
The Process Data and Physical Properties information will be transferred to PlateFin. However, if users want to run this PlateFin case in Design mode, as described at the very beginning of thisSolution, they need to make sure the ''Allowed pressure drops'' are consistent with the ''Estimated pressure drop''. If the two pressure drops are not the same, users need to manually amend the ''Allowed pressure drops'' based on ''Estimated pressure drops''. This is because in Design mode of Plate Fin Exchanger program, the heat-release curves and UA value are determined by pressure changes based on ''Allowed pressure drops'' rather than ''Estimated pressure drops''. For other modes like Simulation or Checking, it is not necessary to manually amend the ''Allowed pressure drop'', because the heat release curves and UA values will be determined by ''Estimated pressure drops''.
Go to Input | Physical Properties Data | Stream * Properties | Properties tab, whereas property information of each stream can be found.
Now the PlateFin case can be run to produce a design, and from the Design a Simulation case can also be created by referring to Run | Update file with Geometry - PlateFin.
Keywords: HYSYS, LNG block, Plate Fin Exchanger, Link, Integration, Import, Export
References: None |
Problem Statement: Attempting to load a snapshot into the Aspen InfoPlus.21 database can result in the error message:
Invalid when 21CFR11 enabled | Solution: This message is most frequently encountered for 2 reasons:
1. When one attempts to load a snapshot which was taken before the 21CFR11 feature was enabled for the respective Aspen InfoPlus.21 database. This message indicates that only 21CFR11 enabled snapshots can be loaded into an Aspen InfoPlus.21 database after the 21CFR11 feature has been enabled. Since the 21CFR11 feature is irreversible after it is enabled on a given Aspen InfoPlus.21 database, the only way to load a snapshot taken before the 21CFR11 feature was enabled is to completely uninstall, then reinstall Aspen InfoPlus.21 so that the database itself no longer has the 21CFR11 feature enabled.
2. When in 21CFR11 mode, Aspen InfoPlus.21 does not allow a database snapshot to be loaded into memory when the Aspen InfoPlus.21 database is running. This is to prevent the accidental loading of snapshots. Because loading a wrong snapshot can damage the operation of the database. To load a different snapshot, you will have to stop the database, specify the location path and the name of this snapshot on the command line of the loaddb task in the Aspen Infoplus.21 Manager. After this you can start the Aspen Infoplus.21 database.
Keywords: None
References: None |
Problem Statement: The IP.21 client patch (IP060420Y) generates the following message if neither the IP.21 Administrator or the Definition Editor are installed:
Failed to read the key HKLM\Software\AspenTech\Setup\IP21 with error = 2. Proceed anyway
For example, this error can occur when installing the patch on the Web.21 server if no IP.21 client tools are installed. | Solution: This message can be ignored if no IP.21 client tools exist on the machine.
Keywords:
References: None |
Problem Statement: How do I delete a plot saved via the plot tool? | Solution: If a tag list plot is saved via the 'save as' on the file menu of the plot tool in this location:
C:\inetpub\wwwroot\AspenTech\ACOView\plots
The plot appears in the web viewer on the history tab in the plot area of the navigation pane.
The only way to delete this plot is to actually navigate to the above location and delete the XML file.
There is no way within the web viewer to delete the plot.
Keywords: None
References: None |
Problem Statement: A customer reported the following scenario:
I am trying to get data from an OPC server into Aspen Calc. To do this I have setup an Aspen OPC-DA user data source in ADSA config. In Aspen Calc, when I assign one of my variables as OPC, I can browse for the point etc, so it appears to be connecting to the OPC server. However, when I go to run the calc, I get the error: 'Get ADSA property: failed to get ProgID property from <node>' where <node> is the ADSA User Data Source name for my OPC server. | Solution: It turns out that the problem was with DCOM security on the OPC server. The server had to have default access changed to CONNECT, and change IDENTITY to a specific account because of how the OPC server was written.
Keywords: dcomcnfg
References: None |
Problem Statement: How do I add values with the same daily timestamp? | Solution: Calc Formula:
t1=Month(now()) & / & Day(now()) & / & Year(now()) & 10:00:00 sumvals=(TagHistory(tag1,t1,1))+(TagHistory(tag2,t1,1))
The TagHistory function reads values from Aspen InfoPlus.21 history at a specified timestamp. The T1 variable sets the timestamp by using a combination of builtin functions and time. This example adds the values of Tag1 and Tag2 at 10:00 today. In the calculation bind Tag1 and Tag2 to records in IP21 that contain values to be read from history and added together. The output of the calculation SumVals can be bound to an IP21 tag to store the results of the calculation. Schedule the calculation daily anytime after 10:00.
Additional information on builtin functions can be found in the Aspen Calc help file.
Keywords: Taghistory
Now
Month
Day
Year
References: None |
Problem Statement: This knowledge base article describes an automated way to create multiple calculations in Aspen Calc. | Solution: The Aspen Calc product includes a CalcCreate.xls Excel spreadsheet that allows you to create multiple calculations and bind the parameters at once. There are two variations of the CalcCreate spreadsheet. They are the CalcCreate.xls and CalcCreateExample.xls. These files are located in Drive:\Program Files\AspenTech\Aspen Calc\Excel folder. The CalcCreateExample.xls contains the same macro as the CalcCreate.xls but also includes some example data to create calculations.
Example: The following procedure describes how to create a calculation from a formula which sums the value of two tags. The formula is called Sum2 and it is part of the library MyLibrary. The formula is the following:
ReturnValue = Tag1 + Tag2
1. Open the CalcCreate.xls by double clicking on the file in the Windows Explorer.
2. Click to Enable Macros.
3. Enter the Aspen Calc Server, Aspen InfoPlus.21 Server and Group. Enter the name of the Calc Server in cell A2 (under the cell CalcServer) Enter the name of the IP.21 Server in cell B2 (under the cell Ip21Server) Enter the IP.21 Group number in cell C2 ( under the cell Group)
4. Enter the name of the calculation in cell A4 (under the cell Calculation). As an example call it 'SumofTag1andTag2'.
5. Enter the name of the Library in cell B4(under the cell Library). The name of the library in the example is 'MyLibrary'.
6. Enter the name of the Formula in cell C4 (under the cell Formula). The name of the formula in the example is 'Sum2'.
7. Create the heading for the name of each parameter involved in the formula. Start at cell D3 for the first Parameter heading in the formula and continue with cell E3, F3, and so forth. In the example, the Parameter headings are Return Value, Tag1, and Tag2. So enter 'Return Value' in cell D3, 'Tag1' in cell E3, and 'Tag2' in cell F3.
8. Under each Parameter heading and in the same row with a Formula, enter the tag and associated field, such as atcai ip_input_value or ATCL103. Start at cell D4 for the first tag and field and continue with cell E4, F4, and so forth. In the example, cell D4 will contain the corresponding tag for the 'Return Value', cell E4 will contain the corresponding tag for 'Tag1', and cell F4 will contain the corresponding tag for 'Tag2'.
9. Click Start to run the calculation(s). As the calculations are processed, starting at the cell F2, a status message displays, such as Processing Calculation 'SumofTag1andTag2'.
Keywords:
References: None |
Problem Statement: How to interpret a zig-zag diagram obtained from Aspen Plate Fin Exchanger. | Solution: The Zig-zag diagram is a traditional way of assessing a layer pattern. It can be drawn up based on a specified (or calculated) duty for each stream, without the need for more detailed calculations. You can view the diagram by going to Thermal Performance | Zigzag diagram
The Zig-zag is a plot of the cumulative heat load in all the layers up to a given point in the pattern. Cold streams have a positive heat load and are shown as increases; hot streams have a negative load and are shown as decreases. Because hot and cold layers typically alternate, the plot goes up and then down, forming a zig-zag line. When there is double banking, there will be two adjacent ups or two adjacent downs.
Since different layer types will have different heat loads per layer, the size of the ups and downs will be different, and the zig-zag may wander away from the zero line on which it starts. A good layer pattern will keep the zig-zag line as close as possible to the zero line. When a zig-zag line is persistently on one side or other of the zero line, it means there is a heat imbalance with an excess of heat load in one part of the pattern, and a deficit in another part. This heat must be conducted via the fins, distorting both the stream and metal temperature profiles from the ideal predicted by the common wall temperature assumption. The temperature gradients through the layer depth can lead to degraded performance, and more importantly, thermal stresses which in extreme cases could threaten the integrity of the exchanger.
After a layer-by-layer calculation, the individual heat loads in each layer are known, but the mean heat load per layer type is used, so the basis of the Zig-zag is the same as for stream by stream calculations.
In addition to the main Zig-zag, a number of details graphs are shown, applying the zig-zag concept separately to a number of equal length regions along the exchanger. These can indicate potential problems that could occur in certain parts of the exchanger length, even when in the main overall Zig-zag the layer pattern appears to be good.
The Zig-zag is valuable as a simple graphical tool, but it should be remembered that the Zig-zag simply provides an indication of when there might be a problem with the layer pattern. It does not show how that problem will manifest itself. For that, you should look at the results of a layer-by-layer calculation.
Keywords:
References: None |
Problem Statement: How does Aspen Plate Fin Exchanger calculate heat transfer in distributors? | Solution: The distributors at the inlet and outlet of each layer consist of one or two pads of finning, at least one of which is at an angle to the main flow direction, to direct the flow in the main fin to or from the stream header.
Each distributor pad has a defined fin type, and flow direction, so the mass flux (perpendicular to the flow direction) is well defined. This means that pressure gradients and heat transfer coefficients can readily be calculated for each pad, using Reynolds number, friction factors and Colburn-J factors. Since the pads are (usually) triangular, they have two extreme flow lengths, on along one side of the triangle, the other (at the vertex) being zero. The mean pressure change in each pad is calculated using a mean flow length.
Keywords: Distributor Calculation, Plate Fin exchanger, Pads
References: None |
Problem Statement: Migrating 2004 app to 2006.5 new server / invalid label | Solution: Edit the iqf (by hand) and change the offending labels to all upper case letters.
Keep in mind, when using input calculations, the labels can get word wrapped, be careful not to miss one of the variables that have been wrapped.
ASolution for this issue is now available fromSolution 125426
Keywords: invalid, labels, iqf, validation, error
References: None |
Problem Statement: Getting error: this computer does not have write permission to InfoPlus.21 database. | Solution: Changes have been made to both AspenCalc 3.0 and AspenCalc 2.5 which restricts writing to an InfoPlus.21 database. This validation is the same validation used by Process Explorer to determine whether it can write to InfoPlus.21.
Add the following registry entry :
HKEY_CLASSES_ROOT\CLSID\{710B32A1-7277-11d1-932C-00805F0F1C84}\Update = Y
OR
Double click on the file: c:\aspentech\desktop\CsdsHistroyDA.reg.
Keywords: calc
aspen calc
permission
References: None |
Problem Statement: When trying to save an On Demand calculation as a Shared calculation using 'Save as Shared' button, Aspen Calc displays 'error write plant area'. What does this error mean? | Solution: When you create an On Demand calculation in Aspen Calc, you have the option to make this calculation available to other clients by storing the calculation as a Shared calculation. A Shared calculation is stored in the Aspen InfoPlus.21 database as an IP_CalcDef record. When viewing the fields of an IP_CalcDef record there is a field for the plant area (IP_Plant_Area). The Aspen Calc On Demand creation display does not present this field for entry, therefore when the 'Saved as Shared' button is selected the value for IP_Plant_Area is set to blank.
When Aspen Calc goes to create and populate the IP_CalcDef record it validates the entry for IP_Plant_Area against the contents of the Plant-Areas selector record. If there is not an occurrence that has been set to have a Select_Description value of blank then the above error will be displayed.
To resolve this error, create an occurrence in the Plant-Areas record that has a description of blank.
Keywords: error write plant area
References: None |
Problem Statement: Aspen Calc doesn't allow the creation of an Excel-based formula on a remote node. | Solution: The general procedure to follow is:
1. Create the Excel spreadsheet and save it to an accessible location.
2. Create a formula which references the correct spreadsheet and copy it to the remote node.
3. Finish modifying the formula and then create any calculations requiring the Excel-based formula.
Creating the Excel spreadsheet
It is recommended that the AspenCalc program is used to create the Excel spreadsheet which will ultimately be used for the formula. The formula can be copied from an existing spreadsheet. This will ensure the spreadsheet properties are properly set.
1. Create a new Excel based formula.
2. Create a new Excel workbook -- saved on the local hard drive. It will be transferred to the proper place later.
3. Create any named cells that you wish to use. Create the spreadsheet formulas and test that the spreadsheet works as desired.
4. When satisfied with the spreadsheet's contents, save the spreadsheet and cancel the formula creation.
5. Locate the spreadsheet using Windows Explorer. The default save location is C:\ProgramData\AspenTech\Aspen Calc\Excel\Formula.
6. Move the spreadsheet to a location where the path used to access it is the same from the remote AspenCalc node as it is on the local machine. This may be a network drive or on the remote node. If it is on the remote node, you will need to mimic the path in order to configure the formula. It is recommended that you use a UNC path name if at all possible. (e.g. \\ServerName\Sharename\...) Mapped network drives will likely NOT work, since the drive mapping may not exist (or could be different) on the remote node.
Creating the formula/calculations
1. Create a new Excel formula on the local node.
2. Specify the path to the file from the perspective of the remote node. (This is where a UNC path is very handy)
3. Do not modify the formula any further at this point. Save it.
4. Copy the formula to the remote node.
5. You can now manipulate the formula on the remote node, including (auto-)adding the named parameters of the file.
6. At this point, the normal process can be used to create calculations based on this formula.
Note: you cannot modify the Excel spreadsheet specification directly through the remote formula, but you can modify the spreadsheet using any other available method including opening the spreadsheet directly in Excel.
Keywords: AspenCalc
Excel
Formula
remote
References: None |
Problem Statement: In Aspen Calc, when using a calculation like ReturnValue = a+b will the Store & Forward functionality work? | Solution: Yes, Aspen Calc supports S&F, but in a calculation the ReturnValue should be left blank, and the output should be used instead. The ReturnValue is only used for AspenCalc Formulas.
A tag needs to be created for the Output, which will be used from historical data as well.
Example: calculate X = A + B (output will be X)
Create a tag called SF_TEST, and tieit to the output. This way, every calculation will be stored in SF_TEST tag.
Please make sure you use the correct fields for the data binding.
In a standard tag FIELD NAME = IP_INPUT_VALUE (which is the current OUTPUT), and HISTORY FIELD = IP_TREND _VALUES (which are the historical values of your calculations).
When forwarding, you will see that the calculation will be filled and there won?t be any gap, as shown in the test plot below, where the light blue is SF_TEST which is the tag that used to output the calc.
Screenshots of the steps taken:
Keywords: Aspen Calc, Store & Forward, SF
References: None |
Problem Statement: How do I delete a reference to an Aspen Calc formula/calculation that is no longer being used in a formula? | Solution: 1. Export the library as .XML file
2. Close the Aspen Calc application.
3. Remove or rename the library from Documents and Settings\All Users\Application Data\Aspentech\Aspen Calc\lib folder ( Program Files\Aspentech\Aspen Calc\lib folder for versions prior to 2006.5)
4. Edit the .XML file using Notepad or Wordpad. Locate the formula or calculation that you are not interested in using anymore.
5. Locate the <FormulaRefs> section of this formula or calculate.
6. Remove the text between <FormulaRefs> and </FormulaRefs> and replace with <FormulaRefs/>.
7. Save the .XML file.
8. Start Aspen Calc and import the XML.
An example of the XML code is shown below:
- <LibFormula>
<Description>this is junk but I cannot delete it</Description>
<Value xmlns:dt=urn:schemas-microsoft-com:datatypes dt:dt=char />
<Status>-1</Status>
<FormulaType xmlns:dt=urn:schemas-microsoft-com:datatypes dt:dt=i4>3</FormulaType>
<ScriptText>a</ScriptText>
<Name>get_hist_val_actual</Name>
- <Parameters>
- <Parameter>
<Default xmlns:dt=urn:schemas-microsoft-com:datatypes dt:dt=char />
<Name>ReturnValue</Name>
<ParameterType>6</ParameterType>
<Status>-1</Status>
<Type xmlns:dt=urn:schemas-microsoft-com:datatypes dt:dt=i4>12</Type>
<Value xmlns:dt=urn:schemas-microsoft-com:datatypes dt:dt=char />
</Parameter>
</Parameters>
<DataSets />
- <FormulaRefs>
- <CalcFormula>
<LibraryName>webMonitor</LibraryName>
<Name>get_hist_val_actual</Name>
</CalcFormula>
</FormulaRefs>
<OleMethods />
</LibFormula>
In the example above the code that is in bold should be replaced with <FormulaRefs/>.
Keywords: None
References: None |
Problem Statement: IQModel would not start using the Excell Addin 2006.5 | Solution: Open and minimize IQmodel before using the Excell Addin.
Keywords: Microsoft Excel
IQmodel
References: None |
Problem Statement: After install, attempting to create an On Demand tag in Aspen Calc produces the error: | Solution: Add any sub-folders that are missing in the ...\Aspentech\Aspen Calc\ directory. Below is how the structure should appear:
Keywords: aspencalc
ondemand
References: None |
Problem Statement: The Excel Add-In fails to correctly retrieve data in an Excel spreadsheet created in Aspen Calc | Solution: When Aspen Calc creates a spreadsheet for you, it is created as a shared file. Because of a Microsoft limitation in how shared files can be accessed and manipulated, it keeps the Process Data Add-In from working correctly. Before adding or attempting to update Process Data Add-In calculations in an Aspen Calc spreadsheet, it is first necessary to remove the shared flag from the file in Excel.
Since this is a Microsoft limitation that cannot be resolved by Aspentech, we recommend not using Process Data Add-In calculations within an Aspen Calc spreadsheet.
Keywords: Aspen Calc
freeze
fail
hang
References: None |
Problem Statement: It is possible to call a method from the Aspen Process Data automation interface within an Aspen Calc calculation even though:
1. The Aspen Process Data automation interface objects do not appear in the list of available COM objects when creating a COM object calculation
and
2. There is no way to add in a ' | Solution: The following VBScript example code illustrates how to create the Aspen Process Data automation interface object to use within an Aspen Calc VBScript calculation. This particular code example returns the sum of a tag's values for a given time period, the same result that is returned by Aspen Calc's TagSum function.
'====================
dim x, i, ds, query, hist, samples
set x=createobject(AtProcessData.DataSources)
apdSum = 16
apdHour = 1
set ds = x(tiger)
set hist = ds.tags.add(atcai).history
set query = hist.query
query.begintime= now()-1
query.endtime = now()
query.period=24
query.periodunits=apdHour
query.type=apdSum
ds.readhistory(false)
set samples = hist.samples
c = samples.count
for i = 1 to samples.count
a = samples(i).value
next
'===================
Keywords: None
References: ' to the Aspen Process Data automation interface within Aspen Calc |
Problem Statement: AspenCalc includes a comprehensive list of formulas that are pre-defined for use in calculations. These can be found in the Builtin library. Formulas from this library can be combined to create more advanced, complicated calculations. One such calculation would be to evaluate a monthly average for an individual tag. The following TechTip provides instructions on how such a monthly average calculation can be created and also provides a template on how the Builtin library formulas can be used to create larger, more personalized formulas. | Solution: The first step is to build and save a monthly average formula, which is created by referencing standard functions included in the Builtin library. The function can be created with the Formula Wizard, with CalcScript as the Formula Type. The completed function should look like the following:
A = TagAverage(Tag, (Now() - (Day(V1))), Now(), Actual, Stepped)
Use the Insert Formula button to access the TagAverage() formula. Replace the default variables T1 and T2 in the TagAverage() function with the correct combination of the Now() and Day() functions.
Note: It is important to use the Insert Formula button to insert the Built-in functions.
This function can calculate the monthly average because the Day() function looks at the input date of the tag it is mapped to, and returns the day of the month. So if the input date is 06/13/01, it returns 13. If the calculation is scheduled to execute at the last day of the month, it will subtract the correct number of days from the current date. This means that it doesn't matter if it is calculating the monthly average over a 31 day month, or a 30 day month. The correct amount of days will be subtracted.
The next step would be to create a calculation and bind the correct values. In the Monthly Average function above, the necessary variables to bind to IP.21 values are as follows:
A -- Output tag
Tag -- Tag you want to calculate the average for. The Field Name should be set to IP_INPUT_VALUE and the History Field should be set to IP_TREND_VALUE.
V1 -- Tag you want to calculate the average for. The Field Name should be set to IP_INPUT_TIME and the History Field should be set to IP_TREND_TIME.
The last step is to schedule the calculation so that it executes on a once a month schedule. For the calculation to work correctly it should execute at the last day of each month. To do this, create a new schedule group with the Group Type set to Periodic Group. The Group period should be set to Day of Month with End of Month selected from the drop down list.
Keywords: monthly average
tagaverage
References: None |
Problem Statement: The CalcScript predefined constants for quality statuses and levels aren't available in VBScript. | Solution: Use the InfoPlus.21 level and status values as defined in the setcim.h header file rather than the predefined constants. The correlation can be found in the setcim.h header file. The pertinent section of this file is copied below.
An example using the demo database:
someValue = (tag1 - tag2) * tag3
ReturnValue = 100 * someValue
' This updates the status value in the database:
ReturnValue.status = -3
'This doesn't work:
'ReturnValue.status = QSTATUS_BAD
From the setcim.h file:
************************************************************************************
#define QSTATUS_INITIAL 0 /* Process value has never been updated */
#define QSTATUS_GOOD (-1) /* Quality status is GOOD */
#define QSTATUS_NO_STATUS (-2) /* Calling program did not provide a quality status */
#define QSTATUS_SUSPECT (-3) /* Quality status is SUSPECT */
#define QSTATUS_CLAMPED_HI (-4) /* SETCIM has clamped the process value to a high entry limit */
#define QSTATUS_CLAMPED_LO (-5) /* SETCIM has clamped the process value to a low entry limit */
#define QSTATUS_BAD (-6) /* Quality status is BAD */
#define QSTATUS_NO_FIELD (-32768) /* There is no quality field */
and the following values for quality level:
#define QLEVEL_GOOD 0 /* Quality level is GOOD */
#define QLEVEL_SUSPECT 1 /* Quality level is SUSPECT */
#define QLEVEL_BAD 2 /* Quality level is BAD */
#define NO_QLEVEL (-1) /* Quality level is not available */
Keywords: quality
status
level
QSTATUS
QLEVEL
References: None |
Problem Statement: How to interpret the Layer Pattern results in EDR Plate Fin Exchanger? | Solution: Let’s say the following is the layer pattern for which the exchanger is designed.
BAB(ABAC*18)ABABAB
With Layer by Layer Simulation, the detailed heat transfer and pressure drop calculations are performed.
The Results -> Mechanical Summary -> Exchanger Diagram -> Layer Pattern can be interpreted as
Line 1: A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A
Line 2:B B B C B C B C B C B C B C B C B C B C B C B C B C B C B C B C B C B C B C B B B
Line 3:
Line 4:----v----1----v----2----v----3----v----4----v----5----v----6----v----7----v----8-
Line 5: A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A A
Line 6:B B B B B B B B B B B B B B B B B B B B B B B
Line 7: C C C C C C C C C C C C C C C C C C
Line 1 indicates the hot stream which is Layer A.
Line 2 indicates the relatively cold streams which is Layer B and C.
The pattern for the layer can be read from the first two lines starting with BAB(ABAC*18)ABABAB.
Line 3 is either empty or shows the presence of hot and cold streams.
Line 4 indicates the number of layers in total. Integers are multiples of 10. Number 1 indicates 10, while number 8 indicates 80.
Line 5, 6 and 7 indicates the number of A, B and C respectively. It shows individual number of layers in the exchanger.
Keywords: Plate Fin Exchanger, Layer Pattern, Lines.
References: None |
Problem Statement: The Aspen PlateFin can generate two types of files;
· A Process Simulation File (PSF) that contains the process data and properties information in the current PlateFin case. The PSF files can be imported into other EDR programs using the File | Import option.
· A MUI file that is the input file for the heritage HTFS MUSE program for plate fin heat exchangers. Since PlateFin can open (import) a MUI file, the capability to generate an MUI file for use as input to MUSE facilitates cross-checking the two programs
Describe below is how these files may be generated. | Solution: Go to the Input | Program Options | Calculation Options | Calculation tab and from “Create PSF or MUI output file” select the appropriate action.
The MUI and PSF files are created in the same directory and with the same name as the current case file, but with the extension *.mui or *.psf instead of EDR.
Since PlateFin can handle more geometrically complex exchangers than MUSE, there might be inconsistencies between the PlateFin and MUSE cases. It is worthwhile checking that areas and number of layers for each stream are as expected. In addition, MUI files can only be generated for axial flow exchanger, not crossflow exchangers.
Keywords: Process Simulator File, MUSE, MULE, MUSC, PFIN, mui
References: None |
Problem Statement: Password change causes inability to run Aspen Calc on NT. | Solution: The account specified in the AspenTech Calculator Engine service defaults to the account used when installing Aspen Calc. If you change the system username and/or the password for the account specified in the Aspen Calc services, you must also change them in the Services setup. To do so, perform the following steps:
1 On the Start menu, click Settings|Control Panel|Services.
2 Select AspenTech Calculator Engine and click Startup?.
3 Under This Account, enter and confirm the new username and password for a valid user account.
4 Click OK.
5 Click Start to start the AspenTech Calculator Engine with the new settings.
Keywords: AspenCalc
calc
calcularor engine
References: None |
Problem Statement: How to import or export Aspen Calc calculations and formulas to XML | Solution: Aspen Calc can import and export calculations and libraries using XML which offers several benefits.
Share Aspen Calc objects with another application.
View and Create Aspen Calc objects using a text or XML editor.
Import calculations and formulas from other systems into Aspen Calc.
Display Aspen Calc objects via the Web using XSL.
Efficiently backup Aspen Calc data into text files.
Query Aspen Calc XML data using XML query language.
Export
Single calculation or library:
1. Right-click on the calculation or library item to be exported.
2. Select the menu item Export XML.
3. Name the file and designate a save location in the Export XML file system window.
All calculations or libraries:
1. Right-click on the calculation or library tree to be exported.
2. Select the menu item Export XML.
3. Name the file and designate a save location in the Export XML file system window.
Import
1. Right-click on the server node of a calculation or library tree.
2. Select the menu item Import XML.
3. Select the desired XML file from the Import XML file system window.
As soon as the import process is finished, the imported object will be visible in the object tree.
Keywords: import
export
xml
calc
aspen calc
References: None |
Problem Statement: To simplify the creation of multiple Schedule Groups, you can copy a Schedule Group on the same server node or to a different server node. You can then rename the Schedule Group. | Solution: To copy a Schedule Group:
In the Schedule Group view, right-click the Schedule Group to be copied. A context menu is displayed.
Click Copy.
Right-click the server node into which you want the Schedule Group to be copied. A context menu is displayed.
Click Paste. The Schedule Group is copied to the selected node. If the selected server node is different from the original server node, the copy will have the same name as the original Library. If you are copying into the original server node, the Schedule Group name will be CopyOfxxx (where xxx is the name of the original Schedule Group).
Note: You can also copy a Schedule Group into a server node by selecting it and dragging it onto the server node.
To rename a Schedule Group:
Right-click the Schedule Group to be renamed. A context menu is displayed.
Click Rename. The Schedule Group name is made editable.
Enter the new name.
Keywords: calc
aspencalc
References: None |
Problem Statement: What are the calculation modes of Aspen Plate Fin Exchanger? | Solution: There are four calculation modes in Aspen Plate Fin Exchanger. They are (1) Stream by stream simulation, (2) Layer by layer simulation, (3) Checking and (4) Design.
(1) Stream by stream simulation
Both Simulations use specified inlet conditions of each stream, and predict the outlet conditions. The exchanger geometry must be specified. In calculating stream outlet conditions for each stream, the program uses an overall metal temperature profile along the length of the exchanger. This is the common wall temperature assumption. Specification of the layer pattern is optional; the program just needs to know the number of layers of each type. It effectively assumes that the layer pattern is good.
(2) Layer by layer simulation
Both Simulations use specified inlet conditions of each stream, and predict the outlet conditions. In calculating stream outlet conditions for each stream, the program does separate calculations for every layer in the exchanger and derives the metal temperature profile along every parting sheet between layers. The layer pattern (stacking pattern) must be specified. The calculation effectively evaluates how good the layer pattern is. The detailed metal temperatures it produces can be used to assess thermal stresses.
Layer by layer calculations are currently restricted to axial flow exchangers. Stream by stream calculations can be used for axial and crossflow exchangers. Layer by layer calculations can take significantly longer than stream by stream calculations, particularly for exchangers with a large number of layers.
(3) Checking
You specify all the streams conditions (or inlet conditions and duty) and the exchanger geometry. Checking mode tells you if the exchanger is adequate for the service or not.
(4) Design
Design mode tries to find a first shot design. These calculations provide an initial estimate of the size of exchanger(s) needed to perform a specified duty. They predict the number of layers for each stream and the size of all layers, together with the finning to be used in each, and distributor and header locations and dimensions. They do not determine a layer pattern. The term first shot is used because an experienced designer may well be able to significantly improve the design, and because (unless proprietary fin performance data is used) a manufacturer will almost certainly produce a different design for the duty.
Design calculations are restricted to axial flow exchangers. They cannot be used for crossflow exchangers at present.
No geometry information is needed for a Design calculation, but any that is provided will be incorporated when possible. This could for example include specification of some or all of the fins to be used.
Design calculations are based on the common stream temperature assumption.
Keywords: Calculation mode, PlateFin
References: None |
Problem Statement: Some calculations may appear with the red checkmark, indicating they are experiencing an error. The error message that appears under the Properties of the calculation may be:
Error reading InfoPlus.21 data | Solution: Several customers have reported this problem, and for all of them, a simple stop and restart of the Aspen Calc Service has eliminated the error message.
Keywords:
References: None |
Problem Statement: Excel calculations return error message (Windows NT only).
Error executing calculation: Error Getting Excel Server Dispatch, | Solution: The account you are using should have DCOM default launch permissions.
To do so, perform the following steps:
1 Run dcomcnfg. The Distributed COM Configuration Properties dialog box is displayed.
2 Select the Default Security tab and click Edit Default in the Default Launch Permissions frame. The Registry Values Permissions dialog box is displayed.
3 Click Add and select your user name from the correct domain.
4 Click OK on the dcomcnfg dialogs.
5 Restart the Aspentech Calculator Service from Control Panel.
Keywords: AspenCalc
calc
excel
References: None |
Problem Statement: This KB article shows how to access the repeat area field IO_Device for TSK_DETECT using a calculation developed in Aspen Calc. | Solution: The following two examples demonstrate how to access the repeat area field IO_Device for TSK_DETECT with use of an Aspen Calc calculation. Specifically, they show how to configure Data Binding properties of the calculation.
Example 1)
Example 2)
Keywords: TSK_Detect;
Aspen Calc;
IO_#_of_Tags;
References: None |
Problem Statement: A user wanted to create a SQLplus script to automatically capture Aspen Calc calculation errors and send them in an email. | Solution: Here's a possibleSolution
local CalcCmd;
local list;
local i int;
local calc;
local str char(75);
CalcCmd = createobject('CalcScheduler.CalcCommands');
list = CalcCmd.GetCalculationList(); --Get a list of all the calculations
FOR i = lbound(list) to ubound(list) DO --Loop through all the calculations in the list
calc = CalcCmd.GetCalculationObject(list[i]); --Retrieve calculation parameters
str=calc.errortext; --Assigns error string to variable
IF str <>'' THEN --If there's a calculation error write it out, if not exit loop
WRITE 'Calculation Name:'||list[i];
WRITE str; -- Write the error message
-- Put email code in here.
END;
End;
The user could include additional code within the If loop to send an email or page. See Knowledge BaseSolutions 103878 or 115270 for guidance on how to send an email from SQLplus. This query could be saved as a CompQueryDef record and put on an appropriate schedule. The query would send an email whenever it ran and found any Aspen Calc calculation errors.
Keywords: Aspen Calc, SQLplus, Aspen Calc errors, calculation errors, Aspen Calc error reports, Aspen Calc COM, Aspen Calc API, Aspen Calc VB
References: None |
Problem Statement: This knowledge base article discusses whether or not it is possible to change the Parameter Type in a formula or calculation. | Solution: It is not possible to change the Parameter Type of a parameter in a formula or calculation. Aspen Calc sets the parameter type automatically in the formula. In more complicated formulas some parameters are set to input/output. If your formula calls for a different parameter type the best way to accomplish this is to add an additional parameter.
Example:
To set a parameter type an Input add an additional parameter to the CalcScript.
b = a
b = b + 1
In this calculation a is an Input and b is an output. You can bind a to a tag and it will not write the value back to the tag.
Keywords: None
References: None |
Problem Statement: SQLA tags are special records inside an IP.21 Database. They allow you to use Aspen Process Explorer to 'directly' display values held inside tables inside a Relational Database (Oracle or SQL-Server).
Suppose you add an SQLA tag to a Process Explorer plot and see nothing but asterisks for the values. How can you troubleshoot the problem? | Solution: Configuration for connecting to the Relational Database is done via the R21 section of a file called C21_Config (See KnowledgebaseSolution 105127 for more details).
There is a parameter inside the R21 section called DEBUG. The default is '0'. Setting it to '1' causes Debug messages to be written to an out file. Tsk_PD_Server needs to be restarted to cause the C21_Config file to be re-read. The file containing the debug messages is called pd_server.out.
HOWEVER, it is NOT the out file located in the normal group200 directory. The correct pd_server.out for viewing any SQLA messages is in the directory ...\AspenTech\InfoPlus.21\c21\k21\log.
The sequence of troubleshooting events should then be:
Turn on Debug via the C21_Config file
Restart Tsk_pd_Server
Restart Process Explorer
Add an SQLA tag to Process Explorer
Look in ...\AspenTech\InfoPlus.21\c21\k21\log\pd_server.out
Remember to disable the debug option when the problem is resolved. If debug is not disabled, the resulting log file can become very large.
NOTE: Process Explorer interacts with the Relational Database by sending SQL Queries to the Relational Database via TSK_PD_SERVER. Inside the pd_server.out you will see the SQL queries. These queries are Oracle or SQL-Server Queries, they are not AspenTech SQL queries. Therefore, the final troubleshooting would be to try using the Oracle or SQL Server tools with these queries to better understand why Process Explorer may be having problems.
Keywords: SQLA
no
data
process
explorer
References: None |
Problem Statement: This knowledge base article answers the question of whether or not the Aspen InfoPlus.21 Application Programming Interface (API) is thread-safe. | Solution: The term thread-safe is used to describe subroutines and functions which can be called from multiple threads without unwanted interaction between the various threads. Functions and subroutines are deemed thread-safe if they are protected from multiple simultaneous execution. The Aspen InfoPlus.21 API routines are thread-safe.
Keywords: thread
safe
threadsafe
References: None |
Problem Statement: What is TSK_POP used for? | Solution: The TSK_POP task is used for popup window support for GCS displays. If you are not using GCS, you can check the Skip during startup box for this task.
Keywords: IP.21 manager
References: None |
Problem Statement: How can I view the configured Repository Parameters including the existing Buffer and Cache sizes without having to Start the Aspen InfoPlus.21 database and/or Load the History fileset data?
For Example, I have the following scenario:
I am performing an Aspen InfoPlus.21 upgrade to a newer aspenONE software version or preparing for some version regression testing prior to upgrading.
I have moved over or copied in place to the Aspen InfoPlus.21 database server my database snapshot and the corresponding *.dat files (config.dat, map.dat, tune.dat), but I have not yet copied over any of the actual History filesets for the Repositories.
Now, before moving over any History filesets and officially starting up the Aspen InfoPlus.21 database to load the existing History and begin testing and collecting new data, I want to investigate my current Buffer and Cache Size for each Repository to see if I need to make changes for any reason before starting up with the History in place. | Solution: There are two ways to view the currently configured Repository Parameters including the Buffer and Cache sizes for the Repositories on all Aspen InfoPlus.21 systems including V2006.5 and higher systems without having to start the Aspen InfoPlus.21 database and/or have the History filesets loaded.
1. The easiest way to view the existing system parameters configured for a specified Repository including its current Buffer and Cache sizes is to use the SYSTEM command for the h21param> parameter value when using the the H21MON utility in DOS.
Open a command (DOS) prompt window (Click Start | Run and type in cmd and then click OK).
Execute a Change Directory to switch into the \h21\bin directory as noted below.
C:\Program Files\AspenTech\InfoPlus.21\c21\h21\bin
At the directory prompt now type in h21mon.exe and hit the enter key on your keyboard.
At the resulting h21mon> prompt now type in the command text parameters and hit the enter key on your keyboard.
At the resulting h21param> prompt now type in the command text setrep <repository_name>, where the value for repository_name is equal to the actual name of the Repository whose configured parameter settings you wish to examine.
Finally, after setting the specified name of desired Aspen InfoPlus.21 Repository to examine, then at the secondary h21param> prompt type in the command text system and again hit the enter key on your keyboard to continue.
The results for the configured parameter settings for the selected Repository will now be returned.
Repeat the setrep <repository_name> and system command procedures as needed to select and view the configured parameter settings for other Repositories.
(** See first example screen attached below shot below)
2. For newer software versions (Applicable ONLY for aspenONE versions V2006.5 and Higher) you can also start the Aspen InfoPlus.21 database WITHOUT starting up the History, and check the configured Repository parameter settings using the Aspen InfoPlus.21 Administrator:
IMPORTANT: The actual arc.byte, arc.dat, and arc.key files for the Repository filesets do not have to be in place, but the Physical Arc Folders as per what is included in the config.dat for the Repository filesets do have to exist.
First use the Aspen InfoPlus.21 Manager to manually start TSK_DBCLOCK.
Next use the Aspen InfoPlus.21 Manager to manually start TSK_H21T.
Last use the Aspen InfoPlus.21 Manager to manually start TSK_ADMIN_SERVER.
Now open the Aspen InfoPlus.21 Administrator.
** The Aspen InfoPlus.21 database should now be started up, and in the Administrator the Repository should be stopped and its corresponding icon color displayed in the Administrator should be RED. The color of the icons for the Repository Filesets displayed in the Administrator should also show as RED, and if you examine any of the tags you will only see the last data points that are still in the cached memory.
Right-Click on the Repository whose properties you wish to view and
NOTE: AspenTech KB Article 125266 has more detail on variations of the selective task/database startupSolution to exclude history as noted above for aspenONE versions V2006.5 and Higher.
Screen Shot ofSolution Method 1:
Screen Shots ofSolution Method 2:
Aspen InfoPlus.21 Manager screen shot showing only the TSK_DBCLOCK, TSK_H21T, and TSK_ADMIN_SERVER started as noted in theSolution procedure previously provided above. Please note that in the Aspen InfoPlus.21 Manager the Running Tasks are listed in alphabetical order rather than the order in which they were started up.
Aspen InfoPlus.21Administrator Screen Shot showing that Repositories are stopped and the icon color of Repositories and Filesets is RED. Also shows that TSK_DHIS contains 10 Filesets as indicated by the example h21mon test.
Aspen InfoPlus.21 Administrator screen shot showing the Properties on TSK_DHIS3 for the Buffer and Cache Size as shown earlier on the screen shot also provided above for the h21mon example.
Keywords: buffer
cache
h21mon
h21mon.exe
h21param
repository
References: None |
Problem Statement: What is the Field_Length field used for in definition records? | Solution: Values set in the Field_Length field represent the size of the actual values stored in IP21. In most cases: signed integers are represented by the number of bits and unsigned integers are the number of bits plus 64. There are exceptions to this rule as seen in the example below. Data displayed in the IP21 Administrator or other client tools is formatted by the value in the IP_Value_Format field and does not impact the actual data stored in IP21.
Examples:
Field
Data_Type
Field_Length
Ip_Input_Value
signed_integer-32bit
32
Ip_Input_Quality
signed_integer-16bit
16
Ip_Tag_Type
unsigned_integer-1bit
65(64+1)
Ip_Archiving
signed_integer-2bit
1026(1024+2)..Exception to the rule
Keywords: field_length
infoplus21
definition records
definition editor
References: None |
Problem Statement: During the installation of Process Explorer on a client system the install fails after entering the account information in the NT Service Configuration screen with the following error.
(Login Failure unknown user name and password) | Solution: Access.....Control Panel\ Administrative Tools\ Local Security Settings\ .....Under Local Policies\User Rights Assignment\... The Administrators group needs to be added to the policy Access This Computer From the Network.
Keywords: login failure
install
References: None |
Problem Statement: When you attempt to stop InfoPlus.21, you may encounter the following error: The users access permission does not allow this operation. | Solution: If you encounter this error and cannot shutdown InfoPlus.21, go to Control Panel....Services and find the InfoPlus.21 Task Service for the group running on the system. Example:
InfoPlus.21 Task Service for Group 200
Stop and restart the task.
Then you should be able to stop InfoPlus.21 through the InfoPlus.21 Manager.
Keywords: stop
ip21
access
permissions
References: None |
Problem Statement: This knowledge base article provides an example VBScript calculation which computes the average value of one tag (Tag1) when the value of another tag (Tag2) has reached a certain value. | Solution: The sample script is below.
-------------
If Tag2.Value = 0 then
BTime = NOW()
ETime = NOW() - 1
Tag3.Value = TagAverage.function(Tag1, ETime,BTime, 0,0)
else
' Add logic for the case when the Else clause is executed. For example, set Tag3.Value to zero if
' you wish to display a zero when no average is available.
end if
---------------
This VBScript calculation computes the tag average of Tag1 if the value of Tag2 is zero. The result is stored in Tag3. Tag1 and Tag2 are input parameters. Tag3 is an output parameter.
Keywords: Conditional
References: None |
Problem Statement: I have a local variable in an Input Calculation (IC) or Output Calculation (OC) module that is used as both an input and later in the same script as an output. This makes it show up under both Input Variables and Output Variables. In Aspen Inferential Qualities (IQ) Config, the calculation works in test mode, but once loaded online the variable never changes values.
Some pseudo-code examples are:
IF ( XX <> XX_last ) THEN
END IF
XX_last = XX
- or -
IF XX > 0 THEN
XX = 0
END IF | Solution: The problem is that there is nothing to update the input value for variable ?XX?. The variable ?XX? is a single entity within the Aspen Calc script, but it is stored as both an Input and an Output variable in Aspen IQ. That means it is treated as two separate variables.
TheSolution is to hook the Input variable to the Output variable outside of the IC or OC module. There are two methods to do this: with a Cim-IO tag, or with a Map From and a Map To the same variable in a different module.
The two work-arounds:
1. Cim-IO tag: connect both the Input variable XX as a Read and the Output variable XX as a Write to the same Cim-IO tag.
2. Map From and Map To a variable in a different module: similar to the Cim-IOSolution above, but use Map From and Map To to connect both the Input variable XX and the Output variable XX to the same variable in another module. It could be in the same IQ or a different one, and candidate modules are the Analyzer Update (AZU), Steady-State Detection (SSD), Lab Bias Update (LBU), Lab Data Collection (LDC), or Prediction (PR) modules. You could pick an unused built-in parameter, or create a variable in one of those modules explicitly for the use of the IC or OC script.
Keywords: Aspen IQconfig
References: None |
Problem Statement: What type of filters can be configured for predictions generated by Aspen IQ? | Solution: The IQ prediction (PR module) input variables MVARs for algebraic calculation, *.MDL, or *.IQR model files can optionally include first-order time constant, INFILTT1 (input filter dead time in minutes), second-order time constant, INFLTT2 filters and a dead time, INFILTDT. Additionally, the IQ allows filtering of the calculated result as well using a similar set of parameters- first-order time constant, OUTFILTT1, second-order time constant, OUTFILTT2 filters and dead time, OUTFILTDT.
Input filtering is disabled when both INFILTT1 and INFILTT2 are set as 0 (zero), which is the default value. Input filtering is typically used to reduce the noises in measurements and thus increase the stability of the prediction. The output filtering is primarily to introduce dynamics into the prediction because the prediction is a steady state estimation.
Keywords: IQ filter
MVAR
CVAR
References: None |
Problem Statement: How can I extend Aspen Inferential Qualities (IQ) trend on Aspen Production Control Web Server (PCWS) beyond 3 days? | Solution: Note that the default is 48 hours and it is set in the exe.config file for the Web Provider Data service. This can be changed up to 240 (ten days). See instructions below...
Next, you can modify your individual Aspen IQ applications and change both the Plot Hours and Max Plot Hours entries to the desired settings. Do that from the Aspen IQ Detail page for the entire IQ application (access it from the Overview page of PCWS).
WARNING: increasing the Plot Hours Aspen IQ entry also increases the amount of memory consumed by the application. This can be a large amount of memory for IQ applications. You should keep an eye on the memory size of the WebProviderDataSvc.exe process when changing these parameters.
Keywords: None
References: None |
Problem Statement: You can write a VB script to get the binding parameters of an Aspen Calc calculation. For example, take a simple calculation like a=b+c where the two parameters a and b are bound to IP.21 values. If you are not careful about the variable types that you are using in the code, you may get errors. This | Solution: shows one such error and how you can write your code to avoid it.
Excerpts from the code:
.....
Set calccmd = CreateObject(CalcScheduler.CalcCommands, servername)
list = calccmd.GetCalculationList()
.....
For i = LBound(list) To UBound(list)
If UCase(list(i)) = UCase(calculation_name) Then
Set calc = calccmd.GetCalculationObject(list(i))
For j = 1 To calc.Parameters.Count
Set param = calc.Parameters(j)
If param.Name = parameter_name_calculation Then
.....
End If
Next
End If
Next
.....
This code generates the following error:
Invalid data type for index parameter, code 80004005
at line Set param = calc.Parameters(j)
Solution
Code 80004005 just stands for a generic error. Invalid data type for index parameter simply indicates that an integer is expected. You have this error for example if you are using a variant type for j.
To make sure the right type is used, cast the j variable to integer by replacing the instruction (in red above)
Set param = calc.Parameters(j)
with
Set param = calc.Parameters(Cint(j))
This avoids the error.
Keywords: 80004005
binding
VB script
References: None |
Problem Statement: What is the procedure to allow any AspenTech application to Read/Write data directly from an Aspen InfoPlus21 database? | Solution: Create a Cimio device using Cimio for IP21 to allow cimio to read/write data from the IP21 database. The configuration listed below would need to be performed on the machine running the IP21 server.
1. An external task has to be added for the Cim-IO Server for InfoPlus.21, called TSK_IP21_SERVER. The name of the executable is %CIMIOROOT%\io\cio_set_cim\cimio_setcim_dlgp.exe.
2. Add CIMIOSETCIM_200 and the corresponding port number to services file (C:\Windows\System32\Drivers\etc\).
CIMIOSETCIM_200 13013/tcp # Cim-mO to Infoplus
3. Assuming that the CIMIO logical device we want to add is IOIP21, the entry in CIMIO logical device definition file has to be like below:
IOIP21 <IP21_Server_Name> CIMIOSETCIM_200
Once the above configuration have been performed on the IP21 server, the user would need to do the following on the machine running the Aspen application that needs to communicate with the IP21 server. It is mandatory that the entries in windows services and CIMIO logical device definition file match exactly, on both IP21 server and APC server.
1. Add CIMIOSETCIM_200 entry in the services file.
CIMIOSETCIM_200 13013/tcp # Cim-IO to Infoplus
2. Add the logical device to the Cimio Logical Devices definition file.
IOIP21 <IP21_Server_Name> CIMIOSETCIM_200
This new logical device can be now used to read/ write data to IP21 using any of the compliant Aspen tools such as DMCplus, IQ, APC Builder, RTO, etc.
Keywords: IP21 connection
Aspen application
References: None |
Problem Statement: The purpose of this Knowledge Base Article is to discuss a common error that users might encounter after an upgrade of Aspen Calc. Under some circumstances, the following error will occur when attempting to open a Calculation listed under a folder: | Solution: The Failed to edit calculation. Error Saving Edit Object. See file AspenCalcMessageLog.txt for details error will occur when a folder is missing under the Aspen Calc Edit folder.
For different versions of Aspen Calc, the path to the Edit folder is different. For example, in version 2006, the path is C:\Program Files\AspenTech\Aspen Calc\Lib\Edit.
In Aspen Calc version 7.1 the path is C:\Documents and Settings\All Users\Application Data\AspenTech\Aspen Calc\Calc\Edit.
In Aspen Calc version 7.2 and 7.3 running in Windows 2008 server systems using 32 and 64 bit the path is C:\Program Data\AspenTech\Aspen Calc\Calc\Edit.
In the example below, Folder 1 is missing. When trying to open the calculation Auto2 in Aspen Calc you'll receive the error, since the Folder1 folder does not exist in the expected path.
In order to fix this issue, please make sure the Folder1 exists under the corresponding Aspen Calc folder as described.
Another possible cause of this error is folder NTFS permissions. Verify that the current user has Read and Write access to the Edit folder and subfolders.
Keywords: Failed to edit calculation.
Error Saving Edit Object.
AspenCalcMessageLog.txt
Aspen Calc
Calculation
References: None |
Problem Statement: Why is the timestamp of my scheduled calculation using Aspen Calc Store and Forward (S&F) capability showing seconds values > :00?
If you have a calculation that is scheduled to run every minute then you should see the timestamp of the execution of that calculation in the resultant tag's history repeat area in the Aspen InfoPlus.21 Administrator to be in a format similar to HH:MM:00.0. But, when you look at the timestamp you see the format to be HH:MM:51.2. | Solution: It is recommended that the calculation be created and tested via Aspen Calc. When completely satisfied with the results then add it to the desired schedule group within Aspen Calc and let the execution of the calculation be performed at the scheduled time. This will give a timestamp with the seconds value being close to :00.0 (i.e. it might vary by a few 10ths of a second).
Now you can turn the S&F option 'ON' this calculation via Aspen Calc and let the calculation cycle through the normal scheduled processing. When viewing the resultant tag's data via the Aspen InfoPlus.21 Administrator the historical timestamps should be set at the original timestamp + the schedule interval.
For instance if the calculation was scheduled to execute every minute and the first time it executed at 11:33:00.1 then the subsequent history entries for the tag that the result of the calculation is stored in should show 11:34:00.1, 11:35:00.1 and so on.
Keywords: S&F
Store and Forward
timestamp
References: None |
Problem Statement: When the ABAP program is recompiled on SAP and a workflow is executed, the workflow session fails. | Solution: Check to see if Informatica and SAP are both set to send and receive Unicode in the UTF-8 format. There is an issue between SAP and Informatica where SAP ABAP programs generate output in Unicode UTF-16 format, but Informatica is only able to process UTF-8 format. Customers migrating from SAP 4.6 to SAP ECC6.0 will encounter this problem.
The Informatica cannot process data in UTF-16 format. The data must be converted to UTF-8 format.
Specify UTF-8 encoding of Unicode for both the source and the target in the connection browser.
Convert all the PowerCenter ABAP programs to Unicode.
Keywords: SAP
Informatica
UTF-16
UTF-8
ABAP
Unicode
References: None |
Problem Statement: Bias not updating, Bias had a bad value | Solution: Occasionally the lab bias (LBIASNEW) can get a bad value in it. You have two choices.
Delete the history
in manage delete the application and answer Y to delete the history
load the application
start it
wait for enough lab samples or enter a few bogus ones to get it going again
Keep the history, setup IQ to calculate the bias based on the next lab sample only
set the update source UPDSRC to manual and enter a reasonable value for the bias
set the LBIASFRAC in the LBU section to 1.0 (remember the old value)
set the LBUNUMVALS to 1 in the LBU section (remember the old value)
put a bogus lab value in. It should use that ONE sample to update the LBIASNEW value.
with a good value in LBIASNEW reset the LBIASFRAC and LBUNUMVAL parameters to their old values
Keywords: Aspen IQ, Bias, Bad, Bad Value, Update
References: None |
Problem Statement: Running Aspen IQ and Aspen FCC on the same machine creates a Dias installation conflict | Solution: Here is the procedure for getting Aspen FCC 10.2-3 and Aspen IQ 1.2.1 to work together.
Uninstall Aspen IQ, ACO Base and Dais
Reboot
Install Aspen FCC
Find Dais on your computer (likely it is in C:\Program Files\Aspentech\Dais)
Right click on the my computer icon and select properties, go to the enviornment variable tab, find the system variable path.
At the beginning of the path add Dais bin (e.g. C:\progra~1\aspent~1\dais\bin;). Note that I used the first 6 characters of the long directory names followed by ~1. It is not known if this is necessary, but it does work.
Open up a Dos window and type 'trman'. If the trader manager did not come up then go back to steps 4-6 and try again.
Install ACO base, reboot
Start ACO base, install Aspen IQ online
If your machine is already hosed in a bad way then uninstall IQ Online, ACO Base, Dais, and Aspen FCC. Delete all Dais directories. Remove Dais from the path enviornment variable. Run regedit and search for 'dais' and remove any relevant entities (ignore Legacy entries). Don't be afraid to call support before you mess with your registry. Reboot and head to step 3 above.
Keywords: Dais, Aspen IQ, Aspen FCC
References: None |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.