question
stringlengths
19
6.88k
answer
stringlengths
38
33.3k
Problem Statement: Some VBA applications, when connecting to a database using SQLplus ODBC driver, require that a user submit his username and password for authentication. Moreover, the operator using the application may not be the same person as the one who is logged in to the network on that computer. Does SQLplus ODBC driver use the UID and PWD connection parameters?
Solution: SQLplus ODBC driver does not use the UID and PWD connection parameters. It just uses the user that is logged in. To work around this problem you could define a second SQLplus network server on a secret port number with security disabled and use this from the program. To do this you would: 1. Start the InfoPlus.21 manager. 2. Select TSK_SQL_SERVER. 3. Change the task name (E.g. to TSK_SQL_SERVER2) 4. Change the command line to have a different port number and add the 'n' option to disable security. (E.g. 11014 n) 5. Click the 'Add' button. 6. You could then specify the port number in the connection string. (E.g. DRIVER= {AspenTech SQLplus};HOST=myhost;PORT=11014) You will also have to create a new ADSA connection and point the SQL component to the new port, and create an ODBC System DSN entry pointing to that new ADSA connection. IMPORTANT: Before choosing to implement this approach, consider whether essentially creating a work-around to the security model is appropriate for your business. NOTE: Starting with aspenONE.1, SQLplus ODBC driver will support the use of the UID and PWD connection parameters. For details please see Desktop ODBC User's Guide for Infoplus.21. Keywords: userID Username Password References: None
Problem Statement: Sometimes users get the following error message when trying to create an ODBC connection using Aspen SQLplus 64-bit ODBC driver: The module ip21odbc.dll may not be compatible with the version of Windows that you're running. Check if the module is compatible with an x86 (32-bit) or x64 (64-bit) version of regsvr32.exe. This Knowledge Base article shows how to resolve the above-mentioned error.
Solution: Check the Path Environment Variable to make sure both paths listed below are present: C:\Program Files\Common Files\AspenTech Shared and C:\Program Files (x86)\Common Files \AspenTech Shared If the path C:\Program Files\Common Files\AspenTech Shared is not in the Path Environment Variable, you will need to add it to to the Path manually, as shown in the screen capture below. After that you can confirm that both of them are present by running the set path command at the Command Prompt: Once it is confirmed that the above paths exist in the Path Environmental Variable the error message should not occur. Keywords: None References: None
Problem Statement: How to access the value of the estimation objective function with automation?
Solution: This can be obtained using the LeastSquaresObjective property available on the simulation e.g. Application.simulation.LeastSquaresObjective This property will documented in the next version, as follows: Simulation.LeastSquaresObjective This read only property allows you to access the result of an estimation run giving the final value of the least squares objective function. Example Running external Microsoft Visual Basic: Label1.Caption = ACMApp.simulation.LeastSquaresObjective Keywords: References: None
Problem Statement: Have you ever wanted to scan an IoGetDef record, look for all the Analog (or Discrete) data records that this IoGetDef record points to, and then retrieve data from fixed area fields of those Analog (or Discrete) records ?
Solution: For example, let's suppose that you have an IoGetDef record, that has over 80 occurrences in the IO_#TAGS repeat area. Let's call this IoGetDef record Aspen_Get. Inside each occurrence of Aspen_Get there is a field called IO_Value_Record&Fld. This field points to the Analog (or Discrete) data record that will be populated by the Cim-IO interface, for that tag. Now let's suppose you wanted to retrieve the contents of the Name, IP_Description and IP_Value fields for those Analog (or Discrete) data records. The way to do this is via the SQLplus INDIRECTION option as indicated by the -> sign. The query would therefore be as simple as: SELECT io_value_record&&Fld -> Name, io_value_record&&Fld -> IP_Description, io_value_record&&Fld -> IP_Value FROM Aspen_Get Especially** note that although the field is called io_value_record&fld, the ampersand (&) is a special character and thus the need for a double ampersand in the above example. Finally, you could put a WHERE clause to limit the tags in your list. For example, to only find tags that begin with the string abc: WHERE io_value_record&&Fld -> name like 'abc%' (again the double ampersand!) Keywords: sql indirection ioget ip_analog ip_discrete References: None
Problem Statement: When using the NOT IN function of Aspen SQLplus, it is possible to get No rows selected as an output even when there are records in a folder that match the search criteria. For example, if the folder named folder1 contains three IP_AnalogDef records (rec_1, rec_2, and rec_3), then it would be expected that the following query would return all the tags defined by IP_AnalogDef that are not in folder1: SELECT NAME from IP_AnalogDef where NAME NOT IN (Select RECORD_NAME from folder1); The problem is, sometimes this query will not return any data, and instead will display No rows selected, even when you know that there are records that meet the search criteria. In the cases when no data is returned, the problem is due to a NULL value being selected for one or more of the RECORD_NAMES. This is caused when the folder has had records removed from it using the Aspen InfoPlus.21 Administrator tool. Consider the following example. A folder has four records in it, and one record is removed using the Aspen InfoPlus.21 Administrator. Often times the folder's #DATA_BASE_RECORDS field will remain four, instead of being reduced to three. For this reason, the fourth occurrence is now NULL.
Solution: To work around this behavior, add the clause, where record_name is not null to the query. The entire query would then look like: SELECT NAME from IP_AnalogDef where NAME NOT IN (Select RECORD_NAME from folder1 where RECORD_NAME is not NULL); Keywords: References: None
Problem Statement: I want to model unsteady diffusion in a sphere. The diffusion term is (in equation, not ACM syntax) dc/dt = D/r^2 d/dr (r^2 dc/dr) How can I program this in ACM?
Solution: If you like to use the compact form, here''s the correct way to write it: r*$c(r.interior) = D*(2*c.ddx(r.interior) + r*c.d2x2(r.interior)); But if C is an array and not a scalar distribution, such as for example an array of concentrations, you need to control the expansion of equations with for loops: for j in componentlist do for ir in r.interior do r(ir) * $c(j, ir) = D*(2*c(j).ddx(ir) + r(ir)*c(j).d2dx2(ir)); c(j,ir) : initial; endfor endfor The reason is the way ACM matches the indices of elements of arrays. The complete model would be: Model Sphere r as domain (highestorderderivative : 2); C(componentlist) as distribution1D (xdomain is r) of conc_mole; D as realvariable (fixed, 1); Cex(componentlist) as conc_mole (fixed); for j in componentlist do for ir in r.interior do r(ir) * $c(j, ir) = D*(2*c(j).ddx(ir) + r(ir)*c(j).d2dx2(ir)); c(j,ir) : 0, initial; endfor endfor // BC for j in componentlist do C(j, 0).ddx = 0; C(j, r.endnode) = Cex(j); endfor end Keywords: References: None
Problem Statement: After configuring a report in Aspen SQLplus web reporting tool to send a report to specific email address it don't really send an email to Inbox however the email just sits in the local machine queue under the folder C:\Inetpub\mailroot\Queue.
Solution: When reports in Aspen SQLplus are set to send to an email it goes through Default SMTP Virtual server located in IIS. Make sure this service is started and running. If this is running but still you are not getting emails there is a possibility that they are just sitting in the local machine queue. In order to clear this queue and start receiving emails we should add smart host field to the IIS Default SMTP Virtual Server properties. Following is the procedure on how to add Smart host field: 1. Open IIS manager and go to Default SMTP Virtual Server 2. Right click on Default SMTP Virtual Server and go to properties. 3. go to Delivery tab in properties and click on Advanced which opens up Advanced Delivery dialog box as shown below 4. Specify your Microsoft outlook Exchange server name in the Smart host field and click OK After this go ahead and restart the Default SMTP Virtual Server and see if all the messages in the email Queue are cleared and you start to receive email as expected. Keywords: None References: None
Problem Statement: Sometimes it would be desirable to not have any informational messages appear in your output. For instance, if you're writing a report or preparing a text file for someone and you have your output directed to a file, you wouldn't want messages such as No Rows Selected or 4 rows inserted to appear in your output. There is a way to suppress those messages.
Solution: Using the SET command of LOG_ROWS is the way to accomplish this. It is a toggle of 0 or 1 where 1 is the default to show the messages. To suppress them, simply put SET LOG_ROWS 0; at the beginning of your query. You could also turn it off and on within your query if there's a certain statement's informational output that you do or do not want. Keywords: None References: None
Problem Statement: Is there a way to know how many times in a day a tag is below a certain value?
Solution: Open Aspen SQLPlus and run the following query to know how many times the value has been below or equal to 10,000. NOTE: Modify the query as needed. (replace TagName with the name of your tag and the value field name) Select count (IP_Trend_Value) from ATCAI where IP_Trend_Time between '21-AUG-12 12:00:00' and '22-AUG-12 12:00:00' and IP_Trend_Value <=10000; Keywords: SQLPlus References: None
Problem Statement: How can I display Aspen InfoPlus.21 tag names that are greater than 24 characters in length using an SQL Plus query?
Solution: In order to display tag names that are more than 24 characters use the WIDTH function. The following query will display all the tag names that have a character length of 64 or less. Select Name WIDTH 64 from IP_Analogdef; Note: More details about how WIDTH function is used can be found in the Aspen SQL Plus help file. Keywords: Query SQL Plus WIDTH 24 characters References: None
Problem Statement: Is there a debug file created for errors generated via the automated reporting option of Aspen SQLplus Reporting?
Solution: If an error occurs when an automated report that was scheduled to run at a particular time then you will possibly see some error message in TSK_SQLR out like 'Failed to run the report, check additional log file for more info.'. There will also be a log file generated in the Group200 folder (located either in Program Files\Aspentech\InfoPlus.21\db21\group200 or ProgramData\Aspentech\InfoPlus.21\db21\group200) named SQLplusReportScheduler_<username>.log which should contain more verbose messaging. Keywords: debug, automated reports References: None
Problem Statement: Unable to get multiple-line headers to output correctly when using the CROSS BY function.
Solution: You can only use multiple-line headers if you know in advance what the headers will be. This isn''t the case with a general Cross By query. However, if you do know the headers you can use a Cross By within an inner query as follows. Select ts, atcai a|b, atcbatch c|d from select ts, avg(value) by name from history where name in (''atcai'', ''atcbatch'') group by ts); The output from the above query looks like this: ts b d 18-OCT-01 06:23:29.2 5.54636 1164 18-OCT-01 06:06:29.2 3.70026 1164 18-OCT-01 05:49:29.2 6.00695 1164 18-OCT-01 05:32:29.2 5.41838 1164 18-OCT-01 06:19:29.2 6.6937 1164 18-OCT-01 06:02:29.2 6.77012 1164 18-OCT-01 05:45:29.2 4.71293 1164 18-OCT-01 06:15:29.2 8.97233 1164 18-OCT-01 05:58:29.2 9.45282 1164 Keywords: multiple header cross by References: None
Problem Statement: During the ATIMUS WIN2K session, there was some interest in a small SQL script to count the number of user connections on an InfoPlus.21 server. So Frank van de Pol wrote this script to allow counting the number of connections in InfoPlus.21 v2.51. He added: Use it at your own risk. It didn''t eat my machine, but I can''t guarantee it won''t on yours.
Solution: -- ************************************************** -- - record: UserCountQry -- - calculate user connections to the ip21 database -- - 18 sept 2001 - Frank van de Pol, - Cargill Bergen op Zoom -- -- ************************************************** -- - Modification log: -- - $FP01 18 sep 2001 Initial version -- ************************************************** -- - Description: -- -- -- local users integer; local port integer; First find out at which port InfoPlus is listing for clients -- port = (select port from select count(line) as use, substring(2 of substring(3 of line) between '':'') as port from (system ''netstat -n'') group by port where use > 5); write ''tcpip listing port = '' || port; list ip user connections to infoplus database -- select distinct(substring(1 of substring(4 of line) between '':'')) as remote_ip from (system ''netstat -n'') where substring(2 of substring(3 of line) between '':'') = port; count users -- users = (select count(remote_ip) from select distinct(substring(1 of substring(4 of line) between '':'')) as remote_ip from (system ''netstat -n'') where substring(2 of substring(3 of line) between '':'') = port); write users || '' IP.21 users''; -- store result in database update IP_DiscreteDef set qtimestamp(ip_input_value) = CURRENT_TIMESTAMP, qstatus(ip_input_value) = ''Good'', ip_input_value = users where name=''INFOPLUS_USERS''; Keywords: count counting script SQL script References: None
Problem Statement: You may create additional TSK_IQ tasks (i.e. tasks that start an occurrence of the executable iqtask.exe). The benefit of this is that when you are using Aspen SQLplus heavily, and the existing (default) TSK_IQ1 task has too many processes.
Solution: To add an additional occurrence of iqtask.exe, take the following steps: 1) In the Aspen InfoPlus.21 Manager double click on the TSK_IQ1 task from your list of tasks in the Defined Tasks window. The task detail information will be displayed on the right side of the Aspen InfoPlus.21 Manager window, in the NEW TASK DEFINITION area. 2) Everywhere there is a 1 i.e. task name, .err files, etc., edit it to reflect a new task. For example, change TSK_IQ1 to TSK_IQ2, TSK_IQ1.OUT to TSK_IQ2.OUT, etc. (Note that you can change the task name only, and when clicking ADD, it will automatically re-name the error and output files to match the task name). 3) Click on the ADD button at the bottom of the screen. This will add that task. 4) Create a new record based on the definition record ExternalTaskDef. 5) Name the ExternalTaskDef record the same name that you gave the TSK_IQx task in step #2. Make the record USABLE. NOTES: 1) Each TSK_IQx generates a separate process, called iqtask.exe, in the Windows Task Manager, and each process can be cross-referenced to the TSK_IQx that had spawned it by its unique process ID number (PID), which can be seen when you double click on the TSK_IQx task from the list of tasks in the Running Tasks window of the Aspen InfoPlus.21 Manager. The task detail information, including PID, will be displayed in the NEW TASK DEFINITION area. 2) There is only one TSK_SQL_SERVER which really doesn't affect your TSK_IQx tasks. You do NOT have to set up multiple occurrences of the TSK_SQL_SERVER task. 3) To associate a QueryDef record with your new iqtask, enter the name of the TSK_IQx in the EXTERNAL_TASK_RECORD field of the QueryDef record. Keywords: TSK_IQ; iqtask.exe; external tasks; References: None
Problem Statement: We have had several people ask whether they can setup an ODBC connection between an InfoPlus.21 database and a Setcim database.
Solution: You can use the ODBC Open Access Module (OAM) on InfoPlus.21 to connect to Desktop ODBC served by a SETCIM system. It is possible using the ODBC OAM to do INSERTs, DELETEs and UPDATE on a remote Setcim 4.8 system. For this you need: SQLplus Version on InfoPlus.21 (windows NT) with SQLplus and ODBC OAM license (SQLplus includes the Desktop ODBC driver) SQLplus Version on SETCIM with the SQLplus and DESK licenses (The DESK license covers the Desktop ODBC server). You don't need to select any OAM's or DESK on installation. The configuration looks like: SQLplus on InfoPlus.21 ODBC OAM on InfoPlus.21 Desktop ODBC driver on InfoPlus.21 (Data Source set up for SETCIM node) | SQLplus Network SQLplus Network Server on SETCIM No SequeLink software is needed. The SETCIM system can be VMS or UNIX. The setup of the SQLplus network server on SETCIM is different for UNIX and VMS but both are documented in the SETCIM SQLplus user's manual. NOTE :------ There is not a supported way to do the opposite and connect from SQLplus on SETCIM to an InfoPlus.21 system. Keywords: ODBC Setcim Infoplus.21 References: None
Problem Statement: Customers would like an automated way to check the status of their history. One way to do this is to write an SQL+ query that monitors a repository, save it as an InfoPlus.21 record and schedule the activation of that record.
Solution: The following script checks to see if the archive process for a repository is running. It will send a response of Running or Not Running to an output file called HistoryD_HIST.out. (Please note, it will report a status of 'Not Running' if the archiver is stopped OR paused.) Some people have used third party pager software (not available through AspenTech) to page them when the content of this out file reports Not Running. The script is as follows: system '%h21%\bin\h21qwatch -r TSK_DHIS > HistoryD_HIST.out'; wait 10; IF (select count (substring (2 of line between ';')) from 'HistoryD_HIST.out' where substring (2 of line between ':') like 'Run / Process%') = 1 THEN write 'Running'; ELSE write 'Not Running'; END If you do not have environment varialbes set up on your computer, then use the script below. Note: 'C' is the drive letter where InfoPlus.21 is installed. system 'C:\Progra~1\AspenTech\InfoPlus.21\c21\h21\bin\h21qwatch -r TSK_DHIS > HistoryD_HIST.out'; wait 10; IF (select count (substring (2 of line between ';')) from 'HistoryD_HIST.out' where substring (2 of line between ':') like 'Run / Process%') = 1 THEN write 'Running'; ELSE write 'Not Running'; END Keywords: sqlplus; fileset References: None
Problem Statement: Values displayed in Excel using ODBC not displaying with value format associated with record. For instance, if you were using ODBC connection with Excel to pull in the NAME, IP_VALUE_FORMAT, IP_INPUT_VALUE, and IP_INPUT_TIME. The results were displayed but the data was not displayed using the value format specified in the records. In most cases, there were many more decimal places returned. It seems as though the results are ignoring the value format configured in the record itself and simply returning what was in the memory location. For example, create a new record and set the value format to F10.3. Input 0.0 using the IP.21 Administrator; Excel displayed 0. If you then input in 1.234; Excel displayed 1.233999968. If you then input 9.9 in the record; Excel displayed 9.899999619. Steps to setup this test: Set up an Aspen SQLplus ODBC data source in the Control Panel | Administrative Tools | Data Sources (ODBC). Then in Excel go to Data | Get External Data | New Database Query. Choose your data source just created and put a check in the box for use query wizard to create/edit queries. Choose IP_AnalogDef for the table and NAME, IP_VALUE_FORMAT, IP_INPUT_TIME, and IP_INPUT_VALUE for the columns. Narrowed the search down to simply one record, Chrystina. Chrystina is set up with an F10.3 format. Once you get the results returned in Excel, start changing the value of the record(s) and refresh the data in the spreadsheet in Excel (Data | Refresh Data).
Solution: One may ask if this is a problem with Excel or the ODBC driver? The answer is neither. The default for the SQLplus ODBC driver is to return floating point data directly and allow the application (Excel in this case) to format it. If you want SQLplus to do the formatting, you need to check the Reals sent as Character box on the Advanced page of the ODBC data source configuration. Keywords: ODBC Excel References: None
Problem Statement: SQLplus 2004 and above includes a Web Service that allows users to access Aspen InfoPlus.21 information using SQLplus query language. This
Solution: contains a sample VB.NET application that uses the SQLplus Web Service. Solution The attached zip file includes a sample Microsoft Visual Studio 2003Solution. Comments in the source code explain how the functionality has been implemented. NOTE: This sample is provided as is and will not be supported by AspenTech development. The application can be launched by running the executable contained in the zip file (TalkToSQLplusWebService/bin/TalkToSQLplusWebService.exe). Microsoft Visual Studio is not necessary to run the executable. Keywords: web service SQL query References: None
Problem Statement: Aspen SQLplus query executed from a Web page via an ODBC connection returns ERROR [HYT00] [AspenTech][SQLplus] Query timeout. Where is the timeout value that needs to be set to allow a query to execute to completeness?
Solution: The error code [HYT00] [Aspentech][SQLplus] at first appears to originate from an AspenTech product, but at a second look, the error-code [HYT00] is generic to ODBC-drivers using a Microsoft library or template. Here the default of 30sec. is set. The following link describes how to change the timeout-value, where a value of 0 means indefinite. http://msdn.microsoft.com/en-us/library/system.data.idbcommand.commandtimeout(VS.80).aspx Keywords: ODBC .NET query timeout References: None
Problem Statement: Finding a single data point at a particular timestamp in IP.21 history is not straightforward. Due to the high timestamp re
Solution: in IP.21 (microsecond precision) the following query: select IP_TREND_TIME, IP_TREND_VALUE from ATCAI where IP_TREND_TIME = '28-JUL-04 08:00:00'; will only return a value if the timestamp in history is equivalent to 28-JUL-04 08:00:00.000000 If the value is instead at timestamp 28-JUL-04 08:00:00.000001, no data is returned. Solution One workaround is to search between an upper and lower timestamp bound. For example: select IP_TREND_TIME using 'TS25', IP_TREND_VALUE from ATCAI where IP_TREND_TIME > '28-JUL-04 07:59:55' and IP_TREND_TIME < '28-JUL-04 08:00:05'; Another option is to use the HISTORY table to find an interpolated value at any given point in time: local value_time timestamp; local start_time timestamp; local delta integer; local end_time timestamp; value_time = '28-JUL-04 08:00:00.0' ; -- MODIFY THIS LINE delta = 50; -- in tenths of seconds CAST(delta as TIMESTAMP); start_time = value_time - delta; end_time = value_time + delta; write 'Start Time : '|| start_time; write 'Value Time : '|| value_time; write 'End Time : '|| end_time; SELECT ts as TIME_STAMP width 25, value as VALUE FROM HISTORY WHERE name='ATCAI' -- MODIFY THIS LINE and ts > start_time and ts < end_time and period= delta; -- in tenths of seconds Keywords: single data point time References: None
Problem Statement: How do I create an output line > 600 characters?
Solution: Create a local variable whose data type is Variant. Set the variant equal to a string of any length. The fact that the variable is an Variant overcomes the normal 600 character string limit. set output 'c:\out.txt'; local var1 char(100); local i int; local var2; var1 = '1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890'; for i = 1 to 20 do var2 = var2 || var1; write character_length(var2); end; write var2; Resulting Text file out.txt... 100 200 300 400 500 600 700 800 900 1000 1100 1200 1300 1400 1500 1600 1700 1800 1900 2000 12345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890 Keywords: output text file, text file, set output, line length, character length, variant, 600 References: None
Problem Statement: Aspen Custom Modeler COM interface behaves oddly with C# although I have written the interface codes correctly.
Solution: When developing a C# program to work with the ACM Automation interface, it is necessary to declare C# object variables that reference ACM objects as dynamic to ensure that late binding is used to resolve these objects. Here is an example: static void Main(string[] args) { dynamic App = new AspenCustomModelerLibrary.AspenCustomModeler(); App.OpenDocument(C:\\temp\\FiveTank.acmf); dynamic Sim = App.Simulation; dynamic Flow = Sim.Flowsheet; dynamic Input_Flow = Flow.Resolve(Tank1.FlowIn.Flow); Console.WriteLine(Input_Flow.Value); } Keywords: Late Binding, ACM objects References: None
Problem Statement: When multiple data types are used in an Aspen SQLPlus update statement, the order of updates is dependent upon the data type. For example, if an Aspen SQLplus update statement contains both integer and character values, the integer value will update before the character value. This
Solution: provides the order in which the updates will occur.Solution This list provides the sequence of updates by data type. If the order in which fields are written is significant then separate UPDATE statements are recommended. DTYPDUBL DTYPQUDUBL DTYPREAL DTYPQUREAL DTYPUSTS DTYPIDFT DTYPQULONG DTYPQUTEXT DTYPREID DTYPLONG DTYPSHRT CHARACTER Keywords: None References: None
Problem Statement: This
Solution: shows the query syntax for setting a timestamp variable to take current date but always at a specific time. This is useful when the user wants to initialize a variable every day at a specific hour, regardless of the query execution time. Solution Sample query that will set a local variable to be current day but always at midnight: local StartingTime timestamp; StartingTime = Cast(current_timestamp as char format 'DD-MON-YY')||' 00:00:00'; write StartingTime; Keywords: cast as char format concatenate References: None
Problem Statement: Aspen SQLplus timestamp calculations do not calculate correctly when using timestamps with the D option specified. Note: The D option is used to ensure that an event will reschedule at the same time(s) everyday, regardless of Daylight saving time changes. Example: select name,last_executed,schedule_time[1],reschedule_interval[1], last_executed - (schedule_time[1] - reschedule_interval[1]) as duration from querydef where name = 'dtimes' name last_executed schedule_time[1] reschedule_interval[1] duration DTimes 28-MAR-11 14:10:00.5 29-MAR-11 14:10:00.0 D00024:00:00.0 -047:59:59.5
Solution: Adding the ABS() function to parameters that use the timestamp D option allows the time to calculate correctly. Example: select name,last_executed,schedule_time[1],reschedule_interval[1], last_executed - (schedule_time[1] - ABS(reschedule_interval[1])) as duration from querydef where name = 'dtimes' name last_executed schedule_time[1] reschedule_interval[1] duration DTimes 28-MAR-11 14:10:00.5 29-MAR-11 14:10:00.0 D00024:00:00.0 +000:00:00.5 Keywords: References: None
Problem Statement: When deleting records, it is first necessary to make them unusable. Is it possible to make records unusable using SQLplus?
Solution: Records can be made unusable by setting USABLE=0. USABLE is 0 if a record is unusable and 1 if it is usable. It can be specified as an indirect field. Here are two examples where a record called Tester-A is made unusable: Update IP_AnalogDef set USABLE=0 where name like 'Tester-A'; update Tester-A set usable = 0; Keywords: remove delete References: None
Problem Statement: 연속 λ‘κ°œμ˜ IP_TREND_VALUEs 의 κ°’ 차이가 50 인 Tag듀을 μ°Ύμ•„λ‚΄λŠ” 방법 예: λ§Œμ•½ IP_TREND_VALUEs κ°’μ˜ λ²”μœ„κ°€ 1000 λΆ€ν„° 10000 이고 연속 λ‘κ°œμ˜ IP_TREND_VALUEs 의 κ°’ 차이가 50 인 Tagλ₯Ό μ°ΎλŠ” 방법은 μ•„λž˜μ™€ κ°™μŠ΅λ‹ˆλ‹€.
Solution: 1. μ•„λž˜μ™€ 같은 SQLPlus 슀크립트λ₯Ό μ‹€ν–‰ν•˜μ—¬ IP_TREND_VALUE 값이 1000 λΆ€ν„° 10000 κ°„μ˜ Tag λ₯Ό μ°Ύμ„μˆ˜ 있음. Select name, value from HISTORY where value >1000 and value < 10000; 2. 절차 1μ—μ„œ μ‹€ν–‰ν•œ κ²°κ³Όλ₯Ό Microsoft Excel 에 Copy. 3. 컬럼 C μ—μ„œ μ•„λž˜μ™€ 같은 ν•¨μˆ˜λ‘œ 연속 λ‘κ°œμ˜ IP_TREND_VALUE 값을 계산. 4. 컬럼 D μ—μ„œ μ•„λž˜μ™€ 같은 ν•¨μˆ˜λ‘œ μš”κ΅¬ν•˜λŠ” Tagλ₯Ό μ°Ύμ„μˆ˜ 있음. Keywords: Aspen SQLplus select IP_TREND_VALUE λ²”μœ„ Excel KR- References: None
Problem Statement: Many try to use a local variable to hold a record name in the TABLE spot in the FROM clause so that they can, within a loop, change the record with every iteration. This will not work. SQLplus does not allow a local variable to be in the FROM clause.
Solution: You can START a second query from the first query and pass it a parameter. SQLplus does allow this parameter to be in the FROM clause because they are treated as macros. For example: We want to create a report that contains a selector record name, and all the possible values for that selector record for all the existing selector definition records in the database. Because there is not one single definition record for selector records (i.e. select2def, select3def, etc), we need to cycle through all of these in order to get our desired results. The way to do this uses 2 queries. The first one uses a FOR LOOP to select the name of the records which are selector defs. FOR (SELECT name n FROM definitiondef WHERE field_name_record = 'select_description') DO START 'c:\aspentech\desktop\sqlplus\query2.sql', n; -- Only 'query2.sql' would be needed if in default directory END The second query would be SELECT definition, name, select_description FROM &1; The first query finds the names of all definition records that are selector defs. It then passes the names of these definition records to the second query as a parameter. Parameters in queries are treated as macros and so can be used as part of the FROM clause. The second query then pulls all the data from the relevant record. The results obtained are quite large (one grouping for each selector definition record). Below is just a sampling showing the first 3 selector definition records' worth of data: DEFINITION name select_description Select3Def NO/YES NO Select3Def NO/YES YES Select3Def OFF/ON OFF Select3Def OFF/ON ON Select3Def YES/NO YES Select3Def YES/NO NO Select3Def Q_ON/OFF ON Select3Def Q_ON/OFF OFF DEFINITION name select_description Select1Def Q_BLANK/COMMENT Select1Def Q_BLANK/COMMENT # Select1Def Q_LABELS_1 B DEFINITION name select_description Select2Def Q_RELATIONS = Select2Def Q_RELATIONS != Select2Def Q_RELATIONS < Select2Def Q_RELATIONS > Select2Def Q_RELATIONS <= Select2Def Q_RELATIONS >= Select2Def Io-Priorities 1 Select2Def Io-Priorities 2 Select2Def Io-Priorities 3 Select2Def Io-Priorities 4 Select2Def Io-Priorities 5 Select2Def Io-Priorities 6 Select2Def Io-Priorities 7 Select2Def Io-Priorities 8 Select2Def Io-Priorities 9 Keywords: local variable parameter loop References: None
Problem Statement: Is it possible to save a snapshot of the database from within SQLplus?
Solution: Yes, use the following command which will force the SAVE_SNAP record to execute: UPDATE Save_Snap SET schedule_time=CURRENT_TIMESTAMP; Note that this will now become the time basis for when your regularly scheduled snapshot will be saved, unless you issue a second command that resets the schedule_time time back to the original setting. For instance, if this Save_Snap record is scheduled for a 24 hour interval, happening every morning at 1:00am, and you execute the above query at 3:05pm, your daily snapshot will now start happening at 3:05pm every day. Keywords: Snapshot TSK_SAVE References: None
Problem Statement: How can I time how long my query or a portion of my query is taking to run?
Solution: There may be many ways to accomplish this, but here are 3 good examples to do this. 1. Creating Standard Records in DB A. Create a standard record in the database which has a history repeat area sized to at least 2. B. Write to the raw value field each time you wish to set a timer for an operation C. At the end of the query, SELECT rows from the Trend Time of your history repeat area and subtract them from each other. For example: -- setting the timer TimerPV.value = 1; -- code that you want to time would go here TimerPV.value = 2; -- report duration WRITE 'Query took: ' || TimerPV.TrendTime[2] - TimerPV.TrendTime[1]; 2. Using a SYSTEM command to show the start & end times For example (VMS): SYSTEM 'sho time'; -- code that you want to time would go here SYSTEM 'sho time'; (If running on a UNIX system, the command is 'date' & on NT is 'time') 3. Using the GETDBTIME function The function GETDBTIME should be used for this instead of the CURRENT_TIMESTAMP function as all instances of CURRENT_TIMESTAMP in a query are parsed and resolved at the same time. Therefore, you'll get the same time everytime you use it in a single query. For example: LOCAL t TIMESTAMP; t = GETDBTIME; -- code that you want to time would go here WRITE GETDBTIME - t; Keywords: time GETDBTIME CURRENT_TIMESTAMP References: None
Problem Statement: When accessing IP.21 using ODBC from an ASP page it fails with access denied error.
Solution: Assign a domain user account with permissions to read from the IP.21 database to the anonymous login in IIS. Steps: 1. Open internet service manager. 2. Right click and select properties. 3. Select the Directory security tab. 4. Click the Edit button by the anonymous access box. 5. When Allow Anonymous Access is checked you can click the Edit button. 6. Specify a domain account / password with the appropiate access to IP.21. Keywords: Desktop ODBC SQLPlus ASP IIS References: None
Problem Statement: How to search for repeat area fields by using the NXTFTDEF function within SQLPlus.
Solution: By default, the NXTFTDEF function only searches the fixed area of each defintion record, to see if it contains the field you specify. But what if the field you are looking for resides within a definition record''s repeat area? In order to search for fields in a record''s repeat area, one must specify an occurrence number in the query. For example, the following query finds all the definition records that have an IP_TREND_VALUE field (or a field with the same field ID): LOCAL cur_record record; LOCAL search_field character(30); search_field = ''1 ip_trend_value''; -- NOTE: replace search_field with the field you are searching for cur_record = NXTFTDEF(NULL, search_field); WHILE cur_record IS NOT NULL DO WRITE cur_record; cur_record = NXTFTDEF(cur_record, search_field,0); END Note: This uses ''1 ip_trend_value'' rather than just ''ip_trend_value''. Result: Analogdef GroupMembersDef Q_ParetoNoKeyDef Q_XBARDef Q_XBARSDef Q_XBARCDef Q_XBARCSDEF Q_HistScratchdef IP_AnalogDef IP_DiscreteDef IP_SetDef IP_TextDef Q_XBAR21Def Q_XBARS21Def PMCAnalogDef PMCDiscreteDef Keywords: NXTFTDEF References: None
Problem Statement: This knowledge base article explains how to join tables from two different Aspen InfoPlus.21 databases.
Solution: Create Aspen SQLplus External links for each Aspen InfoPlus.21 database. Create a Join query referencing those External links and select from the Aggregates or History table or any tag record. Aspen SQLplus External Links referencing ODBC data sources to multiple Aspen InfoPlus.21 databases Select Average of Product Quality from Aspen InfoPlus.21 database Number 1 and Average Reactor Temperature from Aspen InfoPlus.21 database Number 2 referencing the Aggregates table in both. The user could also select from the History table or actual tag records if desired. Keywords: None References: None
Problem Statement: This knowledge base article provides suggestions to minimize relational database disconnection errors when multiple Aspen SQLplus queries are processed by multiple instances of iqtask.exe (TSK_IQ#).
Solution: Each TSK_IQ# process (an instance of iqtask.exe) is an independent task. If multiple instances of iqtask.exe process queries which connect to a relational database, they each make separate connections to the database. Executing a query or a disconnect command on one instance of iqtask.exe has no effect on the other instance. Therefore, if a query processed by TSK_IQ1 has disconnected then reconnected to the database, there is still a possibility that a query processed by TSK_IQ2 will not have reconnected to the database. Aspen SQLplus keeps the connection to the relational database open to improve performance. This means that, if the connection is broken, it is only reported as an error on the next execution. For frequently executed queries, this is not a problem because the error causes Aspen SQLplus to disconnect the connection and the next execution restores the connection. However, it is not good for infrequent queries because it is a long time before the next execution. For such queries it is recommended to use the DISCONNECT command. The DISCONNECT command explicitly closes the connection so that the next execution opens it again. In particular, if it seems that queries on one instance of iqtask.exe (TSK_IQ1) have reconnected but queries which run on another instance of iqtask.exe (TSK_IQ2) do not reconnect to the database, keep the following in mind: 1) The DISCONNECT command must be executed on the same task which processes the query which does not reconnect. 2) The DISCONNECT command should be executed after the last statement that accesses the remote database. 3) There may be multiple queries on TSK_IQ2 that access the remote database. All of them should include a DISCONNECT command at the end. The database connection is shared between all queries on a given instance of iqtask.exe. Keywords: connect reconnect disconnect on any error References: None
Problem Statement: Are there any connectivity components, other than ODBC, that can be used to connect to an application called Statistica?
Solution: At the time of this writing (January 2007) there are no mechanisms available, other than ODBC, to allow connectivity to Statistica. However, it is possible to use the Statistica COM API from within AspenSQLplus like any other COM API. This would allow the developer to combine Aspen SQLplus's agility in reading and writing data to and from Aspen InfoPlus.21 with the powerful statistical capabilities of Statisitica. The quote from the Statistica website below clearly indicates that Statistica is a COM based application. Virtually every aspect of STATISTICA is exposed as a set of COM interfaces that are registered on a machine when STATISTICA is installed. Since .NET-based languages cannot communicate with COM directly, a wrapper class called the COM Interop can be utilized to integrate the STATISTICA libraries into your .NET project. The COM Interop layer is created automatically by the Visual Studio .NET IDE when you import a COM interface. The COM Interop layer handles all of the details regarding interacting with the COM libraries in .NET. With the COM Interop layer in place, the STATISTICA COM interfaces behave like any other .NET object. For more information about Statistica, please follow the links below: http://en.wikipedia.org/wiki/Statistica http://www.statsoft.com/ Keywords: None References: None
Problem Statement: How do you connect to an Excel spreadsheet that's set up as an ODBC data source?
Solution: First of all, you must set up the Excel spreadsheet as an ODBC data source: Start -> Settings -> Control Panel -> Administrator Tools -> Data Sources (ODBC) Click the System DSN tab, then click the Add button Select Microsoft Excel Driver (*.xls) and click the Finish button Enter a name for the Data Source, then click the Select Workbook button and browse to the Excel file Then, in SQLplus, set up the connection to this new ODBC source: Click the Tables button (or Record -> Paste Fields), then click the Add Link button Enter a Link Name, select the Data Source Name (from step 4 above) from the drop-down list, and click OK. (Note that you do not normally need a username and password.) Back in the SQLplus Records and Tables dialog box, click the Options button and select System Tables Now you can expand the Database Link and see the column names. Note that these column names are whatever entries you have in the first row (A1, B1, C1, etc.). If you want to access this Excel file while it's open, seeSolution #108544: Error opening open Excel spreadsheet: failed to connect to link. It is already opened exclusively by another user. Keywords: query syntax workbook sheet References: None
Problem Statement: If Excel is installed on the machine in which the Aspen SQLplus script is run, the query output can be sent directly to an Excel spreadsheet using COM object as detailed in knowledge base article 117590 (Sending SQLplus query output directly to an Excel spreadsheet using a COM object). Alternatively, the query output can be saved as a csv file which can be opened with Excel as detailed in knowledge base article 114314 (Saving query results to a csv file or Excel sheet) This
Solution: provides ways to generate Excel file (XLS) which Excel can open even though Excel is not installed on the machine. Solution There are 2 methods to generate Excel XLS file that can be opened by Microsoft Excel. 1. By using Horizontal Tab. The Aspen SQLplus query below will generate an Excel XLS file that is tab-delimited. SET OUTPUT = 'C:\Output.xls'; WRITE 'IP_TREND_TIME' || CHR(9) || 'IP_TREND_VALUE'; FOR (SELECT ip_trend_time, ip_trend_value FROM ATCAI WHERE IP_TREND_TIME BETWEEN (CURRENT_TIMESTAMP - 0:30) AND CURRENT_TIMESTAMP) DO WRITE ' ' || ip_trend_time || CHR(9) || ip_trend_value; END 2. By using HTML. However do note that the version of Excel used must recognize HTML. Using this method has the advantage of formatting as it allows the assignment of colours, bold or italic to values that need to be stressed. Furthermore, even if Excel is not installed, it can still be opened with Internet Explorer by simply changing the .XLS extension to .HTM or .HTML. The Aspen SQLplus query below will generate an Excel XLS file using HTML. SET OUTPUT = 'C:\Output.xls'; WRITE '<HTML><HEAD><Title>Testing report</Title></HEAD><BODY><TABLE border=1>'; WRITE '<TH>IP_TREND_TIME</TH><TH>IP_TREND_VALUE</TH>'; FOR (SELECT ip_trend_time, ip_trend_value FROM ATCAI WHERE IP_TREND_TIME BETWEEN (CURRENT_TIMESTAMP - 0:30) AND CURRENT_TIMESTAMP) DO WRITE '<TR>'; WRITE '<TD>&&nbsp;' || IP_TREND_TIME || '</TD>'; WRITE '<TD>' || TRIM(IP_TREND_VALUE) || '</TD>'; WRITE '</TR>'; END WRITE '</TABLE></BODY></HTML>'; A little trick is employed for both of the Aspen SQLplus script above. For timestamp values, Excel will try to auto format them resulting in undesirable formatted timestamp. Before outputting the timestamp value, a single blank space is output first. For HTML, the non-breaking space (&nbsp;) is used. Keywords: SQLplus SQL Excel XLS References: None
Problem Statement: The following error is reported when Automating a report in Web Sqlplus Reporting: Failed to get list of printers: ERROR [HY00][Aspentech][SQLplus] SYSTEM commands, CREATEOBJECT and file writes disable at line 1
Solution: The user who is getting this error must be added to the role that has full permission (read, write, System Command and Monitor) to the Sqlplus application in AFW Security Manager Keywords: Web Reports Automated References: None
Problem Statement: What formulas does SQLplus use for the Aggregates table's time weighted average calculations?
Solution: The SQLplus Aggregates table retrieves the average data through the same IP.21 API call that Process Explorer uses. There are 2 request types which can be used for a time weighted average. The first (stepped=0) treats data as though it were continuous, the other method (stepped=1) treats data as discrete/stepped. Below are the time weighted average formulas used for both cases. For continuous data the time weighted average assumes a linear change between one point and the next. For STEPPED=0: ' (c1+c2)*t1/2 + (c2+c3)*t2/2 +..... ' TWA = Keywords: None References: None
Problem Statement: UPDATE Statement in Aspen SQLplus is not allowed to use a column list. For example, the following query will report Cannot update expression list : update MODULE.vals set (maxval, minval, avgval, stdevval) = (select trunc(max,2) as maxval, trunc(min,2) as minval, trunc(avg,2) as avgval, trunc(std,2) as stdevval from aggregates where ts between t1 and t2 and period = 00:45:00 and name = tagname)
Solution: The SQLplus help file has the following statement on the ANSI standard: SQLplus is based on the American National Standards Institute (ANSI) standard X3.135-1992, which is also an International Standard ISO/IEC 9075:1992. SQLplus is primarily a data manipulation tool, and has some data definition capabilities. In addition, SQLplus provides a number of extensions for dealing specifically with InfoPlus.21 database. The ISO/IEC 9075:1992 Standard has the following description of the UPDATE Statement <update Statement searched> ::= UPDATE <table name> SET <set clause list> [ WHERE <search condition> ] <set clause list> ::= <object column> <equals operator> <update source> <object column> ::= <column name> So, the standard does not allow UPDATE to use a column list. SQLplus does provide a way to do an UPDATE based on a SELECT and that is to use a FOR loop. For example: for (select trunc(max,2) as new_maxval, trunc(min,2) as new_minval, trunc(avg,2) as new_avgval, trunc(std,2) as new_stdevval from aggregates where ts between t1 and t2 and period = 00:45:00 and name = tagname) do update MODULE.vals set maxval = new_maxval, minval = new_minval, avgval = new_avgval, stdevval = new_stdevval; end; Keywords: Update Column List References: None
Problem Statement: The GetDBTime function returns the timestamp from the local server. This example query makes an ADO connection to a remote server, then queries for the current timestamp of the remote server using GetDBTime. The user will need to change the hostname for this query to work.
Solution: -------- local conn, rs; conn = createobject('ADODB.Connection'); conn.open('DRIVER={AspenTech SQLplus};HOST=YourHostNameGoesHere'); rs = conn.execute('!write getdbtime;'); write rs(0); conn.close; Keywords: References: None
Problem Statement: This knowledge base article provides an example query which returns data timestamped at the beginning of each hour instead of data timestamped when the query was executed.
Solution: The CAST and FORMAT functions can be used to convert the timestamp to the beginning of the hour. For example: local k, begintime; k= getdbtime-500:00:00.0; begintime = cast(k as timestamp format('MM/DD/YYYY HH')); SELECT ts, avg FROM aggregates WHERE name='TagName' AND ts > begintime and ts < current_timestamp AND period = 1:00 This query will return results as follows: 21-MAR-07 18:00:00.0 6.47409 21-MAR-07 19:00:00.0 6.68552 21-MAR-07 20:00:00.0 6.60208 21-MAR-07 21:00:00.0 6.53467 21-MAR-07 22:00:00.0 6.37668 21-MAR-07 23:00:00.0 6.6583 22-MAR-07 00:00:00.0 6.65019 22-MAR-07 01:00:00.0 6.67604 22-MAR-07 02:00:00.0 6.5059 22-MAR-07 03:00:00.0 6.37533 22-MAR-07 04:00:00.0 6.62406 22-MAR-07 05:00:00.0 6.34709 22-MAR-07 06:00:00.0 6.41541 22-MAR-07 07:00:00.0 6.61303 Another version of a script which returns history at the start of each hour for a given day is shown below. local begintime timestamp; begintime = cast(current_timestamp-8:00:00 as character format 'HH:00'); SELECT ts, avg FROM aggregates WHERE name=''TagName'' AND (ts between begintime and current_timestamp) AND period = 01:00:00; Keywords: Top of the hour Average References: None
Problem Statement: This knowledge base article describes how to apply Left, Right or Center justification to output columns in Aspen SQLplus. This article also describes whether or not the Justification statements can be combined with the Using and Width statements.
Solution: Use the words Left, Center or Right following any column name in a Select statement. The default justification is to Right justify all Integer and Real data columns and Left justify all other columns. The Using and Width syntaxes can be combined with justification. The Aspen SQLplus Using syntax temporarily overrides the display format for any numeric field in a record (e.g. IP_INPUT_VALUE). Any existing RealFormatDef or IntegerFormatDef record can be specified in the Using statement. The format record is enclosed in single quotes, 'F5.2', for example. Each field has it's own particular default width when displayed in Aspen SQLplus. The Width syntax temporarily overrides the default column width for any field. The Width statement can be applied to any field in the record. If combining the Using, Width and Justification statements, the Using statement needs to come immediately after the column name, the Width statement follows that and finally the Left, Right or Center justification statement. Keywords: None References: None
Problem Statement: This knowledge base article describes why the Aspen SQLplus Query Writer can sometimes hang when GetObject is called from an Aspen SQLplus query to open an Excel sheet.
Solution: This has been seen to occur when an instance of the Excel work book is already open by another application. In this case, Excel tries to issue a dialog box to ask if it should open a read-only copy. However, the background Excel process does not have access to create windows, so it hangs. This causes Aspen SQLplus to hang as well. A way around this is to explicitly open the work book as ready only and tell Excel not to display dialog boxes. For example: local xlApp, xlBook; xlApp = createobject('excel.application'); xlApp.DisplayAlerts = False; xlBook = xlApp.Workbooks.Open ('c:\test.xls', 0, true); write xlBook.Worksheets(1).range('a1').value; xlBook = null; xlApp = null; The Quit command closes the Excel sheet (stops Excel.exe if this is the only sheet open.) Before trying this, stop any Excel.exe processes on the Aspen SQLplus server using the Windows Task Manager (these may be left over from the previous queries). Keywords: Hang Hung Freeze Froze Not responding References: None
Problem Statement: Aspen Custom Modeler has access to property methods, especially polymer specific methods such as PC-SAFT that are not found inside HYSYS. When importing an Aspen Custom Modeler (ACM) user model into HYSYS, will HYSYS inherit the property method used inside ACM flowsheet?
Solution: When a user model is created inside ACM (see the attached PDF for instructions on creating an ACM user model for HYSYS), the user model contains all the infrastructure for deploying the property method used in the ACM flowsheet In 2004.2 version of HYSYS, you will need to specify the APPDF (problem definition) file that is associated with the ACM user model and the property method used inside ACM will be used inside the user model in HYSYS. However, note that the property model inherited from ACM can only be used inside the imported user model, not in the rest of the HYSYS flowsheet. There are also some compatibility issues when an ACM user model is brought into HYSYS: 1) You will need to map the components between HYSYS property system and the ACM flowsheet using the Aspen Properties product embedded inside ACM model. Users may have some problems using polymer-type components which exists only in ACM and not in HYSYS. The polymer components used inside ACM and Aspen Properties may not have equivalent attributes in HYSYS. For example, ACM and Aspen Properties' polymer components have attributes such as a variable molecular weight, frequencies of segments, etc. 2) HYSYS uses its property system and Aspen Properties / ACM uses another property system, which are not usually compatible. ACM models in HYSYS will face this issue. The thermodynamic calculations are different and these discrepancies need to be handled /compensated for inside the HYSYS flowsheet. Keywords: ACM, Aspen Custom Modeler, ACM user model, polymers, polymers plus References: None
Problem Statement: I have developed a model in ACM and have exported it to Aspen Plus. The model runs ok in ACM, but in Aspen Plus the flowrate of the output streams is always zero, even if the model is converged.
Solution: You need to check the port properties in ACM. To be able to connect a stream correctly in Aspen Plus, the port type in ACM must be one which is recommended (see on-line help for details, in general you need to use the MoleFractionPort). When you set the export model properties, you need to check that the correct port type is detected. The type should be Mole Fraction. Otherwise, the variables of the input and output streams will simply be ignored, and nothing will work correctly. To fix the problem, open the acmf file in ACM, right mouse button click the model and select Model package properties, click next to go to the Port sheet, and fix as shown above, then click next until the end of the wizard. Then re-export and re-install the model. Keywords: References: None
Problem Statement: When you browse to http://<servername>/SQLplusWebService/SQLplusWebService.asmx, you get the following error message: Furthermore, you find the following error message logged in Event Viewer: It is not possible to run two different versions of ASP.NET in the same IIS process. Please use the IIS Administration Tool to reconfigure your server to run the application in a separate process.
Solution: This is because there is a conflict of the ASP.NET versions used by the applications assigned to the same application pool. 1. Create at least 2 application pools, one dedicated to ASP.Net 2.0 applications and the other to ASP.Net 1.1 applications. 1. Right-click on Application Pools node. 2. Select New -> Application Pool... 3. Fill in Application pool ID with a name. E.g. ASP_NET_20 4. Expand the node Web Sites until you reach the applications 5. Right-click on an application and select Properties. 6. Select the ASP.NET tab to check the ASP.Net version used. 7. Select the Virtual Directory tab. 8. Assign to the correct application pool by selecting from the drop-down list beside Application pool. 9. Click on the OK button to confirm the change. Further check to ensure proper configuration of the application. 1. Right-click on an application and select Properties. 2. Select the Virtual Directory tab. 3. Click on the Configuration button. 4. Select the ASP.NET tab to check the ASP.Net version used. 5. Check the .aspx and .asmx extensions ensuring that the same aspnet_isapi.dll from the same folder is used and it is the same as the version specified in ASP.NET tab. To illustrate, if the .asmx is using C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\aspnet_isapi.dll, then the .aspx will be using the above too. Furthermore, on the ASP.NET tab, the ASP.NET version should also be 2.0.50727. Keywords: Server Application Unavailable IIS process ASP.NET References: None
Problem Statement: This knowledge base article describes how to work around the following error Connection is busy with results for another hstmt which can be returned when executing a query against a remote Microsoft SQL Server database from within Aspen SQLplus.
Solution: This error message is returned from Microsoft SQL Server. The error is described in the following Microsoft knowledge base article, http://support.microsoft.com/default.aspx?scid=kb;EN-US;q143032 This error is returned because Microsoft SQL Server cannot process more than one statement on a connection at once. For example, if you open a SELECT statement, you must read all the results or close the statement before executing another statement. One way that you can get this error in Aspen SQLplus is to use a FOR loop on a SELECT statement from a Microsoft SQL Server database and then access the database from inside the loop. For example: FOR (SELECT * FROM mylink.mytable) DO INSERT INTO mylink.anothertable VALUES (x,y); END One workaround for the case above is to add an ORDER BY clause to the SELECT statement so that it reads all the results before executing the loop. Another workaround is to execute the queries using different database links which point to the same server. If the queries are executed through different database links the query would look like this: FOR (SELECT * FROM mylink.mytable) DO INSERT INTO mylink2.anothertable VALUES (x,y); END Keywords: None References: None
Problem Statement: During certain conditions you may need to force the memory data in a record to be written to disk. A side effect of doing this is that the create date of the record gets changed to the current date and time, thus the data in the repeat area for the record is no longer visible in Aspen InfoPlus.21 Administrator, nor when using SQLplus to try to get data from a previous date and time.
Solution: There are two ways to resolve the problem, one uses the command XOLDESTOK in SQLplus, available in Aspen InfoPlus.21 version 6.x and higher, and the second one is using the XOLDESTOK command utility. 1stSolution Using XOLDESTOK in Aspen SQLplus Query Writer. First pick a record that was created at the same time the record you're trying to fix. Then use the XOLDESTOK function on that record to get it's creation date, and use this creation date to fix the record with the problem. For example, if the record with the problem is ATCTEMP2, and ATCTEMP1 was created at the same time, then you can perform these steps in SQLplus to correct the creation date for ATCTEMP2. WRITE XOLDESTOK('ATCTEMP1 1 IP_TREND_VALUE '); This returns the oldest time stamp for ATCTEMP1, say '7-may-07 10:20', with this information you can set the creation date for ATCTEMP2 using the following command: XOLDESTOK('ATCTEMP2 1 IP_TREND_VALUE ','7-may-07 10:20'); This sets the oldest timestamp for ATCTEMP2 to be 7-may-07, 10:20. 2ndSolution Using the XOLDESTOK command utility. There's a utility that you can use to change the creation date of a record to a prior date, to be able to see the data saved to it. The utility is called xoldestok.exe and it's found in the C:\Program Files\AspenTech\InfoPlus.21\db21\code folder. For a full description on how to use the command utility seeSolution 103040, How to insert data older than creation time of a record; how to use xoldestok. You'll need to run this from a CMD or DOS window. The operation to flush a record's data from memory to disk involves changing two field values in several steps: Set the IP_#_OF_TREND_VALUES field value to zero (0) Set the IP_ARCHIVING field value to OFF Set back IP_ARCHIVING field value to ON Set back IP_#_OF_TREND_VALUES value to two (2) After you perform these steps the create date of the record gets changed to the current date and time, and the data in the repeat area for the record is not visible anymore. First pick a record that was created at the same time the record you're trying to fix. Run the utility on that record to get its creation date, and use this creation date to fix the record with the problem. For example, if the record with the problem is ATCTEMP2, and ATCTEMP1 was created at the same time, then you can perform these steps to correct the creation date for ATCTEMP2. Open a CMD window. Change directory to :\Program Files\AspenTech\InfoPlus.21\db21\code folder. Run the command xoldestok. When prompted to enter the record name, enter ATCTEMP1, followed by the word IP_#_OF_TREND_VALUES, like: ATCTEMP1 IP_#_OF_TREND_VALUES This would give you a date, make note of this date and it's format, as this is the date you'll use to correct the creation date for record ATCTEMP2 At the Oldest allowed time prompt hit <Enter> to exit. Now run the program again, this time provide the record name you want to fix followed by the word IP_#_OF_TREND_VALUES, like: ATCTEMP2 IP_#_OF_TREND_VALUES At the Oldest allowed time prompt enter the date you saved from record ATCTEMP1 using the same format, and hit <Enter> twice to finish. If you run the command again for record ATCTEMP2 you'll see the new date you just changed; this confirms that the creation date for the records was changed. This would allow you to see the data collected for record ATCTEMP2 in the IP21 Administrator and SQLplus. For further reference see alsoSolutions: 103040, How to insert data older than creation time of a record; how to use xoldestok. 120295, How is the oldest allowed time set when records are created? 100949, How to properly add old data to a new InfoPlus.21 record using SQLplus Keywords: create date flush memory References: None
Problem Statement: Can you explain the meaning of the solver messages when one changes in Run, Solver Options, Non-linear tab, Diagnostics, the values of Highest variable steps and Highest residuals? These messages are issued when running the attached simulation. Solving steady state ... Decomposition: Total number of equations = 2 Total number of groups = 1 Number of nonlinear groups = 1 Largest group size = 2 Solving group 1, size 2, type Nonlinear Newton iteration 0: Highest 2 residuals above tolerance of 1.000000e-005 are -1.000000e+001 B1.AM_Eqn2 -5.000000e+000 B1.AM_Eqn1 The 2 unknown variables for equation B1.AM_Eqn2: Value Name 3.000000e+000 B1.x 1.000000e+000 B1.y Newton iteration 0: Var. Norm=0.000e+000, Eqn. Norm=1.000e+001 (best 1.000e+001) Newton iteration 1: Highest 2 variable steps above 1.000000e-005 are: Value Vnorm component Absolute step Variable name 1.00000e+000 6.25000e-001 1.25000e+000 B1.y 3.00000e+000 3.12500e-001 1.25000e+000 B1.x Newton iteration 1: Highest 2 residuals above tolerance of 1.000000e-005 are 3.125000e+000 B1.AM_Eqn2 1.562500e+000 B1.AM_Eqn1 The 2 unknown variables for equation B1.AM_Eqn2: Value Name 4.250000e+000 B1.x 2.250000e+000 B1.y Newton iteration 1: Var. Norm=6.250e-001, Eqn. Norm=3.125e+000 (best 3.125e+000) Newton iteration 2: Highest 2 variable steps above 1.000000e-005 are: Value Vnorm component Absolute step Variable name 2.25000e+000 7.39645e-002 -2.40385e-001 B1.y 4.25000e+000 4.57875e-002 -2.40385e-001 B1.x Newton iteration 2: Highest 2 residuals above tolerance of 1.000000e-005 are 1.155695e-001 B1.AM_Eqn2 5.778476e-002 B1.AM_Eqn1 The 2 unknown variables for equation B1.AM_Eqn2: Value Name 4.009615e+000 B1.x 2.009615e+000 B1.y Newton iteration 2: Var. Norm=7.396e-002, Eqn. Norm=1.156e-001 (best 1.156e-001) Newton iteration 3: Highest 2 variable steps above 1.000000e-005 are: Value Vnorm component Absolute step Variable name 2.00962e+000 3.18978e-003 -9.60002e-003 B1.y 4.00962e+000 1.91632e-003 -9.60002e-003 B1.x Newton iteration 3: Highest 2 residuals above tolerance of 1.000000e-005 are 1.843209e-004 B1.AM_Eqn2 9.216047e-005 B1.AM_Eqn1 The 2 unknown variables for equation B1.AM_Eqn2: Value Name 4.000015e+000 B1.x 2.000015e+000 B1.y Newton iteration 3: Var. Norm=3.190e-003, Eqn. Norm=1.843e-004 (best 1.843e-004) Newton iteration 4: Var. Norm=5.120e-006, Eqn. Norm=4.719e-010 (best 4.719e-010) Newton: Function convergence after 4 iterations Steady state
Solution: complete Solution Let''s review this section by section. The message below is issued when the non-linear solver diagnostics option Highest residuals is set to a non-zero value (we suggest a value in the range of 2 to 5, to show the top 2 or 5 highest residuals, that means the 2 to 5 equations that are usually preventing the system to be converged). Newton iteration 0: Highest 2 residuals above tolerance of 1.000000e-005 are -1.000000e+001 B1.AM_Eqn2 -5.000000e+000 B1.AM_Eqn1 The 2 unknown variables for equation B1.AM_Eqn2: Value Name 3.000000e+000 B1.x 1.000000e+000 B1.y Note that only residuals actually higher that the equation tolerance (as set in the solver options, absolute equation tolerance, 1e-5) are displayed. This explains why sometimes fewer than the specified number of highest residuals are actually displayed, as in iteration 4 in the example. It shows the value of the (unscaled ) residual for these equations, which is simply the difference between the left hand side and the right hand side expressions in the equation. When no equation names are specified in the models, ACM gives them a default name, which is AM_EqnNNN where NNN is the equation number (1 for the first equation, 2 for the second, etc - note that in case the model inherits from another model, the equation numbers start from the parent model. In the case of submodels, each submodel has its own equation number counter. In general it is better to give names to equations to avoid having to count equations in a model). The message also shows the calculated variables involved in the equation that has the highest residual (in absolute value). This shows the current value of the variables, e.g. the values that have been used to evaluate the residuals. The values are displayed in the base units of measurement (typically, Metrics). When variable equivalencing is active (this is the default), all variables in the equivalence are displayed. Note that fixed variables, and variables already solved in previous groups are not displayed. This information is useful, because you can get an idea of which equation(s) are preventing the system to be converged, and for the highest residual, the current value of the variables involved. The message below is issued when the value of Highest variable steps is set to a non-zero value. Newton iteration 1: Highest 2 variable steps above 1.000000e-005 are: Value Vnorm component Absolute step Variable name 1.00000e+000 6.25000e-001 1.25000e+000 B1.y 3.00000e+000 3.12500e-001 1.25000e+000 B1.x The thershold value displayed on the first line (1e-5) is in fact the relative variable tolerance, as specified on the Solver options, Tolerances sheet. Value shows the current value of the variable at which the residuals (and Jacobian matrix) have been evaluated. This is the value before the solver step is applied. The value is displayed in the base units of measurement of the variable. Vnorm component shows the value of variable norm for this variable. Note that the variable norm is the maximum of the variable norms components over all variables - i.e. the first variable shown in the highest variable steps. Absolute step - this shows the step to be taken by the variable. This step size is the one calculated by the solver, before any step cutting is applied due to bounds or step range option. The definition of Vnorm is the absolute value of the step size, divided by the absolute value of the variable plus the ratio of the absolute tolerance over the relative tolerance: Vnorm = abs(step) / (abs(x) + abstol/reltol) This information is useful because it shows the variables that are changed by the largest amount. Large variable changes may indicate a very sensitive problem, which can be caused by the non-linear nature of the problem, or by poor variable scaling, or poor starting values for the non-linear iterations. When you see that the variable steps are very small but the residuals are still larger than the equation tolerance, you should also suspect a scaling issue. As the variables are not changed by a significant amount, it is unlikely to make any further progress. Sometimes this can be caused by numerical noise in procedures, or incorrect equation scaling. It is also possible that the system actually has noSolution. A practicalSolution is to change the solver convergence criteria to Variables or Residuals (on the non-linear solver sheet). The variables step convergence criteria is satisfied when the largest vnorm is smaller than the relative tolerance. This test corresponds to: abs(step) < (abs(x) * reltol + abstol) Keywords: vnorm References: None
Problem Statement: Your custom SQLplus web-based reports, built for example using an html editor and connecting to Aspen Infoplus.21 via a system DSN in the ODBC data sources with AspenTech SQLplus as driver, timeout when they take longer than 30 seconds to generate.
Solution: The timeout can be set in the .asp page itself by changing the CommandTimeout value for your connection. The default command timeout within ADO is 30 seconds. CommandTimeout tells the server how long to wait, in seconds, for completion of any command sent to the data source. This value is editable before and after the connection has been opened. The default is 30 seconds, but you can override it like this: Set conn = CreateObject(ADODB.Connection) conn.Open <connectionString> conn.CommandTimeout = 120 Keywords: ASP ADO TIMEOUT REPORT References: None
Problem Statement: The Begin/Exception statement does not write if the query fails due to an invalid record name. Query1 (Q1): BEGIN SELECT ip_input_value from ATCAI3; EXCEPTION WRITE 'record name not found'; END If the record name ATCAI3 does not exist, the query will fail and will not write the exception message.
Solution: The Begin/Exception only traps run-time errors, not parse-time errors. A way to work around this is to put the main query in a separate record or file and use a Start command to run it from within the Begin/Exception. As an example: Save the query in Q2 as Select.SQL. Then use a start statement in Q3 to run the query. If the record does not exist, the exception message will write in the output. Q2: SELECT ip_input_value from ATCAI3; Q3: BEGIN START 'Select.SQL'; EXCEPTION WRITE 'record name not found'; END Keywords: References: None
Problem Statement: Is there a way to make Aspen SQLplus query records read only without using record level security?
Solution: Records defined by QueryDef and CompQueryDef have a field called READ_ONLY. If this field is set to YES, any query lines that perform a database update or a write will be disabled. Keywords: Querydef ComQueryDef read only References: None
Problem Statement: What would be a better way to get a nearest value for a specific time
Solution: You can find the nearest value before the time by selecting from the record: SET MAX_ROWS=1; select ip_trend_time, ip_trend_value from mytag where ip_trend_time < mytime; You can find the nearest value after the time by selecting from the history pseudo table: SET MAX_ROWS=1; select ts, value from history where name= mytag and ts between mytime and current_timestamp and request=4; You would then have to compare the timestamps to find the closest of these two to the time designated as mytime. Keywords: Nearest value Specific time References: None
Problem Statement: Sometimes it is necessary to call a stored procedure on a remote SQL Server or Oracle database. This knowledge base article suggests 2 ways to call a stored procedure on a remote database.
Solution: If calling a stored procedure on a remote database that has only input parameters, the SQLplus EXECUTE statement can be used. If this statement needs to be split over multiple lines, the quote at the end of the first line can be closed and then opened again at the start of the second line. Eg. execute '{call my_proc(123,456' ',789,111,323)' on my_link; However, if one needs to use output parameters, then the SQLplus EXECUTE statement can't be used. In this case, one needs to use ADO with the SQLplus COM objects. The COM objects are available in SQLplus v.4.0 and later. Below is an example which uses the SQLplus COM objects through an ADO connection with an input parameter and a return value: local conn, rs, adoJohn, adoParam; local adInteger int, adChar int, adParamReturnValue int, adParamInput int, adCmdStoredProc int; adInteger = 3; adChar = 129; adParamReturnValue = 4; adParamInput = 1; adCmdStoredProc = 4; conn = createobject('ADODB.Connection'); conn.open('sqlplus on localhost'); adoJohn = createobject('ADODB.Command'); adoJohn.CommandText = 'com_ado_sp1'; adoJohn.CommandType = adCmdStoredProc; adoJohn.ActiveConnection = conn; adoParam = adoJohn.CreateParameter('returnvalue', adChar, adParamReturnValue, 32); adoJohn.Parameters.Append(adoParam); adoParam = adoJohn.CreateParameter('a', adChar, adParamInput, 32); adoJohn.Parameters.Append(adoParam); adoJohn.Parameters(1) = 'end'; adoJohn.Execute; write adoJohn.Parameters(0); Keywords: References: None
Problem Statement: The following SQLplus example query reads a bit then writes the seventh (reading from right to left) bit component to an Aspen InfoPlus.21 record.
Solution: write current_timestamp; local mybitstring integer; local mydevicestate integer; -- -- Here is an example of how to read the seventh bit (counting from right to left) -- mybitstring = 64; -- This would be 0000000001000000 in binary -- -- Shift the bits six places, then and with 1 to filter out the bit -- mydevicestate = BIT_AND(1,BIT_SHIFT(mybitstring,-6)); write mydevicestate; -- -- To write it to the record, use a statement like: -- update mydevicerec set ip_input_value = mydevicestate; -- -- As another example, if you want to read the second bit, -- shift the bits one place, then and with 1 -- mydevicestate = BIT_AND(1,BIT_SHIFT(mybitstring,-1)); write mydevicestate; Keywords: None References: None
Problem Statement: The error maximum string length exceeded is returned when attempting to execute a query containing a WRITE command.
Solution: The maximum string length for a WRITE is 600 characters when using a CHAR data type. Using a variant data type, it is possible to write strings longer than 600 characters. In the following example, a variable of undeclared type is treated as a variant. A string 8000 characters in length will be returned. local x, i int; x = ''; for i = 1 to 1000 do x = x||'abcdefgh'; end write x; Keywords: string character write maximum string length exceeded text Aspen SQL Plus References: None
Problem Statement: When a person is in Excel, MS Query, or Access and accessing the IP.21 database via Desktop ODBC, the pseudo tables available in SQLplus aren't available to them. Many would like to have access to the AGGREGATES and HISTORY pseudo tables in particular.
Solution: In the SQLplus Query Writer, create a view of these tables. Then those views can be accessed in ODBC aware applications. For example, to create a view of the AGGREGATES table, use a query of one of the following forms. CREATE VIEW aggreg AS SELECT * FROM AGGREGATES; -OR- CREATE VIEW aggreg AS AGGREGATES; Each of the above queries, creates a view named aggreg. Once you check the option in your ODBC aware application to show tables AND views, aggreg would show up in your list. A similar procedure could be used to create a view for the HISTORY pseudo table. CREATE VIEW hist AS SELECT * FROM HISTORY; -OR- CREATE VIEW hist AS HISTORY; The following steps can be used to establish a link to the aggreg view from MS Access. Create a System DSN for the ODBC connection. I specified AspenTech SQLplus as the driver and my local machine as the data source name. Open MS Access and create or open a database. Go to File | Get External Data | Link Tables. Select ODBC Databases as the file type to link. Go to the Machine Data Sources Tab and select the Aspen SQLplus Data Source and click OK. You should see a list of available tables. Select the aggreg table/view you created in step 1. Select an appropriate unique record identifier. Name is generally the best option. Once the link is established, it is possible generate queries using the link to the aggreg view. Keywords: AGGREGATES HISTORY pseudo table ODBC References: None
Problem Statement: Creating a delta timespan by subtracting variables defined as timestamps from each other does not work.
Solution: Note the first screen capture below, where tsIndex is defined as a timestamp. The attempt to define several successive timespan deltas fails: In this next screen capture, the only significant change is changing tsIndex to type INT instead of TIMESTAMP, and the delta timespans are defined successfully: Why does one approach work, and the other does not? The reason is that time deltas are measured in 10ths of seconds. The statements tsToday0h - 800:00:00 and tsToday0h - 1728000000 are equivalent. In the statement tsToday0h - 800:00:00, SQL converts 800:00:00 hours to 1728000000 10ths of a second before subtracting. Keywords: delta time References: None
Problem Statement: Compquerydef or Querydef records with a reschedule interval of 1 month, do not execute as expected.
Solution: When 1 Month is specified as the RESCHEDULE_INTERVAL for a QueryDef or CompQueryDef, the following logic is used. If the day of the month is less than or equal to 14 then the rescheduled date use the same number. For example, if the reschedule date is set on 08-AUG-06 with a RESCHEDULE_INTERVAL of 1 Month, the resulting date for the SCHEDULE_TIME will be 08-SEP-06. However, if the day of the month is greater than or equal to 15, the date for the SCHEDULE_TIME will use the number of days from the end of the month. For example, if the reschedule date is set on 15-FEB-06 (13 days before the end of the month) with a RESCHEDULE_INTERVAL of 1 Month, the resulting date for the SCHEDULE_TIME will be 13-MAR-06 (31 - 13) and April will be computed as 17th (30 - 13), and so on. To ensure that the SCHEDULE_TIME will have the same day every month, it is necessary to use the fixed interval setting, 1 Mon-F. In the example above, if the RESCHEDULE_INTERVAL is set on 15-FEB-06 with a value of 1 Mon-F, the SCHEDULE_TIME will be on the 15th day of every month (15-MAR-06, 15-APR-06, etc.) from that point forward. For additional information, see the Aspen InfoPlus.21 Installation Guide. Keywords: month query schedule reschedule interval QueryDef CompQueryDef References: None
Problem Statement: Hints on using Optimization in Aspen Custom Modeler and Aspen Dynamics
Solution: Once you have developed a flowsheet of your plant, it is easy to implement a basic set of steady state optimization parameters. For example, define an objective function such as the value of the plant product made, less the cost of energy required to create the product. The design variable in this case could be the temperature of the reaction for example. By optimizing on these fundamental parameters, you can learn much about the operation of your plant. Once you have run a simple optimization case, you may want to increase the sophistication of your optimization study. For example, you may consider the by-products and a full energy balance of your plant, or the effects of increasing the number of design variables. As the complexity of the problem increases, so does the work the optimization solvers have to do. In this case, you can often benefit from using some of the more advanced facilities and techniques available for optimization.Solution Domain Testing: Before an optimization simulation, it is often useful to use homotopy to ensure that steady stateSolutions exist within the bounds of the design variables and the constraints. For more information about homotopy, look in the on-line help. You can also test the optimization domain by running multiple steady state runs using automation and visual basic, for example. Information about where the optimumSolution is likely to be, and if there are any local minimum or maximum values can also be obtained. The Aspen Custom Modeler example MethReactorSSEst contains an Excel spreadsheet ReactorSSEst.xls that runs multiple Estimation runs - the visual basic in this spreadsheet could be modified to run multiple Steady State simulations instead. Effective constraints: Constrain the design variables by giving them upper and lower bounds to reasonable values. This can be done in the Constraints definition under Flowsheet in the Simulation Explorer. Scaling: Use scaling to improve both the speed and robustness of your optimization runs. There are two forms of scaling you can use: Objective function scaling - Optimization routines work best when the typical value of the objective function is neither a very large or a very small number. To note, particularly small numbers tend to result in rounding errors, and a less accurateSolution. Values between 1 and 100 work well. If your objective function value lies outside this range, simply multiply the objective function by a value to include it in this range. Decision variable scaling - At each optimization iteration, the size of the steps taken for the decision variables is proportional to the scaling applied to that decision variable. The larger the scale factor, the larger the step. Scaling can be applied to a decision variable by displaying the Scale property of a variable in a form (using right mouse button / properties and adding Scale to the visible properties list) and adding a scale factor. Alternatively a scale factor can be applied to a variable in Constraints definition under Flowsheet, or in the model definition itself: Vessel.FlowIn.F.scale : 10; Vessel.FlowIn.F.scale : 0.1; In this example, steps in the flowrate Vessel.FlowIn.F are 10 times the default, and steps in the temperature Vessel.FlowIn.T are 0.1 times the default. In general, if the optimum is very sensitive to the value of a Decision variable, give that variable a small scale factor. If the optimum is insensitive to a Decision variable, give it a larger scale factor. Tuning Parameters: There are two tolerance values that directly affect optimization runs. Variable Change Tolerance - The FEASOPT optimizer repeatedly solves a steady state sub-problem for different values of the Decision variables. The Variable Change tolerance determines the precision to which this is done. If the Variable Change tolerance is too large, the optimizer will see noise in the steady state results it gets, and will have difficulty locating an optimum. Reducing the Variable Change Tolerance sometimes helps to reach an optimum for problems. Variable Change Tolerance has no effect with SRQP. Optimization Tolerance - You find the optimum value when the estimated difference between the objective function at the current test point and the true optimum is less than the Optimization Tolerance. Usually the default value of 1E-4 works well, particularly if the objective function is scaled well. Do not reduce the Optimization Tolerance too much; the precision of the steady state results or physical property calculations may be too low to obtain aSolution. You can also use the Non-Linear solver's iterations option for optimization runs. This solver option determines the maximum number of optimization iterations taken before a run terminates. The default of 50 works well, but you can increase this value if the objective function results are still improving after 50 iterations. For SRQP simulations there are some additional options. See the description in the Solver Options panel for SRQP, and read the on-line help for the SRQP panel. (SQRP is available from version 11.1 on.) Keywords: optimization optimisation feasopt srqp References: None
Problem Statement: An SQLplus Web Report may not run and error out with this message: Error generating report test Failed to transform XML No stylesheet was loaded The corresponding SQLReportDef record may report the following information in its ERROR_TYPE field: Failed to load report What can one do to get past this problem?
Solution: Please try the suggestions inSolution 116430, especially if the underlying query has changed. If that does not fix the problem, please try restarting the IIS Admin Service on the machine which is acting as the web server. One may be prompted that other services need to be restarted in order to do this. Please indicate that those services should be restarted as well. Keywords: None References: None
Problem Statement: An Aspen InfoPlus.21 QueryDef or CompQueryDef record that contains a link to a remote relational database does not execute successfully. The query definition record's #OUTPUT_LINES repeat area contains the following error information: error type=Remote Remote database error on link <link_name> Where <link_name> is the name of the link configured within Aspen SQLplus (SQLplus) for accessing the relational database. During this time, running the query manually from the SQLplus Query Writer works successfully. The reason that the SQLplus client tool works during this time is because each time the client is opened, a new connection to the relational database link is established for that session. This is not the case for QueryDef records where the connection to the relational database link is made once, when the associated EXTERNAL_TASK_RECORD is started (TSK_IQ1 for example). If a database error on the link occurs after this initial connection is made, the QueryDef or CompQueryDef record may remain in an error state.
Solution: To work around this behavior, select the Disconnect on any error check box within the Database Link configuration to make Aspen SQLplus reconnect to any database type of error. If this option is not used, then Aspen SQLplus disconnects only on network errors. This option is useful for databases that do not report network problems in a standard way. Use the steps below to resolve this behavior: 1. Open the Aspen SQLplus Query Writer and click on the Tables button. 2. Expand the Database Links. 3. Select the link of interest and click the Edit Link button. 4. Check the box for Disconnect on any error. 5. Click OK After following these steps, use the InfoPlus.21 Manager to stop/restart the QueryDef record's associated EXTERNAL_TASK_RECORD (TSK_IQ1 for example). Keywords: error on link References: None
Problem Statement: What function should I use in Aspen SQLplus to evaluate a number raised to a certain exponent?
Solution: If you need to evaluate a number (such as 5) raised to a certain power (in this case 3) the proper numeric function to use is POW. POW(x, y) calculates x to the power y. Example: POW(5, 3) = 125 Alternatively, the Aspen SQLplus numeric function EXP(x) calculates mathematical constant e (which is β‰ˆ 2.71828) to the power x. Example: EXP(1) = 2.71828 In mathematics, the logarithm of a number to a given base is the power or exponent to which the base must be raised in order to produce the number. For example, the logarithm of 1000 to the base 10 is 3, because 10 raised to the power of 3 is 1000; the base 2 logarithm of 32 is 5 because 2 to the power 5 is 32. The most widely used bases for logarithms are 10, the mathematical constant e β‰ˆ 2.71828... and 2. Aspen SQLplus provides the two numeric functions to calculate the logarithm of a number. LN calculates the natural logarithm function (log base e). They are both the inverse of EXP. An example of the LN syntax is: LN(2.71828) = 1 You can calculate the LOG function (log to base 10) by using LN (log to base e) and dividing by LN of 10. Example: FUNCTION LOG(X) RETURN LN(X)/LN(10); END WRITE LOG(1000000); Keywords: POW LN EXP LOG exponent logarithm References: None
Problem Statement: This knowledge base article explains how text data is handled when queried through the Aspen SQLplus History table.
Solution: The boxcar data compression algorithm used to reduce the amount of storage needed for Aspen InfoPlus.21 history means that data is not stored at regular time intervals even if it is scanned to regular intervals. The History table interpolates the data to present Aspen InfoPlus.21 history data as if it was recorded at regular intervals. Interpolating uses a mathematical algorithm which is only valid for numbers. However, starting with version 6.0, Aspen SQLplus allows text data to be available from the History table for the actual values request types (REQUEST = 3, 4 & 5). The following query is a working example: SELECT ts, value FROM HISTORY(80) WHERE name='ATextRecord' and request = 5 and ts between '05-jul-06 14:00:00' and '05-jul-06 15:00:00'; This yields the following output: ts value -------------------- ------------ 05-JUL-06 14:36:24.5 Hi 05-JUL-06 14:36:29.0 There 05-JUL-06 14:36:34.1 !!! 05-JUL-06 14:36:38.0 You 05-JUL-06 14:36:40.5 are 05-JUL-06 14:36:43.5 nice! Keywords: Text TextDef IP_TextDef Ascii Letters Alphabet history table References: None
Problem Statement: My simulation is correctly specified when my model has only one section. However, if I change the number of sections, I observe that my simulation is no longer square. Why?
Solution: Here's a little example to show what is happening. Model test X as LengthDomain (DiscretizationMethod:CFD2, HighestOrderDerivative:2, numsections:2) ; y as Distribution1D (XDomain is X); y = 123; End If you set X.numsections : 1, then the simulation is square. If you set X.numsections : 2, then the simulation is overspecified by 1. (more generally, it will be overspecified by x.numsections - 1). You can avoid the overspecification by setting the highestorderxderivative:0 for y. Another approach is to use x.interior in a systematic way and write: for i in [0] + x.interior + [x.endnode] do y(i) = 123; endfor The overspecification with the original formulation is caused by the continuity equations created by the distribution model when the highest order derivative for the distribution is 2, at the element where the next section starts. At the internal element boundaries a continuity equation is used, therefore the user should not be writing equations for the given node. This is unfortunately what happens automatically when you write the equation without explcitely setting the index (e.g. y = 123). Using the Interior of the domain takes care of excluding these boundaries. The recommendations when using the ACM PDEs are: Specify the highest order derivative to be expected from all dependent distributions in the given domain. For a given distribution, if no spatial derivatives are required then indicate in the distribution definition what maximum order is otherwise required (must be <= domain highest order). IMPORTANT: Do not make use of loop constructs based on [0:x.EndNode] or [1:x.EndNode-1] instead make use of the Interior set in the domain (and [0] + EndNode if boudaries. Using the Interior set ensures you only write equations for the nodes in the domain that require them. It will avoid under/over-specification problems when using 2nd order equations over multiple sections and OCFE where continuity equations are used at section/finite element boundaries. There is an analogous issue with OCFE when using 2nd order derivatives when using 2D or 3D distributions and using mixed finite element and OCFE of varying order. Again these are caused by the internal rule which leads in this case to not creating continuity equations at certain nodes in these circumstances. If the model is suffering from this type of problem (underspecification), your own continuity equations (or linear interpolations) will need to be added. For example you may need code such as: Cp_ As Distribution2D(XDomain Is x_,HighestOrderXDerivative:1,YDomain Is r_) Of RealVariable; //Special interpolation at finite element boundaries when using OCFE in axial direction If x_.DiscretizationMethod == OCFE2 Or x_.DiscretizationMethod == OCFE3 Or x_.DiscretizationMethod == OCFE4 Then For ax In DIFFERENCE([0:x_.EndNode], x_.Interior)- [0] - x_.EndNode Do Cp_(ax,[0:r_.EndNode]) = (Cp_(ax+1,[0:r_.EndNode])+Cp_(ax-1,[0:r_.EndNode]))/2; EndFor EndIf This code writes the interpolation for the element boundaries, which are all the nodes not in the interior, and not at first node, and not at the end node. Keywords: References: None
Problem Statement: Flash efficiency does not seem to apply to a Procedure type flash when using any of the submodels below: Props_flash2, Props_flash2_entr, Props_flash2w, Props_flash3, Props_flash3_entr, Props_lle, Props_lwe
Solution: Flash efficiency is only relevant when FlashBasis=Equation or SmoothEquation. It is not used in a Procedure type flash. There is no easy way to support efficiencies for Procedural flashes because the Aspen Plus Flash routine does not support efficiencies. Efficiencies can only be used for FlashBasis=Equation or SmoothEquation. Keywords: References: None
Problem Statement: How can I determine the data type of a field within an Aspen InfoPlus.21 record?
Solution: Ita??s possible to do this with the Aspen InfoPlus.21 Administrator tool where each field is displayed using a special icon (if the field is in the fixed area of the record). The most common data types for any field in the Database are as follows: A A A A A Real (single or double precision) A A A A A Integer A A A A A A Character A A A A A Record Pointer A A A A A A Field Pointer A A A A A Timestamp (Normal, Extended or Scheduled) A A A A A A Repeat Area You can also navigate to the Fields object under each Definition record and look in the first repeat area to see a list of fields and their data types. SQLplus has a VARTYPE function you can use to determine the data type of a field: SELECT VARTYPE(IP_MESSAGE_SWITCH) FROM ATCL101 Returns 3, which is the data type for (Long) Integers. A A complete list of values returned by VarType is available in the SQLplus help (accessible from the Query Writer). This query will return a list of all the fields in a (Definition) record and their datatypes. SELECT FIELD_NAME_RECORD, FIELD_DATA_TYPE FROM DEFINITIONDEF A A A A A A WHERE Name = 'VIEWDEF' If you know the name of the field you are looking for, then modify the query so, SELECT FIELD_NAME_RECORD, FIELD_DATA_TYPE FROM DEFINITIONDEF A A A A A A WHERE Name = 'COSACTDEF' AND FIELD_NAME_RECORD = 'Query_Line' Wrap it all up in a PROMPT statement to make it more generic. MACRO def = PROMPT('Which Definition record?'); SELECT FIELD_NAME_RECORD, FIELD_DATA_TYPE FROM DEFINITIONDEF A A A A A A WHERE Name = '&def' or present it like this: MACRO def = PROMPT('Which Definition record?'); MACRO fld = PROMPT('which field?)'; SELECT FIELD_NAME_RECORD, FIELD_DATA_TYPE FROM DEFINITIONDEF A A A A A A WHERE Name = '&def' AND FIELD_NAME_RECORD = '&fld' This following query allows you to see the data field types of an individual record.A MACRO recdef = PROMPT('the fields of which record ?','ip_analog'); A A A A A A for A A A A A A A A A A A A select name as rec from all_records where name like '&recdef' A A A A A A do A A A A A A A A A A A A if rec->definition = 'DefinitionDef' then A A A A A A A A A A A A A A A A A A A select field_name_record,field_data_type,field_length,field_format_record from definitiondef where name like '&recdef'; A A A A A A A A A A A A else A A A A A A A A A A A A A A A A A A A select field_name_record,field_data_type,field_length,field_format_record from definitiondef where name like rec->definition; A A A A A A end end Keywords: None References: None
Problem Statement: How to ensure that IP.21 is not scanning the same device tagname from multiple IOGetdef records.
Solution: Using this SQLplus query it will list duplicates: select name, occnum, a.IO_TAGNAME, a.IO_VALUE_RECORD&&FLD from iogetdef a, (select count(IO_TAGNAME) as ct, IO_TAGNAME, IO_VALUE_RECORD&&FLD from iogetdef group by IO_TAgname,IO_VALUE_RECORD&&FLD) b where b.ct > 1 and a.IO_TAGNAME = b.IO_TAGNAME; Sample output: name OCCNUM IO_TAGNAME IO_VALUE_RECORD&&FLD IOOPC_Get 1 Random.Real8 OPC_Random IP_INPUT_VALUE RemoteGet1 1 Random.Real8 RemoteOPCTAG IP_INPUT_VALUE OPC_Get 1 Random.Real8 OPC_Random IP_INPUT_VALUE OPC_Bensen 1 Random.Real8 dummy IP_INPUT_VALUE OPC_Get 2 Saw-toothed Waves.Real8 (R/W) OPC_Saw IP_INPUT_VALUE IOOPC_Get 2 Saw-toothed Waves.Real8 (R/W) OPC_Saw IP_INPUT_VALUE OPC_Get2 3 Triangle Waves.Real8 OPC_Triangle IP_INPUT_VALUE IOOPC_Get 3 Triangle Waves.Real8 OPC_Triangle IP_INPUT_VALUE Keywords: References: None
Problem Statement: This knowledge base article explains why when populating tags in an Aspen Infoplus.21 database from a text file using Aspen SQLplus, you might encounter the following error: Error creating record: incorrect buffer size at line n
Solution: This error occurs when there are tag names in the text file more than 24 characters long. Keywords: Incorrect buffer size References: None
Problem Statement: This knowledge base article describes how to use the AGGREGATES table in Aspen SQLplus when to query for multiple time ranges.
Solution: The AGGREGATES table does not support multiple time ranges, so you will need to use separate queries in conjunction with the UNION operation . For example: SELECT TS, AVG(AVG) BY NAME FROM (SELECT NAME, TS, AVG FROM AGGREGATES WHERE NAME IN ('atcai', 'atcl101', 'atcph101') AND PERIOD = 00:10 AND TS BETWEEN '25-SEP-07 09:00' AND '25-sep-07 10:00' UNION ALL SELECT NAME, TS, AVG FROM AGGREGATES WHERE NAME IN ('atcai', 'atcl101', 'atcph101') AND PERIOD = 00:10 AND TS BETWEEN '25-SEP-07 11:00' AND '25-SEP-07 12:00') GROUP BY TS ORDER BY TS DESC; Keywords: AGGREGATES UNION time ranges References: None
Problem Statement: QueryDef or CompQueryDef records can have multiple records defined in the #Wait_for_COS_Fields repeat area. Any of these records may activate the record, which does not provide a method to determine which record caused a COS activation. This
Solution: contains a query which writes the name of the record that activated the query.Solution The query redirects the activation_field and casts it as record. cast(*activation_field as record); which writes the record's name that activated the QueryDef or CompQueryDef. --declare variables local faultrec record; --who activated me if character_length(activation_record) > 0 then write 'cos activation ' || cast(*activation_field as record); faultrec = activation_record; else faultrec = prompt('Enter a record name'); end Keywords: References: None
Problem Statement: Excel can use a time format consisting of <number>.<number>, like 39018.50767500000. Is there a way to convert an Excel double timestamp into an InfoPlus.21 timestamp? Yes, the function below will convert the Excel double timestamp and return an InfoPlus.21 timestamp.
Solution: Save this function as a ProcedureDef and then call this function (CDate) in SELECT and INSERT statements within your own scripts. Function CDate(x double) local t timestamp, i VarDate; --Sample input Value x=38904.50767500000; i = trunc(x * 24)/24.0; return cast(i as timestamp)+(x-cast(i as real))*24:00; End; write cast(CDate(39018.50767500000) as char using 'ts22') Keywords: References: None
Problem Statement: How to catch parse time errors persisting in the midst of query execution
Solution: Begin/exception only traps run time errors, not parse time errors. One way around this is to put the main query in a separate record or file and use a START statement to run it from within the BEGIN/EXCEPTION. For Example: The START query will have: BEGIN START 'updatequery.sql'; EXCEPTION Write 'Error: ' || ERROR_TEXT'; END; Where updatequery.sql will be the main query. Keywords: Error EXCEPTION References: None
Problem Statement: It is observed that a rather large simulation using partial differential equations is solving slowly. Is there any setting that could improve the speed?
Solution: You could give a try to the transpose option in the Run, Solver Options, Linear Solver, with M48 solver. When checked, MA48 uses a transpose of the matrix instead of the original matrix when solving linear systems. This has the effect of changing the direction in which MA48 searches for pivots. For some models, using the transpose will be faster than using the original matrix. You may wish to use this option if your simulation is very slow (particularly if each nonlinear iteration is taking a long time) to improve performance. Note that in this case, the improvement is both in terms of speed and memory requirement. Keywords: References: None
Problem Statement: How can I call the physical property subroutines from my own subroutines which implement my procedures?
Solution: The subroutines that can be called are documented in ACM on-line help, under Aspen Custom Modeler Keywords: References: , Physical Properties reference. We illustrate how this can be done with an example, where we call the liquid density subroutine (GPILMX version without analytical derivatives, GPIDLMX version with analytical derivatives) and return the value of the density in mol/m3 instead of kmol/m3 (this is done purely for illustration purposes). In version 12.1 and below, you had to modify the MakeUserCode file that is generated to add the GPP library. In version 2004.1 and above, we include the required library so you no longer need to modify the MakeUserCode. The example attached is created for version 2004.1 and 2006.
Problem Statement: How do you get an interpolated value for an exact time?
Solution: Use the HISTORY table with the REQUEST type set to 2. For example: SELECT value FROM HISTORY where name = 'atcai' and request = 2 and ts = '23-OCT-05 23:59'; For more information about the History table and request types, please see the SQLplus User's Guide and/or the SQLplus Help files. Keywords: References: None
Problem Statement: When you create Automated reports in Aspen Web.21 SQLplus Reports, they should be saved under the \Documents and Settings\All Users\Application Data\AspenTech\SQLplus\automated folder, but the folder is empty.
Solution: When running the Aspen Web.21 server in a different system as the Aspen InfoPlus.21 database the automated reports and their output are saved in the Aspen InfoPlus.21 database system, not in the Aspen Web.21 server. The default paths in the Aspen InfoPlus.21 database system are: \Documents and Settings\All Users\Application Data\AspenTech\SQLplus\automated. \Documents and Settings\All Users\Application Data\AspenTech\SQLplus\output. Refer toSolution 114122, SQLPlus Reports web server: Where are private and public reports saved? How can this location be changed? Keywords: None References: None
Problem Statement: This knowledge base article explains why a custom application (e.g. a VB.NET or C# application) based on Aspen SQLplus (that connects to Aspen InfoPlus.21 ODBC with the AspenTech SQLplus driver) terminates with the following exception: ERROR [HY000] [AspenTech][SQLplus] No free cursor This can happen when you are execute a lot of queries in a short amount of time. See sample code below: Dim ConnectionString As String = DRIVER={AspenTech SQLplus};HOST=localhost Dim OdbcConn As New System.Data.Odbc.OdbcConnection(ConnectionString) OdbcConn.Open() For I As Integer = 1 To 10000 Dim cmd As New System.Data.Odbc.OdbcCommand(SELECT COUNT(*) FROM IP_AnalogDef;, OdbcConn) cmd.ExecuteScalar() Next I After 100 loops, .ExecuteScalar() raises the above exception.
Solution: 1. Call the Dispose method on OdbcCommand objects when you have finished with them. ...... Dim cmd As New System.Data.Odbc.OdbcCommand(SELECT COUNT(*) FROM IP_AnalogDef;, OdbcConn) cmd.ExecuteScalar() cmd.Dispose() ...... The problem is with closing ODBC statements. This error happens when a program opens many (>100) statements on a single ODBC connection but does not close them. With COM, clearing the last reference to an object releases the objects and all its resources. With .NET this is not the case; you have to call Dispose. So a program like the above one should be changed to include cmd.Dispose() statements after usage of cmd. Note that even when the Garbage Collector is supposed to call Dispose automatically, when you execute lots of commands in a short period, the Garbage Collector is not called often enough to release resources, hence the utility of calling the Dispose method. 2. The best practice to write the code above is indeed to use the Using statement instead of Dim, which would give, in the example provided: ...... Using cmd As New System.Data.Odbc.OdbcCommand(....) cmd.ExecuteScalar() End Using ...... The Using statement in VB.NET and C# automatically disposes of resources by calling .Dispose() at the end of the Using statement block. This alleviates the need for the developer to remember to call .Dispose() See http://msdn2.microsoft.com/en-us/library/htd05whh(VS.80).aspx for details. Moreover, if you create the connection object (OdbcConnection) in a Using statement, the connection is also disposed of when it is no longer needed. Keywords: EXCEPTIONS QUERIES CUSTOM CODE References: None
Problem Statement: How to retrieve the list of variables associated with a block using a script?
Solution: The FindMatchingVariables method can be used to list the variables associated with a block in ACM. See example code below: Set flowsheet = Application.simulation.flowsheet 'Assuming there is a block named B1 Set vars = flowsheet.B1.FindMatchingVariables(~, fixed free, , False, True, True, False) For Each s In vars application.msg name & s.Name & value & s.Value Next Please see online manual for details of script syntax and usage instructions. Keywords: FindMatchingVariables, variables, enumeration, listing, matching, find, lookup References: None
Problem Statement: This knowledge base article describes how to resolve the errors Server Not Available and Tag Not Found which can be returned when Aspen SQLplus web reports are executed even though connectivity to the Aspen InfoPlus.21 server has been verified through ping and nslookup commands.
Solution: These errors can potentially occur if another application which uses IIS has been installed on a server which is currently running the Aspen SQLplus web server. In a few cases, installation of Microsoft Sharepoint corrupted the Aspen SQLplus web reporting components. TheSolution is to uninstall the offending application (e.g., Microsoft Sharepoint) then reinstall Aspen SQLplus according to the procedure below. 1. Uninstall Microsoft Office and Microsoft Sharepoint 2. Reboot the server 3. After the reboot uninstall Aspen SQLplus 4. Reboot the server again 5. Reinstall Aspen SQLplus Keywords: SQLplus Web Reports SQlplus Web Reporting References: None
Problem Statement: This knowledge base article contains an example Aspen SQLplus query that monitors scheduled QueryDef and CompQueryDef records and sends an e-mail if the records are not executing on schedule. This query could be modified for other schedule record types.
Solution: In case the reschedule interval is set to 1 month:, the above query should be treated different and instead of subtracting the schedule_time - reschedule_interval you will have to use subtract_months as shown in the example below: select name, last_executed t, reschedule_interval intrvl, subtract_months(t, intrvl) NextSched from compquerydef where position('Month' in reschedule_interval) > 0 ; Attachments ThisSolution contains two attachments: QueryReport.sql - the example query query_report.txt - an example output file Keywords: stuck hanging iq task tsk_iq query last_executed scheduled_time reschedule_interval References: None
Problem Statement: How do I access domain resources and Aspen InfoPlus.21 tasks with a local account?
Solution: Using a local account for the Aspen InfoPlus.21 task service is a common method to decrease the start-up time of InfoPlus.21. However, a local account by default can not access domain resources. There are manySolutions to overcome this problem. 1. Create a local user account on the domain resource with the same username/password used for the InfoPlus.21 task service. 2. Specify a login script for the local user account that automatically connects domain resources. 3. Use the Aspen SQLplus system command to establish connections to domain resources: system 'net use p: \\server\share <password> /user:domain\username'; system 'dir p:'; Keywords: ip.21_domain local_account References: None
Problem Statement: How can I make my queries more generic using the PROMPT statement?
Solution: By using the PROMPT statement and a MACRO, it is possible to make your queries generic. Ita??s great for user input as well without the need for a VB or VBA application. MACRO recname = PROMPT ('Please enter a record name'); SELECT IP_Input_Value, IP_Input_Time, IP_Eng_Units FROM &recname A second parameter to the PROMPT statement is a default value that appears in the entry box. MACRO recname = PROMPT ('Enter a record name', 'ATCL101'); SELECT IP_Input_Value, IP_Input_Time, IP_Eng_Units FROM &recname If you have a second PROMPT statement, the default value is carried over to the second dialog. To prevent this, enter an empty parameter.A In the example below the blank value appears after 'Please enter a product' - it is highlighted in orange. MACRO bat = PROMPT ('Please enter a batch number', 'B115'); MACRO prod = PROMPT ('Please enter a product',' '); SELECT Start_Time, End_Time FROM Batch_Log WHERE batchid = '&bat' AND product = '&prod' Keywords: None References: None
Problem Statement: ANSI Standard SQL has a key word called 'TOP' to retrieve TOP 'n' number of records from a table. How do I retrieve TOP 'n' records from Aspen InfoPlus.21 Aspen SQLplus?
Solution: Aspen SQLplus does not have a key word like 'TOP', but there is a similar key word MAX_ROWS. Using this we can achieve the same goal. Use the following statement to retrieve TOP 'n' numbers of records from Aspen SQLplus: SET MAX_ROWS=10; -- retrieving top 10 records SELECT NAME FROM IP_ANALOGDEF The above query is written using the SQLplus query language. However you may want to retrieve the top 10 records through your custom applications like VB / ASP etc. In such a situation, please use the ADO/DAO record set property named 'MaxRecords' to retrieve top 'n' number of records from Aspen InfoPlus21 database tables. AdoRs.MaxRecords=10 (here we are limiting the records through ADO/DAO record object property not through Aspen SQLplus.) Keywords: top max References: None
Problem Statement: Is there a way for a QueryDef record to access its own output lines from the previous execution? Also, is it possible to access a specific output line (line #3 for example)?
Solution: SQLplus clears the output of the query before the query execution starts. So, having a query select its own output does work but it returns what the query has produced in this execution not what it produced in the last execution. For example: write 'ok'; Select output_line from QueryDef where name = 'Test'; produces the output: ok output_line ok Getting a specific line is easy. You can use: Select output_line from QueryDef where name = 'Test' and occnum=3; or Select output_line[3] from QueryDef where name = 'Test'; Keywords: References: None
Problem Statement: Using Aspen SQLplus, is it possible to list all tags on a system that receive updates from Aspen Cim-IO via an I/O transfer record?
Solution: The following sample query demonstrates one way to display all such tags and their corresponding IO_GetDef record. Query1 SELECT io_value_record&&fld->name AS TAGNAME, name AS GET RECORD, OCCNUM AS OCCURRENCE NUMBER FROM iogetdef ORDER BY 1 You also can add a where condition to display only the occurrences with a 'good status. Query2 SELECT io_value_record&&fld->name AS TAGNAME, name AS GET RECORD, OCCNUM AS OCCURRENCE NUMBER FROM iogetdef WHERE IO_DATA_PROCESSING='ON' AND IO_DATA_STATUS = 'Good' ORDER BY 1 Keywords: cross reference transfer record get record tag References: None
Problem Statement: This sample program shows how to create a simple console application that uses the ADO.NET DataReader object to access data from IP.21
Solution: Follow the steps below to create the application. The code is also attached to the knowledge base article. Launch Microsoft Visual Studio .NET Select File | New | Project Select a Project type of Visual C# Projects Select the Console Application Template Type a Project Name and click OK An empty program template will be displayed. Inside the static Main() block, add the following code: // Specify the name of the ODBC data source which uses the AspenTech SQLplus driver // Substitute the IP21 portion of the string to the data source name configured on your system string strConn = dsn=IP21; // Create the command object System.Data.Odbc.OdbcCommand oCmd = new System.Data.Odbc.OdbcCommand(); oCmd.Connection = new System.Data.Odbc.OdbcConnection(strConn); // Open the connection oCmd.Connection.Open(); // Assign the SQL to the Command Object oCmd.CommandText = Select name from IP_AnalogDef; // Execute the SQL System.Data.Odbc.OdbcDataReader oDR = oCmd.ExecuteReader(); while (oDR.Read()) { System.Console.WriteLine(oDR.GetValue(0)); } //Close Connection oCmd.Connection.Close(); } Compile and run the application, it will generate a list of the tag names defined by the IP_AnalogDef defintion record family. Keywords: References: None
Problem Statement: How can I get the 'real' value of the quotient when dividing two integers in SQLplus?
Solution: Suppose you wanted to calculate the percentage of IP_AnalogDef records that currently have IP_Value_Quality = 'Bad'. Using the COUNT(*) function in SQLplus returns an integer value. The example SQLplus script describes the result you would get using local #_bad_tags integer, total_tags integer, percent_bad real; #_bad_tags = (select count(*) from ip_analogdef where IP_input_quality = 'bad'); total_tags = (select count(*) from ip_analogdef); write 'No of bad tags = ' || #_bad_tags; write 'Total no of analog tags = ' || total_tags; percent_bad = (#_bad_tags / total_tags) * 100; write 'Percent Bad doing normal divide gives ' || percent_bad; --The result was 0 because integer division returns an integer, even if the result is put into a real variable. --To get the desired results, either declare #_bad_tags and total_tags as real, or cast the numerator as real: percent_bad = round((cast(#_bad_tags as real) / total_tags) * 100, 2); write 'Percent Bad using cast function gives ' || percent_bad; Output given is: No of bad tags = 14 Total no of analog tags = 329 Percent Bad doing normal divide gives 0 Percent Bad using cast function gives 4.26 Keywords: integer division cast real References: None
Problem Statement: This article attempts to clarify: The basic capabilities of the product sub-component 'Aspen Desktop ODBC'. How it would be installed. Whether it needs to be purchased as a standalone product.
Solution: Aspen Desktop ODBC is a 32-bit ODBC driver and server that allows you to use ODBC-compliant applications with Aspen SQLplus to retrieve data from the Aspen InfoPlus.21 database. For a full list of AspenTech recommended ODBC-Compliant Applications, as well as the current ODBC application versions supported, please refer to the Desktop ODBC User's Guide available from the AspenTech Support website. At the time of writing this article, the 2006 version of the Aspen Desktop ODBC User's Guide for InfoPlus.21 can be found at http://support.aspentech.com/webteamasp/KB.asp?ID=119776 Because Desktop ODBC is no longer a stand-alone product (see below) the latest User Guide is the Version 2006 guide referred to above. ? Aspen Desktop ODBC is optionally installed as a subcomponent of the Aspen SQLplus installation by selecting SQLplus ODBC. However, a separate license is not required. . ? As far as product purchasing. Aspen SQLplus is now part of Aspen InfoPlus.21. Therefore a purchase of Aspen InfoPlus.21 will also include Aspen SQLplus, which in turn will also include Aspen Desktop ODBC. Keywords: None References: None
Problem Statement: Users of Pocket PC's would like to read and write data to and from InfoPlus.21. AspenTech is not supporting the SQLplus ODBC driver on a Pocket PC, so users cannot write their own client applications.
Solution: SQLplus version 2004 and later provides a Web Service that will allow remote queries from any SOAP compliant client. This example is written using Visual Studio 2005 and DotNET v2.0. The application was compiled for Pocket PC 2003. The complete Visual Studio 2005Solution is attached and there are some comments in the source code. The attached sample Test exe can run on a normal Windows PC that has DotNET v2.0 installed. The application is supplied as an example to help get you started and should not be considered an AspenTech development supported application. Keywords: Pocket PC 2003 PPC Windows CE References: None
Problem Statement: This knowledge base article explains how to make the results of a subquery, which has been executed through the START command, available to the main query.
Solution: The START command does not return values to the calling query like a procedure or function. Rather, the START command executes an Aspen SQLplus query which resides in a text file then returns the results of any select statements to the Aspen SQLplus Query Writer as if the subquery was executed normally. If you wanted to make data you selected in the subquery available to the main program you can insert the results of the subquery's select statement into a temporary table or write the results to a text file. You can then use another select statement later in the main query to obtain the results from your subquery. The following query illustrates how this can be done. -- ---- -- Main program query -- ---- DECLARE LOCAL TEMPORARY TABLE MODULE.TEMP(Name CHAR(24)); -- Function FUNCTION callsubquery(rname) SET COLUMN_HEADERS 0; START 'mysubtemptable.sql', rname; END -- Call the function callsubquery('iogetdef'); -- Query the temp table for results SELECT * FROM MODULE.TEMP; -- ---- -- File mysubtemptable.sql -- This file is saved in the Group200 folder -- It contains the two following statements: -- ---- INSERT INTO MODULE.TEMP SELECT name FROM &1; The results of this query are: 4 rows inserted. AspenRefinry_Get AspenChem_Get IoGetSimul D-IoGet Keywords: None References: None
Problem Statement: Driver not capable message appears after querying a remote database via an ODBC Driver.
Solution: Run an ODBC Trace (see knowledge base article # 114134). If the message (EXIT SQLTransact with return code -1 (SQL_ERROR)) appears in the log file add the ROLLBACK command to the end of the query. Example: SELECT * from Informix.table; ROLLBACK; SQLplus always calls SQLTransact at the end of a query if remote statements have been executed by the query since the last COMMIT or ROLLBACK call. The error above may indicate an incompatibility with the remote database. ROLLBACK also calls SQLTransact but ignores any errors that are returned. Keywords: References: None
Problem Statement: This ER fixes the following problem: The MSI install of 6.0.1 does not upgrade the SQLplus ODBC driver (CQ00148904.)
Solution: Version 6.0.1 of the SQLplus ODBC driver allows DSN-less connections from ADO.NET. Apply this Engineering Release to your system to resolve the problem described above. The Release Notes contain detailed installation instructions. Keywords: References: None
Problem Statement: Is it possible to launch the Aspen SQLplus Query Writer from a query saved to the desktop?
Solution: Yes, it is possible to run an Aspen SQLplus script from a desktop icon. 1. Save the query to the desktop. 2. Right click on the query icon (on the desktop) and select 'Open With'. 3. If the Aspen SQLplus Query Writer is not on the list of files to associate, browse to SQLplus.exe on the hard drive. Select Choose Program and click on the Browse button. Browse to the location of the SQLplus.exe on your hard drive (by default, <Install Drive>\Program Files\AspenTech\InfoPlus.21\db21\code directory). Double click on SQLplus.exe once it is located on the hard drive. Aspen SQLplus Query Writer should now appear in the 'Open With' dialog box. 4. Once you have added the SQLplus Query Writer to your 'Open With' list, single click it and check the box to Always use the selected program to open this kind of file. 5. Click OK to close the Open With dialog box. 6. Now, when you double click your query icon from the desktop it will open the query in the Aspen SQLplus Query Writer. However, it is still necessary to execute it using the Execute button (!), by pressing the F9 key, or by selecting Execute from the Query menu. It is also possible to run an SQLplus script directly from a DOS batch file using sqlplusx.exe The following DOS batch file runs sqlplusx.exe, executes script ListBatches_web_rq and outputs the results to a text file: CurrentLots.txt. === Start of bat file === cd C:\Program Files\AspenTech\InfoPlus.21\db21\code sqlplusx ListBatches_web_rq > c:\CurrentLots.txt type C:\currentlots.txt === End of bat file === 1. create a file and add the example above 2. save the file with a .bat extension (i.e. test.bat) 3. open the DOS prompt and browse to where the .bat file was saved (i.e. C:\ drive) 4. execute the .bat file from the directory it is saved in (i.e. when at C:\ drive, type in test.bat and press ENTER) As this is just a DOS .bat file it can be run directly from a desktop icon or can be set to run at PC startup time by placing a shortcut to the .bat file in the Start|Programs|Startup folder. This runs faster than starting an Aspen SQLplus session and does not require any user interaction. Keywords: icon desktop query References: None
Problem Statement: Since version 2004.1 I can use multiple experiments with data reconciliation (a special case of estimation), but the results are no longer copied back into the flowsheet. The values can only be seen in the Estimation tool. Is there a way to access the predicted values using automation?
Solution: This is possible with version 2004.1 and higher but currently undocumented. We will document the method. You can access the predicted value for a measured variable in an experiment using the GetEstimationPredictedValues method (from the Simulation object). The syntax is: Application.Simulation.GetEstimationPredictedValues(ExptName, MeasVar) ExptName The name of the experiment MeasVar The name of the measured variable The method returns an array (zero based) with the predicted values. Example of Accessing the Value of an Estimated Variable This example shows how to access the predicted value of a measured variable following a successful estimation simulation: Pred = Application.Simulation.GetEstimationPredictedValues(SteadyStateExp_1, streams(cold-in).F)Application.PrintToMessageWindow Estimated value for F is & Cstr(Pred(0)) Note that if the variable name includes quotation marks () you must use double quotation marks () in the argument list. The method can also be used for dynamic experiments. The array returned will give the predicted value for each time. Example to retrieve the predicted value for the DynamicExp_1 dynamic experiment and the variable Ca in Rct (assuming we have 11 data points). v = Application.Simulation.GetEstimationPredictedValues(DynamicExp_1, Rct.Ca) for i = 1 to 11 application.msg v(i-1) next Finally, if you wish to copy the predicted values into the flowsheet you can use a similar script. ' list of variables estimated dim list_var(10) list_var(1) = streams(cold-in).T list_var(2) = streams(hot-in).T list_var(3) = streams(cold-in).F list_var(4) = streams(hot-in).F list_var(5) = streams(cold-out).T list_var(6) = streams(hot-out).T list_var(7) = streams(cold-out).F nvar = 7 ' get results from estimation and copy to flowsheet for i=1 to nvar x = application.simulation.GetEstimationPredictedValues(SteadyStateExp_1,list_var(i)) var = resolve(list_var(i)) var.value = x(0) next ' do steady state run to solve all variables application.simulation.runmode = Steady State application.simulation.run true Keywords: automation VBscript Visual Basic Excel References: None
Problem Statement: How can I update the number of license points for an Aspen InfoPlus.21 database without stopping the database?
Solution: Using Aspen SQLplus Query Writer you can execute the following query to report the number of points that your Aspen InfoPlus.21 database is licensed for: write 'Current number of license points = ' || getpointlicensecount; and the output will display something like this: Current number of license points = 10500 Now execute the following Aspen SQLplus query to update the number of license points: setpointlicensecount(15000); write 'Current number of license points = ' || getpointlicensecount; and the output will be: Current number of license points = 15000 Please note: The above change will not take affect until the Aspen InfoPlus.21 database is restarted. Keywords: snapshot license points setpointlicensecount getpointlicensecount References: None
Problem Statement: This knowledge base article provides troubleshooting steps to resolve the error Failed to create object: tli:tliapplication: Access denied. at line 1 which can be generated by the Aspen SQLplus Query Writer.
Solution: The tli.tliapplication object is contained in tlbinf32.dll. This is a Microsoft type library information object which allows applications to programmatically extract information from type libraries. This file may be corrupted, missing or have been overwritten by a Microsoft patch or by the installation of another application. This file is usually installed in the System32 folder. Verify that this file exists, is registered and has file permissions that allow Aspen SQLplus to access it. If this problem only occurs on one PC, compare the date of this file on the working system and on the non-working system to ensure the file versions match. ? This error could be caused a registry access problem, since references to COM objects are stored in the registry. There is a known defect in Crystal Reports 8.0 which locks the part of the registry needed to add references to programs such as Visual Basic. The same problem could also occur with Aspen SQLplus if access to the registry is restricted by another program. ? In some cases re-installing the Aspen Manufacturing Suite of products has resolved the problem. In some cases it has been necessary to un-install then re-install the Aspen Manufacturing Suite products along with other applications (such as Microsoft Visual Studio). ? There was one case where creating a second instance of TSK_SQL_SERVER (named TSK_SQL_SERVER2) worked around the problem. When the Aspen SQLplus Query Writer used TSK_SQL_SERVER2 the problem did not occur. Keywords: Error tlb type library tliapplication tli References: None
Problem Statement: This knowledge base article provides troubleshooting steps when the following error ERROR[HY000][AspenTech]SQLplus Access Denied - No read access ERROR [01000][Microsoft][ODBC Driver Manager] The driver doesn't support the version of ODBC behavior that the application requested (see SQLSetEnvAttr). is returned when trying to connect to an Aspen InfoPlus.21 database through a custom ASP.NET web application which goes through an Aspen SQLplus ODBC connection. This error message can be returned even if Aspen security has not been implemented. The problem does not occur when the -n parameter is specified as a command line parameter for TSK_SQL_SERVER.
Solution: Restart the AFW Security Client service ? Check the identity that the application pool runs under in IIS to make sure that this account has permission to access the Aspen InfoPlus.21 server. ? Enable ASP.NET impersonation. If ASP.NET impersonation is not enabled, the ASPNET account will be used for authentication purposes. If you wish to use the ASPNET account for authentication you will need to set the passwords to be the same on the ASPNET accounts on the client and server machines. If ASP.NET impersonation is enabled the user's credentials will be used. ? If none of these suggestions work, check the IIS logs to see which account is being used to run the web application. To ensure that this account has no problems connecting to the Aspen InfoPlus.21 database try mapping to a network share on the Aspen InfoPlus.21 server under this account. Also, try connecting to Aspen InfoPlus.21 through other Aspen client applications such as Aspen SQLplus or Aspen Process Explorer. Keywords: deny revoke sql asp .net References: None
Problem Statement: This knowledge base article provides the syntax required to reference a table in SQL Server from Aspen SQLplus.
Solution: SQL Server tables which are accessed through a database link can be referenced using the following syntax: Linkname.DBName.Username.Tablename For example, an appropriate select statement for the following database information is shown below. Item Example Database Linkname AdvisorTraining - SQL Server DBName AdvTrain Username dbo Tablename DOREVENT SELECT * FROM AdvisorTraining - SQL Server.AdvTrain.dbo.DOREVENT; Tip: Once you have located the table by browsing through your database link using the Aspen SQLplus Query Writer you can get a general select statement generated for you by pressing the Paste Query button. (See image below.) Keywords: ODBC remote References: None