question
stringlengths
19
6.88k
answer
stringlengths
38
33.3k
Problem Statement: Is there a way to retrieve sales an inventory data for a day other than today?
Solution: The Aspen Retail history list (located under the Utilities menu) keeps track of the last 21 days of inventory and sales information (to aid in order forecasting). You can go back and look at any of those days by doing the following: Go to the Utilities Pull-Down Menu in the main Aspen Retail screen Click on the History List In the History list Form, browse to find the customer/station number or enter a customer number. Click ok and view the list. If you want to veiw a certain date, Go to the Utilities Pull-Down Menu in the main Aspen Retail screen Click on the History Utility In the History Utility Form, browse to find the customer/station number or enter a customer number. Toggle the date to the appropriate date (see the second attachment for a screen snapshot of this form, AspenUtilityHistory.jpg) Keywords: History List History Utility Sales and Inventory Information Update log References: None
Problem Statement: Dispatched Orders Report is consistently returns a message no results to print.
Solution: If you are getting this error when you are trying to print the Dispatched Orders Report and getting a message that says No results to print and you have dispatched an group then make sure you have the writedispatchedorders is set to 1 in your ini file Keywords: Report Dispatch order report References: None
Problem Statement: How do I add a material of construction using the Databank Search?
Solution: Step 1: Click Databank Search in Materials of Construction -> Vessel Materials form Step 2: You can either type a few letters of the word in the search option or scroll through the material list and select the required material. For example, Seamless tube of carbon steel class. Step 3: Click the item say tube material from the third dialog box and click on `Set?. The seamless tube selected in Step 2 is assigned as the tube material by clicking on `Set? button. Step 4: After clicking the `Set? button, the tube material is updated with the seamless tube. Step 5: Click OK to return to the Vessel Materials form. Keywords: Shell and tube heat exchanger, material of construction, heat exchanger, data bank References: None
Problem Statement: I would like to know how rigorous is the costing in Aspen Shell & Tube Mechanical?
Solution: S&T Mechanical simulates the fabrication of the equipment, including all the relevant shop operations (cutting, shearing, welding, drilling, beveling, etc) per equipment part (tubesheets, flanges, nozzles, etc.). It also includes assembly, testing and other miscellaneous shop functions. Hence the cost estimate is very rigorous. For material costing, since S&T M has all the ASME materials (~3500), a good number of EN, JIS (Japanese), AFNOR (French) and DIN (German) materials, it is difficult to keep up with current pricing for all the materials, product forms (tubes, plates, forgings, pipes) and regions (US, Europe, Asia, Latin America, etc.). However, the prices are reasonable and are updated annually for accurate relative pricing (between carbon steel, nickel allows, copper alloys, etc.). Keywords: Pricing Costing Heat exchanger References: None
Problem Statement: In the Aspen InfoPlus.21 Administrator a node will not expand (no plus sign to the left of the node icon) and clicking on the name will produce a dialog box with this message (after waiting for a minute or so): Group 200 on <name of the node> (Title of dialog box) The network path was not found. What can be done to fix this problem so that the InfoPlus.21 database is accessible?
Solution: An entry for this server in the Aspen Data Source Architecture (ADSA) has a typographical error (a typo). Locate the data source in the ADSA Client Config Tool, make a note of the entries that exist for that data source, and either check them thoroughly for errors or remove them and re-add them. Keywords: References: None
Problem Statement: When configuring the Public Data Sources using the ADSA Client Config Tool you find that the following popup dialog window appears when you click on the OK button to save any changes made: This error does not occur if you instead attempt to configure User Data Sources. You then notice that the Aspen Data Source Directory service is run under a local Windows user account that does not belong to the local Administrators group.
Solution: Either: · add the local Administrators group membership to the Windows user account used to run the Aspen Data Source Directory service, or · switch the Aspen Data Source Directory service to log on as the Local System account. After restarting the service you will again be able to save changes to the ADSA Public Data Sources without error. Keywords: Configuration error OnOkOrSave::SetDataSourcesInXML Public Data Sources Create Data Source Delete Data Source Edit Data Source References: None
Problem Statement: ADSA Client Config Tool is not working the way it used to. The main window opens but I am not able to view the Data Sources. I expect to see the appropriate window when I click either the User Data Sources or Public Data Sources button: Instead, nothing happens when I click either button. The defined data sources are displayed in the Tag Browser and the Test button is also working. I have restarted the ADSA Services but that did not seem to help.
Solution: Clicking the buttons is suppose to launch this 32-bit executable: C:\Program Files (x86)\Common Files\AspenTech Shared\Adsa\DsaConfig.exe Assuming that the file has not been moved or deleted, the problem you are having is caused by a missing registry key. Open the Registry Editor and search for the AspenTech Setup key in the HKLM hive. Depending on your operating system and whether you are using 32-bit or 64-bit version of the ADSA configuration tool you must ensure the ASPENCOMMONDir key is present and correctly specifies the path to the parent of the Adsa folder that contains DsaConfig.exe. Assuming you have installed to the C: drive: 64-bit ADSA configuration tool: [HKEY_LOCAL_MACHINE\SOFTWARE\AspenTech\Setup] ASPENCOMMONDir = C:\Program Files (x86)\Common Files\AspenTech Shared 32-bit ADSA configuration tool on 64-bit Windows: [HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\AspenTech\Setup] ASPENCOMMONDir = C:\Program Files (x86)\Common Files\AspenTech Shared 32-bit Windows: [HKEY_LOCAL_MACHINE\SOFTWARE\AspenTech\Setup] ASPENCOMMONDir = C:\Program Files\Common Files\AspenTech Shared You should be able to copy the existing value of companion Setup key ASPENCOMMON to the empty/missing ASPENCOMMONdir. If you cannot find any of these registry keys then it is recommended that you repair your Aspentech software using the repair tool on the installation DVD. Keywords: ADSA Client Config Tool Fix References: None
Problem Statement: What is special handling and how do I change it in the costing database?
Solution: Special handling is a percentage of total labor hours and material handling. The default value 5% for labor hours and material cost, can be changed using the Costing Database Maintenance form. Costing Database is accessed using Tools > Data Maintenance > Costing Database Keywords: special handling, costing, final assembly, shell and tube mechanical, heat exchanger References: None
Problem Statement: There seem to be new values for RTE_APOLSTAT that we're seeing in AOL 12.1. I reviewed the AOL documentation (Aspen Online 12.1 user's guide), but did not see the new values (17, 19) inquired about below. Can you please send updated documentation?
Solution: Here is the complete list status values and corresponding descriptions. Item 17~19 were added for Aspen OnLine 12.1. They are listed as Appendix | Aspen OnLine Status Values in online help. Value select_description -4 Steady state aborted -3 Aspen OnLine aborted -1 Shutdown 0 Startup 1 Startup initialization 2 Waiting for steady state 3 AOL optimization off 4 Waiting on other plant 5 Initializing implementation 6 Getting input data 7 Calculating parameters 8 Storing parameters 9 Getting opt case bounds 10 Calculating setpoints 11 Storing targets 12 Implementing setpoints 13 Full opt case fetch 14 Full optimize case 15 Full opt case put 16 Waiting for line out 17 Model execution started 18 Output data to IMS (Used by non-A+ EO runs) 19 Model execution ended 26 Waiting for non implementation delay 27 Implementation data problem 28 Fatal implementation problem 29 Get/put data problem 30 Fatal get/put problem - aborting 31 Parameter case did not solve 32 Parameter case aborting 33 Optimize case did not solve 34 Optimize case aborting 35 Full optimize case did not solve 36 Full optimize case aborting 37 AOL time out detected 38 AOL process failure detected Keywords: aol References: None
Problem Statement: After changing the password for the account your Batch.21 BCU Server runs under, the BCU does not start up. Instead, when you click OK, an error is generated: Unable to grant the logon as service right to user <user_account> No mapping between account name and User ID You may also find the BCU Server starts up OK, but when you try to add the units to the scheduler there is a problem. This is caused because the password also needs to be changed in the Microsoft Transaction Server component.
Solution: 108124 for theSolution on: Batch.21 BCU Server can lock-up due to Microsoft Transaction Server components. Applicable Version(s): InfoPlus.21 versions 3.1.x and 4.0.x only. Later versions do not use the Microsoft Transaction Server. Keywords: password BCU InfoPlus.21 References: None
Problem Statement: Should relational database tables be compressed/shrunk after a purge?
Solution: AspenTech recommends that database tables be compressed after the relational database is purged. If the database is not compressed after a purge, the deleted tables still take up disk space. Batch/Event systems do not need to be shut down during database compression. For documentation on compression consult Microsoft SQL Server documentation. Manual compression/shrinking is only necessary for versions prior to MS SQL Server 7.0. In version 7.0 and above, you may use the autoshrink function. This will cause the empty table space to be reused automatically. The use of autoshrink for version 7.0 and above is recommended. For additional information on the SQL Server 7.0 autoshrink function, see: http://msdn.microsoft.com/library/psdk/sql/8_ar_da_24.htm Key Words: database administration memory shrink compress Event Keywords: None References: None
Problem Statement: When you add a condition to a trigger and then you want to get rid of it, blanking out the value and the tag name fields does not blank out the condition on the summary line; the word AND > (or OR > as the case may be) stays there and there is no obvious way to remove it. If not removed, this remnant condition can affect the Verify stage of the script.
Solution: To solve the problem of the additional condition not being removed from the trigger summary line after both the tag name and value have been deleted from the Operand 1 and 2 fields, simply right click on the cell in the grid containing the remnant of the deleted condition and select Delete selected cells from the context menu. That will remove the unwanted condition from the summary line. Keywords: References: None
Problem Statement: In the Specify Query Results section of the Create Batch Report page, the ability to select characteristics does not respond. After pressing the ... button, the characteristics tree is displayed. Opening up either the stage or characteristic branch yields nothing other than loading. Note that subbatch and characteristic information can still be typed in manually.
Solution: Verify that at a minimum you are running IE5.5 SP2 in accordance with the Aspen Manufacturing Suite Supported Environments Matrix v6.0 Keywords: hang web.21 References: None
Problem Statement: A Sqlplus script that uses the Batch.21 API to update the Batch.21 database get this error when trying to add a characteristic to a subbatch: Error executing method add: B21BAI-60147: Subbatch does not exist
Solution: The code below now works after adding the single line: area.Refresh local data_sources, batch, chars, batchdefs; local batchlist, query, area, subbatch, characteristic; local i integer; local rs char(1); data_sources = createobject('AspenTech.Batch21.BatchDataSources'); area = data_sources('BATCH-600-W2K').areas('demo'); area.Refresh; query = area.BatchQuery; batchlist = query.GetByDesignatorValues('E','TAC2508','43D455'); if batchlist.count = 1 then batch = batchlist.item(1); subbatch = batch.subbatches('MIX'); chars = subbatch.characteristics; chars.add ('start_time',0, '4/11/03 10:00:00'); end; Keywords: api program References: None
Problem Statement: This knowledge base article describes why the error Restore Failed. Area name changed is returned when restoring a Batch.21 backup file to another Batch.21 database.
Solution: The restore functionality restores into an area based on the internal area ID. If it does not find an area with the same ID it then creates it. The idea of the tool is to backup and restore into the same database OR into a new database used for offline analysis. This last scenario is usually due to the fact that the configuration has changed so much that the administrator does not want to reintroduce the configuration changes just to see the old data. If someone creates an empty database and then manually creates an area, user name, etc. with the same name, there is no guarantee that the internal database ID will be the same as the records in the backup file. If the internal ID's are different but the names are the same the restore tool fails as Batch.21 enforces name uniqueness within a database. This sameSolution also applies to the scenario where you have batch data on an old server, and on a migrated server you have created a new area with the same name, and then attempt to restore batches to it. This can result in the error: B21BSC-50019: Area already exists In this case follow the advice of thisSolution and delete the existing area, and try the restore again, which will create the area AND populate it with data at the same time, and avoid any conflict. Keywords: References: None
Problem Statement: The table UNITS is used to identify materials that are not measured in the standard units of volume (e.g. BBLS) or weight (e.g. MTONS). In addition, this table is used in a weight-basis model to provide volume-to-weight conversion factors for materials that are purchased or sold on a volume-basis. In some cases, people using this table notice unwanted results with the total of crude purchases or in the submodels. What is the cause?
Solution: The entry under column TEXT in table UNITS is not used as in most other tables (i.e. a description of the material), but it defines the unit that will be used for the given material. If you inadvertently enter a description of the material under column TEXT in table UNITS, the report will come out with wrong totals for these materials in the Purchases, Sales and Submodel section. Make sure that you either leave column TEXT empty or identify new Units of Measure if necessary, but don't use it to enter a description of a material. For example, the following table UNITS is configured incorrectly for the crudes; Aspen PIMS understands that the entries under column TEXT are new units of measure: In the Purchases section of the FullSolution report, the Crude totals are reported as zero, because non-standard units are not included in that total: Keywords: Table UNITS Unit of Measure Report References: None
Problem Statement: In some situations, the crude slate that is purchased has a predefined composition (for example, when the crudes come through a pipeline). However, the total amount of crude that is purchased can vary. How do you model this situation?
Solution: This situation can be modeled with the help of table RATIO (under the Miscellaneous branch of the model tree), which allows to define ratios between matrix columns. In table RATIO, we set up a ratio called RT1 (any 3-character tag that is not used somewhere else is OK); the variables that participate in it are the purchase variables (PURCabc). For each crude that is purchased in table BUY and will be part of the restriction, we add it here, as shown below. The number below column RT1 is the percentage of the total, i.e. ANS will be 33% of the total amount of crude purchased. If a crude is not part of the ratio, then it is not restricted and will not be part of the total amount considered in the ratio calculations. Important Note: You need to relax the MIN and MAX in table BUY for all the participants of the ratio to avoid potential infeasibilities. The results of this run are shown below. You can see that the composition of the crude slate is as specified. You can control the total amount of crude independently or leave it free to be optimized. Keywords: Table RATIO Fix Crude Slate Crude Slate Composition References: None
Problem Statement: This knowledge base article describes how to work around the problem that causes the following error to be logged in the Windows Event Viewer: The HTTP Filter DLL C:\inetpub\wwwroot\AspenTech\AFW\Security\AtWebSSO.dll failed to load. The data is the error.
Solution: This error usually occurs because AtWebSSO.dll is not in the path noted in the error message. The usual root cause is because Microsoft IIS has been installed on a drive other than C:\. If this is the case with your system then you can work around the error by following this procedure: 1. Copy AtWebSSO.dll to the path mentioned in the error message 2. Restart the IIS Admin service on the web server If you are not using Aspen WebSSO then you can work around the problem by disabling Aspen WebSSO. The following procedure outlines how to disable Aspen WebSSO. 1. Add the value Disable under HKEY_LOCAL_MACHINE\SOFTWARE\AspenTech\AFW\WebSSO and set it to 1. The data type is REG_DWORD 2. On the machine having this message in the event log, type in the following in the web browser's Address field and press Enter. http://localhost/AtWebSSO_Reset If the steps outlined above do not disable Aspen WebSSO then the Aspen WebSSO HTTP filter can be removed by executing the Delete command using the IIS 6.0 administration script, Adsutil.vbs, as shown below: cscript //nologo adsutil.vbs DELETE /W3SVC/Filters/AtWebSSO Keywords: Event Log References: None
Problem Statement: How does the Dynamic Buffer ini setting affect the customer setup buffers and safety stock?
Solution: The dynamic buffers setting should be set to 0 in most cases. Dynamic buffers are designed to keep a constant level of buffer product in the ground. The current buffer system is designed in hours each station has a runout buffer in hours and a retain buffer in hours. If dynamic buffers are not turned on then this buffer will always be a constant time. For example, if a customer has a 12-hour runout buffer the system will use 12 hours at the busiest sales point of the day and at the slowest sales point of the day. For example, if a station's 12 hour buffer falls between 6AM and 6PM and a station is selling 200 gallons per hour then for this time period the buffer in terms of gallons is 2400 gallons (200 gallons per hour 12 hours). If a station's 12 hour buffer falls during the time period of 6PM to 6AM and that same station is now selling 100 gallons per hour then the buffer amount in terms of gallons is 1200 (100 gallons per hour 12 hours). With dynamic buffers turn on the system will adjust the buffer that is used based on average daily sales and sales segments to achieve the same-buffered gallons in the tank. For example, if the station on average sells 3600 gallons per day then a 12-hour buffer would loosely translate to 1800-gallon buffer. During the 6AM to 6PM time with dynamic buffers turned on this would create a 9-hour runout buffer (1800 gallons/200 gallons per hour) while during the 6PM to 6AM time this would create an 15 hours buffer (1200 gallons/100 gallons per hour for 12 hours then 600 gallons/200 gallons per hour). The dynamic buffers are designed to keep the same amount of buffered product in the tank at any point. If a client wants to turn the dynamic buffers on they need to be fully aware of how they work and that at some points using the dynamic buffer can actually reduce the buffer time in terms of hours (as in the above example where a 12-hour buffer turns into a 9-hour buffer). Dynamic buffer do work very well if they are configured correctly, however, it takes time and effort for each station to figure out exactly what the buffer should be for each one. Without proper setup the dynamic buffer setting should not be turned on. Keywords: References: None
Problem Statement: Can I run the new version of Aspen Shell & Tube Mechanical with old ASME design code?
Solution: The latest version of Aspen Shell & Tube Mechanical (formerly Aspen Teams) uses the latest edition of the ASME code, i.e. V7.1 ASME 2007 A08, V7.2 ASME 2007 A09, V7.3 ASME 2010, etc. To run earlier ASME code editions, run the appropriate Shell & Tube Mechanical version. However, you can use any annual material database, going back to 1998, when the allowable stresses were changed, with any version of the program. For example, you can run V7.1 with 1998 materials or any materials between 1998 and 2008. V7.2 with any materials between 1998 and 2009, V7.3 with any materials between 1998 and 2010, etc. Keywords: ASME, design code References: None
Problem Statement: In some instances, incorrect Can-Hold value can be displayed on the What If screen. In a particular case, when you open the Optimizer and look at a load and click on the tank Info button, the What If screen appears. If the station shows negative values in the field CAN-HOLD, this means that inventory > safe fill level, and the tank is full. So you would not expect a shipment in the near future. But in the Optimiser the load is RED indicating a must go load. But the quantity of the products for that load does not fit in the tank at the retain point. However, if you go out of the optimiser and go to manual information collection (or the history list), you may see that the real inventory is much lower than the one shown on the What If screen. So the user might see a load in red indicating that it is urgent. But when looking at the tank info screen, he/she sees that the tank is full. So user may wonder which information is correct.
Solution: This problem occurred when several updates were provided to the inventory info. The What-if was reading the inventory data from the FIRST INV fields in the YESTINFO table. This is okay until a customer updates the information, which were stored in the INV field. Therefore when the the info was updated, the data in FIRST INV field and the INV field became different. The system was using the INV field to do the forecast and calculate the Can-Hold value but display the FIRST INV data, thus causing the inconsistence in the displayed data. Currently the FIRST INV goes through a true cycle of DQM, whereas the second and third INV updates do not go through the same DQM procedures as the FIRST INV. The problem has been fixed in 7.04.00.04. Now the displayed data read from the INV field, which contains the latest inventory data. Keywords: Inventory Can Hold What If FIRST INV INV References: None
Problem Statement: How to secure Cim-IO for Aspen Infoplus.21 server in order to allow Cim-IO clients to only read data
Solution: It is possible to restrict write or put access from a client (like cimio_t_api). The way to do this is with Aspen Infoplus.21 and Local Security enabled for the machines you are reading the data from. Of course this means it's available only for Aspen Infoplus.21 version 3.1 and higher. You'll need to create an ALS Role containing the login that starts the dlgp service, and then restrict write access to that Role. This way, the cimio client task or cimio_t_api utility cannot write to the database. See step 4 ofSolution ID 112090 for details on how to start the dlgp process as a service. Keywords: Security Put records References: None
Problem Statement: Can the entry that determines the number of rows in the BCU XREF table be increased after the BCU has started collecting batch data? Is there any disadvantage to making the number much larger than necessary?
Solution: The entry can be increased, however you must stop and restart InfoPlus.21 for the change to be recognized. The disadvantages are memory size and speed. In size, each record is at least 400 bytes, so a table of 1000 entries uses 400K of memory. In speed, it really only matters each time you create a batch for the first time. The BCU has to search through the whole table to determine whether a new batch needs to be created. It must search through 1000 entries rather than 100, for example. The cost in speed is not really that great because each data structure in the XREF table is a fairly simple data structure and creating a new batch isn''t done very often (relatively speaking). The recommended size as seen in the Batch.21 System Administrator''s Manual is three or four rows for each unit defined in the Batch Conversion Utility conversion file. You should also take in consideration store and forward, if configured multiply the above number by the max number of days you might stored data. For more information please see the Batch.21 System Administrator''s Manual. Keywords: BCU_XREF_rows Batch.21 References: None
Problem Statement: This article addresses a situation of not being able to see the contents of the Aspen Local Security Server Database. In other words, the user is able to open the Aspen AFW Security Manager - They can 'Add a Role', but they cannot see the Roles or Applications that have been added to the Aspen Local Security configuration.
Solution: The most common cause is a configuration problem in Internet Information Services (IIS). In the IIS manager looking at : ServerName | Web Sites | Default Web Site | AspenTech | Afw | Security Right Click on 'Security' | Properties | Directory Security Click on the Edit button against Anonymous access and authentication control At this (security) level it is a requirement that the Anonymous Access box has a check-mark (tick). Note that Integrated Windows Authentication can also be checked, but the Anonymous must be checked. Next: The UserName defined in the Anonymous Access setting must be given Read and Write access to the directory ..\AspenTech\Local Security\Access97 Finally: Read and Write access for the Anonymous User account must also be given to the AFWDB.mdb file in that directory. NOTE : A restart of IIS may be needed after making such changes Keywords: None References: None
Problem Statement: When logging into Aspen AFW Security Server a user gets Invalid user name or password error.
Solution: This error is related to Windows authentication problems; it does not have anything to do with securing the AFW Security Server Snap-In. Based on previous experience, a domain-wide policy was issued that tightened down the security settings which ended up disabling NTLM (lan manager) authentication. This might not always be the case but you will definitely need to involve your IS team to track down any kind of authentication issue at the Operating System level. Keywords: Invaild User name or password AFW Security Manager References: None
Problem Statement: How is class of trade set-up and what are valid values for this area? How does class of trade work?
Solution: Class of Trade is used to differentiate the status of a customer with regards to channel of trade. The purpose of this is to integrate the Aspen Retail product with the Aspen Bulk product. The level of the integration consists of a passing of the demand data from the Aspen Retail application to the Aspen Bulk application. This will require a rolling up of the Aspen Retail data by terminal, product, and channel of trade. The Aspen Bulk application will then accept this demand data as an override to the existing forecast for whatever period of time is made available from the Aspen Retail software. Keywords: Class of Trade References: None
Problem Statement: How does saftery stock work?
Solution: Detail how the safety stock is designed. First, safety stock is not treated the same way as pump stop the system is still treating the safety stock as extra inventory in the ground and not as an absolute run out point. This basically means that the system will try and use the safety stock number but if the delivery window is too tight or the system cannot allow for it to be used the system will still revert back to the actually run out point or pump stop. Secondly, safety stock is not going to affect the run out point at all. The displayed run out point in Aspen Retail will always be the actual non-buffered run out point. Safety stock will only affect the must-go shift of the load. Now here is how the safety stock functionality works. First the system will take the volume that is in the safety stock field and translate that into and hours number based on projected sales and sales segments. Once it has this hour number for safety stock it will compare the number of hours to the run out buffer hours. It will then apply the bigger number as the new run out buffer. For example: Run out buffer = 12 hours Safety Stock= 5000 liters / 5000 liters average sales= 24 hours The system will apply a 24-hour buffer. Run out buffer = 12 hours Safety Stock = 5000/ 15000 liters average sales = 8 hours The system will apply a 12-hour buffer. The system will treat this run out buffer as a normal buffer therefore if the 24 hour buffer from our first example can not be applied to the load because of window restraints the system will then cut this buffer in half and try a 12 hour buffer. Keywords: Tanks Safety Stock References: None
Problem Statement: How do I get ASME 2015 edition codes?
Solution: ASME 2015 edition is available in Aspen Exchanger Design and Rating V8.8.2. In Results > Code Calculations, ASME 2015 edition codes are shown. Keywords: ASME 2013 edition, ASME 2015 edition, Shell&Tube Mechanical References: None
Problem Statement: Why do SB-171 C36500 and C44300 Cls materials return a Yield Strength Error with ASME 2011 database?
Solution: The ASME Section II, Part D, Table Y-1UNS does not have yields listed for C36500 and C44300. Aspen Shell & Tube Mechanical can only show what ASME has provided. Please note that not all SB-171 materials have yields provided by ASME. But many do, such as SB-171 C61400 Cls O25 Plate(2 <t<=5). Keywords: yield strength error, SB 171, C36500, C44300 References: None
Problem Statement: Aspen Online issues an error message indicating there was a failure loading the Aspen Plus, APWN.exe. While Aspen Plus can runs its test problems successfully, Aspen Online cannot run its test problems without this error.
Solution: There is a registry setting that may have too low of a value. Open the registry editor, REGEDIT (from START | RUN and then type REGEDIT). Navigate to the following Key: HKEY_LOCAL_MACHINE SYSTEM CurrentControlSet Control Session Manager SubSystems Find the value name of Windows The value data is: %SystemRoot%\system32\csrss.exe ObjectDirectory=\Windows SharedSection=1024,3072,2048 Windows=On SubSystemType=Windows ServerDll=basesrv,1 ServerDll=winsrv:UserServerDllInitialization,3 ServerDll=winsrv:ConServerDllInitialization, 2 ProfileControl=Off MaxRequestThreads=16 It is the 3rd number in the SharedSection that must be modified. The number should be between 1024 and 2048 in increments of 128. On some installations, the default value is 512, and this value is too low to start Aspen Plus. Keywords: References: None
Problem Statement: How to resume dispatch operations after a data base disaster assuming the hot-swappable disk drives aren't usable. Are any of the following options options available for operating Aspen Retail: Can Aspen Retail be operated in a limited-function mode? If so, what are the options? Can any programs operate stand-alone? What tables must exist and be populated? What tables must exist but can be empty? What tables do not have to exist?
Solution: Questions: Can Aspen Retail be operated in a limited-function mode? Can any programs operate stand-alone? What tables must exist and be populated? What tables must exist but can be empty? What tables do not have to exist? Answers: No Yes (they are all exe) ALL tables are used by one program or another, but a good chunk of them does not need to be populated. ALL tables : The ones that are not used to model this particular client They all have to exist. Even some of the TCIF tables are used internally It is safe to backup all. Even if a table is not used in this version internally, it can be on the next version. Keywords: References: None
Problem Statement: Some customers who try to regenerate batch data (start BCU scheduling from an earlier date) after purging all batch data from the database may obtain the trigger error: Invalid Batch ID (B21SVR-50104).
Solution: To resolve this problem, you will need to delete the recent batches cache: disable the Batch units in the Scheduler Table remove (delete) the units from the Scheduler Table delete the recent batches cache file (right click within the Recent Batches Cache window and select Delete All) purge the data in the batch database re-install and re-schedule the units The following information is obtained from the Batch.21 help file: The RecentBatchesCache report displays a list of all batch IDs and associated designators that were cached into memory on the BCU server. Caching mainly improves the processing speed. To display the RecentBatchesCache report: On the BCU Administrator, Main Window, use the Main Menu to select RecentBatchesCache from the Server menu to open the RecentBatchesCache report. Refresh Refresh the RecentBatchesCache report with the contents of the recent batches cache. Delete Remove the selected batch from the cache. Delete All Deletes all batches from the cache. Details Opens the Batch Detail Display for the selected batch. Keywords: B21SVR-50104 Batch ID Invalid 50104 purge regenerate References: None
Problem Statement: Sometimes an Oracle Administrator might need to grant the appropriate roles and privileges to an account (note: all privileges is not appropriate). Below is an explanation of the privileges and roles required by the Oracle account used by Batch.21.
Solution: Here are the privileges the Oracle account will need: CONNECT, RESOURCE, CREATE PUBLIC SYNONYM, CREATE ANY TRIGGER, CREATE ANY SYNONYM, SELECT ANY TABLE, SELECT ANY SEQUENCE, EXECUTE ANY PROCEDURE, INSERT ANY TABLE, UPDATE ANY TABLE, DELETE ANY TABLE Keywords: oracle privilleges batch.21 References: None
Problem Statement: When solving a model under XNLP, sometimes the following message appears at the end of the Execution Log, even though the
Solution: is succesful: Variable Qquapol at bound xxx with DJ = yyy for 'Variable Name' = zzz What does it mean, and what can be done to get rid of it? Solution In the XNLP formulation, the variables Qquapol represent the value of quality qua from pool pol. Ideally Q variables should not be at their bounds with a non-zero DJ at theSolution. If it happens, it may be a local optimaSolution. It is better to check the bound on that particular Q variable and see why it is at the bound. It could be a modeling problem, or it could be related to the starting point. You may note in some instances that this messages are not recreated when you run the model again, or that the message appears for another Q variable. One way to overcome it (assuming that checking the model did not help) is to relax the bound on the Q variable. You can either use Table SCALE to relax it for all recursed variables of that quality, or you can use Table XBOUNDS (which is a new table introduced in Aspen PIMS version 17.1.6, located under the Miscellaneous branch) to relax a specific Q variable. The XBOUNDS table is used to impose bounds on XNLP created variables, such as the Quality variables (Q columns). XBOUNDS has the same format as Table BOUNDS, as shown below. Keywords: Quality variable DJ Table XBOUNDS References: None
Problem Statement: This knowledge base article describes the options for setting the LogInfoToServer key in AFWTools.
Solution: The LogInfoToServer key in AFWTools which is located in the registry under HKLM | Software | AspenTech | AFW is a client side setting used to enable logging of the authentication and authorization operations to the security database. These logs can be displayed when you use the AFW Security manager. Until Aspen Manufacturing Suite (AMS) 6.0 authentication is always logged and is not affected by the value of this registry key. With AMS 6.0 Service Pack 1 (AFW/ALS ver 2.6.1.410) the logging has been made more configurable. The configurable registry values are listed below 0 - No Logging 1 - Log both Authentication and Authorization (the Authorization CheckAccess() call is also logged) 2 - Log Authentication (This is the default value created by the install) 3 - Log Authorization Keywords: AFW Tools oginfotoserver References: None
Problem Statement: What could cause the Aspen Chromatography icon to launch Aspen Custom Modeler?
Solution: This icon problem where it defaults to Custom Modeler because the registry didn't have an entry for Aspen Chromatography parameter of /CHM. HKLM\Software\Wow6432Node\Aspentech\AMSystem\Options is the location of these parameters and if there isn't a entry to represent one of the Advance products it takes the default and it opens Aspen Custom Modeler instead. Keywords: Aspen Adsorption, Aspen Model Runner, Aspen Chromatography, Aspen Custom Modeler References: None
Problem Statement: What exactly does the load confirmation service do?
Solution: The Load Confirmation NT Service processes records that contain information regarding the status of dispatched shipments. The TCIF_LOADCONFIRM table is used to pass loading or delivery confirmation information about shipments from the mainframe system to Aspen Retail. This applies only to inventory managed accounts. The Load Confirmation Service also accommodates Terminal-to-Terminal transfers and Transport-to-Transport transfers. If the Load Confirmation service receives a load confirmation record and the delivery source differs from the original source on the shipment, the service will update the shipment with the new delivery source. The same happens, Similarly, if a load confirmation record is received and the transport differs from the original transport on the shipment. The service also has the capability to accept load confirmations for order entry customers. The mainframe system inserts records into the table with the new information flag (N_INFO_FLG) set to 'T'. As the Load Confirmation NT Service runs, it selects records from TCIF_LOADCONFIRM table where N_INFO_FLG ='T'. When processed, the service sets each record's N_INFO_FLG to 'F'. A mainframe process is responsible for removing those records that have been read. The TCIF_LOADCONFIRM table contains a unique key using the customer id and the unique Aspen shipment number. There is a unique constraint for each combination of uniquenum, statnum, and entry_date. Keywords: Load Confirmation References: None
Problem Statement: Getting HTTP Error 404.17 - Not Found The requested content appears to be script and will not be served by the static file handler. when trying to connect to the Aspen Local Security Server, http:///AspenTech/AFW/Security/pfwauthz.aspx The 404.17 error indicates that dynamic content in Windows Internet Information Services (IIs) is mapped to the static file handler.This could be caused by a bad installation of Windows IIs, reinstallation, or a machine specific IIs setting.
Solution: Check the following IIs configurations on the Security server to resolve the problem: * Check there are no restrictions under ISAPI and CGI restrictions in the IIs section for the computer,  Go to IIs Manager => Computer Name (System Name)  Go to ISAPI and CGI restrictions  Allow any Not Allowed restrictions. * Revert to parent in the Handler Mapping under AspenTech Defaul Website; the AspenTech website may have incorrect request handlers defined in IIs.  Go to IIs Manager, => Computer Name (System Name) => Sites => Default Web Site,  Select AspenTech site, double click Handler Mappings, the list of handler mapping for AspenTech site is displayed.  If handler mapping list for AspenTech site is shorter than the list for the Default Web site, click Revert To Parent from the Actions panel on the right hand side of the AspenTech site. * manually register ASP.NET framework v4.  Open a Windows CMD screen using Run as Administrator and enter the following command:  C:\Windows\Microsoft.NET\Framework\v4.0.30319\aspnet_regiis -i  to register ASP.NET V4 framework, then restart IIs Keywords: http 404 not found StaticFile References: None
Problem Statement: Why is the truckstop id written to the TCIF_Export_Orders table instead of the loading terminal id?
Solution: A load is put on the truck (truck is based on truck stop)manually, it is possible that the loading terminal is not open yet. In this case the terminal ID of the truck stop is written into the export orders table Keywords: References: None
Problem Statement: How do I solve the Error 102 in Aspen Shell & Tube Mechanical?.
Solution: A cone on the channel side needs a nozzle for the small diameter of the cone to be placed either on position 1 angle 360 deg and/or position 9 angle 360 deg. This nozzle diameter then becomes the small diameter of the cone. To specify this in EDR Mechanical, the user should go to Input | Exchanger Geometry | Nozzles-General | Nozzles and input the correct location and position for the nozzle. Now your exchanger should look like this: Keywords: Error 102, nozzle, conical head References: None
Problem Statement: The Total cost? [Reg Wage + O/T Wage + Op Costs] How is operating cost calculated? [It uses the Edit Cost Type form's entries. From my calc's, it uses the volume * Cost/Vol from volchart and Distance * Cost / Distance and also the minimums for time and distance, if entered Since the minimum charge values (distance and hours) are used for each delivery on the truck driver's shift, it may not be possible to use only the report's calculated totals for distance and volumes (the volume and distance for each delivery must be compared to the minimum hours and distance]. Also, the volume charts, when used, make it impossible to calculate the operating costs using only the report's sum totals for hours, distance and volume. Kim's hand calcs don't match the columnar data in this report. She will email me the report and some screen captures of the Edit Cost Type forms.
Solution: In this report, all carriers are proprietary and each made 2 deliveries. Aspen Retail assumes that the proprietary trucks will charge you for 12 hours of labor, regardless if they worked fewer hours. This would explain why all the operating costs were the same, despite the different hours worked each shift. Assuming as with your own carriers that there is not a cost-per-volume charge or distance charges, This would work out to $17.30/hour for the proprietary carriers. Also, it could also be that your proprietary carrier charges a lower hourly rate, but there was some fixed charge per delivery. Keywords: References: None
Problem Statement: Script used for clean up interface tables in Oracle Script used for clean up interface tables in MS SQL
Solution: For cleaning up Oracle, please down load the attachment ClearInterfaceTableProcedure.sql, which is a script that will create a procedure in Oracle that will clear the Interface tables. By default the procedure will delete the data that are 30 days old. For cleaning up MS SQL, please download the attachment ClearInterfaceTableProcedure.sql, which is a script that will create a procedure in MS SQL that will clear the Interface tables. By default the procedure will delete the data that are 30 days old. Keywords: Interface Tables Clean up MS SQL Oracle Script References: None
Problem Statement: What does the INI Setting DisplayMovedOrders do?
Solution: The DisplayMovedOrders ini setting was added that allows you to specify whether or not to display (in RSO) the dispatched orders for a station that was moved out of the selected group and for the selected shift. Keywords: RSO References: None
Problem Statement: AFW Security Manager content suddenly becomes empty. All previously configured Applications and Roles have disappeared. While observing this behavior, all related InfoPlus 21 applications run normally without any security access issues.
Solution: Reset the IIS application. From a command prompt type: iisreset and press Enter Once completed all Applications and Role settings in AFW Security Manager should display correctly. Keywords: AFW Security manager empty disappear Vanish References: None
Problem Statement: What would cause account to show FORECASTED in the status report in the morning after DONE FOR THE DAY has run? 1. We need to verify that done for the day actually ran at night. This can be done by looking at the TCIF_APPSTATUS table and seeing what time DONE FOR THE DAY last ran. 2. In addition, please check the ODBC and DBERR logs to make sure that there was not a database disconnect or error. If DONE FOR THE DAY did run correctly then we need to check the INFOUPATE log for the customer to see if any new information was entered for the customer after the DONE FOR THE DAY service ran. If new information is entered for the client then it is possible for the status report to display the account as forecasted
Solution: TheSolution is to make sure that Done For The Day service has successfully run. Keywords: Done for the Day References: None
Problem Statement: Is there way to capture the overflow orders in the optimizer?
Solution: The Overflow Orders Report is a list of the overflow shipments grouped by terminal and provides the reason each order is in the overflow list. Knowing why each order is in overflow reduces the amount of time needed to research why the optimizer put the load in overflow. Keywords: Reports Overflow Orders Optimizer References: None
Problem Statement: On the Replenishment Planner's Order forms, there is a field at the bottom called Private Notes. What is this used for and where is it stored?
Solution: The private notes field allows the user to place a comment in this field. The comment will only be displayed when viewing that particular order. This note will not be displayed on any reports or on the dispatch tender. The private note data is stored in Orders2 table in the PCOMMENTS field. Keywords: Private Notes, Replenishment Planner, Order2 table References: None
Problem Statement: While using Aspen Retail v.7.04, in some cases the total shift hours for some trucks (which in general must be 11.50 hours) are different (eg. 2.02 hours). When we double click the truck the weekly set-up for the truck is correct. Note, that in both cases, the trucks are of Type 2, fix. Whatever we do, we cannot change the setting to have the correct 11.50 hours.
Solution: This can happen when the pre-time setting inside the truck has been set to 999 for whatever reasons. The problem can be removed by simply going into the truck screen and remove these numbers. The way the system treats pre and post time is that it will subtract it from the entire drive time the truck is set up for. For example, if you have a 2 hour preload time set up for a driver and place the driver onduty for 10 hours then the system will only place the driver on duty for 8 driving hours. We need to correctly set the regional settings and the then delete the 999 pre trip time and put in the correct one. Once this 999 pre trip time is cleared you should be able to correctly set the hours on the truck. Keywords: References: None
Problem Statement: When opening an existing case, the program shows a message showing problems with the calculations on the yield stresses for bolts as the message shown below: Also, when running a design with body flanges, the yield stresses are not reported on the bolts. And they don't show up on the database either.
Solution: Unlike the previous databases, the ASME 2011 database is the only material database that includes the bolt information. However this database is not selected as the default database for the program. In order to have the bolt information available and let the software calculate the bolt yield strengths, the ASME 2011 database needs to be selected. To do so, please go to Tools | Program Settings and under the Files tab select the proper database. Keywords: ASME 2011 Bolt yield stresses References: None
Problem Statement: Every truck has got 5 free restriction fields that can be used afterwards to restrict stations and trucks. However, sometimes, the system was not respecting the trucks restrictions when a user manually places a load on a truck using the automatic fit function. For example, a station is restricted from Truck Type 9999. If one pulls the order into storage inside the optimizer and then manually places the order on a Truck Type 9999 and then pushes autmoatic fit, the sysem places this order on a truck the customer is supposed to be resticted from. No check is made on this restriction field. So the dispatcher has to check this manually.
Solution: Fixed in Release 7.04.00.04 (Service Pack) Keywords: Optimizer Restriction Truck Type References: None
Problem Statement: On the 7.4 manual, pg 57, the descriptions for the Diesel Compartment Group and the Full Diesel Group appear the same. The Full Diesel groups last line even refers to the Full Diesel Group? Can you please give a better description of both these functions, and all describe what 1-3 and 4-4 means if these are described by the system?
Solution: See the attached document for detailed explanation Keywords: References: None
Problem Statement: How can I specify the exact nozzle locations in Aspen Shell & Tube Mechanical?
Solution: The exact nozzle locations can be specified at the Input->Exchanger Geometry->Nozzles-Details-Ext.Loads> Penetr/Proj/Distances Tab. This sheet can be used to specify the nozzle projection, elevation, and distances details as well. For instance, if you would like to define the distance from nozzle center line to from tubesheet gasket face, specify those values under the Distance from nozzle centerline, Gasket column. Keywords: nozzle location, Penetr/Proj/Distances, Distance Nozzle CL References: None
Problem Statement: How can I call up a list (either print or printscreen) of all the outlets that are on credit hold . Reason for this is that the dispatcher can check before starting the optimizer which customers are on credit hold and check with the people responsable if one or more of these customer on credit hold might be lifted while he is dispatching.
Solution: Inside the folder that contains the Aspen Retail Applications is an application called Credrep. This utility gives you the option of controling your Credit Hold accounts. You can release orders from this application and you can put sites on hold and take sites off hold. If you go to the file menu, select print, Customer Hold Report, this will print out all customers that are on hold a list of orders pending and any comments that are associated. Then you can take the appropriate action that is needed. Keywords: Credrep Credit Hold References: None
Problem Statement: The must go orders (red) are being sent to the overflow, while the can go yellow orders are being dispatched by the optimizer, WHY?
Solution: There are many scenarios under where a Can-Go (yellow) order would go out before a must go(Red). When this occurs, it is usually because there is a problem with the must go load. The red load has some constraint that can't be met, and so the truck is put into the overflow area. The optimizer, which is focused on reducing trucking costs, tries to fill the hole in the schedule created by the red problem load with a can-go order. I have watched some of our customers use the overflow area as sort of a 'diagnostic memo pad'. You can try to make the problem order go manually and troubleshoot the cause of that particular problem load. Keywords: References: None
Problem Statement: What tables are used for Sales and Inventory input?
Solution: -------- The basic flow of Sales and Inventory Information is: Unreconciled sales and inventory data ==> CURINFO Unreconciled sales only data ==> NEWINFO Unreconciled inventory only data ==> NEWINFO Unreconciled 2nd inventory data ==> UPINFO Reconciled data ==> YESTINFO The only data that either the Replenishment manager Or the Optimizer use is in the YESTINFO table. All other data Is pre Exception Processing and is considered unreliable. Keywords: Sales Inventory Sales and Inventory References: None
Problem Statement: When using Retail 7.5, the DSN path and ODBC name are not being displayed in the login screen
Solution: There are two possible causes to this problem. First check to make sure that the data source name and the net login path are correct in the Customize.ini setting. Please see example below. If these setting are correctly filled in, please see step 2. [RetailTCIF] dataBase=ORACLE DataSourceName=at8i netLoginPath=at8i.world Password=rdrlg29 UserName=custom741 Version=7 [RetailTPS] dataBase=ORACLE DataSourceName=at8i netLoginPath=at8i.world Password=rdrlg29 UserName=custom741 Version=7 A second cause of this is that the registry settings are not correct. You will need to adjust the SETTINGS_DIRECTORY path for Aspen Retail. To do this, go to the RUN command in the start menu and type REGEDIT to bring up the REGISTRY EDITOR. Expand the HKEY_LOCAL_MACHINE folder and then find a folder called SOFTWARE. Expand the SOFTWARE folder to get to a folder called ASPENTECH. Under the ASPENTECH folder they should see a folder called COMMON. Inside the COMMON folder, double click on the SETTINGS_DIRECTORY and then type in the patch where the application is installed. The normal default setting for this will be C:\Program Files\AspenTech\Aspen Retail Suite\Aspen Retail. Please refer to the attached screen shot for this procedure. Keywords: database ODBC name DSN path References: None
Problem Statement: Can Aspen Framework Security and Aspen Local Security be installed on the same server?
Solution: Aspen Local Security and Aspen Framework Security cannot coexist on the same system. For example, if you already have a system with Aspen Local Security configured, you cannot install Framework Security on the same system. In order to install Aspen Framework Security, the Aspen Local Security Component needs to be un-installed first. Keywords: Local Security Framework Security ALS, AFW References: None
Problem Statement: Why does Aspen Shell and Tube Mechanical give both calculation results for Integral and Loose Flange in Body Flange code calculation?
Solution: This is probably because users select to use Optional type flange as code type. From ASME Section VIII-1, Optional Type Flanges have be defined as below. Optional Type Flanges. This type covers designs where the attachment of the flange to the nozzle neck, vessel, or pipe wall is such that the assembly is considered to act as a unit, which shall be calculated as an integral flange, except that for simplicity the designer may calculate the construction as a loose type flange Optional type flanges can be calculated as integral or loose. The program will do both calculations and select the type that results in the smallest flange thickness. If the user wants to select an integral or loose flange type (not optional), then the User can change the flange type to integral or loose. Keywords: Code Calculation, Body Flange, Integral Flange, Loose Flange. References: None
Problem Statement: The ATReplicateSheet(VarParm, MDBFileName) is used to create multiple copies of the worksheet it is located in. It supports two methods of determining how many copies of the worksheet to create. The first method involves passing an SQL query string as a parameter. The function will execute the SQL query and create a copy of the worksheet for each row of data it returns. In addition, an Excel worksheet level defined name is created in each copy that will contain the values of each column in the row with the names of the columns being used as the defined names. This is done so that other functions in the worksheet can use these values as arguments to other functions or as part of cell expressions. The second method involves passing an array of strings as a parameter. The function will create a copy of the worksheet for each member of the array and create the Excel worksheet level defined name REPLNAME containing the value of the array element. This is done so that other functions in the worksheet can use this value as an argument to other functions or as part of cell expressions. This
Solution: document shows how to use the SQL query string as a parameter, along with the Excel worksheet level defined name. It is used when you need to filter the data by some criteria, instead of specifying a fixed array of data. For comparison purposes, the array string method is also shown. Usually this function is used in conjunction with ATRenameSheet(), which is needed to give a distinct name to the replicated sheets. Also, the Excel name is used as a parameter in most of the functions used in the template. Solution For this purpose, an SQL query that brings only one column of data will be entered as the first parameter to the ATReplicateSheet(VarParm, MDBFileName) function. The name of the column, defined in the SQL query, will automatically be the name of the Excel worksheet level defined name. This Excel worksheet level defined name is a variable that references a specific cell, a range of cells or a value. In this case, it will reference a value. Necessary steps: Create the query that will select the data that you want to report: for example, UNITS, BLENDS, STREAMS, etc. Create the Excel name. To do so, go to Insert | Name | Define. On the top of the windows that appears, enter the name you want to define, at the bottom enter the value: in this case you can leave it empty by writing =. Another possibility is to enter as the value, the same name; this is not a problem as it will be overwritten anyway when RW executes. In the attached example, both methods are shown: In Sheet ''Unit - Excel Parameter'', in cell B2 the following SQL query is used =ATReplicateSheet(select distinct UNITOP from PMUNITOP, PMGROUPS where UNITOP like ''SCD*'' and UNITOP=STRMID and GRPID= ''#UNIT'',) This will select all the UNITOP data that begins with ''SCD''. UNITOP, the column name of the data brought back, will also be the Excel worksheet level defined name: in each of the replicated sheets, the value of that name will be one of those returned by the query. This name UNITOP is used in the same template as a parameter for renaming the sheet as well as for selecting data for the report itself. Sheet ''Unit - RW Group'' shows how to pass an array of strings, as well as the use of the Excel name REPLNAME, created by Report Writer. In this case, this name is fixed; in the previous case, that name is just the selected column name of the query. As an example of how to use this Excel name as a parameter, note that the function =ATRenameSheet(UNIT & UNITOP ) would rename the first Sheet as UNIT SCD1 if the first retrieved value for the query is SCD1. The syntax is also shown for combining a string (UNIT) with an Excel name value (UNITOP). The & symbol is used for this Keywords: Excel worksheet level defined name ATReplicateSheet() ATRenameSheet() SQL Query Parameter References: None
Problem Statement: Script to be used for creating new uses in MS SQL
Solution: Please download the attached script (create_MSSQL_user.sql) for creating a new user in MS SQL. You need to login as SA account or equivalent and pointing to the database where the user will be created in. You will need to edit the username and database name to suit your needs. Keywords: Script MS SQL New User References: None
Problem Statement: How to use the Holiday Storm Planner for next year and subsequent years?
Solution: The holiday storm planner allows you to specify a reference period, but now it has a new notes feature. Go to the Utilities pull-down menu and click on Holiday Storm Utility. You would click on the NOTES button to create a new note or the List All Holiday Notes to review and modify existing notes. Keywords: Holiday Storm Planner References: None
Problem Statement:
Solution: Keywords: References: None
Problem Statement: The Aspen Framework (AFW) Security Client Tool is used to monitor and administer the AFW Security Client Service. The polling cycle frequency and time threshold to wait before reissuing a Security Server request are both specified on the main dialog screen. The Advanced Screen can be used to show how often requests are being reissued. Common security functions using both the in-process and out-of-process implementation approaches can be exercised on the Test Security Functions form. The Test Security Functions form contains two radio buttons: Out of Process (Authorization Service) In Process (Authorization DLL) What is the user context when choosing either one of them?
Solution: If you run the Test Security Functions tool, choosing the ?Out of Process? button, then it uses the logon account of the AFW Security Client service. If you pick the ?In Process? radio button, then it will test based on the logged on user. SeeSolution 110390 for more details about this configuration option. Keywords: None References: None
Problem Statement: Is there a way to keep the dispatchers from entering manual orders for all customers associated with one terminal? Often times, when running parallel between Aspen Retail and the customer's legacy system, dispatchers will enter orders into Aspen Retail before it is live (the legacy system is still live at this point).
Solution: Two possibleSolutions: Modify the stations table and set the OrdersAhead to 0 for each customer associated with the particular terminal. When the terminal goes live, modify the table so that OrdersAhead equals some acceptable default value, such as 5. OR Add a prefix number in front of each customer number - a prefix that would be unique like 999 or 888, etc. This way the dispatcher can not find the customer. When the terminal goes live, then do a search to remove that numeric prefix from each affected customer at that terminal. Keywords: References: None
Problem Statement: In the customer and terminal setup screens the user has the possibility to enter a phone and/or fax number. The number is spit up in prefix and actual number. In some European countries the prefix is longer than 3 digits.
Solution: Wordaround is simple. Put everything in the second field. Keywords: Terminal Setup Prefix field References: None
Problem Statement: How do I know if I'm using AtWebSSO?
Solution: Web-Based Single Sign On (AtWebSSO) is installed automatically with Aspen Local Security, or as one of the Aspen Framework components. AtWebSSO may be necessary for some of the Aspen Role-Based Visualization Portal Content Packs to work. There are two tools that you can use to know if you are using AtWebSSO: 1. WebSSO Configuration Utility. Aspen Security Component installs the WebSSO Configuration Utility, which can be found at: <drive>:\Program Files\AspenTech\BPE\WebSSOConfigTool.hta This configuration tool allows you to see all registered applications and elements of IIS. It allows you to add web applications in secured or unsecured mode as well as remove previously registered WebSSO applications. 2. WebSSO Diagnostics. Aspen Security Component installs WebSSO diagnostic application <http://support.aspentech.com/webteamcgi/SolutionDisplay_view.cgi?key=113994>, which you can find at: <drive>:\Program Files\AspenTech\BPE\SSD.hta. This diagnostic tool will analyze your Aspen Web-Based Single Sign On configuration and perform dynamic tests. It can be run on the Common Logon Server or any IIS server that is secured with Aspen Web Based Single Sign On. It checks your Single Sign On (SSO) and IIS configurations, determines responsiveness of key components, performs general network tests, and offers knownSolutions to identified problems. Windows Management Instrumentation (WMI), Active Directory Services Interface (ADSI), and Windows Host Script (WHS) are used to perform these tests. All results will be echoed to SSDResults.txt in the folder used to launch this diagnostic. Keywords: Web-Based Single Sign On IIS References: None
Problem Statement: I can not add a station to a cluster. I receive a dialog box stating the station already belongs to one, but I know that it doesn't.
Solution: When deleting a Cluster it is important to do it in sequence. Remove the Sites from the Cluster then delete the cluster. When a site is added to a cluster it marks the stations table (column name SPLITLOAD) with the cluster number. If the Cluster is removed from the tps_clusters table without removing the sites this column remains unchanged and therefore it will not allow you to re-asign this site to a new cluster. If a cluster is removed from the tps_clusters without removing the sites first then you must go to the tps_stations table and edit the colmn SPLITLOAD with a '0'. This will allow you to re-assign this site to a different cluster. Keywords: Clusters Split Loads References: None
Problem Statement: Why is the system forecasting large amounts of side products that the station is not going to sell?
Solution: Aspen Retail is an inventory management system that stresses both proportional loads and lowest cost dispatch. In some rare cases stations that sell historically low volume on side products can slowly become out of balance on these side products. This occurs because the system will always choose a less proportional cheaper full load as opposed to a more costly more proportional short load. One current option to minimize this issue is to put maximum loads sizes on the affect stations. The best way to do this is to establish which product is the controlling product (i.e. which product is driving the delivery schedule). In almost all cases this product will be regular unleaded. Next, look at the forecasted orders for the stations and see on average how much of the controlling product is being delivered on each load. This is the quantity that the maximum load size should be set at. In the above example, assuming UNLEADED is the controlling product we would probably want to set the maximum fill level around 6000 gallons. The maximum fill level field is located in the Customer Set under restrictions. Future versions of Aspen Retail will put more weight on carry costs to help relieve this issue. Keywords: Inventory controlling product References: None
Problem Statement: It is sometimes necessary to use XML to modify existing batch data. Below are brief examples illustrating how to use XML to modify existing batch data.
Solution: NOTE: The examples in this KB article are for sending data to the data source object. If you are using the web service, please remember that it is sending the xml to the data sources object, which would require the code samples below to be modified to include the data source name. 1. Specifying the batch to modify There are two possible methods available to specify the batch for which data will be added or modified. The first involves batch handles. The second involves batch designators. The batch handle can be used if the exact internal batch ID of the batch is known. <BatchData xmlns=Aspentech.Batch21.BatchData> <Area name=Demo2> <Batch> <Handle>calufrax.14</Handle> [...] The batch designator can be used if the batch ID is not known, but the designator value is known. <BatchData xmlns=Aspentech.Batch21.BatchData> <Area name=Demo2> <Batch> <Designator name=Batch No>976</Designator> [...] If the batch area uses multiple designators, it is necessary to specify all of them: <Designator name=Batch No>976</Designator> <Designator name=Campaign>2003</Designator> 2. Specifying items to add to the batch New subbatches and characteristics can be added to the batch in the same way as is done for a new batch. Specifying an instance value of next will add an instance one larger than the largest that already exists. The following XML will add instance 1 of Batch Cost if none already exists, or instance 2 if one already exists: <BatchData xmlns=Aspentech.Batch21.BatchData> <Area name=Demo2> <Batch> <Designator name=Batch No>976</Designator> <Characteristic name=BATCH COST instance=next><Value>1500</Value><Timestamp isUTC=1>2002-10-07T23:01:25</Timestamp> </Characteristic> [...] 3. Adding new items to existing subbatches New subbatches and characteristics can be added to an existing subbatch by specifying the desired subbatch instance--in this case, first, last, auto, or n (any number greater than or equal to 1). A subbatch instance value of auto will select the highest instance if one exists, or will create instance 1 if no instances exist. Instance values of first or last will select the appropriate instance if it exists. These instance values will return an error if no instances exist. An instance value of n will select that instance if it exists, or will create that instance if it does not exist. For example, the following XML will add the Start Time characteristic to the first instance of the Mix subbatch. <BatchData xmlns=Aspentech.Batch21.BatchData> <Area name=Demo2> <Batch> <Designator name=Batch No>976</Designator> <Subbatch name=Mix instance=first> <Characteristic name=Start Time instance=auto isUTC=1> <Value>2002-10-07T23:01:25</Value> <Timestamp isUTC=1>2002-10-07T23:01:25</Timestamp> </Characteristic> </Subbatch> </Batch> </Area> </BatchData> 4. Modifying existing characteristics Existing characteristics can be modified by specifying the forceOverwrite attribute. The following XML example will update the Batch Cost characteristic if it already exists, or create it if it does not exist: <BatchData xmlns=Aspentech.Batch21.BatchData> <Area name=Demo2> <Batch> <Designator name=Batch No>976</Designator> <Characteristic name=Batch Cost instance=auto forceOverwrite=1> <Value>1500</Value> <Timestamp isUTC=1>2002-10-07T23:01:25</Timestamp> </Characteristic> </Batch> </Area> </BatchData> The following XML will update the highest instance of the Batch Cost characteristic if one already exists, or return an error if the characteristic does not exist: <BatchData xmlns=Aspentech.Batch21.BatchData> <Area name=Demo2> <Batch> <Designator name=Batch No>976</Designator> <Characteristic name=Batch Cost instance=last forceOverwrite=1> <Value>1500</Value> <Timestamp isUTC=1>2002-10-07T23:01:25</Timestamp> </Characteristic> </Batch> </Area> </BatchData> Additional examples are attached to this KB article. Keywords: sample References: None
Problem Statement: How do I correct the errors when using the Aspen Timezone Information Service in ADSA? Failed call to Connect To Source:<SourceName>: Check to make sure AtTimezoneSvc_ps.dll is registered Failed call to send: http://<ServerName>/Web21/ProcessData/AtProcessDataREST.dll/TimeInfo/Short ComError:Access is denied when configuring the Aspen Timezone comp Invalid Bias Response: < !DOCTYPE html PUBLIC -//W3C//DTD XHTML 1.0 Strict//EN http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd>
Solution: The way you configure the Aspen Timezone Information Service in ADSA depends on the version of Aspen InfoPlus.21 you are using and whether or not you are using the 64-bit version of Aspen InfoPlus.21. InfoPlus.21 32-bits Pre-V8.0: The DCOM option requires the Aspen Timezone Information Service be installed. AtTimezoneSvc.exe is located in: C:\Program Files (x86)\AspenTech\InfoPlus.21\db21\code V8.0 and later: Two protocols are available: DCOM (as previous versions) and Web Service (new). The Web Service option allows a better Firewall handling. The DCOM protocol requires the Aspen Timezone Information service be installed and the Web Service requires the Process Data REST service installed. The AtProcessDataREST.dll is found in C:\inetpub\wwwroot\AspenTech\Web21\ProcessData. InfoPlus.21 64-bits Two protocols are available: DCOM and Web Service. However, the Aspen Timezone Information service is no longer distributed (installed). The work around is to use Web Service, which uses the Process Data REST service (the AtProcessDataREST.dll is found in C:\inetpub\wwwroot\AspenTech\Web21\ProcessData). - Make sure that ADSA \ Public Data Sources are correctly configured. - Re-register AtTimezoneSvc_ps.dll (where TZ points shows a message related to that component). - If the Aspen InfoPlus.21 Web server is V7.3 or before, use DCOM in the ADSA \ Aspen Timezone Information Service. - If using the DCOM option make sure that there are enough DCOM LogOn privileges between the Aspen InfoPlus.21 / Web server and the client machines. - If using Web Service make sure Process Data REST is installed. Keywords: ADSA, Aspen Timezone Information Service, DCOM, Web Service, AtProcessDataREST.dll References: None
Problem Statement: When I run a simulation in v8.8.2, the code calculations have the same format as in previous versions. How can I see the calculations in the new format?
Solution: The new report format is currently not the default. To turn it on, go to File | Options | Execution and check the Use Rich Text display for code calculations box as shown below: Keywords: code calculations, rich text format, output, equations References: None
Problem Statement: How to specify a Pneumatic pressure test for shell and tube heat exchanger?
Solution: Aspen Shell & Tube Mechanical program allows the designer to specify a Pneumatic pressure test instead of a Hydro static test. Calculations for the required Pneumatic test pressure will be done in accordance with ASME UG-100. To specify a Pneumatic pressure test in Aspen Shell & Tube Mechanical, click on Program Options and go to Test case tab under change codes/ test case /combined loads page. Under test case tap user can specify a Pneumatic pressure test for heat exchanger (please refer below screenshot) The Test Pressure calculation will be shown under Design Summary > MAWP/MDMT/Test/PWHT tab test pressure. The Test Pressure detailed results will be shown under Code Calculations > MAWP/MDMT/Test P/Static P and then clicking as shown below: Please note that Pneumatic pressure test option available in EDR v9.1 and onwards. Keywords: Pneumatic pressure test, ASME UG-100 References: None
Problem Statement: How can someone specify the maximum stresses on the bolt, flanges and gasket?
Solution: You can enter an allowable material strength (stress) that is different from that obtained from the built in material database. This option is available under Input | Exchanger Geometry | Body Flanges | Nubbin/Racess/Gasket. The screenshot below shows the option for modifying the allowable stresses for the bolt. The options to define the maximum flange and gasket stresses are on the adjacent rows. Note that there is an input for each flange position typically found in a shell & tube heat exchanger. Keywords: Flange Gasket Bolt Maximum References: None
Problem Statement: Aspen Data Source Architecture (ADSA) configuration includes the ability to set Permissions and Enable Security on Public Data Sources using the Permission and Advanced buttons respectively. Note, this is standard Windows user and group security - not Aspen Local Security Roles. ADSA also has the ability to Export and Import both Public and User Data Sources. However, when exporting Public Data Sources, any accompanying Permission setting is not included in the XML that would be loaded onto a new machine. How do I restore those permissions?
Solution: ADSA permissions associated with each Public Data Source are stored in the registry under:- HKEY_LOCAL_MACHINE\Software\(Wow6432Node)\AspenTech\ADSA\DataSources - Where (Wow6432Node) only exists on 64-Bit machines such as Microsoft Windows 7 or Microsoft Windows 2008 server. Therefore ADSA Permissions can be restored by using Registry Export from the old machine and Registry Import on the new machine NOTES :- - Backup of the registry being modified should always be created before any editing (Import) is attempted. - Registry editing should only be attempted by somebody familiar with the procedures and impacts - If exporting from a 32-bit machine and exporting to a 64-bit machine, then before the import, you should edit the .reg file, adding the Wow6432Node to the path of the registry directory location on the 64-bit machine. Keywords: None References: None
Problem Statement: You receive the error message unable to load registry or load public list upon running ADSA Client Config Tool and clicking on Public Data Source button. This happens while running on the ADSA Server itself:
Solution: Make sure the DCOM settings are properly set on the server. 1- Run the DCOMCNFG command and locate the Aspen Data Source Directory icon under DCOM Config. Right click on it and select Properties: 2- Select the Security tab. Check the Customize button for Launch and Activation Permissions and click on the Edit button: 3- Make sure your user account or group has all permissions set to Allowed. 3- Make sure your user account or group has all permissions set to Allowed. Repeat this step for Access Permission and Configuration Permission. Keywords: ADSA DCOM registry public list References: None
Problem Statement: Can I get a clear definition of what DaySupplyThreshold setting does? Am I correct in assuming this will look at the average sales on a certain product and then calculate towards whatever the INI setting is at. For example, if a product sells 200 gallons a day and we have the setting at 25, it would then be allowed to deliver up to 5000 gallons per load. Or does this mean the maximum in the ground at any given time could not exceed 5000 plus the pumpstop.
Solution: This setting lets the user specify the maxium number of days supply the software is allowed to deliver to a customer. The software calculates the days supply threshold after the delivery by subtracting the largest day supply by product from the smallest day supply by product and compares it to the DaySupplyThreshHold. This setting only looks at the product being delivered on that particular delivery. Here is an example: DaySupplyThreshold=5 Unleaded Mid-grade Premium Current days supply 3 4 6 Projected days supply 10 12 14 The software would subtract the largest day supply (Premium) 14 - from the smallest day supply (Unleaded) 10 to equal 4 (14-10) The software would then compare the 4 days supply to the DaySupplyThreshold of 5. Since the days supply after the delivery, 4, is within the DaySupplyThresHold set at 5 days, the delivery is within the tolerance and the delivery is legal. Keywords: Day Supply Threshold INI setting References: None
Problem Statement: Does the Forc Only option in the exception processing screen ever write back to either the psoftg ini settings or the psoft ini settings so it would always appear checked on?
Solution: The Forc Only is not stored in any .ini file. Keywords: Forc Only References: None
Problem Statement: The Aspen DA for IP.21 Service contains a field called Code Page: What is this field, and what may be entered into the field?
Solution: The Code Page option allows clients to read/write strings into Aspen InfoPlus.21 using the Aspen InfoPlus.21 Administrator on a foreign language Operating System. For example, if you have installed Aspen InfoPlus.21 on a Chinese Operating System and use the Aspen InfoPlus.21 Administrator, you can put in a description for a tag using Chinese characters. Client applications that use the Aspen InfoPlus.21 Data Server (ie: Aspen Process Explorer, aspenONE Process Explorer, etc.) cannot read/write the description if the correct code page is not set. Once you specify the correct code page in Aspen Data Source Architecture (ADSA) Directory Server then the client applications can successfully see the data. Once the code page has been set for an InfoPlus.21 system it needs to remain that way. Default Code Page 0 works for the following languages: English French German Italian Portuguese Spanish The supported code page values for different language character sets are: 936 - Chinese 932 - Japanese 949 - Korean 866 - Russian Keywords: ADSA. code page, References: None
Problem Statement: How to prevent forecasts to a customer product tank that is leaking?
Solution: If a tank is going down how can I stop the site from forecasting loads to be sure there won't be any deliveries for this site for a particular product? The options are: Holiday storm planner- knock the sales down to zero or near zero. The problem with this is that those zero or near zero sales are stored in the trend data used for forecasting, and there is the possibility that a delivery might occur. Raise the pump stop so that the tank's available inventory is less than the smallest truck compartment. Orders will still be generated, as problem loads, in the optimizer, and hence there is a probability the order might be OK'ed by the dispatcher. This method also leaves bad (extremely low) sales data in the new trend table that will affect future forecasts. Delete the tanks. The best way to take a tank offline, is to save the trend data (with the customer setup report) and then delete the tank. The best option is to delete the tank and re-enter the trend data when you add the tank back in. The other options do not protect your new trend table and deliveries might accidentally occur. Keywords: References: None
Problem Statement: The OpenBatch Interface error log files can sometimes grow very large very quickly.
Solution: To avoid logging large amounts of data to the OpenBatch Interface log files, edit the OpenBatchBCI.def file and change the debug_messages setting to OFF. Keywords: debug_messages OpenBatchBCI.def References: None
Problem Statement: When setting up a custom command in the BCU. The argument time of trigger format string is not documented.
Solution: The format is a C-language strftime-style format string. The most common format string is %d-%b-%y %H:%M:%S, which produces a string of the form 20-Apr-00 00:00:00. Keywords: BCU, custom command, format string, arguments. References: None
Problem Statement: The VB Application for creating the Demo Batch area is not localized and will not accept Kanji
Solution: Aspentech ships the source code for the VB Application which creates the Demo area with Batch.21. After installation it can be found in this directory (if your install drive is C): c:\Program Files\Aspentech\Batch.21\Demo\atcBatchArea The source files are included so that a programmer can adapt the code in order to create a localized VB App, if so desired. Keywords: References: None
Problem Statement: How to delete unwanted batches, unused characteristics, sub-batch levels etc., from the Batch.21 database.
Solution: Any Batch data associated with deletions will have to first, be Archived and Purged before anything can be deleted. The actual recommended steps would be :- .Stop, Disable and Delete all relevant units from the scheduling table .Delete any references to the unwanted characteristics/sub-batches from the unit scripts .Query the database to find all batches that reference the characteristics/sub-batches you want to delete Archive and Purge the above-mentioned batch data using the Batch RANGE method (NOT the Time Range method) - SEE BELOW for more details .Proceed to Delete the unwanted characteristics/sub-batches etc. in the Batch.21 Administrator .Install, Schedule, and Execute the previously modified unit scripts For more detail on performing Archiving and/or Purging of Batch data, the recommended procedure would be: 1. Go to Start Programs | AspenTech | Aspen Manufacturing Suite | Batch.21 ! Batch.21 Administrator 2. In the Left Hand Pane, expand 'Batch.21', followed by 'Data Sources', followed by the required data source. 3. Now expand 'Areas' and Right Click on the specific Area in question. 4. Select 'Database' and a new GUI will appear with the label of Aspen Batch.21 Backup. 5. On the top left hand corner of the GUI you will see a box labelled 'task'. It has four choices ...... Archive, Backup, Purge and Restore. 6. Underneath the choice for 'task', there is another multi-choice box labelled Criteria 'Type'. It has three choices Time Range, Batch Range and Batch IDs. NOTE: Our Knowledge-base article 123389 has an excellent viewlet video of how to perform Steps 1-6 above Keywords: Purge Delete Archive References: None
Problem Statement: Error: Invalid startup file... when running a saved query in the Batch Query Tool. The above-mentioned error message is usually received when clicking on the saved query in the recent files list (under the File menu item) when the query was saved in a directory which was NOT currently visible in the scope pane of the BQT.
Solution: TheSolution to this problem is to change the default root directory for the BQT to the directory where the query was saved, which will then clear the recent files list and the problem will no longer manifest itself, or delete and recreate the query and save it in the default BQT directory, which is \Program Files\AspenTech\Working Folders\Batch.21\QueryTool\Data\ The default batch root directory can be set in BQT by clicking Tools | Options | General tab. The latter option is recommended, which is to always save your queries in the default BQT directory. NOTE: The root directory which is shown in the BQT scope window must normally be above the saved query file you want to use. If you put the saved files somewhere else, sometimes the system can't figure out where the query is. This is because the scope window is automatically driven by the directory structure it is showing and it is not possible to select something in the scope window that you cannot see. Some of the functions (including automatically running a saved query) require that the query is selected in the scope window. Just run the query by highlighting it in the scope window and pressing the ! button. Keywords: References: None
Problem Statement: At first glance, it seems necessary to grant System rights to a user account before SQLplus scripts that run under that account can successfully access Batch.21 COM objects.
Solution: It is possible to configure access to COM objects without granting the System privilege to the user. In fact, the System Command permission is only needed to execute the CreateObject call. The New call also can create objects but only requires the security access specified in the type library reference. So, to use the Batch.21 API from SQLplus without requiring users to have System permission: In the SQLplus Query Writer, select View | Keywords: security violation access References: ... from the menu. Select Aspen Batch.21 Application Interface. Check the box and select Read for Security. Click OK. In the SQLplus query, use the New function rather than CreateObject. Eg New(AtBatch21ApplicationInterface.BatchDataSources) instead of: CreateObject('AspenTech.Batch21.BatchDataSources.1') The query can then be executed by users that only have SQLplus read permission.
Problem Statement: When client applications are separated from the InfoPlus.21 server by a firewall, the Aspen Production Record Manager (APRM) client applications may not work.
Solution: It is possible, using the steps inSolution 104056-2, to configure the server tasks in InfoPlus.21 to only use specific ports. There is even a TSK_BATCH21_SERVER in Version 5 and later, to which thisSolution applies. However, even with the port address locked, and those same ports available through the firewall, APRM clients may not connect. This is because the real client to the TSK_BATCH21_SERVER task is not the client tools on the desktop, but the Aspen Production Record Manager Service itself. When a client tool makes a request to the APRM Server, the Server service itself then becomes the client that either (a) requests information from the APRM Relational Database, or (b) information from InfoPlus.21 via the TSK_BATCH21_SERVER service. The APRM client tools use the DCOM method of connection. The architecture of this method requires that a range of ports be open on the firewall, and a range of ports be specified as available on the server. The application will then connect using this range information. How does DCOM work? DCOM dynamically allocates one port per process. You need to decide how many ports you want to allocate to DCOM processes, which is equivalent to the number of simultaneous DCOM processes through the firewall. You must open all of the UDP and TCP ports corresponding to the port numbers you choose. You also need to open TCP/UDP 135, which is used for RPC End Point Mapping, among other things. In addition, you must edit the registry to tell DCOM which ports you reserved. There are two different ways to accomplish this, via configuration in DCOMCNFG, or by editing the registry. We recommend the DCOMCNFG approach, since registry edits by their nature carry more risk, and the registry instructions below apply to one particular version of Windows, and the actual keys and locations can vary between different Windows versions. 1. Via DCOMCNFG: Run DCOMCNFG, open Component Services, Computers, select the server, right click, and click properties. Select the Default Protocols tab, select Connection-oriented TCP/IP, click the Properties... button. Here you can specify port ranges for DCOM. 2. Editing the registry key: HKEY_LOCAL_MACHINES\Software\Microsoft\Rpc\Internet registry key, which you will probably have to create using the Registry Editor. The following example tells DCOM to restrict its port range to 10 ports: Named Value: Ports Type: REG_MULTI_SZ Setting: Range of port. Can be multiple lines such as: 3001-3010 135 Named Value: PortsInternetAvailable Type: REG_SZ Setting:Y Named Value: UseInternetPorts Type: REG_SZ Setting: Y The next step is to make sure that the ports specified (in this example ports 3001-3010) are available on the firewall. Reboot the server to make all changes take effect. Now start up a APRM client tool on the other side of the firewall. It will randomly pick one of the available 3001 - 3010 ports to connect. If you have many Batch clients, you should open a range of ports large enough to anticipate the number of potential simultaneous connections. A firewall monitoring tool will show what port the Batch application is using. Keywords: Batch Query Tool BQT References: None
Problem Statement: Unit scheduling stops when IP.21 is stopped and does not resume when IP.21 is restarted.
Solution: Look for TSK_BCU_START in the IP.21 Manager's Defined Tasks list. If it is present, ensure that Skip during startup is not selected. If it is not in the list of defined tasks, create a new task with the following conditions: 1) Task name: TSK_BCU_START 2) Subsystem: Batch21 3) Executable: %b21%\bcustart.exe 4) All check boxes should be cleared. 5) Select ADD 6) Place the task near the latter part of the defined tasks list. The default install places it after TSK_BGCSNET. Keywords: batch tsk bcu batch.21 References: None
Problem Statement: IP_INPUT_TIME??IP_INPUT_QUALITY???AIP_INPUT_VALUE???C???I????Zz?E?S???[?g?IInfoPlus.21???c?AAspen CIM-IO for SETCIM/InfoPlus-X/InfoPlus.21??Zg?p???A?C?Y?????u?@?B
Solution: IP_INPUT_VALUE, IP_INPUT_TIME, IP_INPUT_QUALITY ?I?????\`?`I?I???E?U`????e?A???U???B GET???R?[?h?I?????E???I?????E?Y?e???e???A?S???[?g?IInfoPlus.21?IIP_INPUT_TIME??IP_INPUT_QUALITY?????[?J???IInfoPlus.21?I?????t?B?[???h?E?`? IO_TAGNAME IO_VALUE_RECORD&FLD REMOTE_TAGNAME IP_INPUT_VALUE LOCAL_TAGNAME IP_INPUT_VALUE ?>???A?AIP_VALUE, IP_VALUE_TIME,IP_VALUE_QUALITY ?a???l?E?E?I???e?\`?`I?I???E?U?????A?U`????e?A???U???B ???????A?A???e?c?a?AGET???R?[?h?I?????E???I?????E?Y?e???e???AIP_VALUE???C???I????Zz?E IP_VALUE_TIME ?? IP_VALUE_QUALITY ???S???[?g?IInfoPlus.21???c???[?J???IInfoPlus.21?E?C?Y?z??????????? IO_TAGNAME IO_VALUE_RECORD&FLD REMOTE_TAGNAME IP_VALUE LOCAL_TAGNAME IP_INPUT_VALUE ???IZz?A???[?J???I?^?O?IIP_INPUT_VALUE?AIP_INPUT_TIME?AIP_INPUT_QUALITY?I?I?>???U?????B Keywords: IP_INPUT_QUALITY IP_INPUT_TIME IP_VALUE_QUALITY IP_VALUE_TIME JP- References: None
Problem Statement: How can I generate the steady state detection file called 'ssd status.txt'?
Solution: In control panel go to Option tab and select the check box for SSD Print and make sure steady state program is running. After doing this within a minute or two you should be able to see the ssdstatus.txt file. Keywords: None References: None
Problem Statement: There are times when PIMS is closed unexpectedly, freezes or crashes and Excel sessions can be left open in memory. This can cause problems when subsequently trying to open Excel files from the PIMS model tree. It is best in such cases to close all Excel sessions to clear the memory.
Solution: The attached file is a .bat file will close all Excel sessions open in memory. This allows the PIMS user an easy way to clean up Excel sessions that may be lingering from a PIMS freeze up. Keep in mind that ALL Excel sessions will be closed whether they were initiated by PIMS or not. So if any open Excel files have changes that need to be saved, do that before using this tool. Save the killit.xxx file to your hard drive (for example in the folder C:\Util). Change the file name to killit.bat. In PIMS, click on TOOLS | Edit User Tools Menu. This will open the User Tools Dialog box. Click on ADD and browse to the .bat file. Then name the utility as desired. Once completed click OK to close the dialog box. Now when you select TOOLS in PIMS you will see a new menu item for this function: To execute the utility, select TOOLS, then Kill All Excel Instances (or whatever name you have chosen). One downside to this utility is that sometimes after running this, PIMS may have to be restarted so that the hidden Excel session can be restarted. Keywords: Excel References: None
Problem Statement: When the Batch.21 Web Reporting page is opened, the page is blank. All that is shown is an empty page with the words done at the bottom left hand corner of the page.
Solution: Sometimes, aspx web pages will not display correctly when IIS (or another web server) is installed after the .NET Framework. Since the Batch Web tools are .aspx web pages, they will appear blank if IIS was reinstalled after the Microsoft .NET components. In this case, theSolution is to re-register ASP.NET using the aspnet_regiis.exe. This can be done with the following steps: Open a command prompt screen by typing cmd in the Start | Run dialog box. At the command prompt, use the cd command to change to the directory of the aspnet_regiis.exe. The location of this directory can depend on the version of the .Net Framework installed. By default, the location is C:\Microsoft.NET\Framework\<versionNumber> At the correct directory type: aspnet_regiis.exe -i at the command prompt to reregister ASP.NET in IIS. Restart the Batch.21 web reporting pages, which should now display correctly. Keywords: References: None
Problem Statement: A tag is that incrementing at a near constant rate until it is reset at around 1000. How can you create a trigger to start a batch when the tag value goes above 50? A problem occurs when the data compression and near constant rate results in IP.21 only recording the first and last points (0 and 1000). Batch.21 consistently records the start time as that of the first point outside of the data compression limits after 50 (which is usually 1000). Setting the Extrapolate to Current Time to ON and OFF has no effect on tracking the value. Is there a way to accurately trigger on 50 without eliminating Data compression?
Solution: During the time while the data is flowing in, the IP_INPUT_VALUE field of the IP.21 tag will see the values that come in every 5 minutes. If the BCU is running DURING this period, it will be able to see these data points (see [*] below), and will trigger at the first time point at which the value actually exceeded 50 (at 14:15 in this example). However, these values are getting compressed and not going into history, so if the BCU is NOT running during this period, it will never be able to see these interim data points, only the ones finally laid down in history. The BCU does NOT attempt to retroactively linearly interpolate to figure out where the conditions MIGHT have been met. For example, with a condition of > 50, interpolating the following history : 12/30/02 14:00 0 12/30/02 14:45 227 would imply that the conditions were met at 14:09:54.714. But operating off of the real data would have given you a timestamp of 14:10:00. Nobody likes discrepancies, especially in a regulated environments as an example. for example, on two successive runs of the BCU, it might see these data points coming back from IP.21: run 1, at 14:07 : 12/30/02 14:00 0 <-- last point laid down in history 12/30/02 14:05 25 <-- exists only in IP_INPUT_VALUE / IP_INPUT_TIME run 2: at 14:12 : 12/30/02 14:00 0 <-- still last point laid down in history 12/30/02 14:10 50 <-- note that 14:05 data point has been compressed out Keywords: trigger batch.21 infoplus.21 References: None
Problem Statement: When configuring Batch.21 in the ADSA utility, you now have the option to use two different connection methods to the relational database: OLE DB and ODBC Data Source. What is the difference between OLE DB vs ODBC Data Source? When would you use them? Does this mean that each Batch.21 user would need to be listed as a user in the relational database in order to be authorized to access the information?
Solution: In version 5.0, the method Batch.21 uses to connect to the relational database was enhanced. A Server Connection String method was added in addition to ODBC to allow customers more flexibility. The main concern was security. A secure connection via ODBC requires that a username and password be embedded in the connection string. This is a potential security risk, since the username and password are passed every time a connection between the Batch Server and the relational database is established. Also, the username and password are contained by necessity in the ADSA configuration, which is yet another point where the username/password information is vulnerable. A Server Connection String method can take advantage of Integrated Windows Authentication. This means password information does not have to be handled directly in Aspen software. Instead, when a request is made to the relational database, all verification of the user making the request, and their privileges, are determined via Integrated Windows Authentication. The connection method choices do not add any features to the product; the method being used will be transparent to the end user. Aspen also does not recommend one method over the other for performance. Also, each Batch.21 user does not need to be listed as a user in the relational database. The only account that needs to have access to the AspenBatch db is the one specified when setting up either the OLE connection string or the ODBC data source. When you create the Batch.21 database through the Batch.21 Repository setup wizard, it will automatically create a dbo account called AspenBatch21. You can use this account when creating the ODBC data source/connection string. Keywords: OLE DB ODBC Data Source Batch.21 InfoPlus.21 References: None
Problem Statement: The following error is returned when inserting a tagname from the Aspen Batch.21 administrator or attempting to create a new batch: B21BSC-50002 Relational Database error. -2146825023: batch21_sp_insert_user
Solution: Check if the ADSA | Public Data sources | Data source in question | Aspen Batch.21 Service | Setup | Provider= is pointing to the appropiate OLE DB Provider In cases where the Oracle driver is used, make sure (for v5.0.1 and higher systems), to use at least Oracle client 8.1.7.3. Prior versions of the 8.1.7 series have an Oracle ODBC driver that will deadlock when talking to the Oracle database. Effectively, the Aspen Batch.21 service would then lock up, requiring either a reboot or a restart of the service due to this issue. We advise getting Patchset 3 for 8.1.7.0 (giving you, effectively, 8.1.7.3). Keywords: Oracle Drivers Batch.21 Administrator References: None
Problem Statement: In the Validation.lst report, section Miscellaneous will report the model input tables designated in table PRNTABS. The purpose of this report is to see the data as Aspen PIMS sees it, which helps to track errors in the input tables (e.g. hidden columns that designate the end of the table, broken links to other sheets, etc.) In this table you indicate the Aspen PIMS name of the table as rows: tables BOUNDS, PGUESS, PDIST and PCALC will be reported because of this. In the case of the Assays tables, there are two options for reporting, as described below. For information on how to use table PRNTABS to report static properties from table BLNPROP, please refer to
Solution: 126033. Solution To report table Assay information, there are two options: 1. Report only the assays used in each Crude Unit 2. Report each complete Assay Table For the first option, use the table name ZASSXYZ, where XYZ is the name of the logical crude unit, e.g. ZASSCD1 for crude unit CD1. For the second option, use the table name under the ASSAYS branch, e.g. table ASSAY2. This is a screenshot for table ZASSCD1 in the Validation report: only the crudes that go to unit CD1 are reported. This is a screenshot for table ASSAY2 in the Validation report: all crudes from that Assay table are reported. Keywords: Table PRNTABS Validation report Model Validation References: None
Problem Statement: Getting error Get failed to verify error 80010105 when trying to verify a unit in the BCU Administrator.
Solution: The most likely scenario leading to the above-mentioned error is when you change names of several phases and characteristics in the Batch Administrator for a unit running in the BCU and then miss one or more of them when updating that unit's script in the BCU Administrator with the names of the new phases and characteristics. The final result is that you have the old characteristic name in the BCU unit script that does not match the updated name in the Batch Administrator. To correct the problem, verify that all of your phases and characteristics names defined in the Batch Administrator for a particular unit match the names of the phases and characteristics defined in your unit script in the BCU Administrator. Keywords: References: None
Problem Statement: How to read IP_INPUT_TIME and IP_INPUT_QUALITY in addition to IP_INPUT_VALUE of remote tag, when using Aspen CIM-IO for SetCim/InfoPlus-X/InfoPlus.21?
Solution: IP_INPUT_VALUE, IP_INPUT_TIME and IP_INPUT_QUALITY are stored in the same structure. An occurrence in the Get Record configured like this reads the IP_INPUT_TIME and IP_INPUT_QUALITY fields of the remote tag and stores them into the same field of the local tag: IO_TAGNAME IO_VALUE_RECORD&FLD REMOTE_TAGNAME IP_INPUT_VALUE LOCAL_TAGNAME IP_INPUT_VALUE And, IP_VALUE, IP_VALUE_TIME and IP_VALUE_QUALITY are stored in another structure. An occurrence in the Get Record configured like this reads the IP_VALUE_TIME and IP_VALUE_QUALITY fields of the remote tag and stores them into IP_INPUT_TIME and IP_INPUT_QUALITY field of Local tag. IO_TAGNAME IO_VALUE_RECORD&FLD REMOTE_TAGNAME IP_VALUE LOCAL_TAGNAME IP_INPUT_VALUE And using this occurrence entry will not change LOCAL IP_INPUT_VALUE、IP_INPUT_TIME and IP_INPUT_QUALITY but directly updates the local IP_VALUE, and IP_VALUE_TIME fields: IO_TAGNAME IO_VALUE_RECORD&FLD REMOTE_TAGNAME IP_VALUE LOCAL_TAGNAME IP_VALUE Keywords: IP_INPUT_QUALITY IP_INPUT_TIME IP_VALUE_QUALITY IP_VALUE_TIME References: None
Problem Statement: In the base model you set up a structure in table ROWS where you access existing model variables (e.g. SCD1ARL - the crude ARL consumption in crude unit SCD1, or BRFTURG - the RFT component going to blend URG), for example, to limit the total amount. What happens if you disable the structure that generates the variables used in the structure from table ROWS?
Solution: If you disable the corresponding variables in table CASE (for example, by disabling the purchase of this crude), the structure in table ROWS will not be affected (it will not be disabled). The variable SCD1ARL variable will now be treated as a user created variable, and as it is not restricted by other equations, it is just a dummy variable that is not constraining. To avoid potential problems and unintended results , you need to enter an EMPTY coefficient under disabled column in the table ROWS structure. You will not get warnings related to this problem. The attached model illustrates the problem: In table ROWS, we introduce a constraint for the crude consumption in SCD1: the amount of ARL consumed must be equal to ANS + NSF. In table CASE we DISABLE crude ARL: The variable SCD1ARL, which previously was created by PIMS because of the crude lineup to this unit, is now treated as a user created variable: it is not constrained by any other equation, and can take any value it needs to satisfy the equation. However, the structure created by table ROWS looks the same as before. To prevent undesired outcomes, you must modify table ROWS in table CAE by either entering the keyword EMPTY under the disabled variable or by using the keyword REPLACE or REPLACEALL to replace the entire table ROWS. Below is a screenshot of the created matrix, which looks the same as before. Keywords: DISABLE EMPTY REPLACE REPLACEALL Disabling ROWS References: None
Problem Statement: When a characteristic that contains a single quote is created with a custom VB application, it is entered into the database as two single quotes. (e.g. Cuve d'alimentation becomes Cuve d''alimentation)
Solution: A fix is expected in v7.0.1. Until then, enter the characteristic manually in the Batch.21 Administrator or edit the erroneous entries. If regional settings allow, another option is to use ´ (acute - ascii 180), ? (ascii 146) in place of ' (quote - ascii 39). Keywords: References: None
Problem Statement: When using the Batch Application Interface to insert batch data into 21CFR11 compliant systems, every characteristic created will cause a creation event message to be added to Alarm and Event. This is problematic, because it fills up the Alarm and Event database with messages that are not necessary. Typically, you only want an A&E message to be logged when one of these characteristic values are modified, but not when the value is initially recorded. How can creation event messages be bypassed when they are generated by a custom application created with the Application Interface tools?
Solution: There are two steps to bypassing creation event messages. Have the user account running the application granted the Batch.21 Administer permission in AFW or Local Security Set the API BatchDatasource source property to atSourceCustomApplication programmatically in your application. The first step is accomplished by adding the user account running the custom application to a Role in the AFW Security Manager that is granted the Batch.21 Administer right. This right is granted by opening the Properties window for the Batch.21 Application in the AFW Security Manager. Select the role from the drop down list and check the Administer box. The second step requires that you modify your Source property for the BatchDataSource Object in the application code. The Help File lists four possible settings for the Source value: atSourceBatchConnect -- The source of the value is Batch Connect. atSourceBCU -- The source of the value is the BCU. atSourceManual -- The source of the value is a manual entry. atSourceUnknown -- The source of the value is unknown. The data in the system was created when the system was noncompliant. There are actually eight settings possible. The additional options include atSourceSystem = 4, atSourceServices = 5, atSourceEBRS = 6, atSourceCustomApplication = 7 You will want to set the BatchDataSource source attribute to 7 so that Batch recognizes the source as a custom application and creation event messages are not logged. Keywords: References: None