question
stringlengths
19
6.88k
answer
stringlengths
38
33.3k
Problem Statement: SQLplus Help file does not make any comment on using nested exception handlers for execution code flow. Nested exception blocks is a supported construct and an example of their usage is shown below. Run the example code to see that the execution path leads from the inner exception block straight to the outer exception block by re-raising a custom error.
Solution: SQLplus code example: LOCAL MyErrorCode string; MyErrorCode = 'MyErrorCode'; WRITE('Starting!!'); BEGIN write('outer block'); BEGIN write('inner block'); ERROR MyErrorCode, 'error originally raised in inner block'; EXCEPTION write('inner exception block: ('||ERROR_CODE||') - '||ERROR_TEXT); IF ERROR_CODE <> MyErrorCode THEN --This unexpected real error should be raised ERROR ERROR_CODE, ERROR_TEXT; ELSE --let's re-raise the error to show execution path --rerun this and see what happens if you comment out next line... ERROR ERROR_CODE, 're-raised '||ERROR_TEXT; END; END; --if we re-raised the error in the exception block above, --we won't execute the remainder of this outer block's code write(' #outer block continues'); write(' #these lines would not have been run if error was re-raised'); EXCEPTION WRITE('outer block exception: ('||ERROR_CODE||') - '||ERROR_TEXT); IF ERROR_CODE <> MyErrorCode THEN --This unexpected real error should be raised ERROR ERROR_CODE, ERROR_TEXT; END; END; WRITE('Finished!!'); Keywords: None References: None
Problem Statement: When opening the Aspen Weigh and Dispense tool you get the message user has insufficient permission.
Solution: The Weigh and Dispense tool configured through Aspen Weigh and Dispense Management website uses the Roles created and configured in the Aspen Framework (AFW). There are several steps that you will need to take to ensure that a user has access. 1. The account in Active Directory must have the full name and follow AFW guidelines for using accounts. (ie. No nested accounts). 2. The Role AFW must be configured to use the Active Directory account. 3. In most cases you will need to configure the user and workstation in Manufacturing Operations/Control (MOC). Typical cases will have MOC locked to user and workstation (machine name) although some conditions such as testing / development system you may use local accounts. 4. Open the Weigh and Dispense Administration page and logon with Administrator account. http://localhost/WeighDispense Note: Where localhost is equal to you server or node name. Note: The URL is case sensitive. 5. Click on the Administration Tab and select Permissions from left menu. 6. Select the Role. If this is first time configuring make sure you select Administrator account and give them full rights. 7. Select the Access you want by putting a tick into the correct checkboxes and click the Apply button. Keywords: None References: None
Problem Statement: What is the difference between Paste and Paste as
Solution: In Web Explorer, Paste and Paste as Keywords: Right-click menu, paste, by Folder, view References: in Web Explorer?
Problem Statement: The online-help of Aspen Custom Modeler provides an example for the use of the Results.CopyValues method. The command shown in the example has the same function as the Advanced Copy dialog available from the Snapshot Management dialog box. That means it requires the specification of a flowsheet object (e.g., a block ID), and a variable name. How can one copy all variable values of the simulation at once? In other words, what is the automation equivalent to Tools | Snapshots... | Copy Values?
Solution: The required syntax to copy all variable values from a snapshot is: Simulation.Results.CopyValues aResult, False, True, True, False, True, True, , ~ Keywords: Automation Results Snapshot Copy References: None
Problem Statement: What is a project conflict re
Solution: and how do I resolve it? Solution It is possible for the master workspace to change after a project has been created from it. Before a project can be committed, any conflicts with its master must be resolved. If the master has changed, when the Project Management application is accessed in a project it is indicated that the parent has changed. Clicking the text shows the Update Report, indicating where the changes have occurred. The user can then select whether to update the project with the changes or not. Note that a project CANNOT be committed back to its master until all conflicts have been resolved. Keywords: Parent project, sub project, project update References: None
Problem Statement: What is the difference between a linear, explicit, nonlinear and torn group in the block decomposition?
Solution: Decomposition information could be accessed in the All Items pane of the Simulation Explorer --> Diagnostics. This diagnostic information only becomes available after a simulation is run. If there are structural errors with the simulation this information is not available. Group Type Description Linear A linear group featuring a system of linear equations Explicit A linear group featuring one single equation Non-linear Non-linear equations involved Torn A block that could contain any of the above types (used with procedures) To find out block types and variables involved, go to the simulation explorer --> Diagnostics --> Click on Initialization decomposition or dynamic decomposition. To view the information on the type of blocks in your simulation, right-click in the Contents pane of the Simulation Explorer and choose Details. The type of group is useful to understand some convergence problems. Essentially, explicit groups can only fail if the value is outside the bounds of the variable. Linear groups will fail if the system is singular. They may cause troubles when the system is nearly singular (ill-conditioned). Numerical issues with the linear solver are quite rare. Some problems have a special mathematical structure which make them solve better with MA38. The vast majority of convergence problems are due to non-linear groups. It is useful to note that for non-linear groups, single equation groups are solved using a different non-linear solver than multiple equations groups. Finally, tearing procedure is done very rarely, but it may help to dramatically improve the calculation speed for dynamic simulations. It is also important to understand that the decomposition depends on the way the simulation is specified (fixed/free variables). One technique is to try to figure out how to get many small groups, so that hopefully the convergence will be obtained, and then modify the equations or the specifications to the desired configuration. As free variables will be closer to theSolution, it is more likely convergence will be obtained. It is also easier to troubleshoot convergence problems on small groups. Keywords: block, decomposition, linear, explicit, nonlinear, torn, run, diagnostics References: None
Problem Statement: This knowledge base article describes the Good Only and Suspect Only sampling methods available on the Aspen Process Explorer (APEx) Tag Properties Sampling tab. Note, prior to APEx V10 the help file did not include a description of these methods of sampling.
Solution: Good Only - Number of good data points in each time interval (only good, no suspect samples). Suspect Only - Number of suspect data points in each time interval (not good and not bad samples). Keywords: missing Suspect samples Bad samples References: None
Problem Statement: Experiments for hydrogenation of furfural to furfuryl-alcohol (FA) have been performed in a batch reactor. The composition of FA has been measured at different times. The objective is to use those data to estimate the kinetic parameters. Composition data: reactor was operated at 3 different temperatures for each experiment Flowsheet: feed is a batch charge. H2 is fed continuously.
Solution: The complete example is attached. The attachment also contains step-by-step instructors to set up the estimation case from a starter file. Keywords: workshop, estimation, kinetics References: None
Problem Statement: Experiments for hydrogenation of furfural to furfuryl-alcohol (FA) have been performed in a batch reactor. The composition of FA has been measured at different times. The objective is to use those data to estimate the kinetic parameters. Composition data: reactor was operated at 3 different temperatures for each experiment Flowsheet: feed is a batch charge. H2 is fed continuously.
Solution: The complete example is attached. The attachment also contains step-by-step instructors to set up the estimation case from a starter file. Keywords: workshop, estimation, kinetics References: None
Problem Statement: What is the recommended method for Water Dew Point Calculation for a Dry Gas Stream in Aspen HYSYS?
Solution: Dry gas water content and water dew point are two key product specifications for the dehydration process. Excess water can cause corrosion, pipeline degradation, or hydrate plug. Aspen HYSYS provides method to predict water dew point on the material stream. However, the water dew point calculated for a of dry gas stream can be adversely impacted by the presence of trace amounts of glycols, amines, or methanol. ASTM D1142 is the standard test method for determining dew point temperature for water in gas stream. The dew point temperature is the temperature, at a given pressure, at which liquid water first appears from the sample gas. However, the presence of glycol, amines, or methanol will interfere with the water dew point calculation. Any trace of polar compounds that have strong affinity with water will alter the phase equilibria significantly and cause a liquid to form at different temperatures than if that liquid is just water. Thus, when determining the water dew point in Aspen HYSYS, the best practice is to remove these species from the stream using the Aspen HYSYS Component Splitter before evaluating the water dew point. This aligns with industry practice of demisting these species to achieve a more accurate measurement. Key Words Dehydration, Water Dew Point, Dew Point, Dry Gas Keywords: None References: None
Problem Statement: An adiabatic plug flow reactor has been used to study the conversion of furfural to furan. The temperature profile along the length has been measured. The objective is to use this information to estimate the kinetic parameters (pre-exponential factor and activation energy). The green dots show the experimental temperature. The blue line shows the calculated temperature with the current parameters. The x axis is the location along the length of the reactor.
Solution: The attachment contains theSolution file. We also provide step-by-step instruction to set up the estimation case from the starter files. The plot below shows the results after the estimation. Keywords: estimation, PFR, PDE, activation energy References: None
Problem Statement: An adiabatic plug flow reactor has been used to study the conversion of furfural to furan. The temperature profile along the length has been measured. The objective is to use this information to estimate the kinetic parameters (pre-exponential factor and activation energy). The green dots show the experimental temperature. The blue line shows the calculated temperature with the current parameters. The x axis is the location along the length of the reactor.
Solution: The attachment contains theSolution file. We also provide step-by-step instruction to set up the estimation case from the starter files. The plot below shows the results after the estimation. Keywords: estimation, PFR, PDE, activation energy References: None
Problem Statement: Why do non condensable gases reduce heat transfer capacity?
Solution: The release of latent energy (change of state) to condensate in the steam components takes place on the heat transfer surface, which is where heat is being transferred due to the temperature difference (steam to the process). The steam component transfer is consuming the latent energy, and the steam is condensing to a liquid (condensate); the condensate is drained away by gravity, but the non-condensable gases and air remain. The non-condensable gases form a stagnant film on the walls of the heat transfer surface, which creates a resistance. Heat energy transmitting through the heat transfer surface has to pass by conduction through these films of resistance. A film of air or non-condensable gases that is only one thousandth of an inch thick has the resistance of a three-inch wall of iron. Keywords: - Non-condensable gases, heat transfer efficiency, heat transfer coefficient References: None
Problem Statement: In Aspen Plus V9, for the interactive Binary Analysis there is a new option for pseudo-binary system. What does this calculation do?
Solution: In older versions of Aspen Plus, you had to create a Sensitivity analysis to perform this simple calculation. Let's say we have 3 components, A, B and E. E is the entrainer component. For a simple TXY calculation, we vary the mole fraction of xA from 0 to 1, and xB = 1 - xA. The bubble temperature Tb, as well as the bubble vapor phase composition yA are calculated for the mixture, and we plot Tb vs xA as the bubble curve and Tb vs yA as the dew point curve. For the pseudo binary calculation, we set the entrainer mole fraction xE to the specified value. We vary the mole fraction of the component A as xA = x * (1-xE), xB = (1-x) * (1-xE). We calculate the bubble temperature Tb as well as the bubble vapor phase fraction (yA, yB, yE) for the mixture. We plot Tb vs x as the bubble curve and Tb vs yA/(yA+yB) as the dew point curve. The attached simulation for Aspen Plus V9 illustrates this calculation with a sensitivity analysis. The calculator block C-1 is used to set the feed composition of a Flash2, which calculates the bubble temperature and the vapor phase composition. The Sensitivity analysis varies the composition (component A is water, component B is methanol and component E is ethanol). The plot was created to present BTEMP in the x axis, and X and YWATER/(YWATER+YMEOH) in the y axis, then the axis have been flipped Keywords: interactive, binary, property, analysis References: None
Problem Statement: The business proposition for simulating the dehydration process is to ensure that process objectives can be met while optimizing operating costs and/or CAPEX: - Optimize dry gas water content to meet product spec - Optimize required glycol rates to meet product spec at minimum OPEX - Quantitatively evaluate different configurations, glycols, and process conditions in design - Used to make confident decisions in troubleshooting activities - Understand process impact on regulated benzene emissions - Predict hydrate formation temperatures to reduce plugging risk Summary This application example utilizes the Aspen HYSYS CPA Package functionality to rigorously model a TEG dehydration system for natural gas, with the goal of optimizing operating costs to meet key sales gas product specifications and column performance metrics. In this example, you will learn how to: - Solve a common dehydration problem through this application example: Evaluate various operational decisions that can be made to meet specs - Leverage the power of Aspen HYSYS integrated simulation models: Utilize the hydrate formation utility and Column Analysis functionality - Use Aspen HYSYS CPA Package features: Tuned kinetic parameters, accurate prediction of key properties - Run a Case Study: Quickly evaluate the effect of different operation conditions
Solution: Users will use the Glycol Dehydration Feature-Set in Aspen HYSYS to make common operations and design decisions. Features in the HYSYS CPA Package include: Familiar HYSYS Environment Optimize the entire gas plant in HYSYS Predict key dehydration targets & emissions CPA (Cubic-Plus-Association) Package Based on the Soave-Redlich-Kwong (SRK) Cubic EOS Fitted interaction parameters for common components Accurately represents phase behavior of components Hydrate Formation Utility Predict formation temperature of hydratesAccurate Prediction Equation contains terms for association effects, e.g. H-Bond Developed to be capable of describing complex (VLLE) applications Supports TEG, DEG, and MEG dehydration Layered functionality, such as Column Analysis Users will use these features to: Optimize operations to: Achieve > 98.4% lean TEG in the regenerator bottoms (operational best practice) Achieve a water dew point of 5F in the dry gas (product specification) Ensure no hydrate formation in the process Minimize the amount of purged lean TEG as much as possible (OPEX savings) Evaluate if new absorber column internals to be installed at next turnaround are limiting for a max rating case At the end of the example, users will have: Ran a case study To determine operational impacts on product dew point Solved a common dehydration problem Increased glycol circulation and reboiler temp to meet specs Identified hydraulic limitations of the absorber Determined upper limit of feed rate until flooding occurs Confirmed that no hydrates will form in the system By using the Hydrate Formation utility and conditional formatting Keywords: Column Hydraulics, Column Analysis, gas plant, gas processing, hydraulics, flooding, weeping, base case, max rating, turndown, example, reboiler duty, solvent, packing, trays, column, sales gas, sweet gas, TEG, dehydration, glycol, hydrates, hydrate formation, CPA, dew point, lean, rich References: None
Problem Statement: After installing Aspen Production Execution Manager for the first time you will have to configure the server to connect to the database. This article describes how this is achieved.
Solution: The AeBRSInstaller must be run if the server has been migrated or installed for first time. The user running the installer MUST have full domain or full administrator rights or this may cause corruption or failure to configure. Note: Run the CMD or command prompt as an Administrator by Right Clicking and selecting 'Run as Administrator'. Do this even if logged in as an administrator to avoid any security policies that may block such actions. 1. Navigate to the Production Execution Manager folder using a Command Prompt. A typical path would be: C:\Program Files\AspenTech\AeBRS Note: Use cd and cd\ to navigate the folders in Command Prompt. 2. Run this command: AeBRS AeBRSInstaller That launches the first Aspen Production Execution Manager Configuration dialog box. This is case sensitive so make sure you enter correctly 3. Type or paste the following information to configure the database: o Select either MSSQL or Oracle to use as Production Execution Manager database. Note: Database option is only enabled for the first installation procedure, which is disabled during upgrading procedure. o Account name of the Production Execution Manager database owner in the Account name of the Aspen Production Execution Manager database owner box o Database password in the Password and Enter password again boxes Note: Alternatively, select the Allow blank password check box to override those two password boxes. o Database host name in the Database host name box o SQL server database name/Oracle SID in the SQL server database name/Oracle SID box Note: If you are using an Oracle database, the name of the database is in reality the name of the Oracle SID. o Database port in the Database port box 4. Type or paste the following information to configure the Apache Tomcat server: o Apache host name in the Apache host name box o Apache host port in the Apache host port box 5. Click OK. The Update Configuration dialog box appears. 6. Type or paste the installation/system UID in the UID used to uniquely identify Templates exported from this system box. 7. Set company name & logo icon for reports that are included in the reports: o the company name o the company logo icon o click Change logo… to change company logo icon, or else the default “AspenTechnology” is used. Note: You can also change company name and logo icon for reports by changing COMPANY_NAME and LOGO_NAME in path.m2r_cfg file, then double-click codify_all.cmd file to run it. The two files are normally located at C:\Program Files\AspenTech\AeBRS\cfg_source. 8. Determine if you want to migrate system-resources used as RUDOs, if yes, select the Migrate system-resources used as RUDOs(very time consuming) check box. 9. Determine if you want to recompile designs during the upgrade procedure. If yes, select Compile designs. If unchecked, you can run CompileProc later to compile the designs. Note: The Compile designs option is disabled for the first installation procedure, which is only enabled during the upgrading procedure. 10. Determine if you want to customize which designs to recompile, if yes, click Compilation options. The dialog box displays: 11. Determine if you want to import General Manufacturing Library while updating. If yes, select the Import GML template. 12. Click OK. The Production Execution Manager Configuration message box displays. 13. Click OK at the Configuration process has completed successfully message box that appears at the when the database is configured. 14. Click Close. A Close the application message box displays. The Installer will need to be run if you have changed the database name Keywords: None References: None
Problem Statement: Is there a way in ACCE to zero out the substation cost? We need to have at least the main substation in the Power Distribution section. So, how do I exclude the substation cost for a retro-fit project?
Solution: The cost of the substation can be changed by accessing the options for the substation. Following are the steps: 1. Go to power distribution 2. Right click on the substation to access the options 3. Make the required capacity 0 Keywords: Substation, Power Distribution References: None
Problem Statement: Opening a trend plot gets the following error message: Could not create grid control VSFlex7L.ocx.
Solution: The error message indicates a possibility that the VSFlex7L.ocx control is not registered. Please register it by following the steps below: 1. Search for the VSFlex7L.ocx (Start | Find | Files or Folders...); it should be located in the %SystemRoot%\system32 directory 2. Right click on it and select Open With... from the context menu 3. Click Other... and navigate to %SystemRoot%\system32 directory 4. Find regsvr32.exe and click on it to select it 5. Click Open and then OK; your control should now be registered If unable to register VSFlex7L.ocx because it can not be found, then copy it from another system and paste it into the %SystemRoot%\system32 directory before registering. In some cases this file was reported missing after installing Microsoft updates. Keywords: VSFlex7L.ocx VSFlex8l.ocx Process Explorer InfoPlus.21 Microsoft References: None
Problem Statement: It is very useful to package certain calculations within submodels for modularity, easier maintenance and re-usability of code. When modelling partial differential equation systems, distributed variables may need to be used in submodels and they required to be defined in as general way as possible so they can be easily handled by the calling model.
Solution: We describe below the procedure to implement a submodel using distributed variables and we include an example to illustrate its use. The example used is a modified version of the Heated Slab example included in the Aspen Custom Modeler installation. It was modified to contain a temperature dependent thermal conductivity term, instead of constant diffusion coefficient. The thermal conductivity calculation will be implemented in a submodel called Conductivity, which will then be instanced in the MetalPlate model and used in the heat transfer equations. The example is illustrative only and the data for heat capacity, density and thermal conductivity was actually obtained from Aspen Plus for water at 25C. The steps required are summarised as follows: 1 - Create the problem as usual in ACM, defining all required types. 2 - Then create a submodel to calculate the thermal conductivity. As thermal conductivity is a temperature dependent function and temperature is a distributed variable, we also want to calculate the conductivity in every node. 3 - Instance your model and use the variable calculated by it in another model. Implementation of step 2 We have two domains defined in the main model so the submodel should also have two domains defined. However the submodel domains should be defined as general as possible and then mapped to the right domain when the submodel is instanced. Hence we create two external domains in a submodel called Conductivity: Model conductivity XDomain as external Domain; YDomain as external Domain; The variables required in the Conductivity model are temperature, T, and thermal conductivity, K. For simplicity we also define T as external, which means we do not need to map it when instancing the model and the variable T in the main model will be used directly. T as external Distribution2D; As there is no derivative term, and the temperature is distributed over 2 domains we need to create also a Distribution2D for thermal conductivity with highest order derivatives of zero as there is no derivative term in the conductivity expression. K as Distribution2D(value: 1; HighestOrderXDerivative: 0, HighestOrderYDerivative: 0); Now we can write the equation of K as a function of T. Using FOR loops makes the code much easier to read, so the expression for conductivity will be written as below. for iX in [0:XDomain.endNode] do for iY in [0:YDomain.endNode] do conductivity: K(iX,iY)=0.487653 +1.487E-3 *T(iX,iY) - 5.6346E-6 *T(iX,iY)^2 +0.16E-8 *T(iX,iY)^3; endfor endfor When finished typing in, compile the model and correct any errors. Implementation of Step 3 Now we can instance the model in the main model, so we can use K in the equations. We will concentrate on the use of the submodel only in this example. For more information on the Heated Slab original example please consult the Examples PDF manual. In the main model, MetalPlate, we have two domains defined: X as LengthDomain(DiscretizationMethod:BFD1, HighestOrderDerivative: 2, Length:1, SpacingPreference:0.1); Y as LengthDomain(DiscretizationMethod:BFD1, HighestOrderDerivative: 2, Length:1, SpacingPreference:0.1); And the distributed variable T T as Distribution2D (XDomain is X, YDomain is Y) of Temperature(10); The Conductivity model is using this variable T directly in its calculations because we declared it as external in it. We also had to define two extra variables in the usual way for the equations: cp as cp_mol_liq (69.47, fixed); rho as dens_mol_liq (55.216, fixed); Now we need to instance the Conductivity submodel to use here. We instance the model in the usual way and we map the domains XDomain and YDomain from the submodel, to the domains we want to use from the model MetalPlate, which are X and Y. These have been defined as LengthDomain, see above. cond as conductivity( XDomain is X, YDomain is Y); I have chosen not to map any variables between the model and submodel because I am using T as external in the submodel and I will use K from the submodel directly in the heat transfer equations below. Again I re-arranged these equations in two nested FOR loops to make them easier to write and read. To refer to variable K in the submodel we use model_name.variable_name as usual. However, don't forget that K is a distributed variable so you should always include the index(es), otherwise you will find your problem over- or underspecified by an enormous amount of variables. That is a good indicator that you have something wrong in the equations. // Heat transfer equations for iX in X.Interior do for iY in Y.interior do HeatTransfer: cp *rho *$T(iX, iY) = cond.k(iX,iY) *T(iX, iY).d2dx2 +cond.k(iX,iY) *T(iX, iY).d2dy2; endfor; endfor; Once finished writing the equations and including any initial and boundary conditions, the model is ready to be compiled. Keywords: domain, submodel, parameter References: None
Problem Statement: When attempting to open a project, I get an access denied error message How do I open my project and what happened to it?
Solution: When the project is accessed, it opens the initial project information. If these files are missing or corrupted, the project will return an error message of 'Access Denied'. Many times, when this happens, is when the project is being worked on, or being closed, the closing of the application was terminated unexpectedly (either by ending the task manually or if the power was shut down on the computer). To recover the project, please follow the next steps: It may be possible that the SESSION.LCK file was not automatically removed from the system when the last scenario was closed. The file resides in the C:\Users\[username]\AppData\Local\AspenTech\Economic Evaluation Vx.x\Projects directory. Normally, the file exists only while a scenario is open on the system. In rare circumstances, however, the file is not removed automatically by the system. Close the open Aspen Economic Evaluation software application and delete the SESSION.LCK. Reopen the Aspen Economic Evaluation software. All scenarios should be accessible now. If there is no SESSION.LCK file, it is very likely that there are important files missing from the project scenario, thus the reason why the project scenario cannot be opened and the resulting 'Access Denied' error message. Please check to see if you have a backup of your project in the C:\Users\[username]\AppData\Local\AspenTech\Economic Evaluation Vx.x\Backup directory. If there does not seem to be a viable project to recover, please send the IZP and SZP files to [email protected]. The IZP file contains all the necessary files to open a project. The SZP files contain database files, reports, etc., which are not essential to opening the project, but are useful in attempting to repair the project file. A technical support consultant will attempt to repair the project file and send it back to you. Keywords: Access denied, SESSION.LCK References: None
Problem Statement: How to redesign the equipment using the MAWP as the new design pressure?
Solution: Aspen Shell and Tube Mechanical will calculate the maximum allowable working pressure (MAWP) for each component of the vessel. However, the ASME BPVC VIII-1 does not have an explicit procedure to calculate the MAWP for a multi-chambered vessel (i.e. a shell-and-tube heat exchanger) for those components subject to both the shell side and the tube side pressures. So we have 3 different approaches: Option 1: Use the design pressure as the MAWP Option 2: Use the calculated MAWP as the new design pressures Option 3: Iterative method if option 2 fails If you want to redesign the equipment using the calculated MAWP as the new design pressure, one of the methods is to manually increase, by a small amount, the design pressure on each side simultaneously until a component is overstressed. The procedure is as follows: Once the design has been finalized using the original design pressures, click on ‘Update Geometry’ (result thicknesses and diameters are automatically entered as new inputs) Then, increase the design pressure by a small amount on each side and re-run Continue until you get a warning message on a component. i.e.: Now use the previous run as the new MAWP for each side This must be as done as trial-and-error because there are many variables on each side that affect the overall stresses of components subject to both pressures. UHX is an example of this complexity. Keywords: Re-design, MAWP, maximum allowable working pressure. References: None
Problem Statement: This Knowledge Base article answers the following question: How can I change the type of a batch characteristic that has already been stored in a relational database?
Solution: There is no easy way to change the type of a characteristic that has already been recorded in the Aspen Production Record Manager (APRM) database short of writing a custom script that would make the change (see experienced users section below). Your best option using the AspenTech client tools provided would be to go to the APRM Administrator and rename the characteristic that has an incorrect type to something like CHAR_NAME_do_not_use, create a new one and assign it the proper type, purge the batches from your APRM database that contain the bad characteristic and re-run the BCU scripts over the corresponding period making sure that the correct characteristic is used this time around. Database procedures for experienced users There are generic stored procedures attached to thisSolution which can be modified and run on your APRM relational database to change the type of a batch characteristic. The scripts are relational database specific (MS SQL Server and Oracle). Please make sure that you have a current backup of your APRM database before attempting to run the scripts. Important Note: Aspen Production Record Manager Server service must be stopped while running the scripts. Keywords: Batch.21 Convert Numeric References: None
Problem Statement: How does “Pipe length adjustment” specification affect the pipe length calculated by ACCE?
Solution: Pipe length adjustment, accessed in the Area piping specs, is a factor considered to increase or decrease the overall length for every installation pipe calculated in the equipment(s) placed in that area. Each equipment has a formula to calculate each line reported for it. For example, a trayed tower will have a specific formula to calculate Line Number 3 Overhead Vapor and this formula will be different than the one used to calculate Line Number 1, 2 etc. The formula used for line number 3 is as follows: 12 ft + 0.5*Pipe diameter (ft) + 0.4*Column T/T height (ft) If we use a pipe length adjustment of 200% then the calculation for the length would be as follows: [12 ft + 0.5*Pipe diameter (ft) + 0.4*Column T/T height (ft)] * 2 Increasing the overall calculated length by two. To further exemplify this, let’s assume a column of 56 ft height, and 12-inch diameter. The calculated length would be: 12 ft + (0.5*1 ft) + (0.4*56 ft) = 34.9 ft and rounded up to 35.0 ft If user were to define the pipe length adjustment to 200% the final length would be 70 ft Notes: Pipe length adjustment is affected by the area dimensions. If the area is too big, the factor would be applied without a problem. If the area is to short, the overall length calculated will be truncated due to area dimensions being too small to contain the pipe length calculated. Pipe length adjustment is also affected by the pipe envelope, if the pipe envelope is too small it will truncate the calculated length. This description is only valid if the default volumetric model is applied. If the user has toggled off the volumetric model for the equipment / project, user must manually enter the lines and their length. Attached to thisSolution is an izp project file with four areas, one with the default pipe length adjustment of 100%, and three areas for different scenarios big area with an adjustment, small area with an adjustment and a small envelope with an adjustment. If you would like further clarification on this, please contact Aspen Tech Support. Keywords: Pipe, length, adjustment, volumetric, model, envelope References: None
Problem Statement: The HYSYS workbook is currently unable to report Flow Assurance data (e.g. Erosion Velocity). As a workaround, VBA can be used to access the Pipe Segment results and report it to an MS Excel spreadsheet.
Solution: An MS Excel file is provided. See VBA editor the relevant VBA code. In this case, backdoor variables are used to access the pipe segment properties / results. Keywords: VBA, Flow Assurance, Erosion Velocity References: None
Problem Statement: How to return derivatives from a procedure.
Solution: During the non-linear equations reSolution, the Newton solver needs to evaluate the value of the residuals and the jacobian matrix. This is done analytically for equations implemented using Aspen Custom Modeler language. For the external code implemented in procedures, the user has the option to return derivatives. Returning correct derivatives from a procedure may help the solver; otherwise, those are evaluated using numerical perturbations. Coding derivatives is tedious. For simple procedures, numerical perturbation work fine and better than a procedure returning incorrect derivatives. However there are some cases where numerical derivatives will be poor. This is the case of input variables with extreme order of magnitudes or special non-linear effects (e.g. when increasing the value above or below some value is non-physical, or if you want to use higher order approximation for the derivatives). In this case, you can specify that the procedure returns derivatives and code yourself some ad-hoc perturbation method inside the subroutine itself. Finally you can evaluate analytical derivatives or use third party code to generate such derivatives. To specify that a procedure is returning derivatives, you need to add the keyword DERIVATIVES in the OPTIONS specification of the procedure. The style of procedure may be traditional or new (using the option COMPATIBILITY: ACM2004;). For the traditional style, the derivatives will be returned in a matrix DERIVS. For the ACM2004 style, the derivatives are returned in an 1-dimensional array. The order of the derivatives is specified by ACM. The argument ICALL specifies what the procedure is supposed to do: it may request the value of the output variables (ICALL=0), or the value of the derivatives (ICALL=3), or both (ICALL=4). The attached example shows three implementations of the same calculation: pProc1 returns the area and perimeter of a rectangle of specified length and height. The pProc2 procedure does the same but also return derivatives. The pProc3 procedure does the same but using the ACM2004 style. The example was created in Aspen Custom Modeler v9, but it can be opened without any problem in older versions such as v8, v8.4 and v8.8. You can experiment with the simulation and create a new block using test2. This model sets the area and perimeter. This will require the back-calculation of length and height. This will trigger the calculation of derivatives. You can see that with Mixed Newton or Newton as the solver, the procedure is called with ICALL=0 and ICALL=3 for each iterations. If you select the Fast Newton method, the procedure is called with ICALL=0, and with ICALL=3 only when the solver detects a slow rate of convergence. Note that the Fast Newton actually requires more iterations to converge (12 instead of 6) as not updating the jacobian matrix for each iteration breaks the quadratic rate of convergence of the Newton method. Note that Mixed Newton simply means that the full Newton method is used for steady state and initialization runs, and fast newton for dynamic runs. Finally, it is important to point that such a simple example is used only for illustration and the best implementation would be to simply write the equations directly in Aspen Custom Modeler language (model test3). Keywords: procedure, derivatives, derivs References: None
Problem Statement: It is possible to generate text files that define the structure of records in an Aspen InfoPlus.21 (IP21) database. Such files can be modified in a text editor and read back into the database (or a completely different IP21 database). This
Solution: explains the different options available and the codes used in the files themselves. Solution RECSAVE and RECLOAD are utilities accessible in the InfoPlus.21 Administrator tool, which can save and load records in human-readable ASCII text format. The mechanism is useful for: Mass database population by creating a record, saving it, then “cutting and pasting” inside a text editor. Transferring record systems between databases. To start generating RECLOAD files based on existing records right click the data source name in the navigation tree of Aspen InfoPlus.21 Administrator and select Save Records from the context menu: Select existing records with the Find tool. Once the records are selected then select a suitable filename and path by clicking the (…) box located in the file panel. As shown in the example below with the ReactorTags.rld file generated using the different (Save Fields As) options. Names - This option saves the field names as saved in the record's name field. Tags - The default behavior is to save the field names as field numbers (Tags). This is the format used internally by IP21. The field ID numbers are not logical unless you match the record IDs within IP21. It is recommended to use the ID number format for loading very large ASCII files but not recommend to be used for reference. Tags plus Include Names as Comments – This option saves the field names as field numbers but includes the readable field name as comments. A RECLOAD file makes use of different symbols to control the way the file is loaded back into IP21. SeeSolution 103522 for a full list of the symbols that can be used in a recload file. How to load a RECLOAD file into IP21 It is recommended that you always save the IP21 snapshot before loading records in case it is necessary to return to a clean database after any failure. Right click on the data source name within the IP21 administrator and select the Load Records menu to open the Load Records dialog window: There is a choice of options to use when loading records: Available in this system Disregards any record IDs listed in the file, instead, creating a record using the first available ID within the database. You may use the example used below which provides a new free ID to be created within your IP21 database. If a pipe character is found followed by a number the RECLOAD program will use the first available ID after this number: Example: | 3000 #1208 @IP_AnalogDef *NewRec Saved in File If this option is selected, the IDs in the file will be used. This option should be chosen if it is important for the records to occupy certain IDs within IP21. A dialogue box gives information on the success or failure of the load process, any errors should be investigated based on the error message presented. NOTE It is possible for a Load operation to be partially successful, in which case records may be created but only partially populated. If this occurs but it is important to create a reliable recload set, the records must be manually deleted, adjustments made and the operation repeated until completely successful. Keywords: custom RECSAVE RECLOAD References: None
Problem Statement: I need to calculate the specific gravity for the conversion of pump head to pressure. There are several different density data in the HYSYS stream result: Mass Density Std. Ideal Liq. Mass Density Liq. Mass Density (Std. Cond) Which density should I use?
Solution: To calculate specific gravity (SG) to convert pump head to pressure, you need to consider the real stream condition and mixing effect of different fluids. Mass Density is calculated rigorously at the flowing conditions of the stream (i.e., at stream temperature and pressure). This includes mixing effects. Std. Ideal Liq. Mass Density is calculated based on ideal mixing of pure component at standard condition. Each component has an ideal liquid density at standard conditions, then these are summed weighted by component fraction (xi) to create an ideal liquid density for the stream: Ideal density of a stream = 1 / sum (xi / ideal density (i) ) Consequently, ideal density does not include any actual mixing effects. Liq. Mass Density (Std. Cond) is calculated rigorously at the standard condition. This includes mixing effects. The standard condition used here depends on the temperature units in use by Aspen HYSYS. If C or K is used it is 15 degree C and 1 atm. If F or R is used it is 60 F and 1 atm. Therefore, usually you should use Mass Density to calculate SG to convert pump head to pressure. Keywords: Specific gravity Mass Density Std. Ideal Liq. Mass Density Liq. Mass Density (Std. Cond) References: None
Problem Statement: What does the void fraction parameter used for in calculations?
Solution: The pressure drop in a fixed bed is very sensitive to bed void fraction. The void fraction can be controlled by particle size, shape, and method of catalyst loading. A range of particle sizes is not desirable because it results in smaller void fractions and can cause a significant pressure drop. A high void fraction results in low pressure drop and is desirable in some reactions. It is also used to calculate the reactor volume, but that doesn’t have any effect since it is the catalyst weight that will affect conversion. Catalyst density and bed voidage are parameters related to vendor. Generally, it varies from 0.2 to 0.75. The actual void fraction can be as much as 0.5 for sock shaped catalyst, to as low as 0.35 for dense loaded spherical catalyst. Keywords: void fraction, hydrocracker References: None
Problem Statement: The Aspen Production Execution Manager Design
Solution: This Knowledgebase Article exists to document additional flags. Most of these flags are for special use-cases. About Flags All user-defined flags will be found in one of the files in the following directory on the server or on a client: C:\Program Files\Aspentech\AeBRS\cfg_source In that folder there are three files meant primarily to hold flags: config.m2r_cfg path.m2r_cfg flags.m2r_cfg Not all possible flags are explicitly defined. Many flags exist internally in Production Execution Manager with default values. To change that default value, the flag must be added to one of these files with the desired value. Whenever a change is made to any of these files, it is necessary to run codify_all.cmd (located in the same directory) which codifies the content of these text files (i.e. writes compiled *.cfg files to the Program Files\AeBRS directory.) The next time MOC is opened, or in the case of a server flag, Apache is restarted, the updated flag values will be taken into account. Flags related to Automatic Basic Phases Flag Name: AUTO_START_FIRST_ONLY Description: Modifies behavior of auto-start Basic Phases. Applicable Version: Version 6.0.1 and later; related CQ00209119 Server or Client: This flag affects client behavior, but it should be set the same for ALL clients, so set it in flags.m2r_cfg on the server. (A separateSolution will be written that explains configuration for shared and local flags) Details: If the auto-start checkbox is checked at design time, a Basic Phase will start automatically when the flow of execution reaches it. Once started, if the user cancels the Basic Phase before it reaches a Finished State, default behavior is that it will restart again the next time condition evaluation occurs (i.e. value 0.) If explicitly defined as: AUTO_START_FIRST_ONLY=1 this ensures that once canceled the Basic Phase will not restart. Note that if the Basic Phase is part of a loop in the RPL (for example held inside an Operation with a repetition count) it is completely reset, so that it once again has initial autostart behavior as designed. Flag Name: DISABLE_SERVER_RETRY_AUTOSTART_MESSAGES Description: Modifies behavior of auto-start Basic Phases. Applicable Version: Version 6.0.1 and later. Server or Client: Server only. Details: This flag controls whether or not the server will try to restart an auto-start BP elsewhere in the Production Execution Manager system if the auto-start Basic Phase fails to start on the workstation initially triggering its availability. The default behavior typically seen is that an auto-start Basic Phase starts on the workstation which triggers its availability. For example, if three auto-start Basic Phases are strung together in a serial relationship, and the first auto-start Basic Phase in that RPL section starts on Workstation Alpha, the second and third auto-start phases will also likely launch themselves on Workstation Alpha as each preceding Basic Phase finishes. However, there is no 100% guarantee, because of the architecture of the mechanism. Much like the internet sends packets by multiple routes to ensure robustness, Production Execution Manager is dynamic in its approach to get an auto-start Basic Phase up and running once it is triggered. In fact when an auto-start Basic Phase is triggered, a start message is sent to ALL workstations except the triggering one, with an accompanying .5 second timeout. That default .5 second timeout can be modified by explicitly declaring and setting a value for the MSG_PROCESSOR_READY_AUTO_PAUSE flag.) During this .5 second timeout window, the only workstation not forced to pause for .5 seconds is the triggering one, so most likely it will start the auto-start Basic Phase. To help increase the likelihood that the auto-start Phase only stays on the triggering workstation, increase the timeout flag. For example if the triggering workstation has low RAM, heavy processor usage, etc., the clean-up work from the previous execution may take longer than .5 seconds, and then the auto-start Basic Phase will pop up (seemingly randomly) on some other available workstation. Default value of this flag, if it is not explicitly defined in a config file, is 0, which means the server will try to autostart BP's which fail to launch on their triggering workstation on any other available workstation. It will continue this attempt during every condition evaluation cycle (See KB Article 000013450 for more information about CONDITION_EVALUATION_PERIOD.) By explicitly defining it as true: DISABLE_SERVER_RETRY_AUTOSTART_MESSAGES=1 it means if the first attempt to start the Basic Phase on the triggering workstation fails, it will not auto-start anywhere else. But it will show up as available on every workstation module in the plant, waiting to be started manually. In summary, this flag could be considered a corner-case that is rarely needed, since the triggering of auto-start Basic Phases should work as designed. In those cases where auto-start Basic Phases seem to pop up randomly, and not on the intended workstation, first try increasing the MSG_PROCESSOR_READY_AUTO_PAUSE flag. If a larger timeout still does not address the issue, set DISABLE_SERVER_RETRY_AUTOSTART_MESSAGES to 1, so that auto-start phases which fail to start the first time do NOT try a second auto-start, and instead simply wait for the operator on the desired workstation Flag Name: PHASE_REMOTE_START Description: Modifies behavior of auto-start Basic Phases. Applicable Version: Version 6.0.1 and later. Server or Client: This flag affects client behavior, but it should be set the same for ALL clients, so set it in flags.m2r_cfg on the server. Details: This flag determines what conditions trigger Apache to start condition evaluation By default RPL evaluation is triggered to start only when (1) a MOC session is open on a workstation in the system, and (2) Workstation or Order Tracking module is opened within that MOC session. At that point Apache will start evaluating all conditions in the RPL every 10 seconds (see KB Article 000013450 for more information about CONDITION_EVALUATION_PERIOD.) PHASE_REMOTE_START=0 Default value. Condition evaluation only starts as described above. PHASE_REMOTE_START=1 Condition evaluation only happens when a Recipe is being displayed in Order Tracking. PHASE_REMOTE_START=2 Just MOC needs to be open for Apache to start condition evaluation. It is not necessary to have Workstation or Order Tracking modules specifically open. Flag Name: MSG_PROCESSOR_READY_AUTO_PAUSE Description: The timeout period that helps determine where an automatic Basic Phase will execute. Applicable Version: Version 5.0.1 and later Server or Client: This flag affects client behavior, but it should be set the same for ALL clients, so set it in flags.m2r_cfg on the server. Details: See KB Article 000013311 for more information. GENERAL FLAGS Flag Name: ARCHIVE_ROOT Description: Controls destination folder of external archive packages created with the Production Execution Manager Administrator tool. Applicable Version: Version 2004 and later. Server or Client: Server Details: See KB Article 000013650 for more information. Flag Name: CANCEL_PENDDING_WORKSTATION_OPER Description: Determines how Production Execution Manager handles workstations that hang or otherwise get disconnected while executing a Basic Phase Applicable Version: Version 5.0.1 and later Server or Client: Client Details: When a workstation is executing a Basic Phase, that Basic Phase has an Executing status in the system. If the client environment crashes (for example the Windows PC itself, or the Java Virtual Machine handling the execution environment), or if that workstation somehow gets disconnected from the system, the Production Execution Manager Server continues to maintain the status of the Basic Phase as Executing. However, when that same client attempts to log back into the system, the server will then handle the status of the still executing BP status according to the flag value. With the default value of 1, upon log-in of the workstation, the Basic Phase that was executing goes to a cancelled status, and the entire order gets the logical status Cancel by Phase. If the flag is set to 0, then when the workstation logs in, the Basic Phase will go back to an Enabled/Ready status, and the Order maintains its Initiated (i.e. in progress) status. KB Article 000013491 has more detail on this flag and steps for managing Cancelled by Phase status. As a final note, make sure not to try to spell the flag any other way, it is indeed pendding, not pending. Flag Name: CONDITION_EVALUATION_PERIOD Description: The interval at which the Production Execution Manager server checks for conditions to evaluate Applicable Version: Version 5.0.1 and later Server or Client: Server Details: See KB Article 000013450 for more information. Flag Name: DIALOG_MESSAGE_WIDTH Description: Specifies a limit to how long a text string can be included in a pop-up dialog. Applicable Version: Version 6.0.1 and later; related CQ00211174. Server or Client: Client Details: This flag was created to help program Production Execution ManagerSolutions for hand-held devices. The default limit of this flag is 100 (unit of measure is number of characters.) Hand-held displays have limited usable space, so messages in pop-up dialogs cannot be very long. Flag Name: DISABLE_OPER_CACHE Description: By default, as a workstation loads Basic Phases for execution, they are cached for reuse, to help performance. Applicable Version: Version 5.0.1 and later. Server or Client: Client Details: Especially when using heavy parameterization, the same Basic Phase may be used many times in the same Recipe. To take advantage of this RPL design approach, as Basic Phases are loaded for use, they are kept in memory in case they are needed again. They persist until the MOC interface is closed. If MOC is left open for weeks, or the RPL does not reuse the same Basic Phases, this flag can be set to true: DISABLE_OPER_CACHE=1 causing each BP to be dropped from memory after use. Flag Name: MUST_AUTHENTICATE_WINNT_USER Description: Controls whether or not authentication is required when opening MOC. Applicable Version: Version 2004.1 and later. Server or Client: Client Details: See KB Article 000013577 for more information. Flag Name: NO_USER_IN_SIGNATURE Description: Related to 21CFR11 compliance Applicable Version: Version 6.0.1 and later; related CQ00121185. Server or Client: Client Details: By default, operations within MOC that require username/password validation supply the username of the logged-in user (for example to certify an RPL.) One interpretation of 21CFR11 requirements can be that the user taking an action should be required to explicitly supply their username and password, not just the password. By setting: NO_USER_IN_SIGNATURE=1 username fields will be blank, with the intent of providing a higher degree of accountability. When viewing the audit trail, you will know that actions taken by a user required manual entry of both fields, not just password. Flag Name: SCREEN_SHOT_KEEP_SIZE_IMAGE Description: Controls whether or not scaling is applied to screenshot images that exceed A4 paper width. Applicable Version: Version 6.0.1 and later; related CQ00145915. Server or Client: Client Details: By default screenshot images are scaled to fit default A4 paper width. In the case of large images, this can result in loss of reSolution. By setting: SCREEN_SHOT_KEEP_SIZE_IMAGE=1 the original image is preserved in the screenshot report. This preserves all original information from the screenshot, but may result in a cut-off image if the report is printed. Flag Name: SCREEN_SHOT_MAX_WIDTH Description: Set an absolute threshold (units are in pixels) above which scaling applies Applicable Version: Version 6.0.1 and later; related CQ00153265. Server or Client: Client Details: Even if using SCREEN_SHOT_KEEP_SIZE_IMAGE flag as described above to avoid scaling based on A4 paper width, this flag allows the administrator to set an absolute pixel width above which scaling will be applied so that the resulting screenshot never exceeds it, like: SCREEN_SHOT_MAX_WIDTH=700 Flag Name: WORKSTATION_SHOWS_READY_PHASES_ONLY Description: Provides a filtered view in the Workstation module Applicable Version: Version 6.0.1 and later. Server or Client: Client Details: By default, the Workstation module shows all Basic Phases for all active orders. Depending on the Production Execution Manager application this could quickly result in hundreds of Basic Phases being listed. Local text filtering makes it easy to filter down to a particular order, but even then the list may be unmanageably long. If the desired usage is to only see the Basic Phases that are actually ready to execute, set this flag with a value of 1. Keywords: None References: Guide contains a guide to commonly used flags that can be used to control application behavior, but it does not include all possible flags.
Problem Statement: In Aspen Plus, how could I know which databank is used for each component's physical property data?
Solution: To understand which databank is used for each component's property data, you should go to Navigation Pane and click Properties. Then expand Setup, click Report Options, then click the Property tab. Check the box of Property parameters' descriptions, equations and source of data. Then run your simulation, click Report under Summary. A txt. file named “Aspen Plus Calculation Report” will be generated. In the calculation report, you will find property data for all the components and the corresponding data source (PURE35, HYSYS, AQUEOUS, SOILDS, etc.). Keywords: None References: None
Problem Statement: I need to simulate a pipeline with a nonzero initial elevation in Aspen HYSYS. Is it possible to change the reference point for pipe elevation?
Solution: The default initial elevation of a pipe segment is zero. It is not possible to change the reference point for pipe elevation in Aspen HYSYS. HYSYS only allows you to input the elevation change of each pipe segment. For example, I want need to simulate a 5- segment pipeline with an initial elevation of 100 ft: Distance (ft) Elevation (ft) 0 100 500 119.685 910 98.68504 1235 99.68504 1560 149.685 1970 119.685 The resulted elevation of each pipe segment in HYSYS will be the relative elevation compared to the start point: However, the initial elevation won't make any difference on the simulation result (flow rate, pressure, temperature, etc.), because it is only a reference point. Keywords: Pipeline Elevation References: point
Problem Statement: In MOC (Manufacturing Operations & Control) and the Web client you can change the timeout settings so that the application will time out and require authentication screen to continue.
Solution: The time out settings can be set so that MOC or the Web application will force the user to log back into the application. 1. On the APEM server open the path.M2r_cfg file with Notepad. This is typically located in: C:\Program files(x86)\AspenTech\AeBRS\cfg_source 2. Look for the flags under INACTIVITY PERIOD INACTIVITY_PERIOD = 300 OPERATION_INACTIVITY_PERIOD = 300 WEB_INACTIVITY_PERIOD = 300 3. Enter a value in seconds. Where 300 is the default of 300 seconds. 4. Enter 0 so the application does not time out. 5. Save the file and in same folder run the file Codify_all.cmd 6. Close MOC or Web application and re open for changes to take effect. Keywords: None References: None
Problem Statement: This Knowledge Base article shows how to retain the units installed in the Scheduling Table when upgrading from one Aspen Production Record Manager (APRM) version to another and at the same time moving the database from one Relational Database Management System (RDBMS) server to another. Here is a possible scenario: Migrate Aspen InfoPlus.21 and APRM v8.0 from Server1 to AMS v9.0 on Server2. Export APRM (v8.0) database from database1 to database2 (v9.0)
Solution: Upgrading the database without moving it to a different relational database server maintains the existing units. However, when migrating to another database server you need to export the units as XML files and re-import them into the new system. You will lose execution context in that case. Below is a workaround that will let you retain the units installed in the Scheduling Table when your upgrade scenario matches the one presented in the Keywords: Batch.21 References: None
Problem Statement: In Aspen Process Explorer how can I hide the timeline by default? Normally when you start Process Explorer it includes the timeline at the bottom of the display. This can be turned off by clicking View from the action bar and then unchecking the Global Timeline option. But how could the user automatically start with this option turned off?
Solution: You can hide the timeline by adding a key to the registry on the user's machine. [HKEY_CURRENT_USER\Software\Aspentech\Aspen Process Explorer\Workspace] Add a DWord called TimelineVisible. When this is set to 0 the timeline does not appear at startup. You still have the option to turn this on manually using the View option Keywords: Global timeline timespan hidden References: None
Problem Statement: When you load a Golden Batch Profile plot you get the error: B21BAI-60244:Invalid data source name (servernamexxx) Unspecified error (Ox80004005)
Solution: This error refers to the Aspen Data Source Administrator (ADSA) data source name you have configured for Aspen Process Explorer and the Aspen Production Record Manager. To resolve this issue check the following: 1. Open the ADSA and ensure that you are have the correct Directory Server. 2. In the ADSA Public Data source ensure that you the service for Aspen Production Record Manager and that this has the correct server and that Save Password has been ticked. 3. In Process Explorer open the Profile that you received the error message and edit this profile tag. Check that the data source is correct. 4. If your data source is correct try to ping the data source by name and by IP Address. If these do not resolve to same then refer this issue to your IT or Network administrator. 5. Open Process Explorer with an Administrator user acount and if this plot works with no error then refer to IT or Network Administrator for DCOM permissions issue. Keywords: Invalid data source B21BAI-60244 Golden Batch Profile Profile Process Explorer References: None
Problem Statement: This Knowledge Base article provides steps to resolve the following error: Error: Relational Database Error -2147217833: Arithmetic overflow error for data type smallint, value = 32768. which may be recorded in the Aspen Batch and Event Extractor log.
Solution: Below is an example of the error block from the Aspen Batch and Event Extractor log: 05/11/17 22:52:46 Error : Configuration: PolyABC_ABB has failed 1 time(s) 05/11/17 22:52:46 Error : Error processing configuration: PolyABC_ABB->Config error: ProcessData Result Reader Error while processing table (b21Batch_Transitions) configuration (PolyABC_ABB): Relational Database Error: Relational Database Error -2147217833: Arithmetic overflow error for data type smallint, value = 32768. 05/11/17 23:26:33 Warning: Forcing Bundle size to 0 for next pass - Relational Database Error: Relational Database Error -2147217833: Arithmetic overflow error for data type smallint, value = 32768. In this example, the above-mentioned error message is thrown by relational database table 'b21Batch_Transitions'. To resolve the error, you will need to ask your Relational Database Server Administrator (DBA) to perform the following steps: Find the table in your database that is throwing the error (such as b21Batch_Transitions in our example above) Right-click on it and select 'Design' from the Context menu Under Column Name find a column that has Data Type ‘smallint’ and change it to ‘int’, as shown in the screen capture below Save the changes (this may require a DBA to perform further steps that are outside of the scope of this KB Article) Keywords: BXE References: None
Problem Statement: Diagnostic logging is produced by the Aspen.InfoPlus21_DA Server, but what options are available to configure it?
Solution: The Aspen.InfoPlus21_DA Server provides a sophisticated diagnostic logging system. It can be used to send log messages to one or more destination media, such as to a file, a database table, an email, and so on. The logging behavior is controlled by a configuration file named IP21DaServer.dll.config. The file is in XML format and can be edited using any text editor. By default, it is configured to log warning and error messages to a file that rolls over when it reaches 1024 KB in size. The application logs messages of various severity levels, such as FATAL, ERROR, WARN, INFO, or DEBUG (ordered from most severe to least). There is a filter that controls which log severity levels should be included. This is controlled by the LevelMin and LevelMax values of the filter in the LogFileAppender section. By default, it filters to include levels WARN through FATAL. The level value under the root section also controls the filtering of messages. It controls the global severity level. For example, if additional appenders were added to the configuration file, they might have their own severity filters, but all appenders are limited by the root level. The current default for the root level is INFO. But you will notice that no INFO messages will appear in the file log. Again, this is because the LevelMin filter is set to WARN. To include INFO messages into the file, you must change the LevelMin to INFO (or a lower severity, such as DEBUG). If you want to view the most detailed logging, then set the root level to DEBUG and LevelMin to DEBUG, also. When using INFO level, there is one message that appears too often. We recommend adding the following filter to the LogFileAppender section (just after the </filter> line) to exclude that one message. <filter type=log4net.Filter.StringMatchFilter> <stringToMatch value=ChangeCount /> <acceptOnMatch value=false /> </filter> The location of the output file to be written is controlled by the conversionPattern value. You may find that this value is misconfigured in the default log config file. It will write to the current user's roaming profile location but uses a bad file path. This results in the log file placed somewhere like: C:\Users\<user>\AppData\RoamingAspenTech\Logs\InfoPlus.21\IP21OPCServer.log Please change this value to have a direct path to the desired location, such as: <conversionPattern value=C:\ProgramData\AspenTech\DiagnosticLogs\InfoPlus.21\Ip21OpcServer\IP21OPCServer.log /> The logging system is based on Apache's log4net logging component. It is very flexible and powerful. Please visit the web site at https://logging.apache.org/log4net/release/manual/introduction.html to learn about the various configuration options. There are many types of appenders that can be configured to write log messages to many types of output destinations. You can also learn how to configure filters to eliminate unwanted messages. Keywords: Aspen.InfoPlus21_DA.1 IP21 OPC Server diagnostic logging IP.21 DA Server log4net LevelMin References: None
Problem Statement: How can I specify specific sets of conditions when running a case study?
Solution: The default configuration for the Case Study tool in Aspen HYSYS uses Nested input to direct calculations. This means that all combinations of variables will be run based on an equal division between the high and low bounds using the specified step size. The figures below show a sample of the setup and results for this type of configuration: If the case study type is changed from Nested to Discrete, the user is given the freedom to choose which combinations of variables the case study will use to generate the output. In this configuration, the user provides the number of states to be run as well as the specific combination of values to be used for each variable. Note that the number of states must be entered directly by the user prior to specifying the specific combination of values to be used for each variable. The figures below show a sample of the setup and results for this type of configuration: The attached simulation demonstrates how these two options may be used to construct a case study. Keywords: Case Study, Type, State, Nested, Discrete References: None
Problem Statement: When opening the Aspen Weigh and Dispense tool you get the error message: No Booth found for workstation WORKSTATION.
Solution: The Weigh and Dispense tool is configured through Aspen Weigh and Dispense Management website. 1. This message can occur if you are trying to access Weigh and Dispense for the first time and do not have a booth configured for the machine you are running on. 2. If this is the first time then open the Aspen Weigh and Dispense Management website: http://localhost/WeighDispense Note: Where localhost is equal to you server or node name. Note: The URL is case sensitive 3. You will need to create a booth and add a workstation to the booth. In the Booth Characteristics area, click next to the Machine Name. The Workstation dialog box appears. Enter the necessary information in the empty line at the end and Click OK. Check the Machine Name drop-down list to make sure the newly added workstation is available. 4. If you already have a booth created open the Equipment tab, click edit (click the pencil) your booth. You can add or remove the workstations that have access. Note: This list is also in the MOC config and Workstation Module. Keywords: None References: None
Problem Statement: How to control expander inlet guide vane (IGV) in Aspen HYSYS Dynamics?
Solution: If you are using an IGV Curve in your expander and want to control the Current IGV, you can use a PID controller. Unfortunately, the Current IGV is not a listed variable as Process Variable in the controller. In order to connect this variable to your controller, please use the Spreadsheet operation. Locate the Current IGV variable in your expander and right click on it. In the new menu select Send To | Spread Sheet | SPRDSHT-1 (in case you have this operation already added in your case, otherwise select Create New). Select the cell in which you want to include this variable (i.e. B1) and press OK. Open your PID controller and click on Select PV… to navigate through the flowsheet variables. Locate the Spreadsheet (SPRDSHT-1) and the cell in which you included the Current IGV (cell B1). Keywords: Expander, IGV, Position, Control, PID, Controller, Spreadsheet References: None
Problem Statement: I want to know if there is a way to copy/backup the users and roles from a workspace into a new ones without having to create role by role or adding user by user.
Solution: There are two options to back up the list of users and roles in order to use it for new workspaces: Access Control - Aspen Basic Engineering Policy Filename (azpf file), and Privileges.xml file. · Access Control - azpf file 1. In the Administration tool select the source workspace. 2. From the Action menu select Export Access Control, and define the location to save the azpf file. 3. Select the new workspace in which you want to import the list of users and roles (azpf file). 4. Pick Import Access Control from the same Action menu, and look for the azpf file created in the previous step. With this, users and roles will be included, and Privilege. xml will be automatically modified. · Privileges.xml 1. To take a backup of Roles and Users for any particular workspace, go to that workspace folder and copy privileges.xml file to safe location. (i.e. if you are using default location then you can find this file at C:\AspenZyqadServer\Basic Engineering1X.1\Workspaces\workspace name) 2. If you want to use this privileges.xml file for new workspace, then first you need to open this file with Notepad and modify the name of workspace as shown in line below. 3. Once you modify this file with new workspace name, you can overwrite existing file with this new one in your new workspace folder. 4. Make sure that you are using privileges.xml file of same ABE version. Keywords: roles, users, privileges, azpf, backup, Access Control References: None
Problem Statement: How can I change the fluid package of an Aspen HYSYS stream via automation?
Solution: The following VBA code will change a stream's associated fluid package to Basis-2. The HYSYS.ProcessStream interface contains a FluidPackage property. However, this property only returns the fluid package object that is attached to the stream and does not return the underlying attachment field. Thus, this cannot be used to change the fluid package of the stream. The underlying fluid package attachment field can only be accessed via a BackDoor method. You can obtain the moniker by running the script recorder and performing an action which changes the selection in this field, such as by changing the selected fluid package from Basis-1 to Basis-2: AttachObject FlowSht.1/StreamObject.400(1) :CompList.300 FluidPkgMgr.300/CompList.300(Basis-2) NoCreate The code to perform this action is as follows: 'Assume that this object has been assigned a value Dim stream As HYSYS.ProcessStream 'Cast the HYSYS.ProcessStream object into a HYSYS.BackDoor object Dim streamBd As HYSYS.BackDoor Set streamBd = stream 'Access the fluid package attachment of the stream Dim attachedFp As HYSYS.ObjectVariable Set attachedFp = streamBd.BackDoorVariable(:CompList.300).Variable 'Create a HYSYS.FluidPackage object (note that Basis-2 must already be present in the Properties Environment) Dim newFp As HYSYS.FluidPackage Set newFp = hyCase.BasisManager.FluidPackages.Item(Basis-2) 'Set the fluid package of the stream to Basis-2 attachedFp.Object = newFp Keywords: HYSYS, Automation, COM, ActiveX, fluid package References: None
Problem Statement: How can I change the fluid package of an Aspen HYSYS stream via automation?
Solution: The following VBA code will change a stream's associated fluid package to Basis-2. The HYSYS.ProcessStream interface contains a FluidPackage property. However, this property only returns the fluid package object that is attached to the stream and does not return the underlying attachment field. Thus, this cannot be used to change the fluid package of the stream. The underlying fluid package attachment field can only be accessed via a BackDoor method. You can obtain the moniker by running the script recorder and performing an action which changes the selection in this field, such as by changing the selected fluid package from Basis-1 to Basis-2: AttachObject FlowSht.1/StreamObject.400(1) :CompList.300 FluidPkgMgr.300/CompList.300(Basis-2) NoCreate The code to perform this action is as follows: 'Assume that this object has been assigned a value Dim stream As HYSYS.ProcessStream 'Cast the HYSYS.ProcessStream object into a HYSYS.BackDoor object Dim streamBd As HYSYS.BackDoor Set streamBd = stream 'Access the fluid package attachment of the stream Dim attachedFp As HYSYS.ObjectVariable Set attachedFp = streamBd.BackDoorVariable(:CompList.300).Variable 'Create a HYSYS.FluidPackage object (note that Basis-2 must already be present in the Properties Environment) Dim newFp As HYSYS.FluidPackage Set newFp = hyCase.BasisManager.FluidPackages.Item(Basis-2) 'Set the fluid package of the stream to Basis-2 attachedFp.Object = newFp Keywords: HYSYS, Automation, COM, ActiveX, fluid package References: None
Problem Statement: The Simulation Importer is not listed under ABE | Explorer | Tools. How can I resolve this issue?
Solution: If the Simulation Importer is not listed under the Tools menu, then it must be added to ABE | Explorer. To do so, follow the next steps: 1) Go to Tools | Customize. 2) On the Customize Tools dialog, click on the Add button and then browse to the following location: C:\Program Files\AspenTech\Basic Engineering VX.X\UserServices\bin\AZSimImporter.exe (‘X.X’ refers to the ABE version you are using (i.e. V9.0, V8.8, etc.)). 3) Back on the Customize Tools dialog, make sure ‘AZSimImporter’ is listed under Menu contents. Then, left-click on it and select ‘Workspace Moniker’ for the Arguments field. 4) The Customize Tools dialog should look like this: 5) Refresh the Workspace and verify that the AZSimImporter does appear now listed under the Tools menu. If it does, then it will be possible to successfully launch the Simulation Importer. Keywords: Simulation Importer, AZSimImporter, Explorer, Tools. References: None
Problem Statement: How shall a user create additional reports for rate based column calculation?
Solution: Rate Based Technology is a powerful tool for users to analyze the performance of a column with a more rigorous approach. Moreover, users can create additional reports for the calculation beside the normal Profile and Results. To generate those reports, users can go to Rate-Based Modeling folder in RadFac Unit Operation and select Rate-based Report sheet. Users can then select Property Options and Efficiency Options. For Property options, one can select interfacial compositions, interfacial area, binary diffusion coefficients, heat transfer rates/ coefficients, mass transfer rate/coefficients, bubble and dew points, reaction rates and scalar dimensionless numbers. For Efficiency options, one can select including Murphree Efficiencies and Tray Efficiencies. After checking the options and rerunning the simulation, users can then see the data in interface profiles, transfer coefficients, dimensionless numbers, efficiencies and HETP. Those are special calculation results for rate-based modeling. Key Word Rate-Based Column, Reports, interface profiles, transfer coefficients, dimensionless numbers, efficiencies and HETP. Keywords: None References: None
Problem Statement: The MS Excel xlGetFilePathName function is declared as a variant, even though the result is almost always a string. [This function is almost always used to find the name of the Aspen Plus model to open]. Is there a way to use always use a string variable for the results of the xlGetFilePathName Function?
Solution: The reason the xlGetFilePathName was defined as a variant is that the end-user can cause two types of results - a string variable by selecting a file from the open dialogue OR a value of FALSE (numeric = 0) by hitting the CANCEL button. So, by default, Microsoft set this function as a VARIANT to handle either a string (the filename and path) OR a False/numeric result. Tom Miller of UNDEO Nalco shares the following tip: The function xlGetFilePathName will work just fine if the variable lv_FilePathName is a string and the returned type of the function is also a string. A little more bulletproofing changes the statement in the calling program from: If lv_FilePathName <> False Then to : or: If Ucase(lv_FilePathName) <> FALSE Then ''ALTERNATE ONE If InStr(lv_FilePathName,.) > 0 Then ''ALTERNATE TWO In ALTERNATE ONE, we convert the retrieved filename to upper case using the UCASE intrinsic (because a false value is usually returned as False or false). The IF statement checks for the xlGetFilePathName result to equal the string FALSE. In ALTERNATE TWO, we know that a path/filename should contain a . in the filename (i.e. c:\Temp\ActiveX.Bkp). If the value is False or zero, there will be no period. The InStr intrinsic is used to scan the returned string to see if it contains a period (.). If it does, the result of this function, is the column number in the string where the period was found. We don''t have to know the exact location of the period for this check, only that if a period is found the value of the InStr instrinsic will be greater than one. Keywords: References: None
Problem Statement: Users should be able to access Aspen InfoPlus.21 (IP.21) from any client application if the security role that they belong to permits. However, some users cannot access IP.21 from any client application - even though they are a member of an Administrator role and have permissions assigned appropriately. For example: The IP.21 Administrator tools gives a User access permission... type error. Aspen Process Explorer does not show any data. Aspen SQLplus gives an Error reading definition record error. The Aspen Tag Browser does not return any tags. All applications behave as if the user does not have any privileges.
Solution: Sometimes the IP.21 cache file does not get updated with the current Aspen AFW Security Manager information. Therefore, though the user's role indicates that the user has permission to access the database, the IP.21 cache file is not aware of these permissions. To verify, log into the IP.21 machine as the account that runs the Aspen InfoPlus.21 Task Service in services.msc. Go into the IP.21 Manager to the menu Actions -> User Roles. Check for the specific user in the InfoPlus.21 Users section. Does this user show a red head icon or a green head icon? If there is a red head icon, this indicates that the user does not have permission to access the database. To force the local cache to get recreated, stop the Aspen InfoPlus.21 Task Service in the Control Panel | Services. Delete the IP21AFWCACHE.DAT file from the group200 folder usually located at: %ProgramData%\AspenTech\InfoPlus.21\db21\group200\ Restart the Aspen InfoPlus.21 Task Service. A new cache file should be generated with the current role information. Now, the user should be listed in IP.21 Manager's menu Actions -> User Roles with a green head icon. Keywords: User access permission Error reading definition record Security References: None
Problem Statement: This is a simplified version of a real application for reactor modelling with ACM. A structure is used to store the kinetic data. Structure KinDataStructure K as realvariable (fixed, 42); End A reaction model is then used for the calculation of reaction rates: Model reaction kin as external KinDataStructure; r as realvariable; r = kin.K; End and finally, the reactor model is: Model reactor n as integerparameter (10); reac([1:n]) as reaction; r as realvariable; r = sigma(reac.r); End How can one ensure that each instance of the reaction submodel in reactor uses the very same structure, and can be specified just once instead of having to specify it for each instance of reaction individually?
Solution: To provide the same reference in sub-models as in the parent model declare a structure reference in the sub-model, and give the parent reference as an argument to the sub-model instance. For this specific example: two changes are required in the reactor model, as shown below. Model reactor kin as external KinDataStructure; // change 1 n as integerparameter (10); reac([1:n]) as reaction (kin: kin); // change 2 r as realvariable; r = sigma(reac.r); End Keywords: structure, submodel, model References: None
Problem Statement: I want to increase the length of my installation piping and I want to know if it will increase the fittings quantity.
Solution: Pipe length adjustment will not affect the fittings count of your pipe. This option will only increase or decrease the length of your piping as described in KB number: 000044676 Fittings in the equipment installation piping are preset and defined with the volumetric model or in P&ID diagram. Changing the length of the pipe does not change the default fittings specifications. Keywords: Pipe, length, adjustment, volumetric, model, envelope References: None
Problem Statement: With the Process Data Add-ins, when referencing a tag name that could also be interpreted as an Excel cell reference (eg. AB123), the following error would be generated: COM Add-in - Error:Tag name is either empty or referencing empty cell(s). Legacy Add-in - #ERROR 50110: Invalid Tags The above example assumes that the referenced cell (cell AB123) is in fact empty. The scope of this problem increased with the arrival of Excel 2007. In this version, Microsoft increased the 256 column limit within Excel worksheets so that now three characters (up to XFD) can potentially be used to identify a column rather than a maximum of two characters (up to IV) in the days of Excel 2003. A consequence of this change is that a tag name which was correctly treated as such in Excel 2003 may now be treated as a cell reference on upgrading to Office 2007. eg. IW1 is not a cell reference in Excel 2003 but it is in Excel 2007. When using the Aspen Process Data Add-in, a user may define the tag being analyzed in various ways: Type the tag name directly into the Process Data Add-in GUI. Type the tag name into an Excel cell and then within the GUI use the cell reference pointer to identify the location of the tag name. COM Add-in - Use the Aspen Tag Browser to drag and drop the tag name into an Excel cell and then reference the cell in the Add-in GUI’s tag field. Legacy Add-in - Use the Aspen Tag Browser to drag and drop the tag name directly into the Process Data dialog box. Only by typing the tag name into the Add-in GUI should you ever experience a valid tag name being incorrectly interpreted as an Excel cell reference, the other methods are considered safe.
Solution: Solution 1: Adopt the approach used by the Tag Browser drag and drop method and type double quotes around the tag name, this forces Excel to view the tag as a name label rather than a cell reference. Note, it is important to use double quotes rather than single quotes around a tag name since single quotes would be treated as part of the tag name and the formula is likely to generate the following error: COM Add-in - Error:(<tag name>) Tag Name <tag name> is invalid Legacy Add-in - #ERROR 50102: Bad tag ‘<tag name>’Solution 2: It is worth remembering that tag names can be separated up with space characters and still be evaluated as the same name. You can use this fact to prevent a tag ever being treated as a cell reference. eg. AB 123 would be a valid alternative tag name to type into the Add-in GUI (instead of AB123). Within Excel, AB 123 would not be confused as a cell reference and you should receive the results you originally expected. Keywords: None References: None
Problem Statement: Is pipe envelope useful to increase or decrease the length of my equipment installation piping?
Solution: Pipe envelope is an option the software offers in the Area piping specs, to specify a tight arrangement in the equipment installation piping and adjust the pipe length calculated by the volumetric model. This specification will only be used to shorten the calculated length and not to increase it. Should you wish to increase the pipe length calculated you must do it with the pipe length adjustment (KB article: 000044676) Default loose arrangement Tight arrangement with pipe envelope Note: This description is only valid if the default volumetric model is applied. If the user has toggled off the volumetric model for the equipment / project, user must manually enter the lines and their length. If you would like further clarification on this, please contact Aspen Tech Support. Keywords: Pipe, length, volumetric, model, envelope References: None
Problem Statement: How to access the rigorous Coil Wound Heat Exchanger model within Aspen HYSYS?
Solution: The Aspen Coil Wound Heat Exchanger (CWHE) is the newest addition to the Aspen Exchanger Design & Rating (EDR) Family, available since the V9.1 release. Like other Aspen EDR programs, we can integrate the Aspen Coil Wound Heat Exchanger in Aspen HYSYS simulations. We use the Aspen HYSYS LNG block to model the CWHE, since the operation is similar in principle to that of a Plate Fin exchanger (except that there are several different fluids in the WCHE tubes). To access the rigorous CWHE EDR model within Aspen HYSYS, you must have installed, at least, the Cumulative Patch 1 for Aspen HYSYS V9. Then follow the next procedure: Open the LNG block, and change the Rating Method to EDR – Coil Wound (Design | Parameters (SS) | Exchanger Parameters) The EDR CoilWound tab presents a basic set of input/output data. For more specialized operation, you can click the View EDR Browser button to launch the EDR CoilWound Browser from within the LNG page. More information on the CWHE simple rating method can be found on the ArticleID 000031568 How do I model a Wound Coil Exchanger using Aspen HYSYS? Keywords: Coil Wound Heat Exchanger, CWHE, Activated EDR. References: None
Problem Statement: How can I use automation to add an internal stream to an Aspen HYSYS column?
Solution: The following VBA code will add an internal stream to a column in Aspen HYSYS. The internal streams will be named testStrm - #, with # representing the number of internal streams that are currently present in the column. Creating an internal stream requires the use of several BackDoor monikers. 'Assume that this ColumnOp object has been assigned a value Dim col As HYSYS.ColumnOp 'Link to the corresponding column subflowsheet Dim colFs As HYSYS.ColumnFlowsheet Set colFs = col.ColumnFlowsheet 'Cast the column subflowsheet into a HYSYS.BackDoor object Dim colFsBd As HYSYS.BackDoor Set colFsBd = colFs 'Count the current number of internal streams 'The moniker ColumnIntrnlStream.500.[] returns the entire array of internal streams as an ObjectFlexVariable Dim numStrms As Integer numStrms = 1 + UBound(colFsBd.BackDoorVariable(ColumnIntrnlStream.500.[]).Variable.Values) 'Create a ColumnIntStream inside the column subflowsheet Dim strmName As String strmName = testStrm - & numStrms 'Gives a unique name to each internal stream being added Dim newIntStrm As ProcessStream Set newIntStrm = colFs.Streams.Add(strmName, ColumnIntStream) 'Add a new internal stream, same as clicking on Add from Flowsheet | Internal Streams colFsBd.SendBackDoorMessage CreateInternalStream 'Access the internal stream connection Dim intStrm As ObjectVariable Set intStrm = colFsBd.BackDoorVariable(ColumnIntrnlStream.500. & numStrms & :ColumnIntStream.300).Variable 'Note that ColumnIntrnlStream.500.0 refers to the first internal stream 'ColumnIntrnlStream.500.1 refers to the second 'ColumnIntrnlStream.500.2 refers to the third 'Etc 'The code accesses the last stream that the program created since this number is stored in numStrms 'Set the stream connection intStrm.Object = newIntStrm 'Choose a stage to connect to the internal stream Dim stageNum As Integer stageNum = 1 'Numberings for stageNum: '0 for the Condenser '1 - n for #_MainTower 'n + 1 for the Reboiler Dim stageOp As SeparationStage Set stageOp = colFs.SeparationColumnStages.Item(stageNum).SeparationStage 'Assign stageOp to the internal stream Dim strmStage As ObjectVariable Set strmStage = colFsBd.BackDoorVariable(ColumnIntrnlStream.500.0 & :SepStage.300).Variable strmStage.Object = stageOp 'Set the Draw phase type Dim phaseTypeVar As RealVariable Set phaseTypeVar = colFsBd.BackDoorVariable(ColumnIntrnlStream.500.0 & :ExtraData.302).Variable phaseTypeVar = 1 '1 for vapor, 2 for liquid, 3 for aqueous 'Set the Export checkbox Dim exportVar As RealVariable Set exportVar = colFsBd.BackDoorVariable(ColumnIntrnlStream.500.0 & :Boolean.301).Variable exportVar.Value = 1 '0 for false, 1 for true Output: Keywords: HYSYS, Automation, COM, ActiveX, column internal stream References: None
Problem Statement: Which volumetric unit set corresponds to which volumetric variable on Aspen HYSYS?
Solution: When you attempt to change the units of volumetric variables, you will notice that the names assigned in the unit set are different from the name of the variables shown in the HYSYS objects (streams, operations, tables, etc.). The following table states the relationship between the name designated to unit set variables and the stream variables names: Keywords: Volumetric Flow, Unit Set, Variable Name, Units. References: None
Problem Statement: In Aspen Petroleum Scheduler, How to resolve “Failed to create the temporary directories.Please specify a different Local working directory” which results in crashing of APS.
Solution: The Local Working Directory path can be set from the APS Model Settings option. This is the location where the files associated with the current model are saved. To change the working directory: 1. From the menu bar, click View | Settings to display the Settings dialog box, and then click the User Settings tab. 2. Alternatively, click on the toolbar and in the Local Working Directory field, enter the desired location (Refer Screenshot below) In case, the APS crashes without allowing to set the Working Directory from the GUI and giving an error like in the screenshot mentioned in problem description. Please try the following changes in the Registry to resolve the problem. Open Regedit.exe (Registry editor) in the machine where this problem is encountered and has APS installed. Navigate to the following registry entry HKEY_CURRENT_USER\Software\ASPENTECH\Orion Scheduling\Settings Delete the value specified for “Working Directory” on the right hand side of the pane and keep it blank. Once done, launch APS again. It will ask you to specify the Working directory. Specify the working folder as per your requirement. You should now be able to specfiy the working directory for APS and this will resolve the error encountered. APS no longer crashes. Keywords: APS, Crash, Working directory, stopped working References: None
Problem Statement: How do I modify the fluid package associated to an existing Petroleum Assay?
Solution: When you add a new Petroleum Assay, it is always associated to the default fluid package (i.e. Basis-1). If you want to associate an existing assay with a different fluid package, go to the Petroleum Assay folder (under properties environment) and display the array in Fluid Package column. Remember to re-characterize the assay so HYSYS can properly update the assay’s properties with the new component list. Keywords: Petroleum Assay, Fluid Package, Component List References: None
Problem Statement: How is the Energy Balance performed for a Heat Exchanger unit operation in Aspen HYSYS?
Solution: Aspen HYSYS does not use for the duty calculations, since the Cp value is not constant during the heat transfer process, and moreover, in the case that the involved material streams are at a V-L mixture condition or if there is phase change. That is the reason why using the equation, users will not match the results reported by Aspen HYSYS. The heat duty is calculated based on the following energy balance equation (for steady state): Where: M = Fluid mass flow rate H = Enthalpy Qleak = Heat leak Qloss = Heat loss hot and cold = hot and cold fluids in and out = inlet and outlet stream Q = Heat transferred between shell and tube sides U = Overall heat transfer coefficient. ΔTML = Log mean temperature difference (LMTD) Ft = LMTD correction factor For the simple heat exchanger models where there is a single unknown and the energy balance is the only constraint to solve the exchanger (for example, where three temperatures and both flow rates are known and it is needed to solve for temperature/enthalpy in the one unknown stream), the energy balance calculation performed by Aspen HYSYS follows these steps: 1) First, the enthalpy balance is done in the side where it is possible (either shell or tubes side), thanks to the user specification. Note that the pressure drop across the unit has a strong influence (more than Cp variation), especially in cases where almost all the process fluid is gas. 2) Second, it meets the enthalpy balance in the other side. 3) Finally, the rest of the properties are calculated using the thermodynamic package (i.e. the Property Package). This way it is not necessary to account for the properties changes along the process, and knowing the enthalpy balance in one side makes it possible to calculate the other side. Aspen HYSYS works with a molar basis, therefore once the stream conditions are specified, the rest of the properties are calculated thanks to the thermodynamic package and mass properties (such as Cp and Enthalpy) are calculated from the molar properties using the molecular weight. Now, for cases with multiple unknowns (for example, where both outlet temperatures are unknown, but there is a UA specification), an iterative calculation process is performed to find theSolution that meets both the UA and energy balance constraints, but the energy balance calculation is still the same. The Dynamic Rating and Rigorous Shell&Tube (EDR) models have their own solving procedures, but the energy balance should still be calculated using the same energy balance equation. Keywords: Energy Balance, Heat Duty, Enthalpy, Heat Exchanger, Heat Capacity. References: None
Problem Statement: How does the Sizeable option affect my pipe calculations in Aspen Flare System Analyzer?
Solution: In the pipe specification tab, you will see the Sizeable option that will have an effect in the calculations depending on the calculation mode that you are running. There are three calculation modes in Aspen Flare System Analyzer (AFSA): Design, Rating and Debottlenecking. The key differences between these modes is that, in Design mode, AFSA will change the specified pipe diameters such that the design constraints are satisfied (the diameter change can be an increase or a decrease). The design constraints are defined on the Constraints tab of the Scenario Editor. Debottlenecking mode is similar to Design mode except that pipe diameters will only ever be increased (not decreased). Hence if you do not violate your design constraints, there is really no need to run your case in debottlenecking mode. You might want to tune your model manually to match actual plant situation. Note that, even in Design/Debottlenecking mode, only pipes that are tagged Sizeable (on the Pipe Editor\Dimensions tab) will be modified. To summarize: Rating - performs no diameter changes Design - performs diameter increases or decreases for sizeable pipes in order to satisfy Design constraints Debottlenecking - performs diameter increases for sizeable pipes in order to satisfy Design constraints Keywords: Sizeable, Pipe, Rating, Design, Debottleneck References: None
Problem Statement: Is it possible to use Aspen SQLplus to access those fields?
Solution: The following queries can be used to access those fields. In these examples IP_AnalogDef is used (please substitute the name of the appropriate definition accordingly): select * from definitiondef where name like 'ip_analogdef' select * from definitiondef.1 where name like 'ip_analogdef' The .1 after definitiondef above designates that the query should access the first repeat area. Note that it is also possible to request specific fields rather than do a select *: select field_name_record, field_data_type, field_length from definitiondef.1 where name like 'ip_analogdef' select * from definitiondef.2 where name like 'ip_analogdef' The .2 after definitiondef above designates that the query should access the second repeat area. Keywords: DEFINED_REC DETAIL_DISPLAY_REC DISK_HISTORY_RECORD EXTERNAL_TASK_RECORD MAP_RECORD REC_ACTIVATION_CODE RECORD_FIXED_LENGTH ALIAS_FIELD_PNTR #_OF_REPEAT_AREAS #_OF_FIELDS_IN_REC MAX_FIELD_POINTER FIELD_NAME_RECORD REPEAT_AREA_INDEX FIELD_DATA_TYPE FIELD_LENGTH FIELD_LOCATION FIELD_BIT_POSITION FIELD_FORMAT_RECORD SEARCH_KEY_RECORD FIELD_CHANGEABILITY OP_CHANGE_REF_FIELD OP_CHANGE_REF_VALUES SUMMARY_LINE_RECORD PROCESS_FIELD_PNTR HISTORY_FIELD_PNTR LIMIT_FIELD_POINTER LIMIT_NUMBER MAXIMUM_LIMIT_STATE BI-DIRECTION_LIMITS? LIMIT_DEADBAND? ACTIVATION_CRITERIA ACTIVATION_TYPE PROCESSING_RECORD QUALITY_STATUS_FIELD TIMESTAMP_FIELD AUXILIARY_FIELD_PNTR FIELD_WRITE_LEVEL AUDIT_PROPERTY INCL_IN_TAG_CREATION PUBLISH_UPDATES References: None
Problem Statement: On your Aspen InfoPlus.21 server you may notice increased CPU usage for the tsk_server.exe process. In fact, even with InfoPlus.21 in a stopped state, CPU usage continues unchanged.
Solution: Tsk_server.exe is the executable for the Aspen Infoplus.21 Task service. One of its tasks is to verify Infoplus.21 Database security, accessing the IP21AFWCACHE.dat file. It can happen that the IP21AFWCACHE.dat file is corrupted and this can cause the increased CPU usage. Below are the steps to correct this: 1. Stop the Infoplus.21 database. 2. Stop the Aspen Infoplus.21 Task Service in services.msc. 3. Delete the IP21AFWCACHE.dat file. from the ...\AspenTech\InfoPlus.21\db21\Group200 folder. 4. Restart the Aspen Infoplus.21 Task Service. 5. Restart the Infoplus.21 database. It may already have been restarted by the service when the Start@Boot option is ticked in Infoplus.21 Manager. If you notice that the CPU usage is still high then check the AFW or Local security with these steps. 1. Delete AFW (Aspen Framework) or ALS (Aspen Local Security) cache and restart Infoplus21 from manager. Please see article 000019056 for AFW/ALS cache files: https://esupport.aspentech.com/S_Article?id=000019056 2. Check connection status if you are using an AFW server such as MS-SQL Server and Oracle. If you having high ping times and/or packet loss you can have delay or errors with the task service trying to authenticate. 3. Check MS-SQL Server or Oracle database for 'Deadlocking' conditions. If the database is in a busy state or crash state this can be an issue with the tsk_server.exe Refer to you Database Administrator for further testing. Keywords: None References: None
Problem Statement: Which model is being used when a method field is set with 'Model Default'?
Solution: When a field is set with ‘Model Default’, it means that the pipe, relief valve or control valve will use the model selected for such calculation method. This can be set up on the Calculation Settings Editor. Under the Methods tab, the specified models will be used for all unit operations in the flowsheet unless a different model is specified for a single unit operation. For example, if the user specifies on the Calculation Settings Editor | Methods tab the ‘Beggs & Brill Homog’ pressure drop correlation for the three pipe orientations (horizontal, inclined and vertical), then when a pipe segment uses the ‘Model Default’ for pressure drop calculations, it will be using the ‘Beggs & Brill Homog’ correlation. Keywords: Model Default, Calculation Settings, Options, Methods. References: None
Problem Statement: How does Aspen HYSYS predict weeping in Column Internals Analysis tool?
Solution: One of the factor affecting column distillation operation is Weeping caused by low vapour flow. The pressure exerted by the vapour is insufficient to hold up the liquid on the tray. Therefore, liquid starts to leak through perforations. If you want to view a video presentation of weeping, please click here. Aspen HYSYS has the capability to predict this phenomenon using the Column Hydraulic Internals tool in V9.0. The Hydraulic Plots will show in red the problematic areas. If you review the plots by tray, you will be able to see more details with the error message indicating that weeping was detected. The program takes as reference the book Distillation Design by Henry Kister ISBN 0-07-034909-6 to predict weeping. Specifically look for the weeping methods of Hsieh & McNulty, and Lockett. The basic correlations involved as used by Wallis (Wallis, G.B., One-Dimensional Two-Phase Flow, p.339, McGraw Hill, New York (1969). are: In addition, the equation for the calculation of the characteristic length Z: Please find the attached paper with the above reference for further details. Keywords: Weeping, Column, Internals, Error, Hydraulic Plot, Trays. References: None
Problem Statement: When using the pseudo table, ALL_RECORDS, you can only select those columns that every record in the database contains, those being: NAME RECID DEFINITION USABLE ALL_RECORDS can still be used when requiring more useful reporting information for Aspen Infoplus21, for instance a report that will provide all saved tag records for definition records that are IP_AnalogDef or IP_DiscreteDef.
Solution: Use record indirection using the name field on the ALL_RECORDS pseudo table. This allows you to select any fixed area field of any record in the database. If any record does not contain the appropriate field, SQLplus returns a null value; which shows up by default as a blank or space. (Be aware that SQLplus retrieves the field number of the indirected field at parse time and uses that instead of the textual name supplied in the query (this is an optimisation step). This means that records containing fields using the same field number, possibly with different field names, will also appear in the results.) An example query obtaining the desired data mentioned above would look like this: SELECT NAME, NAME->IP_DESCRIPTION, NAME->IP_VALUE, DEFINITION FROM ALL_RECORDS WHERE DEFINITION in ('IP_AnalogDef', 'IP_DiscreteDef'); Example output: NAME NAME->IP_DESCRIPTION NAME->IP_VALUE DEFINITION ------------------------ Keywords: None References: None
Problem Statement: Which is the best property method to use in a system containing acetic acid and water in liquid and vapor phases, taking into account the dimerization of acetic acid in the vapor phase?
Solution: For selecting a suitable property method for this system, we suggest using the Method Assistant that one can open from the Home ribbon, Tools | Method Assistant. Please see the print-screen below. Through the Method Assistant, one should select the proper options: From Getting started, please select Specify component type From Component type, select Special (water only, amines, sour water, carboxylic acid, HF, electrolyte) From Special Components, select carboxylic acids (such as acetic acid) in the mixture With this selection, one should arrive to the Method Assistant page below: Then, in order to take into consideration the acid acetic dimerization in the vapor phase, one should use an activity coefficient method with Nothnagel or Hayden-O'Connel model, so suitable methods should be NRTL-HOC or WILS-NTH. These property methods are recommended for systems that strongly associate in the vapor phase, such as organic acids (acetic acid). These systems are known to form dimers, which affect both VLE and enthalpy. The NRTL-HOC property method uses: The NRTL activity coefficient model for the liquid phase; The Hayden-O'Connell equation of state for the vapor phase, incorporating chemical theory of dimerization; The Rackett model for liquid molar volume; Henry's law for supercritical components. The WILS-NTH property method uses: The Wilson activity coefficient model for the liquid phase; The Nothnagel equation of state for the vapor phase incorporating chemical theory of dimerization; The Rackett model for liquid molar volume; Henry's law for supercritical components. For details, see Physical Property Methods, Chapter 2. Keywords: acetic acid, CH3COOH, carboxylic acid, dimers, Method Assistant References: None
Problem Statement: When using VBA/Excel to operate Aspen Plus, occasionally a fatal error in the spawned Aspen Plus session will cause the VBA code to crash also. Is there a way to code the VBA procedures to recover when this occurs?
Solution: Visual Basic and VBA has an 'ON-ERROR' handler that recovers from most types of errors. It is a good idea to trap for errors whenever VBA or VB launches another process, such as Aspen Plus. The ON-ERROR handler can be enabled before the new process is launched, and then disabled as soon as the process is successfully started and control is returned to the calling subroutine. In the below example, no method exists to check for the availability of an additional license from the license manager. So, an attempt to open an Aspen Plus session is made. If it fails to open, we can assume it is because no licenses are available. We use the VBA On-Error handler to see if an error occured as we tried to spawn a new Aspen Plus session. Example: Add an On Error statement before the InitFromArchive2 method call: 'enable the On-Error checking On Error GoTo NoLic 'Code to attempt opening a model in Aspen Plus ' note: the code will cause a VBA error if Aspen Plus fails to open the model Set AspenSimulation = CreateObject(Apwn.Document) call go_Simulation.InitFromArchive2(FilePathName,HostType,NodeName, _ UserName,Password,WorkingDir) 'Disable Error checking On Error GoTo 0 Go to GoodRun NoLic: 'An error occured in the opening of the Aspen Plus model. 'Translate the error code into a text message & display the text. Range(A1).Select MsgBox Unable to Start Aspen Plus Simulation Engine & Chr(13) _ & Reason: & Err.Description, vbCritical Err.Clear Application.Cursor= xlDefault Exit Sub GoodRun: 'No error occured, proceed as normal ' Add Your code for handling a successful model opening by Aspen Plus below end sub For more details on the ON-ERROR handler, see the VBA on-line help for On Error Statement, and also the Err command for obtaining details of the error that occured. The ON-ERROR handler does slow the program execution. When the error checking is no longer needed, turn it off with the following command: ON ERROR GOTO 0 Keywords: VBA, ON-ERROR, error, ActiveX, Visual Basic, COM, Automation References: None
Problem Statement: What is the difference between the RadFrac feed stage conventions and when should each convention be used?
Solution: The feed stage convention allows you to specify how the feed stream is introduced into a column. The feed stream conventions are: Convention Definition Above-stage Feed enters between stages--the liquid goes to the designated stage while the vapor goes to the stage above it (default) On-stage Both the liquid and vapor phases are introduced on the designated stage; a feed flash is performed only if hydraulic calculations are done or Murphree efficiency is used. Liquid Feed enters on the designated stage; the all-liquid feed is never flashed* Vapor Feed enters on the designated stage; the all-vapor feed is never flashed* Decanter Feed enters on the decanter attached to the designated stage (for three-phase calculations only) * The feed is treated as being entirely in the phase specified. This avoids unnecessary flash calculations when Murphree efficiency or tray/packing sizing/rating calculations are requested. Equations where feed F could be flashed: f' is the vapor portion of the feed and f'' is the liquid portion of the feed Doing a material balance around these stages produces these equations: Above-Stage: Stage i: f'' + Li-1 + Vi+1 - Li - Vi = 0 Stage i-1: f' + Li-2 + Vi - Li-1 - Vi-1 = 0 On-Stage : Stage i: F + Li-1 + Vi + 1 - Li - Vi = 0 Stage i-1: Li-2 + Vi - Li-1 - Vi-1 = 0 Tips To save time from performing flash calculations make the following specifications: If you know that your feed is all liquid, then choose On-Stage or Liquid. If the feed is all vapor, choose Vapor. If there is an all vapor feed to the bottom of a column, choose Vapor or On-Stage, or the bottom stage will be dry. You should also specify On-Stage when the feed is a supercritical fluid because when a feed is made up of supercritical components the flash calculations can be difficult. Keywords: RadFrac, column, tower, stage, distillation, on-stage, above stage, feed stage References: None
Problem Statement: What is the bracket option within the secant algorithm?
Solution: The secant method is used for converging single design specifications. We can select bracketing option inside it. To specify this ,go to Convergence -> Conv Options -> Methods It is used whenever a function is discontinuous, non-monotonic, or flat over a region. It will go back to secant method after eliminating the flat region. We have the following switch control options : Bracket=No (default ) It is the default option and using this specification ,bracketing is not used. If bracketing is used, it typically takes more iterations to solve, so the default is to not use bracketing. Bracket=Yes When the design spec function is not changing , by selecting Bracket = Yes, the function is evaluated at the bounds of the manipulated variable. The interval halving algorithm is used to find theSolution . As soon as the function starts to change, the interval halving algorithm is replaced by the standard secant algorithm. Bracket = Check bounds When this option is specified ,bracketing is tried when the function is not changing or the variable hits the bounds. It is useful whenever a function is flat or non-monotonic . This option makes sure that whenever the secant algorithm is stuck at a variable bounds, another variable bound will be tried. Keywords: Secant ,Bracket References: None
Problem Statement: In AORA, When trying to open a report from the AORA reporting tool How to resolve error, SQL_Error error code: S1000 error message: [Microsoft] [ODBC Microsoft Access Driver]? (refer screenshot of the error) The error highlights that the ODBC driver is trying to access a MS Access Model.
Solution: Aspen Operations Accounting Reporter is a standard Windows application that enables you to manage and print multiple reports. It comes complete with Aspen Operations Accounting's set of standard reports that can be easily printed to a preview window, a printer, or to an external file format. To select or view reports, you must first open a model. (Refer the screenshots for the process) Once the model is opened, you can select from any of the standard report templates available. This issue is encountered when user is trying to open a MS Access Model with the Reporting tool The Advisor Reporting Tool is compatible with only SQL Server based models. To resolve this problem, We must open a SQL Server based model and then try launching the report template. Once the correct SQL model is opened, the Reporter will allow you to open and use the standard reports. Keywords: AORA Reporter,S1000 , SQL_Error, ODBC Microsoft Access Driver, AORA, Advisor References: None
Problem Statement: How do I render all indirect costs to zero in ACCE?
Solution: Users have control over most indirect costs in Aspen Capital Cost Estimator through the contractor form, however, engineering costs are also considered to be indirect costs. Because of this you must first go to Project Basis View | Engineering Workforce | By Phase and enter the info as shown below: Engineering workforce number 1 Engineering phase * Engineering hours 0 Engineering cost 0 Once done this, go to the contractor form (Project Basis View | Contracts | Contractors | Contractor 1 | Edit) and input zero in the following fields: Engineering G and A 0 Engineering fee 0 Engineering contingency 0 Material G and A 0 Material fee 0 Construction contingency 0 Handling fee 0 Total indirects cost (USD) 0 All the remaining information fields in both Contractors and Engineering By Phase can be left with the default values. Lastly, right click on the contractor and Link to Engg.. Work Force, and add the workforce number one. Apply all changes and evaluate the project and you will notice that the only costs reported will be Total Direct Costs. Note: If the goal of the user is to render all indirect costs to zero, there is no need to have a contract structure or multiple engineering workforces. Working with a single contractor and a single engineering work force will help zeroing indirects in an easier way. Keywords: Indirects, Contractor, Zero, Cost References: None
Problem Statement: Is there a recommended Bromine Number for different feed types when no measurements are available?
Solution: Bromine Number is an indicator of the degree of unsaturation for a specific feed, and it is used to calculate olefin content. However, measurements are not always available for this input, so estimated values would have to be employed to incorporate them into the characterization process. In general: - For a virgin feedstock, where olefin content is very low, user can enter 1 as a default Bromine Number. - For a cracked feedstock (from FCC, Coker, or Visbreaker), a typical range of Bromine Number would be 20 to 30. The suggestion is to use 25 as input. Unless user is expecting a high olefin content (from a Visbraker product for example), impact on results due to using estimate Bromine Number values should be minimal. Keywords: Bromine Number, Estimate, Olefin Content. References: None
Problem Statement: Can aspenONE Process Explorer do filtering? That is, can it conditionally display and trend some data and not others?
Solution: Yes - filtering can be achieved by substituting an Ad-Hoc query in place of the tag name. Once a tag has been added to a plot: Click the check box to the left of the pen color box: This will cause a 'Menu Option Flyout' to appear above the legend: Select the Pencil icon to edit the tag: On the screen that appears switch to the 'Source' option on the left: Which brings up this screen: Then change the tag entry to an Ad-Hoc query which would filter the data appropriately. In this example it will now ONLY plot the value if it is greater than 10. Click the OK button to commit the changes. For illustrative purposes an additional tag (ATCAI) has been added to the plot. On this screen the red line represents the unfiltered values for the record ATCAI while the blue line only plots the filtered values (greater than 10): Keywords: None References: None
Problem Statement: While upgrading/creating the Aspen Production Record Manager (APRM) database using the Aspen Database Wizard if it fails it would be helpful to have a log file for more troubleshooting. This
Solution: provides you with information on where can one find the log file for the Aspen Database Wizard. Solution 1. In V10 the log file for Aspen Database Wizard will be written to: %ProgramData%\AspenTech\DiagnosticLogs\DatabaseWizard\DatabaseWizard.log 2. In earlier versions the log file will be written to: %ProgramFiles%\Common Files\AspenTech Shared\DatabaseWizard -or- %ProgramFiles(x86)%\Common Files\AspenTech Shared\DatabaseWizard In earlier versions, if this folder does not exist then you will not see a log file generated. Please manually create the folder if it is missing. Note: Please note that the log files exists in the same location as mentioned above when upgrading/creating databases for Aspen Production Execution Manager, Aspen Audit and Compliance Manager as well as Aspen Tank and Operations Manager. Keywords: Upgrade Fails Aspen Production Record Manager APRM Aspen Production Execution Manager Aspen Audit and Compliance Manager Aspen Task and Operations Manager Aspen Batch Extractor References: None
Problem Statement: Is there a way to recover my changes after Aspen Plus crashes?
Solution: When Aspen Plus crashes, the program tries to save a file called $filename$backup.bk$. When Aspen Plus is re-opened, it should check for a file with this type of name, and there will be a dialog asking if you want to restore the previous run. Alternatively, this file can be renamed to BKP and opened. If you are not sure where your working directory is located, you can search for $backup or .bk$. The working directory is generally the folder where the file was opened or the default working directory set on the File menu Options dialog Files sheet. By default, the working directory is C:\ProgramData\AspenTech\Aspen Plus VX.X Keywords: Crash, Recovery File, BK$ References: None
Problem Statement: An operator training simulator (OTS) provides a detailed emulation of the operator interface and behavior of a plant and its control system. It consists of a detailed dynamic model connected to actual hardware or software emulations for the operator stations and control; and instructor functionality to initiate malfunctions and monitor operator performance. Utilizing an OTS provides many benefits across a wide range of applications. Operator efficiency will be improved by empowering them with more knowledge and hands on experience. A better product quality can be achieved by knowing the control parameters. Operators can practice on recovering from various plant scenarios and can gain more familiarity with plant controls. With this increased familiarity, operators are more equipped to handle unforeseen plant events and can achieve a smoother startup and shutdown process. Aspen OTS Framework enables large dynamic simulations to run in real time across multiple models using parallel processing. By utilizing partitions, models with fast and slow dynamics can be separated with varying time steps to ensure optimal performance and accuracy. Integrators between the various models are synchronized and stream data is exchanged between the models. All communication links are based on OPC standards so it is easy to connect and communicate with other OPC compliant devices (e.g. DCS, control emulators, etc.). Models built in HYSYS Dynamics, Aspen Plus Dynamics, and Aspen Customer Modeler are all supported within Aspen OTS Framework.
Solution: This example will leverage a completed model built in Aspen HYSYS Dynamics and OTS Framework for model deployment. Keywords: Dynamic simulation, dynamics, transient, HYSYS, HYSYS Dynamics, OTS Framework, operator training, model deployment, partitions, OPC References: None
Problem Statement: When modelling a reactive distillation column based on a USER-type kinetic subroutine, Aspen Plus may return the following error message: *** SEVERE ERROR WHILE CHECKING INPUT SPECIFICATIONS BLOCK NAME: B1 MODEL NAME: RADFRAC REACTIONS/CHEMISTRY PARAGRAPH: STOIC AND SALT AND DISSOCIATE SENTENCES ARE MISSING.
Solution: The cause of this error message is missing input on the User-Define Stoichiometry Sheet. While in principle all input specifications with regards to reactions can be specified within the FORTRAN subroutine, Aspen Plus expects at least one reaction on this sheet. The advantage of specifying the reaction stoichiometry on this sheet (i.e., inside the Aspen Plus GUI) is that it is visible to any user of the model, regardless their knowledge of Fortran or the accessibility of the Fortran code. Another advantage is that certain arguments of the subroutine are set and can be used in the Fortran code, such as the array of stoichiometric coefficients (STOIC). If you prefer to specify all relevant information in the actual FORTRAN code of the kinetic subroutine, then theSolution is to enter a dummy reaction. For example, specify the same component twice on the Components Specifications Selection sheet, with different component IDs. To avoid any interference with the reacting system in the column this can be a component which does not belong to that system at all. For the reaction, specify one of the ID's to be the reactant, while the other acts as the product: 1 KINETIC DUMMY1 --> DUMMY2 The requirement for at least one stoichiometric reaction on the User-Define Stoichiometry Sheet is imposed by the input checker of the RadFrac model. In other words, use of the very same reaction ID in a reactor model (e.g., RCSTR, RPLUG) lifts this requirement. Also, the same input requirement (i.e., the need for at least one stoichiometric reaction) is imposed when using a kinetic subroutine of type REAC-DIST. The Reactive Distillation Stoichiometry Sheet, however, will be incomplete when no reaction is specified, thereby reflecting the higher input demands of RadFrac compared to reactor models. Keywords: reactive distillation kinetic subroutine user stoichiometry References: None
Problem Statement: The Aspen Properties Excel Calculator gives me an error when I try to open a Property Package. In Excel, when going to the Aspen Properties Tool bar menu and choosing the Select Property Package to select a property package, there is a VB run time error and making it unable to proceed. The Visual Basis dialog box reads: Run-time error ''-21478467259 Method ''GetPropertyPackage'' of object ''IAspnMaterialTemplate'' failed'' The Cape Open file opens without problems, and I can run Aspen Properties as a stand alone application. Is there something else that needs to be set in Excel?
Solution: Sometimes, you may need to re-register some dlls for Aspen Properties since it is using the Aspen Plus version. To do this, use Start/Programs/AspenTech/Aspen Engineering Suite/Aspen Properties 10.2/CAPE-OPEN Registry Fixup Utility Click OK. Keywords: Aspen Properties, Excel calculator,Property Package, Cape-Open References: None
Problem Statement: Why I’m getting following error while evaluation of project even though I defined the open steel structure as a project component and provided structure tag to all process equipment for the same area?
Solution: In Aspen Capital Cost Estimator when you defined the open steel structure as a project component then make sure that it should be the first component in the component list of the Area level specification. ACCE will evaluate all components in order, and the error comes up when it cannot find the structure while there is a structure tag defined in the component hanging onto the structure. Make sure that open steel structure component is always at top position if you specifying all the equipment’s hung to that structure. Keywords: Open Steel Structure, Structure Tag, Aspen Capital Cost Estimator References: None
Problem Statement: What's the command in Excel VBA to reconcile a stream in Aspen Plus?
Solution: The following command will reconcile stream RECYCLE: Public go_Simulation As HappLS go_simulation.Tree.Data.Streams.Elements(RECYCLE).Reconcile(1) -OR- call go_simulation.Tree.Data.Streams.Elements(RECYCLE).Reconcile(1) More information on the Reconcile method can be found in the Object Browser in located on the Visual Basic Editor's VIEW pulldown menu (narrow the search the Happ library). You can also reconcile all of the streams within any hierarchial level or the top level in a single command. Following the previous naming convention: Call go_simulation.Tree.Data.Streams.Reconcile(HAPP_RECONCILE_INPUT) -OR- Call go_simulation.Tree.Data.Streams.Reconcile(1) The passed argument for Reconcile, in this case, HAPP_RECONCILE_INPUT, can be any one of a number of ennumerated constant values for the HAPP_RECONCILE_CODE class which allows VBA to reconcile the streams in different ways. These constant values are all described in the Visual Basic Editor's object browser (under VIEW | OBJECT BROWSER) for the HAPP library. For example, to reconcile the streams in a hierarchial block, extend the command as follows: go_simulation.Tree.Data.Blocks.Elements('hierarchy blockname').Elements(Data).Elements(Streams).Reconcile(1) A mix of HAPP_RECONCILE_CODE's can be used with the operator OR between the different options. StrmRecDefault = HAPP_RECONCILE_INPUT Or HAPP_RECONCILE_TP Or _ HAPP_RECONCILE_CF Or HAPP_RECONCILE_MICFMOLE Or _ HAPP_RECONCILE_MITFMOLE Or HAPP_RECONCILE_CICFMOLE Or _ HAPP_RECONCILE_CITFMOLE call go_Simulation.Tree.Data.Streams.reconcile (StrmRecDefault Or _ HAPP_RECONCILE_ONLY Or HAPP_RECONCILE_QUIET) Keywords: Visual Basic Application, VBA, VB, visual basic, activeX, reconcile References: None
Problem Statement: How does one specify the rate of change of a conventional component attribute within a user kinetic routine?
Solution: If you want to express your rate in terms of attributes/sec for a conventional component attribute, a class 2 attribute (CAUSRA through CAUSRE) must be used. Many users will select a class 1 attribute (CAUSR1 through CAUSR5) since they are listed first in the drop down list on the Components.Attr-Comps form. The units for class 2 attributes are mass*attribute/time (kg*attribute/sec). The RATCAT vector can be used to specify the rate. All of the attributes for the first substream will be listed first followed by the next substream, etc. To set the rate of change for the second attribute belonging to the mixed substream for a conventional component, set RATCAT(2). For class 1 attributes, RATCAT will be a chain rule sum, ie. . kg*d(attribute)/dt + attribute*d(kg)/dt (for kg mass units). Comp-Attr must be set to YES on the Kinetic form inside the reactor block for ASPEN PLUS to converge the attribute flow based on the RATCAT specification. The attribute can be explicitly set by modifying the SOUT vector instead of specifying the rate of change using the RATCAT vector. Declared attributes for all substreams are appended to the normal stream structure. For example, to set the second attribute associated with the mixed substream, set SOUT(NC+11) since SOUT(NC+9) is the last element in the normal mixed stream structure. Comp-Attr must be set to NO on the Kinetic form inside the reactor block when using this approach. If Comp-Attr is not appropriately set, to NO or YES consistent with the approach used to set the attribute, a material balance convergence error will result. Keywords: Component Attribute Classes Kinetic User Routines RATCAT Attribute Units References: None
Problem Statement: What is the Aspen Plus Summary File Toolkit?
Solution: The summary file toolkit is a set of Fortran subroutines that retrieve results information from the Aspen Plus summary file and backup file. (You can use the backup file with results as well). The subroutines are organized around the logical structure of the data. You can retrieve selected results. Or you can retrieve all the results of a simulation or simulation object (such as unit operation blocks or streams). The Fortran source code for the subroutines is provided so that you can build applications on any computer. The summary file is an ASCII file produced by Aspen Plus for every simulation run. This file contains the summary of simulation results, such as block results, stream values, tray profiles, heating/cooling curves, and property tables. The information in a summary file is also contained within the Aspen Plus backup file. Summary files are named according to the form runid.SUM. There are 4 example files documented in detail in the Summary File Toolkit reference manual, chapter 8. These files are a good starting point to familiarize yourself with the features and functionality of the Summary File Toolkit. The Summary File Toolkit can also be used for retrieving all blocks, property sets and other flowsheet variables and passing them to external programs. http://support.aspentech.com/Public/Documents/Engineering/Aspen%20Plus/2006/AspenPlusSummFileTools2006-Ref.pdf Keywords: None References: None
Problem Statement: Some questions on CAPE OPEN.
Solution: Q1: In the CAPE-OPEN mixnsplit example problem, I specified a different property method on Property Global tab. How can I retrieve specified property method name in CAPE-OPEN compliant unit operation model? Also How I can I say for sure it is using the selected property method rather than some CAPE-OPEN property package? It is not possible to retrieve the name of the Property method from the code for a CAPE-OPEN Unit operation because the interfaces defined by the standard do not allow it. We agree it is hard to tell whether your Property Method is being used, but if it is the property method assigned to the block then it should be used. The simplestSolution is to make the property method a parameter of the unit operation, which the user can define along with any other configuration data. You could implement it so that validation would fail if the property method had not been specified. The dialog that you implement in your CAPE-OPEN Unit Operation for configuration can allow the user to specify the Property Method name along with values of whatever other configuration data a user has to provide to configure the Unit. ThisSolution has the advantage that it is very simple. It has the disadvantage that it is divorced from the Aspen Plus defaulting mechanism - in other words you have to specify the property method at the block level. An alternative, which would would require development from us, would be to make the property method name available as a property on the material object connected to the ports of your block. Then you would get it by making a getProp call in the same way that you will do for stream conditions, something like this: ReDim v(0 To 0) As string On Error GoTo GetPropError v = CapeMO.GetProp(PropertyMethod, vbNullString, Empty, vbNullString, vbNullString) This would be an extension to the standard that would not be supported by other CAPE-OPEN software, so relying on it would reduce the compatibility of your component. ThisSolution has the advantage that it would work with the Aspen Plus defaulting mechanism. It has the disadvantage that it requires development work from us and could therefore only be available in the 12.1 release. Q2: For a cape-open block, we always get block-options->Properties tab and column Property set 1. would it be possible to access these variables in Visual Basic? If yes, how? It is not possible for the code for a CAPE-OPEN Unit Operation to access the name of the property method associated with the block because the CAPE-OPEN standard makes no provision for it. The code for the Unit Operation can only access Aspen Plus data through CAPE-OPEN interfaces, and the only interface that Aspen Plus provides to the unit is the ICapeThermoMaterialObject interface that the unit uses to access the stream conditions at each of its ports. Q3: Can you access the Material Template object from a Unit Operation? At the moment you can''t access the Material Template object from a Unit Operation. There has been a proposal to extend the standard to allow such access but we have not implemented it. There is an interface which is actually used by our Excel Add-In for Aspen Properties and was not intended to be used by Unit Operation writers. Keywords: References: None
Problem Statement: When automating an existing report in SQLplus web-based reporting, what can be done to overcome the following error message when clicking OK on the automation screens: (Text of error is 'Failed to save automated report')
Solution: When a report is automated a record defined by SQLReportDef is created behind the scenes and is filled out using the automation details provided on the configuration screens (things like when the report will be run, who it goes to if e-mailed, which printer will be used if printed, etc.). The limit on the length of the name of an SQLReportDef record is 24 characters. If the report name which is being used as the basis for the automation is longer than 24 characters then this message will appear, as the SQLReportDef record will not be created. To fix the issue please save the original report using a name which is less than or equal to 24 characters and proceed through the automation steps again. Keywords: None References: None
Problem Statement: Can I get information on meaning of symbols that appears on Aspen Plus flowsheet or data browser?
Solution: Status indicators display the completion status for the entire simulation as well as for individual forms and sheets. The status indicators appear: Next to sheet names on the tabs of a form As symbols representing forms in the Data Browser menu tree On the flowsheet This table shows the meaning of the symbols that appear: * The required input complete icons and appear on some forms and folders in the Data Browser where no input is required as soon as you enter the form. You can restore these to the no data entered icons by right-clicking the icon and selecting Delete. In this case, the form or folder will not be deleted, but it will be restored to its original status in a blank simulation. Keywords: Status indicators, symbols, data browser, flowsheet etc; References: None
Problem Statement: What is the definition of the total, effective and required area and the difference between U clean, U dirty and U service?
Solution: · The total area of the heat exchanger is determined by the PI (3.14) * Tube Outer Diameter * Tube length * Number of tubes * number of shells. · The effective heat transfer area is the area available for heat transfer (this is also often referred to as the actual area). This is the total area minus areas which are not available for heat transfer, which include tube length within tubesheets, tube projections beyond tubesheets and tube length beyond blanking baffles etc. · The required area is that heat transfer area which would be required in order to achieve the required duty, given the calculated overall coefficients and the temperature difference. U clean: Overall heat transfer coefficient (excluding fouling resistances) predicted by Aspen Shell & Tube based on the fouling resistances specified and referred to the required area. Please note that the specified fouling resistances may affect the wall surface temperature and therefore the coefficient. U dirty: Overall heat transfer coefficient predicted by Aspen Shell & Tube with fouling resistances included, and referred to the required area. U service: In Checking and Design modes, this is the overall heat transfer coefficient which corresponds to the heat load specified by the user and the effective surface heat transfer area, i.e. U service = (Specified Heat Load) / (MTD * Effective heat transfer area). In Simulation mode the duty (and outlet temperatures) are determined by Aspen Shell & Tube. Therefore the required area is the same as the effective heat transfer area (and actual area). The area ratio is given by: Area ratio = Effective heat transfer area / Required area and also Area ratio = U dirty / U service Keywords: Duty, total area, effective heat transfer area, required area, clean coefficient, dirty coefficient, service coefficient, area ratio References: None
Problem Statement: Can a Object Linking and Embedding for Process Control Data Access (OPC-DA) server for Aspen InfoPlus.21 (IP.21) be restricted to read only access for all saved records within the database?
Solution: Data Access (DA) servers do not provide additional security functionality for IP.21. It only simply provides read or write for the values that are being requested from the target source. It does not offer a secure write configuration and does not authenticate the OPC client connection. Preventing writes to the IP.21 database is achieved by securing the database itself from being written to from any external application using the Aspen Framework Security Manager. For more information on Aspen Framework Security Manager please contact your local support centre for more information about this feature product. Keywords: OPC DA Read only References: None
Problem Statement: How do I simulate the pressure drop of a fluid flowing in the annulus of 2 pipes?
Solution: The Pipe Segment unit operation is not currently designed to directly model this piping geometry. If the fluid being modeled is single phase, the Hydraulic Diameter could be used (for the Inner Diameter input): where DH - Hydraulic Diameter Do - Outer Pipe Diameter Di - Inner Pipe Diameter Keywords: Annulus, hydraulic diameter References: None
Problem Statement: Sometimes it may be necessary to access the database file created for a project to customize reports. This
Solution: document contains information on installing and using SQL Anywhere 5.0 (or 5.5) Runtime. Also contained in this bulletin are step-by-step instructions on making the ODBC connection between an Aspen ICARUS Estimate Results Database (*.IDB file) and Microsoft Access. Solution Section I - Installing SQL Anywhere Aspen ICARUS Project Manager creates an SQL format database when an estimate is run. This database can be used, manipulated and changed using any commercial database package that supports SQL database files. Any changes made to the database would be reflected in reports run from Aspen ICARUS Project Manager prior to re-estimating your project. You can also create customized reports using the database package. SQL Anywhere disks are available for WinNT and Win95 only attached to thisSolution Doc. If you need them for Windows 2000, please go to the Sybase Web site, to download the Evaluation version of the Sybase's SQL Anywhere Studio 8.0.1 (http://login.sybase.com/detail/1%2C6904%2C1016644%2C00.html). This evaluation version is only good for 60 days. The installation and use of the software will not be supported by Aspen Technology...please use your IT department for assistance or contact Sybase for help. To install SQL Anywhere 5.0 (or 5.5) Runtime: Download appropriate SQL Anywhere Runtime disks attached to thisSolution Document, to your hard drive. Unzip the file. Click on the Start button, then select Run. Click the BROWSE button, and move to where you unzipped the ZIP file on your hard drive. Check off Install/Reinstall software when prompted. Copy files to C:\SQLANY50 or a directory of your choice. Check off installation of DLLs for version WATCOM SQL 3.2 and 4.0, then click OK. Setup will copy all files to the designated directory. Insert disk 2 when prompted. If notified of an existing copy of a file, select Install New File and Keep Existing File. Allow SQL Anywhere to make all changes to CONFIG.SYS and AUTOEXEC.BAT. If SQL Anywhere 5.0 Runtime was successfully installed, click OK several times to return to Windows, then REBOOT your computer. Section II - Using an Aspen ICARUS Project Manager Database with SQL Anywhere 5.0 (or 5.5) Runtime and Microsoft ACCESS Prepare an estimate in Aspen ICARUS Project Manager. Preparing an estimate creates the CST_RSLT.IDB file that will be used in MS ACCESS. For example purposes, the test file, REFIT will be used to demonstrate setting up a database file for use with MS ACCESS. All ICARUS systems ship with this project. Open REFIT project in Aspen ICARUS Project Manager. Prepare an estimate. Preparing an estimate will create the CST_RSLT.IDB file. Exit Aspen ICARUS Project Manager. Setup an ICON (WIN 3.x) or Shortcut (WIN95) for the project database. Open the SQL Anywhere Program Group or use Explorer. Create an ICON (WIN 3.x) or Shortcut (WIN95) for the project database (we will use CST_RSLT.IDB for this example). NOTE: You must create an SQL ICON for each database you want to use with MS ACCESS. For more information on creating an Icon(WIN 3.x) or setting up a Shortcut (WIN95), see Windows Help. For ICONS created in Windows 3.x, Windows 95, and Windows NT : In the description field you will put in a description of the database you are using: Description: CST_RSLT.IDB In the command line field you will enter the command that runs the SQL Anywhere 5.0 Runtime program with the associated database file: Command Line (Win 3.x): C:\SQLANY50\WIN\RTDSK50S.EXE C:\PROJMGR\PROJECTS\CST_RSLT.IDB Target (Win95, Win NT): C:\SQLANY50\WIN32\RTDSK50.EXE C:\PROJMGR\PROJECTS\CST_RSLT.IDB The format for the command line is as follows: program path and name (RTDSK50S.EXE) space database path and file name (CST_RSLT.IDB) program path and name (RTDSK50.EXE) space database path and filename (CST_RSLT.IDB) In the working directory field you will enter the working directory for your program, which is SQL Anywhere 50. Working Directory: C:\SQLANY50\WIN Start In(Win95): C:\SQLANY50\WIN32 Click OK to save your ICON information. The ICON will automatically be created. Create an ODBC (Open Database Connectivity) Connection to your project database. Open the Main Program Group, then open the Control Panel. In Windows 95, click on Start, Settings, Control Panel. Double Click on the ODBC Administrator ICON. (In Win95, use ODBC32 Administrator Icon.) Click on Add to Add Data Source. From dialog box, select: Sybase SQL Anywhere 5.0 then click OK. Enter the following information into the SQL Anywhere ODBC Configuration dialog box. Enter the data source name. This is the name of your project database: CST_RSLT.IDB Enter the description of the database: REFIT Project Do not enter any information into the USER ID field and the PASSWORD field. You will fill this information in when you attach to the database through Microsoft Access. Enter the database path and file name. This will be the full path to the location of the database file. C:\PROJMGR\PROJECTS\CST_RSLT.IDB Enter the Database File as: C:\PROJMGR\PROJECTS\CST_RSLT.IDB Click on the Microsoft Applications (Keys in SQLStatistics) check box at the bottom of the window. Click on the Prevent Driver not Capable errors check box at the bottom of the window. Click OK. Your database description should appear in the list of Data Sources. Click CLOSE to exit the ODBC Administrator. You have just completed the setup necessary to make the connection between an Aspen ICARUSProject Manager database and Microsoft Access (or any other commercial database package such as Borland Paradox or Lotus Approach). Next you will create a new database in Microsoft Access, creating a connection between an Aspen ICARUS Project Manager database and Microsoft Access. Opening a Database File in MS ACCESS Double click on the newly created project ICON (WIN 3.x) or Shortcut (WIN95) to start the SQL Anywhere 5.0 Runtime engine. Double click on the CST_RSLT.IDB ICON Open MS ACCESS Select File, New from the menu. Enter a file name for your new database: REFTPROJ Click OK. In ACCESS 2.0, select File, Attach Table... from the menu. A dialog box appears. In ACCESS for Windows 95, 7.0, select File, Get External Data, Link? 1. Highlight (see Figure 2) 2. Click OK. 3. The SQL Data Source dialog box appears, select your project database: REFT.IDB (see Figure 3) 4. Click OK. 5. You will be prompted to connect to the SQL Anywhere 5.0 Database. a. Enter the USER ID as typed below. Icarus b. Tab to the PASSWORD field or click the mouse in the field. Enter the PASSWORD as typed below. star%&vacuum NOTE: All databases you use with SQL Anywhere 5.0 Runtime will use the same User ID and Password. c. Click OK to make the connection to the SQL Anywhere 5.0 Runtime database. 6. The Attach Tables dialog box appears displaying all the tables in the CST_RSLT.IDB. a. Check the box at the bottom of the screen to save login ID and password locally. Highlight the desired table (usually the DBA.DETAILS table). b. Click on Attach (see Figure 4). c. Click on OK when successfully attached (see Figure 5). d. Continue attaching (connecting) to all other tables desired in the database following steps 6a, 6b and 6c. e. When finished, click on the CLOSE button. Your tables will appear in the list of tables accessible through MS ACCESS. Have fun using ACCESS with your REFT.DB database. The possibilities are endless. You can report, query, change (using an UPDATE Query), etc. all of the information in the database. Any changes you make will appear in your Aspen ICARUS Project Manager reports. You need to run these reports through Aspen ICARUS Project Manager before the project is re-estimated! Try any query to make sure that you have the connection setup properly. If the connection is not setup properly you will get some type of ODBC failure message. ­ Remember that these steps are cookbook instructions; just insert the name of your Aspen ICARUS Project Manager database file wherever you see CST_RSLT.IDB. You can create as many different database icons as you desire! ­ Caution - If you re-estimate your project, any changes made to the Aspen ICARUS Project Manager database through the MS Access will be lost! Keywords: SQL Anywhere ODBC Database References: None
Problem Statement: In AspenTech Business Process Explorer application, How to resolve error BPE Application Creation failed due to VBA Setup problems.hr=0x80040562? In BPE, after installation while trying to launch a module, it gives the following error stating that that there is a VBA setup problem and the application quits opening. This error is faced when the VBA libraries get corrupted. This
Solution: documents a procedure for recovering from this problem. Solution Microsoft VBA (Visual Basic for Applications) is a required component for Business Process Explorer (BPE) client/serverSolutions. TheseSolutions are often used within MIMI Model Management Applications. The Microsoft Installer (MSI) is used to install the VBA core component during the installation of BPE. The recommendedSolutions to solve the error message are as follows: On the AspenTech Installation CD, Navigate to the Core\VBA folder and run Setup to re-install only the VBA component. After the VBA installation, We need to repair the VBA setup. So Locate the file called VBA6.msi (it may be in Program Files\Common Files\Microsoft Shared\VBA\VBA6) and right click it and select 'Repair'. BPE should start normally after repairing the VBA installation. 3. Please uninstall BPE and then perform a re-installation of the application with Administrator privileges. don’t resolve the errorIf the above mentioned steps These reSolutions will resolve the error encountered about the VBA setup failure. Keywords: BPE, VBA6, VBA, hr=0x80040562 References: None
Problem Statement: How do you configure a pressure relief application?
Solution: Please review the attached Animated Tutorial. After creating a dynamic simulation from Aspen Plus, you can add pressure relief equipment to your simulation from within Aspen Dynamics. You can then simulate various failure scenarios, size and rate the safety equipment and simulate the maximum relief rates and pressures. Play Viewlet Viewlets require the Macromedia Flash player and will appear in a separate browser window. Look for them in help topics with a Show me! link. Keywords: viewlet References: None
Problem Statement: In the Drawing Editor, the instrument label only has the d (differential) and - options for the Modifier field. How can I add more options in the Modifier field of the instrument label (i.e. Integrate/Totalizer)?
Solution: The symbol list of the Modifier field is located in eLoopModifier inside the GraphicDefinerBubbles class view. To access this list and modify it, follow these steps: 1. Open the class library file associated with the workspace with the Class Library Editor. If you are using the StandardLibrary, you may find it in its original location (C:\AspenZyqadServer\Basic EngineeringX.X\WorkspaceLibraries\DataModel\StandardModel). 2. Look for the GraphicDefinerBubbles class view and open it. 3. Locate the Modifier attribute and right click on the eLoopModifier under Type column. Select Open eLoopModifier option. 4. In the eLoopModifier window, right click on the first row to add a new symbol. You can add a name, description, and value to the new line. 5. When finished, remember to compile the class library into a class store and reload the workspace in the Administration tool. 6. In the Modifier file of the Instrument Label, you should be able to see the new symbol. Keywords: Instrument label, Modifier, eLoopModifier, GraphicDefinerBubbles class view References: None
Problem Statement: In some simulations where a very large amount of results from the Results Summary | Streams are generated (i.e.: +300), the copying and pasting process to Microsoft Excel from Aspen Plus of these streams results in Microsoft Excel generating memory issues and crashes due to a capacity issue in the clipboard which is unable to manage such a huge amount of data.
Solution: In order to avoid this issue, instead of using Copy All option, it is suggested to select and copy a maximum amount of columns of 205, then proceed to paste these into your Microsoft Excel spreadsheet. It does not matter how many components have been reported in the streams results. Keywords: Copy and paste, Stream results tables References: None
Problem Statement: RadFrac fails with the message that the VAPOR or LIQUID is drying-up on certain stages.
Solution: The error appears when the flow rate of either vapor or liquid on a tray falls below a pre-set limit. This limit is set as a fraction of the sum of the feeds. The default value is 1.0e-5. So when a stage flow rate falls below 1.0e-5 * SUM(FEED FLOWRATES) then the error message will appear and convergence will stop. The limiting fraction is controlled by the parameter FMINFAC which is the minimum allowed value for stage flow as a fraction of the total feed to the column. This parameter can be found on the Advanced sheet of the RadFrac Convergence form. If simulating a tower that really does have a section with very low vapor or liquid flow rates (as a fraction of feed to the tower), you can avoid this error by reducing this parameter. On the other hand, few columns have a section with flow rates this low, and the most common cause of this message is that the column is not converging normally. If this is the case, adjusting the value of FMINFAC will not help to solve your convergence problem. If having convergence difficulties please check the following points: Are the column specifications correct? Are they the best ones? In general, our algorithms work better with flow specifications. Does the convergence algorithm used match the characteristics of the process fluid ? Are the physical properties correct? If everything else checks, then it may be necessary to tune some of the convergence parameters. The iteration history with increased diagnostics will be the best cue to see what parameters should be adjusted. When modelling absorbers where the gas phase is very soluble in the liquid and the inert gas flow is low (e.g. HCl scrubbers), try selecting the Absorber=Yes option on the Convergence \ Advanced sheet with the Standard algorithm. Please see the document 3491 What to do if RadFrac does not converge? for more details. Keywords: column convergence Radfrac drying up References: None
Problem Statement: What is CAPE-OPEN?
Solution: CAPE-OPEN is a cooperation project aimed at defining software interfaces for allowing plug-and-play simulation components for the various process simulators in the market. AspenTech is a key member of the CAPE-OPEN initiative, being the first simulation vendor having a prototype version of a CAPE-OPEN compliant simulator. During the CAPE-OPEN project, Aspen HYSYS was used as the validation platform to ensure correctness of the developed interfaces. This has allowed AspenTech to have a deep knowledge in implementing and using CAPE-OPEN Interfaces. The Aspen HYSYS open architecture, particularly existingSolutions for integrating external simulation models (Aspen HYSYS extensions), have facilitated supporting the CAPE-OPEN Interfaces. Available proprietary Aspen HYSYS sockets for Unit Operations and Property Package extensions have been adapted to handle the CAPE-OPEN interface specifications in addition to the traditional Aspen HYSYS Extensions, leading to the Aspen HYSYS CAPE-OPEN sockets. Aspen HYSYS.CAPE-OPEN consists on Aspen HYSYS (v 2.2 or later) and the Aspen HYSYS CAPE-OPEN sockets. The conversion of existing Aspen HYSYS extensions to the CAPE-OPEN standard is facilitated by this architecture. All Aspen HYSYS users may download the required software to enable the Aspen HYSYS interfaces to CAPE-OPEN Unit Operations and Property Packages. Consult the site http://www.colan.org to find more information about the CAPE-OPEN standard. The Global CAPE-OPEN Mid-term Meeting demonstrated successful interoperability between HYSYS.Process 2.2 and Aspen+ 10.2. The version 1.0.3 of the CAPE-OPEN sockets allows converting Aspen HYSYS.Process or Aspen HYSYS.Plant version 2.2 into a CAPE-OPEN compliant simulator, and also reproducing these interoperability tests. Keywords: CAPE-OPEN, Aspen HYSYS, interface, sockets References: None
Problem Statement: When using ActiveX automation with Aspen Plus for large simulations, it is critical to retrieve the convergence loop information such as staus (converged, not converged, etc ...) and the type of convergence loop.
Solution: Access the collection of convergence loops and use: .AttributeValue(HAP_COMPSTATUS) the calculation status .AttributeValue(HAP_RECORDTYPE) to retrieve the convergence loop type The attached example will open an Aspen Plus simulation, run the simulation and display convergence loop information. If no convergence loop is used to solve the flowshett, the ''For Each lo_Loop In lo_Conv'' loop will not display any convergence loop information. There is then no need to check whether there is a converegence loop used in the simulation. Keywords: Convergence Loop ActiveX COM Automation Visual Basic References: None
Problem Statement: In this example, you will learn how Aspen HYSYS V9 can be used by process engineers or planners to generate simulation data required to calculate base and shift vectors needed to update a PIMS reactor submodel. This example follows the workflow of using a pre-calibrated and pre-configured model to generate the data.
Solution: You will learn to: Prepare a simulation model for Planning Model Update: Establish appropriate scope & operating range for the study Run a Case Study to generate Simulation Data for PIMS: Ensure all required variables are included in the study Use an Excel Front End as an alternative workflow: Setup & Run the same analysis in Aspen Simulation Workbook Present simulation data in desired format for PIMS: Appropriately map & link variables to PIMS submodel tables Note: This example uses Aspen HYSYS V9 CP1 (35.0.1.271) Keywords: Aspen Petroleum Refining, Planning Model, PIMS, ASW References: None
Problem Statement: Can a Visual Basic application import a Summary file into a blank simulation? It is possible in the graphical user interface (GUI) to open a new simulation based on a template then import a summary (.sum) file. The blocks and streams that have reported results will automatically be created and drawn in the simulation diagram. All input specifications including the component list and property specifications will be incomplete. However, all results will be available and can be viewed. This is a convenient way to review simulation results when the backup (.bkp) file is not available. How can you do this from an ActiveX Visual Basic application?
Solution: The following Visual Basic code can be used: 1. Create an Aspen Plus object Set go_Simulation = CreateObject(Apwn.Document) 2. Import an application template. Any template can be used. Call go_Simulation.InitFromTemplate2(d:\aes\GUI\Templates\Simulations\General with English Units.apt) 3. Import the summary file. The On Error statement is required since Aspen Plus will object to ' being initlialized twice. The second On Error statement resets the first statement. On Error Resume Next Call go_Simulation.InitFromFile2(d:\examples\ApVBA\DialogBoxes\Test.sum) On Error GoTo 0 The Visual Basic ActiveX interface is documented in Chapter 38 of the Aspen Plus User Guide. The variable go_Simulation is a global object variable that represents the Aspen Plus simulation. It is defined by a Dim statement as an IHapp type. NOTE: There are some limitations to importing a summary file. The summary file contains results data and a flowsheet topology (the streams and their connections to each unit operation). It is best to import a summary file into a fully specified Aspen Plus model where the components, property method, streams, and blocks have been fully specified on the input forms. Aspen Plus will allow you to import a summary file into a blank simulation template, but after the import, there will be several incomplete input forms - Components, Properties, Streams and Blocks. This is because the summary file wants to load results, but if for example, there are no components specified in the host model, the summary file import will not be able to load or display any stream results. Keywords: VBA, ActiveX, Summary File, import References: None
Problem Statement: Can Custom GUI forms be designed for User2 Models with variable length arrays?
Solution: Yes. Any form control used in Aspen Plus can be applied to custom User2 forms created with Visual Basic. Attached is a simple example to illustrate this technique. It is better to create Configured Variables for the User2 model with the Model Variable Configuration Editor when working with variable length user-defined parameters. In this example, two real variables were created. The first one is an Integer called NTubes. The second is a Real variable called X. The dimension for X is set equal to NTubes to make the array fully configurable by the user. Templates.vbp should be used to build the Visual Basic project. This file is stored in the xeq subfolder of the Release 10.1 GUI installation directory. The form and tab should also be based on FormTemplate.Ctl and TabTemplate.Ctl also residing in the Xeq subfolder. The Visual Basic Project should be assigned the same root name as the model name. The model name is defined when the User2 model is inserted into the Aspen Plus Model Library. In this example, the User2 model was assigned the name Example. The Project is stored as Example.Vbp and Name is set to Example in the Project Properties. The FormTemplate was renamed to Scroll and the TabTemplate was renamed to Tab. The AspenTech Tab Strip Control, MMTabStrip (T), was used to insert the Tab. In the Tab properties, reduce the Tab Count to 1. Specify Input for the Tab Caption and Example.Scroll for FormName. The FormName field ties the Aspen Plus Model to the Visual Basic Input form. Place a Frame on the Input Form using the AspenTech MMFrame (F) control. Set the caption for Frame to Input. Insert an AspenTech MMTextBox (TB) control onto the form for entry of NTubes. The Custom Properties for this control can be selected by choosing Properties from the right mouse click pop-up window. In the Custom properties, select RealNumber and set the maximum number of characters on the General properties subform. On the Variable subform, create variable Ntubes that will be displayed in this field. This variable will be a Node type. Set the path to main/Input/User Tree/NTUBES. An optional text label can be placed on the form for identification. The scrolling is accomplished by overlaying a prototype field on top of the MMScrollArea (SA) AspenTech Control. Place the grid first then place an MMTextBox control onto the first data cell. The prototype will form a link between the form and the Aspen Plus data. The grid will repeat this field depending on the value of NTubes. Make the same General subform custom property specifications for the prototype control as was specified for the Ntubes control. Define a Node type variable on the Variable subform and set the path equal to main/Input/User Tree/X/@rowid. @rowid will serve as the VB offset for X and will be defined later in the grid Custom Properties. Set the property Left in the prototype properties to 260. The properties will be shown to the right of the form once the control is selected. The custom properties for the grid can be selected by first selecting the grid by left clicking then left clicking Custom in the Properties displayed to the right of the form. Once Custom is selected, left click the pop-up button to the right of Custom. This button will have three dots (...). In the Custom Properties, define the rowid and NumRows variables on the Variable subform. Select the Row datatype for rowid and Node for NumRows. Specify main/Input/User Tree/NTUBES for NumRows. On the ScrollArea subform, enter NumRows for the Variable for Virtual Rows to define the length of the scrolled region. Clear the Index Ordered Rows box and set the Number of Prototype Rows to 1 in the Row Specifications. Clear the Index Ordered Rows box and set the Number of Prototype Rows to 0 in the Column Specifications. Select the Project properties form from the Project pull-down menu. Set the Binary Compatibility option on the Component subform. Set the OCX file pathname to the GUI\forms subdirectory in the next field. The location of the GUI subfolder is defined when Aspen Plus is installed. Select the Make OCX option from the File pull-down menu. Create an OCR file for the OCX. The format of the OCR file is described in the Customizing Unit Operation Models Getting Started Guide. Copy the OCR file to the GUI\Forms directory. Run ApwnSetup located in the GUI/Xeq folder to register the control. Since Binary Compatibilty was selected for the control, ApwnSetup only needs to run once. The tailored file will automatically be used in Aspen Plus each time the input form for the Example model is opened Keywords: Visual Basic VB forms customized customizing unit operations References: None
Problem Statement: We observe slow performance for a simulation using scripts. What are the possible causes for this?
Solution: Files written by a script: For a client/server configuration, make sure the text files written by scripts are local to the client machine otherwise there will be network traffic writing from the client to the server machine. Batch update of assignments Use the DisableIncrementalUpdate and EnableIncremental update keywords in the scripts. Below is an extract from the help. This enables changes to be batched up and applied all at once, which saves a lot of time. DisableIncrementalUpdate and EnableIncrementalUpdate Methods By default, whenever a change is made to the simulation that will require the update of assignments, connections, or equations, the update happens immediately. For example, the following script will perform two incremental updates, one after each assignment: 'Incremental update happens after this assignment B1.PropMode = Local 'and again after this assignment B2.Nstages = 10 To batch together a number of simulation changes, the following methods are provided on the Flowsheet object: Method Description DisableIncrementalUpdate Call this to stop incremental updates EnableIncrementalUpdate Call this to enable incremental updates The methods DisableIncrementalUpdate and EnableIncrementalUpdate operate on a lock counter which is incremented by DisableIncrementalUpdate and decremented by EnableIncrementalUpdate. This allows scripts called within scripts to use Disable and EnableIncrementalUpdate without overriding the lock on incremental updating that may have been established in the calling script. When a script is invoked via the User Interface, the lock count is automatically restored to its original state when the script completes. If these methods are used from external automation, ensure that there are matching calls to DisableIncrementalUpdate and EnableIncrementalUpdate. Example To batch the operations of the above script: 'do not update until we have completed edit DisableIncrementalUpdate B1.PropMode = Local B2.NStages = 10 'incremental update happens after this command EnableIncrementalUpdate This applies to changes of the simulation structure, to a steady-state simulation or whether it is intended for dynamic. Reading lots of variables When you read the values of lots of variables the UpdateVariables command allows you to retrieve many variables from server to client in one go which can make things faster. (Performance will always improve when using the UpdateVariables method, however the effect will be more noticable for about 100 variables. For client/server configuration, performance improvement will be larger. The performance bottleneck is caused by the fact everytime a variable is retrieved, there is a wait. So by using the UpdateVariables, there is only a single wait period for the entire list of variables, instead of one for each). This is an extract of the online Help text:- UpdateVariables Method When it is necessary to have up-to-date data for a variable, it is retrieved on demand from the server. If you know in advance that you are going to access a number of variables, it is more efficient to update all the relevant variables in a single operation. The UpdateVariables method enables you to do this. When it is used with no arguments, UpdateVariables updates all the variables in the scope of the object on which it is invoked. It can also update a list of variable paths, variable objects, or variable collections. Example Running a script at Flowsheet level: B1.UpdateVariables 'Updates all the variables in the block B1 Set var = B2.Height UpdateVariables(B2.Vol,var) 'Updates the values of B2.Vol and B2.Height From external Microsoft® Visual Basic®: ACMApp.Simulation.Flowsheet.B1.UpdateVariables 'Updates all the variables in the block B1 Set var = ACMApp.Simulation.Flowsheet.B2.Height ACMApp.Simulation.Flowsheet.UpdateVariables(B2.vol, var) 'Updates the values of B2.Vol and B2.Height Client/Server configuration You could consider running Aspen Custom Modeler client and server on one PC, and use of Windows Terminal Server or Citrix Metaframe to access it from another PC.This can prove out to be a useful/flexible client-server set-up, and avoids the performance issues caused by communication delays between the client and the server when they are not running on the same computer. Keywords: VB VBA OLE ActiveX References: None
Problem Statement: Automation in Aspen Plus: how to get access to different variables and present the information to the user through dropdown menus.
Solution: ThisSolution provides an example code in VBA that illustrates how to retrieve different information from an Aspen Plus simulation file using dropdown menus; including accessing to variables with two identifiers such as component profiles of a RadFrac column. The automation program will allow to: · Connect to a simulation file throw a file picker dialog box. · Retrieve the list of blocks and streams in the simulation. · Identify RadFrac blocks. · Access different variables with one and two identifiers. · Select between categories through dropdown menus. An Excel file with VBA code is attached with the aforementioned functionalities. In order to access to the VBA editor, press Alt + F11. For more information of locate variable in your simulation, checkSolution ID 140584. Keywords: Automation, example code, VBA, Aspen Plus, dropdown menus, combobox, accessing variables. References: None
Problem Statement: How to to use a Dynamic Link Options (DLOPT) file (*.opt) to specify the files to link during a simulation run?
Solution: The simplest method of supplying user models to Aspen Plus is by putting the user model's object module files (the results of the aspcomp command) in the run directory. By default, whenever Aspen Plus spawns a subprocess to link a run-specific shared library it includes all object module files from the run directory. Alternatively, you can write a Dynamic Linking Options (DLOPT) file (*.opt) which specifies the objects to use when creating the run-specific shared library. The DLOPT file is also needed to specify shared libraries (*.DLL files) created by the asplink procedure for use when resolving user model symbols instead of, or in addition to, linking a run-specific shared library. Dynamic Linking Options (DLOPT) files can be used to alter the linking of shared libraries. DLOPT files can be specified to: asplink when creating shared libraries before an Aspen Plus run Aspen Plus when making a run In the Aspen Plus GUI, specify the Dynamic Link Options (.opt) file in the Linker options field from the Run\Settings menu. In the Aspen Plus simulation engine use the option /dlopt=name. DLOPT files can contain: DLOPT commands File specifications referring to object module files, object module libraries (archives), or shared libraries Observe these rules when writing DLOPT files: Only one DLOPT command or file specification per line File specifications may contain an asterisk (*) as a wildcard for matching a list of files, for example, *.obj File specifications may contain environment variables (UNIX and Windows only) Comments may be used anywhere, begin with # or !, and terminate at the end of the line Example DLOPT File for Windows: ! This is an example DLOPT file for Windows :no_local ! Do not include object module files from run directory D:\USEROBJS\*.OBJ ! Include all object module files from ! D:\USEROBJS directory %USRLIB%\XYZ.LIB ! Include object module library XYZ.LIB ! from the directory pointed ! to by the USRLIB environment variable D:\USERDLL\*.DLL ! Use the shared libraries in the D\USERDLL ! directory when resolving ! user model symbols For more information see Aspen Plus User Models Keywords: DLL, FORTRAN, Subroutine, DLOPT, ASPLINK References: Manual, Chapter 1, page 6