Response
stringlengths
8
2k
Instruction
stringlengths
18
2k
Prompt
stringlengths
14
160
Adding a random parameter to your request will not sabotage your php in any way: loader = new URLLoader(); var request:URLRequest = new URLRequest( "http://www.mySite.com/getTime.php?action=getTime&bogus="+(Math.random() * 10000)); loader.load(request); Your php code will GET the action parameter, and will ignore the bogus parameter without any effect on your code. You can also use a time based random number to avoid being too unlucky and get the same value twice.
Good Day, I’ve encountered problems with the cache of the UrlLoader in Actionscript 3. I make a UrlRequest to a php site to get a timestamp. When I call initiate the class (which contains the function) a second time, the result is the same. I have to close the application and restart it to get a new request. I have tried "loader = new loader." and also using headers. The option of creating a unique URL for every request like here , does not work for me since it would sabotage my php action.. loader = new URLLoader(); var request:URLRequest = new URLRequest("http://www.mySite.com/getTime.php?action=getTime"); loader.load(request);
AS3 Resetting UrlLoader Cache
Consider disabling client side caching in the DataService object and see if that helps. I had the same problem and setting dataService.MergeOption to MergeOption = MergeOption.OverwriteChanges helped keep the data service refreshing the obejct on each change and get.
We were using the System.Data.Services.Client (version 4 I guess) of Microsoft WCF Data Services. When we updated to the version 5.2 (Microsoft.Data.Services.Client dll), it seems that some caching mechanism has been inserted into the new version of WCF Data Services. Because when we query the data services (OData) through browser, fresh data would be returned, but when we add a service reference to our UI project and use that reference (proxy) to retrieve data, only after 10 minutes or so the fresh data would be shown. By resetting IIS (iisreset.exe) fresh data would be available, which probably means that somewhere in UI project a caching should be in place. We don't do something extraordinary in our code, but using OData service reference in its most simple state: public List<Customer> GetCustomers() { CustomersODataModel customersData = new CustomersODataModel("Url"); return customersData.ToList(); }
Does Microsoft.Data.Services.Client caches data?
As of Yii 1.1.13 I'm afraid you will have to use an SQL statement, and there is no direct way of using CDbCriteria. However there is an indirect way of using CDbCriteria, but ultimately you will have to use it to generate an SQL command, which you'll pass as your dependency. This technique uses CDbCommandBuilder and its methods. Sample (see comments for understanding): $criteria=new CDbCriteria; // ... replace with code to set up your criteria ... // ... // first create commandBuilder instance $commandBuilder = new CDbCommandBuilder(Yii::app()->db->schema); // then create command using criteria $command = $commandBuilder->createFindCommand('table_name', $criteria); // then get sql statement text $sql = $command->text; // then set your dependency $dependency = new CDbCacheDependency($sql); // if you have params in the criteria, set the params for dependency $dependency->params = $criteria->params; // now your dependency is usable In the above I have used createFindCommand, there are other methods like createCountCommand in CDbCommandBuilder, use one which generates the sql statement that you have to run. Alternatively you could have used Query Builder, but of course there wouldn't be any CDbCriteria there.
When we define the cache dependency with CDbCacheDependency, we have to provide a SQL as the dependency. When we are using CDbCriteria, its not possible to provide the sql, as the sql is built via critera with proper paramters token. Is there any way we can use the CDbCriteria as the cache dependency ? Is it right to ask such thing ? as I'm only interested in the sql being built by the CDbCriteria, other I would have to build sql manually and I think that is not right. Thank you
Yii:: How to use CDbCriteria as cache dependency
Got it to work. Here's what it took: def expire_index cache_key = "views/#{request.host_with_port}/posts" Rails.cache.delete(cache_key) end More details on this gist -> https://gist.github.com/4400728
tl;dr My expire_index method below is getting called, I see the puts in the logs. However, when I refresh the page is the stale version. note: I am using rails_admin for updating the models. But have also noticed the same behavior using rails console directly. Thanks for your help. Much appreciated! details app/controllers/posts_controller.rb class PostsController < ApplicationController caches_action :index cache_sweeper :post_sweeper def index @posts = Post.published end def show @post = Post.find(params[:id]) end end app/sweepers/post_sweeper.rb class PostSweeper < ActionController::Caching::Sweeper observe Post def after_save(post) puts "======================" puts " AFTER SAVE " puts "======================" expire_index end private def expire_index puts "======================" puts " EXPIRING INDEX " puts "======================" expire_action(:controller => '/posts', :action => 'index') end end config/environments/production.rb config.action_controller.perform_caching = true config.cache_store = :dalli_store # using memcachier on heroku
Unable to expire_action on Rails
I think that will work for you: Assume you are using the latest version of cakephp add this to your core.php bellow to the line where you configure Cache.check Example and Code: /** * Enable cache checking. * * If set to true, for view caching you must still use the controller * public $cacheAction inside your controllers to define caching settings. * You can either set it controller-wide by setting public $cacheAction = true, * or in each action using $this->cacheAction = true. * */ // Configure::write('Cache.check', true); $UAs = array( 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_2) AppleWebKit/537.11 (KHTML, like Gecko) Chrome/23.0.1271.101 Safari/537.11' ); if (in_array(env('HTTP_USER_AGENT'), $UAs)) { define('USE_CACHE', '1 hour'); Configure::write('Cache.check', true); } else { define('USE_CACHE', false); Configure::write('Cache.check', false); } $UAs refers to the user agents of bots This is a sample controller that you can use to test the code: <?php App::uses('AppController', 'Controller'); class HomeController extends AppController { public $name = 'Home'; public $uses = array(); public $helpers = array( 'Cache' ); public $cacheAction = USE_CACHE; public function index() {} }
I can cache controller actions in my CakePHP 2 application by using CacheHelper. And this helper provides me to select cache duration, "nocache" parts of page etc. But is it possible to serve cached actions regarding to User agent of the visitor. For instance I plan to show cached page to crawlers/bots, but construct the page if visitor is not bot. I don't want to select which parts of the page will be cached / nocached. Taking the page as a whole.
CakePHP serving cached controller actions, regarding to user agent
3 I faced the same issue. So, i think, the problem is, that we don't have cachable template in this case. So if you cache it the way you did (and as I did, too), you end up in a base64 encoded list of links in your cache file. To validate that, I uncompressed the files in var/full_page_cache - and here we go: the cart count is cached and won't be changed even if your cart changes, and it would be not replaceable on server side (at least not in a clean way). The reason for this behaviour is a simple one: For FPC, you render templates only, passing some values. But the template only renders a list in that special case, accessing only one block method (getLinks). In your layout xml files, you will find some calls of "addLink", which feeds that block, that's why all the results become base64 encoded and end up in your cache file. They are not accessable by your container. But I think, there is a way to fix that. Just collect the links you want to be rendered and create a custom template and a custom block for that. You'll now be able to cache it in a proper way. Share Improve this answer Follow answered Jan 24, 2013 at 16:25 quafziquafzi 6866 bronze badges 1 Can you please explain with a few more detail how to collect and render those links in a custom block? Then what PageCache container should I provide? Thank you – ermannob Jun 10, 2015 at 13:37 Add a comment  | 
Im using Magento EE version 1.12 with Full page cache enabled a) my product detail page is cached b) as a result my shopping cart in this page doesn't show dynamic item count c) so i am not able to show valid cart item count in my product detail page steps i followed 1) I created a block and called from header.phtml 2) trying to make that topcart.phtml block not to be cached As im a newbie in magento , i got some links for cache hole punching I followed below links but no success my file structure app- code - local - Enterprise - PageCache ->etc - cache.xml and PageCache - model -container - TopCart.php code as shown below help link one help link two link three i created files cache.xml and cart.php container file <page_html_topcart> <block>page/html_topcart</block> <name>topcart</name> <placeholder>PAGE_HTML_HEADER_CART</placeholder> <container>Enterprise_PageCache_Model_Container_TopCart</container> <cache_lifetime>36400</cache_lifetime> </page_html_topcart> this is my topcart.php container file looks like protected function _getIdentifier() { $cacheId = $this->_getCookieValue(Enterprise_PageCache_Model_Cookie::COOKIE_CUSTOMER, '') . '_' . $this->_getCookieValue(Enterprise_PageCache_Model_Cookie::COOKIE_CUSTOMER_LOGGED_IN, ''); return $cacheId; } protected function _getCacheId() { return 'CONTAINER_TOPCART_' . md5($this->_placeholder->getAttribute('cache_id') . $this->_getIdentifier()); } protected function _renderBlock() { $block = $this->_getPlaceHolderBlock(); //('page/html_header_cart'); Mage::dispatchEvent('render_block', array('block' => $block, 'placeholder' => $this->_placeholder)); return $block->toHtml(); } kindly help me out with useful links and step
How to implement magento cache hole punching for shopping cart block
Generally, you'll want to prevent caching when old (even a few seconds old!) pages are likely to be stale. The longer you expect a page's content to be relevant, the more good caching does (and the less it gets in the way). Conversely, the shorter the information's shelf life, the more likely that caching will actually get in the way and keep people from getting the most up-to-date content. The cost of prohibiting caching, of course, is that you don't get the benefits of caching. Every hit to the page will generate a request to your server, even if you're just serving the same page all the time. That could mean a significant increase in server load, which a big site (or a rinky-dink web server) would find undesirable. So in order to answer the question, "To cache or not to cache?", you'll need to balance your bandwidth and server capabilities (and your willingness to potentially max them out) against the requirement that you have the absolute freshest bits. If you don't have such a requirement, then no-cache might be overkill. As for future effects? The only real issue is that caching is still happening to some degree til all the cached copies expire. Once that happens, there's no real issue. You're not prevented from reenabling caching later. In fact, if you do reenable it, proxies would see the change pretty quickly, and start caching the page again the next time someone requests it.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center. Closed 11 years ago. The W3.org Protocol says By default, a response is cacheable if the requirements of the request method, request header fields, and the response status indicate that it is cacheable When they say "a response" does that mean that everything is caching all the time? So when I use Cache-Control: no-cache will that stop the page from caching? And will that have any ill effect in future?
when should I use Cache-Control: no-cache? [closed]
What you are looking for is a way to access data across your app. This is typically the role a Model plays in MVC. CoreData and NSUserDefaults are ways to save data so it is not lost when your app closes or is quit. They can be parts of a Model, but do not help in having that data be accessible throughout your app. If you want an object that stores data and can be accessed anywhere in your code, you are probably looking for a Singleton. As this excellent Stack Overflow answer explains: Use a singleton class, I use them all the time for global data manager classes that need to be accessible from anywhere inside the application. The author provides some sample code you might find helpful. This would allow you to create a simple object accessible throughout your program that has your NSDictionaries. Because it is a singleton, other classes in your program can easily access it - meaning they can also easily access the NSDictionaries you've stored in it. If you do decide you want to save data, that singleton object would also be an ideal location to write any load and save code. Good luck! Other good resources are: Wikipedia's Entry on Singeltons What Should My Objective C Singleton Look Like? Singeltons and ARC/GCD
I am making a request for an array of perhaps 10-100 objects, all of which are JSON objects that I parse into NSDictionary's. I want to cache this and use this data across the entire application. Is NSCache useful for this or is it better to use NSUserDefaults or what is actually the most accepted way of persisting data across an entire app? CoreData? I'm a iOS newb and don't have too much experience in this.
Best way to cache an NSArray of text/dictionaries and have it useable across the entire app?
The "ram" used by memcache is not from your application, it's from a pool of memcached memory generic to GAE that is shared. All instances of your application "see" the same memcache. What you put into memcache does not count towards your applications ram usage. However the contents of memcache can be evicted at any time with no notice. So really there is no reason (apart from the upper limit on the size of the object you can put into memcache) not to cache everything, as long as you can fallback to the datastore if it's not in cache at the time you ask for it.
Context: using Memcached. (With Google App Engine and Objectify, but this is irrelevant) I will create a simple example, a game with two entities Player and Game. Users can often consult a single Request games that are opened or even the player profiles. There may be open 100 games or maybe 3000000 games. (1) It is good idea to use caching for all entities? If I have unused RAM, why not use it with games or players? Is there a bad case for this? (except take cache accessing data not at all almost no time) (2) Another question is, when loading objects, should I partition to optimize objects stored? and cached?, for example: player { email pass punctuation // This data will change quite frequently numGamesClosed // This data will change quite frequently } Maybe better: @Entity //DataStore entity player { email pass } @Cache //The entity will be cached into Memchached @Entity //DataStore entity DatosJugador { @Parent Key <Player> owner; punctuation numGamesClosed } Thanks a lot
Memcached: Caching objects can be always good? (GAE Objectify)
No, the url you see there is not used to load another page. There are AJAX requests in the javascript code contained in the website, that load the new content to display and update the URL bar. You can read more about it in this article and in the following questions asked in the past: Modify the URL without reloading the page Updating address bar with new URL without hash or reloading the page
What has been known for a while, is that a "fast navigation" works easily for http://example.com/#1 --> http://example.com/#2. However, there is a new technique out there. It enables fast navigation between http://example.com/1 --> http://example.com/2. EXAMPLE: http://rageslide.com/ As you can see in the example, the navigation between http://rageslide.com/1 and http://rageslide.com/2 etc. via swiping apparantly DOES NOT FORCE THE ENTIRE SITE TO RELOAD. I'd like to do the same for my site, but I have no idea how to do this. All pages served by my site are dynamic (via PHP and MYSQL). I have this idea: Cache the generated output of a page (http://example.com/2) for 60 seconds. When the user is on http://example.com/1 preload (http://example.com/2) via Javascript. The user navigates from http://example.com/1 to http://example.com/2. Since the content is preloaded and cached, the content will be served to the user instantly. Different idea: Somehow, http://example.com/1 is being interpreted as http://example.com/content.php#1 through a .htaccess. But I have no idea if this is possible or not. Will this work? Or what would be the best way to solve this problem?
Make a URL Navigation load instantly
The Parse JavaScript SDK is open source, so you could look at the implementation of Parse.User.current and Parse.User._saveCurrentUser. Maybe you could do something similar. http://www.parsecdn.com/js/parse-1.1.11.js
Currently using a setup that follows: Backbone, Parse, Require, and Marionette. I've found through my application that I often need to reuse objects I've already pulled down from Parse. Parse already does this through Parse.User.current(), However it would be great to store other entities locally rather than retrieving them over and over again. Does anyone have any suggestions in terms of good practices or libraries to use for caching these objects locally or would having global variables that hold the information while the application runs be enough?
Caching objects locally for reuse (Parse.com / Backbone)
You can use OutputCacheAttribute to affect output caching on on a controller or action-by-action basis, and use VaryByCustom. [OutputCache(Duration = 60, VaryByParam = "*", VaryByCustom="userName")] Place that on the controllers, then go into your Global.asax.cs and override GetVaryByCustomString: public override string GetVaryByCustomString(HttpContext context, string arg) { if(arg.ToLower() == “username” && context.User.Identity.IsAuthenticated) return context.User.Identity.Name; return base.GetVaryByCustomString(context, arg); }
I have an MVC application with approx 20 controllers. In this application I want to cache certain views (mostly partials) for 60 seconds, i.e. the result would only change once per minute, even if the underlying data changed during that minute. Seems simple enough. The complication is that the partials show different data dependant on the currently logged in user. How can I make sure that the cache is per user using MVC3?
Caching partials based on logged in user
I would keep in mind that Solr already has a lot of caching built into it in order to speed up common queries. I'd advise you to look into the inherent capabilities in Solr/Lucene before you go off and reinvent the wheel with your own query cache. Here is a good place to start.
We are developing a search engine web application that will enable users to search the content of about 200 portals. Our business partner is taking care of maintaining and feeding a solr/lucene instance that is doing the workhorse job of indexing the data. Our application queries solr and presents the results in a human-friendly way. However, we are wondering how we could limit the amount of queries, perhaps using some form of caching. The results could be cached for few hours. What we are wondering is: what could be a good strategy for caching the queries results? Obviously we expect the method invocations to vary a lot... Does it make sense at all to do caching? Is there some caching system that is particularly suitable in this use case? We are using Spring 3 for the development.
What caching strategy for search queries
Yes and yes, these are common race conditions. You can avoid them if by simply writing the code as var test = _cache.Get("Test"); if (test != null) { return test as string; }
MemoryCache is a thread-safe class, according to this article. But I don't understand how it will behave in a specific situation. For example I have the code: static private MemoryCache _cache = MemoryCache.Default; ... if (_cache.Contains("Test")) { return _cache.Get("Test") as string; } Can element's living time expire just after I call Contains() so null value will be returned? Can another thread remove item just after I call Contains() so null value will be returned?
MemoryCache Contains() thread-safety
Have a look here: Entity Framework - Second Level Caching with DbContext SOURCE: http://www.codeproject.com/KB/aspnet/435142/SecondLevelCachingExample.zip and here: Implementing second level caching in EF code first SOURCE: http://www.dotnettips.info/File/UserFile?name=EfSecondLevelCaching.zip
We are running our application in Windows Azure. We are experiencing performance problems with SQL Azure so we are looking into implementing a second level cache. With the ORM that we are currently using this is quite difficult to accomplish. What about Entity Framework second level cache? Specifically when using Windows Azure cache(*). I know that it is currently not supported out-of-the-box and that are are some wrappers available. But are there any future plans on supporting this out-of-the-box? (*) The idea is to use a certain percentage of the Web roles memory for caching. For example using 5 medium Web roles and 20% memory for cache will mean a consistent cache of 3.5 GB.
Entity Framework second level cache and Windows Azure cache
The best way to do this is likely with LISTEN and NOTIFY. Have your app maintain a background worker with a persistent connection to the DB. In that connection, issue a LISTEN name_changed, then wait for notifications. If npgsql supports it it might offer a callback; otherwise you'll have to poll. Add a trigger to the name table that issues a NOTIFY name_changed. When your background worker gets the notification it can flush the cache. You can even use the NOTIFY payload to invalidate only changed entries selectively.
ASP.NET/MONO MVC2 application standard ASP.NET Web cache is used to speed up database access: string GetName() { // todo: dedect if data has changed and invalidate cache var name = (string)HttpContext.Current.Cache["Name"]; if (name!=null) return name; name = db.Query("SELECT name from mydata"); HttpContext.Current.Cache.Insert("Name", name); return name; } mydata can changed by other application. In this case this method returns wrong data. How to detect if data is changed and return fresh data from PostgreSql database in this case ? It is OK to clear whole Web cache if mydata has changed.
How to invalidate ASP.NET cache if data changes in PostgreSql database
The symbol this is always a local reference, so there's no need to "cache" it for performance reasons. There may be other reasons to preserve its value in another local variable however. When there's a local function that needs access to the this value from its containing function, then the containing function must make a copy of the value, since this is always set upon any function invocation. (It may not be purely accurate to call this a "local reference"; the point is that the keyword always references a value pertinent to the local function activation record.)
It's recommended to cache globals locally for better performance like so: function showWindowSize() { var w = window; var width = w.innerWidth; var height = w.innerHeight; alert("width: " + width + " height: " + height); } Is the same true when using the "this" keyword, or is it cached already? Example: Game.prototype.runGameLoop = function() { var self = this; self.update(); self.draw(); };
Is "this" cached locally?
The /data/data/<packagename>/files folder you get e.g. via Context#getFilesDir() is not cleared when the cache is cleared. The files here will only be deleted when your app is uninstalled or the user hits the delete data button in system settings. The /data/data/<packagename>/cache folder Context#getCacheDir() on the other hand can be cleared automatically. (or through the clear cache button in system settings) As the documentation states These files will be ones that get deleted first when the device runs low on storage. There is no guarantee when these files will be deleted.
Is "data/data//files" deleted when application cache cleared in android from setting page?
Is clearing application cache, it's deleted "data/data/<package_name>/files" files?
3 The meta tag only instruct the browser not to cache the html page, but not its assets. To avoid caching, instead of embedding the SWF like "foo.swf", try it like "foo.swf?c=" + new Date().getTime() The same trick goes for any other embedding dynamic assets including videos, audios as well as dynamic images. Share Improve this answer Follow answered Sep 13, 2012 at 15:22 Tianzhen LinTianzhen Lin 2,48411 gold badge1919 silver badges1919 bronze badges Add a comment  | 
I have found to reload the .swf file everytime so the user has the latest version, add the following code to eliminate caching by the browser: <meta http-equiv="Cache-Control" content="no-cache, no-store, must-revalidate" /> <meta http-equiv="Pragma" content="no-cache" /> <meta http-equiv="Expires" content="0" /> Anyone have other suggestions? Thanks!
Adobe Flex .swf Cache
3 I used Akamai about a year ago and never found such a tool. I wish I did though, it would have been quite useful. You could use Charles or a tool similar to it, and return the 304 Not Modified HTTP code for whatever assets you want to test as being cached. It's a pretty manual way of doing it, but depending on the scale of your tests it could work. Share Improve this answer Follow answered Sep 6, 2012 at 15:26 Kevin LawrenceKevin Lawrence 74199 silver badges1818 bronze badges Add a comment  | 
I am trying to test some code and view how it would work when it was cached using Akamai. Can anyone recommend a tool or add-on, that would allow me to emulate a page load as it would be when Cached by Akamai?
Emulating Akamai caching envinronment
You've said in the method call it needs to return type int, which isn't nullable (and so returns it's default value 0 instead). Try changing the second line to: count = cachClient.Get<int?>(myKey); and see if it returns null then.
I am using the ServiceStack Cache Client with Redis to cache integers. I am calling the Get method on a key I know does not exist like this: int? count; count = cachClient.Get<int>(myKey); count always has a value of 0 after this call. From the documentation, I am expecting the Get method to return null for a non-existant key. Am I doing something wrong or understanding this incorrectly?
Why does the Redis cache client in ServiceStack return 0 instead of null for a non-existant key of an integer type?
3 You need a pre-deploy build step that merges all of your .js files together into a single one for download to the client. (It will also likely minify the resulting file.) Then you can serve a single file with HTTP cache headers. Share Improve this answer Follow answered Sep 3, 2012 at 3:47 DomenicDomenic 111k4141 gold badges222222 silver badges271271 bronze badges 3 And I must use JS Builder for that. Right? – Kabeer Sep 3, 2012 at 3:51 I don't know what JS Builder is, but there are many tools that can do this sort of thing, including basic cat. – Domenic Sep 3, 2012 at 20:02 You should use Sencha SDK Tools for this. Check the forum for the help guide, as it is finicky. – Neil McGuigan Sep 4, 2012 at 15:59 Add a comment  | 
In order to achieve modularity & reuse of my custom elements / classes (extended from Ext JS classes / widgets), I am following the approach suggested in The Class System. Since I'm using most of the simple & complex widgets as well as layout containers, I am likely to end up with scores of .js files with 3 to 4 levels of namespace hierarchy (and therefore folder structure). I am a little nervous with this approach because traditionally (with raw JavaScript) I have tried to minimize the no. of .js files. Since my page may use many of these custom elements, it will call quite a few of these .js files. Am I right to assume that this will create a huge performance bottleneck or am I thinking too much? Next, how can I ensure that the .js files that once requested from the server remain cached in the browser at least for the session? I have observed that the .js files are always requested with a dc attribute, each time with a random number. This doesn't help the situation since all what I am requesting is a class definition file that is not changing with each request. How can I address this?
Ext JS best practice for classes ends up with too many .js files. What about performance?
The most granular action you can do is to retrieve a particular key. If you store a list of items under that key, you get the list. To get exactly one item, you'll need to store an item per key. There are various strategies for this, however note that atomicity is at the key level so doing actions across multiple keys could result in race conditions. Personally, I would custom serialize your list (instead of relying on default .NET serialization) as this will be fairly quick. But this depends on how many times "sometimes" is when talking about updating individual keys or requesting individual keys. The common sense "store it as you will use it" will apply here. For the custom serialization, we use Protobuf-net and store the resulting byte[] into the cache manually - as in, the caches takes and gives a byte[] and we do the serialization manually.
I need to store a list of data in the cache. Sometimes I'll need full list sometimes I'll only need to query the results and return one item from the list. Current logic for one item is this: User result; var cachedUsers = _cache.Get<List<User>>(Constants.Cache.UsersKey); if( cachedUsers != null) { result = cachedUsers.FirstOrDefault(u => u.UserID == userID); if (result != null) return result; } result = GetUserFromDb(userID); if (cachedUsers != null) { cachedUsers.Add(result); _cache.Store(StoreMode.Set, Constants.Cache.UsersKey, cachedUsers); } return result; Can I access one element directly from the cache without loading full list of users? I'd also need to logic to add a user into the cache without loading full list and overwriting it. EDIT: I have option to store items separate from the list, or write the cache from scratch with different approach. What is the best way to implement this logic?
Caching and retrieving list of data using memcached
You can try setting the background with css and only change the className in code. This jsfiddle shows an idea to do that. In the Network tab of the Chrome developer tools you'll see that the images, after initial load, are subsequently loaded from cache. The code becomes as simple as: div.className = yellowBG ? 'yellowBG' : ''; In this jsfiddle the images are preloaded.
When the user hovers over a particular area of screen, I create a div, then set the background: div= document.createElement('div'); if (yellowBg) { div.style.backgroundImage = 'url(\'../partHoverBgYellow.png\')'; } else { div.style.backgroundImage = 'url(\'../partHoverBg.png\')'; } parent.appendChild(div) In Firefox and IE, the background image gets cached after it is fetched for the first time. But in chrome, it appears that it is not cached. The result being that the div appears before its background is set every time. I have checked using Fiddler, and the image is indeed fetched every time. Is there a way I can prevent this from happening?
How can I force Chrome to stop fetching the same images?
It just means that the SP itself was found in the execution plan cache. This means that the SP doesn't need to be re-compiled. http://msdn.microsoft.com/en-us/library/aa173892(v=sql.80).aspx Line 3 onward, in your example, shows that the database is itself being interrogated to complete the query.
I would like to clear something up in my understanding on the "SP:CacheHit" event in SQL Profiler. Is it safe to assume that whenever a "SP:CacheHit" event is shown against an execution of a stored procedure that no hit is being made to the database? The reason I ask is because I currently have a query (using Entity Framework/LINQ) that selects one random record out of 4000 rows in a table. Has SQL Server truly cached 4000 records of data from my table, so any subsequent queries will not hit the database? The series of events are as follows: RPC:Starting SP:CacheHit SP:StmtStarting SP:StmtCompleted --> This is where I see the number of reads and row counts RPC:Completed --> This is where I see the number of reads and row counts I found this useful article that somewhat clarified my understanding, but confirmation from one of my fellow experts would be great.
Clarification on SP:CacheHit Event in SQL Profiler
3 Stream the input to a file. Really, there is no other choice. It comes in faster than you can process it. You could create one file per second of input data. That way you can directly start processing old files while new files are being streamed on the disk. Share Improve this answer Follow answered Aug 9, 2012 at 10:40 usrusr 170k3535 gold badges242242 silver badges374374 bronze badges 3 @tuğrulbüyükışık, that would make sense. The OP needs to stream to disk anyway because of the high data volume. – usr Aug 9, 2012 at 10:50 I think file will slow down the performance.So; I will like to handle it using some caching. I know if we are consistently getting data on the same speed the incoming data will be lost. But there is possibility that incoming data speed will not be consistently at this high spped. – learningstack Aug 9, 2012 at 10:53 when the phase-change memory is released, will be no problem i think – huseyin tugrul buyukisik Aug 9, 2012 at 10:54 Add a comment  | 
I am working on a project where we can have input data stream with 100 Mbps. My program can be used overnight for capturing these data and thus will generate huge data file. My program logic which interpret these data is complex and can process only 1 Mb data per second. We also dump the bytes to some log file after getting processed. We do not want to loose any incoming data and at the same time want my program to work in real time.So; we are maintaining a circular buffer which acts like a cache. Right now only way to save incoming data from getting lost is to increase size of this buffer. Please suggest better way to do this and also what are the alternate way of caching I can try?
How to handle 100 Mbps input stream when my program can process data only at 1 Mbps rate
I suggest to add a delegate class which handles the caching. The delegate class could look like this: public class Delegate { private static SomeServiceAsync service = SomeServiceAsync.Util.getInstance(); private List<MyData> data; public static void getData(Callback callback) { if (date != null) { callback.onSuccess(data); } else { service.getData(new Callback() { public onSuccess(List<MyData> result) { data = result; callback.onSuccess(result); }); } } } Of course this is a crude sample, you have to refine the code to make it reliable.
I'm thinking of introducing some kind of caching mechanism (like HTML5 local storage) to avoid frequent RPC calls whenever possible. I would like to get feedback on how caching can be introduced in the below piece of code without changing much of the architecture (like using gwt-dispatch). void getData() { /* Loading indicator code skipped */ /* Below is a gwt-maven plugin generated singleton for SomeServiceAsync */ SomeServiceAsync.Util.getInstance().getDataBySearchCriteria(searchCriteria, new AsyncCallback<List<MyData>>() { public void onFailure(Throwable caught) { /* Loading indicator code skipped */ Window.alert("Problem : " + caught.getMessage()); } public void onSuccess(List<MyData> dataList) { /* Loading indicator code skipped */ } }); } One way I can think of to deal with this is to have a custom MyAsyncCallback class defining onSuccess/onFailure methods and then do something like this - void getData() { AsyncCallback<List<MyData>> callback = new MyAsyncCallback<List<MyData>>; // Check if data is present in cache if(cacheIsPresent) callback.onSuccess(dataRetrievedFromCache); else // Call RPC and same as above and of course, update cache wherever appropriate } Apart from this, I had one more question. What is the maximum size of storage available for LocalStorage for popular browsers and how do the browsers manage the LocalStorage for different applications / URLs? Any pointers will be appreciated.
Enabling caching for GWT RPC asynchronous calls
If you implement this yourself using php, you are responsible for also sending the 304 Not Modified. So compare the If-None-Match header with your ETag, and use header() to send back the 304.
I'm trying the understand local cache with ETags and a nginx (1.2.1) server which redirects php's request to a php-cgi deamon. Here my simple index.php : <?php header('Cache-Control: public'); header('Etag:"5954c6-10f4-449d11713aac0"'); echo microtime(true); After a second request, my browser send a If-None-Match header : Accept:text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Charset:ISO-8859-1,utf-8;q=0.7,*;q=0.3 Accept-Encoding:gzip,deflate,sdch Accept-Language:fr-FR,fr;q=0.8,en-US;q=0.6,en;q=0.4 Cache-Control:max-age=0 Connection:keep-alive Host:cache.loc If-None-Match:"5954c6-10f4-449d11713aac0" But my web server doesn't returns a 304 : HTTP/1.1 200 OK Server: nginx/1.2.2 Date: Thu, 12 Jul 2012 11:46:03 GMT Content-Type: text/html Transfer-Encoding: chunked Connection: keep-alive X-Powered-By: PHP/5.3.12 Cache-Control: public Etag: "5954c6-10f4-449d11713aac0" Cache-Control: public Unless I've misunderstood, my server should compare Etag with the If-None-Match sent and returns a 304 response because they're the same. Where am I wrong ? Should I compare Etag with If-None-Match in my PHP script because nginx (or Apache) will not do the job itself ? Regards, Manu
Nginx/PHP : Nginx is not returning a 304 response when sending Etag header
A file cache is fine, you just have to be smart about it. I'd aim to keep directorys to, say, 500 entries or less. with 40k entries, just hashing the url and using the first 2 bytes of the hash will give you 255 folders, each of which should contain on average ~150 files.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance. Closed 11 years ago. I'm looking for the best solution for caching thousands of web pages. Right now I am using flat files, which works great until there's many thousands of flat files, then the entire file system slows down (a lot) when accessing the cache of files (Running on CentOS with EXT3 under OpenVZ). I'd like to explore other options such as Redis or MongoDB as a substitute, but would they be any faster? And if not, what would be the best suggestion? My system dynamically creates over 40K pages per website, so it's not feasible to do a memory cache either. Thanks!!
Suggestion for fastest PHP web page caching for many thousands of pages? [closed]
The problem from a cache point of view isn't the way you are accessing the elements. In this case using a pointer or the array index is equivalent. BTW Node* p[] is an array of pointer. So you could have possibly allocated your Node objects into distant memory areas. (For example using several ptr = new Node()). The best cache performance are gainable if: Your Node are stored contiguosly into memory Node size doesn't exceed the cache size.
void foo(Node* p[], int size){ _uint64 arr_of_values[_MAX_THREADS]; for (int i=0 ; i < size ; i++ ){ arr_of_values[i] = p[i]->....; // much code here // } } vs void foo(Node* p[], int size){ _uint64 arr_of_values[_MAX_THREADS]; Node* p_end = p[size]; for ( ; p != p_end ; ){ arr_of_values[i] = (*p)->.....; p++; // much code here // } } I created this function to demonstrate what i am asking: what is more efficient from the cache efficiency aspect : taking p[i] or using *p++? (i'll never use the p[i-x] in the rest of the code, but i may use p[i] or *p in the following calculation)
array traversal vs pointer, cache efficiency aspect
If you follow the link below you will see that Windows Azure Cache API does not support dataCache.CreateRegion(): Caching API Support in Windows Azure Caching But it sure is supported in Windows Azure Caching (Preview) so you can use either dedicated or co-located cache with your project if you want to create regions and handle cache respectively. If you want to know the difference between Windows Azure Shared Cache and Windows Azure Caching (Preview) please take a look at my blog: Difference between Windows Azure Cache(Preview) and Windows Azure Shared Cache
I'm trying to add a region using Azure Datacache object but I'm getting such error "This operation is not supported by the cache." Dim dataCache As DataCache = dataCacheFactory.GetDefaultCache() Dim tagList As New List(Of DataCacheTag)() From {New DataCacheTag("tagList")} dataCache.CreateRegion("region-1") dataCache.Put("key-name", "key-value", New Timespan(0, 10, 0), tagList, "region-1")
Azure CreateRegion "This operation is not supported by the cache."
Today - after months of working on this matter - I figured out that you need to shutdown the XPcom on a clean way, otherwise the "dirty flag" in the cache is set and the Gecko-Framework will clear the cache on startup. Thus, you need to add Gecko.Xpcom.Shutdown() to get a clean shutdown and the "dirty flag" is not set if you finish your program (e.g. on closing form or something similar). Now, I need to refactor my code, because I'm asking for still open windows and kill those windows without any chance of clean shutdown for Xpcom-framework. sigh Maybe other people will help this hint ... Regards, Markus
I'm using the GeckoFx v1.9.1.0 in VB and find a way to activate the cache with following code (just for to be sure it is activated - I know it is default): Skybound.Gecko.GeckoPreferences.User.Item("browser.cache.disk.enable") = True Skybound.Gecko.GeckoPreferences.User.Item("browser.cache.memory.enable") = True Skybound.Gecko.GeckoPreferences.User.Item("Browser.cache.check doc frequency") = 3 Skybound.Gecko.GeckoPreferences.User.Item("Browser.cache.disk.capacity") = 50000 Skybound.Gecko.GeckoPreferences.User.Item("Browser.cache.memory.capacity()") = -1 I can see that the cache at "user/Geckofx/1.9/cache" is filled during the first load of a page, but on restart of my application EVERYTHING is reloaded (although cache is activated). So I think there's missing another option to tell that nothing should be reloaded on each start. Can you help me to find this option? Thx Markus
How to say GeckoFX to use old Cache-files instead of reloading them?
2 document.getElementsByTagName returns a "live" NodeList, which isn't what you think at all. When you access the list, the DOM is traversed (implementation may cache it) every time to get the result. This gives the illusion of the list being live. document.getElementsByTagName("div") === document.getElementsByTagName("div") //true To do what you want, simply convert it to an array. DOM = [].slice.call(DOM) Share Improve this answer Follow answered Jun 16, 2012 at 5:55 EsailijaEsailija 139k2323 gold badges275275 silver badges327327 bronze badges Add a comment  | 
So I tried to build a cache of the DOM: var DOM = document.getElementsByTagName('*'); However, the DOM variable seems to be a dynamic reference, so that if I change an element in the DOM, the DOM variable changes as well. I tried iterating through the DOM variable and using the cloneNode method to create a deep copy of each node. This works in that it does not change when I change the DOM. However, the problem is that a cloned node does not equal its original DOM node when you compare them with the === operator. So to sum up, I'm looking to create a cache of the DOM that does not change but whose nodes are still equal to the original DOM nodes.
How to build a cache of the DOM that does not change
Add a new property public int EmailALertInterval { get { return Current.EmailALertInterval; } }
I have set up a configuration system that will grab the configuration from a MySQL database. And I have got it working, but now I realize that to get the Value from the Cache, I have to use this long messy code. CacheLayer.Instance.Caches["app_config"].Current.EmailALertInterval I would like to remove Current, but I haven't managed to figure out if it's possible to have the class do this directly. At the moment the implementation looks like this. public T Current { get { if (_current == null) { ReloadConfiguration(); } return _current; } } But I would like to simply it to this: CacheLayer.Instance.Caches["app_config"].EmailALertInterval I am looking at something like this, but this only works with indexers. public T this[T key] EDIT: Just to add some more context I will add some more code. This is CacheLayer. It essentially allows me to store multiple configurations. I may have one that is for example the general app config, but I can also grab an array of Emails used. public Dictionary<String,IGenericCache<dynamic>> Caches { get { return _caches; } } public void AddCache(String config) { _caches.Add(config,new GenericCache<dynamic>(config)); } Inside my GenericCache I load the configuration using a JSON string stored in a MySQL db. _current = JsonConvert.DeserializeObject<T>(db.dbFetchConfig(config)); The reason that GenericConfig is T and not dynamic is because I want to be able to make a custom implmentation outside of the Current0, one that does not necessarily use Current1. ANother example on how I want this to be used. Current2 This would in fact grab an JSON Array from the MySQL containing a list of Emails. Any ideas?
Simplifying class structure
It's nice if mongodb is managing the index and ram on its own. MongoDB does not manage the RAM at all. It uses Memory-Mapped files and basically "pretends" that everything is RAM all of the time. Instead, the operating system is responsible for managing which objects are kept in RAM. Typically on a LRU basis. You may want to check the sizes of your indexes. If you cannot keep all of those indexes in RAM, then MongoDB will likely perform poorly. However, I am not sure if it does and about the way mongodb can speed up these queries on indexs-only best. MongoDB can use Covered Indexes to retrieve directly from the DB. However, you have to be very specific about the fields returned. If you include fields that are not part of the index, then it will not return "index-only" queries. The default behavior is to include all fields, so you will need to look at the specific queries and make the appropriate changes to allow "index-only". Note that these queries do not include the _id, which may cause issues down the line.
After using a myisam for years now with 3 indexes + around 500 columns for Mio of rows, I wonder how to "force" mongodb to store indexes in memory for fast-read performance. In general, it is a simply structured table and all queries are WHERE index1=.. or index2=... or index3=.. (myisam) and pretty simple in mongodb as well. It's nice if mongodb is managing the index and ram on its own. However, I am not sure if it does and about the way mongodb can speed up these queries on indexs-only best. Thanks
mongodb: force in-memory
3 It's difficult to understand your intention in the controller here, since both your show and index method have the same implementation. With that being said, it's likely that you want to move any caching logic in to the model here, then it will be easier to isolate your problem. Please consider the following refactor: stocks_controller: def index @stocks = Stock.active_for_restaurant(params[:restaurant_id]) end def show @stock = Stock.fetch_from_cache(params[:id]) end stock.rb: def active_for_restaurant(restaurant_id) Rails.cache.fetch(custom_cache_path(restaurant_id, Const::ACTIVE_STOCKS)) do Stock.only_active_stocks(restaurant_id) end end def fetch_from_cache(id) Rails.cache.fetch(id, find(id)) end For more info on fetch see: http://api.rubyonrails.org/classes/ActiveSupport/Cache/Store.html#method-i-fetch Share Improve this answer Follow answered May 22, 2012 at 13:42 Chris MohrChris Mohr 3,74911 gold badge1313 silver badges99 bronze badges 1 1 Won't Rails.cache.fetch(id, find(id)) always hit the DB regardless of cache hit/miss? – Alistair A. Israel Dec 3, 2012 at 7:38 Add a comment  | 
How to activerecord cache in Rails 3.2.3 stocks_controller.rb: def index @stocks = Rails.cache.read custom_cache_path(@res.uuid, Const::ACTIVE_STOCKS) if @stocks.blank? @stocks = Stock.only_active_stocks(params[:restaurant_id]) Rails.cache.write custom_cache_path(@res.uuid, Const::ACTIVE_STOCKS), @stocks end end def show @stocks = Rails.cache.read custom_cache_path(@res.uuid, Const::ACTIVE_STOCKS) if @stocks.blank? @stocks = Stock.only_active_stocks(params[:restaurant_id]) Rails.cache.write custom_cache_path(@res.uuid, Const::ACTIVE_STOCKS), @stocks end end When does a request to the show action cache return nil?
How to activerecord cache in Rails 3.2.3
There are two active pull requests on the GitHub Issue Tracker which pertain to this problem: https://github.com/EllisLab/CodeIgniter/pull/661 https://github.com/EllisLab/CodeIgniter/pull/1403 They are both a little different and while one of them looks pretty old (I didn't notice it) the second more recent issue might be right up your alley. If you can verify either of the fixes then give the pull request a +1 and I'll merge it. I had an issue a while back where I wanted to use Memcache on a server which didn't have it installed correctly, and instead of it falling back to Files it just bitched at me, but I never found the time to fix it. If this is the same issue then great, I can get it merged.
This question is unlikely to help any future visitors; it is only relevant to a small geographic area, a specific moment in time, or an extraordinarily narrow situation that is not generally applicable to the worldwide audience of the internet. For help making this question more broadly applicable, visit the help center. Closed 11 years ago. I'm trying to use the cache driver which CodeIgniter supplies in their PHP framework. However, it seems to fully ignore the backup adapter. If I use: $this->load->driver('cache', array('adapter' => 'apc', 'backup' => 'dummy')); Then I assume that it will use APC if available, otherwise it'll fall back to dummy cache (do nothing). This is obviously very handy since not everyone will have APC installed. This doesn't seem to be the case - since I get an error when testing the following code: if(!$config = $this->cache->get('config')) { //Get config from database $config = $this->db->from('core')->get()->row_array(); //Cache it $this->cache->save('config', $config, 600); } (Fatal error: Call to undefined function apc_cache_info())
CodeIgniter cache driver ignoring backup adapter [closed]
Using the cache template tag is your best bet when dealing with dynamic responses. Any time you're varying on things like logged in users, session stores, etc, you're simply not going to be able to cache the whole response. Caching the non-changing bits of your template with the cache template tag is the next best thing. Then, at least, only the actual dynamic parts need to be processed.
I have a view in Django which is "semi-dynamic". In my case, it serves different content to first time visitors and returning visitors. I know that my view can run all sorts of logic before rendering the response, but that would make caching in the view (and beyond) level impossible, because different types of users (according to cookie, session data or user data) will get a different response. I'm thinking of several options to implement this: Redirect from my view to another view which is cached. I don't like this approach because it affects the UX (changes the URL). I know that if I'd ever want to cache in the HTTP server level, I'll need to opt for that approach. Cache in the template level using the cache tag (can practically cache the entire template, head to toe). That way, I can still use different templates for each case. It's an OK approach, I guess, it still involves running the template engine which is something I'd rather avoid. Cache the ready HttpResponse objects myself in the view using the caching framework directly. Practically this sounds like it would offer the best performance, but it feels a bit like "reinventing the wheel". Any other ideas? Any standard way of doing that which I'm missing?
Caching a semi-dynamic view in Django
You should get the file (from disk or WS) and render the content in a classic Action, then set the cache with an annotation : @Cached(key="sitemap", duration=86400) public static Result index() { // ... set sitemap variable from your file return ok(siteMap); } http://www.playframework.org/documentation/2.0/JavaCache Or you can achieve the same behavior with a job.
I'm using playframework v2 and I have my sitemap files being re-created once a day by an external process. They're all in the assets folder/sitemap How do I force playframework to return the file directly from disk?
Playframework 2.x - prevent assets caching
2 Question 1: From my experience the ASP forms authentication would be enough. No reason to send credentials as POST and certainly not GET. You can use that for a change password or account info method. You might want to look into Membership and Roles. Question 2: I would stick with the ASP.NET session. This might make your application more prone to issues and vulnerabilities in the end, and I see it as unnecessary. Share Improve this answer Follow answered Mar 14, 2012 at 19:10 wachpwnskiwachpwnski 67811 gold badge55 silver badges1111 bronze badges 2 But session will expire after application recycle and cache duration will be as set by you. – Pankaj Mar 14, 2012 at 19:13 If you don't want to lost data after application recycle, you can use SQL server. support.microsoft.com/kb/317604 – Vano Maisuradze Mar 23, 2012 at 22:00 Add a comment  | 
I need to place my app business logic into a WCF service. The service shouldn't be dependent on ASP.NET and there is a lot of data regarding the authenticated user which is frequently used in the business logic hence it's supposed to be cached (probably using a distributed cache). As for authentication - I'm going to use two level authentication: Front-End - forms authentication back-end (WCF Service) - message username authentication. For both authentications the same custom membership provider is supposed to be used. To cache the authenticated user data, I'm going to implement two service methods: 1) Authenticate - will retrieve the needed data and place it into the cache(where username will be used as a key) 2) SignOut - will remove the data from the cache Question 1. Is correct to perform authentication that way (in two places) ? Question 2. Is this caching strategy worth using or should I look at using aspnet compatible service and asp.net session ? Maybe, these questions are too general. But, anyway I'd like to get any suggestions or recommendations. Any Idea
asp.net, wcf authentication and caching
1 Beware of HTTP caching. I looked into this some time ago. my blog article on the HTTP caching Share Improve this answer Follow answered Mar 7, 2012 at 10:14 AdamAdam 36.3k99 gold badges101101 silver badges140140 bronze badges 4 I am afraid I don't understand your point. You suggest no caching at all? I DO want caching so that the browser makes as few requests as possible. In order to DECREASE RESPONSE TIMES and thus increase user expierience. Also i want to decrease bandwidth usage. So these are the boundaries within where i search for an answer. Given this, i cannot accept your answer. – Jesper Rønn-Jensen Mar 7, 2012 at 10:39 1 If you use "current" symlinks and then update where the current symlink is pointing, browsers will still use the cached version for quite some time - possibly days in my experience. I'd recommend not using a current symlink approach. Keep it keyed on a meaningful version number in the path name - this way the caching will behave in the way you want. – Adam Mar 7, 2012 at 11:45 OK so i should understand your answer in the context of not using a current symlink? Please add any other advice you may want to share! – Jesper Rønn-Jensen Mar 7, 2012 at 15:02 @JesperRønn-Jensen Yes, I'd advise against the current symlink approach. Apologies this isn't a complete answer. – Adam Mar 7, 2012 at 15:24 Add a comment  | 
For an (enterprise) web project i want to keep previous versions of the static files so that projects can decide for themselves when they are ready to implement design changes. My initial plan is to provide folders for static content like so: company.com/static/1.0.0/ company.com/static/1.0.0/css/ company.com/static/1.0.0/js/ company.com/static/1.0.0/images/ company.com/static/2.0.0/ company.com/static/2.0.0/css/ company.com/static/2.0.0/js/ company.com/static/2.0.0/images/ Each file in these folders should then have a cache-policy to cache "forever" -- one year at least. I also plan to concatenate css files and js files into one, in order to minimize number of requests. Then i would also provide a current folder (which symlinks to the latest released version) company.com/static/current/ company.com/static/current/css/ company.com/static/current/js/ company.com/static/current/images/ This will solve my first problem (that projects and sub websites can lock their code to a certain version and can upgrade whenever they are ready). But then I can see some caching issue. Now i cannot "just" cache current folder, since it will change for each release. What should my caching policies be on that folder. Also, for each release, most of the static files will never change anyway. Is it relevant to cache them forever, and rename if there are changes? I am looking for advice here, since i want to know about your best trade-off between caching and changing the files.
Versioning and caching static files: CSS, JS, images -- What to consider
i would do something like this: function GetServerStatus($site, $port){ $fp = @fsockopen($site, $port, $errno, $errstr, 2); if (!$fp) { return false; } else { return true; } } $tempfile = '/some/temp/file/path.txt'; if(GetServerStatus('ServerB',80)){ $content = file_get_contents("http://serverB/somePage.aspx?someParameter"); file_put_contents($tempfile,$content); echo $content; }else{ echo file_get_contents($tempfile); }
I have a website with the following architecture: End user ---> Server A (PHP) ---> Server B (ASP.NET & Database) web file_get_contents browser Server A is a simple web server, mostly serving static HTML pages. However, some content is dynamic, and this content is fetched from Server B. Example: someDynamicPageOnServerA.php: <html> ...static stuff... <?php echo file_get_contents("http://serverB/somePage.aspx?someParameter"); ?> ...more static stuff... </html> This works fine. However, if server B is down (maintainance, unexpected crash, etc.), those dynamic pages on server A will fail. Thus, I'd like to cache the last result of file_get_contents and show this result if file_get_contents timeouted. Now, it shouldn't be too hard to implement something like this; however, this seems to be a common scenario and I'd like to avoid re-inventing the wheel. Is there some PHP library or built-in feature that helps which such a scenario?
Fault-tolerant file_get_contents
If this is ASP.NET then the easiest way to do the caching would be to use the HttpContext.Current.Cache object. It would work something like this in your code. You can find more information on the Cache class on MSDN. if (HttpContext.Current.Cache.Get("ef_results") == null) { var results = null; // todo: get results from EF HttpContext.Current.Cache.Add("ef_results", // cache key results, // cache value null, // dependencies System.Web.Caching.Cache.NoAbsoluteExpiration, // absolute expiration TimeSpan.FromMinutes(30)); // sliding expiration } myDropDown.DataSource = HttpContext.Current.Cache.Get("ef_results"); If this is WPF/WinForms then the easiest way would be to just add a static field to your class and 'cache' the results of your EF query in that static field using the same logic as above.
I have a drop down list that I am populating with the result set of a Entity Framework call to a SQL Server database. Currently, there are 20 records returned in that call, and I am thinking of using caching. I have never setup caching for a specific control before, can someone point me to a tutorial. Is this also kind of overkill, for that small data set?
Caching result of SQL Call
You can force the cache expiration for the BLOB, read this doc: http://msdn.microsoft.com/en-us/library/windowsazure/gg680306.aspx
I've Silverlight project and on Windows azure storage, I upload an image here : https://**.blob.core.windows.net/profilepicture/3d5978a1-3e51-4212-b129-9ff401149bc0 i see my picture, but when i update this picture, i see my old picture (I think it's because caching), when i check with "Azure storage explorer" my picture was change... How i can Force the refresh on my silverlight application for to see my last update? Thanks you very much if you have same question, ask me.
Update picture on my Windows Azure Storage (Refresh)
AFAIK, you can't serialize PDO objects at all: they are tightly coupled with the underlying database driver and the connection that is currently open. You would have to cache an array containing the database call's results.That may not necessarily help your performance though: fetching a lot of data from a cache (or storing it all in the PHP script's memory) may take just as long as making the database call, especially if the table is properly indexed. Using a normal database connection may be the right way to go here; if you have full control over your database, and you're using mySQL, you could also consider looking into mySQL query caching.
I am trying to cache the results of a query which won't change very often, if at all. In my class I have a private class variable private $_cache and in my constructor I initialize it the way I do with most of my caching: // Setup caching $frontendOptions = array('lifeTime' => (strtotime('+1 week') - time())); $backendOptions = array('cache_dir' => '../application/cache'); $this->_cache = Zend_Cache::factory('Core', 'File', $frontendOptions, $backendOptions); Later, in a function I attempt to cache a query's results: $cache_id = 'all_station_results'; if ( ($results = $this->_cache->load($cache_id)) === false ) { // Get all data from stations table $sql="SELECT * FROM locations"; $sth = $this->_db->query($sql); // Serialize query results $data = serialize($sth); // Write to cache $this->_cache->save($data, $cache_id); } else { // Return results from cache return unserialize($results); } This throws an exception: You cannot serialize or unserialize PDOStatement instances So I tried without serializing and I get this exception thrown: Datas must be string or set automatic_serialization = true Now, obviously a PDOStatement isn't a string and I don't see the difference between setting automatic_serialization = true and manual serialization. How can I cache this PDOStatement object?
Caching PDOStatement Object in Zend Framework
You can turn the RAM cache on/off on a per-ruleset basis. Go to the Caching control panel, to the Detailed Settings tab, and go to "View/edit/clear per-ruleset paramaters" for each operation to find the setting.
I'd like to use plone.app.caching for front-end proxy caching only (Varnish). By default, plone.app.caching comes with RAM cache settings. To make caching simpler, easier to diagnose, I'd like to cache stuff only in the front end cache. Can plone.app.caching RAM cache disabled by setting cached object count to zero? Does this have any known bad effects on the site? Are there any other ways to disable RAM cache? Is RAM cache enabled on vanille Plone installations (no plone.app.caching installed)?
plone.app.caching and disabling RAM cache
2 Add a random number to the GET request so that IE will not identify it as "the same" in its cache. This number could be a timestamp: new Date().getTime() EDIT perhaps make the requested url: var _url = "/path/updated-data.htm?" + new Date().getTime() This shouldn't cause any errors I believe. EDIT2 Sorry I just read your post a bit better and saw that this is not an option for you. You say "is hosted on Akamai and cannot accept querystrings" but why not? I've never heard of a page that won't accept an additional: "?blabla", even when it's html. Share Improve this answer Follow answered Jan 24, 2012 at 14:09 Rick KuipersRick Kuipers 6,61722 gold badges1818 silver badges3737 bronze badges 1 What I should have mentioned is that the data file is hosted on Akamai Netstorage which does not allow querystrings. Unfortunately is you pass a querystring in Akamai, it pulls from the Origin server and we want to avoid having a heavy traffic load on Origin. – user1085587 Jan 25, 2012 at 12:28 Add a comment  | 
I'm having the classic IE-caches-everything-in-Ajax issue. I have a bit of data that refreshes every minute. Having researched the forums the solutions boil down to these options (http://stackoverflow.com/questions/5997857/grails-best-way-to-send-cache-headers-with-every-ajax-call): add a cache-busting token to the query string (like ?time=[timestamp]) send a HTTP response header that specifically forbids IE to cache the request use an ajax POST instead of a GET Unfortunately the obvious querysting or "cache: false" setting will not work for me as the updated data file is hosted on Akamai Netstorage and cannot accept querystrings. I don't want to use POST either. What I want to do is try send an HTTP response header that specifically forbids IE to cache the request or if anyone else knows another cache busting solution?? Does anyone know how this might be done? Any help would be much appreciated. Here is my code: (function ($) { var timer = 0; var Browser = { Version: function () { var version = 999; if (navigator.appVersion.indexOf("MSIE") != -1) version = parseFloat(navigator.appVersion.split("MSIE")[1]); return version; } } $.fn.serviceboard = function (options) { var settings = { "refresh": 60}; return this.each(function () { if (options) $.extend(settings, options); var obj = $(this); GetLatesData(obj, settings.refresh); if (settings.refresh > 9 && Browser.Version() > 6) { timer = setInterval(function () { GetLatestData(obj, settings.refresh) }, settings.refresh * 1000); } }); }; function GetLatestData(obj, refresh) { var _url = "/path/updated-data.htm"; $.ajax({ url: _url, dataType: "html", complete: function () {}, success: function (data) { obj.empty().append(data); } } }); } })(jQuery);
Clear IE cache when using AJAX without a cache busting querystring, but using http response header
Rails provides an easy to using caching system: Rails.cache.fetch('some_key', :expires_in => 24.hours) do ... end if the cache store doesn't contain some key then the block is evaluated and the result stored in the cache. There are several stores you can choose from, such as memcache, in memory, or the file system.
My Rails application has a simple yaml file that's downloaded relatively frequently (5x per second). The file is updated very infrequently (at most once per day). I don't want to YAML::load the file every time it's requested. What's the best way to cache this result?
Caching yaml files in Rails
You may use ApcBundle to do this.
Is there a way to extend Symfony 2 cache:clear command to clear APC as well or to do some other logic?
Extend Symfony 2 cache:clear command to clear APC as well
You actually have to Start the Visual Studio as admin to be able to create Cache programatically.
I want to add named data cache programmatically in appfabric. I kept following code for this :- try { //This can also be kept in a config file var config = new DataCacheFactoryConfiguration(); config.SecurityProperties = new DataCacheSecurity(); config.Servers = new List<DataCacheServerEndpoint> { new DataCacheServerEndpoint(Environment.MachineName, 22233) }; DataCacheFactory dcf = new DataCacheFactory(config); if (dcf != null) { var state = InitialSessionState.CreateDefault(); state.ImportPSModule(new string[] { "DistributedCacheAdministration", "DistributedCacheConfiguration" }); state.ThrowOnRunspaceOpenError = true; var rs = RunspaceFactory.CreateRunspace(state); rs.Open(); var pipe = rs.CreatePipeline(); pipe.Commands.Add(new Command("Use-CacheCluster")); var cmd = new Command("New-Cache"); cmd.Parameters.Add(new CommandParameter("Name", "Vaibhav")); cmd.Parameters.Add(new CommandParameter("Expirable", false)); pipe.Commands.Add(cmd); var output = pipe.Invoke(); } } catch (Exception e) { //throw new Exception } But this is not working as expected when i try to access the DataCache (using: dcf.GetCache("Vaibhav");) it is giving Cache not found error. When i created the Cache using powershell it worked fine and i was able to access the Cache, but i want to implement this programmactically and not by Command Prompt(powershell) Please suggest a proper way to implement this.... Thanks in Advance Vaibhav
Adding Named Data Cache Programmatically in Appfabric
Snapshots treated in Gradle as changing modules. By default, they are refreshed every 24 hours. There is an explanation available on how to configure it in the Gradle documentation
I am interested in reducing the time between refresh of SNAPSHOT versions of dependencies in Gradle. How can I configure it and what is the default value?
How to change in Gradle time to live for artifact caches?
I used RNCachingURLProtocol , a way to do drop-in offline caching for UIWebView Reference: http://robnapier.net/blog/offline-uiwebview-nsurlprotocol-588
I'm looking to cache HTML pages inside an application that uses UIWebView. Is there a framework that will allow me to do this and if not what method would you recommend? On the server side these files are being seen as PHP so I'd just be caching whatever the file gives back out.
Is there a framework to cache HTML pages on iOS?
After playing around with EhCache for a few weeks it is still not perfectly clear what they mean by the term "multi-tier" cache. I will follow up with what I interpret to be the implied meaning; if at any time down the road someone comes along and knows otherwise, please feel free to answer and I'll remove this one. A multi-tier cache appears to be a replicated and/or distributed cache that lives on 1+ tiers in an n-tier architecture. It allows components on multiple tiers to gain access to the same cache(s). In EhCache, using a replicated or distributed cache architecture in conjunction with simply referring to the same cache servers from multiple tiers achieves this.
I've recently come across the phrase "multi-tier cache" relating to multi-tiered architectures, but without a meaningful explanation of what such a cache would be (or how it would be used). Relevant online searches for that phrase don't really turn up anything either. My interpretation would be a cache servicing all tiers of some n-tier web app. Perhaps a distributed cache with one cache node on each tier. Has SO ever come across this term before? Am I right? Way off?
What is a multi-tier cache?
I wouldn't worry about the speed of ConcurrentHashMap Map<Integer, Integer> map = new ConcurrentHashMap<>(); long start = System.nanoTime(); int runs = 200*1000*1000; for (int r = 0; r < runs; r++) { map.put(r & 127, r & 127); map.get((~r) & 127); } long time = System.nanoTime() - start; System.out.printf("Throughput of %.1f million accesses per second%n", 2 * runs / 1e6 / (time / 1e9)); prints Throughput of 72.6 million accesses per second This is far beyond the access rate you appear to be using. If you want to reduce garbage you can use mutable objects and primitive. For this reason I would avoid using String (as you appear to have far more strings than data entries)
I wrote a stock market simulator which uses a ConcurrentHashMap as a cache. The cache holds about 75 elements but they are updated and retrieved very quickly (~ 500 times a second). Here is what I did: Thread 1: Connected to an outside system which provides me with streaming quotes for a given stock symbol. Thread 2 (callback thread): Waits till data is delivered to it by the outside system. Once it gets the data, it parses it, creates an immutable DataEntry object, caches it and sends a signal to thread3. Thread 3 (Consumer thread): Upon receiving the signal, retrieve the DataEntry from the cache and uses it. (It is part of the task to not let thread2 push data directly to thread3). public final class DataEntry{ private final String field1; private final String field2; //... private final String field25; // Corresponding setters and getters } public final class Cache{ private final Map<String, DataEntry> cache; public Cache( ){ this.cache = new ConcurrentHashMap<String, DataEntry> ( 65, 0.75, 32 ); } // Methods to update and retrieve DataEntry from the cache. } After running it through a profiler, I noticed that I am creating a lot of DataEntry object. And therefore eden is filling up very quickly. So, I am thinking of tweaking the design a bit by: a) Making the DataEntry class mutable. b) Pre-populating the cache with empty DataEntry objects. c) When the update arrives, retrieve the DataEntry object from the map and populate the fields. This way, number of DataEntry object will be constant and equal to the number of elements. My questions are: a) Does this design have any concurrency issues that I may have introduced by making the DataEntry mutable. b) Is there anything else I can do to optimize the cache? Thanks.
Writing a highly performant Cache
3 Based on the comments we've exchanged above, I'd say that the client side caching works. Your server sends: Cache-Control:max-age=36000 Which means, that the client should cache it for 10 hours (60 * 60 * 10 == 36000). If you actually want 10 days, the configuration is: cacheControlMaxAge="10.00:00:00" Remember that the client might decide to retrieve the resource again regardless of your cache headers, for any number of reasons (such as the client side cache has been purged, the user requested a full refresh, the client doesn't implement client side caching properly, etc. etc.) How did you arrive at the conclusion that it doesn't work ? Share Improve this answer Follow answered Dec 4, 2011 at 16:26 driisdriis 163k4545 gold badges268268 silver badges343343 bronze badges 2 I checked it using YSlow Fiddler and Chrome dev tool. Every time I refresh page it always makes request to static content (http status 200). – pauliusnrk Dec 4, 2011 at 17:51 The server tells the client the right thing, but refreshing in this case also means to refresh the cached resources. If you just nnavigate to the page, the browser should use the cached version – driis Dec 4, 2011 at 19:59 Add a comment  | 
I want to cache static content of my asp.net mvc 3 app. I added this tag in web.config to cache for 10 days: <staticContent> <clientCache cacheControlMode="UseMaxAge" cacheControlMaxAge="10:00:00" /> </staticContent> but it seens doesn't work (checked using YSlow and Fiddler). Any ideas why?
Asp.NET MVC static content caching doesn't work
This blog post, although not specifically addressing your topic, does have some good information about what might cause your app pool to get recycled. http://haacked.com/archive/2011/10/16/the-dangers-of-implementing-recurring-background-tasks-in-asp-net.aspx IIS will itself recycle the entire w3wp.exe process every 29 hours. It’ll just outright put a cap in the w3wp.exe process and bring down all of the app domains with it. It also contains some information that may solve your problem (HostingEnvironment.RegisterObject)
I have created a WCF method that runs an infinite loop, polling every 5 minutes. It writes 2 items to cache, the main item I'm retrieving and the LAST_POLLED. This is successfully called on app start of our web site: //infinite loop while (true) { try { _cache.RefreshCache(); WcfCache.SetCache(LAST_POLLED, DateTime.Now); } catch(Exception ex) { //logging exception to database } //essentially polling interval, though dependent on how long it takes to complete task Thread.Sleep(Polling_DELAY); } The LAST_POLLED datetime should have the time it was polled. Occasionally, though, it's storing Jan 1, 0001. I imagine this must be due to the cache disappearing. The app pool for this WCF site is NOT set to Recycle NOR are the worker processes set to shutdown after being idle. This will work for days at a time, but hasn't worked a full week yet. Why is this cache getting reset? Is there a better way to code this? I didn't want to create a separate windows service for this to keep down the number of projects/apps, but I can.
Why is my cached data disappearing?
If you mean by stochastic stochasting sampling (for simulating effects like DOF or motion blur), the answer is probably yes. Two sample rays for a same pixel could lead to two very different paths in your acceleration structure, leading to potential cache misses. One of the best way to accelerate this is simply not to use raytracing for primary rays but rasterization and use stochastic sampling of your polygons (check for Reyes rendering). That's what softwares like Pixar's RenderMan® do for instance.
Specifically in the context of a real-time raytracer where view updates are frequent? The obvious answer would seem to be "yes" and yet I wonder if any methods have been found to accelerate Monte Carlo methods given their usefulness.
Is stochastic raytracing inherently cache-unfriendly?
3 I think you need to specify a LoadOption on the data. I can't remember exactly, and I can't find the documentation, but I believe you need to override the LoadingData event on the DomainDataSource; and set the args.MergeOption. Try 'RefreshCurrent'. Share Improve this answer Follow answered Oct 31, 2011 at 14:58 TheNextmanTheNextman 12.6k22 gold badges3838 silver badges7676 bronze badges 0 Add a comment  | 
I use a DomainDataSource with filter descriptors, but it seems that the DomainDataSource (or DomainContext) is caching old data and not replacing it with fresh data from the database. <riacontrols:DomainDataSource AutoLoad="True" LoadSize="5" Name="employeeDomainDataSource" QueryName="GetEmployeeQuery" Width="0" DomainContext="{Binding EmployeeContext}"> <riacontrols:DomainDataSource.FilterDescriptors> <riacontrols:FilterDescriptor IsCaseSensitive="False" PropertyPath="Name" Operator="Contains" Value="{Binding ElementName=NameFilter, Path=Text}"/> </riacontrols:DomainDataSource.FilterDescriptors> </riacontrols:DomainDataSource> I also have a DataPager control. Suppose user A and B load the data. User B changes the Employee's Name in edit mode. User A types in that new name as filter, the data will be fetched but the old (cached) name is displayed. I used fiddler and I can see that the correct data is returned from the database and the webservice. Is there any option where I can switch this off?
Disable caching of DomainContext / DomainDataSource in Silverlight RIA Services
Run varnishd in debug mode and it should show you where the error is. Example: # varnishd -d -f /etc/varnish/default.vcl Message from VCC-compiler: Expected an action, 'if', '{' or '}' ('input' Line 32 Pos 6) resp.http.Cache-Control = "max-age=60"; -----#######################----------------
After adding a simple acl to the top of default.vcl to restrict purge requests to localhost, Varnish fails to restart. My default VCL is otherwise unmodified. default.vcl is as follows: backend default { .host = "127.0.0.1"; .port = "8080"; } acl purge { "localhost"; } It doesn't matter if I place the acl declaration above or below the backend directive. Varnish version 3.0.2. This should be an otherwise exteremely simple configuration.
Varnish default VCL access control list
after looking at the source code! i find this at BaseMemcachedCache: @property def _cache(self): """ Implements transparent thread-safe access to a memcached client. """ if getattr(self, '_client', None) is None: self._client = self._lib.Client(self._servers) return self._client So, I would say that, this will work: c._cache.cas Try, and let me know! for more details: https://code.djangoproject.com/svn/django/trunk/django/core/cache/backends/memcached.py
Over here at the Django groups Tom Evans explains the method to do compare and set in Django as shown below You can access the memcached client via django though: >>> from django.core import cache >>> c=cache.get_cache('default') >>> help(c._client.cas) But somehow I couldn’t get it to work. >>> from django.core import cache >>> c=cache.get_cache('memcache') >>> help(c._client.cas) Traceback (most recent call last): File "<console>", line 1, in <module> AttributeError: 'MemcachedCache' object has no attribute '_client' How can I get to do a compare and set in Django, if not the method shown above? I use Django version 1.3.
Django Memcache : Compare and Set
Enterprise Library doesn't support an out of the box distributed multi-server cache solution. From Chapter 5 of the Developer's Guide: Out of the box, the Caching Application Block does not provide the features required for distributed caching across multiple servers. Based on what you are trying to do I would take a look at Windows Server AppFabric Caching Services. Windows Server AppFabric provides a distributed in-memory application cache platform for developing scalable, available, and high-performance applications.
Can I Use Microsoft Enterprise Library Caching in Load Balance servers?, My case is I have a web service located in 2 load balance servers, and I use Database Cache "backingStores" with configuration like below <cachingConfiguration defaultCacheManager="Cache Manager"> <cacheManagers> <add name="Cache Manager" type="Microsoft.Practices.EnterpriseLibrary.Caching.CacheManager, Microsoft.Practices.EnterpriseLibrary.Caching, Version=5.0.414.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" expirationPollFrequencyInSeconds="60" maximumElementsInCacheBeforeScavenging="1000" numberToRemoveWhenScavenging="10" backingStoreName="MyDataCacheStorage11"/> </cacheManagers> <backingStores> <add name="MyDataCacheStorage11" type="Microsoft.Practices.EnterpriseLibrary.Caching.Database.DataBackingStore, Microsoft.Practices.EnterpriseLibrary.Caching.Database, Version=5.0.414.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" encryptionProviderName="" databaseInstanceName="EntLib1ConnectionString111" partitionName="CAPwiki Cache"/> <add type="Microsoft.Practices.EnterpriseLibrary.Caching.BackingStoreImplementations.NullBackingStore, Microsoft.Practices.EnterpriseLibrary.Caching, Version=5.0.414.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="NullBackingStore"/> </backingStores> </cachingConfiguration> and when I set cache with key "_testorderstatus3" from server1 and try to get it from server2, it returns Null value!!! also when I set value with same key from server1 and set it again from server2, and I check the DB, I found it set 2 times as shown below any idea about that? i need to set and get in both servers. Thank you
EnterpriseLibrary Caching in Load Balance
It is stored in memory according to MSDN An application can often increase performance by storing data in memory Which could possibly be stored on the hard drive depending on the utilization of the Page file. Be especially careful to not create references to objects you don't want to store.
I was thinking about implementing some caching in my ASP.NET web application to improve performance. I read about HttpRuntime.Cache and it seems like that is what it was made for. The only issue is that I want to potentially cache a lot of data (mainly SQL query results). So, I wanted to know where HttpRuntime.Cache stores data...is it in memory or on the HD somewhere? If it stores in memory, then I am potentially overloading the server's resources, so I would need an alternative cache store. I was thinking MongoDB might be a good option, unless anyone else has a better suggestion...?
Where does HttpRuntime.Cache store data?
A database that small is already in memory on your server, no need to do it twice.
My web application is using a SQL Server database that has around 2000 rows. I wanted to know if its better to load all the DB to the RAM (Store it in a static var) and query it using ASP.NET LINQ. I need it only for read operations. I think that the amount of RAM needed is not very high at all. I think it can speed up the application considerably. I wanted to know if it's a good option, instead of caching the data?
In-memory vs Caching for an ASP.net application
You shouldn't have to do your own locking with The Caching Application Block because "you are assured that the block performs in a thread-safe manner". If you take a look at the source code for Cache you will see that the Add, Remove, and GetData methods all obtain locks on the in-memory cache before performing any operations.
I'm using Enterprise Lib 5.0 caching app block and I'm trying to figure out the best way to handle read/writes in a multi threaded scenario. Is taking a lock on write and not taking a lock on read the recommended approach? Update(string key, object value) { lock(syncLock) { cacheManager.Add(key,value); } } object Read(string key) { object o = cacheManager.GetData(key); return o; } //OR is the following Read recommended using a lock object Read(string key) { lock(syncLock) { object o = cacheManager.GetData(key); return o; } } My concern is that if one thread is about to update the item for a specific key at the same time as another thread is about to read the item for the same key.Can this cause races? Would it make sense to have a dictionary of "keys" to "ReaderWriterLockSlim"s so that you would essentially take a lock for only a specifc key as opposed to a "common" lock in a multi threaded scenario like a web app: Basically something like this: Dictionary<string,ReaderWriterLockSlim> dict = new Dictionary<string,ReaderWriterLockSlim> (); void Update(string key, object value) { dict[key].EnterWriteLock(); cacheManager.Add(key,value); dict[key].ExitWriteLock(); } object Read(string key) { dict[key].EnterReadLock(); object o = cacheManager.GetData(key); dict[key].ExitReadLock(); return o; }
Reading/Writing to and from a shared Cache based on Enterprise Library
Personnaly I use os.path.expanduser to find a good place for caches, it's quite common in unix environnement where most of the current user's config/cache is saved under his home directory, the use of a directory name starting with a dot, making an "hidden" directory. I would do something like : directory = os.path.join(os.path.expanduser("~"), ".my_cache") As for the modification date of the distant file you can use urlib : import urllib u = urllib.urlopen("http://www.google.com") u.info().get("last-modified") However you should check that your HTTP server provides the last-modified HTTP header and that it is a coherent value ! (This is not always the case)
My pygtk program is an editor for XML-based documents which reference other documents, possibly online, that may in turn reference further documents. When I load a file, the references are resolved and the documents loaded (already asynchronously). However, this process repeats every time I start the editor, so I want some local caching to save bandwidth and time for both the user and the server hosting the referenced documents. Are there any typical ways this is done? My idea so far would be: Get a path to a cache directory somehow (platform-independent) Any ideas? Put a file named md5(url) there. If there is a cache file already existing and it's not older than $cache_policy_age take it, otherwise use HTTP (can urllib do that?) to check if it has been modified since it was downloaded.
Good way (and/or platform-independent place) to cache data from web
It depends whether you have Varnish installed in front of Symfony2 or not. To be clear: Symfony2 ESI Proxy won't call any external resources, and will only call your app's controllers. Varnish is able to handle the ESI from any sources.
Here is my situation: I am using Symfony2 as a middle layer here, when web client ask for a webpage to Symfony2(the layer), the layer is going to request single/multiple data/image to another backend remote resource server by http, combine them and return to web client. And I also wish to have caching in order to reduce requests to the backend server. I found that the ESI has similar manner, however, could I include another server resource in Symfony2? Is there any proper way to implement this? Thank you!
Using Symfony2 as a dynamic View proxy
Latency between a backend and frontend instance is extremely low. If you think about it, all App Engine RPC's are fulfilled with "backend instances". The backends for the Datastore and Memcache are just run by Google for your convenience. Most requests, according to the App Engine team, stay within the same datacenter - meaning latency is inter-rack and much lower than outside URLFetches. A simple request handler and thin API layer for coordinating the in memory storage is all you need - in projects where I've set up backend caching, it's done a good job of fulfilling the need for more flexible in-memory storage - centralizing things definitely helps. The load balancing doesn't hurt either ;)
I've just finished watching the Google IO 2011 presentation on AppEngine backends (http://www.google.com/events/io/2011/sessions/app-engine-backends.html) which piqued my curiosity about using a backend instance for somewhat more reliable and configurable in-memory caching. It could be an interesting option as a third layer of cache, under in-app caching and memcache, or perhaps as a substitute for some cases where higher reliability is desirable. Can anyone share any experience with this? Googling around doesn't reveal much experimentation here. Does the latency of a URLfetch to retrieve a value from a backend's in-memory dictionary render it less attractive, or is it not much worse than a memcache RPC? I am thinking of whipping up some tests to see for myself, but if I can build on the shoulder of giants...thanks for any help :)
Feasibility/value of caching with AppEngine backends?
Yes, some pages will need different combinations of style sheets. Each combination must be cached individually. Unfortunately, the browser won't know that there isn't a difference between ?stylesheets=a.css,b.css and ?stylesheets=b.css,a.css so both will need to be cached. That's used to make sure the browser doesn't accidentally cache the dynamically generated stylesheet. It's unnecessary if you are using a decent minifier. Usually, the GUID is found by hashing the last-modified times of each file in the list. Like I said, most minifiers will automatically check for new versions of files and discard the old cached version. I would suggest PHP Minify. Installation is as easy as copying the folder into your doc root. It also supports JavaScript compression with the Google Closure Compiler.
In my project CSS files can be pre-processed and then optionally minified (depending upon configuration). Q1. Should multiple cache variations be generated for different combinations of CSS files? res.php?stylesheets=test.css,test2.css,test3.css res.php?stylesheets=test.css,test3.css Q2. In the past I have noticed that such cache files were given some sort of GUID. How can I generate such an ID based upon the request? res.php?stylesheets=test.css,test2.css,test3.css => cache/css/a3cadfeA33afADSFwwrzgnzzddsveraeAE res.php?stylesheets=test.css,test3.css => cache/css/ergope4t5oiyk0sfb9x0fHkg04mm04tjzG Please excuse the naivety of the above IDs! Somehow I need to be able to regenerate the same ID from the stylesheets specified. My question is only about caching of multiple variations and ID generation.
Automatic minify and cache CSS
Lots of hints here on how to achieve that: http://msdn.microsoft.com/en-us/library/dd997357.aspx Cache can be fitted with an expiry date so it will go and re-fetch the data after a set amount of time without dealing with timers etc.
I have an application that uses data from a table which is almost always not changing in many parts of it. This seems a right place to make a cache of it. So: i need to make cached list of that data to work with it, but have some expiration timeout after which my cached list should update itself from database(thats why global static list is not for this situation). PS im sure thats not that difficult, but im new to caching and help will save my time, thank you. At least i can create static list that will be updated after some timeout with timer in another thread, but i think such solution is too ugly.
Cached list of data from database in .NET Framework 4
You can disable caches using URLConnection.setUseCaches(boolean)
How can i disable http caching of all my http connections in my japplet? I dont want any of my http (request/response)to be cached.Regardless of the user settings in control panel\java\Temporary File Settings. Applet is signed and compiled with java1.6. I am using URLConnection.class and my request to an 3rd party web service is being cached. ie: I can see my request url in Java Cache Viewer. http://www.service.com?param1=232&param2=2323232 Also i can find the responses in application ....users\data\sun\java\deployment\cache responseline1 responseline2
How to disable http caching in applet
Is there a way we could plug another cache implementation? Did you investigate the use of the EclipseLink shared object cache that comes with EclipseLink? Going by the description, the shared object cache is not confined to a single EntityManager alone, and is available across the lifecycles of several Entity managers, i.e. across several transactions. It is of course, constrained to the lifecycle of an EntityManagerFactory, which may be as live as long as the application is running in the container. The EclipseLink shared object cache is different from Oracle Coherence, and I believe it is not licensed and packaged separately, thereby making it available on all containers.
We'd like to use another L2 cache for our big JPA application. We are trying to achieve a shared cache between multiple servers. We use Eclipselink as JPA implementation, and some legacy codes uses internal Eclipselink API's, so switching is not an option. Coherence/Toplink Grid seems too expensive (4000$/cpu?). Is there a way we could plug another cache implementation? Is something specified in JPA 2 (I can't find anything in the specs, but maybe I just misread it)? Proprietary (=Eclipselink specific) solutions are ok, as long as they are somewhat documented or simple enough (we don't want that to break).
how to use another implementation for JPA 2 level 2 cache?
1 Query caching with Hibernate JPA impl it's not possible without hint org.hibernate.cacheable up to current version 4.1. Of course you may store it in xml descriptors in named query defenitions and change it for different JPA providers (or store define hints for all variants). But in our projects we implement additional utility that adds hints to named query by external config (props). Eg: ru.citc.jpa.queryhint.[query_name]=org.hibernate.cacheable\=true, javax.persistence.cache.retrieveMode\=USE,etc... Entity cahcing may be configured by external properties. See there . Example: <prop key="hibernate.ejb.classcache.ru.citc.migcredit.csrfront.model.Form">read-write,ScriptExecution</prop> <prop key="hibernate.ejb.collectioncache.ru.citc.migcredit.csrfront.model.Form.fields">read-write,ScriptExecution</prop> <prop key="hibernate.ejb.collectioncache.ru.citc.migcredit.csrfront.model.Form.constraints">read-only,ScriptDesignCollections</prop> As result we have pure JPA code with external config for caching and other provider specific features. That work perfectly for Hibernate and EclipseLink. As additional bonus we have different caching strategies for different application modules (eg. admin web app do not cache metadata tables, but operator web app cache it read only). Share Improve this answer Follow answered Mar 30, 2012 at 6:29 Vladimir KonkovVladimir Konkov 16222 silver badges1010 bronze badges Add a comment  | 
I am trying to get JPA 2.0 Caching working in my Spring 3.0.5 application which uses Hibernate and EhCache. I do not wish to have my application bound to Hibernate and EhCache and wish to have it make use of pure JPA code only as far as possible. I managed to get caching working with EHCache and Hibernate by setting the Hibernate specific @Cache annotation on top of my entity classes and specifying org.hibernate.cacheable as a query hint for my named queries. However, when I try to switch those to @Cacheable(true) and setting the query hint javax.persistence.cache.retrieveMode to "CacheRetrieveMode.USE" (I also tried just "USE") it doesn't work and my named query which should be cached is just getting retrieved again from the database. I am specifying these within the annotation of the NamedQuery itself using the hints = ... I tried various combinations of <shared-cache-mode>, ENABLE_SELECTIVE, DISABLE_SELECTIVE etc. but none seems to have any effect. I am starting to suspect that this functionality is not available on J2SE. Am I missing something? Should I enable some extra annotation handler from the Spring application context? Thanks.
Using JPA 2.0 @Cacheable in J2SE with Spring, EHCache and Hibernate, without Hibernate specific annotations
2 Upgrade to the latest spring 3.1 milestone - it has built-in cache support through annotations - see here Apart from that, you can always use the EhCacheFactoryBean Share Improve this answer Follow answered Jul 5, 2011 at 11:39 BozhoBozho 592k146146 gold badges1.1k1.1k silver badges1.1k1.1k bronze badges Add a comment  | 
I have an application in which I use spring 3.0.2 and ibatis. Now, I need to integrate ehcache with my code. I tried this link but couldnt get it working. I would prefer someone to give me the details of the jars required, xml configurations to be done and code changes if required.
Integrating ehcache with spring 3.0
2 You seems to have XML to Object tool that creates an object model from the XML. What usually takes most of the time is not the parsing but creating all these objects to represent the data. So You might want to extract only part of the XML data which will be faster for you and not systematically create a big object tree for extracting only part of it. You could use XPath to extract the pieces you need from the XML file for example. I have used in the past a nice XML parsing tool that focuses on performances. It is called vtd-xml (see http://vtd-xml.sourceforge.net/). It supports XPath and other XML Tech. There is a C# version. I have used the Java version but I am sure that the C# version has the same qualities. LINQ to XML is also a nice tool and it might do the trick for you. Share Improve this answer Follow edited Jun 21, 2011 at 15:21 answered Jun 21, 2011 at 14:50 zoobertzoobert 56211 gold badge55 silver badges1313 bronze badges 0 Add a comment  | 
I have WCF service which reads data from xml. Data in xml is being changed every 1 minute. This xml is very big, it has about 16k records. Parsing this takes about 7 sec. so its definitely to long. Now it works in that way: ASP.NET call WCF WCF parse xml ASP.NET is waiting for WCF callback WCF gives back data to ASP.NET of course there is caching for 1 minute but after it WCF must load data again. Is there any possibility to make something that will refresh data without stopping site? Something like ... I don't know, double buffering? that will retrieve old data if there is none of new? Maybe you know better solution? best regards EDIT: the statement which takes the longest time: XDocument = XDocument.Load(XmlReader.Create(uri)); //takes 7 sec. parse takes 70 ms, its okey, but this is not the problem. Is there a better solution to dont block the website? :) EDIT2: Ok I have found a better solution. Simply, I download xml to the hdd and Im read data from it. Then the other proccess starts download new version of xml and replace the old. Thx for engagement.
problem with huge data
WCF REST caching is done by ASP.NET caching module which is registered to handle HttpApplication.ResolveRequestCache and HttpApplication.UpdateRequestCache event. You just need to handle event before ResolveRequestCache (most suitable would be BeginRequest) and access request directly in HttpContext.Current.Request (you can use InputStream property to read pure HTTP request). These events are typically handled in Global.asax file. Be aware that caching can happen on multiple levels - client can have data in its own cache, proxy server can cache data, etc. so not necessarily all request which use cached response will hit your server. You can control where the data can be cached by setting Location in cache profile.
When a WCF REST service has caching enabled, the underlying code (of course) is not run on subsequent calls to the same URI. However, I'm wondering if there is a way we could hook into the caching provider to see the request coming in and then log it. This would be for analysis purposes or to track API usage.
Is it possible to hook into the WCF cache provider?
You will want to take advantage of the lock keyword to make sure that the data is loaded and added to the cache in an atomic manner. Update: I modified the example to hold the lock accessing Cache for as short as possible. Instead of storing the data directly in the cache a proxy will be stored instead. The proxy will be created and added to the cache in an atomic manner. The proxy will then use its own locking to make sure that the data is only loaded once. protected void Page_Load(object sender, EventArgs e) { string key = "dtMine" + "_" + Session["UserID"].ToString(); DataProxy proxy = null; lock (Cache) { proxy = Cache[key]; if (proxy == null) { proxy = new DataProxy(); Cache[key] = proxy; } } object data = proxy.GetData(); } private class DataProxy { private object data = null; public object GetData() { lock (this) { if (data == null) { data = LoadData(); // This is what actually loads the data. } return data; } } }
I have following scenario: Lets say we have two different webparts operating on the same data - one is a pie chart, another is a data table. in their Page_Load they asynchronously load data from the database and when loaded place it in application cache for further use or use by other web parts. So each o the web parts has code similar to this: protected void Page_Load(object sender, EventArgs e) { if (Cache["dtMine" + "_" + Session["UserID"].ToString()]==null) { ... Page.RegisterAsyncTask(new PageAsyncTask( new BeginEventHandler(BeginGetByUserID), new EndEventHandler(EndGetByUserID), null, args, true)); } else { get data from cache and bind to controls of the webpart } } Since both webparts operate on the same data it does not make sense for me to execute the code twice. What is the best approach to have one web part communicate to the other "i am already fetching data so just wait until i place it in cache"? I have been considering mutex, lock, assigning temporary value to the cache item and waiting until that temporary value changes... many options - which one should I use.
asp.net cache multithreading locks webparts
The only posts I could figure out (by searching on Google) are the following: Mobile Browser Cache Limits: Android, iOS, and webOS Mobile Browser Cache Limits, Revisited How Mobile Browser Cache Affects Browsing on iOS, Android, and More
Is anyone familiar with a comprehensive list of mobile devices and their browser's cache limits for images? I found one reference for iPhone: http://www.niallkennedy.com/blog/2008/02/iphone-cache-performance.html But it's 3 years old. It does state, however, that the iPhone won't cache images over 25k. I'd like to know if that's still true and if anyone has similar info for Nokia, BlackBerry and Android devices.
Mobile image cache size limitations?
2 you could try adding OutputCach attribute to the action and set the duration to a couple seconds. Ex: [OutputCache(Duration=1, VaryByParam = "none")] Share Improve this answer Follow answered May 12, 2011 at 13:21 Alaeddin HusseinAlaeddin Hussein 74811 gold badge77 silver badges1515 bronze badges Add a comment  | 
I have an ASP MVC 3 site running on IIS 7.5 and I cant prevent it from caching. I have disabled the output caching in IIS by adding a '* Do Not Cache' entry to the website and also added an action filter on the controller on result execute that prevents caching too (see code below) but when I use the site, it's still caching. I have deleted all history and cookies etc from Internet Explorer and Firefox but I still see old data. Does anyone have any ideas or suggestions on what else I can do to try and prevent this? Thanks in advance, James UPDATE I have dug further using the SQL profiler and it seems to be the SQL server caching the query. Could this be the case? 2nd UPDATE I now know if definitley NOT SQL caching, now looking at IIS and MVC SOLVED!!! It was NHibernate! It was using the same session for every call rather than a session per request. public override void OnResultExecuting(ResultExecutingContext filterContext) { filterContext.HttpContext.Response.Cache.SetExpires(DateTime.UtcNow.AddDays(-1)); filterContext.HttpContext.Response.Cache.SetValidUntilExpires(false); filterContext.HttpContext.Response.Cache.SetRevalidation (HttpCacheRevalidation.AllCaches); filterContext.HttpContext.Response.Cache.SetCacheability(HttpCacheability.NoCache); filterContext.HttpContext.Response.Cache.SetNoStore(); base.OnResultExecuting(filterContext); }
Prevent caching on ASP MVC site running on IIS 7.5
Your key & val variables are in JavaScript so that isn't going to work with the Url Helper. You could change your script to look something like this: EDIT: fixed bug - changed {id = null} to { id = String.Empty } $.post('<%= Url.Action("GetLatest", "News") %>', function (data) { $.each(data, function (key, val) { $('#latestNews').append('<li><a href="<%= Url.Action("Details", "Article", new { id = String.Empty}) %>/' + key +'">' + val + '</a></li>'); }); $('#news').show(); }, "json"); So, the MVC Url.Action() method is just giving you the first part of the url. Then jQuery will add key onto the end of the url and add val as the text for the anchor at runtime. I think that's the easiest way of doing it without refactoring the code too much.
I'm generating the list of links like this: $('#latestNews').append('<li><a href="<%= Url.Action("Details", "Article") %>/' + key + '">' + val + '</a></li>'); And on the Index page link looks like: <a href="/En/Article/Details/6">Title</a> But, on the /En/Article/Details/6 page - generated link looks like: <a href="/En/Article/Details/6/6">Title</a> I've tried $('#latestNews').append('<li><a href="<%= Url.Action("Details", "Article") %>?id=' + key + '">' + val + '</a></li>'); It works ok, but then caching does not work. My controller code: [OutputCache(Duration = Int32.MaxValue, VaryByParam = "id,language", SqlDependency = "database:Articles")] //Articles will be added rarely so i think it'll be nice to cache them public ActionResult Details(string id, string language) {..} My route: routes.MapRoute( "Default", "{language}/{controller}/{action}/{id}", new { language = "En", controller = "Home", action = "Index", id = UrlParameter.Optional } ); So how to generate Url in a better way? UPDATED: $.post('<%= Url.Action("GetLatest", "News") %>', function (data) { $.each(data, function (key, val) { $('#latestNews').append('<li><%= Url.ActionLink(val, "Details", "Article", new { id = key }, null) %></li>'); }); $('#news').show(); }, "json");
Generate link in JQuery using ASP.NET MVC Url.Action
The output cache is Out or _oh in your user namespace, so you can call Out.clear(). Edit: This might be a different output cache to self.outputcache. I'm not so familiar with 0.10. If you need a reference to your IPython shell, in IPython 0.10, use __IP or __IPYTHON__. In 0.11 (the development version), use get_ipython().
I am having memory issues with iPython, and I find that calling %clear out occasionally clears this out. It seems to be caching output somewhere within some of the functions that I'm calling. I would like to build this into my function. clear out calls: self.outputcache.flush() How can I get a reference to the iPython shell (self in the above)? In other words: how can I flush the outputcache in iPython, without using clear?
iPython clear cache from within function call?
Cache the data instead of caching the drop-down. So, instead of putting the SelectList into the ViewData, put the contents for it: if (HttpContext.Current.Cache["ComboList"] == null) { HttpContext.Current.Cache["ComboList"] = repo.GetComboitems(id); //use Add instead so that you can specify the cache duration. } ViewData["ComboList"] = HttpContext.Current.Cache["ComboList"]; //take from cache. Note, code is not accurate, but it is an example only. Then, in your view, render the combo. I hope this helps.
Is it possible to cash drop down list? I'm using a Telerik MVC Window, ComboBox, and the contents of the window (including ComboBox) is being returned with a partial view. Contents of the partial view depends on the list of parameters, but on the every div in this window there is a combo box contents of which is usually unchanged and it contains ~2000 records. i'm thinking about returning ViewData["ComboContent"] using separate controller with cashing before returning the window itself, but may be there is a better way... TIA updated (my controller code): [AcceptVerbs("GET")] [OutputCache(Duration = int.MaxValue, VaryByParam = "id")] //Some custom param?? public ActionResult LoadTimeOffset(int id) { String error; IEnumerable<MyModel> model = repo.GetSomeVariableStuff(id, 10, out error); //always varies ViewData["ComboList"] = new SelectList(repo.GetComboitems(id), "Key", "Value", -1); //changes only on id if (model.Count() > 0) { return PartialView("Partial", model); } return Content(error); }
ASP.NET MVC: Caching combobox
Sure, just add an attribute called "vote_score" and store it. If you're looking for a "magic" way to do this - there is none. If you simply don't want to run the calculation everytime the method is called, memoize the result. def vote_score @vote_score ||= heavy_calculation(votes_count) end Or, via ActiveSupport::Memoizable: def vote_score heavy_calculation(votes_count) end memoize :vote_score
Is it possible to create a counter cache for a virtual attribute in Rails? Background: I have Posts that get voted on, and have a counter_cache for votes (votes_count) in the Post database. Objective: I have a "vote score" I calculate as a virtual attribute using the current votes_count along with other variables, is there a way to store this "vote score" as a cached value in my Post database as well?
counter_cache for a virtual attribute in Rails 3
3 Django has a cache framework that you could use. http://docs.djangoproject.com/en/dev/topics/cache/ It's got a low level caching api that does what you want. from django.core.cache import cache cache.set('my_key', 'hello, world!', 30) cache.get('my_key') To use it, you'd do something like if cache.get("key"): return cache.get("key") else: value = some_expensive_operation() cache.set("key",value) return value Using something like this will give you more flexibility in the future. Share Improve this answer Follow answered Mar 8, 2011 at 17:33 Marc HughesMarc Hughes 5,82833 gold badges3737 silver badges4646 bronze badges 2 I have over 15 of "constants" like this defined at the module level. The question is about managing the initialisation and caching/memoisation of all of them in a uniform way. I do not want all the 6 lines of code per each item. – Erik Kaplun Mar 9, 2011 at 16:24 If these are truly constants that don't change, why are they even in the database? Just do a Django settings.py style configuration file. – Marc Hughes Mar 10, 2011 at 13:58 Add a comment  | 
How do you normally load and store stuff from the DB in global constants for caching during initialisation? The global constants will not change again later. Do you just make the DB query during load time and put it in a constant, or use a lazy loading mechanism of some sort? What I have in mind is code in the global scope like this: SPECIAL_USER_GROUP = Group.objects.get(name='very special users') OTHER_THING_THAT_DOESNT_CHANGE = SomeDbEnum.objects.filter(is_enabled=True) # several more items like this I ran into issues doing that when running tests using an empty test database. An option would be to put all the needed data in fixtures, but I want to avoid coupling each individual test with irrelevant data they don't need. Would the following be considered good style? @memoize def get_special_user_group(): return Group.objects.get(name='very special users') Or would a generic reusable mechanism be preferred?
What's a good way to load and store data in global constants for caching in Django?
Yes, it will work just fine. If you look at the code, EntityManagerImpl delegates to a SessionImpl, so everything will work as with pure hibernate. Also check this article about caching in JPA 2.0
I have very simple query. I want to make sure that I don't have any confusion. I saw in the spec that caching is not a part of spec and is provided according to specific orm tool providers. I'm using Hibernate as an ORM tool in my application. But to be vendor independent I'm using everything (annotations, classes, etc) of JPA (javax.persistence) and not anything specifically provided by Hibernate. I'm using EntityManager and EntityManagerFactory instead of SessionFactory and Session. My query is that in the blogs I saw that cache providers and caching mechanism provided by Hibernate is taken care of by Session (indirectly). So is it possible that EntityManager will also be able to use to cache providers and cache configuration and hence the entities and queries specified as cacheable will be able to use the caching features? (I think they should be). Also is there any api provided by JPA (like Statistics api provided by Hibernate) to measure and view caching statistics? Please help me in this regards.
Will hibernate cache (EHCache for eg) will work with jpa specific code (if I use EntityManager/EM Factory instead of Session/SessionFactory)?
3 If you're developing on OSX (I haven't tested on linux), you can use purge command to force disk cache to be purged (flushed and emptied). Then run your SQL query again and it should execute as if it were the first time. Share Improve this answer Follow answered Aug 14, 2013 at 18:31 user99874user99874 Add a comment  | 
Hi Sometimes I need to do some SQL tuning task, I usually do such tests on my test db. After I excute a sql statments, I want to flush the buffer cache which containing SQL statements and sql result, just like the command in Oracle "Alter system flush buffer_cache" Is PG server provide a commond of this?
Does Postgres provide a command to flush buffer cache?
The purpose of the Cache tag is to cache the output that the server sends to the client. Javascript, images and any other information that is contained within the code sent to the client side is not cached, unless specifically told to do so by the headers set in the tag of your HTML. By default, Play (if you extend the main.html) does not specify any cache-control headers, so therefore your scripts will be cached based on the browsers standard caching policy. This should be "no-cache" according to the http spec, but I am doubtful of whether this is the case.
I'm using Play Framework (v1.1.1) and I have a doubt about the #{cache} tag. I suppose the question would be "when should I use it?" but I think it's quite generic. So besides that, I would like to know if someone has checked its behaviour with Javascript. I understand that it will cache the output of other tags embedded in its body, but it will also cache Javascript? More specifically, if I include some script tags that reference external resources (like a CDN), the file will get cached too or only the tag?
Play Framework: caching on templates
3 ESI is fairly unknown outside the CDN context. However, I think there are a number of interesting use cases for its use 'closer' to the origin servers now that we see a lot more interest in splitting the applications that deliver Web sites into several self-contained services. An ESI-enabled cache is a good means to integrate these services into a single Web site. Share Improve this answer Follow answered Sep 27, 2011 at 18:50 Jan AlgermissenJan Algermissen 4,95844 gold badges2727 silver badges3939 bronze badges Add a comment  | 
Is it a good practice to use ESI's or is that an older technology? are there alternatives that would be better. I am familiar with the use of CDN's and cache servers but this ESI was typically for applications that do more than just load the front end.
Edge Server Includes (ESI)
3 The page is still cached. You need to add the following response header: cache-control : no-cache which doesn't actually prevent caching. The cache-control response header's no-cache directive means that the browser MUST NOT use the response to satisfy a subsequent request without successful revalidation with the origin server. If you really want to prevent caching, specify the no-store directive. That tells the browser that it MUST NOT store any part of either this response or the request that elicited it. This directive applies to both non-shared and shared caches. "MUST NOT store" in this context means that the cache MUST NOT intentionally store the information in non-volatile storage, and MUST make a best-effort attempt to remove the information from volatile storage as promptly as possible after forwarding it. See the HTTP 1.1 specs for details on cache-control and its directives. Share Improve this answer Follow answered Feb 17, 2011 at 18:16 Nicholas CareyNicholas Carey 72.7k1616 gold badges9494 silver badges135135 bronze badges 2 Hello, I've tried to add: HttpContext.Current.Response.CacheControl = "no-cache"; or HttpContext.Current.Response.Cache.SetCacheability(HttpCacheability.NoCache), but it didn't help. – jwaliszko Feb 17, 2011 at 20:31 Install Firefox (if you haven't). Install the Firefox plug-in Tamper Data (addons.mozilla.org/en-us/firefox/addon/tamper-data). Start tampering: Tamper Data lets you inspect and tweak the HTTP[S] request/response stream -- you can modify/inject/delete both request and response headers, etc. It's an essential web development tool – Nicholas Carey Feb 18, 2011 at 20:42 Add a comment  | 
It's quite common topic I think, but I can't resolve my problem. In my application build with ASP.NET MVC 3, I'm using form authentication along with output caching: <authentication mode="Forms"> <forms loginUrl="~/Account/LogOn" name=".CMS" protection="All" timeout="43200" cookieless="UseCookies"/> </authentication> <caching> <outputCacheSettings> <outputCacheProfiles> <add name="Dynamic" duration="3600" location="Client" varyByParam="id" /> </outputCacheProfiles> </outputCacheSettings> </caching> My LogOff action looks folowing: public ActionResult LogOff() { _formsService.SignOut(); return RedirectToAction("Index", "Dynamic"); } this action uses simple SignOut method: public void SignOut() { FormsAuthentication.SignOut(); HttpContext.Current.Session.Abandon(); // clean auth cookie HttpCookie authCookie = new HttpCookie(FormsAuthentication.FormsCookieName, string.Empty); authCookie.Expires = DateTime.Now.AddDays(-1); HttpContext.Current.Response.Cookies.Add(authCookie); // clean session cookie HttpCookie sessionCookie = new HttpCookie("ASP.NET_SessionId", string.Empty); sessionCookie.Expires = DateTime.Now.AddDays(-1); HttpContext.Current.Response.Cookies.Add(sessionCookie); } But problem is following: the page http://localhost/app/dynamic/page is protected. I cannot enter this page untill I login. After login, I have access for browsing such page. After logout, and then entering the page again, unfortunately I can still view its content. How to prevent access to protected pages after logout, when caching is enabled and I was previously visiting such pages ? What I'm doing wrong ? The cookies should be cleaned in another way ? Regards
Issue with cleaning browser cache and cookies on logout in ASP.NET MVC 3
I don't think there'd be a noticeable performance difference in storing your parameters in the HttpCache versus a Singleton object. Either way, you need to load the parameters when the app starts up. The advantage of using the HttpCache is that it is already built to handle an expiration and refresh, which I assume you would want. If you never want to refresh the parameters, then I suppose you could use a Singleton due to the simplicity. The advantage of building your own custom class is that you can get some static typing for your parameters, since everything you fetch from HttpCache will be an object. However, it would be trivial to build your own wrapper for the HttpCache that will return a strongly typed object.
I have a app that pass through a web service to access data in database. For performance purpose, I store all apps parameters in cache, otherwise I would call the web service on each page requests. Some examples of these parameters are the number of search result to display, or wich info should be displayed or not. The parameters are stored in database because they are edited through a windows management application. So here comes my question, since these parameters don't have to expire (I store them for a couple of hours), would it be more efficent to store them in a static variable, like a singleton? What do you think?
Performance : asp.net Cache versus singleton
You can get around this by doing something like this: User if Rails.env == 'development' @user = Rails.cache.fetch("key"){ User.find(0) } This will force the User model to be re-loaded before the cache statement. If you have a class with multiple cache statements you can do this: class SomeController [User, Profile, Project, Blog, Post] if Rails.env == 'development' def show @user = Rails.cache.fetch("user/#{params[:user_id]") do User.find(params[:user_id]) end end end If you are in Rails 2.x and Rails.env does not work you can always use RAILS_ENV or ENV['RAILS_ENV'] instead. Of course, your other option is to simply disable caching in your development environment, then you don't have to deal with this issue at all.
What is a right way to preload Rails model in development mode? Background: Rails 2.2, memcahe as cache store. When Rails start in production mode first of all it preload and cache all models. In development mode it use laizy loading. That's why wen we store any model into rails cache, for example, Rails.cache.write("key", User.find(0)) on next loadind of app, when we try do Rails.cache.read("key") memcache fire, that User is unknown class/module. What is a right way to preload class in this situation?
Rails. Preloading class in Development mode
I figured out how to make it: CacheManager.getInstance().getEhcache("CacheName").removeAll(); It gets singletone CacheManager, then gets Ehcache depending on name of cache, then removes elements. In next request to that cached page, filter looks for Ehcache, finds, but without elementx and UPDATES elements!
we are using web-ehcache's net.sf.ehcache.constructs.web.filter.SimplePageCachingFilter, that configured by xml, to cache page where is JSON message, but this message can be changed by administrator. How to invalidate cache when administrator changes changes the JSON message?
How to renew or invalidate cache using SimpleCachingPageFilter in Ehcache?
You can clean out the cache and sproc cache with DBCC DROPCLEANBUFFERS DBCC FREEPROCCACHE respectively
When I am trying to examine the performance of a given query in my application I'm generally uninterested in the effect my code is having. I want to be able to watch the time taken in Sql Management Studio. Unfortunately, I'm finding that some sort of caching must be going on, because one query that returns 10,000 results from a table with 26 or so columns, many of them large varchars, takes 12 seconds the first time I run it in a while and takes 6 seconds the following times unless I don't re-run it for a few minutes. Is there any way to instruct it to bypass the cache and pretend like it had never run it before? I'm using SQL Server 10.0.
How to bypass SQL Server caching
Food for thought. How many records are you caching, how big are the tables? How much mid-tier resources can be reserved for caching? How many of each type data exists? How fast filtering on the client side will be? How often does the data change? how often is it changed by the same application instance? how often is it changed by other applications or server side jobs? What is your cache invalidation policy? What happens if you return stale data? Can you/Should you leverage active cache invalidation, like SqlDependency or LinqToCache? If the dataset is large then filtering on the client side will be slow and you'll need to cache two separate results (no need for a third if ALL is the union of the other two). If the data changes often then caching will return stale items frequently w/o a proactive cache invalidation in place. Active cache invalidation is achievable in the mid-tier if you control all the updates paths and there is only one mid-tier instance application, but becomes near really hard if one of those prerequisites is not satisfied.
I have method in my BLL that interacts with the database and retrieves data based on the defined criteria. The returned data is a collection of FAQ objects which is defined as follows: FAQID, FAQContent, AnswerContent I would like to cache the returned data to minimize the DB interaction. Now, based on the user selected option, I have to return either of the below: ShowAll: all data. ShowAnsweredOnly: faqList.Where(Answercontent != null) ShowUnansweredOnly: faqList.Where(AnswerContent != null) My Question: Should I only cache all data returned from DB (e.g. FAQ_ALL) and filter other faqList modes from cache (= interacting with DB just once and filter the data from the cache item for the other two modes)? Or should I have 3 cache items: FAQ_ALL, FAQ_ANSWERED and FAQ_UNANSWERED (=interacting with database for each mode [3 times]) and return the cache item for each mode? I'd be pleased if anyone tells me about pros/cons of each approach.
ASP.NET data caching design
2 Here's a cache viewer app, built as a Firefox add-on: https://addons.mozilla.org/en-US/firefox/addon/cacheviewer/ For writing your own program, the Firefox/Mozilla cache format is documented here: http://www.pyflag.net/cgi-bin/moin.cgi/Mozilla_Cache_Format That documentation is part of pyflag, a forensic analysis tool. Here's the Python source code that reads the cache info: http://www.pyflag.net/pyflag/src/FileFormats/MozCache.py Share Improve this answer Follow answered Feb 2, 2011 at 4:01 paynepayne 14k55 gold badges4343 silver badges4949 bronze badges 1 +1 : this last link looks really useful, I will try to impl and let you know if works or not! thanks anyways! – Arthur Neves Feb 2, 2011 at 19:57 Add a comment  | 
I have a set of images which I`d like to get from my firefox cache(about:cache?device=disk), the thing is there is a pattern in the field Key that I can use to query these files that I want, but the problem is, how can I write a program to get this content?! some suggestion?! can be in python, java, c++ it doesnt really matter the language for me, just looking for some suggestion! thanks!
how to get content from about:cache?device=disk
2 I also came across this question. It is too bad that almost every blog post and article I find about this subject dutifully replicates the MSDN example without really explaining how it works. I don't have a definite answer but I think this works because the page life cycle is invoked at least once. Namely when the page is requested for the first time and thus isn't cached yet. During that first request the Page_Load is called and the HttpCacheValidateHandler is registered with the Cache object. During all subsequent request for that page, the Cache object is able to call your ValidateCacheOutput() method. And because this method is static the page life-cycle doesn't have to be invoked. I hope that someone who knows more about this can comment on it but in my opinion this also implies the following: In the given example the HttpCacheValidateHandler doesn't need to be a static method of the page because it doesn't use any properties of the Page object. It can be a static method on any other object you like. The ValidateCacheOutput() method will probably be called for every page request, not just for the page which is (ab)used to call Response.Cache.AddValidationCallback(). Maybe i'm missing something obvious but I don't see how the Cache "knows" which HttpCacheValidateHandler belongs to which page. Share Improve this answer Follow edited Sep 5, 2011 at 10:30 answered Aug 24, 2011 at 8:28 Remko JansenRemko Jansen 4,67944 gold badges3232 silver badges4040 bronze badges 1 About 2, wouldn't that mean that if you return HttpValidationStatus.Invalid from ValidateCacheOutput, the Page_Load would be fired again and the validation callback be added a second time? – user3700562 Oct 19, 2020 at 19:30 Add a comment  | 
I'm reading how to programmatically invalidate cached pages on the server in ASP.NET, and the book ("MCTS Self-Paced Traing Kit (Exam 70-515)") says: To directly control whether a cached version of a page is used or whether the page is dynamically generated, response to the ValidateCacheOutput event and set a valid value for the HttpValidationStatus attribute. The code segments look like the following: public static void ValidateCacheOutput(HttpContext context, Object data, ref HttpValidationStatus status) { if (context.Request.QueryString["Status"] != null) { string pageStatus = context.Request.QueryString["Status"]; if (pageStatus == "invalid") status = HttpValidationStatus.Invalid; else if (pageStatus == "ignore") status = HttpValidationStatus.IgnoreThisRequest; else status = HttpValidationStatus.Valid; } else status = HttpValidationStatus.Valid; } protected void Page_Load(object sender, EventArgs e) { Response.Cache.AddValidationCallback( new HttpCacheValidateHandler(ValidateCacheOutput), null); } Could someone please explain to me what this code is doing? Also, the main question I have is that I thought Cached pages were simply returned from the server, but the code below indicates that the page life-cycle is being invoked (Page_Load event); I'm confused because the page life-cycle isn't invoked if a cached page is returned, so how would the code in the Page_Load event even fire? Note: Here's the same example that the book has
ASP.NET Caching - Programatically Invalidating Server Cache...confusion on example
Most PHP build don't have a caching mechanism built in. There are extensions though that can take care of caching for you. Have a look at APC or MemCache
what are the available cache methods i could use in php ? Cache HTML output Cache some variables it would be great to implement more than one caching method , so i need them all , all the available out there (i do caching currently with files , any other ideas ?)
cache methods in php?
I would setup a cronjob to run nightly that parses the access log file and updates the counts. I'm not sure about the memcache method (have not tried it), but if you resorted to updating the database every single request, I don't think it would be very efficient. Update queries are expensive, and updating a count column would involve locking, at minimum, the row. Alternatively, you could insert a record in a "views" table for each view, then run a cronjob nightly to aggregate the view count, add it to a "views" column in the pages table, and afterwards purge the records it aggregated. Then, there is always Google Analytics as well if you are ok with resorting to a 3rd party.
My application caches pages so no unnecessary requests to the database are made. They are cached as files in the filesystem with appropriate names (their unique identifiers). I need to be able to keep view counts for those cached pages, and one way is storing them in memcached and incrementing the values by one every time. When the cache is cleaned, the values are collected and updated in the database using a bulk query. I'm not sure this is a good idea, and as I've noticed that accessing memcached slows it down. Are there any better solutions? Edit: I'm not caching little bytes of data, I am caching HTML pages, and a lot of them. There are about 30 pages for each user, and with a million users the amount of data that needs caching will be massive.
Best place to keep a viewcount in for a cached objecct in PHP?
3 The format of a URI has nothing to do with cacheability, other than noting that requests with query parameters are not cacheable by default. Everything about the cacheability of a GET request is driven by the Cache-Control, Expires, and Last-Modified (for heuristic caching) headers on the server response, and these don't have anything to do with whether the resource is dynamically or statically generated (or rather, your browser doesn't care and can't tell the difference). URL opacity is meant to promote one of the primary principles of REST, which is that services should be hypermedia-driven, and that really, clients should only "know" a few well-known entry point URLs, and get everywhere else by navigating links and forms (or their API equivalents). Share Improve this answer Follow answered Dec 26, 2010 at 13:29 Jon MooreJon Moore 1,3901010 silver badges1111 bronze badges 2 1 Agreed-the issue I see is that it's not possible to make a URI that responds to different accepts headers (i.e. HTML, XML, JSON) cacheable, since the URI is the same in all cases. – aceofspades Dec 26, 2010 at 15:52 1 Actually, not true: you can set 'Vary: Accept' on the server response, creating the notion of multiple cache "variants". This actually lets you have multiple cache entries for the same URI and have the right one served to the right client (based on the client's Accept header, in this case). – Jon Moore Mar 11, 2011 at 12:59 Add a comment  | 
I am curious how other developers reconcile http://www.w3.org/DesignIssues/Axioms.html#opaque in the context of web caching. I prefer the Rails' approach of suffixing resource requests based on the format I want, i.e. .json or .xml, rather than relying on the accepts header, despite the fact that it is not URI-opaque. The same issue rears its head with XHR's. Without the addition of a query param to differentiate it from standard HTTP Requests, caching must be disabled. I have personally opted that the purest interpretation of URI Opacity may be more academic than practical. Opinions?
Axiom of URI Opacity and Caching
http://api.jquery.com/jQuery.ajax/ when you use ajax to call up the data source you can tell it whether or not to cache the data source.
I'm using a remote XML source. Can someone point me in the right direction to cache this document? Thanks
jQuery ui autocomplete cache xml source
To answer my own question: HttpWebRequest does make use of the IE cache and RequestCacheLevel.BypassCache is refering to that cache. The setting is adding cache control headers Compared to NoCacheNoStore both avoid the cache; however, BypassCache will load to the cache so if you do a future request it will be there. Reload does similar but the headers allow proxies to return from their cache whereas BypassCache tells proxies to ignore their cache too. Yes, it is the best option ;)
The documentation on RequestCacheLevel.BypassCache doesn't explain much on the internal working of setting this with HttpWebRequest. Some questions I have on it: What cache is this refering to? A previous comment which implies it shares with Internet Explorer? Is this setting adding any cache control headers? How does this compare with NoCacheNoStore & Reload which do set headers? If I want to ensure (as much as possible from the client side) that the content does not come from cache is this the best option?
What is RequestCacheLevel.BypassCache internally?