Response
stringlengths
8
2k
Instruction
stringlengths
18
2k
Prompt
stringlengths
14
160
0 What you're looking for is a database. Database engines are made to keep key tables and object tables separate. (If you're not familiar with that concept, search the web for information on database indexing, primary keys, etc.) They're also made to keep pages of the keys tables in RAM and recently used objects in RAM. In fact, many database engines allow you to configure how much of each you keep in RAM. I think you should use Sqlite with a lightweight ORM (like Dapper or ServiceStack.OrmLite or one of a bazillion others). Sqlite has a parameter to turn off disk synchronicity and two other parameters to adjust the amount of data in RAM. See the info here: http://www.sqlite.org/pragma.html Share Improve this answer Follow answered Mar 21, 2012 at 4:05 BrannonBrannon 5,35644 gold badges3737 silver badges8585 bronze badges Add a comment  | 
I am looking for an c# object cache library that can implement the following patterns: the cache is used to cache objects of a certain type T that have a primary key. Example: a Person class (with first name, last name etc) and the key is PersonId the cache can store an unlimited # of keys. The keys are of type int or long. the cache, however, can store only a limited # of objects of type T. T objects take a lot of memory and I can't have lots of these objects in the cache at a time. on overflow the cache can serialize the objects to a database or file etc. (fast medium) but the cache would still keep around the keys. I basically need to process more T objects than I can keep in the memory and I want to use the cache to retrieve them quickly before I save the results to the database. So, I was thinking of using the Proxy pattern and have the cache store proxy objects that can get/serialize my real objects. Do you know any caching c# library that can be used with these patterns? I haven't been able to find anything myself. Thanks
Object cache with unlimited # of keys but a limited # of objects
0 Try using cattr_accessor, it'll add the accessor method onto the class as well. This worked for me. Share Improve this answer Follow answered Dec 6, 2012 at 14:15 broomyocymrubroomyocymru 17311 silver badge88 bronze badges Add a comment  | 
When I add a attr_accessor to my model without the column in the database, I can add temporary data to an array of class objects. My example : class User < ActiveRecord::Base attr_accessor :score end The problem however is that if cache in memcache an array of users with scores, the array goes from : [< User >, < User >, < User >] to: [< User >, :@score, 100, < User >, :@score, 200, < User >, :@score, 300] Is there any way to cache this information without breaking the array ? EDIT : As requested, the actual code that puts the data in the cache : def users_scoreboard Rails.cache.fetch("special_scoreboard_#{self.cache_key}", :expires_in => 1.hour) do users = Photo.missions(self.missions).created(self.start_at, self.end_at).map(&:user).uniq! users = [] if users.nil? users.each do |u| u.score = u.score_for_special(self) end users.sort! { |a,b| b.score <=> a.score } end end EDIT : What I'm using : ruby -v ruby 1.9.2p290 (2011-07-09 revision 32553) [x86_64-darwin11.0.1] rails -v Rails 3.2.0 memcached -i memcached 1.4.5 memcached -i memcached 1.4.5 gem -v dalli 1.8.8 But the problem appears either with Memcache or with the filestore. Thanks for the help !
Cache Model array with attr_accessor?
0 There's a useful piece here on the AppFabric blog that talks about this error with some diagnosis and workarounds which might be helpful for you, particularly as you reference your cache server by IP address and not name. Share Improve this answer Follow answered Jan 19, 2012 at 11:34 PhilPursglovePhilPursglove 12.6k55 gold badges4646 silver badges7070 bronze badges 1 It's not working, we have our PC's in different domains, so we cannot add the client pc in the server. What can we do to solve it? – Fernando Urkijo Jan 19, 2012 at 13:39 Add a comment  | 
I have the following code: static void Main(string[] args) { var config = new DataCacheFactoryConfiguration() { Servers = new List<DataCacheServerEndpoint> { new DataCacheServerEndpoint("192.168.129.118", 22233) } , TransportProperties = new DataCacheTransportProperties() { ConnectionBufferSize = 99999, ChannelInitializationTimeout = TimeSpan.FromSeconds(2), MaxBufferPoolSize = 99999, MaxBufferSize = 99999, ReceiveTimeout = TimeSpan.FromSeconds(2) }, SecurityProperties = new DataCacheSecurity(DataCacheSecurityMode.Transport, DataCacheProtectionLevel.EncryptAndSign), }; DataCacheFactory factory = new DataCacheFactory(config); var cache = factory.GetCache("Maestro_del_mambo"); cache.Put("123", "que tal andamios"); var cities = cache.Get("123"); Console.Read(); } When executing it it fails on the cache.put and cache.get with the following error message: ErrorCode<ERRCA0016>:SubStatus<ES0001>:The connection was terminated, possibly due to server or network problems or serialized Object size is greater than MaxBufferSize on server. Result of the request is unknown. The server side cache cluster has granted my client account so.... What are we doing wrong?
Appfabric cache ErrorCode<ERRCA0016>:SubStatus<ES0001>
0 This removes the link for all roles, put it in functions.php function remove_purge_from_page_cache_link($actions, $post){ unset($actions['pgcache_purge']); return $actions; } add_filter('post_row_actions', 'remove_purge_from_page_cache_link',1000,2); add_filter('page_row_actions', 'remove_purge_from_page_cache_link',1000,2); To make it only remove for authors you'll want to use something like this if (!current_user_can('publish_posts')) { unset($actions['pgcache_purge']); } You can tweak the logic to target the user group(s) you want. Share Improve this answer Follow answered Sep 3, 2012 at 9:19 Adam PopeAdam Pope 3,2542525 silver badges3232 bronze badges Add a comment  | 
with the plugin w3-total cache, in the overview of wordpress-posts there is the ability to "Purge from Page Cache" for each post. this function is also available for users of the role "author". this is nothing to worry about if this were possible only on their own posts. but as an "author" you can do this also on other users posts . so, is there a way to configure w3tc to not allow this for specific user-groups?
disable "Purge from Page Cache" for specific roles on w3-total-cache
0 It is better to test with your own code/data, than try do guess without full information. Write a sample code, that generate data (data should be as close as possible to your real samples, it could be stored in database, or send to your app using messages, depending on it's workflow). After it try to write a simple code, that will use read/write methods, that are used by application and test it with all 3 strategies. Share Improve this answer Follow answered Dec 22, 2011 at 16:27 dbfdbf 6,45722 gold badges4040 silver badges6565 bronze badges Add a comment  | 
EhCache comes with the ability to choose an eviction policy for when a cache fills up to its maximum size. This eviction policy is used to determine which elements to "evict" from the cache so that it does not overflow. The three eviction policy options for on-heap memory stores are: LFU (Least Frequently Used) - the default LRU (Least Recently Used) FIFO (First In, First Out) My question is: how does one determine which one of these policies is most effective for a particular application? Obviously, each will have their own strengths and weaknesses, and different applications will fair better or worse with each of these depending on numerous factors. Is there a benchmark that can be setup? I'd love to write a performance test, but wouldn't know where to start.
EhCache: Choosing Eviction Policy
0 Ideally, if your last-modified time stamp changes, everything should take of by itself. You can set up cache-control http://blogs.msdn.com/b/rahulso/archive/2006/02/02/suppress-caching-of-certain-mime-types-like-gif-jpg-etc.aspx to avoid it. Share Improve this answer Follow answered Dec 9, 2011 at 1:51 Rahul SoniRahul Soni 4,96133 gold badges3737 silver badges5858 bronze badges Add a comment  | 
I pushed some changes to one of our images to web server, but I found it's not updated on IE and firefox web browsers due to browser caching, since I saw the browsers didn't even request for image from server. I have two choice. One is to update the image file name and re-deploy. The other is to let our users' browsers think the image exipires. I haven't set anything in header or in code, here is the response: Content-Length 32042 Content-Type image/png Last-Modified Wed, 09 Nov 2011 18:41:50 GMT Accept-Ranges bytes Server Microsoft-IIS/6.0 X-Powered-By ASP.NET Date Fri, 09 Dec 2011 01:26:03 GMT Connection keep-alive So I am wondering how soon it can expire?
default time for an image to expire
0 Your best option for file system cache/search is probably to use a Lucene index. However if your content is quite static then could you not perhaps look into using Output caching? Trivial example [OutputCache(Duration=10, VaryByParam="none")] public ActionResult Index() { return View(); } This will cache this return for 10 seconds. Check here http://www.asp.net/mvc/tutorials/improving-performance-with-output-caching-cs Share Improve this answer Follow answered Nov 29, 2011 at 10:15 Dave WalkerDave Walker 3,50811 gold badge2525 silver badges2525 bronze badges 1 Thanks for the quick reply. What we see as the deficiency with output caching is that sporadically a user will have to endure the delay while the data is pulled and the output cache is refreshed. We were looking to cache the information to the file system, set the expiry, and then create a behind the scenes process to load this cache on a weekly basis by creating a web service entry point and to consume it out of hours to refresh the cache – trevorgk Nov 29, 2011 at 23:42 Add a comment  | 
We are working on an ASP.Net MVC/Jquery mobile project for a chain of Australian holiday parks. The bulk of park related data is stored in a SQL database. One of the sections currently in development is 'local attractions'. The data for this section is pulled from a webservice owned by the Australian Tourism Data Warehouse (ATDW). Though it can take up to 45 seconds for the view to load, fortunately for us the data is reasonably static. We are planning to cache it, and refresh it on a weekly basis. This data is also relatively large. I consider that it is not a suitable candidate for in-memory caching. I am interested in a file based implementation of the System.Runtime.ObjectCache. I have scoured the internet and I haven't managed to find any examples of this. My question is general, and I am looking for advice on reading/writing to a file based cache. The data will be retrieved by the class CachedLocalAttractionsService.cs in a services tier which is called directly from the controller. Please advise on: Storing name/value pairs on disk, Serializing large amounts of data retrievable in a Cache[key] fashion, Any examples that I might have missed in my 6 hours of searching so far.
Extending ObjectCache to create a file-based cache alternative
you can use MS AppFabric caching, it's a free component for Windows Server and solves this by decoupling the caching from the ISS app domain. see question and answers here: AppFabric vs System.Runtime.Caching if you keep using ASP.NET Caching you are exposed at the issues you have described above and to my knowledge there is no solution.
When code changes are published to .NET websites the re-compilation process involves restarting the AppDomain, this in turn wipes the application cache. Are there any events raised when this happens? Is there any way to manually serialise portions of my cached data and save it to disk, then subsequently re-initialise the cache when the application is loaded again?
ASP.NET persist Cache when AppDomain is refreshed
0 Storing millions of objects sounds like a database application. Share Improve this answer Follow answered Oct 7, 2011 at 14:36 michael667michael667 3,2512424 silver badges3232 bronze badges 3 Yes of course. but at somepoint i need to use a lot of objects at the same time in my "software" to compute them somehow together in a batch. these objects are images btw. ;) – user984161 Oct 7, 2011 at 14:39 I assume you already have a data structure for your images that contains filename etc., but not the image data itself. Why not store the files (image data) on the disk/database and retrieve them when needed? – michael667 Oct 7, 2011 at 14:42 Actually I have both. Image data and structure. But you are right, this is exactly my questions: how to determine weather i need to serialize an object or not (according to the free memory). Are there intelligent ways? e.g. using a softreference would mean, that i always need to serialize an object when i create or change it as i don't know when java kicks it off the heap. or can i listen to that somehow? – user984161 Oct 7, 2011 at 14:48 Add a comment  | 
i just came to the point where whether google nor my knowledge bring me forwards. Think about the following situation: I read in a lot (up to millions) of large objects (up to 500mb each) and sometimes i read in millions of objects with only 500kb, this completely depends on the user of my software. Each object is gonna be processed in a pipeline so they don't need to be all in the memory for all the time, only a reference would be needed to find the objects again on my harddisk after serializing it so that i can deserialize it again. So it's something like a persistent cache for large objects. so here come my questions: Is there a solution (any framework) which does exactly what i need? this includes: arbitrary serialization of large objects after determining somehow if the cache is full? if there isn't: is there a way to somehow intelligent check weather an object should be serialized or not? e.g. checking somehow the memory size? Or something like a listener on a softreference (when it get's released?). Thanks alot, Christian
Caching large objects and de/serializing them if needed (Java)
0 I guess it depends on what kind of cache you are looking for. ICS adds a built-in HTTP response cache, see this blog post for more info. If you are looking for a more general disk cache then the ICS source code has a nice class called DiskLruCache (not available in the public APIs though). It has been back ported and released by a 3rd party though: DiskLruCache. Share Improve this answer Follow answered Mar 7, 2012 at 21:28 AdamKAdamK 21.2k55 gold badges4343 silver badges5959 bronze badges Add a comment  | 
I see that cachemanager is a deprecated class in Android sdk. So what is the substitiute of cachemanager. I see a way with the data storage techniques. but how are they supposed to implement the cache algorithm. Does everyone has to write there own Cachemanagers.
what is the alternative for the deprecated cachemanager class in android
0 If an exception occurs in the method where the caching is declared, there is nothing to cache, everything is fine. There is no easy way to bound the Spring cache to the transaction, and I think that it would not be ok to do so. Try using something like Hibernate 2nd level Cache (e.g. with Ehcache) or a similar DB Cache. They automatically remove or add the data depending on the state of the DB. They assure the consistency with the DB. Share Improve this answer Follow edited Apr 25, 2013 at 14:04 answered Dec 4, 2012 at 14:53 azlazl 10911 silver badge88 bronze badges Add a comment  | 
I have a question regarding the caching with Spring 3.1 : is that possible to configure Spring to automaticly rollback the cache modifications when JDBC rollback occurs… I'm talking here of consistency between the cache and the DB.
Rollbacking Spring 3.1 Cache or Evict
0 You will still get that head.js speed benefit during the initial cache manifest download event and during cache manifest updates Share Improve this answer Follow answered Aug 15, 2012 at 3:36 mattnullmattnull 34222 silver badges99 bronze badges Add a comment  | 
There are large speed benefits to using head.js in my own sites. Now I am considering HTML5 cache manifest to improve offline access to sites and improving speed (more things are loaded from cache) Are the benefits of head.js still there (parallel script loading in particular) if I use HTML5 Cache Manifest?
html5 cache manifest compatibility with head.js?
0 One common use case for Hibernate's second level cache is to cache static or very infrequently changing reference data. For example, a list of states/provinces and their attributes. Spring's @Cacheable is useful anywhere you have a method that returns a value that is expensive to compute. In both the @Cacacheable and Hibernate second level cache scenarios if the data will change over time then you should think hard about what cache implementation you should use (eg, should it be a distributed cache? What about cache invalidation?) Share Improve this answer Follow answered Aug 4, 2011 at 3:42 sourcedelicasourcedelica 24k77 gold badges6666 silver badges7474 bronze badges 1 It should be distributed cache, so I'm considering memcached, redis and infinispan, and so on. – Whiteship Aug 4, 2011 at 10:02 Add a comment  | 
I have read about the Hibernate's reference about Hibernate 2nd level cache in here, and I've read about the Spring 3.1's Cache abstraction in here. After then, I've understood the Hibernate's 2nd level cache is very similar to the first level cache but, it's scope is extended to the SessionFactory. And, the Spring 3.1's @Cacheable looks good for the service layer. I want to here from you more detailed use-cases for each. When and where should I use the Hibernate 2nd level cache? When and where should I use the Spring 3.1's Cache Abstraction? Thanks for reading.
What's the suitable use-case for the Hibernate 2nd level cache and the Spring 3.1's @Cacheable?
First, your ajax might be loading so quickly that you aren't seeing the loading message. Second, You can add a check to see if the page has been loaded already, and if it has, then don't change to the loading message. $(function() { var cached = {}; ( "#tabs" ).tabs({ cache : true, select: function(event, ui) { if ( typeof ui.index === "undefined" ) { $(ui.panel).html('Loading tab content...'); cached[ui.index] = true; } } }); });
I am using jQuery 1.6.2 and jQuery UI. I would like to show a custom message on tab AJAX loading. I have two tabs (as in the official documentation) and I like to display a "spinner" message inside the HTML div being loaded instead of near the tab title. The problem with the following code is that I have enabled the cache so it doesn't work as expected. $jQ(function() { $jQ( 'tabs' ).tabs({ cache: true, select: function(event, ui) { var currentId = ui.index + 1; $jQ('#' + currentId).html('Loading tab content...'); } }); }); Here is what happens: The first time I click on the second tab I don't get any loading message but the tab content is loaded. The second time I click on the second tab I get the loading message and no more the tab content (that is, the tab content is not displayed and it persists on the loading message). How can I make it to work?
Spinner message with cache enabled
0 I think your problem is due to some missing or wrong indexes. I've a sf1.4 project for a large soccer site (i.e. 2M pages/day) and pagers aren't going so slow even if our database has more than 1M rows these days. Take a look at your query with EXPLAIN and check where it is going bad... Share Improve this answer Follow answered Oct 30, 2011 at 12:53 dlonderodlondero 2,56911 gold badge2424 silver badges3333 bronze badges 2 I tried that with EXPLAIN and optimized indexes. We upgraded our hardware and that made things better. However, you didn't answer my question. I asked whether the concept of cache swapping is possible. Why must users wait when building the new cache entry? - old cache entries are still available. – fishbone Oct 30, 2011 at 16:11 I don't see the point. Yes, cache swapping is possible but this solution is only for some very critical scenarios. And I think you can solve it in another way working on your db stuff. – dlondero Oct 31, 2011 at 8:34 Add a comment  | 
I run a Symfony 1.4 project with very large amount of data. The main page and category pages are using pagers which need to know how much rows are available. I'm passing a query which contains joins to the pager which leads to a loading-time of 1 minute on these pages. I configured cache.yml for the respective actions. But I think the workaround is insufficient and here are my assumptions: Symfony rebuilds the cache within a single request which is made by a user. Let's call this user "cache-victim" to simplify things. In our case, the data needs to be up-to-update - a lifetime of 10 minutes would be sufficient. Obviously, the cache won't be rebuilt, if no user is willing to be the "cache-victim" and therefore just cancels the request. Are these assumptions correct? So, I came up with this idea: Symfony should fake the http-request after rebuilding the cache. The new cache-entries should be written on a temporary file/directory and should be swapped with the previous cache-entries, as soon as cache rebuilding has finished. Is this possible? In my opinion, this is similar to the concept of double buffering. Wouldn't it be silly, if there was a single "gpu-victim" in a multiplayer game who sees the screen building up line by line? (This is a lop-sided comparison, I know ... ;) ) Edit There is no "cache-victim" - Every 10 minutes page reloading takes 1 minute for every user.
Automatically rebuild cache
Small files don't compress especially well, so you may not gain much compression. While loading the files will be fast because they are smaller, decompression adds time. You'd have to experiment to see which is faster. I would think the real issues would relate to the efficiency of the file system when it comes to iterating over all the little files, especially if they are all in one folder. Windows is notorious for being pretty inefficient when folders contain lots of files. I would consider doing something like writing them out into one file, uncompressed, that could be streamed into memory -- maybe not necessarily contiguous memory, as that might be a problem. But the idea would be to put them all in one file. Then write some kind of index that ties a file name or other identifier to an offset from which the location of the image in memory could be determined. New images could be added at the end, and the index updated appropriately. It isn't fancy but that's what you're trying to avoid. An archive or even a file system gives you lots of power and flexibility but at the cost of efficiency. When you know what you want to do, sometimes simple is better. I would consider implementing a solution that reads files from a folder, another that divides the files into subfolders and subsubfolders so there are no more than 100 or so files in any given folder, then time those solutions so you have something to compare to. I would think a simple indexed file would be fast enough that you wouldn't even need to pre-load the images like you're suggesting -- just retrieve them as you need them and keep them around once they're in memory.
I am creating an application that requires a lot of image thumbnails (~3000, 5-25KB). Because speed is essential I plan on loading these images into memory when the application starts. At runtime, new thumbnails will be downloaded and added to the collective. I could store them all in a folder, but reading thousands of files into memory when a program starts hardly seems efficient. My second option would be to save them in some kind of (compressed) archive. This would make storage itself and loading more efficient (I think). However, new files will be added regularly, and that will probably not go as smoothly as just saving them in a folder. Is storing a cache of small files in a (compressed) archive a bad idea or not? Are ZIP files the way to go? Would I be better off using uncompressed archives (and if so, what kind)? All image files will be JPEG's. Thanks in advance! EDIT: I am considering to drop the "load everything into memory on application start" thing. This would simplify my question a little. My initial idea to put everything in one big file now seems less beneficial, since the problem of many files in one directory can be solved by hashing into subdirectories.
Storing lots of small files: archive vs. filesystem
0 URLConnection supports caching, but there is no default cache implementation. See this article for more info: http://download.oracle.com/javase/1.5.0/docs/guide/net/http-cache.html If you plug in your own cache implementation, you can just use a regular URLConnection to download the item, and your cache will be used. Share Improve this answer Follow answered Apr 6, 2011 at 20:59 Sam BarnumSam Barnum 10.6k33 gold badges5555 silver badges6060 bronze badges Add a comment  | 
I'm working on this program that needs to download files from a network. The same file is downloaded many times, so we need to cache them to the local file system. Is there a java library that, given a cache directory, will manage the cache and the downloading of the resources? The library needs to have some kind of open source license. Thank you.
Java caching network files
0 The general way of doing this is generating a random number and adding it as a GET/POST variable. For example: http://example.com/myfile.html?r=189818273 Just my two cents.... Share Improve this answer Follow answered Jan 27, 2011 at 14:47 Abe PetrilloAbe Petrillo 2,4092424 silver badges3434 bronze badges 1 1 That stops it getting cached though as the number would be different each time. I want it to be cached - I just want the ability to update it too. – Ian Chilton Jan 27, 2011 at 14:53 Add a comment  | 
I'm interested how people deal with updating images/css/js with regards to the the browser cache. It's obviously good to use mod_expires and have a future expiry etc - but how can you then update those files if you don't want to rename the file all the time? Does anyone have any cool tricks with a version number which will not risk the browser (or proxy) not wanting to cache the file, but will still guarantee the user will see the new version if it's updated?
Updating cached images/css/js without renaming
you could: set up 2 cachezone's in nginx with 2 proxy_cache_path directive in your ninx http block (see http://nginx.org/en/docs/http/ngx_http_proxy_module.html#proxy_cache_path for specifics) then refer to the defined zone-names in proxy_cache directives in your location blocks you can then clear just the cache for the yaml-generated stuff
I have a modular Sinatra app running on nginx with Phusion Passenger. When I alter my app (and in particular, some YAML files which are used to generate pages), I'd like to be able to clear only the parts of my cache that are affected (and leave evertyhing else in /public alone--I know I can just clean out the whole cache, but I was hoping not to). Thanks!
How can I selectively clear the cache for Sinatra + Nginx + Phusion Passenger?
0 It is possible if you're using IIS7.X. I'm not sure if this will work for an MVC project. Basically all you have to do is add this to your system.webserver section in the web.config file: <caching> <profiles> <add extension=".aspx" policy="CacheForTimePeriod" kernelCachePolicy="DontCache" duration="00:00:05" varyByHeaders="host" /> </profiles> </caching> If you have access to IIS then you can click on the "output cache" icon under your site and configure with the GUI but all it will do is update your web.config to something liek the above code. Share Improve this answer Follow answered Oct 12, 2012 at 18:48 nerdybeardonerdybeardo 4,6552323 silver badges3232 bronze badges Add a comment  | 
I'm trying to do something similar to this question, I have a multitenant application and want to configure the output cache to be per tenant. However I'd rather not have to use a custom OutputCacheAttribute or have an OutputCache profile and remember to use this everywhere. Is it possible to change the default OutputCache profile settings, adding the host to the VaryByHeader attribute?
Is it possible to modify the default outputcache settings
0 It is possible to use it and there is no need to start a thread or the like. Sending class instances around requires a jar of the message class in tomcat lib directory. cheers Michae Share Improve this answer Follow answered Aug 20, 2010 at 10:59 MichaelMichael 3111 silver badge33 bronze badges Add a comment  | 
I'm currently working on a migration a web application to run in a cluster. This application uses caches. Some of this caches are reloaded in case the user saves something. I'ld like to inform the other nodes of the cluster about this, so that all nodes refresh their caches. It seems that the tomcat server has a group messaging build in. (Tribes) I'm wondering if I can use this messaging for my task and how to have the event listener run the whole day then. with kind regards Michael
How to implement cache synchronization in tomcat 6.0 cluster environment?
0 Try adding a debug statement (console.log(), or even a simple alert() will do) at the beginning of the handler function to make sure that it is being invoked at all. Are you adding the click handler in a document.ready() handler? Also I think the $.post function will do an AJAX POST, but will not refresh the page, so you may not see anything happening at all. Try using Firebug / Chrome developer tools to examine outgoing requests to make sure. You may want to make the click() handler explicitly return a value too (true means continue processing the click, false means stop). Maybe that's the root of the problem: in Firefox the handler sometimes returns true, so the link is followed, while on Chrome it returns false and the POST is executed, but the link is not followed. Share Improve this answer Follow answered Aug 12, 2010 at 23:22 Gintautas MiliauskasGintautas Miliauskas 7,78444 gold badges3333 silver badges3434 bronze badges 1 Yes, the click handler is in document.ready() handler. I'm using Firebug and DevTools, and I've tried to add some logs: looks that the function is invoked and runs until his end, with the right variable values. After the click, the page does refresh, in the chosen language with Firefox, but with Chrome is in same language as before. – dolma33 Aug 12, 2010 at 23:45 Add a comment  | 
For the internationalization of my django project, I'm using django's i18n, and I love it. For setting the language in the template, instead of using forms like in this example : <form action="{{site_url}}i18n/setlang/" method="post"> <input name="next" type="hidden" value="" /> <select name="language"> {% for language in languages %} <option value="{{language.0}}">{{language.1}}</option> {% endfor %} </select> <input type="submit" value="Ok" /> </form> I would like to use simple plain text links; something like this: {% for language in languages %} {% ifnotequal language.0 lang %} <a href="{{site_url}}i18n/setlang/" >{{language.1}}</a>{% else %}{{language.1}} {% endifnotequal %} ... {% endfor %} For letting the previous template snippet do his work, I've created the following jQuery function: var languageLink = $('#language-choser > a'); languageLink.click(function(e){ var languageURL = languageLink.attr('href'); var languageNow = languageLink.text(); var lang = (languageNow=='English') ? 'en' : 'es'; $.post(languageURL, {next: "", language:lang}); }); This function works with Firefox but not with Chrome: it will simply reload the page, without changing the language. Someone can tell me what's wrong? I've been playing around with it for a long time, without finding a way out. EDIT Looks that it could be a caching problem. In my click function, I should clean the cached page. But how? Or should I disable browser caching for the whole site? I don't think so...
Django internationalization with i18n: choosing language in template using jQuery
3 The solution to complex reporting is to pre-calculate, so you're on the right path if you're looking at OLAP. Share Improve this answer Follow answered Aug 10, 2010 at 3:47 AlexAlex 57633 silver badges88 bronze badges Add a comment  | 
We have million and millions of records in a SQL table, and we run really complex analytics on that data to generate reports. As the table is growing and additional records are being added, the computation time is increasing and the user has to wait a long time before the webpage loads. We were thinking of using a distributed cache like AppFabric to load the data in memory when the application loads and then running our reports off that data in memory. This should improve the response time a little since now data is in memory vs disk. Before we take the plundge and implement this I wanted to check and find out what others are doing and what are some of the best techniques and practices to load data in memory, caching etc. Surely you don't just load the entire table with 100s of millions of records in memory...?? I was also looking into OLAP / Data warehousing, which might give us better performance rather than caching.
Data caching techniques / Tips / AppFabric
0 At the moment in IOS 5.1 you can't save video or audio files to Application Cache this is valid for the iPhone and for the iPad, but desktop safari can store and play offline video e and audio files since version 5.1 of the Safari browser. This is probably a precaution to prevent consuming unnecessary network data (usually 3G data is limited and expensive), because every file declared in the manifest get's downloaded and cached even if the user doesn't visit the portion of the site having that content. In my opinion if the users had the wifi enabled safari for ios should be able to cache video and audio files. Share Improve this answer Follow edited Jul 31, 2012 at 10:11 answered Jul 31, 2012 at 9:51 lmmendeslmmendes 1,4801515 silver badges1414 bronze badges Add a comment  | 
i have seen a few (discouraging) questions related to this subject but I am still not clear on the answer. is it possible to cache video content for immediate playback in an offline web app on the iphone or ipad? (i believe that there is a 5mb limit for any cached file.) can videos cache the same way other files can, using the manifest? are there alternatives?
cache video content for offline iphone/ipad web apps in mobile safari (html5)?
0 Files listed for caching in the manifest will always be served from the application cache, whether you're online or offline. The browser will always look first in the application cache for any resource requested from a page covered by the manifest, hence the terminology "bypass the cache". The network whitelist can then be seen as a set of files for which the browser will skip the step where it checks the application cache for the resource. The only way to expire items in the application cache is to change your manifest file. Future expiration of files in the application cache is not possible, you will always be dependent on the user connecting to your website after you've updated your manifest file. Share Improve this answer Follow answered Nov 17, 2010 at 2:31 robertcrobertc 75.2k1818 gold badges198198 silver badges178178 bronze badges Add a comment  | 
It's not clear to me from the descriptions of the cache manifest that I've read (e.g. http://www.w3.org/TR/offline-webapps/#offline and elsewhere) what this file does. I'll explain what I find to be unclear. The heading name ("Offline Web Applications") suggests that the cache manifest is relevant only for offline scenarios; the cache manifest is defined as "The mechanism for ensuring Web applications are available even when the user is not connected to their network" But does the cache-manifest have any implications for online use? It seems so. The file contains a NETWORK section, and the files listed there (sometimes I've seen it described as the last file listed there) do not go into the cache; they "...should never be cached, so that any attempt to access that file will bypass the cache." What would "bypass the cache" mean in an offline context? But if the user is online, are files listed in the NETWORK section always obtained from the server, even if they had been previously cached? The files added to the NETWORK section are said to be "white-listed". Normally, to whitelist something means to allow it. Actually, these files are being black-listed -- i.e. not allowed into the cache. This page is being copied verbatim or paraphrased lemming-like in many other documents, so the original's imperfect clarity is being perpetuated. So, my question: could someone please provide an authoritative, clear, and concise definition of the purposes the cache-manifest serves, giving examples of how one would set cache-expiration policies, such as expiring content at midnight on December 31, 2010. Is future-expiration even possible to do, declaratively, in HTML5? Thanks
HTML5 cache manifest: what exactly does it do? the documentation is unclear
0 You can specify the directory by adding the following to your environment.rb config block config.cache_store = :file_store, "#{RAILS_ROOT}/public/projects" A couple other things: The cache settings have changed some with the different version of Rails, it would be helpful to know what version you are running In general you're probably better off using the cache memory store instead of file store (if you have sufficient RAM in your box). Share Improve this answer Follow answered Apr 12, 2010 at 14:10 Mike BuckbeeMike Buckbee 6,85322 gold badges3333 silver badges3636 bronze badges 1 I’m using Rails 2.3.5. I might give memcaching a go, but RAM is likely to be tight, and I trust Apache to serve my small, low-traffic site efficiently. However, I’m not trying to change the cache root, just the caching behaviour of the index action. – Andy May 2, 2010 at 10:52 Add a comment  | 
I have a controller Projects in my Rails app with: caches_page :index However, instead of the cached file being generated at /public/projects/index.html it is located at /public/projects.html. The web server (currently Mongrel) looks for */ directories before *.html files. So the http://…/projects request is routed through Rails and my index cache file is never served. How can I tell caches_page :index to generate the file at /public/projects/index.html instead?
Rails caches_page :index in Wrong Location
0 Trying to do the same. The best I have so far is this... [TestCase] public void ResponseNotFromCache() { System.Net.WebRequest rq = System.Net.WebRequest.Create("testmethod"); System.Net.HttpWebResponse rs = rq.GetResponse() as System.Net.HttpWebResponse; Assert.IsFalse(rs.IsFromCache); } There must be a better way! Update: How can I test an event of a MVC controller Share Improve this answer Follow edited May 23, 2017 at 12:13 CommunityBot 111 silver badge answered Feb 28, 2010 at 2:42 rotary_enginerotary_engine 55922 gold badges66 silver badges1717 bronze badges Add a comment  | 
Scenario: I have a base controller which disables caching within the OnActionExecuting override. protected override void OnActionExecuting(ActionExecutingContext filterContext) { filterContext.HttpContext.Response.Cache.SetExpires(DateTime.UtcNow.AddDays(-1)); filterContext.HttpContext.Response.Cache.SetValidUntilExpires(false); filterContext.HttpContext.Response.Cache.SetRevalidation(HttpCacheRevalidation.AllCaches); filterContext.HttpContext.Response.Cache.SetCacheability(HttpCacheability.NoCache); //IE filterContext.HttpContext.Response.Cache.SetNoStore(); //FireFox } How can I create a Unit Test to test this behavior?
Help create a unit test for test response header, specifically Cache-Control, in determining if caching has been disable or not
0 If you're looking to do something outside the bounds of the functionality available in MGTwitterEngine you'll probably have to use the raw Twitter API. Try this page for some help. Caching should probably be done in the NSCachesDirectory though. It may have a longer lifetime than NSTemporaryDirectory and is the recommended place to store cached data. Efficiency, in this case, probably means downloading once, storing the image so you can easily identify it later, determining if it exists and determining whether a download is needed or not. Share Improve this answer Follow answered Dec 9, 2010 at 15:05 HyperboleHyperbole 3,93744 gold badges3636 silver badges5454 bronze badges Add a comment  | 
What is the most efficient way to get and cache a profile image for a user using MGTwitter? Our problem currently is that there must be a call to getUserInformationFor to get the url of the image, then getImageAtURL resulting in two calls to the server. Currently, we just need the image information, so its redundant to have to download all the other information. When you bear in mind that we might do 20-30 of these calls at once (e.g. to get a list of user profile images), this become quite slow. Secondly, what is the most efficient way to cache that image so that it doesn't have to be downloaded every time (we don't mind assuming that the profile image is never going to change), currently we're just writing to NSTemporaryDirectory() with the Twitter username as the filename, and then for each Twitter request checking if the filename exists in that directory. Is there a better, more efficient approach?
MGTwitterEngine - efficient way to get and cache profile imags
0 What do you mean by "updating the cache"? Are you concerned that users can modify the List returned by methodWithCaching()? If so, I would propose that method return an unmodifiable collection. Or perhaps the cache could detect that the result is a Collection and wrap it with an unmodifiable wrapper. Share Improve this answer Follow answered Jan 13, 2010 at 17:51 KevinKevin 30.3k99 gold badges7777 silver badges8383 bronze badges 5 I want the users to do whatever they want with the list. But imagine this scenario: methodinvocation(1), cache miss, userupdates the lists and the cache gets updated too. methodinvocation(2), cache hit (now the list contains 1 element), user updates the list and cache gets updated too. methodinvocation(3), cache hit (now the list contains 2 elements), and so on. – Pablo Fernandez Jan 13, 2010 at 18:28 Do you want the user to have a local copy of the cached object and not a shared instance? – Kevin Jan 13, 2010 at 18:37 That's right, and possibly without modifying the client, just the interceptor – Pablo Fernandez Jan 13, 2010 at 21:06 I would try to implement something using Cloneable as akhnaten suggested. You need to either return a copy of the data or a different data structure all together. – Kevin Jan 13, 2010 at 21:55 I cant. Implementing Cloneable does not bring the clone() method to public visibility. I would have to implement that myself in every single object I want to cache – Pablo Fernandez Jan 14, 2010 at 23:26 Add a comment  | 
I have an application that uses ehcache for cache (but I think this problem is framework-agnostic), with a method interceptor so basically if I mark my method for caching something like this happnes: public Object invoke(MethodInvocation mi) throws Throwable { Object result = cache.get(key); //key comes from MethodInvocation processing if (result == null) { result = mi.proceed(); cache.put(key, result); } return result; } So far so good. The thing I'm caching a method that returns an Array, and gets called like this: List<Object> result = methodWithCaching(); result.add(new Object()); //! As you can imagine, the line marked with ! also updates the cache instance, and this is not what I want. Can someone think of a way to stop this behavior without modifying the client, only the interceptor?
Cache integrity problem
After some additional Googling, it looks like I might just need to create a web.config in my Scripts and Images folders (and any others I'd like to cache): <?xml version="1.0" encoding="UTF-8"?> <configuration> <system.webServer> <staticContent> <clientCache cacheControlMode="UseMaxAge" cacheControlMaxAge="7.00:00:00" /> </staticContent> </system.webServer> </configuration> Does this seem right? Did I miss something?
We are creating a large secure back office web application in ASP.NET. All access to the site is over https connections, and we'd like to either turn off caching for pages or set caches to expire quickly. However, the site uses quite a few images and largish javascript files/libraries. Is there a way to selectively cache certain files or file types so they are not being reloaded all the time?
Selectively cache .js and .png files over https?
1 A local database really would be a nice way to handle this - especially because queries would aid with your investigation of logs. Plus, then your UDP receiving program could just be a separate thread that spits information at the database (if your data's REALLY fast-paced, you could have two buffers and alternate between them; flush the full buffer to the database while the other one is filling up). It really depends on the scale of your project though. You could always use your third option (storing to a file right away), and have a separate "Log Investigation" tool that reads that file in without running into OOM exceptions. Share Improve this answer Follow answered Aug 26, 2009 at 17:39 Walt WWalt W 3,27933 gold badges3030 silver badges3737 bronze badges Add a comment  | 
I've written an application that logs trace data from an embedded system via UDP. Currently I receive datagrams and parse out the variable length records and store them in a list. The front end can access the list and present the data (graphs and text lists etc). The problem I'm running into is that sometimes I need to log an exceptional amount of data. So much that my list implementation causes an out of memory exception. My requirements are: Allow multithreaded reading and writing of the data (can't just post process) Handle large amounts of data (worst case ~2MB/s ... 7.2GB/hr of logging) Allow storage of data set Random read, index based, access Does anyone have some suggestions about how to attack this? Here were a few thoughts that I had: I'd like a nifty disk backed, memory cached List. It seems like something that would exist but I haven't found one. Local database? I don't know too much about databases but it seems like overkill. Store the data to a file right away. Keep a list in memory that holds the byte offset for each record index. Can my reader thread access this simultaneously?
C# high speed data logging data handling
There's no plugin which would provide such features. "Selective Caching" can be implemented on the server-side. Check this link Squid Cache can solve your problem. Also, you can write the following: <script type="text/javascript" src="scripts/ext.js"></script> <script type="text/javascript" src="scripts/custom_script.js?<?php echo time(); ?>"></script> The second JS file won't be cached. Technically, the browser will cache a different version of the file, so you'll have the latest version every time you refresh the page. The ExtJS file will be cached. In case you want to use HTML5, this solution will let you choose which files should be cached and which files should be requested from the server: http://gregsramblings.com/2012/05/28/html5-application-cache-how-to/
Is there a plugin or method for performing selective caching in firefox? I can disable caching entirely, but I'd like to be able to still cache some large javascript libraries (extjs) which take several seconds to load.
Selective caching in firefox
the last sp of vs 2008 seems to have magically solved my issue although i see no note of it
For the last couple months i have ben having some issues with my app.config. I will add an AppSetting key and run my project with no real issue it reads the config file and all is good. Then at a later date i will change the value of that same key and when i run my project I will get the old value of the key. It seems like it isnt saving the file properly or detecting that there was a change in the app.config file. if I clean or rebuild my solution it is fine. Has anyone else seen this issue, is there a hot fix? This is a real problem to have to double check all the time especially when usually the keys are the difference between production and testing environments. Imagine my surprise when i started publishing test messages to my production environment, scary. Thanks in advance
Visual Studio 2008 App.config Caching
It turns out that calling nsICacheSession.openCacheEntry() with ACCESS_READ_WRITE will create the cache entry.
IE has WinInet API, such as GetUrlCacheEntryInfo, to read and manipulate IE browser cache. Is there a similar API for non IE browsers such as Firefox or Chrome? If so where can I get more info? Thanks Update: According to following (Accessing Firefox cache from an XPCOM component) the WinInet function GetUrlCacheEntryInfo() can be accomplished by nsICacheSession.openCacheEntry() to get nsICacheEntryDescriptor. Is there an equivalent WinInet function CreateUrlCacheEntry() which will create a cache entry?
Browser Cache API for non IE browsers
I'm making some assumptions here for this answer.... I'm assuming the client app is using one of the .NET caching classes to store your application's options? When you say 'flush' do you mean flush them back to a configuration file or db table? Because the cache objects and data won't be shared between processes you need a mechanism to signal to the code running on the other worker process that it needs to re-read it's options into its cache or force the process to restart (which is not exactly convenient and most likely undesirable). If you don't have access to the client source to modify to either watch the options config file or DB table (say using a SqlCacheDependency) I think you're kinda stuck with this behaviour.
Ok, strange setup, strange question. We've got a Client and an Admin web application for our SaaS app, running on asp.net-2.0/iis-6. The Admin application can change options displayed on the Client application. When those options are saved in the Admin we call a Webservice on the Client, from the Admin, to flush our cache of the options for that specific account. Recently we started giving our Client application >1 Worker Processes, thus causing the cache of options to only be cleared on 1 of the currently running Worker Processes. So, I obviously have other avenues of fixing this problem (however input is appreciated), but my question is: is there any way to target/iterate through each Worker Processes via a web request?
Target IIS Worker Processes on Request
-1 FIX // Opt out of caching for an individual `fetch` request fetch(`https://...`, { cache: 'no-store' }) https://nextjs.org/docs/app/building-your-application/caching // Opt out of caching for all data requests in the route segment export const dynamic = 'force-dynamic' Share Improve this answer Follow answered Aug 16, 2023 at 10:47 Mohamad C PMohamad C P 11 Add a comment  | 
With nextjs, the cache limit for a fetch is 2MB. However, some of my fetches are larger than 2 MB and the data is therefore not cached, which is very annoying because I make calls to my api again when I navigate to my component via Link (dynamique routes). The fetch is called as a preload, but the call to the api is also recalled a second time when the component is called, for all the routes concerned where I need to fetch this data. (so it makes 2 calls to my api instead of 0 for a every route) So I have hundreds of calls to the api because of the cache limit... Do you have a solution to my problem? Maybe use something other fetching methods ? thanks everyone tried to reduce the size of my json fetched but its not a good solution in my case
Nextjs fetch cache max size problem, too much api calls
0 On my side, I had the same issue, but I realized that once I need to reload my extension so that it can update its service worker, I had to exit out of the tabs that are running the extension first before doing so. I didn't have to do this before, but for some reason this is how it worked on myside. I don't have a source that says that what I just said is correct or not, but this is just out of experience, I would be happy to know if doing this worked for you too. My guess is that it's because the service worker is inactive when the extension is not being used, therefore that's the only time you can reload it. Again I have no proof other than experience. Share Improve this answer Follow answered Dec 5, 2023 at 6:05 MehdiMehdi 13511 gold badge22 silver badges1212 bronze badges Add a comment  | 
I'm working on a Chrome Extension in React that communicates with Firebase. My workflow: make some changes, run the build script, and then press the reload button on the chrome extensions developer card. Everything I change to the frontend shows right away, but changes to background.js are not being reflected in the files that the extension is serving. I initially have been trying to add new fields to my firebase collections by adding key/value pairs being sent with the updateDoc() method. I thought there was an issue with firebase until I checked an error through the console by clicking the link to the rendered background.js file and did not see any of my changes. Then I tried putting more console logs in the file, including at the beginning of the file, directly under another console log that is printing, and none of the new logs are printing. Is there an issue with the backgound.js file being stored in cache, preventing me from updating similar to working with PWA version control? Thank you
When reloading my unpacked chrome extension, background.js is not updating to the latest version
-1 I think your issue can be solved using dash_extensions and specifically server side call back caches, might be worth a shot to implement. https://community.plotly.com/t/show-and-tell-server-side-caching/42854 Share Improve this answer Follow answered Apr 6, 2021 at 13:08 Anthony1223Anthony1223 34922 silver badges77 bronze badges 1 This is no longer available. – Sam Oct 2, 2023 at 9:57 Add a comment  | 
I have a problem with my Dash application put in a server of a remote office. Two users running the app will experience interactions with each other due to table import followed by table pricing (the code for pricing is around 10,000 lines and pull out 8 tables). While looking on the internet, I saw that to solve this problem, it was enough to create html.Div preceded by the conversation of dataframes in JSON. However, this solution is not possible because I have to store 9 tables totaling 200,000 rows and 500 columns. So, I looked into the cache solution. However, this option does not create errors but increases the execution time of the program considerably. Going from a table of 20,000 vehicles to 200,000 it increases the compute time by almost * 1,000 and it is horrible every time I change the settings of the graphics. I use cache filesystem and i used the exemple 4 of this : https://dash.plotly.com/sharing-data-between-callbacks. By doing some time calculations, I noticed that it is not accessing the cache that is the problem (about 1sec) but converting the JSON tables to dataframe (almost 60 seconds per callback). About 60 seconds is the time also corresponding to the pricing, so it is the same to call the cache in a callback as it is to price in a callback. 1/ do you have an idea that would save a dataframe not a JSON in the form of a cache or with a technique like the invisible html.Div or a cookie system or whatever other methods ? 2/ with the Redis or Memcached, we have to provide return json? 2/ If so, how do we set it up, taking example 4 from the previous link because I have an error "redis.exceptions.ConnectionError: Error 10061 connecting to localhost: 6379. No connection could be established because l target computer expressly refused it. " ? 3/ Do you also know if turning off the application automatically deletes the cache without following the default_timeout?
How to store BIG DATA as global variables in Dash Python?
-1 From documentations and source code, we can find: Response Caching Middleware determines when responses are cacheable, stores responses, and serves responses from cache. ResponseCache Attribute specifies the parameters necessary for setting appropriate headers in response caching. It is used to configure and create (via IFilterFactory) a ResponseCacheFilter. The ResponseCacheFilter performs the work of updating the appropriate HTTP headers and features of the response. The filter: Removes any existing headers for Vary, Cache-Control, and Pragma. Writes out the appropriate headers based on the properties set in the ResponseCacheAttribute. Updates the response caching HTTP feature if VaryByQueryKeys is set. For more information, please check: ResponseCachingMiddleware source code ResponseCacheAttribute source code Share Improve this answer Follow answered Jan 6, 2020 at 7:22 Fei HanFei Han 27k11 gold badge3131 silver badges4545 bronze badges Add a comment  | 
I know ResponseCache attribute can caching page in client side by http headers attribute cache-control. And ResponseCache middleware, it caching page on server (same http headers as ResponseCache attribute). I compare these, it seems no different, same features, same condition. Server side caching is no different from ResponseCache attribute, they both don't let the request into controller action, or these have different request pipeline ? So, What kind of scenarios would choose ResponseCache middleware/ ResponseCache attribute?
What's different between "ResponseCache attribute" and "ResponseCache middleware" in asp.net core?
-1 if you are using the simplecache then you can retrieve all url as keys of cache like below: abc.getcache(context).getKeys(); This will return you all the urls whose data is cached in you app cache. if you want the video to be cached then see the exoplayer video of google 2018. https://www.youtube.com/watch?v=svdq1BWl4r8. Share Improve this answer Follow edited Dec 17, 2019 at 15:21 Prabindh 3,44622 gold badges2424 silver badges2727 bronze badges answered Dec 17, 2019 at 12:49 Waqas YousafWaqas Yousaf 28444 silver badges1313 bronze badges Add a comment  | 
I have a question about ExoPlayer and caching. Struggling with this for a 2 days already and searched over the internet but I can't collect all informations together in one picture, don't know is that even possible. So my question is: Can somehow using ExoPlayer components like CacheDataSource, CacheUtils etc, video be cached and than Uri of that video retrieved? Like independently from ExoPlayer I tried with DataSpec, CacheUtil, SimpleCache but didn't find a way to do this. I don't know is that a simple question or maybe it is not possible at all. Any help would be appreciated
Get uri from cached video ExoPlayer
-1 By default, most browsers cache images, styles and scripts automatically. The easiest way to bypass this for development environments is to set the caching headers detailed here Another common way to bypass caching is to set a random query parameter (usually ?v=<random value here>). Chromium based browsers also have a disable cache option in the dev tools Share Improve this answer Follow answered May 27, 2019 at 4:28 user3674603user3674603 5611 silver badge66 bronze badges Add a comment  | 
I am facing weird issue with file uploads. When I upload a new file to publicly visible folder, I can see it instantly in anonymous mode. But if i try to access it in non-anonymous mode, the server responds with 404 unless I do hard refresh (ie ctrl + F5 for Mozzila). I have already disabled cache control headers for that folder in apache, but that did not seem to resolve the issue. It seems to me that the apache is storing information that "there is actually no file at requested url" and serves it to user unless user clears cache even if the file is uploaded at that location. Anyone ran into similar issue in the past?
Uploaded file is not visible in browser unless I force no cache browser reload
0 It appears to be C:\Users\%username%\AppData\Local\Google\Chrome\User Data\Default\Cache\data_3 dns cache is definitely in there, might be some other data as well - so be careful Share Improve this answer Follow answered Nov 30, 2020 at 18:12 Severin PaarSeverin Paar 7811 silver badge66 bronze badges Add a comment  | 
In Windows there are two DNS cache respositories: the operating system and the browser. Flushing the operating system DNS does not clear the browser DNS cache (running ipconfig /flushdns does not remove entries from Chrome DNS cache). I understand you can clear and view the Google Chrome DNS cache by navigating to chrome://net-internals/#dns in the browser. Since this data is persistent (if you close the browser and re-open it, this data populates again), it means it's being written to disc. Which file contains this data? Use-Case: I want to access remote client Chrome DNS cache to detect users running proxies or tunnels (SOCKS5 will use proxy dns, so a significant difference between the OS DNS cache and the browser cache will indicate proxy/tunnel usage). As a bonus I'd also like to know where Firefox and Edge/IExplorer stores this data. It's not as easy to view because Firefox doesn't have the convenient net-internals.
Which files contain DNS cache for Google Chrome?
0 Normally Browser caches image, js and css files. If you are using image path or fixed image url for image src then you should add a random nonce [usually a random number] with the image url every time. so your image path will be something like that, <img src='//path/to/foo.img?1234567'></img> This will ensure that your previous image or cached image doesn't load. Always load new image from server though the page refreshed again and again. Share Improve this answer Follow answered Aug 2, 2017 at 5:16 Ataur Rahman MunnaAtaur Rahman Munna 3,89711 gold badge2424 silver badges3535 bronze badges 1 This is not what I am looking for. My issue isn't that I don't know how to force the browser to fetch the latest version of a file. My issue is that I need to prefetch the new version of a file asynchronously so that when the page is refreshed, the new file is retrieved from the cache instead of being fetched over the network. – kjh Aug 2, 2017 at 7:01 Add a comment  | 
This is a question about browsers in general, but I'm primarily concerned with Chrome. Let's say I have the following snippet in a file, index.html: <img src='//path/to/foo.img'></img> and foo.img changes on my server every hour. I want to prefetch this image on the hour so that when the user refreshes the page, the updated image //path/to/foo.img is read from the browser's HTTP cache. There are a few things I'm uncertain about: Are the responses for XHRs cached at all by default? If so, do they use a separate cache from the one the browser uses when fetching things like img, css, js, etc. requests? If the answer to #2 is no, then is it sufficient to send an XHR for //path/to/foo.img in order to cause the response to be cached - and then re-used by the browser when the page is refreshed?
Is a browser's HTTP cache ever used to store XMLHttpRequest responses?
-1 You can use crontab job, and look for specific file extension and delete it. You can even filter based on time and leave the files created in last n minutes. If you are ok with this, let me know, I will add more details here. Share Improve this answer Follow edited Jul 16, 2017 at 11:35 macfij 3,10111 gold badge2020 silver badges2424 bronze badges answered Jul 14, 2017 at 11:52 Mohammed SalauddinMohammed Salauddin 6044 bronze badges 1 That’s a fairly terrible solution which involves polling the file system regularly to see if it’s full. This is functionality which should be triggered by the file system becoming full, at the kernel level. – Philip Withnall Jul 28, 2017 at 14:02 Add a comment  | 
When writing software which needs to cache data on disk, is there a way in libc, or a way which is specific to a certain file system (such as ext4), to create a file and flag it as suitable to be deleted automatically (by the kernel) if the partition becomes almost full? There’s something similar for memory pages: madvise(…, MADV_FREE). Some systems achieve this by writing a daemon which monitors the partition fullness, and which manually deletes certain pre-determined paths once it exceeds a certain fill level. I’d like to avoid this if possible, as it’s not very scalable: each application would have to notify the daemon of new cache paths as they are created, which may be frequently. If this were in-kernel, a single flag could be held on each inode indicating whether it’s a cache file. Having a standardised daemon for this would be acceptable as well. At the moment it seems like different major systems integrators all invent their own.
Create a cache file which is automatically deleted when partition is nearly full in libc (all file systems) or ext4?
-1 They are stored at this path %USERPROFILE%\AppData\Local\ in Windows. Run this command %TEMP% in the run dialog box to visit the folder. Share Improve this answer Follow answered Feb 23, 2016 at 10:27 RubysmithRubysmith 1,16588 silver badges1212 bronze badges Add a comment  | 
I have few questions: Where pdf files are saved when opened in browser? Is it possible when i open pdf in Chrome, access that file somewhere in my pc system?
Where pdf files are saved when opened in browser?
0 You could add a kind of version number, for example like this: _worker = new Worker('js/toy-cpu.js?v=' + new Date().getTime()); Share Improve this answer Follow answered Oct 26, 2015 at 22:08 trincottrincot 330k3535 gold badges259259 silver badges296296 bronze badges 2 Okay, that would be my last solution if there is no other way to do it since it is surely faster than deleting the cache manually all the time. But I'd still would want my browser to not use the cache here. I think I'm not looking for the programmatic solution for a way out but if it can't be helped.. – Stefan Falk Oct 26, 2015 at 22:19 Then you get into browser-specific areas which cannot be controlled from within Javascript. – trincot Oct 26, 2015 at 22:37 Add a comment  | 
This is incredibly annoying.. I am wondering why the heck my changes aren't reflected as I notice that my JavaScript file for my Web Worker always gets loaded from cache: I have disabled the Cache and hitting Ctrl + F5 does not work either. How can I make sure that this file does not get loaded from cache? _worker = new Worker('js/toy-cpu.js');
Web Worker: How to prevent that file gets loaded from cache?
After a long run I found the solution, the problem was: replace:true, After I removed this part the problem was solved.
I created a directive with dynamic template and it is working well. The problem is that I am getting an error: At Chrome = TypeError: undefined is not a function at forEach.attr At Firefox = Error: element.setAttribute is not a function Here is the code: return { restrict:'E', replace:true, scope:{ content:'@' }, controller:function($scope){ $scope.getTemplateUrl = function() { if($scope.content.match(/<img/i) && !$scope.content.match(/icon-subs/i)) { return 'app/templates/_image.html'; } else if ($scope.content.match(/<a/i)) { return 'app/templates/_link.html'; } else if ($scope.content.match(/<iframe/i)) { return 'app/templates/_video.html'; } else { return 'app/templates/_minutes.html'; } } }, template: '<div ng-include="getTemplateUrl()"></div>' }; The directive tag: <div ng-repeat="lance in contentsArray"> <icone-timeline data-content="{{lance.content}}"></icone-timeline> <div ng-bind-html="lance.content"></div> </div>
Angular - Directive with dynamic template
0 Try this, works for me. <?php namespace YourBundle; use Symfony\Component\EventDispatcher\EventSubscriberInterface; use Symfony\Component\HttpKernel\KernelEvents; use Symfony\Component\HttpKernel\Event\FilterResponseEvent; class KernelSubscriber implements EventSubscriberInterface { public static function getSubscribedEvents() { return array( KernelEvents::RESPONSE => array( array('clearBrowserCache', 434255), ), ); } public function clearBrowserCache(FilterResponseEvent $event) { $response = $event->getResponse(); $response->headers->addCacheControlDirective('no-cache', true); $response->headers->addCacheControlDirective('max-age', 0); $response->headers->addCacheControlDirective('must-revalidate', true); $response->headers->addCacheControlDirective('no-store', true); } } services.yml kernel_subscriber: class: YourBundle\KernelSubscriber tags: - { name: kernel.event_subscriber } Share Improve this answer Follow answered Jan 10, 2017 at 10:27 Yuriy YakubskiyYuriy Yakubskiy 53955 silver badges66 bronze badges Add a comment  | 
I try to kill browser cache when user logout. I implement the LogoutSuccessHandlerInterface to extends the onLogoutSuccess method. There is no error but when I logout, I can press back button in browser and I see my profil page => If I refresh this page, I am automatically redirected, so I am correctly logged out. security.yml logout: path: /logout target: / invalidate_session: true success_handler: project_user.handler.logout_handler services.yml project_user.handler.logout_handler: class: Project\UserBundle\Handler\LogoutHandler Project/UserBundle/Handler/LogoutHandler.php <?php namespace Project\UserBundle\Handler; use Symfony\Component\Security\Http\Logout\LogoutSuccessHandlerInterface; use Symfony\Component\HttpFoundation\Request; use Symfony\Component\HttpFoundation\RedirectResponse; class LogoutHandler implements LogoutSuccessHandlerInterface { public function onLogoutSuccess( Request $request ) { $response = new RedirectResponse( '/' ); $response->headers->addCacheControlDirective( 'no-cache', true ); $response->headers->addCacheControlDirective( 'max-age', 0 ); $response->headers->addCacheControlDirective( 'must-revalidate', true ); $response->headers->addCacheControlDirective( 'no-store', true ); return $response; } } I try with this solution and that works perfectly, but this method is called for each requests (many time for each pages) and caused slowdowns. Please help! thx
Symfony2 - Logout and clear cache + prevent back button
i resolved it by not adding google maps api in app cache
I am creating the HTML5 mobile application and right now i have some problem in app caching , in my application i am showing the google map on my pages but i do not want to add the google api in app caching , but when i do not add the google api in app cache manifest file the application dont run , chrome show me this error can not get:- http://maps.gstatic.com/cat_js/intl/en_us/mapfiles/api-3/9/8/%7Bcommon,util%7D.js can not get:- http://maps.google.com/maps/api/jsv2/AuthenticationService.Authenticate?1shttp%3A%2F%2Flocalhost%2FmmtFinal%2Ftrunk%2F&4sAIzaSyA4H0NCYZ49_bwl9AkwViEGzU3gEen7-4I&5e0&callback=_xdc_._0h4avmibb&token=15142 I am clueless right now, please tell me what I am doing wrong. Thanks
App cache with google map API
-1 Hopefully the comments answered your questions about caching in jQuery. As they said, your code seems good, so its probably an issue with the browser. If you want to display a loading image while your page loads, you can add an loading image at the beginning of your page: <div id="loader"> <img src="loader.gif" alt="Loading..." /> </div> Then style it using CSS using something like: #loader { z-index: 100; position: fixed; top: 50%; left: 50%; margin-left: -10px; //(or half the width of your loader image) } And then add the jQuery function to hide it after your page completes loading: jQuery(window).load(function() { jQuery('#loader').hide(); }); Share Improve this answer Follow answered Mar 21, 2011 at 20:23 seangengseangeng 933 bronze badges 2 Thanks. I learned that the browser will cache the selector on page load (before the selector is clicked). God to know for a beginner like me. Now I wonder if there is a way to cache the actual function so the browser will remember what to do? – Hakan Mar 21, 2011 at 22:24 This answer didn't have anything to do with the question. – android.nick Jul 14, 2011 at 13:39 Add a comment  | 
I know that you can cache selectors in jquery/javascript by using “var = $xxx”. I am already doing this with all selectors that will be used more than once… Problem: The javascript animation is slow the first time the visitor activates the function by clicking. Next time they click it works without any hesitation. Since I don’t know much about JavaScript I wonder if this is because of A or B below: A: This is because the browser caches the selector only when the visitor has “clicked. B: This is because the browser remembers the function/animation. Question if A is true: Is there a way to cache all selectors before the click functions? Is there a way to make the browser remember the cached selectors until next time they visit the site? Question if B is true: Can I somehow cache the functions in JavaScript? Or can I show how run all the functions when visitors arrive (for example first pop-up a loading div with z-index 10000 and run all functions behind it). Here’s some example code: $(document).ready(function(){ var $selector1 = $('#div1'), $selector2 = $('#div2'); $selector1.click(function(){ $selector2.animate({height:'toggle'},350) }); }); Sorry for my bad English.
Speed up JavaScript/jQuery function
-1 As suggested in the question, timestamping the url helps: I used: var url = "https://api.myurl.com/" + param1 + "?" + new Date().getTime() Share Improve this answer Follow answered Feb 19, 2016 at 8:24 MarekMarek 3,5551717 gold badges7474 silver badges124124 bronze badges Add a comment  | 
I'm working on an app using Visual Studio 2015 Cordova tools on Windows 8.1. Target is also Windows 8.1. The app is caching HTTP GET request. So the second GET request to the same resource returns a cached response. I have tested after disabling the network adapter and I still get a response with the cached results. I am using jsforce libray to connect to salesforce.com. I know I can add a timestamp on the url but I would like to find fix not a work around. Any ideas? [UPDATE] Issue is not related to jsforce as it works well on Android. The error is specific to Windows 8.1 and cordova.
Cordova App 8.1 Caching HTTP request
0 There's a few solutions available to you. When one node gets the request to clear cache, you can put a message on a JMS topic and allow all nodes to read that message to clear the cache. Share Improve this answer Follow answered Feb 10, 2015 at 21:47 John AmentJohn Ament 11.6k11 gold badge3737 silver badges4545 bronze badges Add a comment  | 
We have a restful webservice. We are using JaxRs's CacheControl to cache the response xml for a GET endpoint. ex: GET - https://api.apiway.com/v1/users/12345 To Clear the cache, I need to hit the same endpoint but with a PUT(Instead of a GET) and the cache will be cleared for that endpoint. ex: PUT - https://api.apiway.com/v1/users/12345 This works fine when I run it in my local where I only have 1 websphere instance running. But our QA environment has 2 webphere instances running behind a load balancer. So, when I call the PUT endpoint to refresh the cache, it will only clear the cache on 1 websphere instance where as the other websphere instance will send respond with outdated data. How to I use cacheControl to refresh cache from multiple instances that are behind a load balancer?
clearing cache from multiple servers behind load balancer
2 Premature optimization is the root of all evil. If you don't need a cache, don't use a cache. That being said, if you are content to not serve up dynamic content per request, you might want to look into using a caching proxy such as varnish and cutting out PHP and the webserver entirely. There's quite a bit of overhead to get to even your first line of PHP, and serving static files through PHP is a little dirty. If you just want to cache elements, something like memcached or APC's cache is the way to go. APC has the advantage of being more readily available (you should have APC installed on your servers for the opcode cache if you care at all about performance) and memcached has the option of letting you have a cache that's accessible by multiple webservers (and/or multiple caches) Share Improve this answer Follow answered Mar 26, 2010 at 20:56 Daniel PapasianDaniel Papasian 16.3k66 gold badges3030 silver badges3232 bronze badges Add a comment  | 
I wrote a rather small skeleton for my web apps and thought that I would also add a small cache for it. It is rather simple: If the current page exists as a file in the cache and the file isn't too old, read it out and exit instead of rebuilding the page If the current page isn't cached/outdated recalc the page and save it However, the bad thing about it is: My performance tests with a page that receives 40 relatively long posts via a MySQL query said that with using the cache, it took even longer to handle a single request (1000 tests each) How can that happen? How can doing a MySQL query, looping through the results the first time, passing the results to the template and then looping through the results for the second time be faster than a filemtime() check and a readout? Should I just remove the complete raw-PHP cache and relieve on the availability of some PHP cache like memcached or so?
Is a PHP-only "cache engine" ever worth it?
-2 It's possible to vary the Output Caching based on pretty much anything you want by using VaryByCustom, and providing a special function that will return the cache key string. For your case, try a directive like this: <%@ OutputCache Duration="30" VaryByParam="myParam" VaryByCustom="mySessionVar" %> Then in Global.asax, override the GetVaryByCustomString function for your application: public override string GetVaryByCustomString(HttpContext context, string arg) { if(arg == "mySessionVar" && Session["mySessionVar"] != null) { return Session["mySessionVar"].ToString(); } return base.GetVaryByCustomString(context, arg); } Share Improve this answer Follow answered Mar 3, 2010 at 18:15 wompwomp 116k2626 gold badges238238 silver badges269269 bronze badges 1 2 -1 You can't acess session at this time. "Session state is not available in this context." – Zote Jul 28, 2011 at 19:03 Add a comment  | 
I would like to know if its possible to use outputcache with a querystring parameter AND a session parameter together. I'm serving location based content and the countryid is stored in a session, while other parameters as categoryid, pageindex are stored in querystring.
Basing ASP.NET Outputcache on querystring parameter AND session
0 Since the question you're asking is a general-approach one, I will put my two-cents in. On your approaches: Option 4 - You could use some offline software or an external site for compression, but it seems tedious work. If I needed to upload one image per day, I would probably choose this option. Option 2 - I would rather not do compression on upload since you lose the original image. Image compression can ruin some images very badly. As for options 1&3 - I think it depends on the resources of your server, the number of images, the traffic of your site, etc. Generally, I prefer compressing/caching on request, not upload, but for a smaller site, it shouldn't make much difference. As for the API - generally, you have two options: do the work on your server/site or use an external service. When it comes to services, we use CloudImage, it has very simple API and it helps a lot with the compression process (and resizing if you need it). Also, you have the benefits of the CDN, which will boost the performance. Since you are using Pydio, I assume you need data security and privacy, so CloudImage may be a good option for you since they take the privacy stuff really seriously. If you prefer to do this yourself, and given that you use PHP, I would recommend ImageMagick and the PHP library IMagick. You can control every parameter of the compression and the documentation is pretty good. The only downside is that to achieve best compression without losing quality, it is a bit of trial-and-error at first. Good luck! Share Improve this answer Follow answered Jun 25, 2018 at 10:26 Philip MichaylovPhilip Michaylov 2122 bronze badges Add a comment  | 
I am receiving control of a website and I need to take care of an image compression process. Right now, when uploading an image, it gets stored on the server with high quality and when the website's being cached, the image is getting compressed for the cache. So the cache has a compressed copy of the image while the original, high quality image, is still stored on the server. The tool which is responsible of doing what I have just described was developed by the current owner of the website and since I am not getting that tool I will need another one. The site is currently using Pydio and I have not seen any compression option there. Since it seems I need a new tool for the image compression process, I want to know first what is the best practice, performance-wise, for handling the compression and I know there some good, experienced developers here. I thought about some options: Keep it the way it is now, which is to store the original image on the server and when caching, compress it for the cache (Best compatibility with the website since this is what the tool currently being used doing). Compress all images the moment they are being uploaded and so I will have only the compressed images on the server and use them to cache (Save storage space, but don't know how to combine it with Pydio). Have a cron which will compress all the images which are not already compressed (Gives me the ability to upload images freely without worrying about compressing them, though the images will not be immediately compressed). Upload the image to a website which compresses the image and then take the outputted image and upload it (Really, sounds stupid and a lot of messing around in order to upload an image).. What do you think will be the best practice, and why? Also, Is there a better practice for compressing the images? Plus, if you know any tool which has an API for it or anything, I will be thankful to hear about it. The website is built using PHP.
What is the best practice for image compression
1 Make sure you provision memcachier addon on heroku and that you have its credentials in your environment (run: heroku config). Also make sure your memcachier & dalli gems are not nested in any gem group so that they are available in production. Share Improve this answer Follow answered Sep 10, 2014 at 10:31 hammadyhammady 96911 gold badge1313 silver badges2323 bronze badges Add a comment  | 
I am using: Rails 4.1.0rc2 Heroku gem 'memcachier' gem 'dalli' If I use caching from the console, it works: irb(main):010:0> Rails.cache.write("foo", "bar") => 1297036692682702848 irb(main):011:0> Rails.cache.read("foo") => "bar" But if I set the cache using Rails.cache.fetch in the application, and attempt to read via the console, I get this. Rails.cache.read([School, "California", [], School.where(state: "California").all.map(&:updated_at).max, "city_filters"]) Dalli::Server#connect mc3.dev.ec2.memcachier.com:11211 Dalli/SASL authenticating as 451265 Dalli/SASL authenticating as 451265 Dalli/SASL: 451265 Dalli/SASL: 451265 => [{:type=>"city", :value=>"San Francisco", :count=>11, :current=>false}] But when I run this in the app, it does a new search each time. Completed 200 OK in 8481ms (Views: 1151.4ms | ActiveRecord: 246.6ms) Caching works locally (it's not a full cache). Completed 200 OK in 655ms (Views: 244.5ms | ActiveRecord: 74.5ms) How can I get memcached/dalli working? I had this working in a different app; seems like the same set up to me.
Rails 4.1 w/ Heroku: DalliError: No server available
This was an issue with local storage support in IE8. The cache does not remove local storage so instead need to call the following from the developer tools console $.jStorage.flush()
I'm using IE8 and the jstorage library to store data in place of cookies. This is all good until i want to clear the stored values. In chrome this is possible by navigating to the content settings page. However in IE8 it only provides an option to clear cookies which doesn't clear the values I've stored in local storage. Any ideas how i can clear this data? I don't want to display a "clear cache" button
How to clear local storage values in Internet Explorer 8
You shouldn't. NHibernate sessions are there to help you work in an ACID environment, which means that one transaction is not aware of any concurrent transactions. You should be using short sessions which do small sets of actions. You should not be holding sessions open for long periods of time. If you do need long periods of time for working with domain objects, then you should be detaching and then re-attaching the domain objects from and to different sessions. Once you open a new session, any changes done to the database before the session was opened will be made available through NHibernate.
I am implementing NHibernate into an existing web application. However, we have some other processes that do bulk inserting and updating on the database. How can I make NHibernate aware that changes are occurring on the backend db that were not initiated through NHibernate? Most of the info that I have read around NHibernate use in asp.net have mentioned storing the Session object in the HttpContext or CallContext. This would then store the session object for the duration of the application lifecycle. This is what I have implemented. I was afraid of the costs of initializing NHibernate on each request. Isn't there a significant performance hit with this approach with initializing the Session object on each request? Also, would it make more sense to store the SessionFactory in the the HttpContext or CallContext so that the mappings don't have to be regenerated on each request?
How to force NHibernate to recognize db changes not made through NHibernate
5 I've already tried simply to delete gradle folder. then restarted Android Studio and the problem solved Share Improve this answer Follow answered Jul 7, 2015 at 20:37 Andread SzeAndread Sze 5144 bronze badges Add a comment  | 
Error:java.io.FileNotFoundException: C:\Users\Petrusic\AndroidStudioProjects\askramarnovi.gradle\2.2.1\taskArtifacts\cache.properties (The system cannot find the file specified) C:\Users\Petrusic\AndroidStudioProjects\askramarnovi.gradle\2.2.1\taskArtifacts\cache.properties (The system cannot find the file specified)
Error in Android Studio cache.properties
You should take a look at memchached: access it through PHP's memcache or memcached Memcache module provides handy procedural and object oriented interface to memcached, highly effective caching daemon, which was especially designed to decrease database load in dynamic web applications. It is designed exactly for what you need and is used by many high-performance webapplications. Memcached was developed to enhance the speed of LiveJournal.com, a site which was already doing 20 million+ dynamic page views per day for 1 million users with a bunch of webservers and a bunch of database servers. The introduction of memcached dropped the database load enormously. Note There are two (2) client libraries in PHP. Some more discussion on this can be found on serverfaul.com: memcache-vs-memcached and here is a comparison
I'm writing a web service for my application and want to know the best way to handle the possibly tons of requests I might get. A lot of the data probably won't change throughout the day but the particular script I'm writing makes 3 MySQL queries which seem a little excessive considering the data will probably be the same as the last request to the script, and if it's not the same then it's no big deal. Will performance be much better if I save the output XML/JSON to a file and then serve it to the requester throughout the day and then overwrite it with the first request of the following day? What's the best way of doing this? I know Joomla and phpBB and other MySQL intensive applications use caching so as to not make as many MySQL queries, so this is what got me thinking. EDIT - Forgot to mention I'm on Windows/IIS 7.0
Caching with PHP to take stress away from MySQL
6 Do you actually need them removed from the cache at that time? Or just that future requests to the cache for that item should return null after a given time? To do the former, you would need some sort of background thread that was periodically purging the cache. This would only be needed if you were worried about memory consumption or something. If you just want the data to expire, that would be easy to do. It is trivial to create such a class. class CachedObject<TValue> { DateTime Date{get;set;} TimeSpan Duration{get;set;} TValue Cached{get;set;} } class Cache : Dictionary<TKey,TValue> { public new TValue this(TKey key) { get{ if (ContainsKey(key)) { var val = base.this[key]; //compare dates //if expired, remove from cache, return null //else return the cached item. } } set{//create new CachedObject, set date and timespan, set value, add to dictionary} } Share Improve this answer Follow answered Apr 9, 2009 at 16:40 Jason CoyneJason Coyne 6,5681010 gold badges4141 silver badges7272 bronze badges Add a comment  | 
I am writing a Console Application in C# in which I want to cache certain items for a predefined time (let's say 1 hour). I want items that have been added into this cache to be automatically removed after they expire. Is there a built-in data structure that I can use? Remember this is a Console App not a web app.
C# Collection whose items expire
Yes, Volley caches every response, unless setShouldCache is set to false. BUT, it does so according to the HTTP cache headers of the response. This means that if there are no cache headers, or they have expired, the JSON response (or any response for that matter) will NOT be cached. setShouldCache is true by default so you don't have to set it to true manually. It's actually used to explicitly ask for the response not to be cached. Also, the tutorial you're looking at is wrong. You do not need to manually interact with Volley's cache. Volley does that automatically.
Ive followed a few tutorials, one in particular shows to do the following if you want to use the cached result for HTTP call. Cache cache = MyApplication.getInstance().getRequestQueue().getCache(); Cache.Entry entry = cache.get(url); if (entry != null) { // means there is a cached result, to use it we have to do something like this... new JSONObject(new String(entry.data, "UTF-8")) } else { // no cached result found so make the full request JsonObjectRequest jsonObjReq = new JsonObjectRequest(Request.Method.GET, url, null, new Response.Listener<JSONObject>() { @Override public void onResponse(JSONObject response) { //stuff } }, new Response.ErrorListener() { @Override public void onErrorResponse(VolleyError error) { //stuff } }); // Adding request to request queue MyApplication.getInstance().addToRequestQueue(jsonObjReq, TAG);} I was under the impression that the default behaviour was to cache the result, and retrieve the cached result automatically, without having the explicitly get the cached entry - ie. entry = cache.get(url).. so im basically asking whether or not this is the default behaviour. thanks
Are Android Volley Requests Automatically Cached?
I'd recommend you to follow this rules in order to improve performance in general http://developer.yahoo.com/performance/rules.html If you install the YSlow for firebug it will validate all these rules for you. And regarding to cache in particular I recommend you to read this tutorial. Cache is a very extensive topic and it's not easy to explain everything in 10 lines :-) http://www.mnot.net/cache_docs/#CONTROL Specifically talking about the output cache directive for ASP NET pages, it's quite simple to use. Here you have the reference http://msdn.microsoft.com/en-us/library/hdxfb6cy.aspx But please take into account that is important to use the cache for pages and also for other resources as css, JS and images.
I am a web designer and usually design corporate web sites which often does not require update. So I want to cache the output for one day. How can I do this? Also any suggestions for better performance for asp.net on slow servers are accepted.
How to cache asp.net web site for better performance
It won't matter how long you tell it to stay in the cache if the app pool is set to recycle every xx minutes. On most Servers the app pool recycles by default every 1440 minutes (24 hours). Your cache can't survive that app pool recycle unless you persist the data. I don't persist that long (almost never really practical). I also set the app that if the cache has been purged to rebuild itself AS IT NEEDS THE DATA - not all at once. You could also be using more RAM than the app pool is configured to allow, too much CPU, etc. And if this is running on a developer machine everytime you reset the app or reload Visual Studio everything is also erased as well. Look at your App Pools and you will probably see the culprit for the resets.
I'm currently developing a site in ASP.NET Webforms.. I'm caching things where it makes sence... adding things using High / Normal / Low priority.. telling them to stay in the cache for 2 weeks, 1 week, 4 hours respectably im showing the current number of cached items on every page (for debug reasons). sometimes if i travel through the site quickly the number of items in the cache can get upto 2000 items... but if i wait 5 mins and refresh the page the cache goes down to 20 items.. (just what was cached on that page) is there any way i can find our what's going on? and is there a reason for this that im missing? im running Win7, 4Gb ram, 64 bit, VS10, .net4, i have 4 gigs of ram so why should my cache completely empty? i would say 10% of the cached items are about 4k each the rest would be strings about 100 chars long. EDIT: I'm using Sliding Expiration EDIT: I sorted it out, there were one or two items that were VERY BIG and they were set to High Priority.. that and some other smaller changes fixed my problem.
I'm losing all my cache.... (Items are disapearing from my cache)
Um APC is current tech and almost a must for any performant PHP site. Not only that but it will ship as standard in PHP 6 (rather than being an optional module like it is now). I don't know what your issue is/was but it's not APC being outdated or old tech.
I have started having problems with my VPS in the way that it would faill to serve the pages on all the websites. It just showed a blank page, or offered to download the php file ( luckily the code was not in the download file :) ). The server was still running, but this seemed to be a problem with PHP, since i could login into WHM. If i did a apache restart, the sites would work again. After some talks with the server support they told me this is a problem with the APC extension witch they considered to be old and not recommended for production servers. So they removed it for now, to see if the same kind of fails would continue to appear. I haven't read anywhere that APC could have some problems or that its not always recommended to use, quite the contrary ... everywhere people are saying to always use it. The APC extension was installed ssh and is the latest version. Edit: They also dont recomend MemCache and say that a more reliable extension would be eAccelerator
APC not recommended for production?
For a DB, 10K rows is nothing. You're not seeing much difference because the actual calculation time is minimal, with most of it consumed by other, constant, overhead. It's difficult to predict when you'd start noticing a difference, but it would probably be at around a million rows. If you've already set up caching and it's not detrimental, you may as well leave it in.
We have about 10K rows in a table. We want to have a form where we have a select drop down that contains distinct values of a given column in this table. We have an index on the column in question. To increase performance I created a little cache table that contains the distinct values so we didn't need to do a select distinct field from table against 10K rows. Surprisingly it seems doing select * from cachetable (10 rows) is no faster than doing the select distinct against 10K rows. Why is this? Is the index doing all the work? At what number of rows in our main table will there be a performance improvement by querying the cache table?
DB Index speed vs caching
In Sitecore 6, the CacheManager class has a static method that will clear all caches. The ClearAll() method is obsolete. Sitecore.Caching.CacheManager.ClearAllCaches();
I am trying to publish programmatically in Sitecore. Publishing works fine. But doing so programmatically doesn't clear the sitecore cache. What is the best way to clear the cache programmatically? I am trying to use the webservice that comes with the staging module. But I am getting a Bad request exception(Exception: The remote server returned an unexpected response: (400) Bad Request.). I tried to increase the service receivetimeout and sendtimeout on the client side config file but that didn't fix the problem. Any pointers would be greatly appreciated? I am using the following code: CacheClearService.StagingWebServiceSoapClient client = new CacheClearService.StagingWebServiceSoapClient(); CacheClearService.StagingCredentials credentials = new CacheClearService.StagingCredentials(); credentials.Username = "sitecore\adminuser"; credentials.Password = "***********"; credentials.isEncrypted = false; bool s = client.ClearCache(true, dt, credentials); I am using following code to do publish. Database master = Sitecore.Configuration.Factory.GetDatabase("master"); Database web = Sitecore.Configuration.Factory.GetDatabase("web"); string userName = "default\adminuser"; Sitecore.Security.Accounts.User user = Sitecore.Security.Accounts.User.FromName(userName, true); user.RuntimeSettings.IsAdministrator = true; using (new Sitecore.Security.Accounts.UserSwitcher(user)) { Sitecore.Publishing.PublishOptions options = new Sitecore.Publishing.PublishOptions(master, web, Sitecore.Publishing.PublishMode.Full, Sitecore.Data.Managers.LanguageManager.DefaultLanguage, DateTime.Now); options.RootItem = master.Items["/sitecore/content/"]; options.Deep = true; options.CompareRevisions = true; options.RepublishAll = true; options.FromDate = DateTime.Now.AddMonths(-1); Sitecore.Publishing.Publisher publisher = new Sitecore.Publishing.Publisher(options); publisher.Publish(); }
Sitecore Clear Cache Programmatically
You have to add a MIME type to .manifest: .manifest using text/cache-manifest In .htaccess: AddType text/cache-manifest .manifest
I've been writing a simple text-editor in HTML5 that is supposed to work offline. I can't, however, get the offline application cache to work, and I can't work out why not. My manifest file is like this: CACHE MANIFEST application.html options.html ... And it is being invoked as follows: <!DOCTYPE html> <html manifest="cache.manifest"> <head> ... I'm using Google App Engine to host the web application. I've put the webpage through the W3C HTML validator (http://validator.w3.org/check?uri=https%3A%2F%2Fwrite-space.appspot.com%2F) and it comes out fine. I've tested it in Chrome and Firefox. In Chrome nothing is added to the cache storage (and window.applicationCache.status returns 0). In Firefox the notification bar asking to cache the files does not appear. Basically, the files are not being cached. I've looked at various demos that do cache for offline viewing, and cannot work out why my code does not work. Can anyone help?
HTML5 Application Cache Not Working
Just add a timestamp to the link getting the Excel File, e.g. printf('<a href="file.xls?%d">Excel File</a>', time()); Because the timestamp is always different, their browser won't cache the file.
I have a PHP script that simply takes some data, separates it into tab delimited format, saves it as an .xls file, and then gives user a link to download it. Works well most of the time, but some people are getting cached versions of the exported .xls file. What I am thinking I need to do is instead of giving user a direct link to the .xls document, give them a link a PHP page something like this: deliver_excel_doc.php?file=some_excel_file.xls& then the deliver_excel_doc.php pulls in the data from the excel doc but does something with headers so the excel doc is not cached and then outputs as xls so the file will be downloaded (or rendered inside excel). Any ideas on how I can do this (is this concept viable)?
Eliminate Caching of Excel Document
5 You can cache: Query results The HTML output of a PHP script/request Cache variables Cache parts of a page. Cache the code itself (speeds up things, no need to do bytecode). Each of those is a different subject with different methods. Share Improve this answer Follow answered Jul 17, 2009 at 18:09 Itay Moav -MalimovkaItay Moav -Malimovka 53.1k6363 gold badges192192 silver badges284284 bronze badges Add a comment  | 
I have read a few things here and there and about PHP being able to "cache" things. I'm not super familiar with the concept of caching from a computer science point of view. How does this work and where would I use it in a PHP website and/or application. Thanks!
can i do caching with php?
I had the same problem. I wanted to create a bot which sends an image taken by a webcam of a ski slope (webcam.example.com/image.jpg). Unfortunately, the filename and so the url never updates and telegram always sends the cached image. So I decided to alter the url passed to the api. In order to achieve this, I wrote a simple php site (example.com/photo.php) which redirects to the original url of the photo. After that, I created a folder (example.com/getphoto/) on my webspace with a .htaccess file inside. The .htaccess redirects all request in this folder to the photo.php site which redirects to the image (webcam.example.com/image.jpg). So you could add everything to the url of the folder and still get the picture (e. g. example.com/getphoto/42 or example.com/getphoto/hrte8437g). The telegram api seems to cache photos by url, so if you add always another ending to the url passed to the api, telegram doesnt use the cached version and sends the current image instead. The easiest way to always change the url is by adding the current date to it. example.com/photo.php <?php header("Location: http://webcam.example.com/image.jpg"); die(); ?> example.com/getphoto/.htaccess RewriteEngine on RewriteRule ^(.*)$ http://example.com/photo.php in python: bot.sendPhoto(chat_id, 'example.com/getphoto/' + strftime("%Y-%m-%d_%H-%M-%S", gmtime())) This workaround should also work in other languages like java or php. You just need to change the way to get the current date.
I am working on a telegram bot that displays images from several webcams upon request. I fetch the images from urls and then send to the user (using bot.sendPhoto() ) My problem is that for any given webcam the filename does not change and it seems that the photo is sent from telegram's cache. So it will display the image from the first time that image was requested. I have thought about downloading the image from the url, saving with a variable name (like a name with a timestamp in it) then sending it to the chat, this seems like an inelegant solution and was hoping for something better. Like forcing the image not to be cached on the telegram server. I am using the python-telegram-bot wrapper, but I am not sure that it's specific to that. Any ideas? I have tried searching but so far am turning up little. Thanks in advance.
telegram bot image from url - undesired cache
Keep this line in top of the login page.That will clear cache and prevent back page(pabel) <?php echo header("Cache-Control: no-store, no-cache, must-revalidate, max-age=0"); header("Cache-Control: post-check=0, pre-check=0", false); header("Pragma: no-cache"); header('Content-Type: text/html');?>
My problem is I want to do when I already logout, If I click back button on browser it never back admin home page. Now when I click back button It show admin secret pages, But when I refresh the page then It back to login page. I solved it write the code in filter.php on Laravel 4.2 App::after(function($request, $response) { $response->headers->set('Cache-Control','nocache, no-store, max-age=0, must-revalidate'); $response->headers->set('Pragma','no-cache'); $response->headers->set('Expires','Fri, 01 Jan 1990 00:00:00 GMT'); }); But Now how can I do Prevent Back Login After Logout by hitting the Back button on Browser in L5?
Prevent Back Login After Logout by hitting the Back button on Browser in L5?
This is a trick, but it works. Put a variable and random number in the image url. Something like: <img src="photo.jpg?xxx=987878787"> Maybe there's a better way, but it works for me.
I have a an asp.net-mvc site. On one of the pages, I have an image, I use jcrop to allow users to crop the image. When this click submit, I crop the image on the server side and then reload the page. The issue is that the image looks the same as before . . if i hit F5 and refresh the page then the updated image shows up.. Is there anyway to avoid having to force a F5 refresh in this case?
What is the best way to force an image refresh on a webpage?
5 header("Cache-Control: no-cache, must-revalidate"); header("Expires: Mon, 26 Jul 1997 05:00:00 GMT"); //header("Content-Type: application/xml; charset=utf-8"); for clearing browser cache Share Improve this answer Follow edited Mar 7, 2012 at 8:20 answered Mar 7, 2012 at 8:03 GhostmanGhostman 6,05499 gold badges3535 silver badges5353 bronze badges 1 For clarity, this does not clear the browser cache. This tells the browser not to cache the page in the first place. And the Content-Type header probably shouldn't be there (at least not statically like that) – Leigh Mar 7, 2012 at 8:05 Add a comment  | 
I want to clear browser cache in each page when it loads in the browser. I used clearcache() php function but it did not work for me. please help. Thanks.
How to clear cache on each php page when it loads in the browser?
7 Quoting Hibernate documentation: 3.4.6. Hibernate statistics If you enable hibernate.generate_statistics, Hibernate exposes a number of metrics that are useful when tuning a running system via SessionFactory.getStatistics(). Hibernate can even be configured to expose these statistics via JMX. Read the Javadoc of the interfaces in org.hibernate.stats for more information. You'll find the org.hibernate.stats package-summary here. For the JMX part, have a look at Publishing statistics through JMX. For more advanced stuff, you'll have to rely on specific features from your cache provider. Share Improve this answer Follow edited Jul 20, 2010 at 17:50 answered Dec 3, 2009 at 11:40 Pascal ThiventPascal Thivent 566k138138 gold badges1.1k1.1k silver badges1.1k1.1k bronze badges 4 @Ondra Thanks, link updated (why the hell did JBoss not implement some permanent redirect to the new location, there are so many links on the net pointing to nowhere now, this goes against all best practices...). – Pascal Thivent Jul 20, 2010 at 17:54 Well, JBoss.org has underwent big structural changes, and unfortunatelly, link preservation was not the priority to be kept in mind when there's only few people taking care of the whole web... – Ondra Žižka Jul 20, 2010 at 22:09 I wanted to send a redirection request to JBoss.org team, but I don't know the original link :) Do you still have it, accidentally? – Ondra Žižka Jul 20, 2010 at 22:14 @Ondra The original link was hibernate.org/hib_docs/v3/api/org/hibernate/stat/…. Ideally, a redirect should be done for both the javadoc and the reference guide, something like hibernate.org/hib_docs/v3 --> docs.jboss.org/hibernate/core/3.3 – Pascal Thivent Jul 20, 2010 at 22:25 Add a comment  | 
Is there any tool which would allow for monitoring Hibernate 2nd level cache usage? I know that I could use Hibernate API for retrieving such information. But what should I do when I have application which doesn't read the information itself, and I can't modify it? Is there any way to read cache statistics from the outside of the application?
Tool for monitoring Hibernate cache usage
The Hibernate 1st level cache can not be disabled (see How to disable hibernate caching). You need to understand Hibernate's session cache if you want to force Hibernate querying to the database. Lokesh Gupta has a good tutorial on http://howtodoinjava.com/2013/07/01/understanding-hibernate-first-level-cache-with-example/ First level cache is associated with “session” object and other session objects in application can not see it. The scope of cache objects is of session. Once session is closed, cached objects are gone forever. First level cache is enabled by default and you can not disable it. When we query an entity first time, it is retrieved from database and stored in first level cache associated with hibernate session. If we query same object again with same session object, it will be loaded from cache and no sql query will be executed. The loaded entity can be removed from session using evict() method. The next loading of this entity will again make a database call if it has been removed using evict() method. The whole session cache can be removed using clear() method. It will remove all the entities stored in cache. You should therefor either use the evict() or clear() method to force a query to the database. In order to verify this, you can turn on SQL output using the hibernate.show_sql configuration property (see https://docs.jboss.org/hibernate/orm/5.0/manual/en-US/html/ch03.html#configuration-optional).
I have an app to retrieve data from Database, and I monitor the time my app takes to retrieve data. But I have an issue when I use the same data input set to retrieve data with my app, the second time retrieving will take much less time. I assume Java or Hibernate has some cache or temp file to save the data, so second time run will be fast, but I don't want it happen. I need monitor the time it actually takes, not the time retrieve from cache or temp file. I tried to forbid the cache and temp file generate in Java control Panel, I tried to disable the hibernate cache(first level or second level). But these are still not solve my problem. The second time run still takes less time than it should take. Any idea the reason caused the second time run faster? it just a simple app to retrieve data from DB
How can I stop Java or Hibernate Caching
Building on @justrohu answer you could have a method you wrap all of your raw queries around... public function cacheQuery($sql, $timeout = 60) { return Cache::remember(md5($sql), $timeout, function() use ($sql) { return DB::raw($sql); }); } $results = $this->cacheQuery("SELECT * FROM stuff INNER JOIN more_stuff"); This would cache your queries by creating an MD5 hash of the SQL as the cache key.
I have a repository with a number of raw queries, for example: DB::select(DB::raw( 'SELECT stuffFields FROM stuffTable A NUMBER OF COMPLEX JOINS, ETC' )); I would like to cache the results from this query, but I encountered a couple of issues: 1) I cannot do ->remember(60), as the Fluent query is not started with the table() method. 2) I cannot do DB::table('stuffTable') ->select(DB::raw( 'stuffFields A NUMBER OF COMPLEX JOINS, ETC' ))->get(); because there are those joins and the FROM clause is getting appended at the end of the query (after the joins), and this throws an SQL syntax error. I also cannot bring out the joins in a join() method, as they contain nested queries (is there a way to perform rawJoin()...i couldn't find anything like that?). Can anyone suggest a way to either restructure the Fluent calls or a common way to cache such raw queries?
Laravel Cache Raw Queries
4 Using code splitting will aid you in chunking up the code. Unfortunately you'll need to look at how your app is setup to take advantage of it, but you'll definitely benefit from it, if you app is that huge. Share Improve this answer Follow edited Oct 1, 2012 at 23:10 Thomas Broyer 64.4k77 gold badges9393 silver badges164164 bronze badges answered Oct 1, 2012 at 23:01 checkettschecketts 14.5k1111 gold badges5454 silver badges8282 bronze badges 0 Add a comment  | 
I have a huge GWT application where after I compile it, it generates a file XXX.chace.html. This file is very large -- about 3 MB -- and it requires the user to spend 30 seconds to load the webpage. Is there anyway to reduce this file other than using code splitting? I tried using the compress command in GZip, but apparently there's a bug filed to the effect that files larger than 1.5 MB wont be compressed. Has anyone faced this issue before? How were you able to solve it?
GWT generated HTML file too large
One problem with HttpRuntime.Cache in a server farm is that much of the data in the cache will be duplicated across servers. For example, lets say you have an e-commerce site and you store the product catalog in the cache (since it rarely changes) and that takes 20M of memory. If you have 5 web servers, each of them will have their own copy of the product catalog in their cache, using a total of 100M of memory (20M * 5 servers). Using a shared cache like memcached eliminates that waste by storing only one copy of the cached items instead of 5 copies (one for each server). Another problem is cache synchronization. Again, let's say you cache your product catalog for 1 hour after retrieving it from the database. Each server could have a copy of the catalog in its cache that expires at different times. When the cache expires on one server, it could get updated data that the other servers don't have yet. This could result in a strange user experience where they might see 50 products in a category, then refresh the page, are sent to a different server, and see 51, refresh again and see 50 again, etc. With a shared cache, you don't have to worry about that.
I've heard that HttpRuntime.Cache is not suitable for use with web farms. Why is that? What are some good example workarounds to this problem? I have heard about Memcached and similar offerings but I cannot find any good recent sample showing basic usage.
HttpRuntime.Cache not suitable for web farms?
You have a problem with the casing of your variable name. PHP variable names are case sensitive. Change cacheFile to cachefile (with the small F instead). Change this: $cached = fopen($cacheFile, 'w'); To this: $cached = fopen($cachefile, 'w');
Im using this tutorial http://papermashup.com/caching-dynamic-php-pages-easily/ for caching a page <?php { $cachefile = $_SERVER['DOCUMENT_ROOT'].'cache.html'; $cachetime = 4 * 60; // Serve from the cache if it is younger than $cachetime if (file_exists($cachefile) && time() - $cachetime < filemtime($cachefile)) { include($cachefile); } else { ob_start(); // Start the output buffer ?> /* Heres where you put your page content */ <?php // Cache the contents to a file $cached = fopen($cacheFile, 'w'); fwrite($cached, ob_get_contents()); fclose($cached); ob_end_flush(); // Send the output to the browser } ?> but i get the following errors Warning: fopen() [function.fopen]: Filename cannot be empty in Warning: fwrite(): supplied argument is not a valid stream resource in Warning: fclose(): supplied argument is not a valid stream resource in The path to the file is right. And if i edit the file my self is included but again i get the errors
Warning: fopen() [function.fopen]: Filename cannot be empty in
"There are only two hard things in Computer Science: cache invalidation and naming things." — Phil Karlton
There are many questions on here about caching that doesn't work properly, or asking how to implement caches properly, for all sorts of things from HTTP to SQL queries, L1/L2 memory caching, etc.. Is it generally held to be a difficult problem in computer science terms?
Is implementing a cache considered a difficult problem?
Ok, so I feel like an idiot for answering my own question but hopefully it helps someone one day. This was not an API Gateway caching issue. The problem was a pymysql connection & lambda session caching issue. My Lambda was using pymysql to query the MySQL RDS. Per recommended performance reasons, I reused the connection across lambdas (meaning I did not close the connection each time). The solution was to call conn.commit() after I did my fetchall() What was happening was that my subsequent calls were returning a cached query result (termed a consistent read. Thanks! @Michael - sqlbot) I believe I probably had more than one lambda containers or something so when I was inactive for a while (ie busy reading stackoverflow posts), the lambda would unload. Then my next API gateway attempt would reinitialize a fresh lambda handler and a branch new connection would be created (without a cache). So this is why it seems to "sometimes work, then stop". Apologies if I wasted anyone's time.
Update: I figured it out, please see the answer post below. I have an AWS API Gateway api defined with various resources and various GET and POST methods. Everything works mostly fine. POSTs are going through. GETs return a response (JSON payload) except that the returned value seems to be a cached value. My GET api calls a Lambda function that calls a query to RDS. I can confirm my responses are stale because: When I manually Query the RDS, I get the updated value I have Cloud Watch logs enabled and the lambda function does not get called (I believe I have it set up correctly because when I test invoke the lambda, I can get Cloud Watch logs) It did refresh once, but I think that was because I crossed some (like 1hr) caching threshold or something. I understand that API Gateway generates a CloudFront behind the scenes. And I feel that this is what is doing the caching. But that's just a guess and I have no proof. Maybe some kind of default caching TTL? I obviously have caching turned off on my API Gateway stage. I even tried enabling it, setting the TTL to 1, flushing the cache, and disabling cache again. Each stage of that testing still returned the stale values. I do not know if it is relevant, but additional details: I have CORS enabled ("*") I have Cognito authorizers enabled I pass in the JWT token via the Authorization header (this is all working fine) Is there some header I'm supposed to pass to request an uncached value? I went to CloudFront, but here are no configurations there. All other posts on API Gateway caching seem to be about caching not working or people asking about cache key specificity. I haven't seen anything about the value ALWAYS being cached no matter what. So I feel like I'm missing something obvious... Any help or debugging tips would be much appreciated!
AWS API Gateway GET response always cached
Because it makes no sense, and if you would think about all the cases where it makes no sense you would not ask it. This is not so much a "does it sometimes make sense" question as a "are there side effects that make it bad". Next time you evaluate something like this, think about the negatives: Memory consumption goes up as you HAVE to cache the results, even if not wanted. On then ext run, the results may be different as incoming data may have changed. your simplistic example (Enumerable.Range) has no issue with that - but filtering a list of customers may have them updated. Stuff like that makes is very hard to sensibly take away the choice from the developer. Want a buffer, make one (easily). But the side effects would be bad.
So it is my understanding that LINQ does not execute everything immediately, it simply stores information to get at the data. So if you do a Where, nothing actually happens to the list, you just get an IEnumerable that has the information it needs to become the list. One can 'collapse' this information to an actual list by calling ToList. Now I am wondering, why would the LINQ team implement it like this? It is pretty easy to add a List at each step (or a Dictionary) to cache the results that have already been calculated, so I guess there must be a good reason. This can be checked by this code: var list = Enumerable.Range(1, 10).Where(i => { Console.WriteLine("Enumerating: " + i); return true; }); var list2 = list.All(i => { return true; }); var list3 = list.Any(i => { return false; }); If the cache were there, it would only output the Enumerating: i once for each number, it would get the items from the cache the second time. Edit: Additional question, why does LINQ not include a cache option? Like .Cache() to cache the result of the previous enumerable?
Why does LINQ not cache enumerations?
For starters, this isn't caching: Route::get('/users', function() { $users = User::all(); Cache::put('users',$user,60); if(Cache::has('users')){ return Cache::get('users'); } }); You're running the query every time, then doing extra work to put it into the cache every time. You've replaced: database call with: database call save the results of the call to cache ask if cache has the key we just saved get key we just saved back from the cache You want to be doing something like this: Route::get('/users', function() { return Cache::remember('users', 60, function() { return User::all(); }); }); You may not notice a huge difference with caching on such a simple query, especially on an unloaded system with just a few test users. Caching is much more significant on heavy queries (particularly stuff involving joins/relationships) under heavy database load.
Query Directly from Database Route::get('/users', function() { $user = User::all(); return $users; }); Caching Route::get('/users', function() { $users = User::all(); Cache::put('users',$user,60); if(Cache::has('users')){ return Cache::get('users'); } }); Result Comparing both of these in the browser when page load, I don't notice the different at all. they both returned list of users of my database Is there any tool/way to show the performance of them ?
Caching Vs. Query Directly from Database in Laravel
I have found the answer. I had to put: Location = OutputCacheLocation.Server, in another case it caches on the client side which is wrong. So the outputcache attribute should look like this: [OutputCache(Duration = 600, VaryByParam = "*", VaryByCustom = "User", Location = OutputCacheLocation.Server)] public ActionResult Index(<my parameters>)
I spend whole day to figure out the problem but I couldn't: Here is the problem: On the action I have output cache attribute: [OutputCache(Duration = 600, VaryByParam = "*", VaryByCustom = "User")] Also I've rewritten the Global.asax like this: public override string GetVaryByCustomString(HttpContext context, string arg) { if (arg == "User") { return "User=" + context.User.Identity.Name; } return base.GetVaryByCustomString(context, arg); } But when I login first time it caches the values, then when I try to logout and again login as different user I see the previous cached value. While debugging I checked that Identity.Name returns correct results for the first user it is "admin" for the second user it is "kate"
Is VaryByCustom not working when I use it in Output cache or cached it wrong?
Performance tuning through cachine could be categorized into multi-layers: Client-side (JS and CSS): Add an Expires or a Cache-Control Header will get it done for you. But note that there are more to do than only caching to enhance client-side performance. For detailes check Best Practices for Speeding Up Your Web Site Server-side: this could be on many levels web server, scripting language, database, operating system, network, etc.. Good introduction and practical code examples can be found in Chapter 9 (Performance) - Developing Large Web Applications. It will talk about Caching CSS, Javascript, Modules, Pages, Ajax and Expire headers. If we need to keep things simple on server-side do the following: Install APC extension which will make PHP faster for you through the so called opcode caching. No special configuration, it will work silently for you. Cache the full page for two-hours using this simple Pear library PEAR::Cache_Lite. For each database SELECT query cache the result in APC with a TTL of 5 Min, md5 hash the SELECT statement and use it as key for APC cache. Docs In future if you have multiple servers and the performance becomes to be crucial before then you will need to look at: Shared memory caching between servers. Check Memecache or even Membase You need a reverse proxy solution: this basically layer between your user and server server so that it will serve the HTTP requests instead of your server. You can use for that Varnish, Squid or Apache Traffic Server. Mysql innoDB engine is slow, you may need to go for faster engine such as XtraDB Then maybe you will find that rational databases are stil slow for you. Then you will go for the key-value solution such as MongoDB. Finally as references in web application performance check: Front-end Performance: High Performance Web Sites, Even Faster Web Sites and High Performance JavaScript. Back-end Performance: Pro PHP Application Performance and High Performance MySQL
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance. Closed 12 years ago. I have two websites that will share some resources, lets say, index.php, functions.js and style.css, and these scripts will be used on almost all pages on the websites. I have two audiences to cater for (in terms of download speed), the users within the same network that the sites hosted on (100mb/s aprx) and external users. I am looking for the best way to cache each kind of script (.js, .css, .php) and examples of how this would be done with their pros and cons over other methods if possible. By caching I mean locally, network and server caching. Note: index.php is a dynamic page which should be refreshed from cache every 2 hours. It would be nice if you start your answer with .js, .css, .php or a combination so I can easily see what type of script you are talking about caching. Thanks All!
Different file caching methods, pros & cons [closed]
Sequences will not and can not generate gap-free values. So you'd expect that numbers will occasionally be skipped. That's perfectly normal when you're using sequences. As you've surmised, the most likely scenario is that the sequence cache is aging out of the shared pool overnight when the APEX application isn't being used. You can reduce the frequency of gaps by declaring your sequence NOCACHE but that will decrease performance and it will not eliminate gaps it will just make them less frequent.
I have created a table in APEX that has a PK that is incremented by a SQL sequence: CREATE SEQUENCE seq_increment MINVALUE 1 START WITH 880 INCREMENT BY 1 CACHE 10 This seems to work perfectly. The issue is that sometimes, usually when I get on in the morning and run a process to enter a new row, it skips a bunch of numbers. I only care because these numbers are being used as the ID# of documents in my company and losing/skipping blocks of numbers is not going to be acceptable when this tool goes live. It does seem to jump to the next '10' number. i.e. yesterday my last test assigned 883 and this morning it assigned 890 as the next number. Looking at my code for creation of the sequence I notice that I have set it up to cache 10 values so that it will process quicker. Is it possible that this cache is getting dumped over night and that it is pulling 890 because it had 880-889 in cache and it was dumped? Are there other potential causes and solutions?
SQL Auto-Increment in Oracle APEX occasionally skips a chunk of numbers when incrementing?
When Not too quickly. It is generally very cheap to have short lived objects. For a cache to be profitable there would have to be (very) many candiadates and they should live long enough to make it to the next generation. How can you diagnose if it's an issue? With a Profiler. I'm not so sure the author of the article did that. How much of a problem is managed heap fragmentation in a managed language like C#? As far as I know it is rare. .NET has a compacting Garbage collector, that prevents most forms of fragmentation. There are issues with the Large Object Heap sometimes. Edit: When you go through the comments below the article you will find that someone measured it and found the cache to be a lot slower than creating new eventargs each time. Conclusion: Measure before you start optimizing. This was not a good idea/example.
I was reading a blog entry by Josh Smith where he used a cache mechanism in order to "reduce managed heap fragmentation". His caching reduces the number of short-lived objects being created at the cost of slightly slower execution speed. How much of a problem is managed heap fragmentation in a managed language like C#? How can you diagnose if it's an issue? In what situations would you typically need to address it?
When to address managed heap fragmentation
As @Eilon asked, is this data user-specific or is it site specific? If the data is user specific you can use the Session State to store it, however it will expire when the users session ends, and in some cases can still invoke a roundtrip to your database server (if for instance it is backed by SQL server instead of being in-proc, etc). If the data is application wide you can also use the Application Cache. It is site wide, resides in the process domain and is therefore available to everyone who has sessions on that server. Special care must be taken when using this in a multi-server scenario, but it is easily doable. It should be noted that the Application Cache (and any other global setup) can make your app load slow for the first user to hit the site if the setup takes time. IIS7 and ASP.NET have attempted to address this with a module released recently that periodically wakes your app up to ensure that the global cache is either pre-populated, or remains alive.
I have a List of objects in an asp.net page. when the page loads in the browser for the first time, I fetch data from the database, create objects with those data and populate the list. All these are done inside the page load event handler. Now when the page is posted back to the page, the previous list is gone, since the variables were all freed. How can I cache that List, so that when this page is posted back, all my objects are available?
How to cache a List of objects in ASP.NET
HTTP resources expire based on their own expiration settings. An HTML document is cached if that document is cached. An image referenced by that document is cached if that image is cached. See the Caching Tutorial for Web Authors and Webmasters
I understand the basics of HTML page caching. My uncertainty relates to how caching works on images, included external scripts, and included CSS stylesheets that the HTML page uses. For example, let's say I have an HTML page that is set to expire in 7 days. The page has 10 images on it, 2 included external CSS (.css) stylesheets, and 2 external included javascript (.js) files. Do all of these also expire in 7 days and follow what I implement in the HTML page? Any way to specify individually when these external items expire? I seem to get mixed results on different browsers and/or with a reload or SHIFT+RELOAD action. Perhaps there is an article somewhere that explains how this works (or should work)? Thanks -
HTML Page Caching Question
You can build an LRU cache using the standard JDK library using LinkedHashMap: public class MyLRUCache<K, V> extends LinkedHashMap<K, V> { private final int maxEntries; public MyLRUCache(int maxEntries) { // you can be a bit fancy with capacity and load factor super(16, 0.75, true); this.maxEntries = maxEntries; } @Override protected boolean removeEldestEntry(Map.Entry<K, V> eldest) { return size() > maxEntries; } } You may want to play with WeakReferences as well.
Are there any implementations of a static size hashtable that limits the entries to either the most recently or most frequently used metadata? I would prefer not to keep track of this information myself. I know most caching components keep track of this, but I would rather not introduce quite a lot of new dependencies.
Frequently Used metadata Hashmap
If you just want a quick roll-your own caching solution, have a look at this article on JavaSpecialist, which is a review of the book Java Concurrency in Practice by Brian Goetz. It talks about implementing a basic thread safe cache using a FutureTask and a ConcurrentHashMap. The way this is done ensures that only one concurrent thread triggers the long running computation (in your case, your database calls in your DAO). You'd have to modify this solution to add cache expiry if you need it. The other thought about caching it yourself is garbage collection. Without using a WeakHashMap for your cache, then the GC wouldn't be able to release the memory used by the cache if needed. If you are caching infrequently accessed data (but data that was still worth caching since it is hard to compute), then you might want to help out the garbage collector when running low on memory by using a WeakHashMap.
I often need to implement DAO's for some reference data that doesn't change very often. I sometimes cache this in collection field on the DAO - so that it is only loaded once and explicitly updated when required. However this brings in many concurrency issues - what if another thread attempts to access the data while it is loading or being updated. Obviously this can be handled by making both the getters and setters of the data synchronised - but for a large web application this is quite an overhead. I've included a trivial flawed example of what I need as a strawman. Please suggest alternative ways to implement this. public class LocationDAOImpl implements LocationDAO { private List<Location> locations = null; public List<Location> getAllLocations() { if(locations == null) { loadAllLocations(); } return locations; } For further information I'm using Hibernate and Spring but this requirement would apply across many technologies. Some further thoughts: Should this not be handled in code at all - instead let ehcache or similar handle it? Is there a common pattern for this that I'm missing? There are obviously many ways this can be achieved but I've never found a pattern that is simple and maintainable. Thanks in advance!
How to cache information in a DAO in a threadsafe manner
Even though factory pattern is a good option to solve this kind of problem, normally you don't need to do that for Symfony cache system. Typehints CacheItemPoolInterface instead: use Psr\Cache\CacheItemPoolInterface; public function __construct(CacheItemPoolInterface $cache) It automatically injects the current cache.app service depending on the active environment, so Symfony does the job for you! Just make sure to configure the framework.cache.app for each environment config file: # app/config/config_test.yml imports: - { resource: config_dev.yml } framework: #... cache: app: cache.adapter.null services: cache.adapter.null: class: Symfony\Component\Cache\Adapter\NullAdapter arguments: [~] # small trick to avoid arguments errors on compile-time. As cache.adapter.null service isn't available by default, you might need to define it manually.
I am trying to use different cache system on my environments. I would like to have, for example, Filesystem for dev and memcached for prod. I am using symfony 3.3.10. To achieve this, I would like to autowire the CacheInterface as follow: use Psr\SimpleCache\CacheInterface; class Api { public function __construct(CacheInterface $cache) { $this->cache = $cache; } } Here are my configuration files: config_dev.yml: framework: cache: app: cache.adapter.filesystem config_prod.yml: framework: cache: app: cache.adapter.memcached ... Here is the error I get: The error disappears when the FilesystemCache is declared as a service: services: Symfony\Component\Cache\Simple\FilesystemCache: ~ But now I cannot have another cache system for the test environment like NullCache. In fact, I have to declare only one service inheriting from CacheInterface. It is not possible as config_test is using config_dev too. This is the beginning of services.yml if it can help: services: _defaults: autowire: true autoconfigure: true public: false Any idea on how to autowire different cache system depending on the environment? EDIT: Here is the working configuration: use Psr\Cache\CacheItemPoolInterface; class MyApi { /** * @var CacheItemPoolInterface */ private $cache; public function __construct(CacheItemPoolInterface $cache) { $this->cache = $cache; } } config.yml: framework: # ... cache: pools: app.cache.api: default_lifetime: 3600 services.yml: # ... Psr\Cache\CacheItemPoolInterface: alias: 'app.cache.api'
Autowire symfony CacheInterface depending on environment
Ehcache 2.x Programmatic CacheManager cacheManager = initCacheManager(); CacheConfiguration cacheConfiguration = new CacheConfiguration().name("myCache") .maxEntriesLocalHeap(100) .timeToLiveSeconds(20); cacheManager.addCache(new Cache(cacheConfiguration)); XML <cache name="myCache" maxEntriesLocalHeap="100" timeToLiveSeconds="20"/> Override per Element Ehcache 2.x allows your to override expiry settings per Element: Element element = new Element("key", "value"); element.setTimeToLive(10); cache.put(element); Ehcache 3.x Programmatic CacheManager cacheManager = initCacheManager(); CacheConfigurationBuilder<Long, String> configuration = CacheConfigurationBuilder.newCacheConfigurationBuilder(Long.class, String.class, ResourcePoolsBuilder .heap(100)) .withExpiry(Expirations.timeToLiveExpiration(new Duration(20, TimeUnit.SECONDS))); cacheManager.createCache("myCache", configuration); In Ehcache 3, the builder is immutable and can be safely shared to create multiple caches from a similar configuration. And the code will be more compact if you use static imports, which I did not do here to ease pasting this snippet in an IDE. XML <cache alias="myCache"> <expiry> <ttl unit="seconds">20</ttl> </expiry> <heap>100</heap> </cache> Override through custom Expiry in Ehcache 3.x, Expiry is an interface which users can implement: public interface Expiry<K, V> { Duration getExpiryForCreation(K key, V value); Duration getExpiryForAccess(K key, ValueSupplier<? extends V> value); Duration getExpiryForUpdate(K key, ValueSupplier<? extends V> oldValue, V newValue); } time-to-live matches the <cache name="myCache" maxEntriesLocalHeap="100" timeToLiveSeconds="20"/> 0 invocation, which will receive the key and value of the mapping, allowing to implement different expirations depending on the mapping itself.
I need a simple cache for storing tuples in memory with a certain time-to-live. I couldn't find a way to do that on EHcache website, which contains mostly complex usage scenarios. Can anyone help me out? P.S. I don't use Spring.
EHcache simple example with time-to-live
6 While it's true that HTTP and network add latency, generally you need a cache because the actual operation takes significantly longer. So the question is: if you add 1-2 milliseconds to the cache access, does that still shorten the un-cached operation time significantly? If the answer is yes, and you follow some common best practices, having a centralized cache could be a good idea. You might want to look into low latency, high throughput server-side frameworks for the HTTP service, like Node.js or Go. Also, you will probably benefit from implementing proper ETag support in your cache HTTP API. Another alternative might be centralizing the cache server(s) without wrapping them in an HTTP layer. There are standard cache provider implementations for all the technologies you mentioned available for most modern web frameworks. Share Improve this answer Follow edited Apr 23, 2015 at 11:45 answered Apr 22, 2015 at 9:21 David OstrovskyDavid Ostrovsky 2,4811313 silver badges1313 bronze badges 2 David, do you mean to say if I ditch HTTP and use their proprietary protocol to connect to remote cluster, it will give better performance? – ThinkFloyd Apr 22, 2015 at 10:59 The increased performance is not necessarily related to the protocol being used (though typically binary protocols (such as Memcached) provide smaller overhead and quicker processing times than text-based ones -- but rather because you're eliminating the middleman - the HTTP gateway - which typically runs in a different process and communicates over some kind of socket/IPC to the actual cache. – Mark Nunberg Apr 22, 2015 at 23:49 Add a comment  | 
We have many web services and web app applications which have caching needs so we are trying to come up with caching strategy which can help all the teams irrespective of their technical choices. We have used Memcached(not replicated) & Couchbase(multi master) running locally on each server node and applications connect to them locally using Memcached protocol but going forward we are planning to go with centralized cache cluster exposed via REST APIs which can be used by all the applications running on different server nodes in a datacenter. Following are reasons behind this thought process: Easy maintenance of a cluster without worrying about app server nodes. Single protocol(HTTP) used to access the cache without worrying about underlying implementation.(We might use Redis or Couchbase or Aerospike cluster) But we are not sure about this strategy because we are worried about performance impact due to network overhead because of HTTP. Has anyone tried this strategy? Is it a good idea to make cache as centralized service or local caches are the best?
Is it good idea to provide cache as service?
The dependencies are resolved on a need-by-need basis: The compile time ones for the application's code are downloaded as a very first step when you start the build. Any plugins required are downloaded afterwards during the respective phases. This also includes their transitive dependencies. The dependencies for the tests are downloaded when the tests are to be compiled and executed. These are the dependencies with <scope>test</scope>. Therefore, at the point where you're at the test phase, you already must have the latest dependencies, unless you have them cached locally and installed and you're in offline mode. To resolve all your dependencies, you can do: mvn dependency:go-offline To resolve all the plugins, you can do: mvn dependency:resolve-plugins
How can I warm up the dependency cache of my maven tests? E.g. mvn test -DskipTests downloads some of the dependencies, but not all, e.g. some of the maven surefire plugin dependencies are only downloaded by mvn test. I want to create a snapshot of my filesystem to boost up the execution of my tests. Therefore I'd like to have all the dependencies downloaded, but I want to achieve this without executing the tests itself. Some of the dependencies that are only downloaded during mvn test and not by mvn test -DskipTests: Downloading: http://repo1.maven.org/maven2/org/apache/maven/surefire/surefire-junit47/2.14/surefire-junit47-2.14.pom Downloaded: http://repo1.maven.org/maven2/org/apache/maven/surefire/surefire- junit47/2.14/surefire-junit47-2.14.pom (4 KB at 13.9 KB/sec) Downloading: http://repo1.maven.org/maven2/org/apache/maven/surefire/surefire-providers/2.14/surefire-providers-2.14.pom Downloaded: http://repo1.maven.org/maven2/org/apache/maven/surefire/surefire-
Warm up maven dependency cache
5 If you have problem with caching in the browser itself, you need to set header to prevent caching in your Server Side's Code (e.g. Php, Java, Python .....etc). But if you want to avoid caching in your Ajax Function use this Code: $.ajax({ url : scriptUrl, type : "get", // or 'post' cache : false, // This is to avoid Cache in Ajax Requests // .........................etc }); Notice: Give more details on your question to get a sharper answer. Share Improve this answer Follow answered Dec 28, 2012 at 11:11 ShehabicShehabic 6,82799 gold badges5353 silver badges9595 bronze badges Add a comment  | 
I have an ajax function (in jquery) which update the html. It's ok. Now, When I tap enter in the adress bar (to refresh), my html is not updated (old content). Whereas, when I do cmd + R, my html is always ok. What is the problem ? Why my content shows old content from the adress bar ? Edit : I use Chrome et load() in jQuery
How to disable browser cache?
From Oracle Coherence Developer's Guide: When a lock is in place, it is the responsibility of the caller (either in the same thread or the same cluster node, depending on the lease-granularity configuration) to release the lock. By default Coherence uses thread ownership granularity, so that is probably the reason lock is not being released. A value of thread means that locks are held by a thread that obtained them and can only be released by that thread. A value of member means that locks are held by a cluster node and any thread running on the cluster node that obtained the lock can release it. See http://docs.oracle.com/cd/E24290_01/coh.371/e22837/api_transactionslocks.htm#BEIIEEBB and http://docs.oracle.com/cd/E24290_01/coh.371/e22837/appendix_operational.htm#BAGJBCEF for more details.
I have a test sample about coherence lock-unlock mechanism like this: public class Test { public static void main(String[] args) throws InterruptedException, IOException, IllegalArgumentException, IllegalAccessException { Trt test=new Trt(); test.lock(); Thread a=new Thread(test); a.start(); } public static class Trt implements Runnable{ NamedCache cache=null; @Override public void run() { System.out.println(cache.unlock("asd")); } public void lock(){ cache= CacheFactory.getCache(Globals.REGISTRY_CACHE_NAME); System.out.println(cache.lock("asd")); } } } So the result is: true false The result that I am expecting is: true true But the case is, I have only one item 'test', I am all using it and it has only one cache instance in it. So the owner of the cache is that cache instance. Why is it not able to close it and returns false in the end? Thanks Ali
Coherence lock-unlock usage
9 There are 3 ways of loading a page - Putting the url in the address bar and pressing enter which is equivalent to navigating from a hyper link (Default browsing behaviour). This will honour the Expires headers and will first check the cache of the static content to be valid and then if the Expires header time is in future it will load it directly from the cache. In this case the browser will make no request at all to the server. In case the cached content is in-valid it will make a request to the server. Pressing f5 to refresh the page. This would basically send a if-modified header to the server and verify if the content has changed. If it has changed you would get a 200 response else if not then a 304 response. In both cases the image is not loaded on the page until a response is received from the server. Pressing ctrl+f5 which would forcefully clear all the cache and reload all the images. It will not spend time in verifying if the images have changed or not using the headers. I guess the behaviour you are expecting is the first kind. The only thing that you should be looking at is the way you are loading the page. Normally people are not going to press f5 or ctrl+f5 thus your static content will not be re-validated every time. It will forcefully clear the cache and reload every static item on the page. In short just remember - reload the page by pressing enter in the address bar instead. The browser will honour the headers that you have provided. This is not specific to chrome, its a W3C standard. Share Improve this answer Follow answered Jan 22, 2013 at 18:45 tusharmathtusharmath 10.8k1212 gold badges6060 silver badges8484 bronze badges Add a comment  | 
I have a web application that contains a few hundred small images, and is performing quite badly on load. To combat this, I would like to cache static files in the browser. Using a servlet filter on Tomcat 7, I now set the expires header correctly on static files, and can see that this is returned to Chrome: Accept-Ranges:bytes Cache-Control:max-age=3600 Content-Length:40284 Content-Type:text/css Date:Sat, 14 Apr 2012 09:37:04 GMT ETag:W/"40284-1333964814000" **Expires:Sat, 14 Apr 2012 10:37:05 GMT** Last-Modified:Mon, 09 Apr 2012 09:46:54 GMT Server:Apache-Coyote/1.1 However, I notice that Chrome is still doing a round trip to the server for each static resource on reloads, sending an if-modified header and getting a correct 304 Not Modified response from Tomcat. Is there any way to make Chrome avoid these 100+ requests to the server until the expiry has genuinely passed?
Chrome & Expires Header - Image Caching
(follows @aclark 's answer)....or in production mode, Css are all merged in a single big css per performances reasons. There's a little trick to force the css refresh though: just go to Zope Management Interface -> portal_css. Here toggle the selection of any css (just to simulate a change in the configuration) and then at the end of the page click "Save". This makes Zope think that you made some change and it force it to refresh the css digest.
Basically, i'm admin of a plone website and i want to try out changes in the plone.css, plus other stuff in the Base Properties and ploneCustom.css for my additionnal elements. I want to be able to quickly swicth from my custom css to the default for trying out different versions of plone.css. What's the best way ? Is it about the cache or should I try CSS Manager type switching ? If so how ? When I "save" the contents of plone.css or other style properties, it either takes ages to show up or ages to disappear... thanks.
How do I load / unload css changes quickly on a plone + Zope site ? (noob)