python - Is there a way to cache the fetch output? -



python - Is there a way to cache the fetch output? -

i'm working on closed scheme running in cloud.

what need search function uses user-typed-in regexp filter rows in dataset.

phrase = re.compile(request.get("query")) info = entry.all().fetch(50000) #this takes around 10s when there 6000 records result = x x in info if phrase.search(x.title)

now, database won't alter much, , there no more 200-300 searches day.

is there way somehow cache entries (i expect there no more 50.000 of them, each no bigger 500 bytes), retrieving them won't take >10 seconds? or perhaps parallelize it? don't mind 10cpu seconds, mind 10 sec user has wait.

to address answers "index , utilize .filter()" - query regexp, , don't know indexing mechanism allow utilize regexp.

since there bounded number of entries, can memcache entries , filtering in memory you've outlined. however, note each memcache entry cannot exceed 1mb. can fetch 32mb of memcache entries in parallel.

so split entries sub sets, memcache subsets , read them in parallel precomputing memcache key.

more here:

http://code.google.com/appengine/docs/python/memcache/functions.html

python google-app-engine datastore

Comments

Popular posts from this blog

iphone - Dismissing a UIAlertView -

intellij idea - Update external libraries with intelij and java -

javascript - send data from a new window to previous window in php -