Skip to content

Instantly share code, notes, and snippets.

@erwan-lemonnier
Last active November 1, 2020 06:17
Show Gist options
  • Save erwan-lemonnier/d209aeecc4e8be88375e2d2fbfd45df9 to your computer and use it in GitHub Desktop.
Save erwan-lemonnier/d209aeecc4e8be88375e2d2fbfd45df9 to your computer and use it in GitHub Desktop.
Bulk fetch from Datastore only if objects aren't already in the cache
# Here is a more complicated endpoint that retrieve objects
# from datastore in multiple phases, and build up a queue of objects
# to bulk commit at the end
from pymacaron.auth import get_userid
def do_get_recommended_new_friends(data):
cache = ObjectCache()
saver = ObjectSaver(cache=cache)
# Retrieve the viewer's profile, as well as some other profiles whose
# IDs are in data.user_ids
viewer_id = get_userid()
cache.fetch_missing_objects([viewer_id] + data.user_ids)
# Queue up an update of the viewer's profile
viewer = cache.get_object(viewer_id)
viewer.update_some_things()
saver.put_object(viewer)
# Query Elasticsearch for new users that the viewer should see in her
# friend's feed
suggested_friend_ids = es.search_new_friends()
# Now, we need to get the profiles of those new friends from Datastore. But
# what if we already have them in the cache? We should let the cache do the
# Datastore bulk get, after first filtering out the profiles that are already
# in the cache:
cache.fetch_missing_objects(suggested_friend_ids)
# And queue up more things to bulk insert
for o in get_objects_to_save():
saver.put_object(o)
# Finally, bulk insert all these objects:
saver.commit()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment