active_model_serializers: Cannot override default cache key with fragment caching
Expected vs Actual Behavior
I have a model say User
that I would like to present in two representations say a UserSerializer
and ProfileSerializer
. My serializers are defined as follows:
class UserSerializer < ApplicationSerializer
cache skip_digest: true except: [:num_following]
type 'users'
attributes :first_name, :last_name, :num_following
def object_cache_key
"api:users:#{super}"
end
end
class ProfileSerializer < UserSerializer
cache skip_digest: true except: [:num_following]
type 'profiles'
attributes :personal_info_attribute
def object_cache_key
"api:profiles:#{super}"
end
end
The generated cache keys do not follow the overidden object_cache_key
and fall-back to the AMS default that is based on the model name. So ProfileSerializer
responses overwrite responses of the UserSerializer
in cache since they both act on the same model.
Is there a way to provide a cache prefix or overwrite the default cache key generated in fragment caching? I have tried cache key: 'some_prefix', skip_digest: true except: [:num_following]
and that also doesn’t work.
This issue only occurs with fragment caching due to the dynamic creation of Cached vs NonCached variants of the serializer. Works fine without fragment caching.
Steps to reproduce
See examples above
Environment
Rails 4.2.2, Grape 0.16
About this issue
- Original URL
- State: open
- Created 8 years ago
- Comments: 30 (22 by maintainers)
@lserman sorry to hear that you had trouble with that. I’ll try to dig and see if there is a deeper reason to that specific change. The issue right now is that changing this may imply a breaking change so I’ll try to see what we can do.
If we are explicitly overriding the cache key in the serializer I would think that would take precedence. In Rails, every model implements
cache_key
from ActiveRecord::Base so #1642 essentially removes the ability to cache the same model in different serializers without something like @groyoh commented.Also, the current implementation requires the model’s
cache_key
to be updated whenever anything regarding serialization is changed such as adding a field to the JSON. I don’t think this should be the model’s concern, the serializer cache key should be able to bust it’s own cache without the model caring about how it is serialized.If the current implementation is really preferred, I would suggest that there is at least some warning when the serializer cache key is ignored. We were just burned by this in production because we added one existing field to the JSON and thought an update to the serializer cache key would fix the issue.
Having one serializer for User and one serializer for UserWithAuthToken is a pretty common use-case for multiple serializers for one model. I would not be surprised if there applications out there that are unknowingly caching UserWithAuthToken and then serving that JSON to other requests that use the normal UserSerializer. You can imagine how bad this would be, but the developer would never know about it unless he inspected the JSON response of his/her own application! Seems dangerous.
cache:compact_user_users/
keys look way better!@rromanchuk probably due to #1642. The
object.cache_key
is preferred to the serializercache key:
option. I personally think the priority should be switched so that we would first try to see if the user has defined thekey
option, then fallback toobject.cache_key
. @bf4 any thought about this?@onomated github thinks I"m a robot