In some cases, one would only want to keep the result in cache as long
as there is any other reference to the result. By trac ticket #12215, this is
enabled for UniqueRepresentation,
which is used to create unique parents: If an algebraic structure, such
as a finite field, is only temporarily used, then it will not stay in
cache forever. That behaviour is implemented using weak_cached_function,
that behaves the same as cached_function, except that it uses a
CachedWeakValueDictionary for storing the results.

Cython cdef functions do not allow arbitrary decorators.
However, one can wrap a Cython function and turn it into
a cached function, by trac ticket #11115. We need to provide
the name that the wrapped method or function should have,
since otherwise the name of the original function would
be used:

We can proceed similarly for cached methods of Cython classes,
provided that they allow attribute assignment or have a public
attribute __cached_methods of type <dict>. Since
trac ticket #11115, this is the case for all classes inheriting from
Parent. See below for a more explicit
example. By trac ticket #12951, cached methods of extension classes can
be defined by simply using the decorator. However, an indirect
approach is still needed for cpdef methods:

In some cases, one would only want to keep the result in cache as long
as there is any other reference to the result. By trac ticket #12215, this is
enabled for UniqueRepresentation,
which is used to create unique parents: If an algebraic structure, such
as a finite field, is only temporarily used, then it will not stay in
cache forever. That behaviour is implemented using weak_cached_function,
that behaves the same as cached_function, except that it uses a
CachedWeakValueDictionary for storing the results.

By trac ticket #11115, even if a parent does not allow attribute
assignment, it can inherit a cached method from the parent class of a
category (previously, the cache would have been broken):

In order to keep the memory footprint of elements small, it was
decided to not support the same freedom of using cached methods
for elements: If an instance of a class derived from
Element does not allow attribute
assignment, then a cached method inherited from the category of
its parent will break, as in the class MyBrokenElement below.

However, there is a class ElementWithCachedMethod
that has generally a slower attribute access, but fully supports
cached methods. We remark, however, that cached methods are
much faster if attribute access works. So, we expect that
ElementWithCachedMethod will
hardly by used.

By trac ticket #12951, the cached_method decorator is also supported on non-c(p)def
methods of extension classes, as long as they either support attribute assignment
or have a public attribute of type <dict> called __cached_methods. The
latter is easy:

Providing attribute access is a bit more tricky, since it is needed that
an attribute inherited by the instance from its class can be overridden
on the instance. That is why providing a __getattr__ would not be
enough in the following example:

Some immutable objects (such as \(p\)-adic numbers) cannot implement a
reasonable hash function because their == operator has been
modified to return True for objects which might behave differently
in some computations:

If such objects defined a non-trivial hash function, this would break
caching in many places. However, such objects should still be usable
in caches. This can be achieved by defining an appropriate method
_cache_key:

This attribute will only be accessed if the object itself
is not hashable.

An implementation must make sure that for elements a and b,
if a!=b, then also a._cache_key()!=b._cache_key().
In practice this means that the _cache_key should always include
the parent as its first argument:

sage: S.<a>=Qq(4)sage: d=a.add_bigoh(1)sage: b._cache_key()==d._cache_key()# this would be True if the parents were not includedFalse

For proper behavior, the method must be a pure function (no side effects).
If this decorator is used on a method, it will have identical output on
equal elements. This is since the element is part of the hash key.
Arguments to the method must be hashable or define
sage.structure.sage_object.SageObject._cache_key(). The instance it
is assigned to must be hashable.

Since trac ticket #8611, a cached method is an attribute
of the instance (provided that it has a __dict__).
Hence, when pickling the instance, it would be attempted
to pickle that attribute as well, but this is a problem,
since functions can not be pickled, currently. Therefore,
we replace the actual cached method by a place holder,
that kills itself as soon as any attribute is requested.
Then, the original cached attribute is reinstated. But the
cached values are in fact saved (if \(do_pickle\) is set.)

For new style classes C, it is not possible to override a special
method, such as __hash__, in the __dict__ of an instance c of
C, because Python will for efficiency reasons always use what is
provided by the class, not by the instance.

By consequence, if __hash__ would be wrapped by using
CachedMethod, then hash(c) will access C.__hash__ and bind
it to c, which means that the __get__ method of
CachedMethod will be called. But there, we assume that Python has
already inspected __dict__, and thus a CachedMethodCaller
will be created over and over again.

Here, the __get__ method will explicitly access the __dict__, so that
hash(c) will rely on a single CachedMethodCaller stored in
the __dict__.

FileCache is a dictionary-like class which stores keys
and values on disk. The keys take the form of a tuple (A,K)

A is a tuple of objects t where each t is an
exact object which is uniquely identified by a short string.

K is a tuple of tuples (s,v) where s is a valid
variable name and v is an exact object which is uniquely
identified by a short string with letters [a-zA-Z0-9-._]

The primary use case is the DiskCachedFunction. If
memory_cache==True, we maintain a cache of objects seen
during this session in memory – but we don’t load them from
disk until necessary. The keys and values are stored in a
pair of files:

prefix-argstring.key.sobj contains the key only,

prefix-argstring.sobj contains the tuple (key,val)

where self[key]==val.

Note

We assume that each FileCache lives in its own directory.
Use extreme caution if you wish to break that assumption.

However, if there are no strong references left, the result is
deleted, and thus a new computation takes place:

sage: delasage: delbsage: a=f()doing a computation

Above, we used the cache=0 keyword. With a larger value, the
most recently computed values are cached anyway, even if they are
not referenced:

sage: @weak_cached_function(cache=3)....: deff(x):....: print("doing a computation for x={}".format(x))....: returnA()sage: a=f(1);deladoing a computation for x=1sage: a=f(2),f(1);deladoing a computation for x=2sage: a=f(3),f(1);deladoing a computation for x=3sage: a=f(4),f(1);deladoing a computation for x=4doing a computation for x=1sage: a=f(5),f(1);deladoing a computation for x=5

The parameter key can be used to ignore parameters for
caching. In this example we ignore the parameter algorithm:

Helper function to return a hashable key for o which can be used for
caching.

This function is intended for objects which are not hashable such as
\(p\)-adic numbers. The difference from calling an object’s _cache_key
method directly, is that it also works for tuples and unpacks them
recursively (if necessary, i.e., if they are not hashable).

For proper behavior, the method must be a pure function (no side effects).
If this decorator is used on a method, it will have identical output on
equal elements. This is since the element is part of the hash key.
Arguments to the method must be hashable or define
sage.structure.sage_object.SageObject._cache_key(). The instance it
is assigned to must be hashable.

This is different from cache_key since the cache_key might
get confused with the key of a hashable object. Therefore, such keys
include unhashable_key which acts as a unique marker which is
certainly not stored in the dictionary otherwise.

However, if there are no strong references left, the result is
deleted, and thus a new computation takes place:

sage: delasage: delbsage: a=f()doing a computation

Above, we used the cache=0 keyword. With a larger value, the
most recently computed values are cached anyway, even if they are
not referenced:

sage: @weak_cached_function(cache=3)....: deff(x):....: print("doing a computation for x={}".format(x))....: returnA()sage: a=f(1);deladoing a computation for x=1sage: a=f(2),f(1);deladoing a computation for x=2sage: a=f(3),f(1);deladoing a computation for x=3sage: a=f(4),f(1);deladoing a computation for x=4doing a computation for x=1sage: a=f(5),f(1);deladoing a computation for x=5

The parameter key can be used to ignore parameters for
caching. In this example we ignore the parameter algorithm: