Source code for google.appengine.ext.ndb.model

## Copyright 2008 The ndb Authors. All Rights Reserved.## Licensed under the Apache License, Version 2.0 (the "License");# you may not use this file except in compliance with the License.# You may obtain a copy of the License at## http://www.apache.org/licenses/LICENSE-2.0## Unless required by applicable law or agreed to in writing, software# distributed under the License is distributed on an "AS IS" BASIS,# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.# See the License for the specific language governing permissions and# limitations under the License."""Model and Property classes and associated stuff.A model class represents the structure of entities stored in thedatastore. Applications define model classes to indicate thestructure of their entities, then instantiate those model classesto create entities.All model classes must inherit (directly or indirectly) from Model.Through the magic of metaclasses, straightforward assignments in themodel class definition can be used to declare the model's structure:: class Person(Model): name = StringProperty() age = IntegerProperty()We can now create a Person entity and write it to Cloud Datastore:: p = Person(name='Arthur Dent', age=42) k = p.put()The return value from put() is a Key (see the documentation forndb/key.py), which can be used to retrieve the same entity later:: p2 = k.get() p2 == p # Returns TrueTo update an entity, simple change its attributes and write it back(note that this doesn't change the key):: p2.name = 'Arthur Philip Dent' p2.put()We can also delete an entity (by using the key):: k.delete()The property definitions in the class body tell the system the namesand the types of the fields to be stored in Cloud Datastore, whetherthey must be indexed, their default value, and more.Many different Property types exist. Most are indexed by default, theexceptions indicated in the list below:- StringProperty: a short text string, limited to 500 bytes- TextProperty: an unlimited text string; unindexed- BlobProperty: an unlimited byte string; unindexed- IntegerProperty: a 64-bit signed integer- FloatProperty: a double precision floating point number- BooleanProperty: a bool value- DateTimeProperty: a datetime object. Note: App Engine always uses UTC as the timezone- DateProperty: a date object- TimeProperty: a time object- GeoPtProperty: a geographical location, i.e. (latitude, longitude)- KeyProperty: a Cloud Datastore Key value, optionally constrained to referring to a specific kind- UserProperty: a User object (for backwards compatibility only)- StructuredProperty: a field that is itself structured like an entity; see below for more details- LocalStructuredProperty: like StructuredProperty but the on-disk representation is an opaque blob; unindexed- ComputedProperty: a property whose value is computed from other properties by a user-defined function. The property value is written to Cloud Datastore so that it can be used in queries, but the value from Cloud Datastore is not used when the entity is read back- GenericProperty: a property whose type is not constrained; mostly used by the Expando class (see below) but also usable explicitly- JsonProperty: a property whose value is any object that can be serialized using JSON; the value written to Cloud Datastore is a JSON representation of that object- PickleProperty: a property whose value is any object that can be serialized using Python's pickle protocol; the value written to the Cloud Datastore is the pickled representation of that object, using the highest available pickle protocolMost Property classes have similar constructor signatures. Theyaccept several optional keyword arguments:- name=<string>: the name used to store the property value in the datastore. Unlike the following options, this may also be given as a positional argument- indexed=<bool>: indicates whether the property should be indexed (allowing queries on this property's value)- repeated=<bool>: indicates that this property can have multiple values in the same entity.- write_empty_list<bool>: For repeated value properties, controls whether properties with no elements (the empty list) is written to Datastore. If true, written, if false, then nothing is written to Datastore.- required=<bool>: indicates that this property must be given a value- default=<value>: a default value if no explicit value is given- choices=<list of values>: a list or tuple of allowable values- validator=<function>: a general-purpose validation function. It will be called with two arguments (prop, value) and should either return the validated value or raise an exception. It is also allowed for the function to modify the value, but calling it again on the modified value should not modify the value further. (For example: a validator that returns value.strip() or value.lower() is fine, but one that returns value + '$' is not.)- verbose_name=<value>: A human readable name for this property. This human readable name can be used for html form labels.The repeated and required/default options are mutually exclusive: arepeated property cannot be required nor can it specify a defaultvalue (the default is always an empty list and an empty list is alwaysan allowed value), but a required property can have a default.Some property types have additional arguments. Some property typesdo not support all options.Repeated properties are always represented as Python lists; if thereis only one value, the list has only one element. When a new list isassigned to a repeated property, all elements of the list arevalidated. Since it is also possible to mutate lists in place,repeated properties are re-validated before they are written to thedatastore.No validation happens when an entity is read from Cloud Datastore;however property values read that have the wrong type (e.g. a stringvalue for an IntegerProperty) are ignored.For non-repeated properties, None is always a possible value, and novalidation is called when the value is set to None. However forrequired properties, writing the entity to Cloud Datastore requiresthe value to be something other than None (and valid).The StructuredProperty is different from most other properties; itlets you define a sub-structure for your entities. The substructureitself is defined using a model class, and the attribute value is aninstance of that model class. However it is not stored in thedatastore as a separate entity; instead, its attribute values areincluded in the parent entity using a naming convention (the name ofthe structured attribute followed by a dot followed by the name of thesubattribute). For example:: class Address(Model): street = StringProperty() city = StringProperty() class Person(Model): name = StringProperty() address = StructuredProperty(Address) p = Person(name='Harry Potter', address=Address(street='4 Privet Drive', city='Little Whinging')) k.put()This would write a single 'Person' entity with three attributes (asyou could verify using the Datastore Viewer in the Admin Console):: name = 'Harry Potter' address.street = '4 Privet Drive' address.city = 'Little Whinging'Structured property types can be nested arbitrarily deep, but in ahierarchy of nested structured property types, only one level can havethe repeated flag set. It is fine to have multiple structuredproperties referencing the same model class.It is also fine to use the same model class both as a top-level entityclass and as for a structured property; however queries for the modelclass will only return the top-level entities.The LocalStructuredProperty works similar to StructuredProperty on thePython side. For example:: class Address(Model): street = StringProperty() city = StringProperty() class Person(Model): name = StringProperty() address = LocalStructuredProperty(Address) p = Person(name='Harry Potter', address=Address(street='4 Privet Drive', city='Little Whinging')) k.put()However the data written to Cloud Datastore is different; it writes a'Person' entity with a 'name' attribute as before and a single'address' attribute whose value is a blob which encodes the Addressvalue (using the standard"protocol buffer" encoding).Sometimes the set of properties is not known ahead of time. In suchcases you can use the Expando class. This is a Model subclass thatcreates properties on the fly, both upon assignment and when loadingan entity from Cloud Datastore. For example:: class SuperPerson(Expando): name = StringProperty() superpower = StringProperty() razorgirl = SuperPerson(name='Molly Millions', superpower='bionic eyes, razorblade hands', rasta_name='Steppin\' Razor', alt_name='Sally Shears') elastigirl = SuperPerson(name='Helen Parr', superpower='stretchable body') elastigirl.max_stretch = 30 # MetersYou can inspect the properties of an expando instance using the_properties attribute: >>> print razorgirl._properties.keys() ['rasta_name', 'name', 'superpower', 'alt_name'] >>> print elastigirl._properties {'max_stretch': GenericProperty('max_stretch'), 'name': StringProperty('name'), 'superpower': StringProperty('superpower')}Note: this property exists for plain Model instances too; it is justnot as interesting for those.The Model class offers basic query support. You can create a Queryobject by calling the query() class method. Iterating over a Queryobject returns the entities matching the query one at a time.Query objects are fully described in the docstring for query.py, butthere is one handy shortcut that is only available throughModel.query(): positional arguments are interpreted as filterexpressions which are combined through an AND operator. For example:: Person.query(Person.name == 'Harry Potter', Person.age >= 11)is equivalent to:: Person.query().filter(Person.name == 'Harry Potter', Person.age >= 11)Keyword arguments passed to .query() are passed along to the Query()constructor.It is possible to query for field values of structured properties. Forexample:: qry = Person.query(Person.address.city == 'London')A number of top-level functions also live in this module:- transaction() runs a function inside a transaction- get_multi() reads multiple entities at once- put_multi() writes multiple entities at once- delete_multi() deletes multiple entities at onceAll these have a corresponding ``*_async()`` variant as well.The ``*_multi_async()`` functions return a list of Futures.And finally these (without async variants):- in_transaction() tests whether you are currently running in a transaction- @transactional decorates functions that should be run in a transactionThere are many other interesting features. For example, Modelsubclasses may define pre-call and post-call hooks for most operations(get, put, delete, allocate_ids), and Property classes may besubclassed to suit various needs. Documentation for writing aProperty subclass is in the docstring for the Property class."""__author__='guido@google.com (Guido van Rossum)'importcollectionsimportcopyimportcPickleaspickleimportdatetimeimportloggingimportzlibfrom.google_importsimportdatastorefrom.google_importsimportdatastore_errorsfrom.google_importsimportdatastore_queryfrom.google_importsimportdatastore_rpcfrom.google_importsimportdatastore_typesfrom.google_importsimportusersfrom.google_importsimportentity_pbfrom.importkeyaskey_module# NOTE: 'key' is a common local variable name.from.importutilsKey=key_module.Key# For export.# NOTE: Property and Error classes are added later.__all__=['Key','BlobKey','GeoPt','Rollback','Index','IndexState','IndexProperty','ModelAdapter','ModelAttribute','ModelKey','MetaModel','Model','Expando','transaction','transaction_async','in_transaction','transactional','transactional_async','transactional_tasklet','non_transactional','get_multi','get_multi_async','put_multi','put_multi_async','delete_multi','delete_multi_async','get_indexes','get_indexes_async','make_connection',]BlobKey=datastore_types.BlobKeyGeoPt=datastore_types.GeoPtRollback=datastore_errors.Rollback

[docs]classKindError(datastore_errors.BadValueError):"""Raised when an implementation for a kind can't be found. Also raised when the Kind is not an 8-bit string. """

[docs]classInvalidPropertyError(datastore_errors.Error):"""Raised when a property is not applicable to a given use. For example, a property must exist and be indexed to be used in a query's projection or group by clause. """

# Mapping for legacy support.BadProjectionError=InvalidPropertyError

[docs]classUnprojectedPropertyError(datastore_errors.Error):"""Raised when getting a property value that's not in the projection."""

[docs]classReadonlyPropertyError(datastore_errors.Error):"""Raised when attempting to set a property value that is read-only."""

[docs]classComputedPropertyError(ReadonlyPropertyError):"""Raised when attempting to set a value to or delete a computed property."""

# Various imported limits._MAX_LONG=key_module._MAX_LONG_MAX_STRING_LENGTH=datastore_types._MAX_STRING_LENGTH# Map index directions to human-readable strings._DIR_MAP={entity_pb.Index_Property.ASCENDING:'asc',entity_pb.Index_Property.DESCENDING:'desc',}# Map index states to human-readable strings._STATE_MAP={entity_pb.CompositeIndex.ERROR:'error',entity_pb.CompositeIndex.DELETED:'deleting',entity_pb.CompositeIndex.READ_WRITE:'serving',entity_pb.CompositeIndex.WRITE_ONLY:'building',}class_NotEqualMixin(object):"""Mix-in class that implements __ne__ in terms of __eq__."""def__ne__(self,other):"""Implement self != other as not(self == other)."""eq=self.__eq__(other)ifeqisNotImplemented:returnNotImplementedreturnnoteqclass_NestedCounter(object):""" A recursive counter for StructuredProperty deserialization. Deserialization has some complicated rules to handle StructuredPropertys that may or may not be empty. The simplest case is a leaf counter, where the counter will return the index of the repeated value that last had this leaf property written. When a non-leaf counter requested, this will return the max of all its leaf values. This is due to the fact that the next index that a full non-leaf property may be written to comes after all indices that have part of that property written (otherwise, a partial entity would be overwritten. Consider an evaluation of the following structure: class B(model.Model): c = model.IntegerProperty() d = model.IntegerProperty() class A(model.Model): b = model.StructuredProperty(B) class Foo(model.Model): # top-level model a = model.StructuredProperty(A, repeated=True) Foo(a=[A(b=None), A(b=B(c=1)), A(b=None), A(b=B(c=2, d=3))]) This will result in a serialized structure: 1) a.b = None 2) a.b.c = 1 3) a.b.d = None 4) a.b = None 5) a.b.c = 2 6) a.b.d = 3 The counter state should be the following: a | a.b | a.b.c | a.b.d 0) - - - - 1) @1 1 - - 2) @2 @2 2 - 3) @2 @2 2 2 4) @3 @3 3 3 5) @4 @4 4 3 6) @4 @4 4 4 Here, @ indicates that this counter value is actually a calculated value. It is equal to the MAX of its sub-counters. Counter values may get incremented multiple times while deserializing a property. This will happen if a child counter falls behind, for example in steps 2 and 3. During an increment of a parent node, all child nodes values are incremented to match that of the parent, for example in step 4. """def__init__(self):self.__counter=0self.__sub_counters=collections.defaultdict(_NestedCounter)defget(self,parts=None):ifparts:returnself.__sub_counters[parts[0]].get(parts[1:])ifself.__is_parent_node():returnmax(v.get()forvinself.__sub_counters.itervalues())returnself.__counterdefincrement(self,parts=None):ifparts:self.__make_parent_node()returnself.__sub_counters[parts[0]].increment(parts[1:])ifself.__is_parent_node():# Move all children forwardvalue=self.get()+1self._set(value)returnvalueself.__counter+=1returnself.__counterdef_set(self,value):"""Updates all descendants to a specified value."""ifself.__is_parent_node():forchildinself.__sub_counters.itervalues():child._set(value)else:self.__counter=valuedef_absolute_counter(self):# Used only for testing.returnself.__counterdef__is_parent_node(self):returnself.__counter==-1def__make_parent_node(self):self.__counter=-1

[docs]classIndexProperty(_NotEqualMixin):"""Immutable object representing a single property in an index."""@utils.positional(1)def__new__(cls,name,direction):"""Constructor."""obj=object.__new__(cls)obj.__name=nameobj.__direction=directionreturnobj@propertydefname(self):"""The property name being indexed, a string."""returnself.__name@propertydefdirection(self):"""The direction in the index for this property, 'asc' or 'desc'."""returnself.__directiondef__repr__(self):"""Return a string representation."""return'%s(name=%r, direction=%r)'%(self.__class__.__name__,self.name,self.direction)def__eq__(self,other):"""Compare two index properties for equality."""ifnotisinstance(other,IndexProperty):returnNotImplementedreturnself.name==other.nameandself.direction==other.directiondef__hash__(self):returnhash((self.name,self.direction))

[docs]classIndex(_NotEqualMixin):"""Immutable object representing an index."""@utils.positional(1)def__new__(cls,kind,properties,ancestor):"""Constructor."""obj=object.__new__(cls)obj.__kind=kindobj.__properties=propertiesobj.__ancestor=ancestorreturnobj@propertydefkind(self):"""The kind being indexed, a string."""returnself.__kind@propertydefproperties(self):"""A list of PropertyIndex objects giving the properties being indexed."""returnself.__properties@propertydefancestor(self):"""Whether this is an ancestor index, a bool."""returnself.__ancestordef__repr__(self):"""Return a string representation."""parts=[]parts.append('kind=%r'%self.kind)parts.append('properties=%r'%self.properties)parts.append('ancestor=%s'%self.ancestor)return'%s(%s)'%(self.__class__.__name__,', '.join(parts))def__eq__(self,other):"""Compare two indexes."""ifnotisinstance(other,Index):returnNotImplementedreturn(self.kind==other.kindandself.properties==other.propertiesandself.ancestor==other.ancestor)def__hash__(self):returnhash((self.kind,self.properties,self.ancestor))

[docs]classModelAdapter(datastore_rpc.AbstractAdapter):"""Conversions between 'our' Key and Model classes and protobufs. This is needed to construct a Connection object, which in turn is needed to construct a Context object. See the base class docstring for more info about the signatures. """def__init__(self,default_model=None,id_resolver=None):"""Constructor. Args: default_model: If an implementation for the kind cannot be found, use this model class. If none is specified, an exception will be thrown (default). id_resolver: A datastore_pbs.IdResolver that can resolve application ids. This is only necessary when running on the Cloud Datastore v1 API. """# TODO(user): Remove this once AbstractAdapter's constructor makes# it into production.try:super(ModelAdapter,self).__init__(id_resolver)except:passself.default_model=default_modelself.want_pbs=0# Make this a context manager to request setting _orig_pb.# Used in query.py by _MultiQuery.run_to_queue().def__enter__(self):self.want_pbs+=1def__exit__(self,*unused_args):self.want_pbs-=1

[docs]defmake_connection(config=None,default_model=None,_api_version=datastore_rpc._DATASTORE_V3,_id_resolver=None):"""Create a new Connection object with the right adapter. Optionally you can pass in a datastore_rpc.Configuration object. """returndatastore_rpc.Connection(adapter=ModelAdapter(default_model,id_resolver=_id_resolver),config=config,_api_version=_api_version)

[docs]classModelAttribute(object):"""A Base class signifying the presence of a _fix_up() method."""def_fix_up(self,cls,code_name):pass

class_BaseValue(_NotEqualMixin):"""A marker object wrapping a 'base type' value. This is used to be able to tell whether ent._values[name] is a user value (i.e. of a type that the Python code understands) or a base value (i.e of a type that serialization understands). User values are unwrapped; base values are wrapped in a _BaseValue instance. """__slots__=['b_val']def__init__(self,b_val):"""Constructor. Argument is the base value to be wrapped."""assertb_valisnotNone,"Cannot wrap None"assertnotisinstance(b_val,list),repr(b_val)self.b_val=b_valdef__repr__(self):return'_BaseValue(%r)'%(self.b_val,)def__eq__(self,other):ifnotisinstance(other,_BaseValue):returnNotImplementedreturnself.b_val==other.b_valdef__hash__(self):raiseTypeError('_BaseValue is not immutable')

[docs]classProperty(ModelAttribute):"""A class describing a typed, persisted attribute of a Cloud Datastore entity. Not to be confused with Python's 'property' built-in. This is just a base class; there are specific subclasses that describe Properties of various types (and GenericProperty which describes a dynamically typed Property). All special Property attributes, even those considered 'public', have names starting with an underscore, because StructuredProperty uses the non-underscore attribute namespace to refer to nested Property names; this is essential for specifying queries on subproperties (see the module docstring). The Property class and its predefined subclasses allow easy subclassing using composable (or stackable) validation and conversion APIs. These require some terminology definitions: - A 'user value' is a value such as would be set and accessed by the application code using standard attributes on the entity. - A 'base value' is a value such as would be serialized to and deserialized from Cloud Datastore. The values stored in ent._values[name] and accessed by _store_value() and _retrieve_value() can be either user values or base values. To retrieve user values, use _get_user_value(). To retrieve base values, use _get_base_value(). In particular, _get_value() calls _get_user_value(), and _serialize() effectively calls _get_base_value(). To store a user value, just call _store_value(). To store a base value, wrap the value in a _BaseValue() and then call _store_value(). A Property subclass that wants to implement a specific transformation between user values and serialiazble values should implement two methods, _to_base_type() and _from_base_type(). These should *NOT* call their super() method; super calls are taken care of by _call_to_base_type() and _call_from_base_type(). This is what is meant by composable (or stackable) APIs. The API supports 'stacking' classes with ever more sophisticated user<-->base conversions: the user-->base conversion goes from more sophisticated to less sophisticated, while the base-->user conversion goes from less sophisticated to more sophisticated. For example, see the relationship between BlobProperty, TextProperty and StringProperty. In addition to _to_base_type() and _from_base_type(), the _validate() method is also a composable API. The validation API distinguishes between 'lax' and 'strict' user values. The set of lax values is a superset of the set of strict values. The _validate() method takes a lax value and if necessary converts it to a strict value. This means that when setting the property value, lax values are accepted, while when getting the property value, only strict values will be returned. If no conversion is needed, _validate() may return None. If the argument is outside the set of accepted lax values, _validate() should raise an exception, preferably TypeError or datastore_errors.BadValueError. Example/boilerplate: def _validate(self, value): 'Lax user value to strict user value.' if not isinstance(value, <top type>): raise TypeError(...) # Or datastore_errors.BadValueError(...). def _to_base_type(self, value): '(Strict) user value to base value.' if isinstance(value, <user type>): return <base type>(value) def _from_base_type(self, value): 'base value to (strict) user value.' if not isinstance(value, <base type>): return <user type>(value) Things that _validate(), _to_base_type() and _from_base_type() do *not* need to handle: - None: They will not be called with None (and if they return None, this means that the value does not need conversion). - Repeated values: The infrastructure (_get_user_value() and _get_base_value()) takes care of calling _from_base_type() or _to_base_type() for each list item in a repeated value. - Wrapping values in _BaseValue(): The wrapping and unwrapping is taken care of by the infrastructure that calls the composable APIs. - Comparisons: The comparison operations call _to_base_type() on their operand. - Distinguishing between user and base values: the infrastructure guarantees that _from_base_type() will be called with an (unwrapped) base value, and that _to_base_type() will be called with a user value. - Returning the original value: if any of these return None, the original value is kept. (Returning a differen value not equal to None will substitute the different value.) """# TODO: Separate 'simple' properties from base Property class_code_name=None_name=None_indexed=True_repeated=False_required=False_default=None_choices=None_validator=None_verbose_name=None_write_empty_list=False__creation_counter_global=0_attributes=['_name','_indexed','_repeated','_required','_default','_choices','_validator','_verbose_name','_write_empty_list']_positional=1# Only name is a positional argument.@utils.positional(1+_positional)# Add 1 for self.def__init__(self,name=None,indexed=None,repeated=None,required=None,default=None,choices=None,validator=None,verbose_name=None,write_empty_list=None):"""Constructor. For arguments see the module docstring."""ifnameisnotNone:ifisinstance(name,unicode):name=name.encode('utf-8')ifnotisinstance(name,str):raiseTypeError('Name %r is not a string'%(name,))if'.'inname:raiseValueError('Name %r cannot contain period characters'%(name,))self._name=nameifindexedisnotNone:self._indexed=indexedifrepeatedisnotNone:self._repeated=repeatedifrequiredisnotNone:self._required=requiredifdefaultisnotNone:# TODO: Call _validate() on default?self._default=defaultifverbose_nameisnotNone:self._verbose_name=verbose_nameifwrite_empty_listisnotNone:self._write_empty_list=write_empty_listifself._repeatedand(self._requiredorself._defaultisnotNone):raiseValueError('repeated is incompatible with required or default')ifchoicesisnotNone:ifnotisinstance(choices,(list,tuple,set,frozenset)):raiseTypeError('choices must be a list, tuple or set; received %r'%choices)# TODO: Call _validate() on each choice?self._choices=frozenset(choices)ifvalidatorisnotNone:# The validator is called as follows:# value = validator(prop, value)# It should return the value to be used, or raise an exception.# It should be idempotent, i.e. calling it a second time should# not further modify the value. So a validator that returns e.g.# value.lower() or value.strip() is fine, but one that returns# value + '$' is not.ifnothasattr(validator,'__call__'):raiseTypeError('validator must be callable or None; received %r'%validator)self._validator=validator# Keep a unique creation counter.Property.__creation_counter_global+=1self._creation_counter=Property.__creation_counter_globaldef__repr__(self):"""Return a compact unambiguous string representation of a property."""args=[]cls=self.__class__fori,attrinenumerate(self._attributes):val=getattr(self,attr)ifvalisnotgetattr(cls,attr):ifisinstance(val,type):s=val.__name__else:s=repr(val)ifi>=cls._positional:ifattr.startswith('_'):attr=attr[1:]s='%s=%s'%(attr,s)args.append(s)s='%s(%s)'%(self.__class__.__name__,', '.join(args))returnsdef_datastore_type(self,value):"""Internal hook used by property filters. Sometimes the low-level query interface needs a specific data type in order for the right filter to be constructed. See _comparison(). """returnvaluedef_comparison(self,op,value):"""Internal helper for comparison operators. Args: op: The operator ('=', '<' etc.). Returns: A FilterNode instance representing the requested comparison. """# NOTE: This is also used by query.gql().ifnotself._indexed:raisedatastore_errors.BadFilterError('Cannot query for unindexed property %s'%self._name)from.queryimportFilterNode# Import late to avoid circular imports.ifvalueisnotNone:value=self._do_validate(value)value=self._call_to_base_type(value)value=self._datastore_type(value)returnFilterNode(self._name,op,value)# Comparison operators on Property instances don't compare the# properties; instead they return FilterNode instances that can be# used in queries. See the module docstrings above and in query.py# for details on how these can be used.def__eq__(self,value):"""Return a FilterNode instance representing the '=' comparison."""returnself._comparison('=',value)def__ne__(self,value):"""Return a FilterNode instance representing the '!=' comparison."""returnself._comparison('!=',value)def__lt__(self,value):"""Return a FilterNode instance representing the '<' comparison."""returnself._comparison('<',value)def__le__(self,value):"""Return a FilterNode instance representing the '<=' comparison."""returnself._comparison('<=',value)def__gt__(self,value):"""Return a FilterNode instance representing the '>' comparison."""returnself._comparison('>',value)def__ge__(self,value):"""Return a FilterNode instance representing the '>=' comparison."""returnself._comparison('>=',value)# pylint: disable=invalid-namedef_IN(self,value):"""Comparison operator for the 'in' comparison operator. The Python 'in' operator cannot be overloaded in the way we want to, so we define a method. For example:: Employee.query(Employee.rank.IN([4, 5, 6])) Note that the method is called ._IN() but may normally be invoked as .IN(); ._IN() is provided for the case you have a StructuredProperty with a model that has a Property named IN. """ifnotself._indexed:raisedatastore_errors.BadFilterError('Cannot query for unindexed property %s'%self._name)from.queryimportFilterNode# Import late to avoid circular imports.ifnotisinstance(value,(list,tuple,set,frozenset)):raisedatastore_errors.BadArgumentError('Expected list, tuple or set, got %r'%(value,))values=[]forvalinvalue:ifvalisnotNone:val=self._do_validate(val)val=self._call_to_base_type(val)val=self._datastore_type(val)values.append(val)returnFilterNode(self._name,'in',values)IN=_INdef__neg__(self):"""Return a descending sort order on this Property. For example:: Employee.query().order(-Employee.rank) """returndatastore_query.PropertyOrder(self._name,datastore_query.PropertyOrder.DESCENDING)def__pos__(self):"""Return an ascending sort order on this Property. Note that this is redundant but provided for consistency with __neg__. For example, the following two are equivalent:: Employee.query().order(+Employee.rank) Employee.query().order(Employee.rank) """returndatastore_query.PropertyOrder(self._name)def_do_validate(self,value):"""Call all validations on the value. This calls the most derived _validate() method(s), then the custom validator function, and then checks the choices. It returns the value, possibly modified in an idempotent way, or raises an exception. Note that this does not call all composable _validate() methods. It only calls _validate() methods up to but not including the first _to_base_type() method, when the MRO is traversed looking for _validate() and _to_base_type() methods. (IOW if a class defines both _validate() and _to_base_type(), its _validate() is called and then the search is aborted.) Note that for a repeated Property this function should be called for each item in the list, not for the list as a whole. """ifisinstance(value,_BaseValue):returnvaluevalue=self._call_shallow_validation(value)ifself._validatorisnotNone:newvalue=self._validator(self,value)ifnewvalueisnotNone:value=newvalueifself._choicesisnotNone:ifvaluenotinself._choices:raisedatastore_errors.BadValueError('Value %r for property %s is not an allowed choice'%(value,self._name))returnvaluedef_fix_up(self,cls,code_name):"""Internal helper called to tell the property its name. This is called by _fix_up_properties() which is called by MetaModel when finishing the construction of a Model subclass. The name passed in is the name of the class attribute to which the Property is assigned (a.k.a. the code name). Note that this means that each Property instance must be assigned to (at most) one class attribute. E.g. to declare three strings, you must call StringProperty() three times, you cannot write foo = bar = baz = StringProperty() """self._code_name=code_nameifself._nameisNone:self._name=code_namedef_store_value(self,entity,value):"""Internal helper to store a value in an entity for a Property. This assumes validation has already taken place. For a repeated Property the value should be a list. """entity._values[self._name]=valuedef_set_value(self,entity,value):"""Internal helper to set a value in an entity for a Property. This performs validation first. For a repeated Property the value should be a list. """ifentity._projection:raiseReadonlyPropertyError('You cannot set property values of a projection entity')ifself._repeated:ifnotisinstance(value,(list,tuple,set,frozenset)):raisedatastore_errors.BadValueError('Expected list or tuple, got %r'%(value,))value=[self._do_validate(v)forvinvalue]else:ifvalueisnotNone:value=self._do_validate(value)self._store_value(entity,value)def_has_value(self,entity,unused_rest=None):"""Internal helper to ask if the entity has a value for this Property."""returnself._nameinentity._valuesdef_retrieve_value(self,entity,default=None):"""Internal helper to retrieve the value for this Property from an entity. This returns None if no value is set, or the default argument if given. For a repeated Property this returns a list if a value is set, otherwise None. No additional transformations are applied. """returnentity._values.get(self._name,default)def_get_user_value(self,entity):"""Return the user value for this property of the given entity. This implies removing the _BaseValue() wrapper if present, and if it is, calling all _from_base_type() methods, in the reverse method resolution order of the property's class. It also handles default values and repeated properties. """returnself._apply_to_values(entity,self._opt_call_from_base_type)def_get_base_value(self,entity):"""Return the base value for this property of the given entity. This implies calling all _to_base_type() methods, in the method resolution order of the property's class, and adding a _BaseValue() wrapper, if one is not already present. (If one is present, no work is done.) It also handles default values and repeated properties. """returnself._apply_to_values(entity,self._opt_call_to_base_type)# TODO: Invent a shorter name for this.def_get_base_value_unwrapped_as_list(self,entity):"""Like _get_base_value(), but always returns a list. Returns: A new list of unwrapped base values. For an unrepeated property, if the value is missing or None, returns [None]; for a repeated property, if the original value is missing or None or empty, returns []. """wrapped=self._get_base_value(entity)ifself._repeated:ifwrappedisNone:return[]assertisinstance(wrapped,list)return[w.b_valforwinwrapped]else:ifwrappedisNone:return[None]assertisinstance(wrapped,_BaseValue)return[wrapped.b_val]def_opt_call_from_base_type(self,value):"""Call _from_base_type() if necessary. If the value is a _BaseValue instance, unwrap it and call all _from_base_type() methods. Otherwise, return the value unchanged. """ifisinstance(value,_BaseValue):value=self._call_from_base_type(value.b_val)returnvaluedef_value_to_repr(self,value):"""Turn a value (base or not) into its repr(). This exists so that property classes can override it separately. """# Manually apply _from_base_type() so as not to have a side# effect on what's contained in the entity. Printing a value# should not change it!val=self._opt_call_from_base_type(value)returnrepr(val)def_opt_call_to_base_type(self,value):"""Call _to_base_type() if necessary. If the value is a _BaseValue instance, return it unchanged. Otherwise, call all _validate() and _to_base_type() methods and wrap it in a _BaseValue instance. """ifnotisinstance(value,_BaseValue):value=_BaseValue(self._call_to_base_type(value))returnvaluedef_call_from_base_type(self,value):"""Call all _from_base_type() methods on the value. This calls the methods in the reverse method resolution order of the property's class. """methods=self._find_methods('_from_base_type',reverse=True)call=self._apply_list(methods)returncall(value)def_call_to_base_type(self,value):"""Call all _validate() and _to_base_type() methods on the value. This calls the methods in the method resolution order of the property's class. """methods=self._find_methods('_validate','_to_base_type')call=self._apply_list(methods)returncall(value)def_call_shallow_validation(self,value):"""Call the initial set of _validate() methods. This is similar to _call_to_base_type() except it only calls those _validate() methods that can be called without needing to call _to_base_type(). An example: suppose the class hierarchy is A -> B -> C -> Property, and suppose A defines _validate() only, but B and C define _validate() and _to_base_type(). The full list of methods called by _call_to_base_type() is:: A._validate() B._validate() B._to_base_type() C._validate() C._to_base_type() This method will call A._validate() and B._validate() but not the others. """methods=[]formethodinself._find_methods('_validate','_to_base_type'):ifmethod.__name__!='_validate':breakmethods.append(method)call=self._apply_list(methods)returncall(value)@classmethoddef_find_methods(cls,*names,**kwds):"""Compute a list of composable methods. Because this is a common operation and the class hierarchy is static, the outcome is cached (assuming that for a particular list of names the reversed flag is either always on, or always off). Args: *names: One or more method names. reverse: Optional flag, default False; if True, the list is reversed. Returns: A list of callable class method objects. """reverse=kwds.pop('reverse',False)assertnotkwds,repr(kwds)cache=cls.__dict__.get('_find_methods_cache')ifcache:hit=cache.get(names)ifhitisnotNone:returnhitelse:cls._find_methods_cache=cache={}methods=[]forcincls.__mro__:fornameinnames:method=c.__dict__.get(name)ifmethodisnotNone:methods.append(method)ifreverse:methods.reverse()cache[names]=methodsreturnmethodsdef_apply_list(self,methods):"""Return a single callable that applies a list of methods to a value. If a method returns None, the last value is kept; if it returns some other value, that replaces the last value. Exceptions are not caught. """defcall(value):formethodinmethods:newvalue=method(self,value)ifnewvalueisnotNone:value=newvaluereturnvaluereturncalldef_apply_to_values(self,entity,function):"""Apply a function to the property value/values of a given entity. This retrieves the property value, applies the function, and then stores the value back. For a repeated property, the function is applied separately to each of the values in the list. The resulting value or list of values is both stored back in the entity and returned from this method. """value=self._retrieve_value(entity,self._default)ifself._repeated:ifvalueisNone:value=[]self._store_value(entity,value)else:value[:]=map(function,value)else:ifvalueisnotNone:newvalue=function(value)ifnewvalueisnotNoneandnewvalueisnotvalue:self._store_value(entity,newvalue)value=newvaluereturnvaluedef_get_value(self,entity):"""Internal helper to get the value for this Property from an entity. For a repeated Property this initializes the value to an empty list if it is not set. """ifentity._projection:ifself._namenotinentity._projection:raiseUnprojectedPropertyError('Property %s is not in the projection'%(self._name,))returnself._get_user_value(entity)def_delete_value(self,entity):"""Internal helper to delete the value for this Property from an entity. Note that if no value exists this is a no-op; deleted values will not be serialized but requesting their value will return None (or an empty list in the case of a repeated Property). """ifself._nameinentity._values:delentity._values[self._name]def_is_initialized(self,entity):"""Internal helper to ask if the entity has a value for this Property. This returns False if a value is stored but it is None. """return(notself._requiredor((self._has_value(entity)orself._defaultisnotNone)andself._get_value(entity)isnotNone))def__get__(self,entity,unused_cls=None):"""Descriptor protocol: get the value from the entity."""ifentityisNone:returnself# __get__ called on classreturnself._get_value(entity)def__set__(self,entity,value):"""Descriptor protocol: set the value on the entity."""self._set_value(entity,value)def__delete__(self,entity):"""Descriptor protocol: delete the value from the entity."""self._delete_value(entity)def_serialize(self,entity,pb,prefix='',parent_repeated=False,projection=None):"""Internal helper to serialize this property to a protocol buffer. Subclasses may override this method. Args: entity: The entity, a Model (subclass) instance. pb: The protocol buffer, an EntityProto instance. prefix: Optional name prefix used for StructuredProperty (if present, must end in '.'). parent_repeated: True if the parent (or an earlier ancestor) is a repeated Property. projection: A list or tuple of strings representing the projection for the model instance, or None if the instance is not a projection. """values=self._get_base_value_unwrapped_as_list(entity)name=prefix+self._nameifprojectionandnamenotinprojection:returnifself._indexed:create_prop=lambda:pb.add_property()else:create_prop=lambda:pb.add_raw_property()ifself._repeatedandnotvaluesandself._write_empty_list:# We want to write the empty listp=create_prop()p.set_name(name)p.set_multiple(False)p.set_meaning(entity_pb.Property.EMPTY_LIST)p.mutable_value()else:# We write a list, or a single propertyforvalinvalues:p=create_prop()p.set_name(name)p.set_multiple(self._repeatedorparent_repeated)v=p.mutable_value()ifvalisnotNone:self._db_set_value(v,p,val)ifprojection:# Projected properties have the INDEX_VALUE meaning and only contain# the original property's name and value.new_p=entity_pb.Property()new_p.set_name(p.name())new_p.set_meaning(entity_pb.Property.INDEX_VALUE)new_p.set_multiple(False)new_p.mutable_value().CopyFrom(v)p.CopyFrom(new_p)def_deserialize(self,entity,p,unused_depth=1):"""Internal helper to deserialize this property from a protocol buffer. Subclasses may override this method. Args: entity: The entity, a Model (subclass) instance. p: A Property Message object (a protocol buffer). depth: Optional nesting depth, default 1 (unused here, but used by some subclasses that override this method). """ifp.meaning()==entity_pb.Property.EMPTY_LIST:self._store_value(entity,[])returnval=self._db_get_value(p.value(),p)ifvalisnotNone:val=_BaseValue(val)# TODO: replace the remainder of the function with the following commented# out code once its feasible to make breaking changes such as not calling# _store_value().# if self._repeated:# entity._values.setdefault(self._name, []).append(val)# else:# entity._values[self._name] = valifself._repeated:ifself._has_value(entity):value=self._retrieve_value(entity)assertisinstance(value,list),repr(value)value.append(val)else:# We promote single values to lists if we are a list propertyvalue=[val]else:value=valself._store_value(entity,value)def_prepare_for_put(self,entity):passdef_check_property(self,rest=None,require_indexed=True):"""Internal helper to check this property for specific requirements. Called by Model._check_properties(). Args: rest: Optional subproperty to check, of the form 'name1.name2...nameN'. Raises: InvalidPropertyError if this property does not meet the given requirements or if a subproperty is specified. (StructuredProperty overrides this method to handle subproperties.) """ifrequire_indexedandnotself._indexed:raiseInvalidPropertyError('Property is unindexed %s'%self._name)ifrest:raiseInvalidPropertyError('Referencing subproperty %s.%s ''but %s is not a structured property'%(self._name,rest,self._name))def_get_for_dict(self,entity):"""Retrieve the value like _get_value(), processed for _to_dict(). Property subclasses can override this if they want the dictionary returned by entity._to_dict() to contain a different value. The main use case is StructuredProperty and LocalStructuredProperty. NOTES:: - If you override _get_for_dict() to return a different type, you must override _validate() to accept values of that type and convert them back to the original type. - If you override _get_for_dict(), you must handle repeated values and None correctly. (See _StructuredGetForDictMixin for an example.) However, _validate() does not need to handle these. """returnself._get_value(entity)

[docs]classModelKey(Property):"""Special property to store the Model key."""def__init__(self):super(ModelKey,self).__init__()self._name='__key__'def_datastore_type(self,value):returndatastore_types.Key(value.urlsafe())def_comparison(self,op,value):ifvalueisnotNone:returnsuper(ModelKey,self)._comparison(op,value)raisedatastore_errors.BadValueError("__key__ filter query can't be compared to None")# TODO: Support IN().def_validate(self,value):return_validate_key(value)def_set_value(self,entity,value):"""Setter for key attribute."""ifvalueisnotNone:value=_validate_key(value,entity=entity)value=entity._validate_key(value)entity._entity_key=valuedef_get_value(self,entity):"""Getter for key attribute."""returnentity._entity_keydef_delete_value(self,entity):"""Deleter for key attribute."""entity._entity_key=None

[docs]classBooleanProperty(Property):"""A Property whose value is a Python bool."""# TODO: Allow int/long values equal to 0 or 1?def_validate(self,value):ifnotisinstance(value,bool):raisedatastore_errors.BadValueError('Expected bool, got %r'%(value,))returnvaluedef_db_set_value(self,v,unused_p,value):ifnotisinstance(value,bool):raiseTypeError('BooleanProperty %s can only be set to bool values; ''received %r'%(self._name,value))v.set_booleanvalue(value)def_db_get_value(self,v,unused_p):ifnotv.has_booleanvalue():returnNone# The booleanvalue field is an int32, so booleanvalue() returns an# int, hence the conversion.returnbool(v.booleanvalue())

[docs]classIntegerProperty(Property):"""A Property whose value is a Python int or long (or bool)."""def_validate(self,value):ifnotisinstance(value,(int,long)):raisedatastore_errors.BadValueError('Expected integer, got %r'%(value,))returnint(value)def_db_set_value(self,v,unused_p,value):ifnotisinstance(value,(bool,int,long)):raiseTypeError('IntegerProperty %s can only be set to integer values; ''received %r'%(self._name,value))v.set_int64value(value)def_db_get_value(self,v,unused_p):ifnotv.has_int64value():returnNonereturnint(v.int64value())

[docs]classFloatProperty(Property):"""A Property whose value is a Python float. Note: int, long and bool are also allowed. """def_validate(self,value):ifnotisinstance(value,(int,long,float)):raisedatastore_errors.BadValueError('Expected float, got %r'%(value,))returnfloat(value)def_db_set_value(self,v,unused_p,value):ifnotisinstance(value,(bool,int,long,float)):raiseTypeError('FloatProperty %s can only be set to integer or float ''values; received %r'%(self._name,value))v.set_doublevalue(float(value))def_db_get_value(self,v,unused_p):ifnotv.has_doublevalue():returnNonereturnv.doublevalue()

# A custom 'meaning' for compressed properties._MEANING_URI_COMPRESSED='ZLIB'class_CompressedValue(_NotEqualMixin):"""A marker object wrapping compressed values."""__slots__=['z_val']def__init__(self,z_val):"""Constructor. Argument is a string returned by zlib.compress()."""assertisinstance(z_val,str),repr(z_val)self.z_val=z_valdef__repr__(self):return'_CompressedValue(%s)'%repr(self.z_val)def__eq__(self,other):ifnotisinstance(other,_CompressedValue):returnNotImplementedreturnself.z_val==other.z_valdef__hash__(self):raiseTypeError('_CompressedValue is not immutable')

[docs]classBlobProperty(Property):"""A Property whose value is a byte string. It may be compressed."""_indexed=False_compressed=False_attributes=Property._attributes+['_compressed']@utils.positional(1+Property._positional)def__init__(self,name=None,compressed=False,**kwds):super(BlobProperty,self).__init__(name=name,**kwds)self._compressed=compressedifcompressedandself._indexed:# TODO: Allow this, but only allow == and IN comparisons?raiseNotImplementedError('BlobProperty %s cannot be compressed and ''indexed at the same time.'%self._name)def_value_to_repr(self,value):long_repr=super(BlobProperty,self)._value_to_repr(value)# Note that we may truncate even if the value is shorter than# _MAX_STRING_LENGTH; e.g. if it contains many \xXX or \uUUUU# escapes.iflen(long_repr)>_MAX_STRING_LENGTH+4:# Truncate, assuming the final character is the closing quote.long_repr=long_repr[:_MAX_STRING_LENGTH]+'...'+long_repr[-1]returnlong_reprdef_validate(self,value):ifnotisinstance(value,str):raisedatastore_errors.BadValueError('Expected str, got %r'%(value,))if(self._indexedandnotisinstance(self,TextProperty)andlen(value)>_MAX_STRING_LENGTH):raisedatastore_errors.BadValueError('Indexed value %s must be at most %d bytes'%(self._name,_MAX_STRING_LENGTH))def_to_base_type(self,value):ifself._compressed:return_CompressedValue(zlib.compress(value))def_from_base_type(self,value):ifisinstance(value,_CompressedValue):returnzlib.decompress(value.z_val)def_datastore_type(self,value):# Since this is only used for queries, and queries imply an# indexed property, always use ByteString.returndatastore_types.ByteString(value)def_db_set_value(self,v,p,value):ifisinstance(value,_CompressedValue):self._db_set_compressed_meaning(p)value=value.z_valelse:self._db_set_uncompressed_meaning(p)v.set_stringvalue(value)def_db_set_compressed_meaning(self,p):# Use meaning_uri because setting meaning to something else that is not# BLOB or BYTESTRING will cause the value to be decoded from utf-8 in# datastore_types.FromPropertyPb. That would break the compressed string.p.set_meaning_uri(_MEANING_URI_COMPRESSED)p.set_meaning(entity_pb.Property.BLOB)def_db_set_uncompressed_meaning(self,p):ifself._indexed:p.set_meaning(entity_pb.Property.BYTESTRING)else:p.set_meaning(entity_pb.Property.BLOB)def_db_get_value(self,v,p):ifnotv.has_stringvalue():returnNonevalue=v.stringvalue()ifp.meaning_uri()==_MEANING_URI_COMPRESSED:value=_CompressedValue(value)returnvalue

[docs]classTextProperty(BlobProperty):"""An unindexed Property whose value is a text string of unlimited length."""def_validate(self,value):ifisinstance(value,str):# Decode from UTF-8 -- if this fails, we can't write it.try:length=len(value)value=value.decode('utf-8')exceptUnicodeError:raisedatastore_errors.BadValueError('Expected valid UTF-8, got %r'%(value,))elifisinstance(value,unicode):length=len(value.encode('utf-8'))else:raisedatastore_errors.BadValueError('Expected string, got %r'%(value,))ifself._indexedandlength>_MAX_STRING_LENGTH:raisedatastore_errors.BadValueError('Indexed value %s must be at most %d bytes'%(self._name,_MAX_STRING_LENGTH))def_to_base_type(self,value):ifisinstance(value,unicode):returnvalue.encode('utf-8')def_from_base_type(self,value):ifisinstance(value,str):try:returnunicode(value,'utf-8')exceptUnicodeDecodeError:# Since older versions of NDB could write non-UTF-8 TEXT# properties, we can't just reject these. But _validate() now# rejects these, so you can't write new non-UTF-8 TEXT# properties.# TODO: Eventually we should close this hole.passdef_db_set_uncompressed_meaning(self,p):ifnotself._indexed:p.set_meaning(entity_pb.Property.TEXT)

[docs]classGeoPtProperty(Property):"""A Property whose value is a GeoPt."""def_validate(self,value):ifnotisinstance(value,GeoPt):raisedatastore_errors.BadValueError('Expected GeoPt, got %r'%(value,))def_db_set_value(self,v,p,value):ifnotisinstance(value,GeoPt):raiseTypeError('GeoPtProperty %s can only be set to GeoPt values; ''received %r'%(self._name,value))p.set_meaning(entity_pb.Property.GEORSS_POINT)pv=v.mutable_pointvalue()pv.set_x(value.lat)pv.set_y(value.lon)def_db_get_value(self,v,unused_p):ifnotv.has_pointvalue():returnNonepv=v.pointvalue()returnGeoPt(pv.x(),pv.y())

def_unpack_user(v):"""Internal helper to unpack a User value from a protocol buffer."""uv=v.uservalue()email=unicode(uv.email().decode('utf-8'))auth_domain=unicode(uv.auth_domain().decode('utf-8'))obfuscated_gaiaid=uv.obfuscated_gaiaid().decode('utf-8')obfuscated_gaiaid=unicode(obfuscated_gaiaid)federated_identity=Noneifuv.has_federated_identity():federated_identity=unicode(uv.federated_identity().decode('utf-8'))value=users.User(email=email,_auth_domain=auth_domain,_user_id=obfuscated_gaiaid,federated_identity=federated_identity)returnvalue

[docs]classJsonProperty(BlobProperty):"""A property whose value is any Json-encodable Python object."""_json_type=None@utils.positional(1+BlobProperty._positional)def__init__(self,name=None,compressed=False,json_type=None,**kwds):super(JsonProperty,self).__init__(name=name,compressed=compressed,**kwds)self._json_type=json_typedef_validate(self,value):ifself._json_typeisnotNoneandnotisinstance(value,self._json_type):raiseTypeError('JSON property must be a %s'%self._json_type)# Use late import so the dependency is optional.def_to_base_type(self,value):try:importjsonexceptImportError:importsimplejsonasjsonreturnjson.dumps(value,separators=(',',':'))def_from_base_type(self,value):try:importjsonexceptImportError:importsimplejsonasjsonreturnjson.loads(value)

[docs]classUserProperty(Property):"""A Property whose value is a User object. Note: this exists for backwards compatibility with existing Cloud Datastore schemas only; we do not recommend storing User objects directly in Cloud Datastore, but instead recommend storing the user.user_id() value. """_attributes=Property._attributes+['_auto_current_user','_auto_current_user_add']_auto_current_user=False_auto_current_user_add=False@utils.positional(1+Property._positional)def__init__(self,name=None,auto_current_user=False,auto_current_user_add=False,**kwds):super(UserProperty,self).__init__(name=name,**kwds)# TODO: Disallow combining auto_current_user* and default?ifself._repeated:ifauto_current_user:raiseValueError('UserProperty could use auto_current_user and be ''repeated, but there would be no point.')elifauto_current_user_add:raiseValueError('UserProperty could use auto_current_user_add and be ''repeated, but there would be no point.')self._auto_current_user=auto_current_userself._auto_current_user_add=auto_current_user_adddef_validate(self,value):ifnotisinstance(value,users.User):raisedatastore_errors.BadValueError('Expected User, got %r'%(value,))def_prepare_for_put(self,entity):if(self._auto_current_useror(self._auto_current_user_addandnotself._has_value(entity))):value=users.get_current_user()ifvalueisnotNone:self._store_value(entity,value)def_db_set_value(self,v,p,value):datastore_types.PackUser(p.name(),value,v)def_db_get_value(self,v,unused_p):ifnotv.has_uservalue():returnNonereturn_unpack_user(v)

[docs]classKeyProperty(Property):"""A Property whose value is a Key object. Optional keyword argument: kind=<kind>, to require that keys assigned to this property always have the indicated kind. May be a string or a Model subclass. """_attributes=Property._attributes+['_kind']_kind=None@utils.positional(2+Property._positional)def__init__(self,*args,**kwds):# Support several positional signatures:# () => name=None, kind from kwds# (None) => name=None, kind from kwds# (name) => name=arg 0, kind from kwds# (kind) => name=None, kind=arg 0# (name, kind) => name=arg 0, kind=arg 1# (kind, name) => name=arg 1, kind=arg 0# The positional kind must be a Model subclass; it cannot be a string.name=kind=Noneforarginargs:ifisinstance(arg,basestring):ifnameisnotNone:raiseTypeError('You can only specify one name')name=argelifisinstance(arg,type)andissubclass(arg,Model):ifkindisnotNone:raiseTypeError('You can only specify one kind')kind=argelifargisnotNone:raiseTypeError('Unexpected positional argument: %r'%(arg,))ifnameisNone:name=kwds.pop('name',None)elif'name'inkwds:raiseTypeError('You can only specify name once')ifkindisNone:kind=kwds.pop('kind',None)elif'kind'inkwds:raiseTypeError('You can only specify kind once')ifkindisnotNone:ifisinstance(kind,type)andissubclass(kind,Model):kind=kind._get_kind()ifisinstance(kind,unicode):kind=kind.encode('utf-8')ifnotisinstance(kind,str):raiseTypeError('kind must be a Model class or a string')super(KeyProperty,self).__init__(name,**kwds)self._kind=kinddef_datastore_type(self,value):returndatastore_types.Key(value.urlsafe())def_validate(self,value):ifnotisinstance(value,Key):raisedatastore_errors.BadValueError('Expected Key, got %r'%(value,))# Reject incomplete keys.ifnotvalue.id():raisedatastore_errors.BadValueError('Expected complete Key, got %r'%(value,))ifself._kindisnotNone:ifvalue.kind()!=self._kind:raisedatastore_errors.BadValueError('Expected Key with kind=%r, got %r'%(self._kind,value))def_db_set_value(self,v,unused_p,value):ifnotisinstance(value,Key):raiseTypeError('KeyProperty %s can only be set to Key values; ''received %r'%(self._name,value))# See datastore_types.PackKeyref=value.reference()rv=v.mutable_referencevalue()# A Referencerv.set_app(ref.app())ifref.has_name_space():rv.set_name_space(ref.name_space())foreleminref.path().element_list():rv.add_pathelement().CopyFrom(elem)def_db_get_value(self,v,unused_p):ifnotv.has_referencevalue():returnNoneref=entity_pb.Reference()rv=v.referencevalue()ifrv.has_app():ref.set_app(rv.app())ifrv.has_name_space():ref.set_name_space(rv.name_space())path=ref.mutable_path()foreleminrv.pathelement_list():path.add_element().CopyFrom(elem)returnKey(reference=ref)

[docs]classBlobKeyProperty(Property):"""A Property whose value is a BlobKey object."""def_validate(self,value):ifnotisinstance(value,datastore_types.BlobKey):raisedatastore_errors.BadValueError('Expected BlobKey, got %r'%(value,))def_db_set_value(self,v,p,value):ifnotisinstance(value,datastore_types.BlobKey):raiseTypeError('BlobKeyProperty %s can only be set to BlobKey values; ''received %r'%(self._name,value))p.set_meaning(entity_pb.Property.BLOBKEY)v.set_stringvalue(str(value))def_db_get_value(self,v,unused_p):ifnotv.has_stringvalue():returnNonereturndatastore_types.BlobKey(v.stringvalue())

[docs]classDateTimeProperty(Property):"""A Property whose value is a datetime object. Note: Unlike Django, auto_now_add can be overridden by setting the value before writing the entity. And unlike classic db, auto_now does not supply a default value. Also unlike classic db, when the entity is written, the property values are updated to match what was written. Finally, beware that this also updates the value in the in-process cache, *and* that auto_now_add may interact weirdly with transaction retries (a retry of a property with auto_now_add set will reuse the value that was set on the first try). """_attributes=Property._attributes+['_auto_now','_auto_now_add']_auto_now=False_auto_now_add=False@utils.positional(1+Property._positional)def__init__(self,name=None,auto_now=False,auto_now_add=False,**kwds):super(DateTimeProperty,self).__init__(name=name,**kwds)# TODO: Disallow combining auto_now* and default?ifself._repeated:ifauto_now:raiseValueError('DateTimeProperty %s could use auto_now and be ''repeated, but there would be no point.'%self._name)elifauto_now_add:raiseValueError('DateTimeProperty %s could use auto_now_add and be ''repeated, but there would be no point.'%self._name)self._auto_now=auto_nowself._auto_now_add=auto_now_adddef_validate(self,value):ifnotisinstance(value,datetime.datetime):raisedatastore_errors.BadValueError('Expected datetime, got %r'%(value,))def_now(self):returndatetime.datetime.utcnow()def_prepare_for_put(self,entity):if(self._auto_nowor(self._auto_now_addandnotself._has_value(entity))):value=self._now()self._store_value(entity,value)def_db_set_value(self,v,p,value):ifnotisinstance(value,datetime.datetime):raiseTypeError('DatetimeProperty %s can only be set to datetime values; ''received %r'%(self._name,value))ifvalue.tzinfoisnotNone:raiseNotImplementedError('DatetimeProperty %s can only support UTC. ''Please derive a new Property to support ''alternative timezones.'%self._name)dt=value-_EPOCHival=dt.microseconds+1000000*(dt.seconds+24*3600*dt.days)v.set_int64value(ival)p.set_meaning(entity_pb.Property.GD_WHEN)def_db_get_value(self,v,unused_p):ifnotv.has_int64value():returnNoneival=v.int64value()return_EPOCH+datetime.timedelta(microseconds=ival)

def_date_to_datetime(value):"""Convert a date to a datetime for Cloud Datastore storage. Args: value: A datetime.date object. Returns: A datetime object with time set to 0:00. """ifnotisinstance(value,datetime.date):raiseTypeError('Cannot convert to datetime expected date value; ''received %s'%value)returndatetime.datetime(value.year,value.month,value.day)def_time_to_datetime(value):"""Convert a time to a datetime for Cloud Datastore storage. Args: value: A datetime.time object. Returns: A datetime object with date set to 1970-01-01. """ifnotisinstance(value,datetime.time):raiseTypeError('Cannot convert to datetime expected time value; ''received %s'%value)returndatetime.datetime(1970,1,1,value.hour,value.minute,value.second,value.microsecond)

[docs]classTimeProperty(DateTimeProperty):"""A Property whose value is a time object."""def_validate(self,value):ifnotisinstance(value,datetime.time):raisedatastore_errors.BadValueError('Expected time, got %r'%(value,))def_to_base_type(self,value):assertisinstance(value,datetime.time),repr(value)return_time_to_datetime(value)def_from_base_type(self,value):assertisinstance(value,datetime.datetime),repr(value)returnvalue.time()def_now(self):returndatetime.datetime.utcnow().time()

class_StructuredGetForDictMixin(Property):"""Mixin class so *StructuredProperty can share _get_for_dict(). The behavior here is that sub-entities are converted to dictionaries by calling to_dict() on them (also doing the right thing for repeated properties). NOTE: Even though the _validate() method in StructuredProperty and LocalStructuredProperty are identical, they cannot be moved into this shared base class. The reason is subtle: _validate() is not a regular method, but treated specially by _call_to_base_type() and _call_shallow_validation(), and the class where it occurs matters if it also defines _to_base_type(). """def_get_for_dict(self,entity):value=self._get_value(entity)ifself._repeated:value=[v._to_dict()forvinvalue]elifvalueisnotNone:value=value._to_dict()returnvalue

[docs]classStructuredProperty(_StructuredGetForDictMixin):"""A Property whose value is itself an entity. The values of the sub-entity are indexed and can be queried. See the module docstring for details. """_modelclass=None_attributes=['_modelclass']+Property._attributes_positional=1+Property._positional# Add modelclass as positional arg.@utils.positional(1+_positional)def__init__(self,modelclass,name=None,**kwds):super(StructuredProperty,self).__init__(name=name,**kwds)ifself._repeated:ifmodelclass._has_repeated:raiseTypeError('This StructuredProperty cannot use repeated=True ''because its model class (%s) contains repeated ''properties (directly or indirectly).'%modelclass.__name__)self._modelclass=modelclassdef_get_value(self,entity):"""Override _get_value() to *not* raise UnprojectedPropertyError."""value=self._get_user_value(entity)ifvalueisNoneandentity._projection:# Invoke super _get_value() to raise the proper exception.returnsuper(StructuredProperty,self)._get_value(entity)returnvaluedef__getattr__(self,attrname):"""Dynamically get a subproperty."""# Optimistically try to use the dict key.prop=self._modelclass._properties.get(attrname)# We're done if we have a hit and _code_name matches.ifpropisNoneorprop._code_name!=attrname:# Otherwise, use linear search looking for a matching _code_name.forpropinself._modelclass._properties.values():ifprop._code_name==attrname:breakelse:# This is executed when we never execute the above break.prop=NoneifpropisNone:raiseAttributeError('Model subclass %s has no attribute %s'%(self._modelclass.__name__,attrname))prop_copy=copy.copy(prop)prop_copy._name=self._name+'.'+prop_copy._name# Cache the outcome, so subsequent requests for the same attribute# name will get the copied property directly rather than going# through the above motions all over again.setattr(self,attrname,prop_copy)returnprop_copydef_comparison(self,op,value):ifop!='=':raisedatastore_errors.BadFilterError('StructuredProperty filter can only use ==')ifnotself._indexed:raisedatastore_errors.BadFilterError('Cannot query for unindexed StructuredProperty %s'%self._name)# Import late to avoid circular imports.from.queryimportConjunctionNode,PostFilterNodefrom.queryimportRepeatedStructuredPropertyPredicateifvalueisNone:from.queryimportFilterNode# Import late to avoid circular imports.returnFilterNode(self._name,op,value)value=self._do_validate(value)value=self._call_to_base_type(value)filters=[]match_keys=[]# TODO: Why not just iterate over value._values?forpropinself._modelclass._properties.itervalues():vals=prop._get_base_value_unwrapped_as_list(value)ifprop._repeated:ifvals:raisedatastore_errors.BadFilterError('Cannot query for non-empty repeated property %s'%prop._name)continueassertisinstance(vals,list)andlen(vals)==1,repr(vals)val=vals[0]ifvalisnotNone:altprop=getattr(self,prop._code_name)filt=altprop._comparison(op,val)filters.append(filt)match_keys.append(altprop._name)ifnotfilters:raisedatastore_errors.BadFilterError('StructuredProperty filter without any values')iflen(filters)==1:returnfilters[0]ifself._repeated:pb=value._to_pb(allow_partial=True)pred=RepeatedStructuredPropertyPredicate(match_keys,pb,self._name+'.')filters.append(PostFilterNode(pred))returnConjunctionNode(*filters)def_IN(self,value):ifnotisinstance(value,(list,tuple,set,frozenset)):raisedatastore_errors.BadArgumentError('Expected list, tuple or set, got %r'%(value,))from.queryimportDisjunctionNode,FalseNode# Expand to a series of == filters.filters=[self._comparison('=',val)forvalinvalue]ifnotfilters:# DisjunctionNode doesn't like an empty list of filters.# Running the query will still fail, but this matches the# behavior of IN for regular properties.returnFalseNode()else:returnDisjunctionNode(*filters)IN=_INdef_validate(self,value):ifisinstance(value,dict):# A dict is assumed to be the result of a _to_dict() call.returnself._modelclass(**value)ifnotisinstance(value,self._modelclass):raisedatastore_errors.BadValueError('Expected %s instance, got %r'%(self._modelclass.__name__,value))def_has_value(self,entity,rest=None):# rest: optional list of attribute names to check in addition.# Basically, prop._has_value(self, ent, ['x', 'y']) is similar to# (prop._has_value(ent) and# prop.x._has_value(ent.x) and# prop.x.y._has_value(ent.x.y))# assuming prop.x and prop.x.y exist.# NOTE: This is not particularly efficient if len(rest) > 1,# but that seems a rare case, so for now I don't care.ok=super(StructuredProperty,self)._has_value(entity)ifokandrest:lst=self._get_base_value_unwrapped_as_list(entity)iflen(lst)!=1:raiseRuntimeError('Failed to retrieve sub-entity of StructuredProperty'' %s'%self._name)subent=lst[0]ifsubentisNone:returnTruesubprop=subent._properties.get(rest[0])ifsubpropisNone:ok=Falseelse:ok=subprop._has_value(subent,rest[1:])returnokdef_serialize(self,entity,pb,prefix='',parent_repeated=False,projection=None):# entity -> pb; pb is an EntityProto messagevalues=self._get_base_value_unwrapped_as_list(entity)forvalueinvalues:ifvalueisnotNone:# TODO: Avoid re-sorting for repeated values.forunused_name,propinsorted(value._properties.iteritems()):prop._serialize(value,pb,prefix+self._name+'.',self._repeatedorparent_repeated,projection=projection)else:# Serialize a single Nonesuper(StructuredProperty,self)._serialize(entity,pb,prefix=prefix,parent_repeated=parent_repeated,projection=projection)def_deserialize(self,entity,p,depth=1):ifnotself._repeated:subentity=self._retrieve_value(entity)ifsubentityisNone:subentity=self._modelclass()self._store_value(entity,_BaseValue(subentity))cls=self._modelclassifisinstance(subentity,_BaseValue):# NOTE: It may not be a _BaseValue when we're deserializing a# repeated structured property.subentity=subentity.b_valifnotisinstance(subentity,cls):raiseRuntimeError('Cannot deserialize StructuredProperty %s; value ''retrieved not a %s instance %r'%(self._name,cls.__name__,subentity))# _GenericProperty tries to keep compressed values as unindexed, but# won't override a set argument. We need to force it at this level.# TODO(user): Remove this hack by passing indexed to _deserialize.# This cannot happen until we version the API.indexed=p.meaning_uri()!=_MEANING_URI_COMPRESSEDprop=subentity._get_property_for(p,depth=depth,indexed=indexed)ifpropisNone:# Special case: kill subentity after all.self._store_value(entity,None)returnprop._deserialize(subentity,p,depth+1)return# The repeated case is more complicated.# TODO: Prove we won't get here for orphans.name=p.name()parts=name.split('.')iflen(parts)<=depth:raiseRuntimeError('StructuredProperty %s expected to find properties ''separated by periods at a depth of %i; received %r'%(self._name,depth,parts))next=parts[depth]rest=parts[depth+1:]prop=self._modelclass._properties.get(next)prop_is_fake=FalseifpropisNone:# Synthesize a fake property. (We can't use Model._fake_property()# because we need the property before we can determine the subentity.)ifrest:# TODO: Handle this case, too.logging.warn('Skipping unknown structured subproperty (%s) ''in repeated structured property (%s of %s)',name,self._name,entity.__class__.__name__)return# TODO: Figure out the value for indexed. Unfortunately we'd# need this passed in from _from_pb(), which would mean a# signature change for _deserialize(), which might break valid# end-user code that overrides it.compressed=p.meaning_uri()==_MEANING_URI_COMPRESSEDprop=GenericProperty(next,compressed=compressed)prop._code_name=nextprop_is_fake=True# Find the first subentity that doesn't have a value for this# property yet.ifnothasattr(entity,'_subentity_counter'):entity._subentity_counter=_NestedCounter()counter=entity._subentity_countercounter_path=parts[depth-1:]next_index=counter.get(counter_path)subentity=Noneifself._has_value(entity):# If an entire subentity has been set to None, we have to loop# to advance until we find the next partial entity.whilenext_index<self._get_value_size(entity):subentity=self._get_base_value_at_index(entity,next_index)ifnotisinstance(subentity,self._modelclass):raiseTypeError('sub-entities must be instances ''of their Model class.')ifnotprop._has_value(subentity,rest):breaknext_index=counter.increment(counter_path)else:subentity=None# The current property is going to be populated, so advance the counter.counter.increment(counter_path)ifnotsubentity:# We didn't find one. Add a new one to the underlying list of# values.subentity=self._modelclass()values=self._retrieve_value(entity,self._default)ifvaluesisNone:self._store_value(entity,[])values=self._retrieve_value(entity,self._default)values.append(_BaseValue(subentity))ifprop_is_fake:# Add the synthetic property to the subentity's _properties# dict, so that it will be correctly deserialized.# (See Model._fake_property() for comparison.)subentity._clone_properties()subentity._properties[prop._name]=propprop._deserialize(subentity,p,depth+1)def_prepare_for_put(self,entity):values=self._get_base_value_unwrapped_as_list(entity)forvalueinvalues:ifvalueisnotNone:value._prepare_for_put()def_check_property(self,rest=None,require_indexed=True):"""Override for Property._check_property(). Raises: InvalidPropertyError if no subproperty is specified or if something is wrong with the subproperty. """ifnotrest:raiseInvalidPropertyError('Structured property %s requires a subproperty'%self._name)self._modelclass._check_properties([rest],require_indexed=require_indexed)def_get_base_value_at_index(self,entity,index):assertself._repeatedvalue=self._retrieve_value(entity,self._default)value[index]=self._opt_call_to_base_type(value[index])returnvalue[index].b_valdef_get_value_size(self,entity):values=self._retrieve_value(entity,self._default)ifvaluesisNone:return0returnlen(values)

[docs]classLocalStructuredProperty(_StructuredGetForDictMixin,BlobProperty):"""Substructure that is serialized to an opaque blob. This looks like StructuredProperty on the Python side, but is written like a BlobProperty in Cloud Datastore. It is not indexed and you cannot query for subproperties. On the other hand, the on-disk representation is more efficient and can be made even more efficient by passing compressed=True, which compresses the blob data using gzip. """_indexed=False_modelclass=None_keep_keys=False_attributes=['_modelclass']+BlobProperty._attributes+['_keep_keys']_positional=1+BlobProperty._positional# Add modelclass as positional.@utils.positional(1+_positional)def__init__(self,modelclass,name=None,compressed=False,keep_keys=False,**kwds):super(LocalStructuredProperty,self).__init__(name=name,compressed=compressed,**kwds)ifself._indexed:raiseNotImplementedError('Cannot index LocalStructuredProperty %s.'%self._name)self._modelclass=modelclassself._keep_keys=keep_keysdef_validate(self,value):ifisinstance(value,dict):# A dict is assumed to be the result of a _to_dict() call.returnself._modelclass(**value)ifnotisinstance(value,self._modelclass):raisedatastore_errors.BadValueError('Expected %s instance, got %r'%(self._modelclass.__name__,value))def_to_base_type(self,value):ifisinstance(value,self._modelclass):pb=value._to_pb(set_key=self._keep_keys)returnpb.SerializePartialToString()def_from_base_type(self,value):ifnotisinstance(value,self._modelclass):pb=entity_pb.EntityProto()pb.MergePartialFromString(value)ifnotself._keep_keys:pb.clear_key()returnself._modelclass._from_pb(pb)def_prepare_for_put(self,entity):# TODO: Using _get_user_value() here makes it impossible to# subclass this class and add a _from_base_type(). But using# _get_base_value() won't work, since that would return# the serialized (and possibly compressed) serialized blob.value=self._get_user_value(entity)ifvalueisnotNone:ifself._repeated:forsubentinvalue:ifsubentisnotNone:subent._prepare_for_put()else:value._prepare_for_put()def_db_set_uncompressed_meaning(self,p):p.set_meaning(entity_pb.Property.ENTITY_PROTO)

[docs]classGenericProperty(Property):"""A Property whose value can be (almost) any basic type. This is mainly used for Expando and for orphans (values present in Cloud Datastore but not represented in the Model subclass) but can also be used explicitly for properties with dynamically-typed values. This supports compressed=True, which is only effective for str values (not for unicode), and implies indexed=False. """_compressed=False_attributes=Property._attributes+['_compressed']@utils.positional(1+Property._positional)def__init__(self,name=None,compressed=False,**kwds):ifcompressed:# Compressed implies unindexed.kwds.setdefault('indexed',False)super(GenericProperty,self).__init__(name=name,**kwds)self._compressed=compressedifcompressedandself._indexed:# TODO: Allow this, but only allow == and IN comparisons?raiseNotImplementedError('GenericProperty %s cannot be compressed and ''indexed at the same time.'%self._name)def_to_base_type(self,value):ifself._compressedandisinstance(value,str):return_CompressedValue(zlib.compress(value))def_from_base_type(self,value):ifisinstance(value,_CompressedValue):returnzlib.decompress(value.z_val)def_validate(self,value):ifself._indexed:ifisinstance(value,unicode):value=value.encode('utf-8')ifisinstance(value,basestring)andlen(value)>_MAX_STRING_LENGTH:raisedatastore_errors.BadValueError('Indexed value %s must be at most %d bytes'%(self._name,_MAX_STRING_LENGTH))def_db_get_value(self,v,p):# This is awkward but there seems to be no faster way to inspect# what union member is present. datastore_types.FromPropertyPb(),# the undisputed authority, has the same series of if-elif blocks.# (We don't even want to think about multiple members... :-)ifv.has_stringvalue():sval=v.stringvalue()meaning=p.meaning()ifmeaning==entity_pb.Property.BLOBKEY:sval=BlobKey(sval)elifmeaning==entity_pb.Property.BLOB:ifp.meaning_uri()==_MEANING_URI_COMPRESSED:sval=_CompressedValue(sval)elifmeaning==entity_pb.Property.ENTITY_PROTO:# NOTE: This is only used for uncompressed LocalStructuredProperties.pb=entity_pb.EntityProto()pb.MergePartialFromString(sval)modelclass=Expandoifpb.key().path().element_size():kind=pb.key().path().element(-1).type()modelclass=Model._kind_map.get(kind,modelclass)sval=modelclass._from_pb(pb)elifmeaning!=entity_pb.Property.BYTESTRING:try:sval.decode('ascii')# If this passes, don't return unicode.exceptUnicodeDecodeError:try:sval=unicode(sval.decode('utf-8'))exceptUnicodeDecodeError:passreturnsvalelifv.has_int64value():ival=v.int64value()ifp.meaning()==entity_pb.Property.GD_WHEN:return_EPOCH+datetime.timedelta(microseconds=ival)returnivalelifv.has_booleanvalue():# The booleanvalue field is an int32, so booleanvalue() returns# an int, hence the conversion.returnbool(v.booleanvalue())elifv.has_doublevalue():returnv.doublevalue()elifv.has_referencevalue():rv=v.referencevalue()app=rv.app()namespace=rv.name_space()pairs=[(elem.type(),elem.id()orelem.name())foreleminrv.pathelement_list()]returnKey(pairs=pairs,app=app,namespace=namespace)elifv.has_pointvalue():pv=v.pointvalue()returnGeoPt(pv.x(),pv.y())elifv.has_uservalue():return_unpack_user(v)else:# A missing value implies null.returnNonedef_db_set_value(self,v,p,value):# TODO: use a dict mapping types to functionsifisinstance(value,str):v.set_stringvalue(value)# TODO: Set meaning to BLOB or BYTESTRING if it's not UTF-8?# (Or TEXT if unindexed.)elifisinstance(value,unicode):v.set_stringvalue(value.encode('utf8'))ifnotself._indexed:p.set_meaning(entity_pb.Property.TEXT)elifisinstance(value,bool):# Must test before int!v.set_booleanvalue(value)elifisinstance(value,(int,long)):# pylint: disable=superfluous-parensifnot(-_MAX_LONG<=value<_MAX_LONG):raiseTypeError('Property %s can only accept 64-bit integers; ''received %s'%(self._name,value))v.set_int64value(value)elifisinstance(value,float):v.set_doublevalue(value)elifisinstance(value,Key):# See datastore_types.PackKeyref=value.reference()rv=v.mutable_referencevalue()# A Referencerv.set_app(ref.app())ifref.has_name_space():rv.set_name_space(ref.name_space())foreleminref.path().element_list():rv.add_pathelement().CopyFrom(elem)elifisinstance(value,datetime.datetime):ifvalue.tzinfoisnotNone:raiseNotImplementedError('Property %s can only support the UTC. ''Please derive a new Property to support ''alternative timezones.'%self._name)dt=value-_EPOCHival=dt.microseconds+1000000*(dt.seconds+24*3600*dt.days)v.set_int64value(ival)p.set_meaning(entity_pb.Property.GD_WHEN)elifisinstance(value,GeoPt):p.set_meaning(entity_pb.Property.GEORSS_POINT)pv=v.mutable_pointvalue()pv.set_x(value.lat)pv.set_y(value.lon)elifisinstance(value,users.User):datastore_types.PackUser(p.name(),value,v)elifisinstance(value,BlobKey):v.set_stringvalue(str(value))p.set_meaning(entity_pb.Property.BLOBKEY)elifisinstance(value,Model):set_key=value._keyisnotNonepb=value._to_pb(set_key=set_key)value=pb.SerializePartialToString()v.set_stringvalue(value)p.set_meaning(entity_pb.Property.ENTITY_PROTO)elifisinstance(value,_CompressedValue):value=value.z_valv.set_stringvalue(value)p.set_meaning_uri(_MEANING_URI_COMPRESSED)p.set_meaning(entity_pb.Property.BLOB)else:raiseNotImplementedError('Property %s does not support %s types.'%(self._name,type(value)))

[docs]classComputedProperty(GenericProperty):"""A Property whose value is determined by a user-supplied function. Computed properties cannot be set directly, but are instead generated by a function when required. They are useful to provide fields in Cloud Datastore that can be used for filtering or sorting without having to manually set the value in code - for example, sorting on the length of a BlobProperty, or using an equality filter to check if another field is not empty. ComputedProperty can be declared as a regular property, passing a function as the first argument, or it can be used as a decorator for the function that does the calculation. Example: >>> class DatastoreFile(Model): ... name = StringProperty() ... name_lower = ComputedProperty(lambda self: self.name.lower()) ... ... data = BlobProperty() ... ... @ComputedProperty ... def size(self): ... return len(self.data) ... ... def _compute_hash(self): ... return hashlib.sha1(self.data).hexdigest() ... hash = ComputedProperty(_compute_hash, name='sha1') """def__init__(self,func,name=None,indexed=None,repeated=None,verbose_name=None):"""Constructor. Args: func: A function that takes one argument, the model instance, and returns a calculated value. """super(ComputedProperty,self).__init__(name=name,indexed=indexed,repeated=repeated,verbose_name=verbose_name)self._func=funcdef_set_value(self,entity,value):raiseComputedPropertyError("Cannot assign to a ComputedProperty")def_delete_value(self,entity):raiseComputedPropertyError("Cannot delete a ComputedProperty")def_get_value(self,entity):# About projections and computed properties: if the computed# property itself is in the projection, don't recompute it; this# prevents raising UnprojectedPropertyError if one of the# dependents is not in the projection. However, if the computed# property is not in the projection, compute it normally -- its# dependents may all be in the projection, and it may be useful to# access the computed value without having it in the projection.# In this case, if any of the dependents is not in the projection,# accessing it in the computation function will raise# UnprojectedPropertyError which will just bubble up.ifentity._projectionandself._nameinentity._projection:returnsuper(ComputedProperty,self)._get_value(entity)value=self._func(entity)self._store_value(entity,value)returnvaluedef_prepare_for_put(self,entity):self._get_value(entity)# For its side effects.

[docs]classMetaModel(type):"""Metaclass for Model. This exists to fix up the properties -- they need to know their name. This is accomplished by calling the class's _fix_properties() method. """def__init__(cls,name,bases,classdict):super(MetaModel,cls).__init__(name,bases,classdict)cls._fix_up_properties()def__repr__(cls):props=[]for_,propinsorted(cls._properties.iteritems()):props.append('%s=%r'%(prop._code_name,prop))return'%s<%s>'%(cls.__name__,', '.join(props))

[docs]classModel(_NotEqualMixin):"""A class describing Cloud Datastore entities. Model instances are usually called entities. All model classes inheriting from Model automatically have MetaModel as their metaclass, so that the properties are fixed up properly after the class once the class is defined. Because of this, you cannot use the same Property object to describe multiple properties -- you must create separate Property objects for each property. E.g. this does not work:: wrong_prop = StringProperty() class Wrong(Model): wrong1 = wrong_prop wrong2 = wrong_prop The kind is normally equal to the class name (exclusive of the module name or any other parent scope). To override the kind, define a class method named _get_kind(), as follows:: class MyModel(Model): @classmethod def _get_kind(cls): return 'AnotherKind' """__metaclass__=MetaModel# Class variables updated by _fix_up_properties()_properties=None_has_repeated=False_kind_map={}# Dict mapping {kind: Model subclass}# Defaults for instance variables._entity_key=None_values=None_projection=()# Tuple of names of projected properties.# Hardcoded pseudo-property for the key._key=ModelKey()key=_keydef__init__(*args,**kwds):"""Creates a new instance of this model (a.k.a. an entity). The new entity must be written to Cloud Datastore using an explicit call to .put(). Keyword Args: key: Key instance for this model. If key is used, id and parent must be None. id: Key id for this model. If id is used, key must be None. parent: Key instance for the parent model or None for a top-level one. If parent is used, key must be None. namespace: Optional namespace. app: Optional app ID. **kwds: Keyword arguments mapping to properties of this model. Note: you cannot define a property named key; the .key attribute always refers to the entity's key. But you can define properties named id or parent. Values for the latter cannot be passed through the constructor, but can be assigned to entity attributes after the entity has been created. """iflen(args)>1:raiseTypeError('Model constructor takes no positional arguments.')# self is passed implicitly through args so users can define a property# named 'self'.(self,)=argsget_arg=self.__get_argkey=get_arg(kwds,'key')id=get_arg(kwds,'id')app=get_arg(kwds,'app')namespace=get_arg(kwds,'namespace')parent=get_arg(kwds,'parent')projection=get_arg(kwds,'projection')ifkeyisnotNone:if(idisnotNoneorparentisnotNoneorappisnotNoneornamespaceisnotNone):raisedatastore_errors.BadArgumentError('Model constructor given key= does not accept ''id=, app=, namespace=, or parent=.')self._key=_validate_key(key,entity=self)elif(idisnotNoneorparentisnotNoneorappisnotNoneornamespaceisnotNone):self._key=Key(self._get_kind(),id,parent=parent,app=app,namespace=namespace)self._values={}self._set_attributes(kwds)# Set the projection last, otherwise it will prevent _set_attributes().ifprojection:self._set_projection(projection)@classmethoddef__get_arg(cls,kwds,kwd):"""Internal helper method to parse keywords that may be property names."""alt_kwd='_'+kwdifalt_kwdinkwds:returnkwds.pop(alt_kwd)ifkwdinkwds:obj=getattr(cls,kwd,None)ifnotisinstance(obj,Property)orisinstance(obj,ModelKey):returnkwds.pop(kwd)returnNonedef__getstate__(self):returnself._to_pb().Encode()def__setstate__(self,serialized_pb):pb=entity_pb.EntityProto(serialized_pb)self.__init__()self.__class__._from_pb(pb,set_key=False,ent=self)def_populate(self,**kwds):"""Populate an instance from keyword arguments. Each keyword argument will be used to set a corresponding property. Keywords must refer to valid property name. This is similar to passing keyword arguments to the Model constructor, except that no provisions for key, id or parent are made. """self._set_attributes(kwds)populate=_populatedef_set_attributes(self,kwds):"""Internal helper to set attributes from keyword arguments. Expando overrides this. """cls=self.__class__forname,valueinkwds.iteritems():prop=getattr(cls,name)# Raises AttributeError for unknown properties.ifnotisinstance(prop,Property):raiseTypeError('Cannot set non-property %s'%name)prop._set_value(self,value)def_find_uninitialized(self):"""Internal helper to find uninitialized properties. Returns: A set of property names. """returnset(nameforname,propinself._properties.iteritems()ifnotprop._is_initialized(self))def_check_initialized(self):"""Internal helper to check for uninitialized properties. Raises: BadValueError if it finds any. """baddies=self._find_uninitialized()ifbaddies:raisedatastore_errors.BadValueError('Entity has uninitialized properties: %s'%', '.join(baddies))def__repr__(self):"""Return an unambiguous string representation of an entity."""args=[]forpropinself._properties.itervalues():ifprop._has_value(self):val=prop._retrieve_value(self)ifvalisNone:rep='None'elifprop._repeated:reprs=[prop._value_to_repr(v)forvinval]ifreprs:reprs[0]='['+reprs[0]reprs[-1]=reprs[-1]+']'rep=', '.join(reprs)else:rep='[]'else:rep=prop._value_to_repr(val)args.append('%s=%s'%(prop._code_name,rep))args.sort()ifself._keyisnotNone:args.insert(0,'key=%r'%self._key)ifself._projection:args.append('_projection=%r'%(self._projection,))s='%s(%s)'%(self.__class__.__name__,', '.join(args))returns@classmethoddef_get_kind(cls):"""Return the kind name for this class. This defaults to cls.__name__; users may overrid this to give a class a different on-disk name than its class name. """returncls.__name__@classmethoddef_class_name(cls):"""A hook for polymodel to override. For regular models and expandos this is just an alias for _get_kind(). For PolyModel subclasses, it returns the class name (as set in the 'class' attribute thereof), whereas _get_kind() returns the kind (the class name of the root class of a specific PolyModel hierarchy). """returncls._get_kind()@classmethoddef_default_filters(cls):"""Return an iterable of filters that are always to be applied. This is used by PolyModel to quietly insert a filter for the current class name. """return()@classmethoddef_reset_kind_map(cls):"""Clear the kind map. Useful for testing."""# Preserve "system" kinds, like __namespace__keep={}forname,valueincls._kind_map.iteritems():ifname.startswith('__')andname.endswith('__'):keep[name]=valuecls._kind_map.clear()cls._kind_map.update(keep)@classmethoddef_lookup_model(cls,kind,default_model=None):"""Get the model class for the kind. Args: kind: A string representing the name of the kind to lookup. default_model: The model class to use if the kind can't be found. Returns: The model class for the requested kind. Raises: KindError: The kind was not found and no default_model was provided. """modelclass=cls._kind_map.get(kind,default_model)ifmodelclassisNone:raiseKindError("No model class found for kind '%s'. Did you forget to import it?"%kind)returnmodelclassdef_has_complete_key(self):"""Return whether this entity has a complete key."""returnself._keyisnotNoneandself._key.id()isnotNonehas_complete_key=_has_complete_keydef__hash__(self):"""Dummy hash function. Raises: Always TypeError to emphasize that entities are mutable. """raiseTypeError('Model is not immutable')# TODO: Reject __lt__, __le__, __gt__, __ge__.def__eq__(self,other):"""Compare two entities of the same class for equality."""ifother.__class__isnotself.__class__:returnNotImplementedifself._key!=other._key:# TODO: If one key is None and the other is an explicit# incomplete key of the simplest form, this should be OK.returnFalsereturnself._equivalent(other)def_equivalent(self,other):"""Compare two entities of the same class, excluding keys."""ifother.__class__isnotself.__class__:# TODO: What about subclasses?raiseNotImplementedError('Cannot compare different model classes. ''%s is not %s'%(self.__class__.__name__,other.__class_.__name__))ifset(self._projection)!=set(other._projection):returnFalse# It's all about determining inequality early.iflen(self._properties)!=len(other._properties):returnFalse# Can only happen for Expandos.my_prop_names=set(self._properties.iterkeys())their_prop_names=set(other._properties.iterkeys())ifmy_prop_names!=their_prop_names:returnFalse# Again, only possible for Expandos.ifself._projection:my_prop_names=set(self._projection)fornameinmy_prop_names:if'.'inname:name,_=name.split('.',1)my_value=self._properties[name]._get_value(self)their_value=other._properties[name]._get_value(other)ifmy_value!=their_value:returnFalsereturnTruedef_to_pb(self,pb=None,allow_partial=False,set_key=True):"""Internal helper to turn an entity into an EntityProto protobuf."""ifnotallow_partial:self._check_initialized()ifpbisNone:pb=entity_pb.EntityProto()ifset_key:# TODO: Move the key stuff into ModelAdapter.entity_to_pb()?self._key_to_pb(pb)forunused_name,propinsorted(self._properties.iteritems()):prop._serialize(self,pb,projection=self._projection)returnpbdef_key_to_pb(self,pb):"""Internal helper to copy the key into a protobuf."""key=self._keyifkeyisNone:pairs=[(self._get_kind(),None)]ref=key_module._ReferenceFromPairs(pairs,reference=pb.mutable_key())else:ref=key.reference()pb.mutable_key().CopyFrom(ref)group=pb.mutable_entity_group()# Must initialize this.# To work around an SDK issue, only set the entity group if the# full key is complete. TODO: Remove the top test once fixed.ifkeyisnotNoneandkey.id():elem=ref.path().element(0)ifelem.id()orelem.name():group.add_element().CopyFrom(elem)@classmethoddef_from_pb(cls,pb,set_key=True,ent=None,key=None):"""Internal helper to create an entity from an EntityProto protobuf."""ifnotisinstance(pb,entity_pb.EntityProto):raiseTypeError('pb must be a EntityProto; received %r'%pb)ifentisNone:ent=cls()# A key passed in overrides a key in the pb.ifkeyisNoneandpb.key().path().element_size():key=Key(reference=pb.key())# If set_key is not set, skip a trivial incomplete key.ifkeyisnotNoneand(set_keyorkey.id()orkey.parent()):ent._key=key# NOTE(user): Keep a map from (indexed, property name) to the property.# This allows us to skip the (relatively) expensive call to# _get_property_for for repeated fields._property_map={}projection=[]forindexed,plistin((True,pb.property_list()),(False,pb.raw_property_list())):forpinplist:ifp.meaning()==entity_pb.Property.INDEX_VALUE:projection.append(p.name())property_map_key=(p.name(),indexed)ifproperty_map_keynotin_property_map:_property_map[property_map_key]=ent._get_property_for(p,indexed)_property_map[property_map_key]._deserialize(ent,p)ent._set_projection(projection)returnentdef_set_projection(self,projection):by_prefix={}forpropnameinprojection:if'.'inpropname:head,tail=propname.split('.',1)ifheadinby_prefix:by_prefix[head].append(tail)else:by_prefix[head]=[tail]self._projection=tuple(projection)forpropname,projinby_prefix.iteritems():prop=self._properties.get(propname)subval=prop._get_base_value_unwrapped_as_list(self)foriteminsubval:assertitemisnotNoneitem._set_projection(proj)def_get_property_for(self,p,indexed=True,depth=0):"""Internal helper to get the Property for a protobuf-level property."""parts=p.name().split('.')iflen(parts)<=depth:# Apparently there's an unstructured value here.# Assume it is a None written for a missing value.# (It could also be that a schema change turned an unstructured# value into a structured one. In that case, too, it seems# better to return None than to return an unstructured value,# since the latter doesn't match the current schema.)returnNonenext=parts[depth]prop=self._properties.get(next)ifpropisNone:prop=self._fake_property(p,next,indexed)returnpropdef_clone_properties(self):"""Internal helper to clone self._properties if necessary."""cls=self.__class__ifself._propertiesiscls._properties:self._properties=dict(cls._properties)def_fake_property(self,p,next,indexed=True):"""Internal helper to create a fake Property."""self._clone_properties()ifp.name()!=nextandnotp.name().endswith('.'+next):prop=StructuredProperty(Expando,next)prop._store_value(self,_BaseValue(Expando()))else:compressed=p.meaning_uri()==_MEANING_URI_COMPRESSEDprop=GenericProperty(next,repeated=p.multiple(),indexed=indexed,compressed=compressed)prop._code_name=nextself._properties[prop._name]=propreturnprop@utils.positional(1)def_to_dict(self,include=None,exclude=None):"""Return a dict containing the entity's property values. Args: include: Optional set of property names to include, default all. exclude: Optional set of property names to skip, default none. A name contained in both include and exclude is excluded. """if(includeisnotNoneandnotisinstance(include,(list,tuple,set,frozenset))):raiseTypeError('include should be a list, tuple or set')if(excludeisnotNoneandnotisinstance(exclude,(list,tuple,set,frozenset))):raiseTypeError('exclude should be a list, tuple or set')values={}forpropinself._properties.itervalues():name=prop._code_nameifincludeisnotNoneandnamenotininclude:continueifexcludeisnotNoneandnameinexclude:continuetry:values[name]=prop._get_for_dict(self)exceptUnprojectedPropertyError:pass# Ignore unprojected properties rather than failing.returnvaluesto_dict=_to_dict@classmethoddef_fix_up_properties(cls):"""Fix up the properties by calling their _fix_up() method. Note: This is called by MetaModel, but may also be called manually after dynamically updating a model class. """# Verify that _get_kind() returns an 8-bit string.kind=cls._get_kind()ifnotisinstance(kind,basestring):raiseKindError('Class %s defines a _get_kind() method that returns ''a non-string (%r)'%(cls.__name__,kind))ifnotisinstance(kind,str):try:kind=kind.encode('ascii')# ASCII contents is okay.exceptUnicodeEncodeError:raiseKindError('Class %s defines a _get_kind() method that returns ''a Unicode string (%r); please encode using utf-8'%(cls.__name__,kind))cls._properties={}# Map of {name: Property}ifcls.__module__==__name__:# Skip the classes in *this* file.returnfornameinset(dir(cls)):attr=getattr(cls,name,None)ifisinstance(attr,ModelAttribute)andnotisinstance(attr,ModelKey):ifname.startswith('_'):raiseTypeError('ModelAttribute %s cannot begin with an underscore ''character. _ prefixed attributes are reserved for ''temporary Model instance values.'%name)attr._fix_up(cls,name)ifisinstance(attr,Property):if(attr._repeatedor(isinstance(attr,StructuredProperty)andattr._modelclass._has_repeated)):cls._has_repeated=Truecls._properties[attr._name]=attrcls._update_kind_map()@classmethoddef_update_kind_map(cls):"""Update the kind map to include this class."""cls._kind_map[cls._get_kind()]=clsdef_prepare_for_put(self):ifself._properties:for_,propinsorted(self._properties.iteritems()):prop._prepare_for_put(self)@classmethoddef_check_properties(cls,property_names,require_indexed=True):"""Internal helper to check the given properties exist and meet specified requirements. Called from query.py. Args: property_names: List or tuple of property names -- each being a string, possibly containing dots (to address subproperties of structured properties). Raises: InvalidPropertyError if one of the properties is invalid. AssertionError if the argument is not a list or tuple of strings. """assertisinstance(property_names,(list,tuple)),repr(property_names)fornameinproperty_names:assertisinstance(name,basestring),repr(name)if'.'inname:name,rest=name.split('.',1)else:rest=Noneprop=cls._properties.get(name)ifpropisNone:cls._unknown_property(name)else:prop._check_property(rest,require_indexed=require_indexed)@classmethoddef_unknown_property(cls,name):"""Internal helper to raise an exception for an unknown property name. This is called by _check_properties(). It is overridden by Expando, where this is a no-op. Raises: InvalidPropertyError. """raiseInvalidPropertyError('Unknown property %s'%name)def_validate_key(self,key):"""Validation for _key attribute (designed to be overridden). Args: key: Proposed Key to use for entity. Returns: A valid key. """returnkey# Datastore API using the default context.# These use local import since otherwise they'd be recursive imports.@classmethoddef_query(cls,*args,**kwds):"""Create a Query object for this class. Args: distinct: Optional bool, short hand for group_by = projection. *args: Used to apply an initial filter **kwds: are passed to the Query() constructor. Returns: A Query object. """# Validating distinct.if'distinct'inkwds:if'group_by'inkwds:raiseTypeError('cannot use distinct= and group_by= at the same time')projection=kwds.get('projection')ifnotprojection:raiseTypeError('cannot use distinct= without projection=')ifkwds.pop('distinct'):kwds['group_by']=projection# TODO: Disallow non-empty args and filter=.from.queryimportQuery# Import late to avoid circular imports.qry=Query(kind=cls._get_kind(),**kwds)qry=qry.filter(*cls._default_filters())qry=qry.filter(*args)returnqryquery=_query@classmethoddef_gql(cls,query_string,*args,**kwds):"""Run a GQL query."""from.queryimportgql# Import late to avoid circular imports.returngql('SELECT * FROM %s%s'%(cls._class_name(),query_string),*args,**kwds)gql=_gqldef_put(self,**ctx_options):"""Write this entity to Cloud Datastore. If the operation creates or completes a key, the entity's key attribute is set to the new, complete key. Returns: The key for the entity. This is always a complete key. """returnself._put_async(**ctx_options).get_result()put=_putdef_put_async(self,**ctx_options):"""Write this entity to Cloud Datastore. This is the asynchronous version of Model._put(). """ifself._projection:raisedatastore_errors.BadRequestError('Cannot put a partial entity')from.importtaskletsctx=tasklets.get_context()self._prepare_for_put()ifself._keyisNone:self._key=Key(self._get_kind(),None)self._pre_put_hook()fut=ctx.put(self,**ctx_options)post_hook=self._post_put_hookifnotself._is_default_hook(Model._default_post_put_hook,post_hook):fut.add_immediate_callback(post_hook,fut)returnfutput_async=_put_async@classmethoddef_get_or_insert(*args,**kwds):"""Transactionally retrieves an existing entity or creates a new one. Positional Args: name: Key name to retrieve or create. Keyword Args: namespace: Optional namespace. app: Optional app ID. parent: Parent entity key, if any. context_options: ContextOptions object (not keyword args!) or None. **kwds: Keyword arguments to pass to the constructor of the model class if an instance for the specified key name does not already exist. If an instance with the supplied key_name and parent already exists, these arguments will be discarded. Returns: Existing instance of Model class with the specified key name and parent or a new one that has just been created. """cls,args=args[0],args[1:]returncls._get_or_insert_async(*args,**kwds).get_result()get_or_insert=_get_or_insert@classmethoddef_get_or_insert_async(*args,**kwds):"""Transactionally retrieves an existing entity or creates a new one. This is the asynchronous version of Model._get_or_insert(). """# NOTE: The signature is really weird here because we want to support# models with properties named e.g. 'cls' or 'name'.from.importtaskletscls,name=args# These must always be positional.get_arg=cls.__get_argapp=get_arg(kwds,'app')namespace=get_arg(kwds,'namespace')parent=get_arg(kwds,'parent')context_options=get_arg(kwds,'context_options')# (End of super-special argument parsing.)# TODO: Test the heck out of this, in all sorts of evil scenarios.ifnotisinstance(name,basestring):raiseTypeError('name must be a string; received %r'%name)elifnotname:raiseValueError('name cannot be an empty string.')key=Key(cls,name,app=app,namespace=namespace,parent=parent)@tasklets.taskletdefinternal_tasklet():@tasklets.taskletdeftxn():ent=yieldkey.get_async(options=context_options)ifentisNone:ent=cls(**kwds)# TODO: Use _populate().ent._key=keyyieldent.put_async(options=context_options)raisetasklets.Return(ent)ifin_transaction():# Run txn() in existing transaction.ent=yieldtxn()else:# Maybe avoid a transaction altogether.ent=yieldkey.get_async(options=context_options)ifentisNone:# Run txn() in new transaction.ent=yieldtransaction_async(txn)raisetasklets.Return(ent)returninternal_tasklet()get_or_insert_async=_get_or_insert_async@classmethoddef_allocate_ids(cls,size=None,max=None,parent=None,**ctx_options):"""Allocates a range of key IDs for this model class. Args: size: Number of IDs to allocate. Either size or max can be specified, not both. max: Maximum ID to allocate. Either size or max can be specified, not both. parent: Parent key for which the IDs will be allocated. **ctx_options: Context options. Returns: A tuple with (start, end) for the allocated range, inclusive. """returncls._allocate_ids_async(size=size,max=max,parent=parent,**ctx_options).get_result()allocate_ids=_allocate_ids@classmethoddef_allocate_ids_async(cls,size=None,max=None,parent=None,**ctx_options):"""Allocates a range of key IDs for this model class. This is the asynchronous version of Model._allocate_ids(). """from.importtaskletsctx=tasklets.get_context()cls._pre_allocate_ids_hook(size,max,parent)key=Key(cls._get_kind(),None,parent=parent)fut=ctx.allocate_ids(key,size=size,max=max,**ctx_options)post_hook=cls._post_allocate_ids_hookifnotcls._is_default_hook(Model._default_post_allocate_ids_hook,post_hook):fut.add_immediate_callback(post_hook,size,max,parent,fut)returnfutallocate_ids_async=_allocate_ids_async@classmethod@utils.positional(3)def_get_by_id(cls,id,parent=None,**ctx_options):"""Returns an instance of Model class by ID. This is really just a shorthand for Key(cls, id, ...).get(). Args: id: A string or integer key ID. parent: Optional parent key of the model to get. namespace: Optional namespace. app: Optional app ID. **ctx_options: Context options. Returns: A model instance or None if not found. """returncls._get_by_id_async(id,parent=parent,**ctx_options).get_result()get_by_id=_get_by_id@classmethod@utils.positional(3)def_get_by_id_async(cls,id,parent=None,app=None,namespace=None,**ctx_options):"""Returns an instance of Model class by ID (and app, namespace). This is the asynchronous version of Model._get_by_id(). """key=Key(cls._get_kind(),id,parent=parent,app=app,namespace=namespace)returnkey.get_async(**ctx_options)get_by_id_async=_get_by_id_async# Hooks that wrap around mutations. Most are class methods with# the notable exception of put, which is an instance method.# To use these, override them in your model class and call# super(<myclass>, cls).<hook>(*args).# Note that the pre-hooks are called before the operation is# scheduled. The post-hooks are called (by the Future) after the# operation has completed.# Do not use or touch the _default_* hooks. These exist for# internal use only.@classmethoddef_pre_allocate_ids_hook(cls,size,max,parent):pass_default_pre_allocate_ids_hook=_pre_allocate_ids_hook@classmethoddef_post_allocate_ids_hook(cls,size,max,parent,future):pass_default_post_allocate_ids_hook=_post_allocate_ids_hook@classmethoddef_pre_delete_hook(cls,key):pass_default_pre_delete_hook=_pre_delete_hook@classmethoddef_post_delete_hook(cls,key,future):pass_default_post_delete_hook=_post_delete_hook@classmethoddef_pre_get_hook(cls,key):pass_default_pre_get_hook=_pre_get_hook@classmethoddef_post_get_hook(cls,key,future):pass_default_post_get_hook=_post_get_hookdef_pre_put_hook(self):pass_default_pre_put_hook=_pre_put_hookdef_post_put_hook(self,future):pass_default_post_put_hook=_post_put_hook@staticmethoddef_is_default_hook(default_hook,hook):"""Checks whether a specific hook is in its default state. Args: cls: A ndb.model.Model class. default_hook: Callable specified by ndb internally (do not override). hook: The hook defined by a model class using _post_*_hook. Raises: TypeError if either the default hook or the tested hook are not callable. """ifnothasattr(default_hook,'__call__'):raiseTypeError('Default hooks for ndb.model.Model must be callable')ifnothasattr(hook,'__call__'):raiseTypeError('Hooks must be callable')returndefault_hook.im_funcishook.im_func

[docs]classExpando(Model):"""Model subclass to support dynamic Property names and types. See the module docstring for details. """# Set this to False (in an Expando subclass or entity) to make# properties default to unindexed._default_indexed=True# Set this to True to write [] to Cloud Datastore instead of no property_write_empty_list_for_dynamic_properties=Nonedef_set_attributes(self,kwds):forname,valueinkwds.iteritems():setattr(self,name,value)@classmethoddef_unknown_property(cls,name):# It is not an error as the property may be a dynamic property.passdef__getattr__(self,name):ifname.startswith('_'):returnsuper(Expando,self).__getattr__(name)prop=self._properties.get(name)ifpropisNone:returnsuper(Expando,self).__getattribute__(name)returnprop._get_value(self)def__setattr__(self,name,value):if(name.startswith('_')orisinstance(getattr(self.__class__,name,None),(Property,property))):returnsuper(Expando,self).__setattr__(name,value)# TODO: Refactor this to share code with _fake_property().self._clone_properties()ifisinstance(value,Model):prop=StructuredProperty(Model,name)elifisinstance(value,dict):prop=StructuredProperty(Expando,name)else:# TODO: What if it's a list of Model instances?prop=GenericProperty(name,repeated=isinstance(value,list),indexed=self._default_indexed,write_empty_list=self._write_empty_list_for_dynamic_properties)prop._code_name=nameself._properties[name]=propprop._set_value(self,value)def__delattr__(self,name):if(name.startswith('_')orisinstance(getattr(self.__class__,name,None),(Property,property))):returnsuper(Expando,self).__delattr__(name)prop=self._properties.get(name)ifnotisinstance(prop,Property):raiseTypeError('Model properties must be Property instances; not %r'%prop)prop._delete_value(self)ifpropinself.__class__._properties:raiseRuntimeError('Property %s still in the list of properties for the ''base class.'%name)delself._properties[name]

[docs]@utils.positional(1)deftransaction(callback,**ctx_options):"""Run a callback in a transaction. Args: callback: A function or tasklet to be called. **ctx_options: Transaction options. Useful options include: retries=N: Retry up to N times (i.e. try up to N+1 times) propagation=<flag>: Determines how an existing transaction should be propagated, where <flag> can be one of the following: TransactionOptions.NESTED: Start a nested transaction (this is the default; but actual nested transactions are not yet implemented, so effectively you can only use this outside an existing transaction). TransactionOptions.MANDATORY: A transaction must already be in progress. TransactionOptions.ALLOWED: If a transaction is in progress, join it. TransactionOptions.INDEPENDENT: Always start a new parallel transaction. xg=True: On the High Replication Datastore, enable cross-group transactions, i.e. allow writing to up to 5 entity groups. WARNING: Using anything other than NESTED for the propagation flag can have strange consequences. When using ALLOWED or MANDATORY, if an exception is raised, the transaction is likely not safe to commit. When using INDEPENDENT it is not generally safe to return values read to the caller (as they were not read in the caller's transaction). Returns: Whatever callback() returns. Raises: Whatever callback() raises; datastore_errors.TransactionFailedError if the transaction failed. Note: To pass arguments to a callback function, use a lambda, e.g. def my_callback(key, inc): ... transaction(lambda: my_callback(Key(...), 1)) """fut=transaction_async(callback,**ctx_options)returnfut.get_result()

[docs]@utils.positional(1)deftransaction_async(callback,**ctx_options):"""Run a callback in a transaction. This is the asynchronous version of transaction(). """from.importtaskletsreturntasklets.get_context().transaction(callback,**ctx_options)

[docs]defin_transaction():"""Return whether a transaction is currently active."""from.importtaskletsreturntasklets.get_context().in_transaction()

@utils.decoratordeftransactional(func,args,kwds,**options):"""Decorator to make a function automatically run in a transaction. Args: **ctx_options: Transaction options (see transaction(), but propagation default to TransactionOptions.ALLOWED). This supports two forms: (1) Vanilla: @transactional def callback(arg): ... (2) With options: @transactional(retries=1) def callback(arg): ... """returntransactional_async.wrapped_decorator(func,args,kwds,**options).get_result()@utils.decoratordeftransactional_async(func,args,kwds,**options):"""The async version of @ndb.transaction."""options.setdefault('propagation',datastore_rpc.TransactionOptions.ALLOWED)ifargsorkwds:returntransaction_async(lambda:func(*args,**kwds),**options)returntransaction_async(func,**options)@utils.decoratordeftransactional_tasklet(func,args,kwds,**options):"""The async version of @ndb.transaction. Will return the result of the wrapped function as a Future. """from.importtaskletsfunc=tasklets.tasklet(func)returntransactional_async.wrapped_decorator(func,args,kwds,**options)@utils.decoratordefnon_transactional(func,args,kwds,allow_existing=True):"""A decorator that ensures a function is run outside a transaction. If there is an existing transaction (and allow_existing=True), the existing transaction is paused while the function is executed. Args: allow_existing: If false, throw an exception if called from within a transaction. If true, temporarily re-establish the previous non-transactional context. Defaults to True. This supports two forms, similar to transactional(). Returns: A wrapper for the decorated function that ensures it runs outside a transaction. """from.importtaskletsctx=tasklets.get_context()ifnotctx.in_transaction():returnfunc(*args,**kwds)ifnotallow_existing:raisedatastore_errors.BadRequestError('%s cannot be called within a transaction.'%func.__name__)save_ctx=ctxwhilectx.in_transaction():ctx=ctx._parent_contextifctxisNone:raisedatastore_errors.BadRequestError('Context without non-transactional ancestor')save_ds_conn=datastore._GetConnection()try:ifhasattr(save_ctx,'_old_ds_conn'):datastore._SetConnection(save_ctx._old_ds_conn)tasklets.set_context(ctx)returnfunc(*args,**kwds)finally:tasklets.set_context(save_ctx)datastore._SetConnection(save_ds_conn)

[docs]defget_multi_async(keys,**ctx_options):"""Fetches a sequence of keys. Args: keys: A sequence of keys. **ctx_options: Context options. Returns: A list of futures. """return[key.get_async(**ctx_options)forkeyinkeys]

[docs]defget_multi(keys,**ctx_options):"""Fetches a sequence of keys. Args: keys: A sequence of keys. **ctx_options: Context options. Returns: A list whose items are either a Model instance or None if the key wasn't found. """return[future.get_result()forfutureinget_multi_async(keys,**ctx_options)]

[docs]defdelete_multi_async(keys,**ctx_options):"""Deletes a sequence of keys. Args: keys: A sequence of keys. **ctx_options: Context options. Returns: A list of futures. """return[key.delete_async(**ctx_options)forkeyinkeys]

[docs]defdelete_multi(keys,**ctx_options):"""Deletes a sequence of keys. Args: keys: A sequence of keys. **ctx_options: Context options. Returns: A list whose items are all None, one per deleted key. """return[future.get_result()forfutureindelete_multi_async(keys,**ctx_options)]