def __new__(cls, obj, recursive=False): self = object.__new__(cls) obj = aq_base(obj) connection = obj._p_jar ObjectReader.__init__(self, connection, connection._cache, connection._db.classFactory) ObjectWriter.__init__(self, obj) migrated_oid_set = set() oid_set = {obj._p_oid} while oid_set: oid = oid_set.pop() obj = ObjectReader.load_oid(self, oid) obj._p_activate() klass = obj.__class__ self.lazy = None if not recursive: _setOb = getattr(klass, '_setOb', None) if _setOb: if isinstance(_setOb, WorkflowMethod): _setOb = _setOb._m import six if six.get_unbound_function( _setOb) is six.get_unbound_function( OFS_Folder._setOb): self.lazy = Ghost elif klass.__module__[: 7] == 'BTrees.' and klass.__name__ != 'Length': self.lazy = LazyBTree() self.oid_dict = {} self.oid_set = set() p, serial = self._conn._storage.load(oid, '') unpickler = self._get_unpickler(p) def find_global(*args): self.do_migrate = args != (klass.__module__, klass.__name__) and \ not isOldBTree('%s.%s' % args) unpickler.find_global = self._get_class return self._get_class(*args) unpickler.find_global = find_global unpickler.load() # class state = unpickler.load() if isinstance(self.lazy, LazyPersistent): self.oid_set.update(self.lazy.getOidList(state)) migrated_oid_set.add(oid) oid_set |= self.oid_set - migrated_oid_set self.oid_set = None if self.do_migrate: log.debug('PickleUpdater: migrate %r (%r)', obj, klass) self.setGhostState(obj, self.serialize(obj)) obj._p_changed = 1
def __new__(cls, obj, recursive=False): self = object.__new__(cls) obj = aq_base(obj) connection = obj._p_jar ObjectReader.__init__(self, connection, connection._cache, connection._db.classFactory) ObjectWriter.__init__(self, obj) migrated_oid_set = set() oid_set = set((obj._p_oid,)) while oid_set: oid = oid_set.pop() obj = ObjectReader.load_oid(self, oid) obj._p_activate() klass = obj.__class__ self.lazy = None if not recursive: _setOb = getattr(klass, '_setOb', None) if _setOb: if isinstance(_setOb, WorkflowMethod): _setOb = _setOb._m if _setOb.im_func is OFS_Folder._setOb.im_func: self.lazy = Ghost elif klass.__module__[:7] == 'BTrees.' and klass.__name__ != 'Length': self.lazy = LazyBTree() self.oid_dict = {} self.oid_set = set() p, serial = self._conn._storage.load(oid, '') unpickler = self._get_unpickler(p) def find_global(*args): self.do_migrate = args != (klass.__module__, klass.__name__) and \ not isOldBTree('%s.%s' % args) unpickler.find_global = self._get_class return self._get_class(*args) unpickler.find_global = find_global unpickler.load() # class state = unpickler.load() if isinstance(self.lazy, LazyPersistent): self.oid_set.update(self.lazy.getOidList(state)) migrated_oid_set.add(oid) oid_set |= self.oid_set - migrated_oid_set self.oid_set = None if self.do_migrate: log.debug('PickleUpdater: migrate %r (%r)', obj, klass) self.setGhostState(obj, self.serialize(obj)) obj._p_changed = 1
def NewObjectReader_load_multi_oid(self, database_name, oid): conn = self._conn.get_connection(database_name) # TODO, make connection _cache attr public reader = ObjectReader(conn, conn._cache, classfactory.ClassFactory) return reader.load_oid(oid)
def load_oid(self, oid): if self.oid_set is not None: if self.lazy: return self.lazy(oid) self.oid_set.add(oid) return ObjectReader.load_oid(self, oid)