class Entity(db.Model, UuidModel, SoftDeleteModel): THING = 'Thing' name = db.Column(db.Unicode) schema = db.Column(db.String(255), index=True) foreign_ids = db.Column(ARRAY(db.Unicode())) data = db.Column('data', JSONB) collection_id = db.Column(db.Integer, db.ForeignKey('collection.id'), index=True) # noqa collection = db.relationship(Collection, backref=db.backref('entities', lazy='dynamic')) # noqa @property def model(self): return model.get(self.schema) @property def names(self): names = set([self.name]) for name, prop in self.model.properties.items(): if prop.type_name not in ['name']: continue names.update(ensure_list(self.data.get(name))) return names def delete_matches(self): pq = db.session.query(Match) pq = pq.filter(or_( Match.entity_id == self.id, Match.match_id == self.id)) pq.delete(synchronize_session=False) db.session.refresh(self) def delete(self, deleted_at=None): self.delete_matches() deleted_at = deleted_at or datetime.utcnow() for alert in self.alerts: alert.delete(deleted_at=deleted_at) super(Entity, self).delete(deleted_at=deleted_at) @classmethod def delete_by_collection(cls, collection_id, deleted_at=None): from aleph.model import Alert deleted_at = deleted_at or datetime.utcnow() entities = db.session.query(cls.id) entities = entities.filter(cls.collection_id == collection_id) entities = entities.subquery() pq = db.session.query(Alert) pq = pq.filter(Alert.entity_id.in_(entities)) pq.update({Alert.deleted_at: deleted_at}, synchronize_session=False) pq = db.session.query(Match) pq = pq.filter(Match.entity_id.in_(entities)) pq.delete(synchronize_session=False) pq = db.session.query(Match) pq = pq.filter(Match.match_id.in_(entities)) pq.delete(synchronize_session=False) pq = db.session.query(cls) pq = pq.filter(cls.collection_id == collection_id) pq = pq.filter(cls.deleted_at == None) # noqa pq.update({cls.deleted_at: deleted_at}, synchronize_session=False) def merge(self, other): if self.id == other.id: raise ValueError("Cannot merge an entity with itself.") if self.collection_id != other.collection_id: raise ValueError("Cannot merge entities from different collections.") # noqa self.schema = model.precise_schema(self.schema, other.schema) self.foreign_ids = string_set(self.foreign_ids, self.foreign_ids) self.created_at = min((self.created_at, other.created_at)) self.updated_at = datetime.utcnow() data = merge_data(self.data, other.data) if self.name != other.name: data = merge_data(data, {'alias': [other.name]}) self.data = data # update alerts from aleph.model.alert import Alert q = db.session.query(Alert).filter(Alert.entity_id == other.id) q.update({Alert.entity_id: self.id}) # delete source entities other.delete() db.session.add(self) db.session.commit() db.session.refresh(other) def update(self, entity): self.schema = entity.get('schema') data = entity.get('properties') if is_mapping(data): data['name'] = [entity.get('name')] self.data = self.model.validate(data) elif self.data is None: self.data = {} self.data.pop('name', None) self.name = entity.get('name') # TODO: should this be mutable? # self.foreign_ids = string_set(entity.get('foreign_ids')) self.updated_at = datetime.utcnow() db.session.add(self) @classmethod def create(cls, data, collection): foreign_ids = string_set(data.get('foreign_ids')) ent = cls.by_foreign_ids(foreign_ids, collection.id, deleted=True) if ent is None: ent = cls() ent.id = make_textid() ent.collection = collection ent.foreign_ids = foreign_ids ent.update(data) ent.deleted_at = None return ent @classmethod def by_foreign_ids(cls, foreign_ids, collection_id, deleted=False): if not len(foreign_ids): return None q = cls.all(deleted=deleted) q = q.filter(Entity.collection_id == collection_id) foreign_id = func.cast(foreign_ids, ARRAY(db.Unicode())) q = q.filter(cls.foreign_ids.contains(foreign_id)) q = q.order_by(Entity.deleted_at.desc().nullsfirst()) return q.first() @classmethod def all_ids(cls, deleted=False, authz=None): q = super(Entity, cls).all_ids(deleted=deleted) if authz is not None and not authz.is_admin: q = q.join(Permission, cls.collection_id == Permission.collection_id) q = q.filter(Permission.deleted_at == None) # noqa q = q.filter(Permission.read == True) # noqa q = q.filter(Permission.role_id.in_(authz.roles)) return q @classmethod def latest(cls): q = db.session.query(func.max(cls.updated_at)) q = q.filter(cls.deleted_at == None) # noqa return q.scalar() def __repr__(self): return '<Entity(%r, %r)>' % (self.id, self.name)
from sqlalchemy import Boolean, Column, DefaultClause, Float, Index, Integer, String, Table, Text from sqlalchemy.schema import PrimaryKeyConstraint from sqlalchemy.dialects.postgresql import ARRAY, DOUBLE_PRECISION from credovi.schema import metadata, schema from credovi.util.sqlalchemy import Vector3D # RAW CHAINS raw_chains = Table('raw_chains', metadata, Column('pdb', String(4), nullable=False), Column('assembly_serial', Integer, nullable=False, autoincrement=False), Column('entity_serial', Integer, nullable=False, autoincrement=False), Column('pdb_chain_id', String(1), nullable=False), Column('pdb_chain_asu_id', String(1), nullable=False), Column('chain_type', String(50)), Column('rotation', ARRAY(DOUBLE_PRECISION)), Column('translation', ARRAY(DOUBLE_PRECISION)), Column('is_at_identity', Boolean(create_constraint=False), DefaultClause('false'), nullable=False), schema=schema, prefixes=['unlogged']) PrimaryKeyConstraint(raw_chains.c.pdb, raw_chains.c.assembly_serial, raw_chains.c.entity_serial, deferrable=True, initially='deferred') # RAW LIGANDS raw_ligands = Table('raw_ligands', metadata, Column('pdb', String(4), nullable=False, primary_key=True), Column('assembly_serial', Integer, nullable=False, primary_key=True, autoincrement=False), Column('entity_serial', Integer, nullable=False, autoincrement=False, primary_key=True), Column('pdb_chain_id', String(1), nullable=False), Column('res_num', Integer), Column('ligand_name', String(64), nullable=False),
class DebcheckIssue(Base): ''' Data for a package migration excuse, as emitted by Britney ''' __tablename__ = 'debcheck_issues' uuid = Column(UUID(as_uuid=True), primary_key=True, default=uuid4) time = Column(DateTime(), default=datetime.utcnow) # Time when this excuse was created package_type = Column(Enum(PackageType)) repo_id = Column(Integer, ForeignKey('archive_repositories.id')) repo = relationship('ArchiveRepository') suite_id = Column(Integer, ForeignKey('archive_suites.id', ondelete='cascade')) suite = relationship('ArchiveSuite', backref=backref('debcheck_issues', passive_deletes=True)) architectures = Column( ARRAY(Text()), default=['any'] ) # Architectures this issue affects, may be a wildcard like "any" or (list of) architecture expressions package_name = Column( String(256)) # Name of the package this issue affects package_version = Column( DebVersion()) # Version of the package this issue affects _missing_json = Column('missing', JSON) # information about missing packages _conflicts_json = Column('conflicts', JSON) # information about conflicts _missing = None _conflicts = None @property def missing(self): if self._missing is not None: return self._missing if not self._missing_json: return [] jlist = json.loads(self._missing_json) schema = PackageIssue() self._missing = [schema.load(d) for d in jlist] return self._missing @missing.setter def missing(self, v): self._missing = None schema = PackageIssue() self._missing_json = json.dumps([schema.dump(e) for e in v]) @property def conflicts(self): if self._conflicts is not None: return self._conflicts if not self._conflicts_json: return [] jlist = json.loads(self._conflicts_json) schema = PackageConflict() self._conflicts = [schema.load(d) for d in jlist] return self._conflicts @conflicts.setter def conflicts(self, v): self._conflicts = None schema = PackageConflict() self._conflicts_json = json.dumps([schema.dump(e) for e in v])
def create_sql_filter(self, data_list): return RegistrationData.data.has_any( db.func.cast(data_list, ARRAY(db.String)))
class Task(Base): """Class to store a task. """ __tablename__ = 'tasks' __table_args__ = ( UniqueConstraint('contest_id', 'num'), UniqueConstraint('contest_id', 'name'), ForeignKeyConstraint( ("id", "active_dataset_id"), ("datasets.task_id", "datasets.id"), onupdate="SET NULL", ondelete="SET NULL", # Use an ALTER query to set this foreign key after # both tables have been CREATEd, to avoid circular # dependencies. use_alter=True, name="fk_active_dataset_id"), CheckConstraint("token_gen_initial <= token_gen_max"), ) # Auto increment primary key. id = Column( Integer, primary_key=True, # Needed to enable autoincrement on integer primary keys that # are referenced by a foreign key defined on this table. autoincrement='ignore_fk') # Number of the task for sorting. num = Column(Integer, nullable=True) # Contest (id and object) owning the task. contest_id = Column(Integer, ForeignKey(Contest.id, onupdate="CASCADE", ondelete="CASCADE"), nullable=True, index=True) contest = relationship(Contest, back_populates="tasks") # Short name and long human readable title of the task. name = Column(Codename, nullable=False, unique=True) title = Column(Unicode, nullable=False) # The names of the files that the contestant needs to submit (with # language-specific extensions replaced by "%l"). submission_format = Column(FilenameSchemaArray, nullable=False, default=[]) # The language codes of the statements that will be highlighted to # all users for this task. primary_statements = Column(ARRAY(String), nullable=False, default=[]) # The parameters that control task-tokens follow. Note that their # effect during the contest depends on the interaction with the # parameters that control contest-tokens, defined on the Contest. # The "kind" of token rules that will be active during the contest. # - disabled: The user will never be able to use any token. # - finite: The user has a finite amount of tokens and can choose # when to use them, subject to some limitations. Tokens may not # be all available at start, but given periodically during the # contest instead. # - infinite: The user will always be able to use a token. token_mode = Column(Enum(TOKEN_MODE_DISABLED, TOKEN_MODE_FINITE, TOKEN_MODE_INFINITE, name="token_mode"), nullable=False, default=TOKEN_MODE_DISABLED) # The maximum number of tokens a contestant is allowed to use # during the whole contest (on this tasks). token_max_number = Column(Integer, CheckConstraint("token_max_number > 0"), nullable=True) # The minimum interval between two successive uses of tokens for # the same user (on this task). token_min_interval = Column( Interval, CheckConstraint("token_min_interval >= '0 seconds'"), nullable=False, default=timedelta()) # The parameters that control generation (if mode is "finite"): # the user starts with "initial" tokens and receives "number" more # every "interval", but their total number is capped to "max". token_gen_initial = Column(Integer, CheckConstraint("token_gen_initial >= 0"), nullable=False, default=2) token_gen_number = Column(Integer, CheckConstraint("token_gen_number >= 0"), nullable=False, default=2) token_gen_interval = Column( Interval, CheckConstraint("token_gen_interval > '0 seconds'"), nullable=False, default=timedelta(minutes=30)) token_gen_max = Column(Integer, CheckConstraint("token_gen_max > 0"), nullable=True) # Maximum number of submissions or user_tests allowed for each user # on this task during the whole contest or None to not enforce # this limitation. max_submission_number = Column( Integer, CheckConstraint("max_submission_number > 0"), nullable=True) max_user_test_number = Column(Integer, CheckConstraint("max_user_test_number > 0"), nullable=True) # Minimum interval between two submissions or user_tests for this # task, or None to not enforce this limitation. min_submission_interval = Column( Interval, CheckConstraint("min_submission_interval > '0 seconds'"), nullable=True) min_user_test_interval = Column( Interval, CheckConstraint("min_user_test_interval > '0 seconds'"), nullable=True) # What information users can see about the evaluations of their # submissions. Offering full information might help some users to # reverse engineer task data. feedback_level = Column(Enum(FEEDBACK_LEVEL_FULL, FEEDBACK_LEVEL_RESTRICTED, name="feedback_level"), nullable=False, default=FEEDBACK_LEVEL_RESTRICTED) # The scores for this task will be rounded to this number of # decimal places. score_precision = Column(Integer, CheckConstraint("score_precision >= 0"), nullable=False, default=0) # Score mode for the task. score_mode = Column(Enum(SCORE_MODE_MAX_TOKENED_LAST, SCORE_MODE_MAX, SCORE_MODE_MAX_SUBTASK, name="score_mode"), nullable=False, default=SCORE_MODE_MAX_TOKENED_LAST) # Active Dataset (id and object) currently being used for scoring. # The ForeignKeyConstraint for this column is set at table-level. active_dataset_id = Column(Integer, nullable=True) active_dataset = relationship( 'Dataset', foreign_keys=[active_dataset_id], # Use an UPDATE query *after* an INSERT query (and *before* a # DELETE query) to set (and unset) the column associated to # this relationship. post_update=True) # These one-to-many relationships are the reversed directions of # the ones defined in the "child" classes using foreign keys. statements = relationship( "Statement", collection_class=attribute_mapped_collection("language"), cascade="all, delete-orphan", passive_deletes=True, back_populates="task") attachments = relationship( "Attachment", collection_class=attribute_mapped_collection("filename"), cascade="all, delete-orphan", passive_deletes=True, back_populates="task") datasets = relationship( "Dataset", # Due to active_dataset_id, SQLAlchemy cannot unambiguously # figure out by itself which foreign key to use. foreign_keys="[Dataset.task_id]", cascade="all, delete-orphan", passive_deletes=True, back_populates="task") submissions = relationship("Submission", cascade="all, delete-orphan", passive_deletes=True, back_populates="task") user_tests = relationship("UserTest", cascade="all, delete-orphan", passive_deletes=True, back_populates="task")
class Project(db.Model): """ Describes a HOT Mapping Project """ __tablename__ = "projects" # Columns id = db.Column(db.Integer, primary_key=True) status = db.Column(db.Integer, default=ProjectStatus.DRAFT.value, nullable=False) created = db.Column(db.DateTime, default=timestamp, nullable=False) priority = db.Column(db.Integer, default=ProjectPriority.MEDIUM.value) default_locale = db.Column( db.String(10), default="en" ) # The locale that is returned if requested locale not available author_id = db.Column(db.BigInteger, db.ForeignKey("users.id", name="fk_users"), nullable=False) mapper_level = db.Column( db.Integer, default=2, nullable=False, index=True) # Mapper level project is suitable for mapping_permission = db.Column(db.Integer, default=MappingPermission.ANY.value) validation_permission = db.Column( db.Integer, default=ValidationPermission.ANY.value ) # Means only users with validator role can validate enforce_random_task_selection = db.Column( db.Boolean, default=False ) # Force users to edit at random to avoid mapping "easy" tasks private = db.Column(db.Boolean, default=False) # Only allowed users can validate featured = db.Column( db.Boolean, default=False) # Only PMs can set a project as featured entities_to_map = db.Column(db.String) changeset_comment = db.Column(db.String) osmcha_filter_id = db.Column( db.String) # Optional custom filter id for filtering on OSMCha due_date = db.Column(db.DateTime) imagery = db.Column(db.String) josm_preset = db.Column(db.String) id_presets = db.Column(ARRAY(db.String)) last_updated = db.Column(db.DateTime, default=timestamp) license_id = db.Column(db.Integer, db.ForeignKey("licenses.id", name="fk_licenses")) geometry = db.Column(Geometry("MULTIPOLYGON", srid=4326)) centroid = db.Column(Geometry("POINT", srid=4326)) country = db.Column(ARRAY(db.String), default=[]) task_creation_mode = db.Column(db.Integer, default=TaskCreationMode.GRID.value, nullable=False) organisation_id = db.Column( db.Integer, db.ForeignKey("organisations.id", name="fk_organisations"), index=True, ) # Tags mapping_types = db.Column(ARRAY(db.Integer), index=True) # Editors mapping_editors = db.Column( ARRAY(db.Integer), default=[ Editors.ID.value, Editors.JOSM.value, Editors.POTLATCH_2.value, Editors.FIELD_PAPERS.value, Editors.CUSTOM.value, ], index=True, nullable=False, ) validation_editors = db.Column( ARRAY(db.Integer), default=[ Editors.ID.value, Editors.JOSM.value, Editors.POTLATCH_2.value, Editors.FIELD_PAPERS.value, Editors.CUSTOM.value, ], index=True, nullable=False, ) # Stats total_tasks = db.Column(db.Integer, nullable=False) tasks_mapped = db.Column(db.Integer, default=0, nullable=False) tasks_validated = db.Column(db.Integer, default=0, nullable=False) tasks_bad_imagery = db.Column(db.Integer, default=0, nullable=False) # Mapped Objects tasks = db.relationship(Task, backref="projects", cascade="all, delete, delete-orphan", lazy="dynamic") project_info = db.relationship(ProjectInfo, lazy="dynamic", cascade="all") project_chat = db.relationship(ProjectChat, lazy="dynamic", cascade="all") author = db.relationship(User) allowed_users = db.relationship(User, secondary=project_allowed_users) priority_areas = db.relationship( PriorityArea, secondary=project_priority_areas, cascade="all, delete-orphan", single_parent=True, ) custom_editor = db.relationship(CustomEditor, uselist=False) favorited = db.relationship(User, secondary=project_favorites, backref="favorites") organisation = db.relationship(Organisation, backref="projects") campaign = db.relationship(Campaign, secondary=campaign_projects, backref="projects") interests = db.relationship(Interest, secondary=project_interests, backref="projects") def create_draft_project(self, draft_project_dto: DraftProjectDTO): """ Creates a draft project :param draft_project_dto: DTO containing draft project details :param aoi: Area of Interest for the project (eg boundary of project) """ self.project_info.append( ProjectInfo.create_from_name(draft_project_dto.project_name)) self.status = ProjectStatus.DRAFT.value self.author_id = draft_project_dto.user_id self.last_updated = timestamp() def set_project_aoi(self, draft_project_dto: DraftProjectDTO): """ Sets the AOI for the supplied project """ aoi_geojson = geojson.loads( json.dumps(draft_project_dto.area_of_interest)) aoi_geometry = GridService.merge_to_multi_polygon(aoi_geojson, dissolve=True) valid_geojson = geojson.dumps(aoi_geometry) self.geometry = ST_SetSRID(ST_GeomFromGeoJSON(valid_geojson), 4326) self.centroid = ST_Centroid(self.geometry) def set_default_changeset_comment(self): """ Sets the default changeset comment""" default_comment = current_app.config["DEFAULT_CHANGESET_COMMENT"] self.changeset_comment = ( f"{default_comment}-{self.id} {self.changeset_comment}" if self.changeset_comment is not None else f"{default_comment}-{self.id}") self.save() def set_country_info(self): """ Sets the default country based on centroid""" lat, lng = (db.session.query( cast(ST_Y(Project.centroid), sqlalchemy.String), cast(ST_X(Project.centroid), sqlalchemy.String), ).filter(Project.id == self.id).one()) url = "https://nominatim.openstreetmap.org/reverse?format=jsonv2&lat={0}&lon={1}".format( lat, lng) country_info = requests.get(url) country_info_json = country_info.content.decode("utf8").replace( "'", '"') # Load the JSON to a Python list & dump it back out as formatted JSON data = json.loads(country_info_json) if data["address"].get("country") is not None: self.country = [data["address"]["country"]] else: self.country = [data["address"]["county"]] self.save() def create(self): """ Creates and saves the current model to the DB """ db.session.add(self) db.session.commit() def save(self): """ Save changes to db""" db.session.commit() @staticmethod def clone(project_id: int, author_id: int): """ Clone project """ cloned_project = Project.get(project_id) # Remove clone from session so we can reinsert it as a new object db.session.expunge(cloned_project) make_transient(cloned_project) # Re-initialise counters and meta-data cloned_project.total_tasks = 0 cloned_project.tasks_mapped = 0 cloned_project.tasks_validated = 0 cloned_project.tasks_bad_imagery = 0 cloned_project.last_updated = timestamp() cloned_project.created = timestamp() cloned_project.author_id = author_id cloned_project.status = ProjectStatus.DRAFT.value cloned_project.id = None # Reset ID so we get a new ID when inserted cloned_project.geometry = None cloned_project.centroid = None db.session.add(cloned_project) db.session.commit() # Now add the project info, we have to do it in a two stage commit because we need to know the new project id original_project = Project.get(project_id) for info in original_project.project_info: db.session.expunge(info) make_transient( info ) # Must remove the object from the session or it will be updated rather than inserted info.id = None info.project_id_str = str(cloned_project.id) cloned_project.project_info.append(info) # Now add allowed users now we know new project id, if there are any for user in original_project.allowed_users: cloned_project.allowed_users.append(user) # Add other project metadata cloned_project.priority = original_project.priority cloned_project.default_locale = original_project.default_locale cloned_project.mapper_level = original_project.mapper_level cloned_project.mapping_permission = original_project.mapping_permission cloned_project.validation_permission = original_project.validation_permission cloned_project.enforce_random_task_selection = ( original_project.enforce_random_task_selection) cloned_project.private = original_project.private cloned_project.entities_to_map = original_project.entities_to_map cloned_project.due_date = original_project.due_date cloned_project.imagery = original_project.imagery cloned_project.josm_preset = original_project.josm_preset cloned_project.license_id = original_project.license_id cloned_project.mapping_types = original_project.mapping_types # We try to remove the changeset comment referencing the old project. This # assumes the default changeset comment has not changed between the old # project and the cloned. This is a best effort basis. default_comment = current_app.config["DEFAULT_CHANGESET_COMMENT"] changeset_comments = [] if original_project.changeset_comment is not None: changeset_comments = original_project.changeset_comment.split(" ") if f"{default_comment}-{original_project.id}" in changeset_comments: changeset_comments.remove( f"{default_comment}-{original_project.id}") cloned_project.changeset_comment = " ".join(changeset_comments) db.session.add(cloned_project) db.session.commit() return cloned_project @staticmethod def get(project_id: int): """ Gets specified project :param project_id: project ID in scope :return: Project if found otherwise None """ return Project.query.get(project_id) def update(self, project_dto: ProjectDTO): """ Updates project from DTO """ self.status = ProjectStatus[project_dto.project_status].value self.priority = ProjectPriority[project_dto.project_priority].value self.default_locale = project_dto.default_locale self.enforce_random_task_selection = project_dto.enforce_random_task_selection self.private = project_dto.private self.mapper_level = MappingLevel[ project_dto.mapper_level.upper()].value self.entities_to_map = project_dto.entities_to_map self.changeset_comment = project_dto.changeset_comment self.due_date = project_dto.due_date self.imagery = project_dto.imagery self.josm_preset = project_dto.josm_preset self.id_presets = project_dto.id_presets self.last_updated = timestamp() self.license_id = project_dto.license_id if project_dto.osmcha_filter_id: # Support simple extraction of OSMCha filter id from OSMCha URL match = re.search(r"aoi=([\w-]+)", project_dto.osmcha_filter_id) self.osmcha_filter_id = (match.group(1) if match else project_dto.osmcha_filter_id) else: self.osmcha_filter_id = None if project_dto.organisation: org = Organisation.get(project_dto.organisation) if org is None: raise NotFound("Organisation does not exist") self.organisation = org # Cast MappingType strings to int array type_array = [] for mapping_type in project_dto.mapping_types: type_array.append(MappingTypes[mapping_type].value) self.mapping_types = type_array # Cast Editor strings to int array mapping_editors_array = [] for mapping_editor in project_dto.mapping_editors: mapping_editors_array.append(Editors[mapping_editor].value) self.mapping_editors = mapping_editors_array validation_editors_array = [] for validation_editor in project_dto.validation_editors: validation_editors_array.append(Editors[validation_editor].value) self.validation_editors = validation_editors_array self.country = project_dto.country_tag # Add list of allowed users, meaning the project can only be mapped by users in this list if hasattr(project_dto, "allowed_users"): self.allowed_users = [ ] # Clear existing relationships then re-insert for user in project_dto.allowed_users: self.allowed_users.append(user) # Update teams and projects relationship. self.teams = [] if hasattr(project_dto, "project_teams") and project_dto.project_teams: for team_dto in project_dto.project_teams: team = Team.get(team_dto.team_id) if team is None: raise NotFound(f"Team not found") role = TeamRoles[team_dto.role].value ProjectTeams(project=self, team=team, role=role) # Set Project Info for all returned locales for dto in project_dto.project_info_locales: project_info = self.project_info.filter_by( locale=dto.locale).one_or_none() if project_info is None: new_info = ProjectInfo.create_from_dto( dto) # Can't find info so must be new locale self.project_info.append(new_info) else: project_info.update_from_dto(dto) self.priority_areas = [ ] # Always clear Priority Area prior to updating if project_dto.priority_areas: for priority_area in project_dto.priority_areas: pa = PriorityArea.from_dict(priority_area) self.priority_areas.append(pa) if project_dto.custom_editor: if not self.custom_editor: new_editor = CustomEditor.create_from_dto( self.id, project_dto.custom_editor) self.custom_editor = new_editor else: self.custom_editor.update_editor(project_dto.custom_editor) else: if self.custom_editor: self.custom_editor.delete() self.campaign = [ Campaign.query.get(c.id) for c in project_dto.campaigns ] if project_dto.mapping_permission: self.mapping_permission = MappingPermission[ project_dto.mapping_permission.upper()].value if project_dto.validation_permission: self.validation_permission = ValidationPermission[ project_dto.validation_permission.upper()].value # Update Interests. self.interests = [] if project_dto.interests: self.interests = [ Interest.query.get(i.id) for i in project_dto.interests ] db.session.commit() def delete(self): """ Deletes the current model from the DB """ db.session.delete(self) db.session.commit() def is_favorited(self, user_id: int) -> bool: user = User.query.get(user_id) if user not in self.favorited: return False return True def favorite(self, user_id: int): user = User.query.get(user_id) self.favorited.append(user) db.session.commit() def unfavorite(self, user_id: int): user = User.query.get(user_id) if user not in self.favorited: raise ValueError("Project not been favorited by user") self.favorited.remove(user) db.session.commit() def set_as_featured(self): if self.featured is True: raise ValueError("Project is already featured") self.featured = True db.session.commit() def unset_as_featured(self): if self.featured is False: raise ValueError("Project is not featured") self.featured = False db.session.commit() def can_be_deleted(self) -> bool: """ Projects can be deleted if they have no mapped work """ task_count = self.tasks.filter( Task.task_status != TaskStatus.READY.value).count() if task_count == 0: return True else: return False @staticmethod def get_projects_for_admin(admin_id: int, preferred_locale: str, search_dto: ProjectSearchDTO) -> PMDashboardDTO: """ Get projects for admin """ query = Project.query.filter(Project.author_id == admin_id) # Do Filtering Here if search_dto.order_by: if search_dto.order_by_type == "DESC": query = query.order_by(desc(search_dto.order_by)) else: query = query.order_by(search_dto.order_by) admins_projects = query.all() if admins_projects is None: raise NotFound("No projects found for admin") admin_projects_dto = PMDashboardDTO() for project in admins_projects: pm_project = project.get_project_summary(preferred_locale) project_status = ProjectStatus(project.status) if project_status == ProjectStatus.DRAFT: admin_projects_dto.draft_projects.append(pm_project) elif project_status == ProjectStatus.PUBLISHED: admin_projects_dto.active_projects.append(pm_project) elif project_status == ProjectStatus.ARCHIVED: admin_projects_dto.archived_projects.append(pm_project) else: current_app.logger.error( f"Unexpected state project {project.id}") return admin_projects_dto def get_project_user_stats(self, user_id: int) -> ProjectUserStatsDTO: """Compute project specific stats for a given user""" stats_dto = ProjectUserStatsDTO() stats_dto.time_spent_mapping = 0 stats_dto.time_spent_validating = 0 stats_dto.total_time_spent = 0 query = """SELECT SUM(TO_TIMESTAMP(action_text, 'HH24:MI:SS')::TIME) FROM task_history WHERE (action='LOCKED_FOR_MAPPING' or action='AUTO_UNLOCKED_FOR_MAPPING') and user_id = :user_id and project_id = :project_id;""" total_mapping_time = db.engine.execute(text(query), user_id=user_id, project_id=self.id) for time in total_mapping_time: total_mapping_time = time[0] if total_mapping_time: stats_dto.time_spent_mapping = total_mapping_time.total_seconds( ) stats_dto.total_time_spent += stats_dto.time_spent_mapping query = """SELECT SUM(TO_TIMESTAMP(action_text, 'HH24:MI:SS')::TIME) FROM task_history WHERE (action='LOCKED_FOR_VALIDATION' or action='AUTO_UNLOCKED_FOR_VALIDATION') and user_id = :user_id and project_id = :project_id;""" total_validation_time = db.engine.execute(text(query), user_id=user_id, project_id=self.id) for time in total_validation_time: total_validation_time = time[0] if total_validation_time: stats_dto.time_spent_validating = total_validation_time.total_seconds( ) stats_dto.total_time_spent += stats_dto.time_spent_validating return stats_dto def get_project_stats(self) -> ProjectStatsDTO: """ Create Project Stats model for postgis project object""" project_stats = ProjectStatsDTO() project_stats.project_id = self.id project_area_sql = "select ST_Area(geometry, true)/1000000 as area from public.projects where id = :id" project_area_result = db.engine.execute(text(project_area_sql), id=self.id) project_stats.area = project_area_result.fetchone()["area"] project_stats.total_mappers = (db.session.query(User).filter( User.projects_mapped.any(self.id)).count()) project_stats.total_tasks = self.total_tasks project_stats.total_comments = (db.session.query(ProjectChat).filter( ProjectChat.project_id == self.id).count()) project_stats.percent_mapped = Project.calculate_tasks_percent( "mapped", self.total_tasks, self.tasks_mapped, self.tasks_validated, self.tasks_bad_imagery, ) project_stats.percent_validated = Project.calculate_tasks_percent( "validated", self.total_tasks, self.tasks_mapped, self.tasks_validated, self.tasks_bad_imagery, ) project_stats.percent_bad_imagery = Project.calculate_tasks_percent( "bad_imagery", self.total_tasks, self.tasks_mapped, self.tasks_validated, self.tasks_bad_imagery, ) centroid_geojson = db.session.scalar(self.centroid.ST_AsGeoJSON()) project_stats.aoi_centroid = geojson.loads(centroid_geojson) unique_mappers = (TaskHistory.query.filter( TaskHistory.action == "LOCKED_FOR_MAPPING", TaskHistory.project_id == self.id, ).distinct(TaskHistory.user_id).count()) unique_validators = (TaskHistory.query.filter( TaskHistory.action == "LOCKED_FOR_VALIDATION", TaskHistory.project_id == self.id, ).distinct(TaskHistory.user_id).count()) project_stats.total_time_spent = 0 project_stats.total_mapping_time = 0 project_stats.total_validation_time = 0 project_stats.average_mapping_time = 0 project_stats.average_validation_time = 0 query = """SELECT SUM(TO_TIMESTAMP(action_text, 'HH24:MI:SS')::TIME) FROM task_history WHERE (action='LOCKED_FOR_MAPPING' or action='AUTO_UNLOCKED_FOR_MAPPING') and project_id = :project_id;""" total_mapping_time = db.engine.execute(text(query), project_id=self.id) for row in total_mapping_time: total_mapping_time = row[0] if total_mapping_time: total_mapping_seconds = total_mapping_time.total_seconds() project_stats.total_mapping_time = total_mapping_seconds project_stats.total_time_spent += project_stats.total_mapping_time if unique_mappers: average_mapping_time = total_mapping_seconds / unique_mappers project_stats.average_mapping_time = average_mapping_time query = """SELECT SUM(TO_TIMESTAMP(action_text, 'HH24:MI:SS')::TIME) FROM task_history WHERE (action='LOCKED_FOR_VALIDATION' or action='AUTO_UNLOCKED_FOR_VALIDATION') and project_id = :project_id;""" total_validation_time = db.engine.execute(text(query), project_id=self.id) for row in total_validation_time: total_validation_time = row[0] if total_validation_time: total_validation_seconds = total_validation_time.total_seconds( ) project_stats.total_validation_time = total_validation_seconds project_stats.total_time_spent += project_stats.total_validation_time if unique_validators: average_validation_time = (total_validation_seconds / unique_validators) project_stats.average_validation_time = average_validation_time return project_stats def get_project_summary(self, preferred_locale) -> ProjectSummary: """ Create Project Summary model for postgis project object""" summary = ProjectSummary() summary.project_id = self.id priority = self.priority if priority == 0: summary.priority = "URGENT" elif priority == 1: summary.priority = "HIGH" elif priority == 2: summary.priority = "MEDIUM" else: summary.priority = "LOW" summary.author = User.get_by_id(self.author_id).username summary.default_locale = self.default_locale summary.country_tag = self.country summary.changeset_comment = self.changeset_comment summary.due_date = self.due_date summary.created = self.created summary.last_updated = self.last_updated summary.osmcha_filter_id = self.osmcha_filter_id summary.mapper_level = MappingLevel(self.mapper_level).name summary.mapping_permission = MappingPermission( self.mapping_permission).name summary.validation_permission = ValidationPermission( self.validation_permission).name summary.random_task_selection_enforced = self.enforce_random_task_selection summary.private = self.private summary.license_id = self.license_id summary.status = ProjectStatus(self.status).name summary.entities_to_map = self.entities_to_map summary.imagery = self.imagery if self.organisation_id: summary.organisation = self.organisation_id summary.organisation_name = self.organisation.name summary.organisation_logo = self.organisation.logo if self.campaign: summary.campaigns = [i.as_dto() for i in self.campaign] # Cast MappingType values to related string array mapping_types_array = [] if self.mapping_types: for mapping_type in self.mapping_types: mapping_types_array.append(MappingTypes(mapping_type).name) summary.mapping_types = mapping_types_array if self.mapping_editors: mapping_editors = [] for mapping_editor in self.mapping_editors: mapping_editors.append(Editors(mapping_editor).name) summary.mapping_editors = mapping_editors if self.validation_editors: validation_editors = [] for validation_editor in self.validation_editors: validation_editors.append(Editors(validation_editor).name) summary.validation_editors = validation_editors if self.custom_editor: summary.custom_editor = self.custom_editor.as_dto() # If project is private, fetch list of allowed users if self.private: allowed_users = [] for user in self.allowed_users: allowed_users.append(user.username) summary.allowed_users = allowed_users centroid_geojson = db.session.scalar(self.centroid.ST_AsGeoJSON()) summary.aoi_centroid = geojson.loads(centroid_geojson) summary.percent_mapped = Project.calculate_tasks_percent( "mapped", self.total_tasks, self.tasks_mapped, self.tasks_validated, self.tasks_bad_imagery, ) summary.percent_validated = Project.calculate_tasks_percent( "validated", self.total_tasks, self.tasks_mapped, self.tasks_validated, self.tasks_bad_imagery, ) summary.percent_bad_imagery = Project.calculate_tasks_percent( "bad_imagery", self.total_tasks, self.tasks_mapped, self.tasks_validated, self.tasks_bad_imagery, ) summary.project_teams = [ ProjectTeamDTO( dict( team_id=t.team.id, team_name=t.team.name, role=TeamRoles(t.role).name, )) for t in self.teams ] project_info = ProjectInfo.get_dto_for_locale(self.id, preferred_locale, self.default_locale) summary.project_info = project_info return summary def get_project_title(self, preferred_locale): project_info = ProjectInfo.get_dto_for_locale(self.id, preferred_locale, self.default_locale) return project_info.name @staticmethod def get_project_total_contributions(project_id: int) -> int: project_contributors_count = (TaskHistory.query.with_entities( TaskHistory.user_id).filter( TaskHistory.project_id == project_id, TaskHistory.action != "COMMENT").distinct( TaskHistory.user_id).count()) return project_contributors_count def get_aoi_geometry_as_geojson(self): """ Helper which returns the AOI geometry as a geojson object """ aoi_geojson = db.engine.execute(self.geometry.ST_AsGeoJSON()).scalar() return geojson.loads(aoi_geojson) def get_project_teams(self): """ Helper to return teams with members so we can handle permissions """ project_teams = [] for t in self.teams: project_teams.append({ "name": t.team.name, "role": t.role, "members": [m.member.username for m in t.team.members], }) return project_teams @staticmethod @cached(active_mappers_cache) def get_active_mappers(project_id) -> int: """ Get count of Locked tasks as a proxy for users who are currently active on the project """ return (Task.query.filter( Task.task_status.in_(( TaskStatus.LOCKED_FOR_MAPPING.value, TaskStatus.LOCKED_FOR_VALIDATION.value, ))).filter(Task.project_id == project_id).distinct( Task.locked_by).count()) def _get_project_and_base_dto(self): """ Populates a project DTO with properties common to all roles """ base_dto = ProjectDTO() base_dto.project_id = self.id base_dto.project_status = ProjectStatus(self.status).name base_dto.default_locale = self.default_locale base_dto.project_priority = ProjectPriority(self.priority).name base_dto.area_of_interest = self.get_aoi_geometry_as_geojson() base_dto.aoi_bbox = shape(base_dto.area_of_interest).bounds base_dto.mapping_permission = MappingPermission( self.mapping_permission).name base_dto.validation_permission = ValidationPermission( self.validation_permission).name base_dto.enforce_random_task_selection = self.enforce_random_task_selection base_dto.private = self.private base_dto.mapper_level = MappingLevel(self.mapper_level).name base_dto.entities_to_map = self.entities_to_map base_dto.changeset_comment = self.changeset_comment base_dto.osmcha_filter_id = self.osmcha_filter_id base_dto.due_date = self.due_date base_dto.imagery = self.imagery base_dto.josm_preset = self.josm_preset base_dto.id_presets = self.id_presets base_dto.country_tag = self.country base_dto.organisation_id = self.organisation_id base_dto.license_id = self.license_id base_dto.created = self.created base_dto.last_updated = self.last_updated base_dto.author = User.get_by_id(self.author_id).username base_dto.active_mappers = Project.get_active_mappers(self.id) base_dto.task_creation_mode = TaskCreationMode( self.task_creation_mode).name base_dto.percent_mapped = Project.calculate_tasks_percent( "mapped", self.total_tasks, self.tasks_mapped, self.tasks_validated, self.tasks_bad_imagery, ) base_dto.percent_validated = Project.calculate_tasks_percent( "validated", self.total_tasks, self.tasks_mapped, self.tasks_validated, self.tasks_bad_imagery, ) base_dto.percent_bad_imagery = Project.calculate_tasks_percent( "bad_imagery", self.total_tasks, self.tasks_mapped, self.tasks_validated, self.tasks_bad_imagery, ) base_dto.project_teams = [ ProjectTeamDTO( dict( team_id=t.team.id, team_name=t.team.name, role=TeamRoles(t.role).name, )) for t in self.teams ] if self.custom_editor: base_dto.custom_editor = self.custom_editor.as_dto() if self.private: # If project is private it should have a list of allowed users allowed_usernames = [] for user in self.allowed_users: allowed_usernames.append(user.username) base_dto.allowed_usernames = allowed_usernames if self.mapping_types: mapping_types = [] for mapping_type in self.mapping_types: mapping_types.append(MappingTypes(mapping_type).name) base_dto.mapping_types = mapping_types if self.campaign: base_dto.campaigns = [i.as_dto() for i in self.campaign] if self.mapping_editors: mapping_editors = [] for mapping_editor in self.mapping_editors: mapping_editors.append(Editors(mapping_editor).name) base_dto.mapping_editors = mapping_editors if self.validation_editors: validation_editors = [] for validation_editor in self.validation_editors: validation_editors.append(Editors(validation_editor).name) base_dto.validation_editors = validation_editors if self.priority_areas: geojson_areas = [] for priority_area in self.priority_areas: geojson_areas.append(priority_area.get_as_geojson()) base_dto.priority_areas = geojson_areas base_dto.interests = [ InterestDTO(dict(id=i.id, name=i.name)) for i in self.interests ] return self, base_dto def check_draft_project_visibility(self, authenticated_user_id: int): """" Check if a User is allowed to see a Draft Project """ is_allowed_user = False if authenticated_user_id: is_team_manager = False user = User.get_by_id(authenticated_user_id) user_orgs = Organisation.get_organisations_managed_by_user( authenticated_user_id) if self.teams: for project_team in self.teams: team_members = Team.get( project_team.team_id)._get_team_members() for member in team_members: if (user.username == member["username"] and member["function"] == TeamMemberFunctions.MANAGER.name): is_team_manager = True break if (UserRole(user.role) == UserRole.ADMIN or authenticated_user_id == self.author_id or self.organisation in user_orgs or is_team_manager): is_allowed_user = True return is_allowed_user def as_dto_for_mapping(self, authenticated_user_id: int = None, locale: str = "en", abbrev: bool = True) -> Optional[ProjectDTO]: """ Creates a Project DTO suitable for transmitting to mapper users """ # Check for project visibility settings is_allowed_user = True if self.status == ProjectStatus.DRAFT.value: if not self.check_draft_project_visibility(authenticated_user_id): is_allowed_user = False if self.private: is_allowed_user = False if authenticated_user_id: user = User.get_by_id(authenticated_user_id) if (UserRole(user.role) == UserRole.ADMIN or authenticated_user_id == self.author_id): is_allowed_user = True for user in self.allowed_users: if user.id == authenticated_user_id: is_allowed_user = True break if is_allowed_user: project, project_dto = self._get_project_and_base_dto() if abbrev is False: project_dto.tasks = Task.get_tasks_as_geojson_feature_collection( self.id, None) else: project_dto.tasks = Task.get_tasks_as_geojson_feature_collection_no_geom( self.id) project_dto.project_info = ProjectInfo.get_dto_for_locale( self.id, locale, project.default_locale) if project.organisation_id: project_dto.organisation = project.organisation.id project_dto.organisation_name = project.organisation.name project_dto.organisation_logo = project.organisation.logo project_dto.project_info_locales = ProjectInfo.get_dto_for_all_locales( self.id) return project_dto def tasks_as_geojson(self, task_ids_str: str, order_by=None, order_by_type="ASC", status=None): """ Creates a geojson of all areas """ project_tasks = Task.get_tasks_as_geojson_feature_collection( self.id, task_ids_str, order_by, order_by_type, status) return project_tasks @staticmethod def get_all_countries(): query = (db.session.query( func.unnest(Project.country).label("country")).distinct().order_by( "country")) tags_dto = TagsDTO() tags_dto.tags = [r[0] for r in query] return tags_dto @staticmethod def get_all_organisations_tag(preferred_locale="en"): query = (db.session.query( Project.id, Project.organisation_tag, Project.private, Project.status).join(ProjectInfo).filter( ProjectInfo.locale.in_( [preferred_locale, "en"])).filter(Project.private is not True).filter( Project.organisation_tag.isnot(None)).filter( Project.organisation_tag != "")) query = query.distinct(Project.organisation_tag) query = query.order_by(Project.organisation_tag) tags_dto = TagsDTO() tags_dto.tags = [r[1] for r in query] return tags_dto @staticmethod def get_all_campaign_tag(preferred_locale="en"): query = (db.session.query( Project.id, Project.campaign_tag, Project.private, Project.status).join(ProjectInfo).filter( ProjectInfo.locale.in_( [preferred_locale, "en"])).filter(Project.private is not True).filter( Project.campaign_tag.isnot(None)).filter( Project.campaign_tag != "")) query = query.distinct(Project.campaign_tag) query = query.order_by(Project.campaign_tag) tags_dto = TagsDTO() tags_dto.tags = [r[1] for r in query] return tags_dto @staticmethod def calculate_tasks_percent(target, total_tasks, tasks_mapped, tasks_validated, tasks_bad_imagery): """ Calculates percentages of contributions """ if target == "mapped": return int((tasks_mapped + tasks_validated) / (total_tasks - tasks_bad_imagery) * 100) elif target == "validated": return int(tasks_validated / (total_tasks - tasks_bad_imagery) * 100) elif target == "bad_imagery": return int((tasks_bad_imagery / total_tasks) * 100) def as_dto_for_admin(self, project_id): """ Creates a Project DTO suitable for transmitting to project admins """ project, project_dto = self._get_project_and_base_dto() if project is None: return None project_dto.project_info_locales = ProjectInfo.get_dto_for_all_locales( project_id) return project_dto def create_or_update_interests(self, interests_ids): self.interests = [] objs = [Interest.get_by_id(i) for i in interests_ids] self.interests.extend(objs) db.session.commit()
class UserSIP(Base): __tablename__ = 'usersip' id = Column(Integer, nullable=False) tenant_uuid = Column(String(36), ForeignKey('tenant.uuid', ondelete='CASCADE'), nullable=False) name = Column(String(40), nullable=False) type = Column(Enum('friend', 'peer', 'user', name='useriax_type', metadata=Base.metadata), nullable=False) username = Column(String(80)) secret = Column(String(80), nullable=False, server_default='') md5secret = Column(String(32), nullable=False, server_default='') context = Column(String(39)) language = Column(String(20)) accountcode = Column(String(20)) amaflags = Column(Enum('default', 'omit', 'billing', 'documentation', name='useriax_amaflags', metadata=Base.metadata), nullable=False, server_default='default') allowtransfer = Column(Integer) fromuser = Column(String(80)) fromdomain = Column(String(255)) subscribemwi = Column(Integer, nullable=False, server_default='0') buggymwi = Column(Integer) call_limit = Column('call-limit', Integer, nullable=False, server_default='10') callerid = Column(String(160)) fullname = Column(String(80)) cid_number = Column(String(80)) maxcallbitrate = Column(Integer) insecure = Column( Enum('port', 'invite', 'port,invite', name='usersip_insecure', metadata=Base.metadata)) nat = Column( Enum('no', 'force_rport', 'comedia', 'force_rport,comedia', 'auto_force_rport', 'auto_comedia', 'auto_force_rport,auto_comedia', name='usersip_nat', metadata=Base.metadata)) promiscredir = Column(Integer) usereqphone = Column(Integer) videosupport = Column( Enum('no', 'yes', 'always', name='usersip_videosupport', metadata=Base.metadata)) trustrpid = Column(Integer) sendrpid = Column(String(16)) allowsubscribe = Column(Integer) allowoverlap = Column(Integer) dtmfmode = Column( Enum('rfc2833', 'inband', 'info', 'auto', name='usersip_dtmfmode', metadata=Base.metadata)) rfc2833compensate = Column(Integer) qualify = Column(String(4)) g726nonstandard = Column(Integer) disallow = Column(String(100)) allow = Column(Text) autoframing = Column(Integer) mohinterpret = Column(String(80)) useclientcode = Column(Integer) progressinband = Column( Enum('no', 'yes', 'never', name='usersip_progressinband', metadata=Base.metadata)) t38pt_udptl = Column(Integer) t38pt_usertpsource = Column(Integer) rtptimeout = Column(Integer) rtpholdtimeout = Column(Integer) rtpkeepalive = Column(Integer) deny = Column(String(31)) permit = Column(String(31)) defaultip = Column(String(255)) host = Column(String(255), nullable=False, server_default='dynamic') port = Column(Integer) regexten = Column(String(80)) subscribecontext = Column(String(80)) vmexten = Column(String(40)) callingpres = Column(Integer) parkinglot = Column(Integer) protocol = Column(enum.trunk_protocol, nullable=False, server_default='sip') category = Column(Enum('user', 'trunk', name='useriax_category', metadata=Base.metadata), nullable=False) outboundproxy = Column(String(1024)) transport = Column(String(255)) remotesecret = Column(String(255)) directmedia = Column(String(20)) callcounter = Column(Integer) busylevel = Column(Integer) ignoresdpversion = Column(Integer) session_timers = Column( 'session-timers', Enum('originate', 'accept', 'refuse', name='usersip_session_timers', metadata=Base.metadata)) session_expires = Column('session-expires', Integer) session_minse = Column('session-minse', Integer) session_refresher = Column( 'session-refresher', Enum('uac', 'uas', name='usersip_session_refresher', metadata=Base.metadata)) callbackextension = Column(String(255)) timert1 = Column(Integer) timerb = Column(Integer) qualifyfreq = Column(Integer) contactpermit = Column(String(1024)) contactdeny = Column(String(1024)) unsolicited_mailbox = Column(String(1024)) use_q850_reason = Column(Integer) encryption = Column(Integer) snom_aoc_enabled = Column(Integer) maxforwards = Column(Integer) disallowed_methods = Column(String(1024)) textsupport = Column(Integer) commented = Column(Integer, nullable=False, server_default='0') _options = Column("options", ARRAY(String, dimensions=2), nullable=False, default=list, server_default='{}') __table_args__ = ( PrimaryKeyConstraint('id'), UniqueConstraint('name'), Index('usersip__idx__category', 'category'), CheckConstraint( directmedia.in_( ['no', 'yes', 'nonat', 'update', 'update,nonat', 'outgoing'])), )
class Project(db.Model, DomainObject): '''A microtasking Project to which Tasks are associated. ''' __tablename__ = 'project' #: ID of the project id = Column(Integer, primary_key=True) #: UTC timestamp when the project is created created = Column(Text, default=make_timestamp) #: UTC timestamp when the project is updated (or any of its relationships) updated = Column(Text, default=make_timestamp, onupdate=make_timestamp) #: Project name name = Column(Unicode(length=255), unique=True, nullable=False) #: Project slug for the URL short_name = Column(Unicode(length=255), unique=True, nullable=False) #: Project description description = Column(Unicode(length=255), nullable=False) #: Project long description long_description = Column(UnicodeText) #: Project webhook webhook = Column(Text) #: If the project allows anonymous contributions allow_anonymous_contributors = Column(Boolean, default=True) #: If the project is published published = Column(Boolean, nullable=False, default=False) # If the project is featured featured = Column(Boolean, nullable=False, default=False) # Secret key for project secret_key = Column(Text, default=make_uuid) # Zip download zip_download = Column(Boolean, default=True) # If the project owner has been emailed contacted = Column(Boolean, nullable=False, default=False) #: Project owner_id owner_id = Column(Integer, ForeignKey('user.id'), nullable=False) #: Project Category category_id = Column(Integer, ForeignKey('category.id'), nullable=False) #: Project info field formatted as JSON info = Column(MutableDict.as_mutable(JSONB), default=dict()) # Custom. # Whether it is visible to users (who have privileges) visible = Column(Boolean, nullable=False, default=True) # Maximum number of tasks per user n_allowed_tasks = Column(Integer, nullable=False, default=0) tasks = relationship(Task, cascade='all, delete, delete-orphan', backref='project') task_runs = relationship(TaskRun, backref='project', cascade='all, delete-orphan', order_by='TaskRun.finish_time.desc()') category = relationship(Category) blogposts = relationship(Blogpost, cascade='all, delete-orphan', backref='project') owners_ids = Column(MutableList.as_mutable(ARRAY(Integer)), default=list()) def needs_password(self): return self.get_passwd_hash() is not None def get_passwd_hash(self): return self.info.get('passwd_hash') def get_passwd(self): if self.needs_password(): return signer.loads(self.get_passwd_hash()) return None def set_password(self, password): if len(password) > 1: self.info['passwd_hash'] = signer.dumps(password) return True self.info['passwd_hash'] = None return False def check_password(self, password): if self.needs_password(): return self.get_passwd() == password return False def has_autoimporter(self): return self.get_autoimporter() is not None def get_autoimporter(self): return self.info.get('autoimporter') def set_autoimporter(self, new=None): self.info['autoimporter'] = new def delete_autoimporter(self): del self.info['autoimporter'] def has_presenter(self): if current_app.config.get('DISABLE_TASK_PRESENTER') is True: return True else: return self.info.get('task_presenter') not in ('', None) @classmethod def public_attributes(self): """Return a list of public attributes.""" return [ 'id', 'description', 'info', 'n_tasks', 'n_volunteers', 'name', 'overall_progress', 'short_name', 'created', 'category_id', 'long_description', 'last_activity', 'last_activity_raw', 'n_task_runs', 'n_results', 'owner', 'updated', 'featured', 'owner_id', 'n_completed_tasks', 'n_blogposts', 'owners_ids' ] @classmethod def public_info_keys(self): """Return a list of public info keys.""" default = [ 'container', 'thumbnail', 'thumbnail_url', 'task_presenter', 'tutorial', 'sched' ] extra = current_app.config.get('PROJECT_INFO_PUBLIC_FIELDS') if extra: return list(set(default).union(set(extra))) else: return default
return value.value def process_result_value(self, value: str, dialect): for __, case in self.python_enum_type.__members__.items(): if case.value == value: return case raise TypeError(f"Cannot map Enum value {value!r} " f"to Python's {self.python_enum_type}") def copy(self): return PythonEnum(self.python_enum_type, **self.kwargs) tbl = Table('sa_tbl_types', meta, Column('id', Integer, nullable=False, primary_key=True), Column('json_val', JSON), Column('array_val', ARRAY(Integer)), Column('hstore_val', HSTORE), Column('pyt_enum_val', PythonEnum(SimpleEnum, name='simple_enum')), Column('enum_val', ENUM('first', 'second', name='simple_enum'))) tbl2 = Table( 'sa_tbl_types2', meta, Column('id', Integer, nullable=False, primary_key=True), Column('json_val', JSON), Column('array_val', ARRAY(Integer)), Column('pyt_enum_val', PythonEnum(SimpleEnum, name='simple_enum')), Column('enum_val', ENUM('first', 'second', name='simple_enum'))) @pytest.fixture def connect(make_engine): async def go(**kwargs):
class DeathSearch(db.Model): """ Define the class with these following relationships last_name -- Column: String(25) first_name -- Column: String(40) middle_name -- Column: String(40) num_copies -- Column: String(2) // put as 40 because new one is 40 cemetery -- Column: String(40) month -- Column: string day -- Column: string year -- Column: array[] death_place -- Column: String(40) borough -- Column: String/Array father_name -- Column: String(30) mother_name -- Column: String(30) letter -- Column: bool comment -- Column: String(255) suborder_number -- Column: BigInteger, foreignKey """ __tablename__ = 'death_search' id = db.Column(db.Integer, primary_key=True) last_name = db.Column(db.String(25), nullable=False) first_name = db.Column(db.String(40), nullable=True) middle_name = db.Column(db.String(40), nullable=True) num_copies = db.Column(db.String(40), nullable=False) cemetery = db.Column(db.String(40), nullable=True) month = db.Column(db.String(20), nullable=True) day = db.Column(db.String(2), nullable=True) _years = db.Column(ARRAY(db.String(4), dimensions=1), nullable=False, name='years') death_place = db.Column(db.String(40), nullable=True) _borough = db.Column(ARRAY(db.String(20), dimensions=1), nullable=False, name='borough') father_name = db.Column(db.String(30), nullable=True) mother_name = db.Column(db.String(30), nullable=True) letter = db.Column(db.Boolean, nullable=True) comment = db.Column(db.String(255), nullable=True) delivery_method = db.Column(db.Enum(delivery_method.MAIL, delivery_method.EMAIL, delivery_method.PICKUP, name='delivery_method'), nullable=True) suborder_number = db.Column(db.String(32), db.ForeignKey('suborders.id'), nullable=False) def __init__(self, last_name, first_name, middle_name, num_copies, cemetery, month, day, years, death_place, borough, father_name, mother_name, letter, comment, _delivery_method, suborder_number): self.last_name = last_name self.first_name = first_name self.middle_name = middle_name self.num_copies = num_copies or None self.cemetery = cemetery or None self.month = month or None self.day = day or None self._years = years or None self.death_place = death_place or None self._borough = borough self.father_name = father_name self.mother_name = mother_name self.letter = letter or None self.comment = comment or None self.delivery_method = _delivery_method self.suborder_number = suborder_number @property def years(self): if isinstance(self._years, list): if len(self._years) > 1: return ",".join(self._years) else: return self._years[0] else: return None @years.setter def years(self, value): self._years = value @property def borough(self): if len(self._borough) > 1: return ", ".join(self._borough) else: return self._borough[0] @borough.setter def borough(self, value): self._borough = value @property def serialize(self): """Return object data in easily serializable format""" return { 'first_name': self.first_name, 'last_name': self.last_name, 'middle_name': self.middle_name, 'num_copies': self.num_copies, 'cemetery': self.cemetery, 'month': self.month, 'day': self.day, 'years': self.years, 'death_place': self.death_place, 'borough': self.borough, 'father_name': self.father_name, 'mother_name': self.mother_name, 'letter': self.letter, 'comment': self.comment, 'delivery_method': self.delivery_method, 'suborder_number': self.suborder_number }
class CommitteeReports(FecFileNumberMixin, PdfMixin, CsvMixin, BaseModel): __abstract__ = True committee_id = db.Column(db.String, index=True, doc=docs.COMMITTEE_ID) committee = utils.related('CommitteeHistory', 'committee_id', 'committee_id', 'report_year', 'cycle') #These columns derived from amendments materializeds view amendment_chain = db.Column(ARRAY(db.Numeric), doc=docs.AMENDMENT_CHAIN) previous_file_number = db.Column(db.Numeric) most_recent_file_number = db.Column(db.Numeric) cycle = db.Column(db.Integer, index=True, doc=docs.CYCLE) file_number = db.Column(db.Integer) amendment_indicator = db.Column('amendment_indicator', db.String) amendment_indicator_full = db.Column(db.String) beginning_image_number = db.Column(db.BigInteger, doc=docs.BEGINNING_IMAGE_NUMBER) cash_on_hand_beginning_period = db.Column( db.Numeric(30, 2), doc=docs.CASH_ON_HAND_BEGIN_PERIOD) #P cash_on_hand_end_period = db.Column('cash_on_hand_end_period', db.Numeric(30, 2), doc=docs.CASH_ON_HAND_END_PERIOD) #P coverage_end_date = db.Column(db.DateTime, index=True, doc=docs.COVERAGE_END_DATE) #P coverage_start_date = db.Column(db.DateTime, index=True, doc=docs.COVERAGE_START_DATE) #P debts_owed_by_committee = db.Column('debts_owed_by_committee', db.Numeric(30, 2), doc=docs.DEBTS_OWED_BY_COMMITTEE) #P debts_owed_to_committee = db.Column(db.Numeric(30, 2), doc=docs.DEBTS_OWED_TO_COMMITTEE) #P end_image_number = db.Column(db.BigInteger, doc=docs.ENDING_IMAGE_NUMBER) other_disbursements_period = db.Column(db.Numeric(30, 2), doc=docs.add_period( docs.OTHER_DISBURSEMENTS)) #PX other_disbursements_ytd = db.Column(db.Numeric(30, 2), doc=docs.add_ytd( docs.OTHER_DISBURSEMENTS)) #PX other_political_committee_contributions_period = db.Column( db.Numeric(30, 2), doc=docs.add_period(docs.OTHER_POLITICAL_COMMITTEE_CONTRIBUTIONS)) #P other_political_committee_contributions_ytd = db.Column( db.Numeric(30, 2), doc=docs.add_ytd(docs.OTHER_POLITICAL_COMMITTEE_CONTRIBUTIONS)) #P political_party_committee_contributions_period = db.Column( db.Numeric(30, 2), doc=docs.add_period(docs.POLITICAL_PARTY_COMMITTEE_CONTRIBUTIONS)) #P political_party_committee_contributions_ytd = db.Column( db.Numeric(30, 2), doc=docs.add_ytd(docs.POLITICAL_PARTY_COMMITTEE_CONTRIBUTIONS)) #P report_type = db.Column(db.String, doc=docs.REPORT_TYPE) report_type_full = db.Column(db.String, doc=docs.REPORT_TYPE) report_year = db.Column(db.Integer, doc=docs.REPORT_YEAR) total_contribution_refunds_period = db.Column( db.Numeric(30, 2), doc=docs.add_period(docs.CONTRIBUTION_REFUNDS)) #P total_contribution_refunds_ytd = db.Column( db.Numeric(30, 2), doc=docs.add_ytd(docs.CONTRIBUTION_REFUNDS)) #P refunded_individual_contributions_period = db.Column( db.Numeric(30, 2), doc=docs.add_period(docs.REFUNDED_INDIVIDUAL_CONTRIBUTIONS)) #P refunded_individual_contributions_ytd = db.Column( db.Numeric(30, 2), doc=docs.add_ytd(docs.REFUNDED_INDIVIDUAL_CONTRIBUTIONS)) #P refunded_other_political_committee_contributions_period = db.Column( db.Numeric(30, 2), doc=docs.add_period( docs.REFUNDED_OTHER_POLITICAL_COMMITTEE_CONTRIBUTIONS)) #P refunded_other_political_committee_contributions_ytd = db.Column( db.Numeric(30, 2), doc=docs.add_ytd( docs.REFUNDED_OTHER_POLITICAL_COMMITTEE_CONTRIBUTIONS)) #P refunded_political_party_committee_contributions_period = db.Column( db.Numeric(30, 2), doc=docs.add_period( docs.REFUNDED_POLITICAL_PARTY_COMMITTEE_CONTRIBUTIONS)) #P refunded_political_party_committee_contributions_ytd = db.Column( db.Numeric(30, 2), doc=docs.add_ytd( docs.REFUNDED_POLITICAL_PARTY_COMMITTEE_CONTRIBUTIONS)) #P total_contributions_period = db.Column('total_contributions_period', db.Numeric(30, 2), doc=docs.add_period( docs.CONTRIBUTIONS)) #P total_contributions_ytd = db.Column(db.Numeric(30, 2), doc=docs.add_ytd( docs.CONTRIBUTIONS)) #P total_disbursements_period = db.Column('total_disbursements_period', db.Numeric(30, 2), doc=docs.add_period( docs.DISBURSEMENTS)) #P total_disbursements_ytd = db.Column(db.Numeric(30, 2), doc=docs.add_ytd( docs.DISBURSEMENTS)) #P total_receipts_period = db.Column('total_receipts_period', db.Numeric(30, 2), doc=docs.add_period(docs.RECEIPTS)) #P total_receipts_ytd = db.Column(db.Numeric(30, 2), doc=docs.add_ytd(docs.RECEIPTS)) #P offsets_to_operating_expenditures_ytd = db.Column( db.Numeric(30, 2), doc=docs.add_ytd(docs.OFFSETS_TO_OPERATING_EXPENDITURES)) #P offsets_to_operating_expenditures_period = db.Column( db.Numeric(30, 2), doc=docs.add_period(docs.OFFSETS_TO_OPERATING_EXPENDITURES)) #P total_individual_contributions_ytd = db.Column( db.Numeric(30, 2), doc=docs.add_ytd(docs.INDIVIDUAL_CONTRIBUTIONS)) #P total_individual_contributions_period = db.Column( db.Numeric(30, 2), doc=docs.add_period(docs.INDIVIDUAL_CONTRIBUTIONS)) #P individual_unitemized_contributions_ytd = db.Column( db.Numeric(30, 2), doc=docs.add_ytd(docs.INDIVIDUAL_UNITEMIZED_CONTRIBUTIONS)) #P individual_unitemized_contributions_period = db.Column( db.Numeric(30, 2), doc=docs.add_period(docs.INDIVIDUAL_UNITEMIZED_CONTRIBUTIONS)) #P individual_itemized_contributions_ytd = db.Column( db.Numeric(30, 2), doc=docs.add_ytd(docs.INDIVIDUAL_ITEMIZED_CONTRIBUTIONS)) #P individual_itemized_contributions_period = db.Column( db.Numeric(30, 2), doc=docs.add_period(docs.INDIVIDUAL_ITEMIZED_CONTRIBUTIONS)) #P is_amended = db.Column( 'is_amended', db.Boolean, doc= 'False indicates that a report is the most recent. True indicates that the report has been superseded by an amendment.' ) receipt_date = db.Column('receipt_date', db.Date, doc=docs.RECEIPT_DATE) means_filed = db.Column('means_filed', db.String, doc=docs.MEANS_FILED) fec_url = db.Column(db.String) html_url = db.Column(db.String, doc='HTML link to the filing.') most_recent = db.Column('most_recent', db.Boolean) @property def document_description(self): return utils.document_description( self.coverage_end_date.year, clean_report_type(str(self.report_type_full)), None, None, )
class Category(SearchableTitleMixin, DescriptionMixin, ProtectionManagersMixin, AttachedItemsMixin, db.Model): """An Indico category""" __tablename__ = 'categories' disallowed_protection_modes = frozenset() inheriting_have_acl = True possible_render_modes = {RenderMode.markdown} default_render_mode = RenderMode.markdown allow_no_access_contact = True ATTACHMENT_FOLDER_ID_COLUMN = 'category_id' @strict_classproperty @classmethod def __auto_table_args(cls): return (db.CheckConstraint( "(icon IS NULL) = (icon_metadata::text = 'null')", 'valid_icon'), db.CheckConstraint( "(logo IS NULL) = (logo_metadata::text = 'null')", 'valid_logo'), db.CheckConstraint("(parent_id IS NULL) = (id = 0)", 'valid_parent'), db.CheckConstraint("(id != 0) OR NOT is_deleted", 'root_not_deleted'), db.CheckConstraint( "(id != 0) OR (protection_mode != {})".format( ProtectionMode.inheriting), 'root_not_inheriting'), db.CheckConstraint('visibility IS NULL OR visibility > 0', 'valid_visibility'), { 'schema': 'categories' }) @declared_attr def __table_args__(cls): return auto_table_args(cls) id = db.Column(db.Integer, primary_key=True) parent_id = db.Column(db.Integer, db.ForeignKey('categories.categories.id'), index=True, nullable=True) is_deleted = db.Column(db.Boolean, nullable=False, default=False) position = db.Column(db.Integer, nullable=False, default=_get_next_position) visibility = db.Column(db.Integer, nullable=True, default=None) icon_metadata = db.Column(JSONB, nullable=False, default=lambda: None) icon = db.deferred(db.Column(db.LargeBinary, nullable=True)) logo_metadata = db.Column(JSONB, nullable=False, default=lambda: None) logo = db.deferred(db.Column(db.LargeBinary, nullable=True)) timezone = db.Column(db.String, nullable=False, default=lambda: config.DEFAULT_TIMEZONE) default_event_themes = db.Column(JSONB, nullable=False, default=_get_default_event_themes) event_creation_restricted = db.Column(db.Boolean, nullable=False, default=True) event_creation_notification_emails = db.Column(ARRAY(db.String), nullable=False, default=[]) event_message_mode = db.Column(PyIntEnum(EventMessageMode), nullable=False, default=EventMessageMode.disabled) _event_message = db.Column('event_message', db.Text, nullable=False, default='') suggestions_disabled = db.Column(db.Boolean, nullable=False, default=False) notify_managers = db.Column(db.Boolean, nullable=False, default=False) default_ticket_template_id = db.Column( db.ForeignKey('indico.designer_templates.id'), nullable=True, index=True) children = db.relationship( 'Category', order_by='Category.position', primaryjoin=(id == db.remote(parent_id)) & ~db.remote(is_deleted), lazy=True, backref=db.backref('parent', primaryjoin=(db.remote(id) == parent_id), lazy=True)) acl_entries = db.relationship('CategoryPrincipal', backref='category', cascade='all, delete-orphan', collection_class=set) default_ticket_template = db.relationship( 'DesignerTemplate', lazy=True, foreign_keys=default_ticket_template_id, backref='default_ticket_template_of') # column properties: # - deep_events_count # relationship backrefs: # - attachment_folders (AttachmentFolder.category) # - designer_templates (DesignerTemplate.category) # - events (Event.category) # - favorite_of (User.favorite_categories) # - legacy_mapping (LegacyCategoryMapping.category) # - parent (Category.children) # - roles (CategoryRole.category) # - settings (CategorySetting.category) # - suggestions (SuggestedCategory.category) @hybrid_property def event_message(self): return MarkdownText(self._event_message) @event_message.setter def event_message(self, value): self._event_message = value @event_message.expression def event_message(cls): return cls._event_message @return_ascii def __repr__(self): return format_repr(self, 'id', is_deleted=False, _text=text_to_repr(self.title, max_length=75)) @property def protection_parent(self): return self.parent if not self.is_root else None @locator_property def locator(self): return {'category_id': self.id} @classmethod def get_root(cls): """Get the root category""" return cls.query.filter(cls.is_root).one() @property def url(self): return url_for('categories.display', self) @property def has_only_events(self): return self.has_events and not self.children @hybrid_property def is_root(self): return self.parent_id is None @is_root.expression def is_root(cls): return cls.parent_id.is_(None) @property def is_empty(self): return not self.deep_children_count and not self.deep_events_count @property def has_icon(self): return self.icon_metadata is not None @property def has_effective_icon(self): return self.effective_icon_data['metadata'] is not None @property def has_logo(self): return self.logo_metadata is not None @property def tzinfo(self): return pytz.timezone(self.timezone) @property def display_tzinfo(self): """The tzinfo of the category or the one specified by the user""" return get_display_tz(self, as_timezone=True) def can_create_events(self, user): """Check whether the user can create events in the category.""" # if creation is not restricted anyone who can access the category # can also create events in it, otherwise only people with the # creation role can return user and ( (not self.event_creation_restricted and self.can_access(user)) or self.can_manage(user, permission='create')) def move(self, target): """Move the category into another category.""" assert not self.is_root old_parent = self.parent self.position = (max(x.position for x in target.children) + 1) if target.children else 1 self.parent = target db.session.flush() signals.category.moved.send(self, old_parent=old_parent) @classmethod def get_tree_cte(cls, col='id'): """Create a CTE for the category tree. The CTE contains the following columns: - ``id`` -- the category id - ``path`` -- an array containing the path from the root to the category itself - ``is_deleted`` -- whether the category is deleted :param col: The name of the column to use in the path or a callable receiving the category alias that must return the expression used for the 'path' retrieved by the CTE. """ cat_alias = db.aliased(cls) if callable(col): path_column = col(cat_alias) else: path_column = getattr(cat_alias, col) cte_query = (select([ cat_alias.id, array([path_column]).label('path'), cat_alias.is_deleted ]).where(cat_alias.parent_id.is_(None)).cte(recursive=True)) rec_query = (select([ cat_alias.id, cte_query.c.path.op('||')(path_column), cte_query.c.is_deleted | cat_alias.is_deleted ]).where(cat_alias.parent_id == cte_query.c.id)) return cte_query.union_all(rec_query) @classmethod def get_protection_cte(cls): cat_alias = db.aliased(cls) cte_query = (select([cat_alias.id, cat_alias.protection_mode]).where( cat_alias.parent_id.is_(None)).cte(recursive=True)) rec_query = (select([ cat_alias.id, db.case( {ProtectionMode.inheriting.value: cte_query.c.protection_mode}, else_=cat_alias.protection_mode, value=cat_alias.protection_mode) ]).where(cat_alias.parent_id == cte_query.c.id)) return cte_query.union_all(rec_query) def get_protection_parent_cte(self): cte_query = (select([ Category.id, db.cast(literal(None), db.Integer).label('protection_parent') ]).where(Category.id == self.id).cte(recursive=True)) rec_query = (select([ Category.id, db.case( { ProtectionMode.inheriting.value: func.coalesce(cte_query.c.protection_parent, self.id) }, else_=Category.id, value=Category.protection_mode) ]).where(Category.parent_id == cte_query.c.id)) return cte_query.union_all(rec_query) @classmethod def get_icon_data_cte(cls): cat_alias = db.aliased(cls) cte_query = (select([ cat_alias.id, cat_alias.id.label('source_id'), cat_alias.icon_metadata ]).where(cat_alias.parent_id.is_(None)).cte(recursive=True)) rec_query = (select([ cat_alias.id, db.case({'null': cte_query.c.source_id}, else_=cat_alias.id, value=db.func.jsonb_typeof(cat_alias.icon_metadata)), db.case({'null': cte_query.c.icon_metadata}, else_=cat_alias.icon_metadata, value=db.func.jsonb_typeof(cat_alias.icon_metadata)) ]).where(cat_alias.parent_id == cte_query.c.id)) return cte_query.union_all(rec_query) @property def deep_children_query(self): """Get a query object for all subcategories. This includes subcategories at any level of nesting. """ cte = Category.get_tree_cte() return (Category.query.join(cte, Category.id == cte.c.id).filter( cte.c.path.contains([self.id]), cte.c.id != self.id, ~cte.c.is_deleted)) @staticmethod def _get_chain_query(start_criterion): cte_query = (select([ Category.id, Category.parent_id, literal(0).label('level') ]).where(start_criterion).cte('category_chain', recursive=True)) parent_query = (select([ Category.id, Category.parent_id, cte_query.c.level + 1 ]).where(Category.id == cte_query.c.parent_id)) cte_query = cte_query.union_all(parent_query) return Category.query.join(cte_query, Category.id == cte_query.c.id).order_by( cte_query.c.level.desc()) @property def chain_query(self): """Get a query object for the category chain. The query retrieves the root category first and then all the intermediate categories up to (and including) this category. """ return self._get_chain_query(Category.id == self.id) @property def parent_chain_query(self): """Get a query object for the category's parent chain. The query retrieves the root category first and then all the intermediate categories up to (excluding) this category. """ return self._get_chain_query(Category.id == self.parent_id) def nth_parent(self, n_categs, fail_on_overflow=True): """Return the nth parent of the category. :param n_categs: the number of categories to go up :param fail_on_overflow: whether to fail if we try to go above the root category :return: `Category` object or None (only if ``fail_on_overflow`` is not set) """ if n_categs == 0: return self chain = self.parent_chain_query.all() assert n_categs >= 0 if n_categs > len(chain): if fail_on_overflow: raise IndexError("Root category has no parent!") else: return None return chain[::-1][n_categs - 1] def is_descendant_of(self, categ): return categ != self and self.parent_chain_query.filter( Category.id == categ.id).has_rows() @property def visibility_horizon_query(self): """Get a query object that returns the highest category this one is visible from.""" cte_query = (select([ Category.id, Category.parent_id, db.case([(Category.visibility.is_(None), None)], else_=(Category.visibility - 1)).label('n'), literal(0).label('level') ]).where(Category.id == self.id).cte('visibility_horizon', recursive=True)) parent_query = (select([ Category.id, Category.parent_id, db.case([ (Category.visibility.is_(None) & cte_query.c.n.is_(None), None) ], else_=db.func.least(Category.visibility, cte_query.c.n) - 1), cte_query.c.level + 1 ]).where( db.and_(Category.id == cte_query.c.parent_id, (cte_query.c.n > 0) | cte_query.c.n.is_(None)))) cte_query = cte_query.union_all(parent_query) return db.session.query(cte_query.c.id, cte_query.c.n).order_by( cte_query.c.level.desc()).limit(1) @property def own_visibility_horizon(self): """Get the highest category this one would like to be visible from (configured visibility).""" if self.visibility is None: return Category.get_root() else: return self.nth_parent(self.visibility - 1) @property def real_visibility_horizon(self): """Get the highest category this one is actually visible from (as limited by categories above).""" horizon_id, final_visibility = self.visibility_horizon_query.one() if final_visibility is not None and final_visibility < 0: return None # Category is invisible return Category.get(horizon_id) @staticmethod def get_visible_categories_cte(category_id): """ Get a sqlalchemy select for the visible categories within the given category, including the category itself. """ cte_query = (select([ Category.id, literal(0).label('level') ]).where((Category.id == category_id) & (Category.visibility.is_(None) | (Category.visibility > 0))).cte(recursive=True)) parent_query = (select([Category.id, cte_query.c.level + 1]).where( db.and_( Category.parent_id == cte_query.c.id, db.or_(Category.visibility.is_(None), Category.visibility > cte_query.c.level + 1)))) return cte_query.union_all(parent_query) @property def visible_categories_query(self): """ Get a query object for the visible categories within this category, including the category itself. """ cte_query = Category.get_visible_categories_cte(self.id) return Category.query.join(cte_query, Category.id == cte_query.c.id) @property def icon_url(self): """Get the HTTP URL of the icon.""" return url_for('categories.display_icon', self, slug=self.icon_metadata['hash']) @property def effective_icon_url(self): """Get the HTTP URL of the icon (possibly inherited).""" data = self.effective_icon_data return url_for('categories.display_icon', category_id=data['source_id'], slug=data['metadata']['hash']) @property def logo_url(self): """Get the HTTP URL of the logo.""" return url_for('categories.display_logo', self, slug=self.logo_metadata['hash'])
class Project(db.Model): """ Describes a HOT Mapping Project """ __tablename__ = 'projects' # Columns id = db.Column(db.Integer, primary_key=True) status = db.Column(db.Integer, default=ProjectStatus.DRAFT.value, nullable=False) created = db.Column(db.DateTime, default=timestamp, nullable=False) priority = db.Column(db.Integer, default=ProjectPriority.MEDIUM.value) default_locale = db.Column( db.String(10), default='en' ) # The locale that is returned if requested locale not available author_id = db.Column(db.BigInteger, db.ForeignKey('users.id', name='fk_users'), nullable=False) mapper_level = db.Column( db.Integer, default=1, nullable=False, index=True) # Mapper level project is suitable for enforce_mapper_level = db.Column(db.Boolean, default=False) enforce_validator_role = db.Column( db.Boolean, default=False) # Means only users with validator role can validate private = db.Column(db.Boolean, default=False) # Only allowed users can validate entities_to_map = db.Column(db.String) changeset_comment = db.Column(db.String) due_date = db.Column(db.DateTime) imagery = db.Column(db.String) josm_preset = db.Column(db.String) last_updated = db.Column(db.DateTime, default=timestamp) license_id = db.Column(db.Integer, db.ForeignKey('licenses.id', name='fk_licenses')) geometry = db.Column(Geometry('MULTIPOLYGON', srid=4326)) centroid = db.Column(Geometry('POINT', srid=4326)) # Tags mapping_types = db.Column(ARRAY(db.Integer), index=True) organisation_tag = db.Column(db.String, index=True) campaign_tag = db.Column(db.String, index=True) # Stats total_tasks = db.Column(db.Integer, nullable=False) tasks_mapped = db.Column(db.Integer, default=0, nullable=False) tasks_validated = db.Column(db.Integer, default=0, nullable=False) tasks_bad_imagery = db.Column(db.Integer, default=0, nullable=False) # Mapped Objects tasks = db.relationship(Task, backref='projects', cascade="all, delete, delete-orphan", lazy='dynamic') project_info = db.relationship(ProjectInfo, lazy='dynamic', cascade='all') author = db.relationship(User) allowed_users = db.relationship(User, secondary=project_allowed_users) priority_areas = db.relationship(PriorityArea, secondary=project_priority_areas, cascade="all, delete-orphan", single_parent=True) def create_draft_project(self, draft_project_dto: DraftProjectDTO): """ Creates a draft project :param draft_project_dto: DTO containing draft project details :param aoi: Area of Interest for the project (eg boundary of project) """ self.project_info.append( ProjectInfo.create_from_name(draft_project_dto.project_name)) self.status = ProjectStatus.DRAFT.value self.author_id = draft_project_dto.user_id self.last_updated = timestamp() def set_project_aoi(self, draft_project_dto: DraftProjectDTO): """ Sets the AOI for the supplied project """ aoi_geojson = geojson.loads( json.dumps(draft_project_dto.area_of_interest)) aoi_geometry = GridService.merge_to_multi_polygon(aoi_geojson, dissolve=True) valid_geojson = geojson.dumps(aoi_geometry) self.geometry = ST_SetSRID(ST_GeomFromGeoJSON(valid_geojson), 4326) self.centroid = ST_Centroid(self.geometry) def set_default_changeset_comment(self): """ Sets the default changeset comment""" default_comment = current_app.config['DEFAULT_CHANGESET_COMMENT'] self.changeset_comment = f'{default_comment}-{self.id}' self.save() def create(self): """ Creates and saves the current model to the DB """ db.session.add(self) db.session.commit() def save(self): """ Save changes to db""" db.session.commit() @staticmethod def clone(project_id: int, author_id: int): """ Clone project """ cloned_project = Project.get(project_id) # Remove clone from session so we can reinsert it as a new object db.session.expunge(cloned_project) make_transient(cloned_project) # Re-initialise counters and meta-data cloned_project.total_tasks = 0 cloned_project.tasks_mapped = 0 cloned_project.tasks_validated = 0 cloned_project.tasks_bad_imagery = 0 cloned_project.last_updated = timestamp() cloned_project.created = timestamp() cloned_project.author_id = author_id cloned_project.status = ProjectStatus.DRAFT.value cloned_project.id = None # Reset ID so we get a new ID when inserted cloned_project.geometry = None cloned_project.centroid = None db.session.add(cloned_project) db.session.commit() # Now add the project info, we have to do it in a two stage commit because we need to know the new project id original_project = Project.get(project_id) for info in original_project.project_info: db.session.expunge(info) make_transient( info ) # Must remove the object from the session or it will be updated rather than inserted info.id = None info.project_id_str = str(cloned_project.id) cloned_project.project_info.append(info) # Now add allowed users now we know new project id, if there are any for user in original_project.allowed_users: cloned_project.allowed_users.append(user) db.session.add(cloned_project) db.session.commit() return cloned_project @staticmethod def get(project_id: int): """ Gets specified project :param project_id: project ID in scope :return: Project if found otherwise None """ return Project.query.get(project_id) def update(self, project_dto: ProjectDTO): """ Updates project from DTO """ self.status = ProjectStatus[project_dto.project_status].value self.priority = ProjectPriority[project_dto.project_priority].value self.default_locale = project_dto.default_locale self.enforce_mapper_level = project_dto.enforce_mapper_level self.enforce_validator_role = project_dto.enforce_validator_role self.private = project_dto.private self.mapper_level = MappingLevel[ project_dto.mapper_level.upper()].value self.entities_to_map = project_dto.entities_to_map self.changeset_comment = project_dto.changeset_comment self.due_date = project_dto.due_date self.imagery = project_dto.imagery self.josm_preset = project_dto.josm_preset self.last_updated = timestamp() self.license_id = project_dto.license_id if project_dto.organisation_tag: org_tag = Tags.upsert_organistion_tag(project_dto.organisation_tag) self.organisation_tag = org_tag else: self.organisation_tag = None # Set to none, for cases where a tag could have been removed if project_dto.campaign_tag: camp_tag = Tags.upsert_campaign_tag(project_dto.campaign_tag) self.campaign_tag = camp_tag else: self.campaign_tag = None # Set to none, for cases where a tag could have been removed # Cast MappingType strings to int array type_array = [] for mapping_type in project_dto.mapping_types: type_array.append(MappingTypes[mapping_type].value) self.mapping_types = type_array # Add list of allowed users, meaning the project can only be mapped by users in this list if hasattr(project_dto, 'allowed_users'): self.allowed_users = [ ] # Clear existing relationships then re-insert for user in project_dto.allowed_users: self.allowed_users.append(user) # Set Project Info for all returned locales for dto in project_dto.project_info_locales: project_info = self.project_info.filter_by( locale=dto.locale).one_or_none() if project_info is None: new_info = ProjectInfo.create_from_dto( dto) # Can't find info so must be new locale self.project_info.append(new_info) else: project_info.update_from_dto(dto) self.priority_areas = [ ] # Always clear Priority Area prior to updating if project_dto.priority_areas: for priority_area in project_dto.priority_areas: pa = PriorityArea.from_dict(priority_area) self.priority_areas.append(pa) db.session.commit() def delete(self): """ Deletes the current model from the DB """ db.session.delete(self) db.session.commit() def can_be_deleted(self) -> bool: """ Projects can be deleted if they have no mapped work """ task_count = self.tasks.filter( Task.task_status != TaskStatus.READY.value).count() if task_count == 0: return True else: return False def get_locked_tasks_for_user(self, user_id: int): """ Gets tasks on project owned by specifed user id""" tasks = self.tasks.filter_by(locked_by=user_id) locked_tasks = [] for task in tasks: locked_tasks.append(task.id) return locked_tasks def get_locked_tasks_details_for_user(self, user_id: int): """ Gets tasks on project owned by specifed user id""" tasks = self.tasks.filter_by(locked_by=user_id) locked_tasks = [] for task in tasks: locked_tasks.append(task) return locked_tasks @staticmethod def get_projects_for_admin(admin_id: int, preferred_locale: str) -> PMDashboardDTO: """ Get projects for admin """ admins_projects = Project.query.filter_by(author_id=admin_id).all() if admins_projects is None: raise NotFound('No projects found for admin') admin_projects_dto = PMDashboardDTO() for project in admins_projects: pm_project = project.get_project_summary(preferred_locale) project_status = ProjectStatus(project.status) if project_status == ProjectStatus.DRAFT: admin_projects_dto.draft_projects.append(pm_project) elif project_status == ProjectStatus.PUBLISHED: admin_projects_dto.active_projects.append(pm_project) elif project_status == ProjectStatus.ARCHIVED: admin_projects_dto.archived_projects.append(pm_project) else: current_app.logger.error( f'Unexpected state project {project.id}') return admin_projects_dto def get_project_summary(self, preferred_locale) -> ProjectSummary: """ Create Project Summary model for postgis project object""" summary = ProjectSummary() summary.project_id = self.id summary.campaign_tag = self.campaign_tag summary.created = self.created summary.last_updated = self.last_updated summary.mapper_level = MappingLevel(self.mapper_level).name summary.organisation_tag = self.organisation_tag summary.status = ProjectStatus(self.status).name centroid_geojson = db.session.scalar(self.centroid.ST_AsGeoJSON()) summary.aoi_centroid = geojson.loads(centroid_geojson) summary.percent_mapped = int( (self.tasks_mapped / (self.total_tasks - self.tasks_bad_imagery)) * 100) summary.percent_validated = int( ((self.tasks_validated + self.tasks_bad_imagery) / self.total_tasks) * 100) project_info = ProjectInfo.get_dto_for_locale(self.id, preferred_locale, self.default_locale) summary.name = project_info.name summary.short_description = project_info.short_description return summary def get_aoi_geometry_as_geojson(self): """ Helper which returns the AOI geometry as a geojson object """ aoi_geojson = db.engine.execute(self.geometry.ST_AsGeoJSON()).scalar() return geojson.loads(aoi_geojson) @staticmethod @cached(active_mappers_cache) def get_active_mappers(project_id) -> int: """ Get count of Locked tasks as a proxy for users who are currently active on the project """ return Task.query \ .filter(Task.task_status == TaskStatus.LOCKED_FOR_MAPPING.value) \ .filter(Task.project_id == project_id) \ .count() def _get_project_and_base_dto(self): """ Populates a project DTO with properties common to all roles """ base_dto = ProjectDTO() base_dto.project_id = self.id base_dto.project_status = ProjectStatus(self.status).name base_dto.default_locale = self.default_locale base_dto.project_priority = ProjectPriority(self.priority).name base_dto.area_of_interest = self.get_aoi_geometry_as_geojson() base_dto.enforce_mapper_level = self.enforce_mapper_level base_dto.enforce_validator_role = self.enforce_validator_role base_dto.private = self.private base_dto.mapper_level = MappingLevel(self.mapper_level).name base_dto.entities_to_map = self.entities_to_map base_dto.changeset_comment = self.changeset_comment base_dto.due_date = self.due_date base_dto.imagery = self.imagery base_dto.josm_preset = self.josm_preset base_dto.campaign_tag = self.campaign_tag base_dto.organisation_tag = self.organisation_tag base_dto.license_id = self.license_id base_dto.last_updated = self.last_updated base_dto.author = User().get_by_id(self.author_id).username base_dto.active_mappers = Project.get_active_mappers(self.id) if self.private: # If project is private it should have a list of allowed users allowed_usernames = [] for user in self.allowed_users: allowed_usernames.append(user.username) base_dto.allowed_usernames = allowed_usernames if self.mapping_types: mapping_types = [] for mapping_type in self.mapping_types: mapping_types.append(MappingTypes(mapping_type).name) base_dto.mapping_types = mapping_types if self.priority_areas: geojson_areas = [] for priority_area in self.priority_areas: geojson_areas.append(priority_area.get_as_geojson()) base_dto.priority_areas = geojson_areas return self, base_dto def as_dto_for_mapping(self, locale: str) -> Optional[ProjectDTO]: """ Creates a Project DTO suitable for transmitting to mapper users """ project, project_dto = self._get_project_and_base_dto() project_dto.tasks = Task.get_tasks_as_geojson_feature_collection( self.id) project_dto.project_info = ProjectInfo.get_dto_for_locale( self.id, locale, project.default_locale) return project_dto def all_tasks_as_geojson(self): """ Creates a geojson of all areas """ project_tasks = Task.get_tasks_as_geojson_feature_collection(self.id) return project_tasks def as_dto_for_admin(self, project_id): """ Creates a Project DTO suitable for transmitting to project admins """ project, project_dto = self._get_project_and_base_dto() if project is None: return None project_dto.project_info_locales = ProjectInfo.get_dto_for_all_locales( project_id) return project_dto
class Observation(Base): """Observation data class. Derive all surveys from this class. See ExampleSurvey for an example. """ # Override the following in derived classes. __tablename__: str = 'observation' __data_source_name__ = 'All data sources' __obscode__ = '500' # MPC observatory code # Required attributes for basic sbsearch functionality. observation_id: int = Column(BigInteger, primary_key=True) source: str = Column(String(64), default='observation', doc='source survey') mjd_start: float = Column(Float(32), nullable=False, index=True, doc='shutter open, modified Julian date, UTC') mjd_stop: float = Column(Float(32), nullable=False, index=True, doc='shutter close, modified Julian date, UTC') fov: str = Column( String(128), nullable=False, doc=('field of view as set of comma-separated RA:Dec pairs in degrees,' 'e.g., "1:1, 1:2, 2:1" (tip: see set_fov)')) spatial_terms: List[str] = Column(ARRAY(Text), nullable=False, doc='spatial index terms') # Common attributes. Additional attributes and data-set-specific # attributes are defined in each data set object. filter: str = Column(String(16), doc='filter/bandpass') exposure: float = Column(Float(32), doc='exposure time, s') seeing: float = Column(Float(32), doc='point source FWHM, arcsec') airmass: float = Column(Float(32), doc='observation airmass') maglimit: float = Column(Float(32), doc='detection limit, mag') # Class methods. def set_fov(self, ra: Union[List[float], np.ndarray], dec: Union[List[float], np.ndarray]) -> None: """Set ``fov`` with these vertices. Parameters ---------- ra, dec : arrays of float Vertices expressed in degrees. """ if len(ra) > 4: raise ValueError('No more than 4 vertices are allowed.') values = [] for i in range(len(ra)): values.append(f'{ra[i]:.6f}:{dec[i]:.6f}') self.fov = ','.join(values) def __repr__(self) -> str: return (f'<{self.__class__.__name__}' f' observation_id={self.observation_id},' f' source={repr(self.source)}, fov={repr(self.fov)}' f' mjd_start={self.mjd_start} mjd_stop={self.mjd_stop}>') __mapper_args__ = { "polymorphic_identity": "observation", "polymorphic_on": source } __table_args__ = ( Index('ix_observation_source_mjd_start', 'source', 'mjd_start'), Index('ix_observation_source_mjd_stop', 'source', 'mjd_stop'), )
class SourcePackage(Base): ''' Data of a source package. ''' __tablename__ = 'archive_src_packages' uuid = Column(UUID(as_uuid=True), primary_key=True, default=None, nullable=False) # The unique identifier for the whole source packaging project (stays the same even if the package version changes) source_uuid = Column(UUID(as_uuid=True), default=None, nullable=False) name = Column(String(256)) # Source package name version = Column(DebVersion()) # Version of this package repo_id = Column(Integer, ForeignKey('archive_repositories.id')) repo = relationship('ArchiveRepository') suites = relationship( 'ArchiveSuite', secondary=srcpkg_suite_assoc_table, back_populates='src_packages') # Suites this package is in component_id = Column(Integer, ForeignKey('archive_components.id')) component = relationship( 'ArchiveComponent') # Component this package is in architectures = Column(ARRAY(String( 64))) # List of architectures this source package can be built for _binaries_json = Column('binaries', JSON) standards_version = Column(String(256)) format_version = Column(String(64)) homepage = Column(Text()) vcs_browser = Column(Text()) maintainer = Column(Text()) uploaders = Column(ARRAY(Text())) build_depends = Column(ARRAY(Text())) files = relationship('ArchiveFile', back_populates='srcpkg', cascade='all, delete, delete-orphan') directory = Column(Text()) _binaries = None @property def binaries(self): if self._binaries: return self._binaries data = json.loads(self._binaries_json) res = [] for e in data: info = PackageInfo() info.deb_type = e.get('deb_type', DebType.DEB) info.name = e.get('name') info.version = e.get('version') info.section = e.get('section') info.priority = e.get('priority', PackagePriority.UNKNOWN) info.architectures = e.get('architectures') res.append(info) self._binaries = res return res @binaries.setter def binaries(self, value): if not type(value) is list: value = [value] data = [] for v in value: d = { 'deb_type': v.deb_type, 'name': v.name, 'version': v.version, 'section': v.section, 'priority': v.priority, 'architectures': v.architectures } data.append(d) self._binaries_json = json.dumps(data) self._binaries = None # Force the data to be re-loaded from JSON @staticmethod def generate_uuid(repo_name, name, version): return uuid.uuid5(UUID_NS_SRCPACKAGE, '{}::source/{}/{}'.format(repo_name, name, version)) @staticmethod def generate_source_uuid(repo_name, name): return uuid.uuid5(UUID_NS_SRCPACKAGE, '{}::source/{}'.format(repo_name, name)) def update_uuid(self): if not self.repo: raise Exception( 'Source package is not associated with a repository!') self.update_source_uuid() self.uuid = SourcePackage.generate_uuid(self.repo.name, self.name, self.version) return self.uuid def update_source_uuid(self): if not self.repo: raise Exception( 'Source package is not associated with a repository!') self.source_uuid = SourcePackage.generate_source_uuid( self.repo.name, self.name) return self.source_uuid def __str__(self): repo_name = '?' if self.repo: repo_name = self.repo.name return '{}::source/{}/{}'.format(repo_name, self.name, self.version)
class Item(BaseMetadata, LocalRolesMixin, Mixin, VersionMixin, Base): """Model class to be used as base for all first level class models.""" __tablename__ = 'items' path = sa.Column(ARRAY(UUID(as_uuid=True)), nullable=False, index=True, info={ 'colanderalchemy': { 'title': 'Path', 'missing': colander.drop, 'typ': colander.List } }) """List of all parent objects including itself.""" type = sa.Column(sa.String(50), index=True, info={ 'colanderalchemy': { 'title': 'type', 'missing': colander.drop, 'typ': colander.String } }) """Polymorphic type.""" @declared_attr def __mapper_args__(cls) -> dict: """Return polymorphic identity.""" cls_name = cls.__name__.lower() args = { 'polymorphic_identity': cls_name, } if cls_name == 'item': args['polymorphic_on'] = cls.type return args @classmethod def create(cls, payload: dict) -> 'Item': """Factory that creates a new instance of this object. :param payload: Dictionary containing attributes and values :type payload: dict """ # we are going to change the payload so we need to avoid side effects payload = deepcopy(payload) # add local roles can_view using payload, actors and special attribute from the class can_view = payload.get('can_view', []) payload['can_view'] = list( set(can_view).union(cls._default_can_view())) actors_data = { actor: payload.pop(actor) for actor in cls.__actors__ if actor in payload } obj_id = payload.setdefault('id', uuid.uuid4()) if isinstance(obj_id, str): obj_id = uuid.UUID(obj_id) # look for a parent id get the parent instance parent_attr = getattr(cls, '__parent_attr__', None) path = [] parent_id = payload.get(parent_attr, None) if parent_attr else None if parent_id: parent = Item.get(parent_id) path = list(parent.path) path.append(obj_id) payload['path'] = path # create and add to the session the new instance obj = cls(**payload) session = obj.__session__ session.add(obj) session.flush() # add local roles using update method if actors_data: obj.update(actors_data) # TODO: fire object created event here? return obj def update(self, values: dict): """Update the object with given values. This implementation take care of update local role attributes. :param values: Dictionary containing attributes and values :type values: dict """ actors = self.__class__.__actors__ for key, value in values.items(): if key not in actors: setattr(self, key, value) else: set_local_roles_by_role_name(self, key, value) def to_dict(self, excludes: Attributes = None, includes: Attributes = None) -> dict: """Return a dictionary with fields and values used by this Class. :param excludes: attributes to exclude from dict representation. :param includes: attributes to include from dict representation. :returns: Dictionary with fields and values used by this Class """ data = super().to_dict(excludes=excludes, includes=includes) roles = {} for lr in self._all_local_roles.all(): principal_id = lr.principal_id if lr.role_name not in roles: roles[lr.role_name] = [principal_id] else: roles[lr.role_name].append(principal_id) data['_roles'] = roles return data @declared_attr def _all_local_roles(cls): """All local roles for this Item using all parent objects in path.""" return sa.orm.relationship( 'LocalRole', foreign_keys='LocalRole.item_id', primaryjoin='LocalRole.item_id==any_(Item.path)', order_by='asc(LocalRole.role_name)', cascade='all, delete-orphan', lazy='dynamic', info={ 'colanderalchemy': { 'title': 'All local roles: including from parent objects.', 'missing': colander.drop, } }) def __repr__(self) -> str: """Representation model Item.""" template = "<{0}(id='{1}' state='{2}' created='{3}' updated='{4}' type='{5}')>" return template.format(self.__class__.__name__, self.id, self.state, self.created_at, self.updated_at, self.type)
class BinaryPackage(Base): ''' Data of a binary package. ''' __tablename__ = 'archive_bin_packages' uuid = Column(UUID(as_uuid=True), primary_key=True, default=None, nullable=False) deb_type = Column(Enum(DebType)) # Deb package type name = Column(String(256)) # Package name version = Column(DebVersion()) # Version of this package repo_id = Column(Integer, ForeignKey('archive_repositories.id')) repo = relationship('ArchiveRepository') suites = relationship( 'ArchiveSuite', secondary=binpkg_suite_assoc_table, back_populates='bin_packages') # Suites this package is in component_id = Column(Integer, ForeignKey('archive_components.id')) component = relationship( 'ArchiveComponent') # Component this package is in architecture_id = Column(Integer, ForeignKey('archive_architectures.id')) architecture = relationship( 'ArchiveArchitecture') # Architecture this binary was built for size_installed = Column(Integer()) # Size of the installed package description = Column(Text()) description_md5 = Column(CHAR(32)) source_name = Column(String(256)) source_version = Column(DebVersion()) priority = Column(Enum(PackagePriority)) section = Column(String(64)) depends = Column(ARRAY(Text())) pre_depends = Column(ARRAY(Text())) maintainer = Column(Text()) homepage = Column(Text()) bin_file = relationship('ArchiveFile', uselist=False, back_populates='binpkg', cascade='all, delete, delete-orphan') sw_cpts = relationship('SoftwareComponent', secondary=swcpt_binpkg_assoc_table, back_populates='bin_packages') __ts_vector__ = create_tsvector(cast(func.coalesce(name, ''), TEXT), cast(func.coalesce(description, ''), TEXT), cast(func.coalesce(source_name, ''), TEXT)) __table_args__ = (Index('idx_bin_package_fts', __ts_vector__, postgresql_using='gin'), ) @staticmethod def generate_uuid(repo_name, name, version, arch_name): return uuid.uuid5( UUID_NS_BINPACKAGE, '{}::{}/{}/{}'.format(repo_name, name, version, arch_name)) def update_uuid(self): if not self.repo: raise Exception( 'Binary package is not associated with a repository!') self.uuid = BinaryPackage.generate_uuid(self.repo.name, self.name, self.version, self.architecture.name) return self.uuid def __str__(self): repo_name = '?' if self.repo: repo_name = self.repo.name arch_name = 'unknown' if self.architecture: arch_name = self.architecture.name return '{}::{}/{}/{}'.format(repo_name, self.name, self.version, arch_name)
class SousChef(db.Model): __tablename__ = 'sous_chefs' __module__ = 'newslynx.models.sous_chef' id = db.Column(db.Integer, unique=True, index=True, primary_key=True) org_id = db.Column(db.Integer, db.ForeignKey('orgs.id'), index=True) name = db.Column(db.Text, index=True, unique=True) slug = db.Column(db.Text, index=True, unique=True) description = db.Column(db.Text) runs = db.Column(db.Text) filepath = db.Column(db.Text) is_command = db.Column(db.Boolean) creates = db.Column(ENUM(*SOUS_CHEF_CREATES, name='sous_chef_creates_enum'), index=True) option_order = db.Column(ARRAY(db.Text)) requires_auths = db.Column(ARRAY(db.Text)) requires_settings = db.Column(ARRAY(db.Text)) options = db.Column(JSON) metrics = db.Column(JSON) def __init__(self, **kw): # set columns self.org_id = kw.get('org_id') self.name = kw.get('name') self.slug = slug(kw.get('slug', kw['name'])) self.description = kw.get('description') self.runs = kw.get('runs') self.filepath = kw.get('filepath') self.is_command = kw.get('is_command') if self.is_command: self.filepath = kw.get('runs') self.creates = kw.get('creates', 'null') if not self.creates: self.creates = 'null' self.required_auths = kw.get('requires_auths', []) self.required_settings = kw.get('requires_settings', []) self.option_order = kw.get('option_order', []) self.options = kw.get('options', {}) self.metrics = kw.get('metrics', {}) def to_dict(self, **kw): incl_options = kw.get('incl_options', True) d = { 'id': self.id, 'name': self.name, 'slug': self.slug, 'filepath': self.filepath, 'description': self.description, 'runs': self.runs, 'is_command': self.is_command, 'creates': self.creates, 'requires_auths': self.requires_auths, 'requires_settings': self.requires_settings } if incl_options: d['options'] = self.ordered_options d['option_order'] = self.option_order if self.creates: if 'metrics' in self.creates: d['metrics'] = self.metrics return d @property def config(self, **kw): """ The original configuration representation. """ return { 'name': self.name, 'slug': self.slug, 'description': self.description, 'runs': self.runs, 'creates': self.creates, 'requires_auths': self.requires_auths, 'requires_settings': self.requires_settings, 'option_order': self.option_order, 'options': self.options, 'metrics': self.metrics, } @property def ordered_options(self): """ Optionally order by specific keys. """ if len(self.option_order): sort_order = {k: i for i, k in enumerate(self.option_order)} return OrderedDict( sorted(self.options.items(), key=lambda k: sort_order.get(k[0], None))) return self.options def __repr__(self): return '<SousChef %r >' % (self.slug)
class SoftwareComponent(Base): ''' Description of a software component as described by the AppStream specification. ''' __tablename__ = 'archive_sw_components' uuid = Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4) kind = Column(Integer()) # The component type cid = Column(Text()) # The component ID of this software gcid = Column( Text()) # The global component ID as used by appstream-generator name = Column(Text()) # Name of this component summary = Column(Text()) # Short description of this component description = Column(Text()) # Description of this component icon_name = Column( String(256)) # Name of the primary cached icon of this component is_free = Column( Boolean(), default=False ) # Whether this component is "free as in freedom" software project_license = Column(Text()) # License of this software developer_name = Column(Text()) # Name of the developer of this software supports_touch = Column( Boolean(), default=False) # Whether this component supports touch input categories = Column(ARRAY(String(256))) # Categories this component is in bin_packages = relationship( 'BinaryPackage', secondary=swcpt_binpkg_assoc_table, order_by='desc(BinaryPackage.version)', back_populates='sw_cpts' ) # Packages this software component is contained in flatpakref_uuid = Column(UUID(as_uuid=True), ForeignKey('flatpak_refs.uuid')) flatpakref = relationship('FlatpakRef') xml = Column(Text( )) # XML representation in AppStream collection XML for this component __ts_vector__ = create_tsvector(cast(func.coalesce(name, ''), TEXT), cast(func.coalesce(summary, ''), TEXT), cast(func.coalesce(description, ''), TEXT)) __table_args__ = (Index('idx_sw_components_fts', __ts_vector__, postgresql_using='gin'), ) cpt = None def update_uuid(self): ''' Update the unique identifier for this component. ''' if not self.gcid and not self.xml: raise Exception( 'Global component ID is not set for this component, and no XML data was found for it. Can not create UUID.' ) self.uuid = uuid.uuid5(UUID_NS_SWCOMPONENT, self.gcid if self.gcid else self.xml) return self.uuid def load(self, context=None): ''' Load the actual AppStream component from stored XML data. An existing AppStream Context instance can be reused. ''' # return the AppStream component if we already have it if self.cpt: return self.cpt # set up the context if not context: import gi gi.require_version('AppStream', '1.0') from gi.repository import AppStream context = AppStream.Context() context.set_style(AppStream.FormatStyle.COLLECTION) if not self.xml: raise Exception( 'Can not load AppStream component from empty data.') self.cpt = AppStream.Component() self.cpt.load_from_xml_data(context, self.xml) self.cpt.set_active_locale('C') return self.cpt
from schematics.exceptions import ValidationError metadata = sa.MetaData() app = Flask(__name__) # sa # users_table = sa.Table('users', metadata, sa.Column('id', sa.Integer, primary_key=True), sa.Column('email', sa.String(255), unique=True), sa.Column('access_token', sa.String(30), unique=True)) posts_table = sa.Table('posts', metadata, sa.Column('id', sa.Integer, primary_key=True), sa.Column('title', sa.String(255), unique=True), sa.Column('markdown_body', sa.String()), sa.Column('tags', ARRAY(sa.String(), dimensions=1)), sa.Column('post_by', sa.ForeignKey("users.id")), sa.Column('created_at', sa.DateTime())) # Validations # class PostValidator(Model): title = StringType(max_length=255, min_length=5, required=True) markdown_body = StringType(required=True) tags = ListType(StringType) # Redis # def is_token_in_cache(token): """Check token is in cache. """
class Collection(db.Model, IdModel, SoftDeleteModel): """A set of documents and entities against which access control is enforced.""" # Category schema for collections. # TODO: should this be configurable? CATEGORIES = { "news": lazy_gettext("News archives"), "leak": lazy_gettext("Leaks"), "land": lazy_gettext("Land registry"), "gazette": lazy_gettext("Gazettes"), "court": lazy_gettext("Court archives"), "company": lazy_gettext("Company registries"), "sanctions": lazy_gettext("Sanctions lists"), "procurement": lazy_gettext("Procurement"), "finance": lazy_gettext("Financial records"), "grey": lazy_gettext("Grey literature"), "library": lazy_gettext("Document libraries"), "license": lazy_gettext("Licenses and concessions"), "regulatory": lazy_gettext("Regulatory filings"), "poi": lazy_gettext("Persons of interest"), "customs": lazy_gettext("Customs declarations"), "census": lazy_gettext("Population census"), "transport": lazy_gettext("Air and maritime registers"), "casefile": lazy_gettext("Investigations"), "other": lazy_gettext("Other material"), } CASEFILE = "casefile" # How often a collection is updated: FREQUENCIES = { "unknown": lazy_gettext("not known"), "never": lazy_gettext("not updated"), "daily": lazy_gettext("daily"), "weekly": lazy_gettext("weekly"), "monthly": lazy_gettext("monthly"), "annual": lazy_gettext("annual"), } DEFAULT_FREQUENCY = "unknown" label = db.Column(db.Unicode) summary = db.Column(db.Unicode, nullable=True) category = db.Column(db.Unicode, nullable=True) frequency = db.Column(db.Unicode, nullable=True) countries = db.Column(ARRAY(db.Unicode()), default=[]) languages = db.Column(ARRAY(db.Unicode()), default=[]) foreign_id = db.Column(db.Unicode, unique=True, nullable=False) publisher = db.Column(db.Unicode, nullable=True) publisher_url = db.Column(db.Unicode, nullable=True) info_url = db.Column(db.Unicode, nullable=True) data_url = db.Column(db.Unicode, nullable=True) # This collection is marked as super-secret: restricted = db.Column(db.Boolean, default=False) # Run xref on entity changes: xref = db.Column(db.Boolean, default=False) creator_id = db.Column(db.Integer, db.ForeignKey("role.id"), nullable=True) creator = db.relationship(Role) def touch(self): # https://www.youtube.com/watch?v=wv-34w8kGPM self.updated_at = datetime.utcnow() db.session.add(self) def update(self, data, authz): self.label = data.get("label", self.label) self.summary = data.get("summary", self.summary) self.publisher = data.get("publisher", self.publisher) self.publisher_url = data.get("publisher_url", self.publisher_url) if self.publisher_url is not None: self.publisher_url = stringify(self.publisher_url) self.info_url = data.get("info_url", self.info_url) if self.info_url is not None: self.info_url = stringify(self.info_url) self.data_url = data.get("data_url", self.data_url) if self.data_url is not None: self.data_url = stringify(self.data_url) countries = ensure_list(data.get("countries", self.countries)) self.countries = [registry.country.clean(val) for val in countries] languages = ensure_list(data.get("languages", self.languages)) self.languages = [registry.language.clean(val) for val in languages] self.frequency = data.get("frequency", self.frequency) self.restricted = data.get("restricted", self.restricted) self.xref = data.get("xref", self.xref) # Some fields are editable only by admins in order to have # a strict separation between source evidence and case # material. if authz.is_admin: self.category = data.get("category", self.category) creator = ensure_dict(data.get("creator")) creator_id = data.get("creator_id", creator.get("id")) creator = Role.by_id(creator_id) if creator is not None: self.creator = creator self.touch() db.session.flush() @property def team_id(self): role = aliased(Role) perm = aliased(Permission) q = db.session.query(role.id) q = q.filter(role.type != Role.SYSTEM) q = q.filter(role.id == perm.role_id) q = q.filter(perm.collection_id == self.id) q = q.filter(perm.read == True) # noqa q = q.filter(role.deleted_at == None) # noqa return [stringify(i) for (i, ) in q.all()] @property def secret(self): q = db.session.query(Permission.id) q = q.filter(Permission.role_id.in_(Role.public_roles())) q = q.filter(Permission.collection_id == self.id) q = q.filter(Permission.read == True) # noqa return q.count() < 1 @property def casefile(self): # A casefile is a type of collection which is used to manage the state # of an investigation. Unlike normal collections, cases do not serve # as source material, but as a mechanism of analysis. return self.category == self.CASEFILE @property def ns(self): if not hasattr(self, "_ns"): self._ns = Namespace(self.foreign_id) return self._ns def to_dict(self): data = self.to_dict_dates() data["category"] = self.CASEFILE if self.category in self.CATEGORIES: data["category"] = self.category data["frequency"] = self.DEFAULT_FREQUENCY if self.frequency in self.FREQUENCIES: data["frequency"] = self.frequency countries = ensure_list(self.countries) countries = [registry.country.clean(c) for c in countries] data["countries"] = [c for c in countries if c is not None] languages = ensure_list(self.languages) languages = [registry.language.clean(c) for c in languages] data["languages"] = [c for c in languages if c is not None] data.update({ "id": stringify(self.id), "collection_id": stringify(self.id), "foreign_id": self.foreign_id, "creator_id": stringify(self.creator_id), "team_id": self.team_id, "label": self.label, "summary": self.summary, "publisher": self.publisher, "publisher_url": self.publisher_url, "info_url": self.info_url, "data_url": self.data_url, "casefile": self.casefile, "secret": self.secret, "xref": self.xref, "restricted": self.restricted, }) return data @classmethod def by_foreign_id(cls, foreign_id, deleted=False): if foreign_id is None: return q = cls.all(deleted=deleted) return q.filter(cls.foreign_id == foreign_id).first() @classmethod def _apply_authz(cls, q, authz): if authz is not None and not authz.is_admin: q = q.join(Permission, cls.id == Permission.collection_id) q = q.filter(Permission.read == True) # noqa q = q.filter(Permission.role_id.in_(authz.roles)) return q @classmethod def all_authz(cls, authz, deleted=False): q = super(Collection, cls).all(deleted=deleted) return cls._apply_authz(q, authz) @classmethod def all_casefiles(cls, authz=None): q = super(Collection, cls).all() q = q.filter(Collection.category == cls.CASEFILE) return cls._apply_authz(q, authz) @classmethod def all_by_ids(cls, ids, deleted=False, authz=None): q = super(Collection, cls).all_by_ids(ids, deleted=deleted) return cls._apply_authz(q, authz) @classmethod def create(cls, data, authz, created_at=None): foreign_id = data.get("foreign_id") or make_textid() collection = cls.by_foreign_id(foreign_id, deleted=True) if collection is None: collection = cls() collection.created_at = created_at collection.foreign_id = foreign_id collection.category = cls.CASEFILE collection.creator = authz.role collection.update(data, authz) collection.deleted_at = None if collection.creator is not None: Permission.grant(collection, collection.creator, True, True) return collection def __repr__(self): fmt = "<Collection(%r, %r, %r)>" return fmt % (self.id, self.foreign_id, self.label) def __str__(self): return self.foreign_id
class Notification(db.Model, IdModel, DatedModel): GLOBAL = 'Global' _event = db.Column('event', db.String(255), nullable=False) channels = db.Column(ARRAY(db.String(255)), index=True) params = db.Column(JSONB) actor_id = db.Column(db.Integer, db.ForeignKey('role.id'), nullable=True) actor = db.relationship(Role) @hybrid_property def event(self): return Events.get(self._event) @event.setter def event(self, event): self._event = event.name @property def recipients(self): q = db.session.query(Role) q = q.join(Subscription, Subscription.role_id == Role.id) q = q.filter(Subscription.channel.in_(self.channels)) q = q.filter(Role.email != None) # noqa q = q.filter(Role.deleted_at == None) # noqa q = q.filter(Subscription.deleted_at == None) # noqa q = q.distinct() return q def iterparams(self): if self.actor_id is not None: yield 'actor', Role, self.actor_id if self.event is None: return for name, clazz in self.event.params.items(): value = self.params.get(name) if value is not None: yield name, clazz, value @classmethod def publish(cls, event, actor_id=None, channels=[], params={}): notf = cls() notf.event = event notf.actor_id = actor_id notf.params = params notf.channels = list(set([c for c in channels if c is not None])) db.session.add(notf) return notf @classmethod def by_role(cls, role, since=None): columns = array_agg(Subscription.channel).label('channels') sq = db.session.query(columns) sq = sq.filter(Subscription.deleted_at == None) # noqa sq = sq.filter(Subscription.role_id == role.id) sq = sq.cte('sq') q = cls.all() q = q.filter(or_( cls.actor_id != role.id, cls.actor_id == None # noqa )) q = q.filter(cls.channels.overlap(sq.c.channels)) q = q.filter(cls._event.in_(Events.names())) if since is not None: q = q.filter(cls.created_at >= since) if role.notified_at is not None: q = q.filter(cls.created_at >= role.notified_at) q = q.order_by(cls.created_at.desc()) q = q.order_by(cls.id.desc()) return q @classmethod def by_channel(cls, channel): q = cls.all() q = q.filter(cls.channels.any(channel)) q = q.filter(cls._event.in_(Events.names())) q = q.order_by(cls.created_at.desc()) q = q.order_by(cls.id.desc()) return q @classmethod def delete_by_channel(cls, channel): q = cls.all() q = q.filter(cls.channels.any(channel)) q.delete(synchronize_session=False)
class ItemModel(db.Model): __abstract__ = True id = db.Column('id', db.Text, nullable=False, primary_key=True) title = db.Column('title', db.Text, nullable=False, index=True) abstract = db.Column('abstract', db.Text, index=True) type = db.Column('type', db.Text, index=True) spatial_data_service_type = db.Column('spatial_data_service_type', db.Text, index=True) spatial_data_service_version = db.Column('spatial_data_service_version', db.Text, index=True) spatial_data_service_operations = db.Column( 'spatial_data_service_operations', ARRAY(db.Text), index=True) spatial_data_service_queryables = db.Column( 'spatial_data_service_queryables', ARRAY(db.Text), index=True) format = db.Column('format', db.Text, index=True) keywords = db.Column('keywords', ARRAY(JSONB)) publisher_name = db.Column('publisher_name', db.Text, index=True) publisher_email = db.Column('publisher_email', db.Text, index=True) publisher_id = db.Column('publisher_id', db.Text, index=True) language = db.Column('language', db.Text, index=True) date_start = db.Column('date_start', db.Date, index=True) date_end = db.Column('date_end', db.Date, index=True) creation_date = db.Column('creation_date', db.Date, index=True) publication_date = db.Column('publication_date', db.Date, index=True) revision_date = db.Column('revision_date', db.Date, index=True) geographic_location = db.Column('geographic_location', Geometry(geometry_type='POLYGON'), index=True) resource_locator = db.Column('resource_locator', db.Text, index=True) license = db.Column('license', db.Text, index=True) open_dataset = db.Column('open_dataset', db.Boolean, default=False) topic_category = db.Column('topic_category', ARRAY(db.Text), index=True) reference_system = db.Column('reference_system', db.Text, index=True) spatial_resolution = db.Column('spatial_resolution', db.Integer, index=True) scales = db.Column('scale', ARRAY(JSONB)) version = db.Column('version', db.Text, index=True) conformity = db.Column('conformity', db.Text, index=True) additional_resources = db.Column('additional_resources', ARRAY(JSONB)) public_access_limitations = db.Column('public_access_limitations', db.Text, index=True) metadata_language = db.Column('metadata_language', db.Text, index=True) metadata_point_of_contact_name = db.Column( 'metadata_point_of_contact_name', db.Text, index=True) metadata_point_of_contact_email = db.Column( 'metadata_point_of_contact_email', db.Text, index=True) metadata_date = db.Column('metadata_date', db.Date, index=True) metadata_version = db.Column('metadata_version', db.Text, index=True) resources = db.Column('resources', ARRAY(JSONB)) lineage = db.Column('lineage', db.Text, index=True) parent_id = db.Column('parent_id', db.Text, index=True) parent_data_source_id = db.Column('parent_data_source_id', db.Text, index=True) item_geojson = db.Column('item_geojson', JSONB) contract_template_id = db.Column('contract_template_id', db.Integer, index=True) contract_template_version = db.Column('contract_template_version', db.Text, index=True) contract_template_type = db.Column('contract_template_type', db.Text, index=True) pricing_models = db.Column('pricing_models', JSONB) statistics = db.Column('statistics', JSONB) delivery_method = db.Column('delivery_method', db.Text, index=True) responsible_party = db.Column('responsible_party', JSONB) automated_metadata = db.Column('automated_metadata', ARRAY(JSONB)) harvested_from = db.Column('harvested_from', db.Text, index=True) harvest_json = db.Column('harvest_json', JSONB) use_only_for_vas = db.Column('use_only_for_vas', db.Boolean, default=False) vetting_required = db.Column('vetting_required', db.Boolean, default=False) ingestion_info = db.Column('ingestion_info', ARRAY(JSONB)) created_at = db.Column('created_at', db.Date, index=True) submitted_at = db.Column('submitted_at', db.Date, index=True) accepted_at = db.Column('accepted_at', db.Date, index=True) visibility = db.Column('visibility', ARRAY(db.Text), index=True) suitable_for = db.Column('suitable_for', ARRAY(db.Text), index=True) extensions = db.Column('extensions', JSONB) ts_vector = func.to_tsvector('english', item_geojson) def update(self, id, data): properties = data.get('properties') if data.get('geometry') is not None: geom = data['geometry'] self.geographic_location = shape(geom).wkt for key in properties: if properties[key] is not None: setattr(self, key, properties[key]) self.id = id item_geojson = self.serialize() self.item_geojson = item_geojson return # convert to geojson format def serialize(self): # build properties object p = {} geom = None for c in inspect(self).attrs.keys(): attr = getattr(self, c) if c == 'geographic_location' and attr is not None: # Convert to a shapely Polygon object to get mapping if isinstance(attr, str): g = shapely.wkt.loads(attr) geom = mapping(g) else: g = to_shape(attr) geom = mapping(g) elif isinstance(attr, datetime.date): p[c] = attr.isoformat() elif c == 'id' or c == 'item_geojson': continue else: p[c] = attr # build geojson item_geojson = \ { "id": self.id, "type": "Feature", "geometry": geom, "properties": p } return item_geojson
class Event(SearchableTitleMixin, DescriptionMixin, LocationMixin, ProtectionManagersMixin, AttachedItemsMixin, AttachedNotesMixin, PersonLinkDataMixin, db.Model): """An Indico event This model contains the most basic information related to an event. Note that the ACL is currently only used for managers but not for view access! """ __tablename__ = 'events' disallowed_protection_modes = frozenset() inheriting_have_acl = True allow_access_key = True allow_no_access_contact = True location_backref_name = 'events' allow_location_inheritance = False possible_render_modes = {RenderMode.html} default_render_mode = RenderMode.html __logging_disabled = False ATTACHMENT_FOLDER_ID_COLUMN = 'event_id' @strict_classproperty @classmethod def __auto_table_args(cls): return (db.Index('ix_events_start_dt_desc', cls.start_dt.desc()), db.Index('ix_events_end_dt_desc', cls.end_dt.desc()), db.Index('ix_events_not_deleted_category', cls.is_deleted, cls.category_id), db.Index('ix_events_not_deleted_category_dates', cls.is_deleted, cls.category_id, cls.start_dt, cls.end_dt), db.Index('ix_uq_events_url_shortcut', db.func.lower(cls.url_shortcut), unique=True, postgresql_where=db.text('NOT is_deleted')), db.CheckConstraint("category_id IS NOT NULL OR is_deleted", 'category_data_set'), db.CheckConstraint("(logo IS NULL) = (logo_metadata::text = 'null')", 'valid_logo'), db.CheckConstraint("(stylesheet IS NULL) = (stylesheet_metadata::text = 'null')", 'valid_stylesheet'), db.CheckConstraint("end_dt >= start_dt", 'valid_dates'), db.CheckConstraint("url_shortcut != ''", 'url_shortcut_not_empty'), db.CheckConstraint("cloned_from_id != id", 'not_cloned_from_self'), db.CheckConstraint('visibility IS NULL OR visibility >= 0', 'valid_visibility'), {'schema': 'events'}) @declared_attr def __table_args__(cls): return auto_table_args(cls) #: The ID of the event id = db.Column( db.Integer, primary_key=True ) #: If the event has been deleted is_deleted = db.Column( db.Boolean, nullable=False, default=False ) #: If the event is locked (read-only mode) is_locked = db.Column( db.Boolean, nullable=False, default=False ) #: The ID of the user who created the event creator_id = db.Column( db.Integer, db.ForeignKey('users.users.id'), nullable=False, index=True ) #: The ID of immediate parent category of the event category_id = db.Column( db.Integer, db.ForeignKey('categories.categories.id'), nullable=True, index=True ) #: The ID of the series this events belongs to series_id = db.Column( db.Integer, db.ForeignKey('events.series.id'), nullable=True, index=True ) #: If this event was cloned, the id of the parent event cloned_from_id = db.Column( db.Integer, db.ForeignKey('events.events.id'), nullable=True, index=True, ) #: The ID of the label assigned to the event label_id = db.Column( db.ForeignKey('events.labels.id'), index=True, nullable=True ) label_message = db.Column( db.Text, nullable=False, default='', ) #: The creation date of the event created_dt = db.Column( UTCDateTime, nullable=False, index=True, default=now_utc ) #: The start date of the event start_dt = db.Column( UTCDateTime, nullable=False, index=True ) #: The end date of the event end_dt = db.Column( UTCDateTime, nullable=False, index=True ) #: The timezone of the event timezone = db.Column( db.String, nullable=False ) #: The type of the event _type = db.Column( 'type', PyIntEnum(EventType), nullable=False ) #: The visibility depth in category overviews visibility = db.Column( db.Integer, nullable=True, default=None ) #: A list of tags/keywords for the event keywords = db.Column( ARRAY(db.String), nullable=False, default=[], ) #: The URL shortcut for the event url_shortcut = db.Column( db.String, nullable=True ) #: The metadata of the logo (hash, size, filename, content_type) logo_metadata = db.Column( JSONB, nullable=False, default=lambda: None ) #: The logo's raw image data logo = db.deferred(db.Column( db.LargeBinary, nullable=True )) #: The metadata of the stylesheet (hash, size, filename) stylesheet_metadata = db.Column( JSONB, nullable=False, default=lambda: None ) #: The stylesheet's raw image data stylesheet = db.deferred(db.Column( db.Text, nullable=True )) #: The ID of the event's default page (conferences only) default_page_id = db.Column( db.Integer, db.ForeignKey('events.pages.id'), index=True, nullable=True ) #: The url to a map for the event own_map_url = db.Column( 'map_url', db.String, nullable=False, default='' ) #: The last user-friendly registration ID _last_friendly_registration_id = db.deferred(db.Column( 'last_friendly_registration_id', db.Integer, nullable=False, default=0 )) #: The last user-friendly contribution ID _last_friendly_contribution_id = db.deferred(db.Column( 'last_friendly_contribution_id', db.Integer, nullable=False, default=0 )) #: The last user-friendly session ID _last_friendly_session_id = db.deferred(db.Column( 'last_friendly_session_id', db.Integer, nullable=False, default=0 )) #: The category containing the event category = db.relationship( 'Category', lazy=True, backref=db.backref( 'events', primaryjoin='(Category.id == Event.category_id) & ~Event.is_deleted', order_by=(start_dt, id), lazy=True ) ) #: The user who created the event creator = db.relationship( 'User', lazy=True, backref=db.backref( 'created_events', lazy='dynamic' ) ) #: The event this one was cloned from cloned_from = db.relationship( 'Event', lazy=True, remote_side='Event.id', backref=db.backref( 'clones', lazy=True, order_by=start_dt ) ) #: The event's default page (conferences only) default_page = db.relationship( 'EventPage', lazy=True, foreign_keys=[default_page_id], post_update=True, # don't use this backref. we just need it so SA properly NULLs # this column when deleting the default page backref=db.backref('_default_page_of_event', lazy=True) ) #: The ACL entries for the event acl_entries = db.relationship( 'EventPrincipal', backref='event', cascade='all, delete-orphan', collection_class=set ) #: External references associated with this event references = db.relationship( 'EventReference', lazy=True, cascade='all, delete-orphan', backref=db.backref( 'event', lazy=True ) ) #: Persons associated with this event person_links = db.relationship( 'EventPersonLink', lazy=True, cascade='all, delete-orphan', backref=db.backref( 'event', lazy=True ) ) #: The series this event is part of series = db.relationship( 'EventSeries', lazy=True, backref=db.backref( 'events', lazy=True, order_by=(start_dt, id), primaryjoin='(Event.series_id == EventSeries.id) & ~Event.is_deleted', ) ) #: The label assigned to the event label = db.relationship( 'EventLabel', lazy=True, backref=db.backref( 'events', lazy=True ) ) # relationship backrefs: # - abstract_email_templates (AbstractEmailTemplate.event) # - abstract_review_questions (AbstractReviewQuestion.event) # - abstracts (Abstract.event) # - agreements (Agreement.event) # - all_attachment_folders (AttachmentFolder.event) # - all_legacy_attachment_folder_mappings (LegacyAttachmentFolderMapping.event) # - all_legacy_attachment_mappings (LegacyAttachmentMapping.event) # - all_notes (EventNote.event) # - all_room_reservation_links (ReservationLink.event) # - all_vc_room_associations (VCRoomEventAssociation.event) # - attachment_folders (AttachmentFolder.linked_event) # - clones (Event.cloned_from) # - contribution_fields (ContributionField.event) # - contribution_types (ContributionType.event) # - contributions (Contribution.event) # - custom_pages (EventPage.event) # - designer_templates (DesignerTemplate.event) # - editing_file_types (EditingFileType.event) # - editing_review_conditions (EditingReviewCondition.event) # - editing_tags (EditingTag.event) # - layout_images (ImageFile.event) # - legacy_contribution_mappings (LegacyContributionMapping.event) # - legacy_mapping (LegacyEventMapping.event) # - legacy_session_block_mappings (LegacySessionBlockMapping.event) # - legacy_session_mappings (LegacySessionMapping.event) # - legacy_subcontribution_mappings (LegacySubContributionMapping.event) # - log_entries (EventLogEntry.event) # - menu_entries (MenuEntry.event) # - note (EventNote.linked_event) # - paper_competences (PaperCompetence.event) # - paper_review_questions (PaperReviewQuestion.event) # - paper_templates (PaperTemplate.event) # - persons (EventPerson.event) # - registration_forms (RegistrationForm.event) # - registrations (Registration.event) # - reminders (EventReminder.event) # - requests (Request.event) # - roles (EventRole.event) # - room_reservation_links (ReservationLink.linked_event) # - session_types (SessionType.event) # - sessions (Session.event) # - settings (EventSetting.event) # - settings_principals (EventSettingPrincipal.event) # - static_list_links (StaticListLink.event) # - static_sites (StaticSite.event) # - surveys (Survey.event) # - timetable_entries (TimetableEntry.event) # - track_groups (TrackGroup.event) # - tracks (Track.event) # - vc_room_associations (VCRoomEventAssociation.linked_event) start_dt_override = _EventSettingProperty(event_core_settings, 'start_dt_override') end_dt_override = _EventSettingProperty(event_core_settings, 'end_dt_override') organizer_info = _EventSettingProperty(event_core_settings, 'organizer_info') additional_info = _EventSettingProperty(event_core_settings, 'additional_info') contact_title = _EventSettingProperty(event_contact_settings, 'title') contact_emails = _EventSettingProperty(event_contact_settings, 'emails') contact_phones = _EventSettingProperty(event_contact_settings, 'phones') @classmethod def category_chain_overlaps(cls, category_ids): """ Create a filter that checks whether the event has any of the provided category ids in its parent chain. :param category_ids: A list of category ids or a single category id """ from indico.modules.categories import Category if not isinstance(category_ids, (list, tuple, set)): category_ids = [category_ids] cte = Category.get_tree_cte() return (cte.c.id == Event.category_id) & cte.c.path.overlap(category_ids) @classmethod def is_visible_in(cls, category_id): """ Create a filter that checks whether the event is visible in the specified category. """ cte = Category.get_visible_categories_cte(category_id) return (db.exists(db.select([1])) .where(db.and_(cte.c.id == Event.category_id, db.or_(Event.visibility.is_(None), Event.visibility > cte.c.level)))) @property def event(self): """Convenience property so all event entities have it""" return self @property def has_logo(self): return self.logo_metadata is not None @property def has_stylesheet(self): return self.stylesheet_metadata is not None @property def theme(self): from indico.modules.events.layout import layout_settings, theme_settings theme = layout_settings.get(self, 'timetable_theme') if theme and theme in theme_settings.get_themes_for(self.type): return theme else: return theme_settings.defaults[self.type] @property def locator(self): return {'confId': self.id} @property def logo_url(self): return url_for('event_images.logo_display', self, slug=self.logo_metadata['hash']) @property def external_logo_url(self): return url_for('event_images.logo_display', self, slug=self.logo_metadata['hash'], _external=True) @property def participation_regform(self): return next((form for form in self.registration_forms if form.is_participation), None) @property @memoize_request def published_registrations(self): from indico.modules.events.registration.util import get_published_registrations return get_published_registrations(self) @property def protection_parent(self): return self.category @property def start_dt_local(self): return self.start_dt.astimezone(self.tzinfo) @property def end_dt_local(self): return self.end_dt.astimezone(self.tzinfo) @property def start_dt_display(self): """ The 'displayed start dt', which is usually the actual start dt, but may be overridden for a conference. """ if self.type_ == EventType.conference and self.start_dt_override: return self.start_dt_override else: return self.start_dt @property def end_dt_display(self): """ The 'displayed end dt', which is usually the actual end dt, but may be overridden for a conference. """ if self.type_ == EventType.conference and self.end_dt_override: return self.end_dt_override else: return self.end_dt @property def type(self): # XXX: this should eventually be replaced with the type_ # property returning the enum - but there are too many places # right now that rely on the type string return self.type_.name @hybrid_property def type_(self): return self._type @type_.setter def type_(self, value): old_type = self._type self._type = value if old_type is not None and old_type != value: signals.event.type_changed.send(self, old_type=old_type) @property def url(self): return url_for('events.display', self) @property def external_url(self): return url_for('events.display', self, _external=True) @property def short_url(self): id_ = self.url_shortcut or self.id return url_for('events.shorturl', confId=id_) @property def short_external_url(self): id_ = self.url_shortcut or self.id return url_for('events.shorturl', confId=id_, _external=True) @property def map_url(self): if self.own_map_url: return self.own_map_url elif not self.room: return '' return self.room.map_url or '' @property def tzinfo(self): return pytz.timezone(self.timezone) @property def display_tzinfo(self): """The tzinfo of the event as preferred by the current user""" return get_display_tz(self, as_timezone=True) @property def editable_types(self): from indico.modules.events.editing.settings import editing_settings return editing_settings.get(self, 'editable_types') @property @contextmanager def logging_disabled(self): """Temporarily disables event logging This is useful when performing actions e.g. during event creation or at other times where adding entries to the event log doesn't make sense. """ self.__logging_disabled = True try: yield finally: self.__logging_disabled = False @hybrid_method def happens_between(self, from_dt=None, to_dt=None): """Check whether the event takes place within two dates""" if from_dt is not None and to_dt is not None: # any event that takes place during the specified range return overlaps((self.start_dt, self.end_dt), (from_dt, to_dt), inclusive=True) elif from_dt is not None: # any event that starts on/after the specified date return self.start_dt >= from_dt elif to_dt is not None: # any event that ends on/before the specifed date return self.end_dt <= to_dt else: return True @happens_between.expression def happens_between(cls, from_dt=None, to_dt=None): if from_dt is not None and to_dt is not None: # any event that takes place during the specified range return db_dates_overlap(cls, 'start_dt', from_dt, 'end_dt', to_dt, inclusive=True) elif from_dt is not None: # any event that starts on/after the specified date return cls.start_dt >= from_dt elif to_dt is not None: # any event that ends on/before the specifed date return cls.end_dt <= to_dt else: return True @hybrid_method def starts_between(self, from_dt=None, to_dt=None): """Check whether the event starts within two dates""" if from_dt is not None and to_dt is not None: return from_dt <= self.start_dt <= to_dt elif from_dt is not None: return self.start_dt >= from_dt elif to_dt is not None: return self.start_dt <= to_dt else: return True @starts_between.expression def starts_between(cls, from_dt=None, to_dt=None): if from_dt is not None and to_dt is not None: return cls.start_dt.between(from_dt, to_dt) elif from_dt is not None: return cls.start_dt >= from_dt elif to_dt is not None: return cls.start_dt <= to_dt else: return True @hybrid_method def ends_after(self, dt): """Check whether the event ends on/after the specified date""" return self.end_dt >= dt if dt is not None else True @ends_after.expression def ends_after(cls, dt): return cls.end_dt >= dt if dt is not None else True @hybrid_property def duration(self): return self.end_dt - self.start_dt def can_lock(self, user): """Check whether the user can lock/unlock the event""" return user and (user.is_admin or user == self.creator or self.category.can_manage(user)) def get_relative_event_ids(self): """Get the first, last, previous and next event IDs. Any of those values may be ``None`` if there is no matching event or if it would be the current event. :return: A dict containing ``first``, ``last``, ``prev`` and ``next``. """ subquery = (select([Event.id, db.func.first_value(Event.id).over(order_by=(Event.start_dt, Event.id)).label('first'), db.func.last_value(Event.id).over(order_by=(Event.start_dt, Event.id), range_=(None, None)).label('last'), db.func.lag(Event.id).over(order_by=(Event.start_dt, Event.id)).label('prev'), db.func.lead(Event.id).over(order_by=(Event.start_dt, Event.id)).label('next')]) .where((Event.category_id == self.category_id) & ~Event.is_deleted) .alias()) rv = (db.session.query(subquery.c.first, subquery.c.last, subquery.c.prev, subquery.c.next) .filter(subquery.c.id == self.id) .one() ._asdict()) if rv['first'] == self.id: rv['first'] = None if rv['last'] == self.id: rv['last'] = None return rv def get_verbose_title(self, show_speakers=False, show_series_pos=False): """Get the event title with some additional information :param show_speakers: Whether to prefix the title with the speakers of the event. :param show_series_pos: Whether to suffix the title with the position and total count in the event's series. """ title = self.title if show_speakers and self.person_links: speakers = ', '.join(sorted([pl.full_name for pl in self.person_links], key=unicode.lower)) title = '{}, "{}"'.format(speakers, title) if show_series_pos and self.series and self.series.show_sequence_in_title: title = '{} ({}/{})'.format(title, self.series_pos, self.series_count) return title def get_label_markup(self, size=''): label = self.label if not label: return '' return Markup(render_template('events/label.html', label=label, message=self.label_message, size=size)) def get_non_inheriting_objects(self): """Get a set of child objects that do not inherit protection""" return get_non_inheriting_objects(self) def get_contribution(self, id_): """Get a contribution of the event""" return get_related_object(self, 'contributions', {'id': id_}) def get_sorted_tracks(self): """Return tracks and track groups in the correct order""" track_groups = self.track_groups tracks = [track for track in self.tracks if not track.track_group] return sorted(tracks + track_groups, key=attrgetter('position')) def get_session(self, id_=None, friendly_id=None): """Get a session of the event""" if friendly_id is None and id_ is not None: criteria = {'id': id_} elif id_ is None and friendly_id is not None: criteria = {'friendly_id': friendly_id} else: raise ValueError('Exactly one kind of id must be specified') return get_related_object(self, 'sessions', criteria) def get_session_block(self, id_, scheduled_only=False): """Get a session block of the event""" from indico.modules.events.sessions.models.blocks import SessionBlock query = SessionBlock.query.filter(SessionBlock.id == id_, SessionBlock.session.has(event=self, is_deleted=False)) if scheduled_only: query.filter(SessionBlock.timetable_entry != None) # noqa return query.first() def get_allowed_sender_emails(self, include_current_user=True, include_creator=True, include_managers=True, include_contact=True, include_chairs=True, extra=None): """ Return the emails of people who can be used as senders (or rather Reply-to contacts) in emails sent from within an event. :param include_current_user: Whether to include the email of the currently logged-in user :param include_creator: Whether to include the email of the event creator :param include_managers: Whether to include the email of all event managers :param include_contact: Whether to include the "event contact" emails :param include_chairs: Whether to include the emails of event chairpersons (or lecture speakers) :param extra: An email address that is always included, even if it is not in any of the included lists. :return: An OrderedDict mapping emails to pretty names """ emails = {} # Contact/Support if include_contact: for email in self.contact_emails: emails[email] = self.contact_title # Current user if include_current_user and has_request_context() and session.user: emails[session.user.email] = session.user.full_name # Creator if include_creator: emails[self.creator.email] = self.creator.full_name # Managers if include_managers: emails.update((p.principal.email, p.principal.full_name) for p in self.acl_entries if p.type == PrincipalType.user and p.full_access) # Chairs if include_chairs: emails.update((pl.email, pl.full_name) for pl in self.person_links if pl.email) # Extra email (e.g. the current value in an object from the DB) if extra: emails.setdefault(extra, extra) # Sanitize and format emails emails = {to_unicode(email.strip().lower()): '{} <{}>'.format(to_unicode(name), to_unicode(email)) for email, name in emails.iteritems() if email and email.strip()} own_email = session.user.email if has_request_context() and session.user else None return OrderedDict(sorted(emails.items(), key=lambda x: (x[0] != own_email, x[1].lower()))) @memoize_request def has_feature(self, feature): """Checks if a feature is enabled for the event""" from indico.modules.events.features.util import is_feature_enabled return is_feature_enabled(self, feature) @property @memoize_request def scheduled_notes(self): from indico.modules.events.notes.util import get_scheduled_notes return get_scheduled_notes(self) def log(self, realm, kind, module, summary, user=None, type_='simple', data=None): """Creates a new log entry for the event :param realm: A value from :class:`.EventLogRealm` indicating the realm of the action. :param kind: A value from :class:`.EventLogKind` indicating the kind of the action that was performed. :param module: A human-friendly string describing the module related to the action. :param summary: A one-line summary describing the logged action. :param user: The user who performed the action. :param type_: The type of the log entry. This is used for custom rendering of the log message/data :param data: JSON-serializable data specific to the log type. :return: The newly created `EventLogEntry` In most cases the ``simple`` log type is fine. For this type, any items from data will be shown in the detailed view of the log entry. You may either use a dict (which will be sorted) alphabetically or a list of ``key, value`` pairs which will be displayed in the given order. """ if self.__logging_disabled: return entry = EventLogEntry(user=user, realm=realm, kind=kind, module=module, type=type_, summary=summary, data=data or {}) self.log_entries.append(entry) return entry def get_contribution_field(self, field_id): return next((v for v in self.contribution_fields if v.id == field_id), '') def move_start_dt(self, start_dt): """Set event start_dt and adjust its timetable entries""" diff = start_dt - self.start_dt for entry in self.timetable_entries.filter(TimetableEntry.parent_id.is_(None)): new_dt = entry.start_dt + diff entry.move(new_dt) self.start_dt = start_dt def iter_days(self, tzinfo=None): start_dt = self.start_dt end_dt = self.end_dt if tzinfo: start_dt = start_dt.astimezone(tzinfo) end_dt = end_dt.astimezone(tzinfo) duration = (end_dt.replace(hour=23, minute=59) - start_dt.replace(hour=0, minute=0)).days for offset in xrange(duration + 1): yield (start_dt + timedelta(days=offset)).date() def preload_all_acl_entries(self): db.m.Contribution.preload_acl_entries(self) db.m.Session.preload_acl_entries(self) def move(self, category): from indico.modules.events import EventLogRealm, EventLogKind old_category = self.category self.category = category sep = ' \N{RIGHT-POINTING DOUBLE ANGLE QUOTATION MARK} ' old_path = sep.join(old_category.chain_titles) new_path = sep.join(self.category.chain_titles) db.session.flush() signals.event.moved.send(self, old_parent=old_category) self.log(EventLogRealm.management, EventLogKind.change, 'Category', 'Event moved', session.user, data={'From': old_path, 'To': new_path}) def delete(self, reason, user=None): from indico.modules.events import logger, EventLogRealm, EventLogKind self.is_deleted = True signals.event.deleted.send(self, user=user) db.session.flush() logger.info('Event %r deleted [%s]', self, reason) self.log(EventLogRealm.event, EventLogKind.negative, 'Event', 'Event deleted', user, data={'Reason': reason}) @property @memoize_request def cfa(self): from indico.modules.events.abstracts.models.call_for_abstracts import CallForAbstracts return CallForAbstracts(self) @property @memoize_request def cfp(self): from indico.modules.events.papers.models.call_for_papers import CallForPapers return CallForPapers(self) @property def reservations(self): return [link.reservation for link in self.all_room_reservation_links] @property def has_ended(self): return self.end_dt <= now_utc() @return_ascii def __repr__(self): return format_repr(self, 'id', 'start_dt', 'end_dt', is_deleted=False, is_locked=False, _text=text_to_repr(self.title, max_length=75))
class OAuthApplication(db.Model): """OAuth applications registered in fossir""" __tablename__ = 'applications' @declared_attr def __table_args__(cls): return (db.Index('ix_uq_applications_name_lower', db.func.lower(cls.name), unique=True), db.Index(None, cls.system_app_type, unique=True, postgresql_where=db.text( 'system_app_type != {}'.format( SystemAppType.none.value))), { 'schema': 'oauth' }) #: the unique id of the application id = db.Column(db.Integer, primary_key=True) #: human readable name name = db.Column(db.String, nullable=False) #: human readable description description = db.Column(db.Text, nullable=False, default='') #: the OAuth client_id client_id = db.Column(UUID, unique=True, nullable=False, default=lambda: unicode(uuid4())) #: the OAuth client_secret client_secret = db.Column(UUID, nullable=False, default=lambda: unicode(uuid4())) #: the OAuth default scopes the application may request access to default_scopes = db.Column(ARRAY(db.String), nullable=False) #: the OAuth absolute URIs that a application may use to redirect to after authorization redirect_uris = db.Column(ARRAY(db.String), nullable=False, default=[]) #: whether the application is enabled or disabled is_enabled = db.Column(db.Boolean, nullable=False, default=True) #: whether the application can access user data without asking for permission is_trusted = db.Column(db.Boolean, nullable=False, default=False) #: the type of system app (if any). system apps cannot be deleted system_app_type = db.Column(PyIntEnum(SystemAppType), nullable=False, default=SystemAppType.none) # relationship backrefs: # - tokens (OAuthToken.application) @property def client_type(self): return 'public' @property def default_redirect_uri(self): return self.redirect_uris[0] if self.redirect_uris else None @property def locator(self): return {'id': self.id} @return_ascii def __repr__(self): # pragma: no cover return '<OAuthApplication({}, {}, {})>'.format(self.id, self.name, self.client_id) def reset_client_secret(self): self.client_secret = unicode(uuid4()) logger.info("Client secret for %s has been reset.", self) def validate_redirect_uri(self, redirect_uri): """Called by flask-oauthlib to validate the redirect_uri. Uses a logic similar to the one at GitHub, i.e. protocol and host/port must match exactly and if there is a path in the whitelisted URL, the path of the redirect_uri must start with that path. """ uri_data = url_parse(redirect_uri) for valid_uri_data in map(url_parse, self.redirect_uris): if (uri_data.scheme == valid_uri_data.scheme and uri_data.netloc == valid_uri_data.netloc and uri_data.path.startswith(valid_uri_data.path)): return True return False
def lquery(self, other): # type: ignore """Return whether the array matches the lquery/lqueries in `other`.""" if isinstance(other, list): return self.op("?")(cast(other, ARRAY(LQUERY))) else: return self.op("~")(cast(other, LQUERY))
class TradeAgreement(ModelBase): """ TODO """ __tablename__ = 'agreements_agreement' __table_args__ = ( sa.UniqueConstraint('public_id'), sa.CheckConstraint( "(amount_percent IS NULL) OR (amount_percent >= 1 AND amount_percent <= 100)", name="amount_percent_is_NULL_or_between_1_and_100", ), sa.CheckConstraint( "(limit_to_consumption = 'f' and amount is not null and unit is not null) or (limit_to_consumption = 't')", name="limit_to_consumption_OR_amount_and_unit", ), ) # Meta id = sa.Column(sa.Integer(), primary_key=True, autoincrement=True, index=True) public_id = sa.Column(sa.String(), index=True, nullable=False) created = sa.Column(sa.DateTime(timezone=True), server_default=sa.func.now()) declined = sa.Column(sa.DateTime(timezone=True)) cancelled = sa.Column(sa.DateTime(timezone=True)) # Involved parties (users) user_proposed_id = sa.Column(sa.Integer(), sa.ForeignKey('auth_user.id'), index=True, nullable=False) user_proposed = relationship('User', foreign_keys=[user_proposed_id], lazy='joined') user_from_id = sa.Column(sa.Integer(), sa.ForeignKey('auth_user.id'), index=True, nullable=False) user_from = relationship('User', foreign_keys=[user_from_id], lazy='joined') user_to_id = sa.Column(sa.Integer(), sa.ForeignKey('auth_user.id'), index=True, nullable=False) user_to = relationship('User', foreign_keys=[user_to_id], lazy='joined') # Outbound facilities facility_gsrn = sa.Column(ARRAY(sa.Integer())) # Agreement details state = sa.Column(sa.Enum(AgreementState), index=True, nullable=False) date_from = sa.Column(sa.Date(), nullable=False) date_to = sa.Column(sa.Date(), nullable=False) technologies = sa.Column(ARRAY(sa.String()), index=True) reference = sa.Column(sa.String()) # Max. amount to transfer (per begin) amount = sa.Column(sa.Integer()) unit = sa.Column(sa.Enum(Unit)) # Transfer percentage (though never exceed max. amount - "amount" above) amount_percent = sa.Column(sa.Integer()) # Limit transferred amount to recipient's consumption? limit_to_consumption = sa.Column(sa.Boolean()) # Lowest number = highest priority # Is set when user accepts the agreement, otherwise None transfer_priority = sa.Column(sa.Integer()) # Senders proposal note to recipient proposal_note = sa.Column(sa.String()) @property def user_proposed_to(self): """ :rtype: User """ if self.user_from_id == self.user_proposed_id: return self.user_to else: return self.user_from @property def transfer_reference(self): """ :rtype: str """ return self.public_id @property def calculated_amount(self): """ :rtype: int """ return self.amount * self.unit.value def is_proposed_by(self, user): """ :param User user: :rtype: bool """ return user.id == self.user_proposed_id def is_inbound_to(self, user): """ :param User user: :rtype: bool """ return user.id == self.user_to_id def is_outbound_from(self, user): """ :param User user: :rtype: bool """ return user.id == self.user_from_id def is_pending(self): """ :rtype: bool """ return self.state == AgreementState.PENDING def decline_proposal(self): self.state = AgreementState.DECLINED self.declined = func.now() def cancel(self): self.state = AgreementState.CANCELLED self.cancelled = func.now() self.transfer_priority = None
class Event(db.Model): """ An event is a significant moment in the life of a thing / org. """ query_class = SearchQuery __tablename__ = 'events' __module__ = 'newslynx.models.event' id = db.Column(db.Integer, unique=True, primary_key=True, index=True) # the unique id from the source. source_id = db.Column(db.Text, index=True) org_id = db.Column( db.Integer, db.ForeignKey('orgs.id'), index=True) recipe_id = db.Column(db.Integer, db.ForeignKey('recipes.id'), index=True) status = db.Column( ENUM(*EVENT_STATUSES, name='event_status_enum'), index=True) provenance = db.Column( ENUM(*EVENT_PROVENANCES, name='event_provenance_enum'), index=True) url = db.Column(db.Text, index=True) domain = db.Column(db.Text, index=True) img_url = db.Column(db.Text) thumbnail = db.Column(db.Text) created = db.Column(db.DateTime(timezone=True), default=dates.now) updated = db.Column(db.DateTime(timezone=True), onupdate=dates.now, default=dates.now) title = db.Column(db.Text) description = db.Column(db.Text) body = db.Column(db.Text) authors = db.Column(ARRAY(String)) meta = db.Column(JSON) # search vectors title_search_vector = db.Column(TSVectorType('title')) description_search_vector = db.Column(TSVectorType('description')) body_search_vector = db.Column(TSVectorType('body')) authors_search_vector = db.Column(TSVectorType('authors')) meta_search_vector = db.Column(TSVectorType('meta')) # relations tags = db.relationship('Tag', secondary=relations.events_tags, backref=db.backref('events', lazy='dynamic'), lazy='joined') # relations __table_args__ = ( db.UniqueConstraint( 'source_id', 'org_id', name='event_unique_constraint'), Index('events_title_search_vector_idx', 'title_search_vector', postgresql_using='gin'), Index('events_description_search_vector_idx', 'description_search_vector', postgresql_using='gin'), Index('events_body_search_vector_idx', 'body_search_vector', postgresql_using='gin'), Index('events_authors_search_vector_idx', 'authors_search_vector', postgresql_using='gin'), Index('events_meta_search_vector_idx', 'meta_search_vector', postgresql_using='gin') ) def __init__(self, **kw): self.source_id = str(kw.get('source_id')) self.recipe_id = kw.get('recipe_id') self.org_id = kw.get('org_id') self.status = kw.get('status', 'pending') self.provenance = kw.get('provenance', 'recipe') self.url = kw.get('url') self.domain = kw.get('domain', url.get_domain(kw.get('url', None))) self.img_url = kw.get('img_url') self.thumbnail = kw.get('thumbnail') self.created = kw.get('created', dates.now()) self.title = kw.get('title') self.description = kw.get('description') self.body = kw.get('body') self.authors = kw.get('authors', []) self.meta = kw.get('meta', {}) @property def simple_content_items(self): content_items = [] for t in self.content_items: content_items.append({ 'id': t.id, 'title': t.title, 'url': t.url }) return content_items @property def content_item_ids(self): return [t.id for t in self.content_items] @property def tag_ids(self): return [t.id for t in self.tags] @property def tag_count(self): return len(self.tags) def to_dict(self, **kw): d = { 'id': self.id, 'recipe_id': self.recipe_id, 'source_id': self.source_id, 'status': self.status, 'provenance': self.provenance, 'url': self.url, 'created': self.created, 'updated': self.updated, 'title': self.title, 'description': self.description, 'authors': self.authors, 'meta': self.meta, 'tag_ids': self.tag_ids, 'content_items': self.simple_content_items, } if kw.get('incl_body', False): d['body'] = self.body if kw.get('incl_img', False): d['thumbnail'] = self.thumbnail d['img_url'] = self.img_url return d def __repr__(self): return '<Event %r>' % (self.title)
class Organisation(ModelBase): """ Establishes organisation object that resources can be associated with. """ __tablename__ = 'organisation' is_master = db.Column(db.Boolean, default=False, index=True) name = db.Column(db.String) external_auth_username = db.Column(db.String) valid_roles = db.Column(ARRAY(db.String, dimensions=1)) _external_auth_password = db.Column(db.String) default_lat = db.Column(db.Float()) default_lng = db.Column(db.Float()) # 0 means don't shard, units are kilometers card_shard_distance = db.Column(db.Integer, default=0) _timezone = db.Column(db.String, default='UTC') _country_code = db.Column(db.String, nullable=False) _default_disbursement_wei = db.Column(db.Numeric(27), default=0) require_transfer_card = db.Column(db.Boolean, default=False) # TODO: Create a mixin so that both user and organisation can use the same definition here # This is the blockchain address used for transfer accounts, unless overridden primary_blockchain_address = db.Column(db.String) # This is the 'behind the scenes' blockchain address used for paying gas fees system_blockchain_address = db.Column(db.String) auto_approve_externally_created_users = db.Column(db.Boolean, default=False) users = db.relationship("User", secondary=organisation_association_table, back_populates="organisations") token_id = db.Column(db.Integer, db.ForeignKey('token.id')) org_level_transfer_account_id = db.Column( db.Integer, db.ForeignKey('transfer_account.id', name="fk_org_level_account")) _minimum_vendor_payout_withdrawal_wei = db.Column(db.Numeric(27), default=0) # We use this weird join pattern because SQLAlchemy # doesn't play nice when doing multiple joins of the same table over different declerative bases org_level_transfer_account = db.relationship( "TransferAccount", post_update=True, primaryjoin= "Organisation.org_level_transfer_account_id==TransferAccount.id", uselist=False) @hybrid_property def timezone(self): return self._timezone @timezone.setter def timezone(self, val): # Make the timezone case insensitive lower_zones = dict( zip([tz.lower() for tz in pendulum.timezones], pendulum.timezones)) if val is not None and val.lower() not in lower_zones: raise Exception(f"{val} is not a valid timezone") self._timezone = lower_zones[val.lower()] @hybrid_property def country_code(self): return self._country_code @hybrid_property def country(self): if self._country_code not in ISO_COUNTRIES: raise Exception(f"{self._country_code} is not a valid timezone") return ISO_COUNTRIES[self._country_code] @country_code.setter def country_code(self, val): if val is not None: val = val.upper() if len(val) != 2: # will try handle 'AD: Andorra' val = val.split(':')[0] if val not in ISO_COUNTRIES: raise Exception(f"{val} is not a valid country code") self._country_code = val @property def default_disbursement(self): return Decimal((self._default_disbursement_wei or 0) / int(1e16)) @default_disbursement.setter def default_disbursement(self, val): if val is not None: self._default_disbursement_wei = int(val) * int(1e16) @property def minimum_vendor_payout_withdrawal(self): return Decimal( (self._minimum_vendor_payout_withdrawal_wei or 0) / int(1e16)) @minimum_vendor_payout_withdrawal.setter def minimum_vendor_payout_withdrawal(self, val): if val is not None: self._minimum_vendor_payout_withdrawal_wei = int(val) * int(1e16) # TODO: This is a hack to get around the fact that org level TAs don't always show up. Super not ideal @property def queried_org_level_transfer_account(self): if self.org_level_transfer_account_id: return server.models.transfer_account.TransferAccount\ .query.execution_options(show_all=True).get(self.org_level_transfer_account_id) return None @hybrid_property def external_auth_password(self): return decrypt_string(self._external_auth_password) @external_auth_password.setter def external_auth_password(self, value): self._external_auth_password = encrypt_string(value) credit_transfers = db.relationship( "CreditTransfer", secondary=organisation_association_table, back_populates="organisations") transfer_accounts = db.relationship( 'TransferAccount', backref='organisation', lazy=True, foreign_keys='TransferAccount.organisation_id') blockchain_addresses = db.relationship( 'BlockchainAddress', backref='organisation', lazy=True, foreign_keys='BlockchainAddress.organisation_id') email_whitelists = db.relationship( 'EmailWhitelist', backref='organisation', lazy=True, foreign_keys='EmailWhitelist.organisation_id') kyc_applications = db.relationship( 'KycApplication', backref='organisation', lazy=True, foreign_keys='KycApplication.organisation_id') attribute_maps = db.relationship( 'AttributeMap', backref='organisation', lazy=True, foreign_keys='AttributeMap.organisation_id') custom_welcome_message_key = db.Column(db.String) @staticmethod def master_organisation() -> "Organisation": return Organisation.query.filter_by(is_master=True).first() def _setup_org_transfer_account(self): transfer_account = server.models.transfer_account.TransferAccount( bound_entity=self, is_approved=True) db.session.add(transfer_account) self.org_level_transfer_account = transfer_account # Back setup for delayed organisation transfer account instantiation for user in self.users: if AccessControl.has_any_tier(user.roles, 'ADMIN'): user.transfer_accounts.append(self.org_level_transfer_account) def bind_token(self, token): self.token = token self._setup_org_transfer_account() def __init__(self, token=None, is_master=False, valid_roles=None, timezone=None, **kwargs): super(Organisation, self).__init__(**kwargs) self.timezone = timezone if timezone else 'UTC' chain = self.token.chain if self.token else current_app.config[ 'DEFAULT_CHAIN'] self.external_auth_username = '******' + self.name.lower().replace( ' ', '_') self.external_auth_password = secrets.token_hex(16) self.valid_roles = valid_roles or list(ASSIGNABLE_TIERS.keys()) if is_master: if Organisation.query.filter_by(is_master=True).first(): raise Exception("A master organisation already exists") self.is_master = True self.system_blockchain_address = bt.create_blockchain_wallet( private_key=current_app.config['CHAINS'][chain] ['MASTER_WALLET_PRIVATE_KEY'], wei_target_balance=0, wei_topup_threshold=0, ) self.primary_blockchain_address = self.system_blockchain_address or bt.create_blockchain_wallet( ) else: self.is_master = False self.system_blockchain_address = bt.create_blockchain_wallet( wei_target_balance=current_app.config['CHAINS'][chain] ['SYSTEM_WALLET_TARGET_BALANCE'], wei_topup_threshold=current_app.config['CHAINS'][chain] ['SYSTEM_WALLET_TOPUP_THRESHOLD'], ) self.primary_blockchain_address = bt.create_blockchain_wallet() if token: self.bind_token(token)
def _proc_array(self, arr, itemproc, dim, collection): if dim is None: if isinstance(self.item_type, CompositeType): arr = [itemproc(a) for a in arr] return arr return ARRAY._proc_array(self, arr, itemproc, dim, collection)
class BaseConcreteCommittee(BaseCommittee): __tablename__ = 'ofec_committee_detail_mv' candidate_ids = db.Column(ARRAY(db.Text))