class IEventTimingRelay(IEventLogic): """ timing relay with trigger- and reset-input and one delayed output """ timeStart = Datetime(title=_("start time"), description=_("last trigger received at"), default=datetime.datetime(1901, 1, 1, 0, 0), readonly=True, required=False) timeDelta = Timedelta(title=_("delay time"), description=_("delayed time for output event"), default=datetime.timedelta(days=1), required=True) isRunning = Bool(title=_("timer is running"), default=False)
class IEventTimer(IEventLogic): """ timer with start- and stop-input and one pulse output """ timeNext = Datetime(title=_("next pulse"), description=_("next pulse at"), default=datetime.datetime(1901, 1, 1, 0, 0), readonly=True, required=False) timePulse = Timedelta(title=_("pulse time"), description=_("pulse width for output event"), default=datetime.timedelta(seconds=60), required=True) isRunning = Bool(title=_("timer is running"), default=False) def removeInvalidOidFromInpOutObjects(self): """ delete all invalid oids
class ISiteRecord(Interface): """ describes a record of a site purchase """ id = TextLine(title=u'Site ID') template = TextLine(title=u'Template ID used to create this site') purchased = Datetime(title=u'Initial purchase date') paid = Datetime(title=u'Last payment date') transaction_id = TextLine(title=u'Most recent transaction id') payment_agent = TextLine(title=u'Payment system used') term = Timedelta(title=u'Purchase duration') created = Bool(title=u'Site has been created') def mark_paid(): """ record datetime.datetime.utcnow as most recent payment datetime """ def mark_created(): """ record that this site has been created """ def is_expired(): """ return true if the expiration date for this site has passed
class IJobRecord(Interface): """ """ jobid = TextLine( title=u"Job ID", description=u"The Job's unique identifier", ) name = TextLine( title=u"Name", description=u"The full class name of the job", ) summary = TextLine( title=u"Summary", description=u"A brief and general summary of the job's function", ) description = TextLine( title=u"Description", description=u"A description of what this job will do", ) userid = TextLine( title=u"User ID", description=u"The user that created the job", ) logfile = TextLine( title=u"Logfile", description=u"Path to this job's log file.", ) status = Choice( title=u"Status", description=u"The current status of the job", vocabulary=SimpleVocabulary.fromValues(states.ALL_STATES), ) created = Datetime(title=u"Created", description=u"When the job was created") started = Datetime(title=u"Started", description=u"When the job began executing") finished = Datetime(title=u"Finished", description=u"When the job finished executing") duration = Timedelta(title=u"Duration", description=u"How long the job has run") complete = Bool( title=u"Complete", description=u"True if the job has finished running", ) def abort(): """Abort the job. """ def wait(timeout=10.0): """Wait until the job has completed or the timeout duration has
class IContract(Interface): """A Contract object.""" type = Choice(title=_(u'Contract type'), description=_(u"Contract type"), required=True, vocabulary="ContractTypes") startDate = Date(title=_(u'start date'), description=_(u"contract start date"), required=False) state = Choice(title=_(u'State'), description=_(u"State"), required=False, vocabulary="ContractState") expirationDate = Date(title=_(u'expiration date'), description=_(u"expiration of contract"), required=False) annualCharges = PhysicalQuantity(title=_(u'annual charges'), description=_(u"annual charges"), required=False) internalContractNumber = TextLine( title=_(u'internal contract number'), description=_(u"internal contract number"), required=False) externalContractNumber = TextLine( title=_(u'external contract number'), description=_(u"external contract number"), required=False) periodOfNotice = Timedelta(title=_(u'period of notice'), description=_(u"period of notice"), required=False) minimumTerm = Timedelta(title=_(u'minimum term'), description=_(u"minimum term"), required=False) contractors = List(title=_(u'Contractors'), description=_(u"Contractors"), value_type=Choice(vocabulary='AllContactItems'), default=[], required=False) responsibles = List(title=_(u'Responsibles'), description=_(u"Responsibles"), value_type=Choice(vocabulary='AllContactItems'), required=False) component = Choice(title=_(u'Component'), vocabulary='AllComponents', required=False) @invariant def ensureAnnualChargesUnit(contract): if contract.annualCharges is not None: annualCharges = convertQuantity(contract.annualCharges) if not annualCharges.isCurrency(): raise Invalid( "No currency specification: '%s'." % \ (contract.annualCharges)) def trigger_online(): """
class ISamlAuthoritySchema(IItemSchema): """Parameters configuring an Saml authority (aka entity).""" entity_id = IriRef( title=_(u'entity_id_title', u'Entity id'), description=_( u'entity_id_description', u"""The id identifying this entity/authority.""", ), required=True, ) certificate = FilesystemPath( title=_(u'certificate_title', u'Certificate'), description=_( u'certificate_description', u"""`clienthome` relative or absolute path to the (DER) file containing the certificate corresponding to 'Private key'.""", ), required=False, ) future_certificate = FilesystemPath( title=_(u'future_certificate_title', u'Future certificate'), description=_( u'future_certificate_description', u"""`clienthome` relative or absolute path to the (DER) file containing the certificate you plan to use in the near future. As other SAML2 authorities rely on your metadata published certificates to verify your signatures, you cannot simply change your privat/public key pair and publish a new certificate: until the other parties had updated their metadata for you, they would be unable to verify your signature signed with your new private key. This field allows you to publish in advance the certificate for a new private key you plan to use in the near future. If you ensure a sufficient delay, they can be prepared for the key change. """, ), required=False, ) private_key = FilesystemPath( title=_(u'private_key_title', u'Private key'), description=_( u'private_key_description', u"""`clienthome` relative or absolute path to the (PEM) file containing the private key used for signing/encryption.""" ), required=False, ) private_key_password = Password( title=_(u'private_key_password_title', u'Private key password'), description=_(u'private_key_password_description', u"""Password used to encrypt the private key"""), required=False) base_url = URI( title=_(u'base_url_title', u'Base url'), description=_( u'base_url_description', u"""A Zope system is often used via different urls (e.g. direct versus indirectly via a Web server). The urls of internal objects change accordingly. The authority generates and distributes metadata involving url. These must remain reliably and not change inadvertantly. Therefore, the base url is not derived automatically from the (varying) urls but specified by this attribute.""" ), required=True, ) metadata_validity = Timedelta( title=_(u'metadata_validity_title', u'Metadata validity'), description=_(u'metadata_validity_description', u"""Validity period of generated metadata."""), required=True, default=timedelta(1), # 1 day )
class ISimpleSpssoPluginSchema(IItemSchema): """schema describing a PAS Plugin working together with a SimpleSpsso.""" idp_cookie_name = ASCIILine( title=_(u"idp_cookie_name_title", u"Identity provider cookie name"), description=_( u"idp_cookie_name_description", u"The name of the cookie used to remember the user's identity provider" ), required=False, default="idp_id", ) idp_cookie_path = ItemPath( title=_(u"idp_cookie_path_title", u"Identity provider cookie path"), description=_( u"idp_cookie_path_description", u"The path of the cookie used to remember the user's identity provider; defaults to the portal" ), required=False, default="/", ) idp_cookie_domain = ASCIILine( title=_(u"idp_cookie_domain_title", u"Identity provider cookie domain"), description=_( u"idp_cookie_domain_description", u"The domain of the cookie used to remember the user's identity provider" ), required=False, ) idp_cookie_lifetime = Timedelta( title=_(u"idp_cookie_lifetime_title", u"Identity provider cookie lifetime"), description=_( u"idp_cookie_lifetime_description", u"The lifetime of the cookie used to remember the user's identity provider. If not specified, this is a session cookie. Example values are `1d` (1 day), `3600s` (1 hour)." ), required=False, default=timedelta(360), # 1 year ) default_idp = IriRef( title=_(u"default_idp_title", u"Default identity provider"), description=_(u"default_idp_description", u"The idp used as default in idp selection"), required=False, ) failure_view = ASCIILine( title=_(u"failure_view_title", u"Failure view"), description=_(u"failure_view_description", u"The view presenting SAML problems."), required=True, default="@@saml_failure", ) select_idp_view = ASCIILine( title=_(u"select_idp_view_title", u"Select idp view"), description=_(u"select_idp_view_description", u"The view used to select an identity provider."), required=True, default="@@saml_select_idp", )
class ICodeImport(Interface): """A code import to a Bazaar Branch.""" export_as_webservice_entry() id = Int(readonly=True, required=True) date_created = Datetime(title=_("Date Created"), required=True, readonly=True) branch = exported( ReferenceChoice(title=_('Branch'), required=True, readonly=True, vocabulary='Branch', schema=IBranch, description=_("The Bazaar branch produced by the " "import system."))) registrant = PublicPersonChoice( title=_('Registrant'), required=True, readonly=True, vocabulary='ValidPersonOrTeam', description=_("The person who initially requested this import.")) review_status = exported( Choice(title=_("Review Status"), vocabulary=CodeImportReviewStatus, default=CodeImportReviewStatus.REVIEWED, readonly=True, description=_("Only reviewed imports are processed."))) rcs_type = exported( Choice(title=_("Type of RCS"), readonly=True, required=True, vocabulary=RevisionControlSystems, description=_("The version control system to import from. " "Can be CVS or Subversion."))) url = exported( URIField( title=_("URL"), required=False, readonly=True, description=_("The URL of the VCS branch."), allowed_schemes=["http", "https", "svn", "git", "bzr", "ftp"], allow_userinfo=True, allow_port=True, allow_query=False, # Query makes no sense in Subversion. allow_fragment=False, # Fragment makes no sense in Subversion. trailing_slash=False)) # See http://launchpad.net/bugs/56357. cvs_root = exported( TextLine( title=_("Repository"), required=False, readonly=True, constraint=validate_cvs_root, description=_( "The CVSROOT. " "Example: :pserver:[email protected]:/cvs/gnome"))) cvs_module = exported( TextLine(title=_("Module"), required=False, readonly=True, constraint=validate_cvs_module, description=_("The path to import within the repository." " Usually, it is the name of the project."))) date_last_successful = exported( Datetime(title=_("Last successful"), required=False, readonly=True)) update_interval = Timedelta( title=_("Update interval"), required=False, description= _("The user-specified time between automatic updates of this import. " "If this is unspecified, the effective update interval is a default " "value selected by Launchpad administrators.")) effective_update_interval = Timedelta( title=_("Effective update interval"), required=True, readonly=True, description=_( "The effective time between automatic updates of this import. " "If the user did not specify an update interval, this is a default " "value selected by Launchpad administrators.")) def getImportDetailsForDisplay(): """Get a one-line summary of the location this import is from.""" import_job = Choice( title=_("Current job"), readonly=True, vocabulary='CodeImportJob', description=_( "The current job for this import, either pending or running.")) results = Attribute("The results for this code import.") consecutive_failure_count = Attribute( "How many times in a row this import has failed.") def updateFromData(data, user): """Modify attributes of the `CodeImport`. Creates and returns a MODIFY `CodeImportEvent` if changes were made. This method preserves the invariant that a `CodeImportJob` exists for a given import if and only if its review_status is REVIEWED, creating and deleting jobs as necessary. :param data: dictionary whose keys are attribute names and values are attribute values. :param user: user who made the change, to record in the `CodeImportEvent`. May be ``None``. :return: The MODIFY `CodeImportEvent`, if any changes were made, or None if no changes were made. """ def tryFailingImportAgain(user): """Try a failing import again. This method sets the review_status back to REVIEWED and requests the import be attempted as soon as possible. The import must be in the FAILING state. :param user: the user who is requesting the import be tried again. """ @call_with(requester=REQUEST_USER) @export_write_operation() def requestImport(requester, error_if_already_requested=False): """Request that an import be tried soon.
class IBuildFarmJob(Interface): """Operations that jobs for the build farm must implement.""" export_as_webservice_entry(as_of='beta') id = Attribute('The build farm job ID.') build_farm_job = Attribute('Generic build farm job record') processor = Reference( IProcessor, title=_("Processor"), required=False, readonly=True, description=_( "The Processor required by this build farm job. " "This should be None for processor-independent job types.")) virtualized = Bool( title=_('Virtualized'), required=False, readonly=True, description=_( "The virtualization setting required by this build farm job. " "This should be None for job types that do not care whether " "they run virtualized.")) date_created = exported( Datetime(title=_("Date created"), required=True, readonly=True, description=_( "The timestamp when the build farm job was created.")), ("1.0", dict(exported_as="datecreated")), as_of="beta", ) date_started = exported(Datetime( title=_("Date started"), required=False, readonly=True, description=_("The timestamp when the build farm job was started.")), as_of="devel") date_finished = exported( Datetime(title=_("Date finished"), required=False, readonly=True, description=_( "The timestamp when the build farm job was finished.")), ("1.0", dict(exported_as="datebuilt")), as_of="beta", ) duration = exported(Timedelta(title=_("Duration"), required=False, readonly=True, description=_( "Duration interval, calculated when the " "result gets collected.")), as_of="devel") date_first_dispatched = exported( Datetime(title=_("Date finished"), required=False, readonly=True, description=_( "The actual build start time. Set when the build " "is dispatched the first time and not changed in " "subsequent build attempts."))) builder = exported( Reference(title=_("Builder"), schema=IBuilder, required=False, readonly=True, description=_("The builder assigned to this job."))) buildqueue_record = Reference( # Really IBuildQueue, set in _schema_circular_imports to avoid # circular import. schema=Interface, required=True, title=_("Corresponding BuildQueue record")) status = exported( Choice(title=_('Status'), required=True, vocabulary=BuildStatus, description=_("The current status of the job.")), ("1.0", dict(exported_as="buildstate")), as_of="beta", ) log = Reference( schema=ILibraryFileAlias, required=False, title=_( "The LibraryFileAlias containing the entire log for this job.")) log_url = exported( TextLine(title=_("Build Log URL"), required=False, description=_("A URL for the build log. None if there is no " "log available.")), ("1.0", dict(exported_as="build_log_url")), as_of="beta", ) is_private = Bool( title=_("is private"), required=False, readonly=True, description=_("Whether the build should be treated as private.")) job_type = Choice(title=_("Job type"), required=True, readonly=True, vocabulary=BuildFarmJobType, description=_("The specific type of job.")) build_cookie = Attribute( "A string which uniquely identifies the job in the build farm.") failure_count = Int( title=_("Failure Count"), required=False, readonly=True, default=0, description=_("Number of consecutive failures for this job.")) def setLog(log): """Set the `LibraryFileAlias` that contains the job log.""" def updateStatus(status, builder=None, slave_status=None, date_started=None, date_finished=None, force_invalid_transition=False): """Update job metadata when the build status changes. This automatically handles setting status, date_finished, builder, dependencies. Later it will manage the denormalised search schema. date_started and date_finished override the default (now). Only sensible transitions are permitted unless force_invalid_transition is set. The override only exists for tests and as an escape hatch for buildd-manager's failure counting. You do not want to use it. """ def gotFailure(): """Increment the failure_count for this job.""" def calculateScore(): """Calculate the build queue priority for this job.""" def estimateDuration(): """Estimate the build duration.""" def queueBuild(suspended=False): """Create a BuildQueue entry for this build. :param suspended: Whether the associated `Job` instance should be created in a suspended state. """ title = exported(TextLine(title=_("Title"), required=False), as_of="beta") was_built = Attribute("Whether or not modified by the builddfarm.") # This doesn't belong here. It really belongs in IPackageBuild, but # the TAL assumes it can read this directly. dependencies = exported(TextLine( title=_('Dependencies'), required=False, description=_( 'Debian-like dependency line that must be satisfied before ' 'attempting to build this request.')), as_of="beta") # Only really used by IBinaryPackageBuild, but # get_sources_list_for_building looks up this attribute for all build # types. external_dependencies = Attribute( "Newline-separated list of repositories to be used to retrieve any " "external build-dependencies when performing this build.")