def parseTimeStr(self, key): time = Time() value = self._args[key][0] for fmt in RequestOptions.TimeFormats: if time.fromString(value, fmt): break if not time.valid(): raise ValueError, "Invalid date format in parameter: %s" % key return time
def cleanup(self): self._cleanupCounter += 1 if self._cleanupCounter < 5: return debug("before cleanup:") debug(" _state %d" % len(self._state)) debug(" _origin %d" % len(self._origin)) debug(" _magnitude %d" % len(self._magnitude)) debug(" _focalmechanism %d" % len(self._focalmechanism)) debug(" public object count %d" % (PublicObject.ObjectCount())) # we first remove those origins and magnitudes, which are # older than one hour and are not preferred anywhere. limit = Time.GMT() + TimeSpan(-self._cleanup_interval) originIDs = self._origin.keys() for oid in originIDs: if self._origin[oid].creationInfo().creationTime() < limit: del self._origin[oid] magnitudeIDs = self._magnitude.keys() for oid in magnitudeIDs: if self._magnitude[oid] is None: # This should actually never happen! error("Magnitude %s is None!" % oid) del self._magnitude[oid] continue if self._magnitude[oid].creationInfo().creationTime() < limit: del self._magnitude[oid] focalmechanismIDs = self._focalmechanism.keys() for oid in focalmechanismIDs: if self._focalmechanism[oid].creationInfo().creationTime() < limit: del self._focalmechanism[oid] # finally remove all remaining objects older than two hours limit = Time.GMT() + TimeSpan(-2 * self._cleanup_interval) to_delete = [] for evid in self._state: org = self._state[evid].origin if org and org.time().value() > limit: continue # nothing to do with this event to_delete.append(evid) for evid in to_delete: del self._state[evid] debug("After cleanup:") debug(" _state %d" % len(self._state)) debug(" _origin %d" % len(self._origin)) debug(" _magnitude %d" % len(self._magnitude)) debug(" _focalmechanism %d" % len(self._focalmechanism)) debug(" public object count %d" % (PublicObject.ObjectCount())) debug("-------------------------------") self._cleanupCounter = 0
def parseTimeStr(self, keys): key, value = self.getFirstValue(keys) if value is None: return None time = Time() for fmt in RequestOptions.TimeFormats: if time.fromString(value, fmt): break if not time.valid(): raise ValueError, "invalid date format in parameter: %s" % key return time
def _checkTimes(self, realtimeGap): maxEndTime = Time(self.accessTime) if realtimeGap is not None: maxEndTime -= Time(realtimeGap, 0) for ro in self.streams: # create time if non was specified if ro.time is None: ro.time = RequestOptions.Time() # restrict time to 1970 - now if ro.time.start is None or ro.time.start < self.MinTime: ro.time.start = self.MinTime if ro.time.end is None or ro.time.end > maxEndTime: ro.time.end = maxEndTime # remove items with start time >= end time self.streams = [x for x in self.streams if x.time.start < x.time.end]
def _addStream(self, ro, streams, toks, lastFileName): start, end = Time(), Time() if start.fromString("%s.%s" % (toks[4], toks[5]), "%Y.%j") and \ end.fromString(lastFileName[-8:] + "23:59:59", "%Y.%j%T"): # match time if ro.time.start > end or \ (ro.time.end and ro.time.end < start): return # limit time to requested time if ro.time.start > start: start = ro.time.start if ro.time.end and ro.time.end < end: end = ro.time.end streams.append((toks[1], toks[2], start, end)) else: Logging.warning("invalid stream information: %s%s.%s" % ( toks[0], toks[1], toks[2]))
def __init__(self): self.service = "" self.accessTime = Time.GMT() self.userName = None self.time = None self.channel = None self.geo = None self.noData = http.NO_CONTENT self.format = None self._args = {} self.streams = [] # 1 entry for GET, multipl
def __init__(self, req, ro, code, length, err): # user agent agent = req.getHeader("User-Agent") if agent is None: agent = "" else: agent = agent[:100].replace('|', ' ') if err is None: err = "" service, user, accessTime, procTime = "", "", "", 0 net, sta, loc, cha = "", "", "", "" if ro is not None: # processing time in milliseconds procTime = int((Time.GMT() - ro.accessTime).length() * 1000.0) service = ro.service if ro.userName is not None: user = ro.userName accessTime = str(ro.accessTime) if ro.channel is not None: if ro.channel.net is not None: net = ",".join(ro.channel.net) if ro.channel.sta is not None: sta = ",".join(ro.channel.sta) if ro.channel.loc is not None: loc = ",".join(ro.channel.loc) if ro.channel.cha is not None: cha = ",".join(ro.channel.cha) # The host name of the client is resolved in the __str__ method by the # logging thread so that a long running DNS reverse lookup may not slow # down the request self.msgPrefix = "%s|%s|%s|" % (service, req.getRequestHostname(), accessTime) xff = req.requestHeaders.getRawHeaders("x-forwarded-for") if xff: self.userIP = xff[0].split(",")[0].strip() else: self.userIP = req.getClientIP() self.clientIP = req.getClientIP() self.msgSuffix = "|%s|%i|%i|%s|%s|%i|%s|%s|%s|%s|%s||" % ( self.clientIP, length, procTime, err, agent, code, user, net, sta, loc, cha)
def __init__(self, args=None): self.service = "" self.accessTime = Time.GMT() self.userName = None self.time = None self.channel = None self.geo = None self.noData = http.NO_CONTENT self.format = None # transform keys to lower case self._args = {} if args is not None: for key in args.keys(): self._args[key.lower()] = args[key] self.streams = [] # 1 entry for GET, multipl
def __init__(self, args=None): self.service = "" self.accessTime = Time.GMT() self.userName = None self.time = None self.channel = None self.geo = None self.noData = http.NO_CONTENT self.format = None # transform keys to lower case self._args = {} if args is not None: for k, v in iteritems(args): self._args[py3ustr(k.lower())] = py3ustrlist(v) self.streams = [] # 1 entry for GET, multipl
class _StationRequestOptions(RequestOptions): Exporters = { 'xml': 'fdsnxml', 'fdsnxml': 'fdsnxml', 'stationxml': 'staxml', 'sc3ml': 'trunk' } MinTime = Time(0, 1) VText = ['text'] OutputFormats = Exporters.keys() + VText PLevel = ['level'] PIncludeRestricted = ['includerestricted'] PIncludeAvailability = ['includeavailability'] PUpdateAfter = ['updateafter'] PMatchTimeSeries = ['matchtimeseries'] # non standard parameters PFormatted = ['formatted'] POSTParams = RequestOptions.POSTParams + RequestOptions.GeoParams + \ PLevel + PIncludeRestricted + PIncludeAvailability + \ PUpdateAfter + PMatchTimeSeries + PFormatted #--------------------------------------------------------------------------- def __init__(self, args=None): RequestOptions.__init__(self, args) self.service = 'fdsnws-station' self.includeSta = True self.includeCha = False self.includeRes = False self.restricted = None self.availability = None self.updatedAfter = None self.matchTimeSeries = None # non standard parameters self.formatted = None #--------------------------------------------------------------------------- def parse(self): self.parseTime(True) self.parseChannel() self.parseGeo() self.parseOutput() # level: [network, station, channel, response] key, value = self.getFirstValue(self.PLevel) if value is not None: value = value.lower() if value == 'network' or value == 'net': self.includeSta = False elif value == 'channel' or value == 'cha' or value == 'chan': self.includeCha = True elif value == 'response' or value == 'res' or value == 'resp': self.includeCha = True self.includeRes = True elif value != 'station' and value != 'sta': self.raiseValueError(key) # includeRestricted (optional) self.restricted = self.parseBool(self.PIncludeRestricted) # includeAvailability (optional) self.availability = self.parseBool(self.PIncludeAvailability) # updatedAfter (optional), currently not supported self.updatedAfter = self.parseTimeStr(self.PUpdateAfter) # includeAvailability (optional) self.matchTimeSeries = self.parseBool(self.PMatchTimeSeries) # format XML self.formatted = self.parseBool(self.PFormatted) #--------------------------------------------------------------------------- def networkIter(self, inv, matchTime=False): for i in xrange(inv.networkCount()): net = inv.network(i) for ro in self.streams: # network code if ro.channel and not ro.channel.matchNet(net.code()): continue # start and end time if matchTime and ro.time: try: end = net.end() except ValueError: end = None if not ro.time.match(net.start(), end): continue yield net break #--------------------------------------------------------------------------- def stationIter(self, net, matchTime=False): for i in xrange(net.stationCount()): sta = net.station(i) # geographic location if self.geo: try: lat = sta.latitude() lon = sta.longitude() except ValueError: continue if not self.geo.match(lat, lon): continue for ro in self.streams: # station code if ro.channel and (not ro.channel.matchSta(sta.code()) or not ro.channel.matchNet(net.code())): continue # start and end time if matchTime and ro.time: try: end = sta.end() except ValueError: end = None if not ro.time.match(sta.start(), end): continue yield sta break #--------------------------------------------------------------------------- def locationIter(self, net, sta, matchTime=False): for i in xrange(sta.sensorLocationCount()): loc = sta.sensorLocation(i) for ro in self.streams: # location code if ro.channel and (not ro.channel.matchLoc(loc.code()) or not ro.channel.matchSta(sta.code()) or not ro.channel.matchNet(net.code())): continue # start and end time if matchTime and ro.time: try: end = loc.end() except ValueError: end = None if not ro.time.match(loc.start(), end): continue yield loc break #--------------------------------------------------------------------------- def streamIter(self, net, sta, loc, matchTime, dac): for i in xrange(loc.streamCount()): stream = loc.stream(i) for ro in self.streams: # stream code if ro.channel and (not ro.channel.matchCha(stream.code()) or not ro.channel.matchLoc(loc.code()) or not ro.channel.matchSta(sta.code()) or not ro.channel.matchNet(net.code())): continue # start and end time if matchTime and ro.time: try: end = stream.end() except ValueError: end = None if not ro.time.match(stream.start(), end): continue # match data availability extent if dac is not None and ro.matchTimeSeries: extent = dac.extent(net.code(), sta.code(), loc.code(), stream.code()) if extent is None or (ro.time and not ro.time.match( extent.start(), extent.end())): continue yield stream break
class _DataSelectRequestOptions(RequestOptions): MinTime = Time(0, 1) PQuality = ['quality'] PMinimumLength = ['minimumlength'] PLongestOnly = ['longestonly'] QualityValues = ['B', 'D', 'M', 'Q', 'R'] OutputFormats = ['miniseed', 'mseed'] POSTParams = RequestOptions.POSTParams + \ PQuality + PMinimumLength + PLongestOnly #--------------------------------------------------------------------------- def __init__(self, args=None): RequestOptions.__init__(self, args) self.service = 'fdsnws-dataselect' self.quality = self.QualityValues[0] self.minimumLength = None self.longestOnly = None #--------------------------------------------------------------------------- def _checkTimes(self, realtimeGap): maxEndTime = Time(self.accessTime) if realtimeGap is not None: maxEndTime -= Time(realtimeGap, 0) for ro in self.streams: # create time if non was specified if ro.time is None: ro.time = RequestOptions.Time() # restrict time to 1970 - now if ro.time.start is None or ro.time.start < self.MinTime: ro.time.start = self.MinTime if ro.time.end is None or ro.time.end > maxEndTime: ro.time.end = maxEndTime # remove items with start time >= end time self.streams = [x for x in self.streams if x.time.start < x.time.end] #--------------------------------------------------------------------------- def parse(self): # quality (optional), currently not supported key, value = self.getFirstValue(self.PQuality) if value is not None: value = value.upper() if value in self.QualityValues: self.quality = value else: self.raiseValueError(key) # minimumlength(optional), currently not supported self.minimumLength = self.parseFloat(self.PMinimumLength, 0) # longestonly (optional), currently not supported self.longestOnly = self.parseBool(self.PLongestOnly) # generic parameters self.parseTime() self.parseChannel() self.parseOutput()
def parsePOST(self, content): nLine = 0 for line in content: nLine += 1 line = line.strip() # ignore empty and comment lines if len(line) == 0 or line[0] == '#': continue # collect parameter (non stream lines) toks = line.split("=", 1) if len(toks) > 1: key = toks[0].strip().lower() isPOSTParam = False for p in self.POSTParams: if p == key: if key not in self._args: self._args[key] = [] self._args[key].append(toks[1].strip()) isPOSTParam = True break if isPOSTParam: continue # time parameters not allowed in POST header for p in self.TimeParams: if p == key: raise ValueError, "time parameter in line %i not " \ "allowed in POST request" % nLine # stream parameters not allowed in POST header for p in self.StreamParams: if p == key: raise ValueError, "stream parameter in line %i not " \ "allowed in POST request" % nLine raise ValueError, "invalid parameter in line %i" % nLine else: # stream parameters toks = line.split() nToks = len(toks) if nToks != 5 and nToks != 6: raise ValueError, "invalid number of stream components " \ "in line %i" % nLine ro = RequestOptions() # net, sta, loc, cha ro.channel = RequestOptions.Channel() ro.channel.net = toks[0].split(',') ro.channel.sta = toks[1].split(',') ro.channel.loc = toks[2].split(',') ro.channel.cha = toks[3].split(',') msg = "invalid %s value in line %i" for net in ro.channel.net: if ro.ChannelChars(net): raise ValueError, msg % ('network', nLine) for sta in ro.channel.sta: if ro.ChannelChars(sta): raise ValueError, msg % ('station', nLine) for loc in ro.channel.loc: if loc != "--" and ro.ChannelChars(loc): raise ValueError, msg % ('location', nLine) for cha in ro.channel.cha: if ro.ChannelChars(cha): raise ValueError, msg % ('channel', nLine) # start/end time ro.time = RequestOptions.Time() ro.time.start = Time() for fmt in RequestOptions.TimeFormats: if ro.time.start.fromString(toks[4], fmt): break logEnd = "-" if len(toks) > 5: ro.time.end = Time() for fmt in RequestOptions.TimeFormats: if ro.time.end.fromString(toks[5], fmt): break logEnd = ro.time.end.iso() Logging.debug("ro: %s.%s.%s.%s %s %s" % (ro.channel.net, ro.channel.sta, ro.channel.loc, ro.channel.cha, ro.time.start.iso(), logEnd)) self.streams.append(ro) if len(self.streams) == 0: raise ValueError, "at least one stream line is required"