def divide_segmentlist(start,end,bins=4096,write=True,**kwargs): ''' Divide given period to segmenlist Parameters ---------- start : `int` GPS start time of given period end : `int` GPS end time of given period bins : `int`, optional The number of bins. Unit is second. Default value is 4096 =(2**12). Returns ------- segmentlist : `gwpy.segment.SegmentList` Divided segmentlist ''' if ((end-start) % bins) != 0: raise ValueError('Not divisible!') _start = range(start ,end ,bins) _end = range(start+bins,end+bins,bins) segmentlist = SegmentList([Segment(s,e) for s,e in zip(_start,_end)]) log.debug(segmentlist[0]) log.debug(segmentlist[-1]) if write: segmentlist.write('./segmentlist/total.txt') return segmentlist
def locked(): lockstate = TimeSeries.fetch('K1:GRD-LSC_LOCK_OK',start,end,host='10.68.10.121',port=8088,pad=np.nan) fs = (1./lockstate.dt).value locked = (lockstate == 1.0*u.V).to_dqflag(round=False,minlen=2**10*fs) # *1 ok = locked.active ok = SegmentList(sorted(ok,key=lambda x:x.end-x.start,reverse=True)) # *2 myprint(ok) ok.write('segments_locked.txt') print('Finished segments_locked.txt')
def check_nodata(segmentlist, prefix='./data', write=True, skip=False, **kwargs): ''' ''' from gwpy.segments import SegmentList if not skip: log.debug('Find segments') # find unchecked segments exists = iofunc.existance(segmentlist, ftype='gwf') not_checked = [ segmentlist[i] for i, exist in enumerate(exists) if not exist ] log.debug('{0}(/{1}) are not checked'.format(len(not_checked), len(segmentlist))) n = len(not_checked) #ans = [_check_nodata(segment,**kwargs)[1] for segment in not_checked] ans = [ _check_nodata(segment, headder='{0:04d}(/{1:04d})'.format(i, n), **kwargs)[1] for i, segment in enumerate(not_checked) ] # nodata segments nodata = SegmentList( [not_checked[i] for i, _ans in enumerate(ans) if 'NoData' in _ans]) else: nodata = SegmentList.read('./segmentlist/nodata.txt') # exist segments exist = diff(segmentlist, nodata) if len(exist) == 0: log.debug('No data are existed...') raise ValueError('No data Error.') if write: exist.write('./segmentlist/exist.txt') nodata.write('./segmentlist/nodata.txt') log.debug('./segmentlist/exist.txt Saved') log.debug('./segmentlist/nodata.txt Saved') log.debug('{0} segments are existed'.format(len(exist))) return exist, nodata
def random_segments(start,end,bins=4096,nseg=10,seed=3434,**kwargs): ''' Return segment list randomly in term you designated Parameters ---------- start : `float` Start time end : `float` End time nseg : `int`, optional The number of segments seed : `int`, optional Seed. 3434 is a default value. bins : `int`, optional Duration. default is 3600 seconds. write : `Bool`, optional If True, segmentlist is written in local directory. Default is True. Returns ------- segmentlist : `gwpy.segments.SegmentList` SegmentList. ''' from numpy.random import randint write = kwargs['write'] np.random.seed(seed=seed) ini = range(start,end,bins) _start = np.array([ ini[randint(0,len(ini))] for i in range(0,nseg)]) _end = _start + bins segmentlist = SegmentList(map(Segment,zip(_start,_end))) if write: segmentlist.write('./segmentlist/random.txt') return segmentlist
def check_baddata(segmentlist, prefix='./data', write=True, plot=True, **kwargs): ''' ''' log.debug('Checking bad segments') exists = iofunc.existance(segmentlist, ftype='png_ts') checked = [segmentlist[i] for i, exist in enumerate(exists) if exist] not_checked = [ segmentlist[i] for i, exist in enumerate(exists) if not exist ] from gwpy.segments import SegmentList bad = SegmentList() eq = SegmentList() for i, segment in enumerate(segmentlist): data, bad_status = _check_baddata(segment, **kwargs) if bad_status - 16 >= 0: eq.append(segment) elif bad_status and not (bad_status - 16 >= 0): bad.append(segment) elif not bad_status: pass else: log.debug(bad_status) log.debug('!') raise ValueError('!') start, end = segment fname_img = iofunc.fname_png_ts(start, end, prefix) log.debug('{0:03d}/{1:03d} {2} {3}'.format(i, len(segmentlist), fname_img, bad_status)) chname = get_seis_chname(start, end) #if plot and not os.path.exists(fname_img): # plot_timeseries(data,start,end,bad_status,fname_img) # new = SegmentList() for segment in segmentlist: flag = 0 for _bad in bad: if segment != _bad: flag += 1 else: break if flag == len(bad): new.append(segment) segmentlist = new new = SegmentList() for segment in segmentlist: flag = 0 for _eq in eq: if segment != _eq: flag += 1 else: break if flag == len(eq): new.append(segment) if write: new.write('./segmentlist/available.txt') bad.write('./segmentlist/lackofdata.txt') eq.write('./segmentlist/glitch.txt') return new, bad, eq
triggers = SegmentList([]) #grabbing the trigger files for filename in glob.glob(wildcard_trigs_hveto): #loading the triggers in data = SegmentList.read(filename) print data triggers += data start_time_utc += datetime.timedelta(days=1) #triggers.coalesce() start_end_seg = Segment(args.gps_start_time, args.gps_end_time) triggers = triggers & SegmentList([start_end_seg]) #print triggers triggers.write("total_hveto_trigs.txt") start_time_utc += datetime.timedelta(days=1) #SEGMENT HANDLING: begin for loop that loops over the range of all days/months/years f = open("total_hveto_segs.txt","w") #file that will hold collection of all segments pattern_segs_hveto = os.path.join(args.directory_path, '{}{:02}','{}{:02}{:02}', '*86400-DARM', 'segs.txt') for day in range(start_day, end_day + 1): for month in range(start_month, end_month +1): for year in range(start_year, end_year +1): wildcard_segs_hveto = pattern_segs_hveto.format(year, month,year, month, day) #grabbing segment files
# Get the SNR threshold to use for this channel. omic_trigs = omic_trigger_tables[i] # get triggers from one channel at a time! snr_thresh = get_percentile(omic_trigs, percentile[j]) # Apply the snr filter to the triggers and get their peaktimes selection = filter_threshold(omic_trigs, snr_thresh) peaktime = np.array(selection.getColumnByName('peak_time')[:], dtype=float) + \ np.array(selection.getColumnByName('peak_time_ns')[:], dtype=float) * 1.0e-9 # Now we can calculate the offsets for this channel veto_segs = [] t0 = time.time() get_vetotimes(peaktime, end_times, veto_segs) t1 = time.time() print "This took %f seconds to run to completion\n" %(t1 - t0) # Coalesce the segments and write them out to the file print "Offsets calculated. Coalesce the segments now\n" veto_segs = SegmentList(veto_segs) # Merge contiguous veto sections and sort the list of segments veto_segs.coalesce() # Write all the segments to disk. Read in the [0, 2] columns to recover the data # We will use h5py to write out all the segments in groups organized by the name # of the channel print "Write the segments to file\n" grp = thresh_grp.create_group('%s' %channels[i]) SegmentList.write(veto_segs, grp, 'vetosegs') print "All done!" f.close() print "Finished computing the offsets for all the channels!!!! Wooohoo!!!\n"