def analyze(): #stations = 'PB01 PB02 PB03 PB04 PB05 PB06 PB07 PB08 PB09 PB10 PB11 PB12 PB13 PB14 PB15 PB16 HMBCX MNMCX PATCX PSGCX LVC TAIQ' stations = 'PB01 PB02 PATCX' stations = 'PATCX' t1 = UTC('2009-01-01') t2 = UTC('2010-01-01') data = IPOC() for station in stations.split(): hours = [[] for i in range(24)] times = [] levels = [] for t_day in daygen(t1, t2): try: stream = data.getRawStreamFromClient(t_day, t_day + 24 * 3600, station, component='Z') except ValueError: continue for tr in stream: tr.stats.filter = '' stream.demean() stream.detrend() stream.filter2(4, 6) stream.downsample2(5) stream.merge() tr = stream[0] startt = tr.stats.starttime endt = tr.stats.endtime if endt - startt < 12 * 3600: continue tr.data = obspy.signal.cpxtrace.envelope(tr.data)[1][:len(tr.data)] for hour in range(24): tr2 = tr.slice(t_day + hour * 3600, t_day + (hour + 1) * 3600) if tr2.stats.endtime - tr2.stats.starttime < 1800: continue num_stds = 60 # =^ every minute len_parts = len(tr2.data) // 60 # =^ 1min len_stds = len_parts // 6 # =^ 10s stds = np.array([np.std(tr2.data[i:i + len_stds]) for i in np.arange(num_stds) * len_parts]) stds = stds[stds != 0.] num_stds = len(stds) if num_stds < 50: continue stds = np.sort(stds)[num_stds // 5:-num_stds // 5] stds = stds[stds < np.min(stds) * 2.] val = np.mean(stds) levels.append(val) times.append(date2num(t_day + (0.5 + hour) * 3600)) hours[hour].append(val) errors = np.array([np.std(hours[i], ddof=1) / len(hours[i]) ** 0.5 for i in range(24)]) hours = np.array([np.mean(hours[i]) for i in range(24)]) times = np.array(times) levels = np.array(levels) np.savez('/home/richter/Results/IPOC/xcorr/noise_apmlitudes_%s_4-6Hz.npz' % station, hours=hours, errors=errors, times=times, levels=levels)
def main(): stations = 'PB01 PB02 PB03 PB04 PB05' component = 'Z' t1 = UTC('2009-06-01') t2 = UTC('2009-07-01') data = IPOC('test', use_local_LVC=False) data.setXLogger('_test') period = 24 * 3600 ax = None plt.ion() for station in stations.split(): pxxs = [] freqs_old = None i = 0 for t in timegen(t1, t2, period): st = data.getRawStreamFromClient(t, t + period, station, component) st.merge(method=1, interpolation_samples=10, fill_value='interpolate') print st pxx, freqs = st.plotPSD(just_calculate=True) assert np.all(freqs == freqs_old) or not freqs_old freqs_old = freqs if max(pxx[4:]) > 1e7: print 'discard' i += 1 continue pxxs.append(pxx) pxx = sum(pxxs) / len(pxxs) del pxxs tr = Trace(data=pxx, header=dict(is_fft=True, sampling_rate=2 * max(freqs), freq_min=min(freqs), freq_max=max(freqs))) ax = tr.plotPSD(ax=ax, label='%s-%d' % (st[0].stats.station, i), figtitle=None) plt.draw() # embed() ax.legend() fig = ax.get_figure() fig.suptitle('%s %s %s to %s' % (stations, component, t1.strftime('%Y-%m-%d'), t2.strftime('%Y-%m-%d'))) plt.ioff() plt.show()
#!/usr/bin/env python # by TR from sito.data import IPOC from obspy.core import UTCDateTime as UTC import matplotlib.pyplot as plt from sito.util.main import streamtimegen from sito.stream import Stream from progressbar import ProgressBar data = IPOC() t_day = UTC('2008-01-01') station = 'PB01' stream = data.getRawStreamFromClient( t_day, t_day + 24 * 3600, station, component='Z') stream.setHI('filter', '') stream.demean() stream.filter2(0.5, 5) stream.trim2(0, 5 * 3600) auto = Stream() for st in streamtimegen(stream, dt=60, start=None, shift=30, use_slice=True): tr = st[0].copy() tr.addZeros(60) tr.acorr(60) auto.append(tr) print auto auto.plotXcorr() stream.plot(type='dayplot')
def analyze(): #stations = 'PB01 PB02 PB03 PB04 PB05 PB06 PB07 PB08 PB09 PB10 PB11 PB12 PB13 PB14 PB15 PB16 HMBCX MNMCX PATCX PSGCX LVC TAIQ' stations = 'PB01 PB02 PATCX' stations = 'PATCX' t1 = UTC('2009-01-01') t2 = UTC('2010-01-01') data = IPOC() for station in stations.split(): hours = [[] for i in range(24)] times = [] levels = [] for t_day in daygen(t1, t2): try: stream = data.getRawStreamFromClient(t_day, t_day + 24 * 3600, station, component='Z') except ValueError: continue for tr in stream: tr.stats.filter = '' stream.demean() stream.detrend() stream.filter2(4, 6) stream.downsample2(5) stream.merge() tr = stream[0] startt = tr.stats.starttime endt = tr.stats.endtime if endt - startt < 12 * 3600: continue tr.data = obspy.signal.cpxtrace.envelope(tr.data)[1][:len(tr.data)] for hour in range(24): tr2 = tr.slice(t_day + hour * 3600, t_day + (hour + 1) * 3600) if tr2.stats.endtime - tr2.stats.starttime < 1800: continue num_stds = 60 # =^ every minute len_parts = len(tr2.data) // 60 # =^ 1min len_stds = len_parts // 6 # =^ 10s stds = np.array([ np.std(tr2.data[i:i + len_stds]) for i in np.arange(num_stds) * len_parts ]) stds = stds[stds != 0.] num_stds = len(stds) if num_stds < 50: continue stds = np.sort(stds)[num_stds // 5:-num_stds // 5] stds = stds[stds < np.min(stds) * 2.] val = np.mean(stds) levels.append(val) times.append(date2num(t_day + (0.5 + hour) * 3600)) hours[hour].append(val) errors = np.array( [np.std(hours[i], ddof=1) / len(hours[i])**0.5 for i in range(24)]) hours = np.array([np.mean(hours[i]) for i in range(24)]) times = np.array(times) levels = np.array(levels) np.savez( '/home/richter/Results/IPOC/xcorr/noise_apmlitudes_%s_4-6Hz.npz' % station, hours=hours, errors=errors, times=times, levels=levels)
fig.text(0.02, y0 + (0.98 / 2) * dy, station, va='center', ha='center', rotation=90) t_day = UTC(day) fname = '%s/%s-%s.png' % (pic_dir, station, day) try: if redraw: raise IOError img = mpimg.imread(fname) except IOError: try: stream = data.getRawStreamFromClient(t_day, t_day + 24 * 3600, station, component='Z') except ValueError: continue fig2 = stream.plot(type='dayplot', vertical_scaling_range=scale_range if station != 'LVC' else scale_LVC, interval=30, handle=True) fig2.axes[0].set_position((0, 0, 1, 1)) fig2.axes[1].set_visible(False) fig2.texts[0].set_visible(False) fig2.canvas.draw() fig2.savefig(fname) img = mpimg.imread(fname) ax = fig.add_axes([x0, y0, dx * 0.98, dy * 0.98])
if args.absolute_scale is None and args.relative_scale is None: kwargs['scale'] = 1. else: station = args.file_station if station.startswith('PB') or station == 'LVC' or station.endswith('CX'): from sito.data import IPOC data = IPOC(xcorr_append=args.xcorr_append) elif station == 'PKD': from sito.data import Parkfield data = Parkfield(xcorr_append=args.xcorr_append) else: raise argparse.ArgumentError('Not a valid station name') day = UTC(args.date) if args.xcorr_append is None: #stream = data.getRawStream(day, station, component=args.component) stream = data.getRawStreamFromClient(day, day + 24 * 3600, station, component=args.component, channel=args.channel) else: stream = data.getStream(day, station, component=args.component) if args.absolute_scale is None and args.relative_scale is None: kwargs['absolutescale'] = 0.0005 tr = stream[0] tr.stats.filter = '' if args.frequency is False: if tr.stats.is_fft: tr.ifft() print tr #tr.plotTrace(component=args.component, **kwargs) tr.plotTrace(**kwargs) else: # if not tr.stats.is_fft: