def Add(conf): MAX = 2 # non std Dylos firmware might generate 4 numbers PM25 = 0 ; PM10 = 1 # array index defs try: # Request Dylos Data only for non std firmware # conf['fd'].write(bytes("R\r\n",'ascii')) #line = '' #while 1: # chr = conf['fd'].read(1) # line += chr # if chr == '\n': # break try: line = conf['fd'].readline() if not conf['file']: while conf['fd'].inWaiting(): # skip to latest record line = conf['fd'].readline() Serial_Errors = 0 except SerialException: conf['Serial_Errors'] += 1 MyLogger.log(modulename,'ATTENT',"Serial exception. Close/Open serial.") try: conf['fd'].close() except: pass conf['fd'] = None if conf['Serial_Errors'] > 10: MyLogger.log(modulename,'ERROR',"To many serial errors. Disabled.") conf['output'] = False return {} return conf['getdata']() except: sleep(10) return {} line = str(line.strip().decode('utf-8')) bin_data = [] try: bin_data = [int(x.strip()) for x in line.split(',')] except: # Dylos Error MyLogger('WARNING',"Dylos Data: Error - Dylos Bin data") bin_data = [None] * MAX if (len(bin_data) > MAX) or (len(bin_data) < MAX): MyLogger.log(modulename,'WARNING',"Data error") while len(bin_data) > MAX: del bin_data[len(bin_data)] while len(bin_data) < MAX: bin_data.append(None) except (Exception) as error: # Some other Dylos Error MyLogger.log(modulename,'WARNING',error) # take notice: index 0 is PM2.5, index 1 is PM10 values if ('raw' in conf.keys()) and (type(Conf['raw']) is module): conf['raw'].publish( tag='dylos', data="pm25=%.1f,pm10=%.1f" % (bin_data[PM25]*1.0,bin_data[PM10]*1.0)) return { "time": int(time()), conf['fields'][PM25]: calibrate(PM25,conf,bin_data[PM25]), conf['fields'][PM10]: calibrate(PM10,conf,bin_data[PM10]) }
def getdata(): global Conf, MyThread, ErrorCnt if (not Conf['input']): sleep(10) return {} if not registrate(): # start up input readings return {} try: return GetNextRecord() except IOError: raise IOError("Unable to contact InFlux server %s" % Conf['host']) except: MyLogger('DEBUG', 'InFlux sub should wait for %d seconds' % Conf['waiting']) shouldWait() return {}
def show_ident(ident): global Conf, CSV IDtxt = None new = True ID = CSV[ident['serial']] row1 = ['date', 'project', 'serial'] row2 = [ datetime.datetime.fromtimestamp(time()).strftime('%b %d %Y %H:%M:%S') + "\n", ident['project'], ident['serial'] ] for key in ("label", "description", "geolocation", "street", "village", "province", "municipality", 'fields', 'units', 'types', 'extern_ip', 'version'): if (key in ident.keys() and (ident[key] != None)): values = ident[key] if key == 'geolocation': if ident[key] == '0,0,0': continue else: values += '(lat,lon,alt)' if type(ident[key]) is list: values = ', '.join(ident[key]) row1.append(key) row2.append("%15s: %s\n" % (key, values)) try: IDtxt = ID['fd'].add_worksheet(title='info', rows="20", cols="%d" % len(Row2)) if new: # would be nice to give this row a color IDtxt.append_rows(row1) IDtxt.append_rows(row2) except: MyLogger.log(modulename, 'WARNING","Unable to created/update info sheet') return False MyLogger( 'DEBUG', 'Gspread added identification to %s_%s.info sheet.' % (ident['project'], ident['serial'])) return True
def InFluxConnect(): """ Connect to InFlux database and save filehandler """ global Conf for item in ['fd', 'updated']: if not item in Conf.keys(): Conf[item] = None if Conf['fd'] != None: return True waitReset() for key in ['user', 'password', 'hostname']: if (not key in Conf.keys()) or (Conf[key] == None): Conf['input'] = False MyLogger.log(modulename, 'FATAL', "Missing login %s credentials." % key) return False try: Conf['fd'] = InfluxDBClient(Conf['hostname'], Conf['port'], Conf['user'], Conf['password'], timeout=2 * 60) return db_list() except: MyLogger("FATAL", "InFlux subscription failed") return True
def Add(conf): bin_data = {} try: try: if (not conf['file']) and (conf['interval'] == 0): # request a json string conf['fd'].write("\n") line = conf['fd'].readline() if not conf['file']: while conf['fd'].inWaiting(): # skip to latest record line = conf['fd'].readline() Serial_Errors = 0 except IOError as er: conf['Serial_Errors'] += 1 MyLogger.log(modulename,'ATTENT',"Serial error: %s, retry close/open serial." % er) try: conf['fd'].close() except: pass conf['fd'] = None if conf['Serial_Errors'] > 10: MyLogger.log(modulename,'ERROR',"To many serial errors. Disabled.") sleep(1) conf['output'] = False return {} return conf['getdata']() except StandardError as er: MyLogger.log(modulename,'ATTENT',"Serial access error: %s" % er) pass line = str(line.strip().decode('utf-8')) s = line.find('{') ; l = line.find('}') if (s < 0) or (l < 0): # MyLogger('WARNING','Not a json input value: %s' % line) sleep(1) return bin_data try: bin_data = json.loads(line[s:(l+1)]) except: # Arduino Error MyLogger('WARNING',"Arduino Data: Error - json data load") if len(bin_data) <= 0: sleep(1) return bin_data except (Exception) as error: # Some other Arduino Error MyLogger.log(modulename,'WARNING','Error in Add routine: %s' % error) sleep(1) return {} if ('firmware' in conf.keys()) and ('version' in bin_data.keys()): if conf['firmware'][0] != bin_data['version'][0]: MyLogger.log(modulename,'FATAL','Version/firmware incompatible') del bin_data['version'] if 'type' in bin_data.keys(): if conf['type'] != bin_data['type']: MyLogger.log(modulename,'ATTENT','%s sensor(s) type: %s' % (conf['type'],bin_data['type'])); conf['type'] = bin_data['type'] del bin_data['type'] if len(conf['keys']): for key in bin_data.keys(): if not key in conf['keys']: del bin_data[key] if not 'keys_skipped' in conf.keys(): conf['keys_skipped'] = [] if not key in conf['keys_skipped']: MyLogger.log(modulename,'ATTENT','Skip value with name %s.' % key) conf['keys_skipped'].append(key) else: # at start: denote all record names/units from Arduino firmware conf['keys'] = []; conf['names'] = [] conf['units'] = [] if not 'calibrations' in conf.keys(): conf['calibrations'] = [] if not 'fields' in conf.keys(): conf['fields'] = [] nr = 0 for key in bin_data.keys(): conf['keys'].append(key) id_un = key.split('_'); nr += 1 if nr > len(conf['fields']): conf['fields'].append(id_un[0]) if nr > len(conf['calibrations']): conf['calibrations'].append = [0,1] conf['names'].append(id_un[0]) conf['units'].append(id_un[1].replace('#','')) MyLogger.log(modulename,'ATTENT','New nr %d sensor added: %s units %s' % (nr,id_un[0],id_un[1])) raw = [] for i in range(0,len(conf['fields'])): dataKey = '%s_%s' % (conf['names'][i],conf['units'][i]) if (type(bin_data[dataKey]) is float) or (type(bin_data[dataKey]) is int): raw.append('%s=%.1f' % (dataKey.replace('/','').replace('.',''),bin_data[dataKey]*1.0)) if dataKey in bin_data.keys(): bin_data.update( {conf['fields'][i]:calibrate(i,conf,bin_data[dataKey])}) del bin_data[dataKey] if ('raw' in conf.keys()) and (Conf['raw'] != None) and len(raw): conf['raw'].publish( tag='%s' % conf['type'].lower(), data='%s' % ','.join(raw)) bin_data.update( {"time": int(time())} ) return bin_data
def registrate(args): global Conf, CSV for key in ['ident']: if not key in args.keys(): MyLogger.log(modulename, 'FATAL', "%s argument missing." % key) ID = args['ident']['serial'] if not ID in CSV.keys(): if len(CSV) >= 20: for id in CSV.keys(): # garbage collection if not 'fd' in CSV[id].keys(): del CSV[id] elif ('last' in CSV[id]) and (int(time()) - CSV[id]['last'] > 2 * 60 * 60): CSV[id]['fd'].close() del CSV[id] if len(CSV) >= 20: MyLogger('WARNING', "to much CSV id's in progress. Skipping %s" & ID) return False CSV[ID] = {'file': Conf['file'], 'renew': 0} ID = CSV[ID] if ('fd' in ID.keys()) and ID['fd']: if (time() > ID['renew']) and ('fd' in ID.keys()): ID['renew'] = int( (time() - 1) / (24 * 60 * 60)) * 24 * 60 * 60 + Conf['ttl'] ID['fd'].close() del ID['fd'] if 'fd' in ID.keys(): return True if (not ID['file']): try: ID['file'] = args['ident']['project'] + '/' os.mkdir(args['ident']['project'], 0770) except OSError: pass except: ID['file'] = './' ID['file'] += args['ident']['serial'] ID['cur_name'] = ID['file'] + '_' + datetime.datetime.now().strftime( "%Y-%m-%d") + '.csv' try: show_ident(args['ident'], ID['file'] + '.txt') except: pass new = True if os.path.exists(ID['cur_name']): if not os.access(ID['cur_name'], os.W_OK): MyLogger.log(modulename, 'FATAL', "Cannot write to %s. Abort." % ID['cur_name']) try: try: ID['fd'] = open(ID['cur_name'], 'at', newline='', encoding="utf-8") except TypeError: ID['fd'] = open(ID['cur_name'], 'at') except: MyLogger.log(modulename, 'FATAL', "Cannot write to %s. Abort." % ID['cur_name']) new = False else: #Otherwise: create it try: try: ID['fd'] = open(ID['cur_name'], 'wt', newline='', encoding="utf-8") except TypeError: ID['fd'] = open(ID['cur_name'], 'wt') except: MyLogger.log( modulename, 'ERROR', "Unable to create file %s. Abort CSV." % ID['cur_name']) Conf['output'] = False raise IOError return False #Write csv-header try: ID['writer'] = csv.writer(ID['fd'], dialect='excel', delimiter=';', quoting=csv.QUOTE_NONNUMERIC) if new: Row = [] cells = args['ident'] for i in range(0, len(cells['fields'])): # this needs some more thought if type(args['data'][cells['fields'][i]]) is list: for j in range(1, len(args['data'][cells['fields'][i]])): Row.append("%s_%d(%s)" % (cells['fields'][i], j, cells['units'][i])) else: Row.append("%s(%s)" % (cells['fields'][i], cells['units'][i])) ID['writer'].writerow(Row) except: MyLogger.log(modulename, 'ERROR', "Failed to open file %s for writing." % ID['cur_name']) raise IOError return False MyLogger.log( modulename, 'INFO', "Created and will add records to file:" + os.getcwd() + '/' + ID['cur_name']) return True
def Add(conf, cnt=0): global Conf # ref: aqicn.org/faq/2015-09-06/ozone-aqi-using-concentrations-in-milligrams-or-ppb/ # ref: samenmetenaanluchtkwaliteit.nl/zelf-meten # ToDo: ppb clearly is < 0: what does that mean? For now: no measurement def PPB2ugm3(gas, ppb, temp): mol = { 'so2': 64.0, # RIVM 2.71 using fixed 15 oC 'no2': 46.0, # RIVM 1.95 using fixed 15 oC 'no': 30.01, # RIVM 1.27 using fixed 15 oC 'o3': 48.0, # RIVM 2.03 using fixed 15 oC 'co': 28.01, # RIVM 1.18 using fixed 15 oC 'co2': 44.01, # RIVM 1.85 using fixed 15 oC 'nh3': 17.031, } if not gas in mol.keys(): raise ValueError, "%s unknown gas" % gas if ppb < 0: return 0 return round(ppb * 12.187 * mol[gas] / (273.15 + temp), 2) def Close(conf): if conf['fd'] == None: return conf['fd'].close() conf['fd'] = None def Open(conf): if conf['fd'] != None: return True try: conf['fd'] = serial.Serial(conf['device'], baudrate=9600, parity=serial.PARITY_NONE, stopbits=serial.STOPBITS_ONE, bytesize=serial.EIGHTBITS, writeTimeout=1, timeout=15) except: MyLogger.log( modulename, 'ERROR', 'failed to open %s for %s' % (conf['device'], conf['gas'])) return False sleep(1) return True if cnt > 5: # stop recursion Close(conf) return {} cnt += 1 # serial line data: SN,PPB,temp oC,RH%,ADCraw,Traw,RHraw,day,hour,min,sec MAX = 11 # len(Conf['dataFlds']) if not 'Serial_Errors' in conf.keys(): conf['Serial_Errors'] = 0 if conf['Serial_Errors'] > 10: Close(conf) MyLogger.log(modulename, 'FATAL', "To many serial errors. Disabled.") sleep(0) # to do: improve this! return {} line = '' bin_data = [] try: now = time() if conf['start'] - now > 0: Close(conf) sleep(conf['start'] - now) try: if not Open(conf): raise Exception while conf['fd'].inWaiting(): # skip to latest record conf['fd'].flushInput() sleep(1) Serial_Errors = 0 for i in range(3, -1, -1): if not i: return Add(conf, cnt) conf['fd'].write(bytes("\r")) # request measurement line = conf['fd'].readline() line = str(line.strip()) if Conf['debug']: print("Got %s sensor line: %s" % (conf['gas'].upper(), line)) bin_data = [] try: bin_data = [int(x.strip()) for x in line.split(',')] except: # Spec Error MyLogger('WARNING', "Spec Data: Error - Spec Bin data") if len(bin_data) != MAX: MyLogger.log(modulename, 'WARNING', "Data length error") conf['Serial_Errors'] += 1 continue break # except SerialException: # conf['Serial_Errors'] += 1 # MyLogger.log(modulename,'ATTENT',"Serial exception. Close/Open serial.") # return {} except (Exception) as error: MyLogger.log(modulename, 'WARNING', "Serial read %s" % str(error)) conf['Serial_Errors'] += 1 Close(conf) sleep(10) return Add(conf, cnt) conf['Serial_Errors'] = 0 except (Exception) as error: # Some other sensor Error MyLogger.log(modulename, 'WARNING', str(error)) values = { "time": int(time()) } rawData = [] for i in range(0, MAX): if Conf['dataFlds'][i] == None: continue try: rawData.append( '%s=%.1f' % (conf['gas'].lower() if i == 0 else Conf['dataFlds'][i], bin_data[i])) except: MyLogger('WARNING', "Error on index %d" % i) #print("dataFlds",Conf['dataFlds']) #print("bin_data",bin_data) if Conf['dataFlds'][i] == 'ppb': try: idx = Conf['fields'].index(conf['gas']) values[conf['gas']] = calibrate(idx, Conf['calibrations'], bin_data[i]) if values[conf[ 'gas']] < 0: # on negative Spec sensor has seen no gas values[conf['gas']] = None continue except: #print("conf",conf) MyLogger('WARNING', "index error on index %d" % i) #print("bin_data",bin_data) if Conf['units'][idx] == 'ug/m3': try: tempVal = float(bin_data[Conf['dataFlds'].index('temp')]) except: tempVal = 25.0 if not conf['gas'] in values.keys(): MyLogger( 'WARNING', "Error on index %d in values on %s" % (i, conf['gas'])) # print(values) values[conf['gas']] = PPB2ugm3(conf['gas'], values[conf['gas']], tempVal) elif Conf['dataFlds'][i][-3:] == 'raw': values[conf['gas'] + Conf['dataFlds'][i]] = int(bin_data[i]) else: try: values[Conf['dataFlds'][i]] = calibrate( Conf['fields'].index(Conf['dataFlds'][i]), Conf['calibrations'], bin_data[i]) except: values[Conf['dataFlds'][i]] = int(bin_data[i]) if ('raw' in conf.keys()) and (conf['raw'] != None) and (not type(conf['raw']) is bool): Conf['raw'].publish(tag='Spec%s' % conf['gas'].upper(), data=','.join(rawData)) # print("Got values: ", values) # Close(conf) return values
def GetNextRecord(): ''' get next record from DataRecords and InfoRecords. if DataRecords is empty find next database and download DataRecords from last time downloaded. On first download try from InfoRecord time ''' global DataRecords, InfoRecords, lastDB, DBlist, Conf identTried = 0 while True: if Conf['updated'] + Conf['update'] < time( ): # (re)download list of DBs db_list() # get list of databases # update internal cache of idents for item in InfoRecords.keys(): if not item in Conf['databases']: if lastDB == item: lastDB = None DataRecords = [] del InfoRecords[item] for item in InfoRecords.keys(): if not item in Conf['databases']: if item == lastDB: lastDB = None DataRecords = [] del InfoRecords[item] for item in Conf['databases']: if not item in InfoRecords.keys(): # initialise info struct initInfo(item) if not len(DataRecords): lastDB = item if not len(DataRecords): # download data records to bufsize if not len(DBlist): DBlist = list( Conf['databases']) # init working list of waiting DBs if not len(DBlist): MyLogger('ERROR', 'No Influx database to subscribe to') Conf['input'] = False return {} for item in DBlist: # pickup oldest in row waiting if lastDB == None: lastDB = item if not item in InfoRecords.keys(): # initialize info struct initInfo(item) if InfoRecords[item]['lastTime'] < InfoRecords[lastDB][ 'lastTime']: lastDB = item # not handled: buffer empty and new ident records just collected if not len(InfoRecords[lastDB]['ident']): if not getIdent(lastDB, 0): # nothing there yet DBlist.remove(lastDB) lastDB = None continue # there is at least one db with identication info if not getData(lastDB): DBlist.remove(lastDB) lastDB = None continue if 'InFluxStamp' in DataRecords[0].keys(): InfoRecords[lastDB]['dataStamp'] = DataRecords[0]['InFluxStamp'] del DataRecords[0]['InFluxStamp'] else: InfoRecords[lastDB]['dataStamp'] = 0 RememberInfo() return { 'register': InfoRecords[lastDB]['ident'], 'data': DataRecords.pop(0) }
def doQuery(database, query): global InfoRecords # expect info response as: # [{ # u'time': u'2017-05-23T14:13:38Z', # u'geolocation': u'41.134\\,15.121\\,23', # u'fields': u'time\\,pm25\\,pm10\\,spm25\\,spm10', # u'extern_ip': u'38.116.115.20', # u'project': u'BdP', # u'units': u's\\,pcs/qf\\,pcs/qf\\,pcs/qf\\,pcs/qf', # u'serial': u'33004d45', # u'types': u'PPD42NS\\,Nova SDS011' # }] # expect data response as: # [{u'spm25': 866, u'geolocation': u'"31.1234,30.112,23"', u'pm10': None, # u'timestamp': None, u'spm10': 6.5, u'time': u'2017-05-19T21:24:15Z', # u'new': u'1', u'pm25': 375}, # {u'spm25': 882, u'geolocation': u'"31.1234,30.112,26"', u'pm10': None, # u'timestamp': 1.495998546e+9, u'spm10': 6.667, u'time': u'2017-05-19T21:30:36Z', # u'new': u'0', u'pm25': 162}, ... ] try: response = list(Conf['fd'].query( query, database=database, expected_response_code=200).get_points()) waitReset() except: MyLogger.log( modulename, 'ERROR', "Query data request error: %s; value: %s" % (sys.exc_info()[0], sys.exc_info()[1])) wait() return [] timing = None if not database in InfoRecords.keys(): MyLogger('FATAL', 'InFlux sub script internal error') if len(response): # no explode needed for i in range(0, len(response)): if 'time' in response[i].keys(): timing = int( mktime( strptime(response[i]['time'].replace('Z', 'UTC'), '%Y-%m-%dT%H:%M:%S%Z'))) response[i][ 'InFluxStamp'] = timing # keep track of InFlux timestamp del response[i]['time'] if 'timestamp' in response[i].keys(): if response[i]['timestamp'] == None: response[i]['timestamp'] = timing if not type(response[i]['timestamp']) is str: response[i]['time'] = int(response[i]['timestamp']) if not 'time' in response[i].keys(): response[i]['time'] = timing del response[i]['timestamp'] if 'new' in response[i].keys(): if response[i]['new'] == '1': if (not i) and (timing - 10 > InfoRecords[database]['infoStamp']): getIdent(database, timing - 10) elif i: del response[i:] break del response[i]['new'] if timing == None: timing = time() InfoRecords[database]['lastTime'] = int(timing) return response
def registrate(args): global Conf, CSV, gspread for key in ['ident']: if not key in args.keys(): MyLogger.log(modulename, 'FATAL', "%s argument missing." % key) if (Conf['fd'] != None) and (not Conf['fd']): if ('waiting' in Conf.keys()) and ( (Conf['waiting'] + Conf['last']) >= time()): CSV = {} raise IOError return False ID = args['ident']['serial'] if not ID in CSV.keys(): if len(CSV) >= 20: for id in CSV.keys(): # garbage collection if not 'sheet' in CSV[id].keys(): del CSV[id] elif ('last' in CSV[id]) and (int(time()) - CSV[id]['last'] > 2 * 60 * 60): del CSV[id] if len(CSV) >= 20: MyLogger('WARNING', "to much GSPREAD id's in progress. Skipping %s" & ID) return False CSV[ID] = {'worksheet': Conf['sheet'], 'renew': 0} ID = CSV[ID] # authenticate if (not 'auth' in ID.keys()): try: ID['auth'] = authenticate_google_docs() except: Conf['auth'] = False if ID['auth'] == False: MyLogger.log(modulename, 'ERROR', 'Unable to get access to gspread. Disabled.') Conf['output'] = False return False # get the spreadsheet if (not ID['worksheet']): ID['worksheet'] = args['ident']['project'] + '_' + args['ident'][ 'serial'] ID['sheet'] = Conf['sheet'] ID['fd'] = None try: if ID['worksheet'] in ID['auth'].worksheets(): ID['fd'] = ID['auth'].open(ID['worksheet']) new = False else: ID['fd'] = ID['auth'].create(ID['worksheet']) if (Conf['user'] == None) and (Conf['hostname'] == None): ID['fd'].share(None, perm_type='anyone', role='reader') else: ID['fd'].share('%s@%s' % (Conf['user'], Conf['hostname']), perm_type='user', role='reader') except: Conf['output'] = False MyLogger.log(modulename, 'ERROR', 'Unable to authorize or access spreadsheet.') return False try: show_ident(args['ident'], ID['fd']) except: pass # get the sheet sorted by ttl (dflt per month) newNme = Conf['sheet'] + datetime.datetime.now().strftime(Conf['ttl']) new = False if (ID['sheet'] != newNme) or ('not cur_sheet' in ID.keys()): new = True ID['sheet'] = newNme ID['cur_sheet'] = None if new: Row = [] cells = args['ident'] for i in range(0, len(cells['fields'])): # this needs some more thought if type(args['data'][cells['fields'][i]]) is list: for j in range(1, len(args['data'][cells['fields'][i]])): Row.append("%s_%d(%s)" % (cells['fields'][i], j, cells['units'][i])) else: Row.append("%s(%s)" % (cells['fields'][i], cells['units'][i])) try: ID['cur_sheet'] = ID['fd'].add_worksheet(title=newName, rows="%d" % 32 * 24 * 60, cols="%d" % len(Row)) # would be nice to give them a color ID['cur_sheet'].append_rows(Row) except: MyLogger.log( modulename, 'ERROR', 'Unable to create new %s sheet in %s.' % (newNme, ID['worksheet'])) Conf['last'] = time() Conf['fd'] = 0 Conf['waitCnt'] += 1 if not (Conf['waitCnt'] % 5): Conf['waiting'] *= 2 raise IOError return False MyLogger.log( modulename, 'INFO', "Created and can add gspread records to file:" + ID['sheet']) Conf['waiting'] = 5 * 30 Conf['last'] = 0 Conf['waitCnt'] = 0 return True
# 'rssi': None, # if wifi provide signal strength # 'last': None, # last time checked to throddle connection info 'fields': ['rssi'], # strength or signal level 'units': ['dBm'], # per field the measurement unit 'raw': False, # no raw measurements displayed by dflt } try: import MyLogger import re # import subprocess, threading import subprocess from threading import Timer from time import time except ImportError as e: MyLogger(modulename, 'FATAL', "Unable to load module %s" % e) class Command(object): def __init__(self, cmd): self.cmd = cmd self.process = None def run(self, timeout): stdout = '' def target(): global stdout print 'Thread started' self.process = subprocess.Popen(self.cmd, shell=True,