def normalize_freqs(cur): global upgraded global be_quiet # select freqs from messages and ensure there are three decimal places tables = ["messages", "messages_saved", "freqs"] for table in tables: acarshub_logging.log(f"Normalizing frequencies in {table}", "db_upgrade", level=LOG_LEVEL["INFO"]) cur.execute(f""" SELECT freq, count(*) as cnt FROM {table} GROUP BY freq """) freqs = cur.fetchall() for freq in freqs: freq_in_table = freq[0] if len(freq_in_table) != 7: upgraded = True adjusted_freq = freq_in_table.ljust(7, "0") cur.execute( f""" UPDATE {table} SET freq = ? WHERE freq = ? """, (adjusted_freq, freq_in_table), ) acarshub_logging.log( f"Normalizing frequencies in {table} complete", "db_upgrade", level=LOG_LEVEL["INFO"], )
def request_graphs(message, namespace): pt = time.time() requester = request.sid socketio.emit( "alert_terms", { "data": get_cached(acarshub_helpers.acarshub_database.get_alert_counts, 30) }, to=requester, namespace="/main", ) socketio.emit( "signal", { "levels": get_cached(acarshub_helpers.acarshub_database.get_signal_levels, 30) }, namespace="/main", ) pt = time.time() - pt acarshub_logging.log( f"request_graphs took {pt * 1000:.0f}ms", "request_graphs", level=LOG_LEVEL["DEBUG"], )
def add_triggers(cur, db: Connection, table: str, columns: List[str]): global upgraded column_list_without = ",".join(f"{c}" for c in columns if c.find(" UNINDEXED") == -1) triggers = [ i[1] for i in cur.execute( "select * from sqlite_master where type = 'trigger';") ] execute_script = "" if f"{table}_fts_insert" not in triggers: execute_script += """ CREATE TRIGGER {table}_fts_insert AFTER INSERT ON messages BEGIN INSERT INTO {table}_fts (rowid, {column_list}) VALUES (new.id, {new_columns}); END; """.format( table=table, column_list=column_list_without, new_columns=",".join(f"new.{c}" for c in columns if c.find(" UNINDEXED") == -1), ) if f"{table}_fts_delete" not in triggers: execute_script += """ CREATE TRIGGER {table}_fts_delete AFTER DELETE ON messages BEGIN INSERT INTO {table}_fts ({table}_fts, rowid, {column_list}) VALUES ('delete', old.id, {old_columns}); END; """.format( table=table, column_list=column_list_without, old_columns=",".join(f"old.{c}" for c in columns if c.find(" UNINDEXED") == -1), ) if f"{table}_fts_update" not in triggers: execute_script += """ CREATE TRIGGER {table}_fts_update AFTER UPDATE ON messages BEGIN INSERT INTO {table}_fts ({table}_fts, rowid, {column_list}) VALUES ('delete', old.id, {old_columns}); INSERT INTO {table}_fts (rowid, {column_list}) VALUES (new.id, {new_columns}); END; """.format( table=table, column_list=column_list_without, new_columns=",".join(f"new.{c}" for c in columns if c.find(" UNINDEXED") == -1), old_columns=",".join(f"old.{c}" for c in columns if c.find(" UNINDEXED") == -1), ) if execute_script != "": upgraded = True acarshub_logging.log("Inserting FTS triggers", "db_upgrade", level=LOG_LEVEL["INFO"]) db.executescript(execute_script) conn.executescript( 'INSERT INTO messages_fts(messages_fts) VALUES ("rebuild")')
def handle_message(message, namespace): import time start_time = time.time() # We are going to send the result over in one blob # search.js will only maintain the most recent blob we send over total_results, serialized_json, search_term = acarshub_helpers.handle_message( message) # grab the socket id for the request # This stops the broadcast of the search results to everyone # in the search namespace. requester = request.sid socketio.emit( "database_search_results", { "num_results": total_results, "msghtml": serialized_json, "search_term": str(search_term), "query_time": time.time() - start_time, }, to=requester, namespace="/main", ) pt = time.time() - start_time acarshub_logging.log( f"query took {pt * 1000:.0f}ms: {str(search_term)}", "query_search", level=LOG_LEVEL["DEBUG"], )
def update_db(vdlm=0, acars=0, error=0): total = vdlm + acars try: rrdtool.update("/run/acars/acarshub.rrd", f"N:{acars}:{vdlm}:{total}:{error}") acarshub_logging.log( f"rrdtool.update: N:{acars}:{vdlm}:{total}:{error}", "rrdtool", level=LOG_LEVEL["DEBUG"], ) except Exception as e: acarshub_logging.acars_traceback(e, "rrdtool")
def database_search(search_term, page=0): result = None try: acarshub_logging.log( f"[database] Searching database for {search_term}", "database", level=LOG_LEVEL["DEBUG"], ) match_string = "" for key in search_term: if search_term[key] is not None and search_term[key] != "": if match_string == "": match_string += f'\'{key}:"{search_term[key]}"*' else: match_string += f' AND {key}:"{search_term[key]}"*' if match_string == "": return [None, 0] match_string += "'" session = db_session() result = session.execute( f"SELECT * FROM messages WHERE id IN (SELECT rowid FROM messages_fts WHERE messages_fts MATCH {match_string} ORDER BY rowid DESC LIMIT 50 OFFSET {page * 50})" ) count = session.execute( f"SELECT COUNT(*) FROM messages_fts WHERE messages_fts MATCH {match_string}" ) processed_results = [] final_count = 0 for row in count: final_count = row[0] if final_count == 0: session.close() return [None, 0] for row in result: processed_results.append(dict(row)) session.close() return (processed_results, final_count) except Exception as e: acarshub_logging.acars_traceback(e, "database") session.close() return [None, 0]
def prune_database(): try: acarshub_logging.log("Pruning database", "database") cutoff = ( datetime.datetime.now() - datetime.timedelta(days=acarshub_configuration.DB_SAVE_DAYS) ).timestamp() session = db_session() result = session.query(messages).filter(messages.time < cutoff).delete() acarshub_logging.log("Pruned %s messages" % result, "database") session.commit() acarshub_logging.log("Pruning alert database", "database") cutoff = ( datetime.datetime.now() - datetime.timedelta(days=acarshub_configuration.DB_ALERT_SAVE_DAYS) ).timestamp() result = ( session.query(messages_saved).filter(messages_saved.time < cutoff).delete() ) acarshub_logging.log("Pruned %s messages" % result, "database") session.commit() except Exception as e: acarshub_logging.acars_traceback(e, "database") finally: if session: session.close()
def reset_alert_counts(message, namespace): if message["reset_alerts"]: acarshub_helpers.acarshub_database.reset_alert_counts() try: socketio.emit( "alert_terms", { "data": acarshub_helpers.acarshub_database.get_alert_counts() }, namespace="/main", ) except Exception as e: acarshub_logging.log( f"Main Connect: Error sending alert_terms: {e}", "webapp") acarshub_logging.acars_traceback(e, "webapp")
def request_count(message, namespace): pt = time.time() requester = request.sid socketio.emit( "signal_count", { "count": get_cached(acarshub_helpers.acarshub_database.get_errors, 30) }, to=requester, namespace="/main", ) pt = time.time() - pt acarshub_logging.log( f"request_count took {pt * 1000:.0f}ms", "request_count", level=LOG_LEVEL["DEBUG"], )
def de_null(cur): # we need to ensure the columns don't have any NULL values # Legacy db problems... acarshub_logging.log("Ensuring no columns contain NULL values", "db_upgrade", level=LOG_LEVEL["INFO"]) cur.execute('UPDATE messages SET toaddr = "" WHERE toaddr is NULL') cur.execute('UPDATE messages SET fromaddr = "" WHERE toaddr is NULL') cur.execute('UPDATE messages SET depa = "" WHERE depa IS NULL') cur.execute('UPDATE messages SET dsta = "" WHERE dsta IS NULL') cur.execute('UPDATE messages SET depa = "" WHERE depa IS NULL') cur.execute('UPDATE messages SET eta = "" WHERE eta IS NULL') cur.execute('UPDATE messages SET gtout = "" WHERE gtout IS NULL') cur.execute('UPDATE messages SET gtin = "" WHERE gtin IS NULL') cur.execute('UPDATE messages SET wloff = "" WHERE wloff IS NULL') cur.execute('UPDATE messages SET wlin = "" WHERE wlin IS NULL') cur.execute('UPDATE messages SET lat = "" WHERE lat IS NULL') cur.execute('UPDATE messages SET lon = "" WHERE lon IS NULL') cur.execute('UPDATE messages SET alt = "" WHERE alt IS NULL') cur.execute('UPDATE messages SET dsta = "" WHERE dsta IS NULL') cur.execute('UPDATE messages SET msg_text = "" WHERE msg_text IS NULL') cur.execute('UPDATE messages SET tail = "" WHERE tail IS NULL') cur.execute('UPDATE messages SET flight = "" WHERE flight IS NULL') cur.execute('UPDATE messages SET icao = "" WHERE icao IS NULL') cur.execute('UPDATE messages SET freq = "" WHERE freq IS NULL') cur.execute('UPDATE messages SET ack = "" WHERE ack IS NULL') cur.execute('UPDATE messages SET mode = "" WHERE mode IS NULL') cur.execute('UPDATE messages SET label = "" WHERE label IS NULL') cur.execute('UPDATE messages SET block_id = "" WHERE block_id IS NULL') cur.execute('UPDATE messages SET msgno = "" WHERE msgno IS NULL') cur.execute( 'UPDATE messages SET is_response = "" WHERE is_response IS NULL') cur.execute( 'UPDATE messages SET is_onground = "" WHERE is_onground IS NULL') cur.execute('UPDATE messages SET error = "" WHERE error IS NULL') cur.execute('UPDATE messages SET libacars = "" WHERE libacars IS NULL') cur.execute('UPDATE messages SET level = "" WHERE level IS NULL') acarshub_logging.log("done with de-nulling", "db_upgrade", level=LOG_LEVEL["INFO"])
def htmlListener(): import time import sys # Run while requested... while not thread_html_generator_event.isSet(): sys.stdout.flush() time.sleep(1) while len(que_messages) != 0: message_source, json_message = que_messages.popleft() client_message = generateClientMessage(message_source, json_message) socketio.emit("acars_msg", {"msghtml": client_message}, namespace="/main") # acarshub_logging.log(f"EMIT: {client_message}", "htmlListener", level=LOG_LEVEL["DEBUG"]) acarshub_logging.log("Exiting HTML Listener thread", "htmlListener", level=LOG_LEVEL["DEBUG"])
def create_db(): try: if not os.path.exists("/run/acars/acarshub.rrd"): acarshub_logging.log("creating the RRD Database", "rrdtool") rrdtool.create( "/run/acars/acarshub.rrd", "--start", "N", "--step", "60", "DS:ACARS:GAUGE:120:U:U", "DS:VDLM:GAUGE:120:U:U", "DS:TOTAL:GAUGE:120:U:U", "DS:ERROR:GAUGE:120:U:U", "RRA:AVERAGE:0.5:1:1500", # 25 hours at 1 minute reso "RRA:AVERAGE:0.5:5:8640", # 1 month at 5 minute reso "RRA:AVERAGE:0.5:60:4320", # 6 months at 1 hour reso "RRA:AVERAGE:0.5:360:4380", # 3 year at 6 hour reso ) except Exception as e: acarshub_logging.acars_traceback(e, "rrdtool") else: acarshub_logging.log("Database found", "rrdtool")
def enable_fts(db: Connection, table: str, columns: List[str]): column_list_without = ",".join(f"{c}" for c in columns if c.find(" UNINDEXED") == -1) acarshub_logging.log("Creating new FTS table", "db_upgrade", level=LOG_LEVEL["INFO"]) db.executescript(""" CREATE VIRTUAL TABLE {table}_fts USING fts5 ( {column_list}, content={table}, content_rowid={rowid} )""".format(table=table, column_list=column_list_without, rowid="id")) acarshub_logging.log("Creating new triggers", "db_upgrade", level=LOG_LEVEL["INFO"]) db.executescript(""" CREATE TRIGGER {table}_fts_insert AFTER INSERT ON messages BEGIN INSERT INTO {table}_fts (rowid, {column_list}) VALUES (new.id, {new_columns}); END; CREATE TRIGGER {table}_fts_delete AFTER DELETE ON messages BEGIN INSERT INTO {table}_fts ({table}_fts, rowid, {column_list}) VALUES ('delete', old.id, {old_columns}); END; CREATE TRIGGER {table}_fts_update AFTER UPDATE ON messages BEGIN INSERT INTO {table}_fts ({table}_fts, rowid, {column_list}) VALUES ('delete', old.id, {old_columns}); INSERT INTO {table}_fts (rowid, {column_list}) VALUES (new.id, {new_columns}); END; """.format( table=table, column_list=column_list_without, new_columns=",".join(f"new.{c}" for c in columns if c.find(" UNINDEXED") == -1), old_columns=",".join(f"old.{c}" for c in columns if c.find(" UNINDEXED") == -1), )) acarshub_logging.log("Populating new FTS table with data", "db_upgrade", level=LOG_LEVEL["INFO"]) db.executescript( 'INSERT INTO messages_fts(messages_fts) VALUES ("rebuild")')
def message_listener(message_type=None, ip="127.0.0.1", port=None): import time import socket import json global error_messages_last_minute if message_type == "VDLM2": global vdlm_messages_last_minute elif message_type == "ACARS": global acars_messages_last_minute disconnected = True receiver = socket.socket(family=socket.AF_INET, type=socket.SOCK_STREAM) acarshub_logging.log( f"message_listener starting: {message_type.lower()}", "message_listener", level=LOG_LEVEL["DEBUG"], ) partial_message = None # Run while requested... while not thread_message_listener_stop_event.isSet(): data = None # acarshub_logging.log(f"recv_from ...", "message_listener", level=LOG_LEVEL["DEBUG"]) try: if disconnected: receiver = socket.socket(family=socket.AF_INET, type=socket.SOCK_STREAM) # Set socket timeout 1 seconds receiver.settimeout(1) # Connect to the sender receiver.connect((ip, port)) disconnected = False acarshub_logging.log( f"{message_type.lower()}_receiver connected to {ip}:{port}", f"{message_type.lower()}Generator", level=LOG_LEVEL["DEBUG"], ) if acarshub_configuration.LOCAL_TEST is True: data, addr = receiver.recvfrom(65527) else: data, addr = receiver.recvfrom(65527, socket.MSG_WAITALL) except socket.timeout: continue except socket.error as e: acarshub_logging.log( f"Error to {ip}:{port}. Reattempting...", f"{message_type.lower()}Generator", level=LOG_LEVEL["ERROR"], ) acarshub_logging.acars_traceback( e, f"{message_type.lower()}Generator") disconnected = True receiver.close() time.sleep(1) continue except Exception as e: acarshub_logging.acars_traceback( e, f"{message_type.lower()}Generator") disconnected = True receiver.close() time.sleep(1) continue # acarshub_logging.log(f"{message_type.lower()}: got data", "message_listener", level=LOG_LEVEL["DEBUG"]) if data is not None: decoded = data.decode() else: decoded = "" if decoded == "": disconnected = True receiver.close() continue # Decode json # There is a rare condition where we'll receive two messages at once # We will cover this condition off by ensuring each json message is # broken apart and handled individually # acarsdec or vdlm2dec single message ends with a newline so no additional processing required # acarsdec or vdlm2dec multi messages ends with a newline and each message has a newline but the decoder # breaks with more than one JSON object # in case of back to back objects, add a newline to split on decoded = decoded.replace("}{", "}\n{") # split on newlines split_json = decoded.splitlines() # try and reassemble messages that were received separately if partial_message is not None and len(split_json) > 0: combined = partial_message + split_json[0] try: # check if we can decode the json json.loads(combined) # no exception, json decoded fine, reassembly succeeded # replace the first string in the list with the reassembled string split_json[0] = combined acarshub_logging.log( "Reassembly successful, message not skipped after all!", f"{message_type.lower()}Generator", 1, ) except Exception as e: # reassembly didn't work, don't do anything but print an error when debug is enabled acarshub_logging.log( f"Reassembly failed {e}: {combined}", f"{message_type.lower()}Generator", level=LOG_LEVEL["DEBUG"], ) # forget the partial message, it can't be useful anymore partial_message = None for part in split_json: # acarshub_logging.log(f"{message_type.lower()}: part: {part}", "message_listener", level=LOG_LEVEL["DEBUG"]) if len(part) == 0: continue msg = None try: msg = json.loads(part) except ValueError as e: if part == split_json[-1]: # last element in the list, could be a partial json object partial_message = part acarshub_logging.log(f"JSON Error: {e}", f"{message_type.lower()}Generator", 1) acarshub_logging.log(f"Skipping Message: {part}", f"{message_type.lower()}Generator", 1) continue except Exception as e: acarshub_logging.log( f"Unknown Error with JSON input: {e}", f"{message_type.lower()}Generator", level=LOG_LEVEL["ERROR"], ) acarshub_logging.acars_traceback( e, f"{message_type.lower()}Generator") continue que_type = getQueType(message_type) if message_type == "VDLM2": vdlm_messages_last_minute += 1 elif message_type == "ACARS": acars_messages_last_minute += 1 if "error" in msg: if msg["error"] > 0: error_messages_last_minute += msg["error"] que_messages.append( (que_type, acars_formatter.format_acars_message(msg))) que_database.append( (que_type, acars_formatter.format_acars_message(msg))) if (len(list_of_recent_messages) >= list_of_recent_messages_max): # Keep the que size down del list_of_recent_messages[0] if not acarshub_configuration.QUIET_MESSAGES: print(f"MESSAGE:{message_type.lower()}Generator: {msg}") client_message = generateClientMessage( que_type, acars_formatter.format_acars_message(msg)) # add to recent message que for anyone fresh loading the page list_of_recent_messages.append(client_message)
# msgno = Column("msgno", String(32), index=True, nullable=False) # is_response = Column("is_response", String(32), nullable=False) # is_onground = Column("is_onground", String(32), nullable=False) # error = Column("error", String(32), nullable=False) # libacars = Column("libacars", Text, nullable=False) # level = Column("level", String(32), nullable=False) # term = Column("term", String(32), nullable=False) # type_of_match = Column("type_of_match", String(32), nullable=False) if (os.getenv("LOCAL_TEST", default=False) and str(os.getenv("LOCAL_TEST", default=False)).upper() == "TRUE"): path_to_db = os.getenv("DB_PATH") else: path_to_db = "/run/acars/messages.db" acarshub_logging.log("Checking to see if database needs upgrades", "db_upgrade") upgraded = False exit_code = 0 count_table = 'CREATE TABLE "count" ("id" INTEGER NOT NULL,"total" INTEGER, "errors" INTEGER, "good" INTEGER, PRIMARY KEY("id"));' freq_table = 'CREATE TABLE "freqs" ("it" INTEGER NOT NULL, "freq" VARCHAR(32), "freq_type" VARCHAR(32), "count" INTEGER, PRIMARY KEY("it"));' level_table = 'CREATE TABLE "level" ("id" INTEGER NOT NULL, "level" INTEGER, "count" INTEGER, PRIMARY KEY("id"));' messages_table = 'CREATE TABLE "messages" ("id" INTEGER NOT NULL, "message_type" VARCHAR(32) NOT NULL, "msg_time" INTEGER NOT NULL, \ "station_id" VARCHAR(32) NOT NULL, "toaddr" VARCHAR(32) NOT NULL, "fromaddr" VARCHAR(32) NOT NULL, "depa" VARCHAR(32) NOT NULL, \ "dsta" VARCHAR(32) NOT NULL, "eta" VARCHAR(32) NOT NULL, "gtout" VARCHAR(32) NOT NULL, "gtin" VARCHAR(32) NOT NULL, \ "wloff" VARCHAR(32) NOT NULL, "wlin" VARCHAR(32) NOT NULL, "lat" VARCHAR(32) NOT NULL, "lon" VARCHAR(32) NOT NULL, \ "alt" VARCHAR(32) NOT NULL, "msg_text" TEXT NOT NULL, "tail" VARCHAR(32) NOT NULL, "flight" VARCHAR(32) NOT NULL, \ "icao" VARCHAR(32) NOT NULL, "freq" VARCHAR(32) NOT NULL, "ack" VARCHAR(32) NOT NULL, "mode" VARCHAR(32) NOT NULL, \ "label" VARCHAR(32) NOT NULL, "block_id" VARCHAR(32) NOT NULL, "msgno" VARCHAR(32) NOT NULL, "is_response" VARCHAR(32) NOT NULL, \ "is_onground" VARCHAR(32) NOT NULL, "error" VARCHAR(32) NOT NULL, "libacars" TEXT NOT NULL,"level" VARCHAR(32) NOT NULL, \ PRIMARY KEY("id"));'
def check_tables(conn, cur): global upgraded columns = [ "message_type UNINDEXED", "msg_time," "station_id UNINDEXED", "toaddr UNINDEXED", "fromaddr UNINDEXED", "depa", "dsta", "eta UNINDEXED", "gtout UNINDEXED", "gtin UNINDEXED", "wloff UNINDEXED", "wlin UNINDEXED", "lat UNINDEXED", "lon UNINDEXED", "alt UNINDEXED", "msg_text", "tail", "flight", "icao", "freq", "ack UNINDEXED", "mode UNINDEXED", "label", "block_id UNINDEXED", "msgno UNINDEXED", "is_response UNINDEXED", "is_onground UNINDEXED", "error UNINDEXED", "libacars UNINDEXED", "level UNINDEXED", ] tables = [ i[0] for i in cur.execute( 'SELECT name FROM sqlite_master WHERE type ="table" AND name NOT LIKE "sqlite_%"' ) ] triggers = [ i[1] for i in cur.execute( "select * from sqlite_master where type = 'trigger';") ] if "text_fts" in tables: upgraded = True acarshub_logging.log("Removing old FTS table", "db_upgrade", level=LOG_LEVEL["INFO"]) cur.execute('DROP TABLE "main"."text_fts";') if "message_ad" in triggers: upgraded = True acarshub_logging.log("Removing AD trigger", "db_upgrade", level=LOG_LEVEL["INFO"]) cur.execute('DROP TRIGGER "main"."message_ad";') if "message_ai" in triggers: upgraded = True acarshub_logging.log("Removing AI trigger", "db_upgrade", level=LOG_LEVEL["INFO"]) cur.execute('DROP TRIGGER "main"."message_ai";') if "message_au" in triggers: upgraded = True acarshub_logging.log("Removing AU trigger", "db_upgrade", level=LOG_LEVEL["INFO"]) cur.execute('DROP TRIGGER "main"."message_au";') if "messages_fts" not in tables: upgraded = True acarshub_logging.log( "Adding in text search tables....may take a while", "db_upgrade", level=LOG_LEVEL["INFO"], ) acarshub_logging.log("creating virtual table", "db_upgrade", level=LOG_LEVEL["INFO"]) enable_fts(conn, "messages", columns) add_triggers(cur, conn, "messages", columns)
def main_connect(): pt = time.time() import sys # need visibility of the global thread object global thread_html_generator global thread_adsb global thread_adsb_stop_event recent_options = {"loading": True, "done_loading": False} requester = request.sid try: socketio.emit( "features_enabled", { "vdlm": acarshub_configuration.ENABLE_VDLM, "acars": acarshub_configuration.ENABLE_ACARS, "arch": acarshub_configuration.ARCH, "adsb": { "enabled": acarshub_configuration.ENABLE_ADSB, "lat": acarshub_configuration.ADSB_LAT, "lon": acarshub_configuration.ADSB_LON, "url": acarshub_configuration.ADSB_URL, "bypass": acarshub_configuration.ADSB_BYPASS_URL, "range_rings": acarshub_configuration.ENABLE_RANGE_RINGS, }, }, to=requester, namespace="/main", ) socketio.emit( "terms", { "terms": acarshub_helpers.acarshub_database.get_alert_terms(), "ignore": acarshub_helpers.acarshub_database.get_alert_ignore(), }, to=requester, namespace="/main", ) except Exception as e: acarshub_logging.log( f"Main Connect: Error sending features_enabled: {e}", "webapp") acarshub_logging.acars_traceback(e, "webapp") try: socketio.emit( "labels", { "labels": acarshub_helpers.acarshub_database.get_message_label_json() }, to=requester, namespace="/main", ) except Exception as e: acarshub_logging.log(f"Main Connect: Error sending labels: {e}", "webapp") acarshub_logging.acars_traceback(e, "webapp") msg_index = 1 for json_message in list_of_recent_messages: if msg_index == len(list_of_recent_messages): recent_options["done_loading"] = True msg_index += 1 try: socketio.emit( "acars_msg", { "msghtml": json_message, **recent_options, }, to=requester, namespace="/main", ) except Exception as e: acarshub_logging.log(f"Main Connect: Error sending acars_msg: {e}", "webapp") acarshub_logging.acars_traceback(e, "webapp") try: socketio.emit( "system_status", {"status": acarshub_helpers.get_service_status()}, to=requester, namespace="/main", ) except Exception as e: acarshub_logging.log(f"Main Connect: Error sending system_status: {e}", "webapp") acarshub_logging.acars_traceback(e, "webapp") try: rows, size = get_cached( acarshub_helpers.acarshub_database.database_get_row_count, 30) socketio.emit("database", { "count": rows, "size": size }, to=requester, namespace="/main") except Exception as e: acarshub_logging.log(f"Main Connect: Error sending database: {e}", "webapp") acarshub_logging.acars_traceback(e, "webapp") try: socketio.emit( "signal", { "levels": get_cached( acarshub_helpers.acarshub_database.get_signal_levels, 30) }, to=requester, namespace="/main", ) socketio.emit( "alert_terms", { "data": get_cached(acarshub_helpers.acarshub_database.get_alert_counts, 30) }, to=requester, namespace="/main", ) send_version() except Exception as e: acarshub_logging.log(f"Main Connect: Error sending signal levels: {e}", "webapp") acarshub_logging.acars_traceback(e, "webapp") # Start the htmlGenerator thread only if the thread has not been started before. if not thread_html_generator.is_alive(): sys.stdout.flush() thread_html_generator_event.clear() thread_html_generator = socketio.start_background_task(htmlListener) pt = time.time() - pt acarshub_logging.log(f"main_connect took {pt * 1000:.0f}ms", "htmlListener", level=LOG_LEVEL["DEBUG"])
def init(): global list_of_recent_messages # grab recent messages from db and fill the most recent array # then turn on the listeners acarshub_logging.log("Grabbing most recent messages from database", "init") try: results = acarshub_helpers.acarshub_database.grab_most_recent( list_of_recent_messages_max) except Exception as e: acarshub_logging.log( f"Startup Error grabbing most recent messages {e}", "init", level=LOG_LEVEL["ERROR"], ) acarshub_logging.acars_traceback(e, "init") if not acarshub_configuration.LOCAL_TEST: try: acarshub_logging.log("Initializing RRD Database", "init") acarshub_rrd_database.create_db( ) # make sure the RRD DB is created / there except Exception as e: acarshub_logging.log(f"Startup Error creating RRD Database {e}", "init") acarshub_logging.acars_traceback(e, "init") if results is not None: for json_message in results: try: que_type = getQueType(json_message["message_type"]) client_message = generateClientMessage(que_type, json_message) list_of_recent_messages.insert(0, client_message) except Exception as e: acarshub_logging.log( f"Startup Error adding message to recent messages {e}", "init") acarshub_logging.acars_traceback(e, "init") acarshub_logging.log( "Completed grabbing messages from database, starting up rest of services", "init", ) init_listeners()
def init_listeners(special_message=""): # This function both starts the listeners and is used with the scheduler to restart errant threads global thread_acars_listener global thread_vdlm2_listener global thread_database global thread_scheduler global thread_html_generator global thread_adsb_listner global thread_adsb # REMOVE AFTER AIRFRAMES IS UPDATED #### global vdlm2_feeder_thread global acars_feeder_thread # REMOVE AFTER AIRFRAMES IS UPDATED #### # show log message if this is container startup acarshub_logging.log( "Starting Data Listeners" if special_message == "" else "Checking Data Listeners", "init", level=LOG_LEVEL["INFO"] if special_message == "" else LOG_LEVEL["DEBUG"], ) if not thread_database.is_alive(): acarshub_logging.log( f"{special_message}Starting Database Thread", "init", level=LOG_LEVEL["INFO"] if special_message == "" else LOG_LEVEL["ERROR"], ) thread_database = Thread(target=database_listener) thread_database.start() if not thread_scheduler.is_alive(): acarshub_logging.log( f"{special_message}starting scheduler", "init", level=LOG_LEVEL["INFO"] if special_message == "" else LOG_LEVEL["ERROR"], ) thread_scheduler = Thread(target=scheduled_tasks) thread_scheduler.start() if not thread_html_generator.is_alive(): acarshub_logging.log( f"{special_message}Starting htmlListener", "init", level=LOG_LEVEL["INFO"] if special_message == "" else LOG_LEVEL["ERROR"], ) thread_html_generator = socketio.start_background_task(htmlListener) if not thread_acars_listener.is_alive( ) and acarshub_configuration.ENABLE_ACARS: acarshub_logging.log( f"{special_message}Starting ACARS listener", "init", level=LOG_LEVEL["INFO"] if special_message == "" else LOG_LEVEL["ERROR"], ) thread_acars_listener = Thread( target=message_listener, args=("ACARS", acarshub_configuration.LIVE_DATA_SOURCE, 15550), ) thread_acars_listener.start() if not thread_vdlm2_listener.is_alive( ) and acarshub_configuration.ENABLE_VDLM: acarshub_logging.log( f"{special_message}Starting VDLM listener", "init", level=LOG_LEVEL["INFO"] if special_message == "" else LOG_LEVEL["ERROR"], ) thread_vdlm2_listener = Thread( target=message_listener, args=("VDLM2", acarshub_configuration.LIVE_DATA_SOURCE, 15555), ) thread_vdlm2_listener.start() status = acarshub_helpers.get_service_status() # grab system status # emit to all namespaces for page in acars_namespaces: socketio.emit("system_status", {"status": status}, namespace=page)
import acarshub_configuration import acarshub_logging from acarshub_logging import LOG_LEVEL import re import os groundStations = dict() # dictionary of all ground stations alert_terms = list() # dictionary of all alert terms monitored # dictionary of all alert terms that should flag a message as a non-alert, even if alert matched alert_terms_ignore = list() overrides = {} # Download station IDs try: acarshub_logging.log("Downloading Station IDs", "database") with open("./data/ground-stations.json", "r") as f: groundStations_json = json.load(f) for station in groundStations_json["ground_stations"]: stationId = station.get("id") if stationId: groundStations[stationId] = { "icao": station["airport"]["icao"], "name": station["airport"]["name"], } acarshub_logging.log("Completed loading Station IDs", "database") except Exception as e: acarshub_logging.acars_traceback(e, "database") # Load Message Labels
def service_check(): import re global decoders global servers global receivers global system_error global stats global start_time global external_formats if os.getenv("LOCAL_TEST", default=False): healthcheck = subprocess.Popen( ["../../tools/healthtest.sh"], stdout=subprocess.PIPE, stderr=subprocess.PIPE, ) else: healthcheck = subprocess.Popen( ["/scripts/healthcheck.sh"], stdout=subprocess.PIPE, stderr=subprocess.PIPE ) stdout, stderr = healthcheck.communicate() healthstatus = stdout.decode() decoders = dict() servers = dict() receivers = dict() stats = dict() external_formats = dict() system_error = False for line in healthstatus.split("\n"): try: match = re.search("(?:acarsdec|dumpvdl2)-.+ =", line) if match: if match.group(0).strip(" =") not in decoders: decoders[match.group(0).strip(" =")] = dict() continue else: for decoder in decoders: if line.find(decoder) != -1: if line.find(f"Decoder {decoder}") and line.endswith( "UNHEALTHY" ): decoders[decoder]["Status"] = "Bad" system_error = True elif line.find(f"Decoder {decoder}") == 0 and line.endswith( "HEALTHY" ): decoders[decoder]["Status"] = "Ok" elif line.find(f"Decoder {decoder}") == 0: system_error = True decoders[decoder]["Status"] = "Unknown" continue match = re.search("^(?:acars|vdlm2)_server", line) if match: if match.group(0) not in servers: servers[match.group(0)] = dict() if line.find("listening") != -1 and line.endswith("UNHEALTHY"): servers[match.group(0)]["Status"] = "Bad" system_error = True elif line.find("listening") != -1 and line.endswith("HEALTHY"): servers[match.group(0)]["Status"] = "Ok" elif line.find("listening") != -1: system_error = True servers[match.group(0)]["Status"] = "Unknown" elif line.find("python") != -1 and line.endswith("UNHEALTHY"): system_error = True servers[match.group(0)]["Web"] = "Bad" elif line.find("python") != -1 and line.endswith("HEALTHY"): servers[match.group(0)]["Web"] = "Ok" elif line.find("python") != -1: system_error = True servers[match.group(0)]["Web"] = "Unknown" continue match = re.search("\\d+\\s+(?:ACARS|VDLM2) messages", line) if match: if line.find("ACARS") != -1 and "ACARS" not in receivers: receivers["ACARS"] = dict() receivers["ACARS"]["Count"] = line.split(" ")[0] if line.endswith("UNHEALTHY"): if time.time() - start_time > 300.0: system_error = True receivers["ACARS"]["Status"] = "Bad" else: receivers["ACARS"]["Status"] = "Waiting for first message" elif line.endswith("HEALTHY"): receivers["ACARS"]["Status"] = "Ok" else: system_error = True receivers["ACARS"]["Status"] = "Unknown" if line.find("VDLM2") != -1 and "VDLM2" not in receivers: receivers["VDLM2"] = dict() receivers["VDLM2"]["Count"] = line.split(" ")[0] if line.endswith("UNHEALTHY"): if time.time() - start_time > 300.0: system_error = True receivers["VDLM2"]["Status"] = "Bad" else: receivers["VDLM2"]["Status"] = "Waiting for first message" elif line.endswith("HEALTHY"): receivers["VDLM2"]["Status"] = "Ok" else: system_error = True receivers["VDLM2"]["Status"] = "Unknown" continue match = re.search("^(acars|vdlm2)_stats", line) if match: if match.group(0) not in stats: stats[match.group(0)] = dict() if line.endswith("UNHEALTHY"): system_error = True stats[match.group(0)]["Status"] = "Bad" elif line.endswith("HEALTHY"): stats[match.group(0)]["Status"] = "Ok" else: system_error = True stats[match.group(0)]["Status"] = "Unknown" match = re.search("^planeplotter", line) if match: if line.find("vdl2") != -1: pp_decoder = "VDLM2" else: pp_decoder = "ACARS" if pp_decoder not in external_formats: external_formats[pp_decoder] = [] if line.endswith("UNHEALTHY"): system_error = True external_formats[pp_decoder].append( {"type": "planeplotter", "Status": "Bad"} ) elif line.endswith("HEALTHY"): external_formats[pp_decoder].append( {"type": "planeplotter", "Status": "Ok"} ) else: system_error = True external_formats[pp_decoder].append( {"type": "planeplotter", "Status": "Unknown"} ) match = re.search("dumpvdl2 and planeplotter", line) if match: if line.find("vdl2") != -1: pp_decoder = "VDLM2" else: pp_decoder = "ACARS" if pp_decoder not in external_formats: external_formats[pp_decoder] = [] if line.endswith("UNHEALTHY"): system_error = True external_formats[pp_decoder].append( {"type": "dumpvdl2 to planeplotter", "Status": "Bad"} ) elif line.endswith("HEALTHY"): external_formats[pp_decoder].append( {"type": "dumpvdl2 to planeplotter", "Status": "Ok"} ) else: system_error = True external_formats[pp_decoder].append( {"type": "dumpvdl2 to planeplotter", "Status": "Unknown"} ) except Exception as e: acarshub_logging.log(e, "service_check", level=LOG_LEVEL["ERROR"]) acarshub_logging.acars_traceback(e)
def search_alerts(icao=None, tail=None, flight=None): result = None global alert_terms if ( icao is not None or tail is not None or flight is not None or alert_terms is not None ): try: session = db_session() search_term = { "icao": icao, # "msg_text": alert_terms, "flight": flight, "tail": tail, } query_string = "" for key in search_term: if search_term[key] is not None and search_term[key] != "": for term in search_term[key]: if query_string == "": query_string += f'{key}:"{term}"*' else: query_string += f' OR {key}:"{term}"*' if query_string != "": query_string = f"SELECT * FROM messages WHERE id IN (SELECT rowid FROM messages_fts WHERE messages_fts MATCH '{query_string}')" if alert_terms is not None: terms_string = """SELECT id, message_type, msg_time, station_id, toaddr, fromaddr, depa, dsta, eta, gtout, gtin, wloff, wlin, lat, lon, alt, msg_text, tail, flight, icao, freq, ack, mode, label, block_id, msgno, is_response, is_onground, error, libacars, level FROM messages_saved""" else: terms_string = "" if query_string != "" and terms_string != "": joiner = " UNION " else: joiner = "" if query_string != "" or terms_string != "": result = session.execute( f"{query_string}{joiner}{terms_string} ORDER BY msg_time DESC LIMIT 50 OFFSET 0" ) else: acarshub_logging.log("SKipping alert search", "database") return None processed_results = [] for row in result: processed_results.insert(0, dict(row)) if len(processed_results) == 0: return None processed_results.reverse() session.close() return processed_results except Exception as e: acarshub_logging.acars_traceback(e, "database") return None else: return None
def create_db_safe_params(message_from_json): params = { "time": "", "station_id": "", "toaddr": "", "fromaddr": "", "depa": "", "dsta": "", "eta": "", "gtout": "", "gtin": "", "wloff": "", "wlin": "", "lat": "", "lon": "", "alt": "", "text": "", "tail": "", "flight": "", "icao": "", "freq": "", "ack": "", "mode": "", "label": "", "block_id": "", "msgno": "", "is_response": "", "is_onground": "", "error": 0, "libacars": "", "level": "", } for index, value in message_from_json.items(): if index == "timestamp": params["time"] = value elif index == "station_id": params["station_id"] = value elif index == "toaddr": params["toaddr"] = value elif index == "fromaddr": params["fromaddr"] = value elif index == "depa": params["depa"] = value elif index == "dsta": params["dsta"] = value elif index == "eta": params["eta"] = value elif index == "gtout": params["gtout"] = value elif index == "gtin": params["gtin"] = value elif index == "wloff": params["wloff"] = value elif index == "wlin": params["wlin"] = value elif index == "lat": params["lat"] = value elif index == "lon": params["lon"] = value elif index == "alt": params["alt"] = value elif index == "text": params["text"] = value elif index == "data": params["text"] = value elif index == "tail": params["tail"] = value elif index == "flight": params["flight"] = value elif index == "icao": params["icao"] = value elif index == "freq": # normalizing frequency to 7 decimal places params["freq"] = str(value).ljust(7, "0") elif index == "ack": params["ack"] = value elif index == "mode": params["mode"] = value elif index == "label": params["label"] = value elif index == "block_id": params["block_id"] = value elif index == "msgno": params["msgno"] = value elif index == "is_response": params["is_response"] = value elif index == "is_onground": params["is_onground"] = value elif index == "error": params["error"] = value elif index == "libacars": try: params["libacars"] = json.dumps(value) except Exception as e: acarshub_logging.acars_traceback(e, "database") # skip these elif index == "channel": pass elif index == "level": params["level"] = value elif index == "end": pass # FIXME: acarsdec now appears to support message reassembly? # https://github.com/TLeconte/acarsdec/commit/b2d0a4c27c6092a1c38943da48319a3406db74f2 # do we need to do anything here for reassembled messages? elif index == "assstat": acarshub_logging.log( f"assstat key: {index}: {value}", "database", level=LOG_LEVEL["DEBUG"] ) acarshub_logging.log( message_from_json, "database", level=LOG_LEVEL["DEBUG"] ) # We have a key that we aren't saving the database. Log it else: acarshub_logging.log( f"Unidenitied key: {index}: {value}", "database", level=LOG_LEVEL["DEBUG"], ) acarshub_logging.log( message_from_json, "database", level=LOG_LEVEL["DEBUG"] ) return params
def update_keys(json_message): # Santiztize the message of any empty/None vales # This won't occur for live messages but if the message originates from a DB query # It will return all keys, even ones where the original message didn't have a value stale_keys = [] for key in json_message: if not has_specified_key_not_none(json_message, key): stale_keys.append(key) for key in stale_keys: del json_message[key] # Now we process individual keys, if that key is present # database tablename for the message text doesn't match up with typescript-decoder (needs it to be text) # so we rewrite the key if has_specified_key(json_message, "msg_text"): json_message["text"] = json_message["msg_text"] del json_message["msg_text"] if has_specified_key(json_message, "time"): json_message["timestamp"] = json_message["time"] del json_message["time"] if has_specified_key(json_message, "libacars"): json_message["libacars"] = libacars_formatted(json_message["libacars"]) if has_specified_key(json_message, "icao"): try: json_message["icao_hex"] = format(int(json_message["icao"]), "X") except Exception as e: acarshub_logging.log( f"Unable to convert icao to hex: {json_message['icao']}", "update_keys", LOG_LEVEL["WARNING"], ) acarshub_logging.acars_traceback(e, "update_keys") if has_specified_key(json_message, "flight") and has_specified_key( json_message, "icao_hex" ): json_message["flight"], json_message["icao_flight"] = flight_finder( callsign=json_message["flight"], hex_code=json_message["icao_hex"] ) elif has_specified_key(json_message, "flight"): json_message["flight"], json_message["icao_flight"] = flight_finder( callsign=json_message["flight"], url=False ) elif has_specified_key(json_message, "icao_hex"): json_message["icao_url"] = flight_finder(hex_code=json_message["icao_hex"]) if has_specified_key(json_message, "toaddr"): json_message["toaddr_hex"] = format(int(json_message["toaddr"]), "X") toaddr_icao, toaddr_name = acarshub_database.lookup_groundstation( json_message["toaddr_hex"] ) if toaddr_icao is not None: json_message["toaddr_decoded"] = f"{toaddr_name} ({toaddr_icao})" if has_specified_key(json_message, "fromaddr"): json_message["fromaddr_hex"] = format(int(json_message["fromaddr"]), "X") fromaddr_icao, fromaddr_name = acarshub_database.lookup_groundstation( json_message["fromaddr_hex"] ) if fromaddr_icao is not None: json_message["fromaddr_decoded"] = f"{fromaddr_name} ({fromaddr_icao})" if has_specified_key(json_message, "label"): label_type = acarshub_database.lookup_label(json_message["label"]) if label_type is not None: json_message["label_type"] = label_type else: json_message["label_type"] = "Unknown Message Label"
def add_indexes(cur): global upgraded indexes = [i[1] for i in cur.execute("PRAGMA index_list(messages)")] if "ix_messages_msg_text" not in indexes: acarshub_logging.log("Adding text index", "db_upgrade", level=LOG_LEVEL["INFO"]) upgraded = True cur.execute( 'CREATE INDEX "ix_messages_msg_text" ON "messages" ("msg_text" DESC)' ) if "ix_messages_icao" not in indexes: acarshub_logging.log("Adding icao index", "db_upgrade", level=LOG_LEVEL["INFO"]) upgraded = True cur.execute( 'CREATE INDEX "ix_messages_icao" ON "messages" ("icao" DESC)') if "ix_messages_flight" not in indexes: acarshub_logging.log("Adding flight index", "db_upgrade", level=LOG_LEVEL["INFO"]) upgraded = True cur.execute( 'CREATE INDEX "ix_messages_flight" ON "messages" ("flight" DESC)') if "ix_messages_tail" not in indexes: acarshub_logging.log("Adding tail index", "db_upgrade", level=LOG_LEVEL["INFO"]) upgraded = True cur.execute( 'CREATE INDEX "ix_messages_tail" ON "messages" ("tail" DESC)') if "ix_messages_depa" not in indexes: acarshub_logging.log("Adding depa index", "db_upgrade", level=LOG_LEVEL["INFO"]) upgraded = True cur.execute( 'CREATE INDEX "ix_messages_depa" ON "messages" ("depa" DESC)') if "ix_messages_dsta" not in indexes: acarshub_logging.log("Adding dsta index", "db_upgrade", level=LOG_LEVEL["INFO"]) upgraded = True cur.execute( 'CREATE INDEX "ix_messages_dsta" ON "messages" ("dsta" DESC)') if "ix_messages_msgno" not in indexes: acarshub_logging.log("Adding msgno index", "db_upgrade", level=LOG_LEVEL["INFO"]) upgraded = True cur.execute( 'CREATE INDEX "ix_messages_msgno" ON "messages" ("msgno" DESC)') if "ix_messages_freq" not in indexes: acarshub_logging.log("Adding freq index", "db_upgrade", level=LOG_LEVEL["INFO"]) upgraded = True cur.execute( 'CREATE INDEX "ix_messages_freq" ON "messages" ("freq" DESC)') if "ix_messages_label" not in indexes: acarshub_logging.log("Adding label index", "db_upgrade", level=LOG_LEVEL["INFO"]) upgraded = True cur.execute( 'CREATE INDEX "ix_messages_label" ON "messages" ("label" DESC)') if "ix_messages_label" not in indexes: acarshub_logging.log("Adding msg time index", "db_upgrade", level=LOG_LEVEL["INFO"]) upgraded = True cur.execute( 'CREATE INDEX "ix_messages_msgtime" ON "messages" ("msg_time" DESC)' )