async def query(collection: str, fields: Optional[List[str]] = None, filt: Optional[Dict[str, Any]] = None, sort: Optional[Union[str, List[str]]] = None, limit: Optional[int] = None) -> Iterable[Record]: if logger.getEffectiveLevel() <= logging.DEBUG: logger.debug( 'querying %s (%s) where %s (sort=%s, limit=%s)', collection, json_utils.dumps(fields) if fields else 'all fields', json_utils.dumps(filt, allow_extended_types=True), json_utils.dumps(sort), json_utils.dumps(limit), ) filt = filt or {} sort = sort or [] if isinstance(sort, str): sort = [sort] # Transform '-field' into (field, reverse) sort = [(s[1:], True) if s.startswith('-') else (s, False) for s in sort] return await _get_driver().query(collection, fields, filt, sort, limit)
async def _handle_api_call_exception(self, func: Callable, kwargs: dict, error: Exception) -> None: kwargs = dict(kwargs) params = kwargs.pop('params', None) args = json_utils.dumps(kwargs) body = params and json_utils.dumps(params) or '{}' if isinstance(error, core_responses.HTTPError): error = core_api.APIError.from_http_error(error) if isinstance(error, core_api.APIError): logger.error('api call %s failed: %s (args=%s, body=%s)', func.__name__, error, args, body) self.set_status(error.status) if not self._finished: # Avoid finishing an already finished request await self.finish_json(error.to_json()) if isinstance(error, core_api.APIAccepted): self.set_status(202) if not self._finished and error.response is not None: # Avoid finishing an already finished request await self.finish_json(error.response) elif isinstance(error, StreamClosedError) and func.__name__ == 'get_listen': logger.debug('api call get_listen could not complete: stream closed') else: logger.error('api call %s failed: %s (args=%s, body=%s)', func.__name__, error, args, body, exc_info=True) self.set_status(500) if not self._finished: # Avoid finishing an already finished request await self.finish_json({'error': str(error)})
async def load_from_data(self, data: GenericJSONDict) -> None: attrs_start = ['enabled'] # These will be loaded first, in this order attrs_end = ['expression'] # These will be loaded last, in this order attr_items = data.items() attr_items = [a for a in attr_items if (a[0] not in attrs_start) and (a[0] not in attrs_end)] attr_items_start = [] for n in attrs_start: v = data.get(n) if v is not None: attr_items_start.append((n, v)) # Sort the rest of the attributes alphabetically attr_items.sort(key=lambda i: i[0]) attr_items_end = [] for n in attrs_end: v = data.get(n) if v is not None: attr_items_end.append((n, v)) attr_items = attr_items_start + attr_items + attr_items_end for name, value in attr_items: if name in ('id', 'value'): continue # Value is also among the persisted fields try: self.debug('loading %s = %s', name, json_utils.dumps(value)) await self.set_attr(name, value) except Exception as e: self.error('failed to set attribute %s = %s: %s', name, json_utils.dumps(value), e) # Value if await self.is_persisted() and data.get('value') is not None: self._value = data['value'] self.debug('loaded value = %s', json_utils.dumps(self._value)) if await self.is_writable(): # Write the just-loaded value to the port value = self._value if self._transform_write: value = await self.adapt_value_type(self._transform_write.eval()) await self.write_value(value) elif self.is_enabled(): try: value = await self.read_transformed_value() if value is not None: self._value = value self.debug('read value = %s', json_utils.dumps(self._value)) except Exception as e: self.error('failed to read value: %s', e, exc_info=True)
def query(collection: str, fields: Optional[List[str]] = None, filt: Optional[Dict[str, Any]] = None, limit: Optional[int] = None) -> Iterable[Record]: if logger.getEffectiveLevel() <= logging.DEBUG: logger.debug('querying %s (%s) where %s', collection, json_utils.dumps(fields) if fields else 'all fields', json_utils.dumps(filt)) return _get_driver().query(collection, fields, filt or {}, limit)
def update(collection: str, record_part: Record, filt: Optional[Dict[str, Any]] = None) -> int: if logger.getEffectiveLevel() <= logging.DEBUG: logger.debug('updating %s where %s with %s', collection, json_utils.dumps(filt or {}), json_utils.dumps(record_part)) count = _get_driver().update(collection, record_part, filt or {}) logger.debug('modified %s records in %s', count, collection) return count
async def insert(collection: str, record: Record) -> Id: if logger.getEffectiveLevel() <= logging.DEBUG: logger.debug('inserting %s into %s', json_utils.dumps(record, allow_extended_types=True), collection) return await _get_driver().insert(collection, record)
async def set_value(name: str, value: Any) -> None: if logger.getEffectiveLevel() <= logging.DEBUG: logger.debug('setting %s to %s', name, json_utils.dumps(value, allow_extended_types=True)) driver = _get_driver() record = {'value': value} records = list(await driver.query(name, fields=['id'], filt={}, sort=[], limit=2)) if len(records) > 1: logger.warning( 'more than one record found in single-value collection %s', name) id_ = records[0]['id'] elif len(records) > 0: id_ = records[0]['id'] else: id_ = None if id_ is None: await driver.insert(name, record) else: await driver.replace(name, id_, record)
async def write_transformed_value(self, value: PortValue, reason: str) -> None: value_str = json_utils.dumps(value) if self._transform_write: # Temporarily set the port value to the new value, so that the write transform expression takes the new # value into consideration when evaluating the result prev_value = self._last_read_value self._last_read_value = value value = await self.adapt_value_type( await self._transform_write.eval(self._make_expression_context())) self._last_read_value = prev_value value_str = f'{value_str} ({json_utils.dumps(value)} after write transform)' try: await self._write_value_queued(value, reason) self.debug('wrote value %s (reason=%s)', value_str, reason) except Exception: self.error('failed to write value %s (reason=%s)', value_str, reason, exc_info=True) raise
async def _eval_and_write(self, context: Dict[str, Any]) -> None: self.debug('evaluating expression') expression = self.get_expression() try: value = await expression.eval(context) except core_expressions.ExpressionEvalError: return except Exception as e: self.error('failed to evaluate expression "%s": %s', expression, e) return value = await self.adapt_value_type(value) if value is None: return if value != self.get_last_read_value( ): # Value changed after evaluation self.debug('expression "%s" evaluated to %s', expression, json_utils.dumps(value)) try: await self.write_transformed_value( value, reason=CHANGE_REASON_EXPRESSION) except Exception as e: self.error('failed to write value: %s', e)
async def set_attr(self, name: str, value: Attribute) -> None: old_value = await self.get_attr(name) if old_value is None: return # Refuse to set an unsupported attribute method = getattr(self, 'attr_set_' + name, None) if method: try: await method(value) except Exception: self.error('failed to set attribute %s = %s', name, json_utils.dumps(value), exc_info=True) raise elif hasattr(self, '_' + name): setattr(self, '_' + name, value) if self.is_loaded(): await main.update() # New attributes might have been added or removed after setting an attribute; therefore new definitions might # have appeared or disappeared self.invalidate_attrdefs() self._attrs_cache[name] = value if old_value != value: await self.handle_attr_change(name, value) if not self.is_loaded(): return # Skip an IO loop iteration, allowing setting multiple attributes before triggering a port-update await asyncio.sleep(0) self.trigger_update()
async def set_attr(attr_name: str, attr_value: Attribute) -> None: core_api.logger.debug('setting attribute %s = %s on %s', attr_name, json_utils.dumps(attr_value), port) try: await port.set_attr(attr_name, attr_value) except Exception as e1: errors_by_name[attr_name] = e1
def finish_json(self, data: Any) -> asyncio.Future: self._response_body_json = data data = json_utils.dumps(data) data += '\n' self.set_header('Content-Type', 'application/json; charset=utf-8') return self.finish(data)
def parse_labels(self) -> None: for area in self._paradox.storage.data['partition'].values(): self.debug('detected area id=%s, label=%s', area['id'], json_utils.dumps(area['label'])) for zone in self._paradox.storage.data['zone'].values(): self.debug('detected zone id=%s, label=%s', zone['id'], json_utils.dumps(zone['label'])) for output in self._paradox.storage.data['pgm'].values(): self.debug('detected output id=%s, label=%s', output['id'], json_utils.dumps(output['label'])) for type_, entries in self._paradox.storage.data.items(): for entry in entries.values(): if 'label' in entry: self._properties.setdefault(type_, {}).setdefault( entry['id'], {})['label'] = entry['label']
async def request(self, method: str, path: str, body: Any = None, admin_password: Optional[str] = None, no_log: bool = False) -> Any: http_client = httpclient.AsyncHTTPClient() if admin_password: password_hash = hashlib.sha256(admin_password.encode()).hexdigest() else: password_hash = core_api_auth.EMPTY_PASSWORD_HASH headers = { 'Content-Type': json_utils.JSON_CONTENT_TYPE, 'Authorization': core_api_auth.make_auth_header(core_api_auth.ORIGIN_CONSUMER, username='******', password_hash=password_hash) } # TODO: this only tries standard port 80; ideally it should try connecting on a list of known ports timeout = settings.slaves.discover.request_timeout url = f'http://{self.ip_address}:80{path}' body_str = None if body is not None: body_str = json_utils.dumps(body) request = httpclient.HTTPRequest(url=url, method=method, headers=headers, body=body_str, connect_timeout=timeout, request_timeout=timeout) if not no_log: self.debug('requesting %s %s', method, path) try: response = await http_client.fetch(request, raise_error=True) except Exception as e: if not no_log: self.error('request %s %s failed: %s', method, path, e, exc_info=True) raise if response.body: return json_utils.loads(response.body)
def remove(collection: str, filt: Optional[Dict[str, Any]] = None) -> int: if logger.getEffectiveLevel() <= logging.DEBUG: logger.debug('removing from %s where %s', collection, json_utils.dumps(filt or {})) count = _get_driver().remove(collection, filt or {}) logger.debug('removed %s records from %s', count, collection) return count
def _request(self, request: WebhooksRequest) -> None: if not self._enabled: return url = self.get_url() body = json_utils.dumps(request.body) def on_response(response: HTTPResponse) -> None: try: core_responses.parse(response) except core_responses.Error as e: logger.error('%s: call failed: %s', self, e) # Retry mechanism if not self._retries: return self._check_pending() if request.retries <= self._retries: request.retries += 1 logger.debug('%s: resending request (retry %s/%s)', self, request.retries, self._retries) self._request(request) else: self._check_pending() else: logger.debug('%s: call succeeded', self) self._check_pending() http_client = AsyncHTTPClient() # TODO use webhooks password headers = { 'Content-Type': json_utils.JSON_CONTENT_TYPE, 'Authorization': core_api_auth.make_auth_header( core_api_auth.ORIGIN_DEVICE, username=None, password_hash=core_device_attrs.normal_password_hash) } request = HTTPRequest(url, 'POST', headers=headers, body=body, connect_timeout=self._timeout, request_timeout=self._timeout, follow_redirects=True) logger.debug('%s: calling', self) http_client.fetch( request, callback=on_response) # TODO implement me using await
def _save(self, data: UnindexedData) -> None: if self._use_backup and os.path.exists(self._file_path): backup_file_path = self._get_backup_file_path() logger.debug('backing up %s to %s', self._file_path, backup_file_path) os.rename(self._file_path, backup_file_path) logger.debug('saving to %s', self._file_path) with open(self._file_path, 'wb') as f: data = json_utils.dumps(data, allow_extended_types=True, indent=4 if self._pretty_format else None) f.write(data.encode())
async def ensure_index(collection: str, index: Union[str, List[str]]) -> None: if isinstance(index, str): index = [index] # Transform '-field' into (field, reverse) index = [(i[1:], True) if i.startswith('-') else (i, False) for i in index] if logger.getEffectiveLevel() <= logging.DEBUG: logger.debug('ensuring index %s in %s', json_utils.dumps(index), collection) await _get_driver().ensure_index(collection, index)
async def write_value(self, value: bool) -> None: self._val_file.seek(0) if value: value = '1' else: value = '0' self.debug('writing %s to "%s"', json_utils.dumps(value), self._val_file.name) self._val_file.write(value) self._val_file.flush()
async def save_sample(port: core_ports.BasePort, timestamp: int) -> None: value = port.get_last_read_value() if value is None: logger.debug('skipping null sample of %s (timestamp = %s)', port, timestamp) return logger.debug('saving sample of %s (value = %s, timestamp = %s)', port, json_utils.dumps(value), timestamp) record = {'pid': port.get_id(), 'val': value, 'ts': timestamp} await persist.insert(PERSIST_COLLECTION, record)
def handle_paradox_property_change(self, change: Any) -> None: from .paradoxport import ParadoxPort info = self._paradox.storage.data[change.type].get(change.key) if info and ('id' in info): id_ = info['id'] self.debug( 'property change: %s[%s].%s: %s -> %s', change.type, id_, change.property, json_utils.dumps(change.old_value, allow_extended_types=True), json_utils.dumps(change.new_value, allow_extended_types=True)) obj = self._properties.setdefault(change.type, {}).setdefault(id_, {}) obj[change.property] = change.new_value obj['label'] = info['label'] else: id_ = None self.debug( 'property change: %s.%s: %s -> %s', change.type, change.property, json_utils.dumps(change.old_value, allow_extended_types=True), json_utils.dumps(change.new_value, allow_extended_types=True)) self._properties.setdefault(change.type, {})[change.property] = change.new_value for port in self.get_ports(): pai_port = cast(ParadoxPort, port) try: pai_port.on_property_change(change.type, id_, change.property, change.old_value, change.new_value) except Exception as e: self.error('property change handler execution failed: %s', e, exc_info=True)
async def _session_loop(self) -> None: api_response_dict = None api_request_dict = {} # TODO this does not work sleep_interval = 0 while True: if not self._enabled: break if sleep_interval: await asyncio.sleep(sleep_interval) sleep_interval = 0 try: api_request_dict = await self._wait( api_request_dict, api_response_dict) # TODO properly implement me except UnauthorizedConsumerRequestError: api_response_dict = { 'status': 401, 'body': json_utils.dumps({'error': 'authentication required'}) } continue except Exception as e: logger.error('wait failed: %s, retrying in %s seconds', e, settings.reverse.retry_interval, exc_info=True) sleep_interval = settings.reverse.retry_interval continue # The reverse mechanism has been disabled while waiting if not self._enabled: break try: api_response_dict = await self._process_api_request( api_request_dict) except Exception as e: logger.error('reverse API call failed: %s', e, exc_info=True) sleep_interval = settings.reverse.retry_interval api_response_dict = None continue
def load() -> None: data = persist.get_value('device', {}) # Attributes persisted_attrs = [] for name, value in device_attrs.ATTRDEFS.items(): persisted = value.get('persisted', value.get('modifiable')) if callable(persisted): persisted = persisted() if persisted: persisted_attrs.append(name) for name in persisted_attrs: if name.endswith('password') and hasattr(device_attrs, name + '_hash'): name += '_hash' if name not in data: continue value = data[name] # A few attributes may carry sensitive information, so treat them separately and do not log their values if name.count('password'): logger.debug('loaded %s', name) elif name == 'network_wifi': logger.debug('loaded %s = [hidden]', name) else: logger.debug('loaded %s = %s', name, json_utils.dumps(value)) setattr(device_attrs, name, value) # Device name if settings.device_name.get_cmd: result = run_get_cmd(settings.device_name.get_cmd, cmd_name='device name', required_fields=['name']) device_attrs.name = result['name'] # Hash empty passwords if not device_attrs.admin_password_hash: device_attrs.admin_password_hash = device_attrs.EMPTY_PASSWORD_HASH if not device_attrs.normal_password_hash: device_attrs.normal_password_hash = device_attrs.EMPTY_PASSWORD_HASH if not device_attrs.viewonly_password_hash: device_attrs.viewonly_password_hash = device_attrs.EMPTY_PASSWORD_HASH
def replace(collection: str, _id: Id, record: Record, upsert: bool = True) -> int: if logger.getEffectiveLevel() <= logging.DEBUG: logger.debug('replacing record with id %s with %s in %s', _id, json_utils.dumps(record), collection) record = dict(record, id=_id) # Make sure the new record contains the id field replaced = _get_driver().replace(collection, _id, record, upsert) if replaced: logger.debug('replaced record with id %s in %s', _id, collection) return replaced
async def replace(collection: str, id_: Id, record: Record) -> bool: if logger.getEffectiveLevel() <= logging.DEBUG: logger.debug('replacing record with id %s with %s in %s', id_, json_utils.dumps(record, allow_extended_types=True), collection) record = dict(record, id=id_) # Make sure the new record contains the id field replaced = await _get_driver().replace(collection, id_, record) if replaced: logger.debug('replaced record with id %s in %s', id_, collection) return False else: await _get_driver().insert(collection, dict(record, id=id_)) logger.debug('inserted record with id %s in %s', id_, collection) return True
async def _process_api_request( self, request_dict: GenericJSONDict) -> GenericJSONDict: from qtoggleserver.web import server as web_server if self._request_is_black_listed(request_dict): return { 'status': 404, 'body': json_utils.dumps({'error': 'no such function'}) } request = httputil.HTTPServerRequest(method=request_dict['method'], uri=request_dict['path'], body=request_dict['body']) dispatcher = web_server.get_application().find_handler(request) await dispatcher.execute() status = dispatcher.handler.get_status() body = dispatcher.handler.get_response_body() return {'status': status, 'body': body}
def set_attrs(attrs: Attributes, ignore_extra: bool = False) -> bool: core_device_attrs = sys.modules[__name__] reboot_required = False attrdefs = get_attrdefs() wifi_attrs = {} ip_attrs = {} for n, value in attrs.items(): # A few attributes may carry sensitive information, so treat them separately and do not log their values if n.count('password') or n == 'wifi_key': logger.debug('setting device attribute %s', n) else: logger.debug('setting device attribute %s = %s', n, json_utils.dumps(value)) try: attrdef = attrdefs[n] except KeyError: if ignore_extra: continue else: raise if not attrdef.get('modifiable'): if not ignore_extra: raise DeviceAttributeError('attribute-not-modifiable', n) # Treat passwords separately, as they are not persisted as given, but hashed first if n.endswith('_password') and hasattr(core_device_attrs, f'{n}_hash'): # Call password set command, if available if settings.core.passwords.set_cmd: run_set_cmd(settings.core.passwords.set_cmd, cmd_name='password', log_values=False, username=n[:-9], password=value) value = hashlib.sha256(value.encode()).hexdigest() n += '_hash' setattr(core_device_attrs, n, value) continue elif n.endswith('_password_hash') and hasattr(core_device_attrs, n): # FIXME: Password set command cannot be called with hash and we don't have clear-text password here. # A solution would be to use sha256 crypt algorithm w/o salt for Unix password (watch for the special # alphabet and for number of rounds defaulting to 5000) setattr(core_device_attrs, n, value) continue persisted = attrdef.get('persisted', attrdef.get('modifiable')) if persisted: setattr(core_device_attrs, n, value) if n == 'name' and settings.core.device_name.set_cmd: run_set_cmd(settings.core.device_name.set_cmd, cmd_name='device name', name=value) elif n == 'date' and system.date.has_set_date_support(): date = datetime.datetime.utcfromtimestamp(value) system.date.set_date(date) elif n == 'timezone' and system.date.has_timezone_support(): system.date.set_timezone(value) elif n in ('wifi_ssid', 'wifi_key', 'wifi_bssid') and system.net.has_wifi_support(): k = n[5:] k = {'key': 'psk'}.get(k, k) wifi_attrs[k] = value elif n in ('ip_address', 'ip_netmask', 'ip_gateway', 'ip_dns') and system.net.has_ip_support(): k = n[3:] ip_attrs[k] = value if wifi_attrs: wifi_config = system.net.get_wifi_config() for k, v in wifi_attrs.items(): wifi_config[k] = v wifi_config = { k: v for k, v in wifi_config.items() if not k.endswith('_current') } system.net.set_wifi_config(**wifi_config) reboot_required = True if ip_attrs: ip_config = system.net.get_ip_config() for k, v in ip_attrs.items(): ip_config[k] = v ip_config = { k: v for k, v in ip_config.items() if not k.endswith('_current') } ip_config['netmask'] = str(ip_config['netmask']) system.net.set_ip_config(**ip_config) reboot_required = True return reboot_required
def set_value(name: str, value: Any) -> None: if logger.getEffectiveLevel() <= logging.DEBUG: logger.debug('setting %s to %s', name, json_utils.dumps(value)) record = {'value': value} _get_driver().replace(name, '', record, upsert=True)
def _value_to_db(value: Any) -> str: return json_utils.dumps(value, allow_extended_types=True)
def insert(collection: str, record: Record) -> Id: if logger.getEffectiveLevel() <= logging.DEBUG: logger.debug('inserting %s into %s', json_utils.dumps(record), collection) return _get_driver().insert(collection, record)