def process_message( self, message: Mapping[str, Any], metadata: KafkaMessageMetadata) -> Optional[ProcessedMessage]: # some old relays accidentally emit rows without release if message["release"] is None: return None if message["duration"] is None: duration = None else: duration = _collapse_uint32(int(message["duration"] * 1000)) # since duration is not nullable, the max duration means no duration if duration is None: duration = MAX_UINT32 errors = _collapse_uint16(message["errors"]) or 0 quantity = _collapse_uint32(message.get("quantity")) or 1 # If a session ends in crashed or abnormal we want to make sure that # they count as errored too, so we can get the number of health and # errored sessions correctly. if message["status"] in ("crashed", "abnormal"): errors = max(errors, 1) received = _ensure_valid_date( datetime.utcfromtimestamp(message["received"])) started = _ensure_valid_date( datetime.utcfromtimestamp(message["started"])) if started is None: metrics.increment("empty_started_date") if received is None: metrics.increment("empty_received_date") processed = { "session_id": str(uuid.UUID(message["session_id"])), "distinct_id": str(uuid.UUID(message.get("distinct_id") or NIL_UUID)), "quantity": quantity, "seq": message["seq"], "org_id": message["org_id"], "project_id": message["project_id"], "retention_days": message["retention_days"], "duration": duration, "status": STATUS_MAPPING[message["status"]], "errors": errors, "received": received if received is not None else datetime.now(), "started": started if started is not None else datetime.now(), "release": message["release"], "environment": message.get("environment") or "", } return InsertBatch([processed], None)
def extract_common( self, output: MutableMapping[str, Any], event: InsertEvent, metadata: KafkaMessageMetadata, ) -> None: # Properties we get from the top level of the message payload output["platform"] = _unicodify(event["platform"]) # Properties we get from the "data" dict, which is the actual event body. data = event.get("data", {}) received = _collapse_uint32(int(data["received"])) output["received"] = (datetime.utcfromtimestamp(received) if received is not None else None) output["version"] = _unicodify(data.get("version", None)) output["location"] = _unicodify(data.get("location", None)) module_names = [] module_versions = [] modules = data.get("modules", {}) if isinstance(modules, dict): for name, version in modules.items(): module_names.append(_unicodify(name)) # Being extra careful about a stray (incorrect by spec) `null` # value blowing up the write. module_versions.append(_unicodify(version) or "") output["modules.name"] = module_names output["modules.version"] = module_versions
def process_message(self, message, metadata=None) -> Optional[ProcessedMessage]: # some old relays accidentally emit rows without release if message["release"] is None: return None if message["duration"] is None: duration = None else: duration = _collapse_uint32(int(message["duration"] * 1000)) # since duration is not nullable, the max duration means no duration if duration is None: duration = MAX_UINT32 processed = { "session_id": str(uuid.UUID(message["session_id"])), "distinct_id": str(uuid.UUID(message.get("distinct_id") or NIL_UUID)), "seq": message["seq"], "org_id": message["org_id"], "project_id": message["project_id"], "retention_days": message["retention_days"], "duration": duration, "status": STATUS_MAPPING[message["status"]], "errors": _collapse_uint16(message["errors"]) or 0, "received": _ensure_valid_date( datetime.utcfromtimestamp(message["received"]) ), "started": _ensure_valid_date( datetime.utcfromtimestamp(message["started"]) ), "release": message["release"], "environment": message.get("environment") or "", } return ProcessedMessage(action=ProcessorAction.INSERT, data=[processed])
def extract_common(self, output, message, data): # Properties we get from the top level of the message payload output['platform'] = _unicodify(message['platform']) output['primary_hash'] = _hashify(message['primary_hash']) # Properties we get from the "data" dict, which is the actual event body. received = _collapse_uint32(int(data['received'])) output['received'] = datetime.utcfromtimestamp( received) if received is not None else None output['culprit'] = _unicodify(data.get('culprit', None)) output['type'] = _unicodify(data.get('type', None)) output['version'] = _unicodify(data.get('version', None)) output['title'] = _unicodify(data.get('title', None)) output['location'] = _unicodify(data.get('location', None)) # The following concerns the change to message/search_message # There are 2 Scenarios: # Pre-rename: # - Payload contains: # "message": "a long search message" # - Does NOT contain a `search_message` property # - "message" value saved in `message` column # - `search_message` column nonexistent or Null # Post-rename: # - Payload contains: # "search_message": "a long search message" # - Optionally the payload's "data" dict (event body) contains: # "message": "short message" # - "search_message" value stored in `search_message` column # - "message" value stored in `message` column # output['search_message'] = _unicodify( message.get('search_message', None)) if output['search_message'] is None: # Pre-rename scenario, we expect to find "message" at the top level output['message'] = _unicodify(message['message']) else: # Post-rename scenario, we check in case we have the optional # "message" in the event body. output['message'] = _unicodify(data.get('message', None)) module_names = [] module_versions = [] modules = data.get('modules', {}) if isinstance(modules, dict): for name, version in modules.items(): module_names.append(_unicodify(name)) # Being extra careful about a stray (incorrect by spec) `null` # value blowing up the write. module_versions.append(_unicodify(version) or '') output['modules.name'] = module_names output['modules.version'] = module_versions
def extract_stacktraces(self, output, stacks): stack_types = [] stack_values = [] stack_mechanism_types = [] stack_mechanism_handled = [] frame_abs_paths = [] frame_filenames = [] frame_packages = [] frame_modules = [] frame_functions = [] frame_in_app = [] frame_colnos = [] frame_linenos = [] frame_stack_levels = [] stack_level = 0 for stack in stacks: if stack is None: continue stack_types.append(_unicodify(stack.get('type', None))) stack_values.append(_unicodify(stack.get('value', None))) mechanism = stack.get('mechanism', None) or {} stack_mechanism_types.append( _unicodify(mechanism.get('type', None))) stack_mechanism_handled.append( _boolify(mechanism.get('handled', None))) frames = (stack.get('stacktrace', None) or {}).get('frames', None) or [] for frame in frames: if frame is None: continue frame_abs_paths.append(_unicodify(frame.get('abs_path', None))) frame_filenames.append(_unicodify(frame.get('filename', None))) frame_packages.append(_unicodify(frame.get('package', None))) frame_modules.append(_unicodify(frame.get('module', None))) frame_functions.append(_unicodify(frame.get('function', None))) frame_in_app.append(frame.get('in_app', None)) frame_colnos.append(_collapse_uint32(frame.get('colno', None))) frame_linenos.append( _collapse_uint32(frame.get('lineno', None))) frame_stack_levels.append(stack_level) stack_level += 1 output['exception_stacks.type'] = stack_types output['exception_stacks.value'] = stack_values output['exception_stacks.mechanism_type'] = stack_mechanism_types output['exception_stacks.mechanism_handled'] = stack_mechanism_handled output['exception_frames.abs_path'] = frame_abs_paths output['exception_frames.filename'] = frame_filenames output['exception_frames.package'] = frame_packages output['exception_frames.module'] = frame_modules output['exception_frames.function'] = frame_functions output['exception_frames.in_app'] = frame_in_app output['exception_frames.colno'] = frame_colnos output['exception_frames.lineno'] = frame_linenos output['exception_frames.stack_level'] = frame_stack_levels
def extract_stacktraces(self, output: MutableMapping[str, Any], stacks: Sequence[Any]) -> None: stack_types = [] stack_values = [] stack_mechanism_types = [] stack_mechanism_handled = [] frame_abs_paths = [] frame_filenames = [] frame_packages = [] frame_modules = [] frame_functions = [] frame_in_app = [] frame_colnos = [] frame_linenos = [] frame_stack_levels = [] if output["project_id"] not in settings.PROJECT_STACKTRACE_BLACKLIST: stack_level = 0 for stack in stacks: if stack is None: continue stack_types.append(_unicodify(stack.get("type", None))) stack_values.append(_unicodify(stack.get("value", None))) mechanism = stack.get("mechanism", None) or {} stack_mechanism_types.append( _unicodify(mechanism.get("type", None))) stack_mechanism_handled.append( _boolify(mechanism.get("handled", None))) frames = (stack.get("stacktrace", None) or {}).get( "frames", None) or [] for frame in frames: if frame is None: continue frame_abs_paths.append( _unicodify(frame.get("abs_path", None))) frame_filenames.append( _unicodify(frame.get("filename", None))) frame_packages.append( _unicodify(frame.get("package", None))) frame_modules.append(_unicodify(frame.get("module", None))) frame_functions.append( _unicodify(frame.get("function", None))) frame_in_app.append(frame.get("in_app", None)) frame_colnos.append( _collapse_uint32(frame.get("colno", None))) frame_linenos.append( _collapse_uint32(frame.get("lineno", None))) frame_stack_levels.append(stack_level) stack_level += 1 output["exception_stacks.type"] = stack_types output["exception_stacks.value"] = stack_values output["exception_stacks.mechanism_type"] = stack_mechanism_types output["exception_stacks.mechanism_handled"] = stack_mechanism_handled output["exception_frames.abs_path"] = frame_abs_paths output["exception_frames.filename"] = frame_filenames output["exception_frames.package"] = frame_packages output["exception_frames.module"] = frame_modules output["exception_frames.function"] = frame_functions output["exception_frames.in_app"] = frame_in_app output["exception_frames.colno"] = frame_colnos output["exception_frames.lineno"] = frame_linenos output["exception_frames.stack_level"] = frame_stack_levels