def __init__(self): Repr.__init__(self) # Note: These levels can get adjusted dynamically! My goal is to get more info when printing important debug stuff like exceptions and stack traces and less info when logging normal events. --Zooko 2000-10-14 self.maxlevel = 6 self.maxdict = 6 self.maxlist = 6 self.maxtuple = 6 self.maxstring = 300 self.maxother = 300
def test_tuple(self): eq = self.assertEquals eq(r((1,)), "(1,)") t3 = (1, 2, 3) eq(r(t3), "(1, 2, 3)") r2 = Repr() r2.maxtuple = 2 expected = repr(t3)[:-2] + "...)" eq(r2.repr(t3), expected)
def __init__(self, subsequentIndent = ""): Repr.__init__(self) self.maxtuple = 20 self.maxset = 160 self.maxlist = 20 self.maxdict = 20 self.maxstring = 1600 self.maxother = 160 self.maxLineLen = 160 self.subsequentIndent = subsequentIndent # Pretty-print? self._pretty = True
class FrameViewer: def __init__(self, root, frame): self.root = root self.frame = frame self.top = Toplevel(self.root) self.repr = Repr() self.repr.maxstring = 60 self.load_variables() def load_variables(self): row = 0 if self.frame.f_locals is not self.frame.f_globals: l = Label(self.top, text="Local Variables", borderwidth=2, relief="raised") l.grid(row=row, column=0, columnspan=2, sticky="ew") row = self.load_names(self.frame.f_locals, row+1) l = Label(self.top, text="Global Variables", borderwidth=2, relief="raised") l.grid(row=row, column=0, columnspan=2, sticky="ew") row = self.load_names(self.frame.f_globals, row+1) def load_names(self, dict, row): names = dict.keys() names.sort() for name in names: value = dict[name] svalue = self.repr.repr(value) l = Label(self.top, text=name) l.grid(row=row, column=0, sticky="w") l = Entry(self.top, width=60, borderwidth=0) l.insert(0, svalue) l.grid(row=row, column=1, sticky="w") row = row+1 return row
def repr_instance(self, obj, level): """ If it is an instance of Exception, format it nicely (trying to emulate the format that you see when an exception is actually raised, plus bracketing '<''s). If it is an instance of dict call self.repr_dict() on it. If it is an instance of list call self.repr_list() on it. Else call Repr.repr_instance(). """ if isinstance(obj, exceptions.Exception): # Don't cut down exception strings so much. tms = self.maxstring self.maxstring = max(512, tms * 4) tml = self.maxlist self.maxlist = max(12, tml * 4) try: if hasattr(obj, 'args'): if len(obj.args) == 1: return '<' + obj.__class__.__name__ + ': ' + self.repr1(obj.args[0], level-1) + '>' else: return '<' + obj.__class__.__name__ + ': ' + self.repr1(obj.args, level-1) + '>' else: return '<' + obj.__class__.__name__ + '>' finally: self.maxstring = tms self.maxlist = tml if isinstance(obj, dict): return self.repr_dict(obj, level) if isinstance(obj, list): return self.repr_list(obj, level) return Repr.repr_instance(self, obj, level)
def __init__(self, root, frame): self.root = root self.frame = frame self.top = Toplevel(self.root) self.repr = Repr() self.repr.maxstring = 60 self.load_variables()
from repr import Repr
Debugger that can be used instead of the pdb module. This one provides a nicer command interface by using the CLI module. """ import sys, os import linecache import bdb from repr import Repr import re import CLI # Create a custom safe Repr instance and increase its maxstring. # The default of 30 truncates error messages too easily. _repr = Repr() _repr.maxstring = 200 _repr.maxother = 50 _saferepr = _repr.repr # set environment from interactive config file #RCFILE = os.path.expandvars("$HOME/.pyinteractiverc") #try: # env = {} # execfile(RCFILE, env, env) #except: # pass #else: # for name, val in env.items(): # if type(val) is str: # os.environ[name] = val
def __init__(self): Repr.__init__(self) self.maxstring = 100 self.maxother = 100
class CommandProcessor(Mprocessor.Processor): def __init__(self, core_obj, opts=None): get_option = lambda key: Mmisc.option_set(opts, key, DEFAULT_PROC_OPTS) Mprocessor.Processor.__init__(self, core_obj) self.continue_running = False # True if we should leave command loop self.event2short = dict(EVENT2SHORT) self.event2short["signal"] = "?!" self.event2short["brkpt"] = "xx" self.optional_modules = ("ipython", "bpy") self.cmd_instances = self._populate_commands() # command argument string. Is like current_command, but the part # after cmd_name has been removed. self.cmd_argstr = "" # command name before alias or macro resolution self.cmd_name = "" self.cmd_queue = [] # Queued debugger commands self.completer = lambda text, state: Mcomplete.completer( self, text, state) self.current_command = "" # Current command getting run self.debug_nest = 1 self.display_mgr = Mdisplay.DisplayMgr() self.file2file_remap = {} self.intf = core_obj.debugger.intf self.last_command = None # Initially a no-op self.precmd_hooks = [] self.location = lambda: print_location(self) self.preloop_hooks = [] self.postcmd_hooks = [] self.remap_file_re = None self._populate_cmd_lists() # Note: prompt_str's value set below isn't used. It is # computed dynamically. The value is suggestive of what it # looks like. self.prompt_str = "(trepan2) " # Stop only if line/file is different from last time self.different_line = None # These values updated on entry. Set initial values. self.curframe = None self.event = None self.event_arg = None self.frame = None self.list_lineno = 0 # last list number used in "list" self.list_offset = -1 # last list number used in "disassemble" self.list_obj = None self.list_filename = None # last filename used in list self.list_orig_lineno = 0 # line number of frame or exception on setup self.list_filename = None # filename of frame or exception on setup self.macros = {} # Debugger Macros # Create a custom safe Repr instance and increase its maxstring. # The default of 30 truncates error messages too easily. self._repr = Repr() self._repr.maxstring = 100 self._repr.maxother = 60 self._repr.maxset = 10 self._repr.maxfrozen = 10 self._repr.array = 10 self.stack = [] self.thread_name = None self.frame_thread_name = None initfile_list = get_option("initfile_list") for init_cmdfile in initfile_list: self.queue_startfile(init_cmdfile) return def add_remap_pat(self, pat, replace, clear_remap=True): pyficache.main.add_remap_pat(pat, replace, clear_remap) if clear_remap: self.file2file_remap = {} pyficache.file2file_remap = {} def _saferepr(self, str, maxwidth=None): if maxwidth is None: maxwidth = self.debugger.settings["width"] return self._repr.repr(str)[:maxwidth] def add_preloop_hook(self, hook, position=-1, nodups=True): if hook in self.preloop_hooks: return False self.preloop_hooks.insert(position, hook) return True # To be overridden in derived debuggers def defaultFile(self): """Produce a reasonable default.""" filename = self.curframe.f_code.co_filename # Consider using is_exec_stmt(). I just don't understand # the conditions under which the below test is true. if filename == "<string>" and self.debugger.mainpyfile: filename = self.debugger.mainpyfile pass return filename def set_prompt(self, prompt="trepan2"): if self.thread_name and self.thread_name != "MainThread": prompt += ":" + self.thread_name pass self.prompt_str = "%s%s%s" % ( "(" * self.debug_nest, prompt, ")" * self.debug_nest, ) highlight = self.debugger.settings["highlight"] if highlight and highlight in ("light", "dark"): self.prompt_str = colorize("underline", self.prompt_str) self.prompt_str += " " def event_processor(self, frame, event, event_arg, prompt="trepan2"): "command event processor: reading a commands do something with them." self.frame = frame self.event = event self.event_arg = event_arg filename = frame.f_code.co_filename lineno = frame.f_lineno if sys.version_info[0] == 2 and sys.version_info[1] <= 4: line = None else: line = linecache.getline(filename, lineno, frame.f_globals) pass if not line: opts = { "output": "plain", "reload_on_change": self.settings("reload"), "strip_nl": False, } line = pyficache.getline(filename, lineno, opts) self.current_source_text = line if self.settings("skip") is not None: if Mbytecode.is_def_stmt(line, frame): return True if Mbytecode.is_class_def(line, frame): return True pass self.thread_name = Mthread.current_thread_name() self.frame_thread_name = self.thread_name self.set_prompt(prompt) self.process_commands() if filename == "<string>": pyficache.remove_remap_file("<string>") return True def forget(self): """ Remove memory of state variables set in the command processor """ self.stack = [] self.curindex = 0 self.curframe = None self.thread_name = None self.frame_thread_name = None return def eval(self, arg): """Eval string arg in the current frame context.""" try: return eval(arg, self.curframe.f_globals, self.curframe.f_locals) except: t, v = sys.exc_info()[:2] if isinstance(t, str): exc_type_name = t pass else: exc_type_name = t.__name__ self.errmsg(str("%s: %s" % (exc_type_name, arg))) raise return None # Not reached def exec_line(self, line): if self.curframe: local_vars = self.curframe.f_locals global_vars = self.curframe.f_globals else: local_vars = None # FIXME: should probably have place where the # user can store variables inside the debug session. # The setup for this should be elsewhere. Possibly # in interaction. global_vars = None try: code = compile(line + "\n", '"%s"' % line, "single") exec(code, global_vars, local_vars) except: t, v = sys.exc_info()[:2] if type(t) == str: exc_type_name = t else: exc_type_name = t.__name__ self.errmsg("%s: %s" % (str(exc_type_name), str(v))) pass return def get_an_int(self, arg, msg_on_error, min_value=None, max_value=None): """Like cmdfns.get_an_int(), but if there's a stack frame use that in evaluation.""" ret_value = self.get_int_noerr(arg) if ret_value is None: if msg_on_error: self.errmsg(msg_on_error) else: self.errmsg("Expecting an integer, got: %s." % str(arg)) pass return None if min_value and ret_value < min_value: self.errmsg("Expecting integer value to be at least %d, got: %d." % (min_value, ret_value)) return None elif max_value and ret_value > max_value: self.errmsg("Expecting integer value to be at most %d, got: %d." % (max_value, ret_value)) return None return ret_value def get_int_noerr(self, arg): """Eval arg and it is an integer return the value. Otherwise return None""" if self.curframe: g = self.curframe.f_globals l = self.curframe.f_locals else: g = globals() l = locals() pass try: val = int(eval(arg, g, l)) except (SyntaxError, NameError, ValueError, TypeError): return None return val def get_int(self, arg, min_value=0, default=1, cmdname=None, at_most=None): """If no argument use the default. If arg is a an integer between least min_value and at_most, use that. Otherwise report an error. If there's a stack frame use that in evaluation.""" if arg is None: return default default = self.get_int_noerr(arg) if default is None: if cmdname: self.errmsg( ("Command '%s' expects an integer; " + "got: %s.") % (cmdname, str(arg))) else: self.errmsg("Expecting a positive integer, got: %s" % str(arg)) pass return None pass if default < min_value: if cmdname: self.errmsg(("Command '%s' expects an integer at least" + " %d; got: %d.") % (cmdname, min_value, default)) else: self.errmsg( ("Expecting a positive integer at least" + " %d; got: %d") % (min_value, default)) pass return None elif at_most and default > at_most: if cmdname: self.errmsg(("Command '%s' expects an integer at most" + " %d; got: %d.") % (cmdname, at_most, default)) else: self.errmsg(("Expecting an integer at most %d; got: %d") % (at_most, default)) pass pass return default def getval(self, arg, locals=None): if not locals: locals = self.curframe.f_locals try: return eval(arg, self.curframe.f_globals, locals) except: t, v = sys.exc_info()[:2] if isinstance(t, str): exc_type_name = t else: exc_type_name = t.__name__ self.errmsg(str("%s: %s" % (exc_type_name, arg))) raise return def ok_for_running(self, cmd_obj, name, nargs): """We separate some of the common debugger command checks here: whether it makes sense to run the command in this execution state, if the command has the right number of arguments and so on. """ if hasattr(cmd_obj, "execution_set"): if not (self.core.execution_status in cmd_obj.execution_set): part1 = "Command '%s' is not available for execution status:" % name mess = Mmisc.wrapped_lines(part1, self.core.execution_status, self.debugger.settings["width"]) self.errmsg(mess) return False pass if self.frame is None and cmd_obj.need_stack: self.intf[-1].errmsg("Command '%s' needs an execution stack." % name) return False if nargs < cmd_obj.min_args: self.errmsg( ("Command '%s' needs at least %d argument(s); " + "got %d.") % (name, cmd_obj.min_args, nargs)) return False elif cmd_obj.max_args is not None and nargs > cmd_obj.max_args: self.errmsg( ("Command '%s' can take at most %d argument(s);" + " got %d.") % (name, cmd_obj.max_args, nargs)) return False return True def process_commands(self): """Handle debugger commands.""" if self.core.execution_status != "No program": self.setup() self.location() pass leave_loop = run_hooks(self, self.preloop_hooks) self.continue_running = False while not leave_loop: try: run_hooks(self, self.precmd_hooks) # bdb had a True return to leave loop. # A more straight-forward way is to set # instance variable self.continue_running. leave_loop = self.process_command() if leave_loop or self.continue_running: break except EOFError: # If we have stacked interfaces, pop to the next # one. If this is the last one however, we'll # just stick with that. FIXME: Possibly we should # check to see if we are interactive. and not # leave if that's the case. Is this the right # thing? investigate and fix. if len(self.debugger.intf) > 1: del self.debugger.intf[-1] self.last_command = "" else: if self.debugger.intf[-1].output: self.debugger.intf[-1].output.writeline("Leaving") raise Mexcept.DebuggerQuit pass break pass pass return run_hooks(self, self.postcmd_hooks) def process_command(self): # process command if len(self.cmd_queue) > 0: current_command = self.cmd_queue[0].strip() del self.cmd_queue[0] else: current_command = self.intf[-1].read_command( self.prompt_str).strip() if "" == current_command and self.intf[-1].interactive: current_command = self.last_command pass pass # Look for comments if "" == current_command: if self.intf[-1].interactive: self.errmsg("No previous command registered, " + "so this is a no-op.") pass return False if current_command is None or current_command[0] == "#": return False try: args_list = arg_split(current_command) except: self.errmsg("bad parse %s" % sys.exc_info()[0]) return False for args in args_list: if len(args): while True: if len(args) == 0: return False macro_cmd_name = args[0] if macro_cmd_name not in self.macros: break try: current_command = self.macros[macro_cmd_name][0]( *args[1:]) except TypeError: t, v = sys.exc_info()[:2] self.errmsg("Error expanding macro %s" % macro_cmd_name) return False if self.settings("debugmacro"): print(current_command) pass if isinstance(types.ListType, type(current_command)): for x in current_command: if bytes != type(x): self.errmsg( ("macro %s should return a List " + "of Strings. Has %s of type %s") % ( macro_cmd_name, x, repr(current_command), type(x), )) return False pass first = current_command[0] args = first.split() self.cmd_queue + [current_command[1:]] current_command = first elif type(current_command) == str: args = current_command.split() else: self.errmsg(("macro %s should return a List " + "of Strings or a String. Got %s") % (macro_cmd_name, repr(current_command))) return False pass self.cmd_name = args[0] cmd_name = resolve_name(self, self.cmd_name) self.cmd_argstr = current_command[len(self.cmd_name):].lstrip() if cmd_name: self.last_command = current_command cmd_obj = self.commands[cmd_name] if self.ok_for_running(cmd_obj, cmd_name, len(args) - 1): try: self.current_command = current_command result = cmd_obj.run(args) if result: return result except ( Mexcept.DebuggerQuit, Mexcept.DebuggerRestart, SystemExit, ): # Let these exceptions propagate through raise except: self.errmsg("INTERNAL ERROR: " + traceback.format_exc()) pass pass pass elif not self.settings("autoeval"): self.undefined_cmd(current_command) else: self.exec_line(current_command) pass pass pass return False def remove_preloop_hook(self, hook): try: position = self.preloop_hooks.index(hook) except ValueError: return False del self.preloop_hooks[position] return True def setup(self): """Initialization done before entering the debugger-command loop. In particular we set up the call stack used for local variable lookup and frame/up/down commands. We return True if we should NOT enter the debugger-command loop.""" self.forget() if self.settings("dbg_trepan"): self.frame = inspect.currentframe() pass if self.event in ["exception", "c_exception"]: exc_type, exc_value, exc_traceback = self.event_arg else: _, _, exc_traceback = ( None, None, None, ) # NOQA pass if self.frame or exc_traceback: self.stack, self.curindex = get_stack(self.frame, exc_traceback, None, self) self.curframe = self.stack[self.curindex][0] self.thread_name = Mthread.current_thread_name() if exc_traceback: self.list_lineno = traceback.extract_tb(exc_traceback, 1)[0][1] self.list_offset = self.curframe.f_lasti self.list_object = self.curframe else: self.stack = self.curframe = self.botframe = None pass if self.curframe: self.list_lineno = (max( 1, inspect.getlineno(self.curframe) - int(self.settings("listsize") / 2), ) - 1) self.list_offset = self.curframe.f_lasti self.list_filename = self.curframe.f_code.co_filename self.list_object = self.curframe else: if not exc_traceback: self.list_lineno = None pass # if self.execRcLines()==1: return True # FIXME: do we want to save self.list_lineno a second place # so that we can do 'list .' and go back to the first place we listed? return False def queue_startfile(self, cmdfile): """Arrange for file of debugger commands to get read in the process-command loop.""" expanded_cmdfile = osp.expanduser(cmdfile) is_readable = Mfile.readable(expanded_cmdfile) if is_readable: self.cmd_queue.append("source " + expanded_cmdfile) elif is_readable is None: self.errmsg("source file '%s' doesn't exist" % expanded_cmdfile) else: self.errmsg("source file '%s' is not readable" % expanded_cmdfile) pass return def undefined_cmd(self, cmd): """Error message when a command doesn't exist""" self.errmsg('Undefined command: "%s". Try "help".' % cmd) return def read_history_file(self): """Read the command history file -- possibly.""" histfile = self.debugger.intf[-1].histfile try: import readline readline.read_history_file(histfile) except IOError: pass except ImportError: pass return def write_history_file(self): """Write the command history file -- possibly.""" settings = self.debugger.settings histfile = self.debugger.intf[-1].histfile if settings["hist_save"]: try: import readline try: readline.write_history_file(histfile) except IOError: pass except ImportError: pass pass return def _populate_commands(self): """ Create an instance of each of the debugger commands. Commands are found by importing files in the directory 'command'. Some files are excluded via an array set in __init__. For each of the remaining files, we import them and scan for class names inside those files and for each class name, we will create an instance of that class. The set of DebuggerCommand class instances form set of possible debugger commands.""" import trepan.processor.command as Mcommand if hasattr(Mcommand, "__modules__"): return self.populate_commands_easy_install(Mcommand) else: return self.populate_commands_pip(Mcommand) def populate_commands_pip(self, Mcommand): cmd_instances = [] eval_cmd_template = "command_mod.%s(self)" for mod_name in Mcommand.__dict__.keys(): if mod_name.startswith("__"): continue import_name = "trepan.processor.command." + mod_name imp = __import__(import_name) if imp.__name__ == "trepan": command_mod = imp.processor.command else: if mod_name in ( "info_sub", "set_sub", "show_sub", ): pass try: command_mod = getattr(__import__(import_name), mod_name) except: # Don't need to warn about optional modules if mod_name not in self.optional_modules: print("Error importing %s: %s" % (mod_name, sys.exc_info()[0])) pass continue pass classnames = [ tup[0] for tup in inspect.getmembers(command_mod, inspect.isclass) if ("DebuggerCommand" != tup[0] and tup[0].endswith("Command")) ] for classname in classnames: eval_cmd = eval_cmd_template % classname try: instance = eval(eval_cmd) cmd_instances.append(instance) except ImportError: pass except: print("Error loading %s from %s: %s" % (classname, mod_name, sys.exc_info()[0])) pass pass pass return cmd_instances def populate_commands_easy_install(self, Mcommand): cmd_instances = [] for mod_name in Mcommand.__modules__: if mod_name in ( "info_sub", "set_sub", "show_sub", ): pass import_name = "trepan.processor.command." + mod_name try: command_mod = __import__(import_name, None, None, ["*"]) except: if mod_name not in self.optional_modules: print("Error importing %s: %s" % (mod_name, sys.exc_info()[0])) pass continue classnames = [ tup[0] for tup in inspect.getmembers(command_mod, inspect.isclass) if ("DebuggerCommand" != tup[0] and tup[0].endswith("Command")) ] for classname in classnames: if False: instance = getattr(command_mod, classname)(self) cmd_instances.append(instance) else: try: instance = getattr(command_mod, classname)(self) cmd_instances.append(instance) except: print("Error loading %s from %s: %s" % (classname, mod_name, sys.exc_info()[0])) pass pass pass pass return cmd_instances def _populate_cmd_lists(self): """ Populate self.lists and hashes: self.commands, and self.aliases, self.category """ self.commands = {} self.aliases = {} self.category = {} # self.short_help = {} for cmd_instance in self.cmd_instances: if not hasattr(cmd_instance, "aliases"): continue alias_names = cmd_instance.aliases cmd_name = cmd_instance.name self.commands[cmd_name] = cmd_instance for alias_name in alias_names: self.aliases[alias_name] = cmd_name pass cat = getattr(cmd_instance, "category") if cat and self.category.get(cat): self.category[cat].append(cmd_name) else: self.category[cat] = [cmd_name] pass # sh = getattr(cmd_instance, 'short_help') # if sh: # self.short_help[cmd_name] = getattr(c, 'short_help') # pass pass for k in list(self.category.keys()): self.category[k].sort() pass return pass
from collections.abc import Mapping, Iterable except ImportError: from collections import Mapping, Iterable from six import string_types from ._types import NumericString, NumericByteString, IntegerString, IntegerByteString USING_PYTHON2 = True if sys.version_info < (3, 0) else False try: from repr import Repr except ImportError: from reprlib import Repr _repr = Repr().repr unittest_case = TestCase(methodName='__init__') def _format(message, *args): return message.format(*map(_repr, args)) class EnsureError(AssertionError): # TODO: preserve original error type, define API for introspecting EnsureError pass class Inspector(object): def __init__(self,
globalses = set() for module in modules: globalses.add(id(module.__dict__)) def trace(frame, event, arg): if event == 'line': print(frame.f_code.co_filename, frame.f_code.co_name, frame.f_lineno) if event == 'call': if id(frame.f_globals) in globalses: return trace sys.settrace(trace) _repr_obj = Repr() _repr_obj.maxstring = 50 _repr_obj.maxother = 50 brief_repr = _repr_obj.repr # events: call, return, get, set, del, raise def trace_print_hook(event, label, obj, attr_name, args=(), kwargs={}, result=_UNSET): fargs = (event.ljust(6), time.time(), label.rjust(10), obj.__class__.__name__, attr_name) if event == 'get': tmpl = '%s %s - %s - %s.%s -> %s' fargs += (brief_repr(result),) elif event == 'set': tmpl = '%s %s - %s - %s.%s = %s'
def DXHTTPRequest(resource, data, method='POST', headers=None, auth=True, timeout=DEFAULT_TIMEOUT, use_compression=None, jsonify_data=True, want_full_response=False, decode_response_body=True, prepend_srv=True, session_handler=None, max_retries=DEFAULT_RETRIES, always_retry=False, **kwargs): ''' :param resource: API server route, e.g. "/record/new". If *prepend_srv* is False, a fully qualified URL is expected. If this argument is a callable, it will be called just before each request attempt, and expected to return a tuple (URL, headers). Headers returned by the callback are updated with *headers* (including headers set by this method). :type resource: string :param data: Content of the request body :type data: list or dict, if *jsonify_data* is True; or string or file-like object, otherwise :param headers: Names and values of HTTP headers to submit with the request (in addition to those needed for authentication, compression, or other options specified with the call). :type headers: dict :param auth: Controls the ``Authentication`` header or other means of authentication supplied with the request. If ``True`` (default), a token is obtained from the ``DX_SECURITY_CONTEXT``. If the value evaluates to false, no action is taken to prepare authentication for the request. Otherwise, the value is assumed to be callable, and called with three arguments (method, url, headers) and expected to prepare the authentication headers by reference. :type auth: tuple, object, True (default), or None :param timeout: HTTP request timeout, in seconds :type timeout: float :param config: *config* value to pass through to :meth:`requests.request` :type config: dict :param use_compression: Deprecated :type use_compression: string or None :param jsonify_data: If True, *data* is converted from a Python list or dict to a JSON string :type jsonify_data: boolean :param want_full_response: If True, the full :class:`requests.Response` object is returned (otherwise, only the content of the response body is returned) :type want_full_response: boolean :param decode_response_body: If True (and *want_full_response* is False), the response body is decoded and, if it is a JSON string, deserialized. Otherwise, the response body is uncompressed if transport compression is on, and returned raw. :type decode_response_body: boolean :param prepend_srv: If True, prepends the API server location to the URL :type prepend_srv: boolean :param session_handler: Deprecated. :param max_retries: Maximum number of retries to perform for a request. A "failed" request is retried if any of the following is true: - A response is received from the server, and the content length received does not match the "Content-Length" header. - A response is received from the server, and the response has an HTTP status code in 5xx range. - A response is received from the server, the "Content-Length" header is not set, and the response JSON cannot be parsed. - No response is received from the server, and either *always_retry* is True or the request *method* is "GET". :type max_retries: int :param always_retry: If True, indicates that it is safe to retry a request on failure - Note: It is not guaranteed that the request will *always* be retried on failure; rather, this is an indication to the function that it would be safe to do so. :type always_retry: boolean :returns: Response from API server in the format indicated by *want_full_response* and *decode_response_body*. :raises: :exc:`exceptions.DXAPIError` or a subclass if the server returned a non-200 status code; :exc:`requests.exceptions.HTTPError` if an invalid response was received from the server; or :exc:`requests.exceptions.ConnectionError` if a connection cannot be established. Wrapper around :meth:`requests.request()` that makes an HTTP request, inserting authentication headers and (by default) converting *data* to JSON. .. note:: Bindings methods that make API calls make the underlying HTTP request(s) using :func:`DXHTTPRequest`, and most of them will pass any unrecognized keyword arguments you have supplied through to :func:`DXHTTPRequest`. ''' if headers is None: headers = {} global _UPGRADE_NOTIFY url = APISERVER + resource if prepend_srv else resource method = method.upper( ) # Convert method name to uppercase, to ease string comparisons later if _DEBUG >= 3: try: formatted_data = json.dumps(data, indent=2) except UnicodeDecodeError: formatted_data = "<binary data>" print(method, url, "=>\n" + formatted_data, file=sys.stderr) elif _DEBUG == 2: try: formatted_data = json.dumps(data) except UnicodeDecodeError: formatted_data = "<binary data>" print(method, url, "=>", formatted_data, file=sys.stderr) elif _DEBUG > 0: from repr import Repr print(method, url, "=>", Repr().repr(data), file=sys.stderr) if auth is True: auth = AUTH_HELPER if auth: auth(_RequestForAuth(method, url, headers)) pool_args = { arg: kwargs.pop(arg, None) for arg in ("verify", "cert_file", "key_file") } if jsonify_data: data = json.dumps(data) if 'Content-Type' not in headers and method == 'POST': headers['Content-Type'] = 'application/json' # If the input is a buffer, its data gets consumed by # requests.request (moving the read position). Record the initial # buffer position so that we can return to it if the request fails # and needs to be retried. rewind_input_buffer_offset = None if hasattr(data, 'seek') and hasattr(data, 'tell'): rewind_input_buffer_offset = data.tell() try_index = 0 while True: success, time_started = True, None response = None try: if _DEBUG > 0: time_started = time.time() _method, _url, _headers = _process_method_url_headers( method, url, headers) # throws BadStatusLine if the server returns nothing response = _get_pool_manager(**pool_args).request(_method, _url, headers=_headers, body=data, timeout=timeout, retries=False, **kwargs) if _UPGRADE_NOTIFY and response.headers.get( 'x-upgrade-info', '').startswith('A recommended update is available' ) and '_ARGCOMPLETE' not in os.environ: logger.info(response.headers['x-upgrade-info']) try: with file(_UPGRADE_NOTIFY, 'a'): os.utime(_UPGRADE_NOTIFY, None) except: pass _UPGRADE_NOTIFY = False # If an HTTP code that is not in the 200 series is received and the content is JSON, parse it and throw the # appropriate error. Otherwise, raise the usual exception. if response.status // 100 != 2: # response.headers key lookup is case-insensitive if response.headers.get('content-type', '').startswith('application/json'): content = response.data.decode('utf-8') try: content = json.loads(content) except ValueError: # The JSON is not parsable, but we should be able to retry. raise exceptions.BadJSONInReply( "Invalid JSON received from server", response.status) try: error_class = getattr(exceptions, content["error"]["type"], exceptions.DXAPIError) except (KeyError, AttributeError, TypeError): error_class = exceptions.HTTPError raise error_class(content, response.status) raise exceptions.HTTPError("{} {}".format( response.status, response.reason)) if want_full_response: return response else: if 'content-length' in response.headers: if int(response.headers['content-length']) != len( response.data): range_str = ( ' (%s)' % (headers['Range'], )) if 'Range' in headers else '' raise exceptions.ContentLengthError( "Received response with content-length header set to %s but content length is %d%s" % (response.headers['content-length'], len(response.data), range_str)) content = response.data if decode_response_body: content = content.decode('utf-8') if response.headers.get('content-type', '').startswith('application/json'): try: content = json.loads(content) except ValueError: # The JSON is not parsable, but we should be able to retry. raise exceptions.BadJSONInReply( "Invalid JSON received from server", response.status) if _DEBUG > 0: t = int((time.time() - time_started) * 1000) if _DEBUG >= 3: print(method, url, "<=", response.status, "(%dms)" % t, "\n" + json.dumps(content, indent=2), file=sys.stderr) elif _DEBUG == 2: print(method, url, "<=", response.status, "(%dms)" % t, json.dumps(content), file=sys.stderr) elif _DEBUG > 0: print(method, url, "<=", response.status, "(%dms)" % t, Repr().repr(content), file=sys.stderr) return content raise AssertionError( 'Should never reach this line: expected a result to have been returned by now' ) except Exception as e: # Avoid reusing connections in the pool, since they may be # in an inconsistent state (observed as "ResponseNotReady" # errors). _get_pool_manager(**pool_args).clear() success = False exception_msg = _extract_msg_from_last_exception() if isinstance(e, _expected_exceptions): if response is not None and response.status == 503: seconds_to_wait = _extract_retry_after_timeout(response) logger.warn( "%s %s: %s. Waiting %d seconds due to server unavailability...", method, url, exception_msg, seconds_to_wait) time.sleep(seconds_to_wait) # Note, we escape the "except" block here without # incrementing try_index because 503 responses with # Retry-After should not count against the number of # permitted retries. continue # Total number of allowed tries is the initial try + up to # (max_retries) subsequent retries. total_allowed_tries = max_retries + 1 ok_to_retry = False # Because try_index is not incremented until we escape this # iteration of the loop, try_index is equal to the number of # tries that have failed so far, minus one. Test whether we # have exhausted all retries. # # BadStatusLine --- server did not return anything # BadJSONInReply --- server returned JSON that didn't parse properly if try_index + 1 < total_allowed_tries: if response is None or \ isinstance(e, (exceptions.ContentLengthError, BadStatusLine, exceptions.BadJSONInReply)): ok_to_retry = always_retry or ( method == 'GET') or _is_retryable_exception(e) else: ok_to_retry = 500 <= response.status < 600 if ok_to_retry: if rewind_input_buffer_offset is not None: data.seek(rewind_input_buffer_offset) delay = min(2**try_index, DEFAULT_TIMEOUT) logger.warn( "%s %s: %s. Waiting %d seconds before retry %d of %d...", method, url, exception_msg, delay, try_index + 1, max_retries) time.sleep(delay) try_index += 1 continue # All retries have been exhausted OR the error is deemed not # retryable. Print the latest error and propagate it back to the caller. if not isinstance(e, exceptions.DXAPIError): logger.error("%s %s: %s", method, url, exception_msg) raise finally: if success and try_index > 0: logger.info("%s %s: Recovered after %d retries", method, url, try_index) raise AssertionError( 'Should never reach this line: should have attempted a retry or reraised by now' ) raise AssertionError( 'Should never reach this line: should never break out of loop')
def DXHTTPRequest(resource, data, method='POST', headers=None, auth=True, timeout=DEFAULT_TIMEOUT, use_compression=None, jsonify_data=True, want_full_response=False, decode_response_body=True, prepend_srv=True, session_handler=None, max_retries=DEFAULT_RETRIES, always_retry=False, **kwargs): ''' :param resource: API server route, e.g. "/record/new". If *prepend_srv* is False, a fully qualified URL is expected. If this argument is a callable, it will be called just before each request attempt, and expected to return a tuple (URL, headers). Headers returned by the callback are updated with *headers* (including headers set by this method). :type resource: string :param data: Content of the request body :type data: list or dict, if *jsonify_data* is True; or string or file-like object, otherwise :param headers: Names and values of HTTP headers to submit with the request (in addition to those needed for authentication, compression, or other options specified with the call). :type headers: dict :param auth: Overrides the *auth* value to pass through to :meth:`requests.request`. By default a token is obtained from the ``DX_SECURITY_CONTEXT``. :type auth: tuple, object, True (default), or None :param timeout: HTTP request timeout, in seconds :type timeout: float :param config: *config* value to pass through to :meth:`requests.request` :type config: dict :param use_compression: "snappy" to use Snappy compression, or None :type use_compression: string or None :param jsonify_data: If True, *data* is converted from a Python list or dict to a JSON string :type jsonify_data: boolean :param want_full_response: If True, the full :class:`requests.Response` object is returned (otherwise, only the content of the response body is returned) :type want_full_response: boolean :param decode_response_body: If True (and *want_full_response* is False), the response body is decoded and, if it is a JSON string, deserialized. Otherwise, the response body is uncompressed if transport compression is on, and returned raw. :type decode_response_body: boolean :param prepend_srv: If True, prepends the API server location to the URL :type prepend_srv: boolean :param max_retries: Maximum number of retries to perform for a request. A "failed" request is retried if any of the following is true: - A response is received from the server, and the content length received does not match the "Content-Length" header. - A response is received from the server, and the response has an HTTP status code in 5xx range. - A response is received from the server, the "Content-Length" header is not set, and the response JSON cannot be parsed. - No response is received from the server, and either *always_retry* is True or the request *method* is "GET". :type max_retries: int :param always_retry: If True, indicates that it is safe to retry a request on failure - Note: It is not guaranteed that the request will *always* be retried on failure; rather, this is an indication to the function that it would be safe to do so. :type always_retry: boolean :returns: Response from API server in the format indicated by *want_full_response* and *decode_response_body*. :raises: :exc:`exceptions.DXAPIError` or a subclass if the server returned a non-200 status code; :exc:`requests.exceptions.HTTPError` if an invalid response was received from the server; or :exc:`requests.exceptions.ConnectionError` if a connection cannot be established. Wrapper around :meth:`requests.request()` that makes an HTTP request, inserting authentication headers and (by default) converting *data* to JSON. .. note:: Bindings methods that make API calls make the underlying HTTP request(s) using :func:`DXHTTPRequest`, and most of them will pass any unrecognized keyword arguments you have supplied through to :func:`DXHTTPRequest`. ''' if session_handler is None: session_handler = SESSION_HANDLERS[os.getpid()] if headers is None: headers = {} global _UPGRADE_NOTIFY url = APISERVER + resource if prepend_srv else resource method = method.upper( ) # Convert method name to uppercase, to ease string comparisons later if _DEBUG >= 3: print(method, url, "=>\n" + json.dumps(data, indent=2), file=sys.stderr) elif _DEBUG == 2: print(method, url, "=>", json.dumps(data), file=sys.stderr) elif _DEBUG > 0: from repr import Repr print(method, url, "=>", Repr().repr(data), file=sys.stderr) if auth is True: auth = AUTH_HELPER if 'verify' not in kwargs and 'DX_CA_CERT' in os.environ: kwargs['verify'] = os.environ['DX_CA_CERT'] if os.environ['DX_CA_CERT'] == 'NOVERIFY': kwargs['verify'] = False from requests.packages import urllib3 urllib3.disable_warnings() if jsonify_data: data = json.dumps(data) if 'Content-Type' not in headers and method == 'POST': headers['Content-Type'] = 'application/json' headers['DNAnexus-API'] = API_VERSION headers['User-Agent'] = USER_AGENT if use_compression == 'snappy': if not snappy_available: raise exceptions.DXError( "Snappy compression requested, but the snappy module is unavailable" ) headers['accept-encoding'] = 'snappy' # If the input is a buffer, its data gets consumed by # requests.request (moving the read position). Record the initial # buffer position so that we can return to it if the request fails # and needs to be retried. rewind_input_buffer_offset = None if hasattr(data, 'seek') and hasattr(data, 'tell'): rewind_input_buffer_offset = data.tell() try_index = 0 while True: success, streaming_response_truncated = True, False response = None try: _method, _url, _headers = _process_method_url_headers( method, url, headers) response = session_handler.request(_method, _url, headers=_headers, data=data, timeout=timeout, auth=auth, **kwargs) if _UPGRADE_NOTIFY and response.headers.get( 'x-upgrade-info', '').startswith( 'A recommended update is available' ) and not os.environ.has_key('_ARGCOMPLETE'): logger.info(response.headers['x-upgrade-info']) try: with file(_UPGRADE_NOTIFY, 'a'): os.utime(_UPGRADE_NOTIFY, None) except: pass _UPGRADE_NOTIFY = False # If an HTTP code that is not in the 200 series is received and the content is JSON, parse it and throw the # appropriate error. Otherwise, raise the usual exception. if response.status_code // 100 != 2: # response.headers key lookup is case-insensitive if response.headers.get('content-type', '').startswith('application/json'): content = json.loads(response.content.decode('utf-8')) error_class = getattr(exceptions, content["error"]["type"], exceptions.DXAPIError) raise error_class(content, response.status_code) response.raise_for_status() if want_full_response: return response else: if 'content-length' in response.headers: if int(response.headers['content-length']) != len( response.content): range_str = ( ' (%s)' % (headers['Range'], )) if 'Range' in headers else '' raise exceptions.ContentLengthError( "Received response with content-length header set to %s but content length is %d%s" % (response.headers['content-length'], len(response.content), range_str)) if use_compression and response.headers.get( 'content-encoding', '') == 'snappy': # TODO: check if snappy raises any exceptions on truncated response content content = snappy.uncompress(response.content) else: content = response.content if decode_response_body: content = content.decode('utf-8') if response.headers.get('content-type', '').startswith('application/json'): try: content = json.loads(content) t = int(response.elapsed.total_seconds() * 1000) if _DEBUG >= 3: print(method, url, "<=", response.status_code, "(%dms)" % t, "\n" + json.dumps(content, indent=2), file=sys.stderr) elif _DEBUG == 2: print(method, url, "<=", response.status_code, "(%dms)" % t, json.dumps(content), file=sys.stderr) elif _DEBUG > 0: print(method, url, "<=", response.status_code, "(%dms)" % t, Repr().repr(content), file=sys.stderr) return content except ValueError: # If a streaming API call (no content-length # set) encounters an error it may just halt the # response because it has no other way to # indicate an error. Under these circumstances # the client sees unparseable JSON, and we # should be able to recover. streaming_response_truncated = 'content-length' not in response.headers raise HTTPError( "Invalid JSON received from server") return content raise AssertionError( 'Should never reach this line: expected a result to have been returned by now' ) except Exception as e: success = False exception_msg = _extract_msg_from_last_exception() if isinstance(e, _expected_exceptions): if response is not None and response.status_code == 503: seconds_to_wait = _extract_retry_after_timeout(response) logger.warn( "%s %s: %s. Waiting %d seconds due to server unavailability...", method, url, exception_msg, seconds_to_wait) time.sleep(seconds_to_wait) # Note, we escape the "except" block here without # incrementing try_index because 503 responses with # Retry-After should not count against the number of # permitted retries. continue # Total number of allowed tries is the initial try + up to # (max_retries) subsequent retries. total_allowed_tries = max_retries + 1 ok_to_retry = False # Because try_index is not incremented until we escape this # iteration of the loop, try_index is equal to the number of # tries that have failed so far, minus one. Test whether we # have exhausted all retries. if try_index + 1 < total_allowed_tries: if response is None or isinstance(e, exceptions.ContentLengthError) or \ streaming_response_truncated: ok_to_retry = always_retry or ( method == 'GET') or _is_retryable_exception(e) else: ok_to_retry = 500 <= response.status_code < 600 if ok_to_retry: if rewind_input_buffer_offset is not None: data.seek(rewind_input_buffer_offset) delay = min(2**try_index, DEFAULT_TIMEOUT) logger.warn( "%s %s: %s. Waiting %d seconds before retry %d of %d...", method, url, exception_msg, delay, try_index + 1, max_retries) time.sleep(delay) try_index += 1 continue # All retries have been exhausted OR the error is deemed not # retryable. Print the latest error and propagate it back to the caller. if not isinstance(e, exceptions.DXAPIError): logger.error("%s %s: %s", method, url, exception_msg) raise finally: if success and try_index > 0: logger.info("%s %s: Recovered after %d retries", method, url, try_index) raise AssertionError( 'Should never reach this line: should have attempted a retry or reraised by now' ) raise AssertionError( 'Should never reach this line: should never break out of loop')
def DXHTTPRequest(resource, data, method='POST', headers=None, auth=True, timeout=600, use_compression=None, jsonify_data=True, want_full_response=False, prepend_srv=True, session_handler=None, max_retries=DEFAULT_RETRIES, always_retry=False, **kwargs): ''' :param resource: API server route, e.g. "/record/new" :type resource: string :param data: Content of the request body :type data: list or dict, if *jsonify_data* is True; or string or file-like object, otherwise :param headers: Names and values of HTTP headers to submit with the request (in addition to those needed for authentication, compression, or other options specified with the call). :type headers: dict :param auth: Overrides the *auth* value to pass through to :meth:`requests.request`. By default a token is obtained from the ``DX_SECURITY_CONTEXT``. :type auth: tuple, object, True (default), or None :param timeout: HTTP request timeout, in seconds :type timeout: float :param config: *config* value to pass through to :meth:`requests.request` :type config: dict :param use_compression: "snappy" to use Snappy compression, or None :type use_compression: string or None :param jsonify_data: If True, *data* is converted from a Python list or dict to a JSON string :type jsonify_data: boolean :param want_full_response: If True, the full :class:`requests.Response` object is returned (otherwise, only the content of the response body is returned) :type want_full_response: boolean :param prepend_srv: If True, prepends the API server location to the URL :type prepend_srv: boolean :param max_retries: Maximum number of retries to perform for a request. A "failed" request is retried if any of the following is true: - A response is received from the server, and the content length received does not match the "Content-Length" header. - A response is received from the server, and the response has an HTTP status code in 5xx range. - A response is received from the server, the "Content-Length" header is not set, and the response JSON cannot be parsed. - No response is received from the server, and either *always_retry* is True or the request *method* is "GET". :type max_retries: int :param always_retry: If True, indicates that it is safe to retry a request on failure - Note: It is not guaranteed that the request will *always* be retried on failure; rather, this is an indication to the function that it would be safe to do so. :type always_retry: boolean :returns: Response from API server in the format indicated by *want_full_response*. Note: if *want_full_response* is set to False and the header "content-type" is found in the response with value "application/json", the body of the response will **always** be converted from JSON to a Python list or dict before it is returned. :raises: :exc:`DXAPIError` if the server returned a non-200 status code; :exc:`requests.exceptions.HTTPError` if an invalid response was received from the server; or :exc:`requests.exceptions.ConnectionError` if a connection cannot be established. Wrapper around :meth:`requests.request()` that makes an HTTP request, inserting authentication headers and (by default) converting *data* to JSON. .. note:: Bindings methods that make API calls make the underlying HTTP request(s) using :func:`DXHTTPRequest`, and most of them will pass any unrecognized keyword arguments you have supplied through to :func:`DXHTTPRequest`. ''' if session_handler is None: session_handler = SESSION_HANDLERS[os.getpid()] if headers is None: headers = {} global _UPGRADE_NOTIFY url = APISERVER + resource if prepend_srv else resource method = method.upper( ) # Convert method name to uppercase, to ease string comparisons later if _DEBUG == '2': print >> sys.stderr, method, url, "=>\n" + json.dumps(data, indent=2) elif _DEBUG: from repr import Repr print >> sys.stderr, method, url, "=>", Repr().repr(data) if auth is True: auth = AUTH_HELPER # When *data* is bytes but *headers* contains Unicode strings, httplib tries to concatenate them and decode *data*, # which should not be done. Also, per HTTP/1.1 headers must be encoded with MIME, but we'll disregard that here, and # just encode them with the Python default (ascii) and fail for any non-ascii content. headers = {k.encode(): v.encode() for k, v in headers.iteritems()} if jsonify_data: data = json.dumps(data) if 'Content-Type' not in headers and method == 'POST': headers['Content-Type'] = 'application/json' # If the input is a buffer, its data gets consumed by # requests.request (moving the read position). Record the initial # buffer position so that we can return to it if the request fails # and needs to be retried. rewind_input_buffer_offset = None if hasattr(data, 'seek') and hasattr(data, 'tell'): rewind_input_buffer_offset = data.tell() headers['DNAnexus-API'] = API_VERSION headers['User-Agent'] = USER_AGENT if use_compression == 'snappy': if not snappy_available: raise DXError( "Snappy compression requested, but the snappy module is unavailable" ) headers['accept-encoding'] = 'snappy' if 'verify' not in kwargs and 'DX_CA_CERT' in os.environ: kwargs['verify'] = os.environ['DX_CA_CERT'] if os.environ['DX_CA_CERT'] == 'NOVERIFY': kwargs['verify'] = False response, last_error = None, None for retry in range(max_retries + 1): streaming_response_truncated = False try: response = session_handler.request(method, url, data=data, headers=headers, timeout=timeout, auth=auth, **kwargs) if _UPGRADE_NOTIFY and response.headers.get( 'x-upgrade-info', '').startswith( 'A recommended update is available' ) and not os.environ.has_key('_ARGCOMPLETE'): logger.info(response.headers['x-upgrade-info']) try: with file(_UPGRADE_NOTIFY, 'a'): os.utime(_UPGRADE_NOTIFY, None) except: pass _UPGRADE_NOTIFY = False if _DEBUG == '2': print >> sys.stderr, method, url, "<=", response.status_code, "\n" + json.dumps( response.content, indent=2) elif _DEBUG: print >> sys.stderr, method, url, "<=", response.status_code, Repr( ).repr(response.content) # If HTTP code that is not 200 (OK) is received and the content is # JSON, parse it and throw the appropriate error. Otherwise, # raise the usual exception. if response.status_code != requests.codes.ok: # response.headers key lookup is case-insensitive if response.headers.get('content-type', '').startswith('application/json'): content = json.loads(response.content) raise DXAPIError(content, response.status_code) response.raise_for_status() if want_full_response: return response else: if 'content-length' in response.headers: if int(response.headers['content-length']) != len( response.content): raise ContentLengthError( "Received response with content-length header set to %s but content length is %d" % (response.headers['content-length'], len(response.content))) if use_compression and response.headers.get( 'content-encoding', '') == 'snappy': # TODO: check if snappy raises any exceptions on truncated response content decoded_content = snappy.uncompress(response.content) else: decoded_content = response.content if response.headers.get('content-type', '').startswith('application/json'): try: return json.loads(decoded_content) except ValueError: # If a streaming API call (no content-length # set) encounters an error it may just halt the # response because it has no other way to # indicate an error. Under these circumstances # the client sees unparseable JSON, and we # should be able to recover. streaming_response_truncated = 'content-length' not in response.headers raise HTTPError("Invalid JSON received from server") return decoded_content except (DXAPIError, ConnectionError, HTTPError, Timeout, httplib.HTTPException) as e: last_error = e # TODO: support HTTP/1.1 503 Retry-After # TODO: if the socket was dropped mid-request, ConnectionError or httplib.IncompleteRead is raised, # but non-idempotent requests can be unsafe to retry # Distinguish between connection initiation errors and dropped socket errors if retry < max_retries: if (response is None) or isinstance(e, ContentLengthError): ok_to_retry = always_retry or (method == 'GET') else: ok_to_retry = (response.status_code >= 500 and response.status_code < 600 ) or streaming_response_truncated if ok_to_retry: if rewind_input_buffer_offset is not None: data.seek(rewind_input_buffer_offset) delay = 2**(retry + 1) logger.warn( "%s %s: %s. Waiting %d seconds before retry %d of %d..." % (method, url, str(e), delay, retry + 1, max_retries)) time.sleep(delay) continue break if last_error is None: last_error = DXError("Internal error in DXHTTPRequest") raise last_error
# XXX TO DO: # - popup menu # - support partial or total redisplay # - more doc strings # - tooltips # object browser # XXX TO DO: # - for classes/modules, add "open source" to object browser from TreeWidget import TreeItem, TreeNode, ScrolledCanvas from repr import Repr myrepr = Repr() myrepr.maxstring = 100 myrepr.maxother = 100 class ObjectTreeItem(TreeItem): def __init__(self, labeltext, object, setfunction=None): self.labeltext = labeltext self.object = object self.setfunction = setfunction def GetLabelText(self): return self.labeltext def GetText(self): return myrepr.repr(self.object) def GetIconName(self): if not self.IsExpandable(): return "python" def IsEditable(self): return self.setfunction is not None def SetText(self, text): try:
# XXX TO DO: # - popup menu # - support partial or total redisplay # - more doc strings # - tooltips # object browser # XXX TO DO: # - for classes/modules, add "open source" to object browser from TreeWidget import TreeItem, TreeNode, ScrolledCanvas from repr import Repr myrepr = Repr() myrepr.maxstring = 100 myrepr.maxother = 100 class ObjectTreeItem(TreeItem): def __init__(self, labeltext, object, setfunction=None): self.labeltext = labeltext self.object = object self.setfunction = setfunction def GetLabelText(self): return self.labeltext def GetText(self): return myrepr.repr(self.object) def GetIconName(self): if not self.IsExpandable(): return "python" def IsEditable(self): return self.setfunction is not None def SetText(self, text): try: value = eval(text)
def __init__(self): Repr.__init__(self) self.maxstring = 76 self.maxother = 76
def __init__(self, core_obj, opts=None): get_option = lambda key: \ Mmisc.option_set(opts, key, DEFAULT_PROC_OPTS) Mprocessor.Processor.__init__(self, core_obj) self.continue_running = False # True if we should leave command loop self.event2short = dict(EVENT2SHORT) self.event2short['signal'] = '?!' self.event2short['brkpt'] = 'xx' self.optional_modules = ('ipython', 'bpy', 'deparse') self.optional_modules = ('ipython', 'bpy') self.cmd_instances = self._populate_commands() # command argument string. Is like current_command, but the part # after cmd_name has been removed. self.cmd_argstr = '' # command name before alias or macro resolution self.cmd_name = '' self.cmd_queue = [] # Queued debugger commands self.completer = lambda text, state: \ Mcomplete.completer(self, text, state) self.current_command = '' # Current command getting run self.debug_nest = 1 self.display_mgr = Mdisplay.DisplayMgr() self.intf = core_obj.debugger.intf self.last_command = None # Initially a no-op self.precmd_hooks = [] self.location = lambda : print_location(self) self.preloop_hooks = [] self.postcmd_hooks = [] self._populate_cmd_lists() # Note: prompt_str's value set below isn't used. It is # computed dynamically. The value is suggestive of what it # looks like. self.prompt_str = '(trepan2) ' # Stop only if line/file is different from last time self.different_line = None # These values updated on entry. Set initial values. self.curframe = None self.event = None self.event_arg = None self.frame = None self.list_lineno = 0 # last list number used in "list" self.list_filename = None # last filename used in list self.macros = {} # Debugger Macros # Create a custom safe Repr instance and increase its maxstring. # The default of 30 truncates error messages too easily. self._repr = Repr() self._repr.maxstring = 100 self._repr.maxother = 60 self._repr.maxset = 10 self._repr.maxfrozen = 10 self._repr.array = 10 self.stack = [] self.thread_name = None self.frame_thread_name = None initfile_list = get_option('initfile_list') for init_cmdfile in initfile_list: self.queue_startfile(init_cmdfile) return
import bdb from repr import Repr import os import re import pprint import traceback class Restart(Exception): """Causes a debugger to be restarted for the debugged python program.""" pass # Create a custom safe Repr instance and increase its maxstring. # The default of 30 truncates error messages too easily. _repr = Repr() _repr.maxstring = 200 _saferepr = _repr.repr __all__ = [ "run", "pm", "Pdb", "runeval", "runctx", "runcall", "set_trace", "post_mortem", "help" ] def find_function(funcname, filename): cre = re.compile(r'def\s+%s\s*[(]' % re.escape(funcname)) try: fp = open(filename) except IOError: return None
# THE SOFTWARE. # """ Tools for printing out extended information about frame variables """ import inspect import smtplib import sys import string import tempfile import traceback import xmlrpclib from repr import Repr _repr = Repr() _repr.maxstring = 3000 _saferepr = _repr.repr def printTraceBack(tb=None, output=sys.stderr, exc_type=None, exc_msg=None): if isinstance(output, str): output = open(output, 'w') exc_info = sys.exc_info() if tb is None: tb = exc_info[2] if exc_type is None: exc_type = exc_info[0] if exc_msg is None:
# Embedded file name: scripts/common/Lib/idlelib/ObjectBrowser.py from idlelib.TreeWidget import TreeItem, TreeNode, ScrolledCanvas from repr import Repr myrepr = Repr() myrepr.maxstring = 100 myrepr.maxother = 100 class ObjectTreeItem(TreeItem): def __init__(self, labeltext, object, setfunction=None): self.labeltext = labeltext self.object = object self.setfunction = setfunction def GetLabelText(self): return self.labeltext def GetText(self): return myrepr.repr(self.object) def GetIconName(self): if not self.IsExpandable(): return 'python' def IsEditable(self): return self.setfunction is not None def SetText(self, text): try: value = eval(text) self.setfunction(value)
if covered[it] == cover: left.remove(it) return reps, lazydict([("thres", rating), ("prop", lambda: len(reps)/float(len(cand))), ("cover", lambda: min(covered.itervalues()))]) ############################################################################### # IO, data formats, etc ############################################################################### from ast import literal_eval from repr import Repr CL = "\r\033[0K" repr_s = Repr() repr_s.maxlevel = 3 repr_s.maxdict = 2 repr_s.maxlist = 2 repr_s.maxtuple = 2 repr_s.maxset = 2 repr_s.maxfrozenset = 2 repr_s.maxdeque = 2 repr_s.maxarray = 2 repr_s.maxlong = 20 repr_s.maxstring = 20 repr_s.maxother = 15 TMP_RAM = "/dev/shm" if os.access("/dev/shm", os.W_OK) else None def repr_call(fname, *args, **kwargs):
class CommandProcessor(Mprocessor.Processor): def __init__(self, core_obj, opts=None): get_option = lambda key: \ Mmisc.option_set(opts, key, DEFAULT_PROC_OPTS) Mprocessor.Processor.__init__(self, core_obj) self.continue_running = False # True if we should leave command loop self.event2short = dict(EVENT2SHORT) self.event2short['signal'] = '?!' self.event2short['brkpt'] = 'xx' self.optional_modules = ('ipython', 'bpy', 'deparse') self.optional_modules = ('ipython', 'bpy') self.cmd_instances = self._populate_commands() # command argument string. Is like current_command, but the part # after cmd_name has been removed. self.cmd_argstr = '' # command name before alias or macro resolution self.cmd_name = '' self.cmd_queue = [] # Queued debugger commands self.completer = lambda text, state: \ Mcomplete.completer(self, text, state) self.current_command = '' # Current command getting run self.debug_nest = 1 self.display_mgr = Mdisplay.DisplayMgr() self.intf = core_obj.debugger.intf self.last_command = None # Initially a no-op self.precmd_hooks = [] self.location = lambda: print_location(self) self.preloop_hooks = [] self.postcmd_hooks = [] self._populate_cmd_lists() # Note: prompt_str's value set below isn't used. It is # computed dynamically. The value is suggestive of what it # looks like. self.prompt_str = '(trepan2) ' # Stop only if line/file is different from last time self.different_line = None # These values updated on entry. Set initial values. self.curframe = None self.event = None self.event_arg = None self.frame = None self.list_lineno = 0 # last list number used in "list" self.list_filename = None # last filename used in list self.macros = {} # Debugger Macros # Create a custom safe Repr instance and increase its maxstring. # The default of 30 truncates error messages too easily. self._repr = Repr() self._repr.maxstring = 100 self._repr.maxother = 60 self._repr.maxset = 10 self._repr.maxfrozen = 10 self._repr.array = 10 self.stack = [] self.thread_name = None self.frame_thread_name = None initfile_list = get_option('initfile_list') for init_cmdfile in initfile_list: self.queue_startfile(init_cmdfile) return def _saferepr(self, str, maxwidth=None): if maxwidth is None: maxwidth = self.debugger.settings['width'] return self._repr.repr(str)[:maxwidth] def add_preloop_hook(self, hook, position=-1, nodups=True): if hook in self.preloop_hooks: return False self.preloop_hooks.insert(position, hook) return True # To be overridden in derived debuggers def defaultFile(self): """Produce a reasonable default.""" filename = self.curframe.f_code.co_filename # Consider using is_exec_stmt(). I just don't understand # the conditions under which the below test is true. if filename == '<string>' and self.debugger.mainpyfile: filename = self.debugger.mainpyfile pass return filename def set_prompt(self, prompt='trepan2'): if self.thread_name and self.thread_name != 'MainThread': prompt += ':' + self.thread_name pass self.prompt_str = '%s%s%s' % ('(' * self.debug_nest, prompt, ')' * self.debug_nest) highlight = self.debugger.settings['highlight'] if highlight and highlight in ('light', 'dark'): self.prompt_str = colorize('underline', self.prompt_str) self.prompt_str += ' ' def event_processor(self, frame, event, event_arg, prompt='trepan2'): 'command event processor: reading a commands do something with them.' self.frame = frame self.event = event self.event_arg = event_arg filename = frame.f_code.co_filename lineno = frame.f_lineno if sys.version_info[0] == 2 and sys.version_info[1] <= 4: line = None else: line = linecache.getline(filename, lineno, frame.f_globals) pass if not line: opts = { 'output': 'plain', 'reload_on_change': self.settings('reload'), 'strip_nl': False } line = pyficache.getline(filename, lineno, opts) self.current_source_text = line if self.settings('skip') is not None: if Mbytecode.is_def_stmt(line, frame): return True if Mbytecode.is_class_def(line, frame): return True pass self.thread_name = Mthread.current_thread_name() self.frame_thread_name = self.thread_name self.set_prompt(prompt) self.process_commands() if filename == '<string>': pyficache.remove_remap_file('<string>') return True def forget(self): """ Remove memory of state variables set in the command processor """ self.stack = [] self.curindex = 0 self.curframe = None self.thread_name = None self.frame_thread_name = None return def eval(self, arg): """Eval string arg in the current frame context.""" try: return eval(arg, self.curframe.f_globals, self.curframe.f_locals) except: t, v = sys.exc_info()[:2] if isinstance(t, str): exc_type_name = t pass else: exc_type_name = t.__name__ self.errmsg(str("%s: %s" % (exc_type_name, arg))) raise return None # Not reached def exec_line(self, line): if self.curframe: local_vars = self.curframe.f_locals global_vars = self.curframe.f_globals else: local_vars = None # FIXME: should probably have place where the # user can store variables inside the debug session. # The setup for this should be elsewhere. Possibly # in interaction. global_vars = None try: code = compile(line + '\n', '"%s"' % line, 'single') exec(code, global_vars, local_vars) except: t, v = sys.exc_info()[:2] if type(t) == str: exc_type_name = t else: exc_type_name = t.__name__ self.errmsg('%s: %s' % (str(exc_type_name), str(v))) pass return def parse_position(self, arg, old_mod=None): """parse_position(self, arg)->(fn, name, lineno) Parse arg as [filename:]lineno | function | module Make sure it works for C:\foo\bar.py:12 """ colon = arg.rfind(':') if colon >= 0: # First handle part before the colon arg1 = arg[:colon].rstrip() lineno_str = arg[colon + 1:].lstrip() (mf, filename, lineno) = self.parse_position_one_arg(arg1, old_mod, False) if filename is None: filename = self.core.canonic(arg1) # Next handle part after the colon val = self.get_an_int(lineno_str, "Bad line number: %s" % lineno_str) if val is not None: lineno = val else: (mf, filename, lineno) = self.parse_position_one_arg(arg, old_mod) pass return mf, filename, lineno def parse_position_one_arg(self, arg, old_mod=None, show_errmsg=True): """parse_position_one_arg(self, arg, show_errmsg) -> (module/function, file, lineno) See if arg is a line number, function name, or module name. Return what we've found. None can be returned as a value in the triple. """ modfunc, filename, lineno = (None, None, None) if self.curframe: g = self.curframe.f_globals l = self.curframe.f_locals else: g = globals() l = locals() pass try: # First see if argument is an integer lineno = int(eval(arg, g, l)) if old_mod is None: filename = self.curframe.f_code.co_filename pass except: try: modfunc = eval(arg, g, l) except: modfunc = arg pass msg = ('Object %s is not known yet as a function, module, ' 'or is not found along sys.path, ' 'and not a line number.') % str(repr(arg)) try: # See if argument is a module or function if inspect.isfunction(modfunc): pass elif inspect.ismodule(modfunc): filename = pyficache.pyc2py(modfunc.__file__) filename = self.core.canonic(filename) return (modfunc, filename, None) elif hasattr(modfunc, 'im_func'): modfunc = modfunc.im_func pass else: if show_errmsg: self.errmsg(msg) return (None, None, None) code = modfunc.func_code lineno = code.co_firstlineno filename = code.co_filename except: if show_errmsg: self.errmsg(msg) return (None, None, None) pass return (modfunc, self.core.canonic(filename), lineno) def get_an_int(self, arg, msg_on_error, min_value=None, max_value=None): """Like cmdfns.get_an_int(), but if there's a stack frame use that in evaluation.""" ret_value = self.get_int_noerr(arg) if ret_value is None: if msg_on_error: self.errmsg(msg_on_error) else: self.errmsg('Expecting an integer, got: %s.' % str(arg)) pass return None if min_value and ret_value < min_value: self.errmsg('Expecting integer value to be at least %d, got: %d.' % (min_value, ret_value)) return None elif max_value and ret_value > max_value: self.errmsg('Expecting integer value to be at most %d, got: %d.' % (max_value, ret_value)) return None return ret_value def get_int_noerr(self, arg): """Eval arg and it is an integer return the value. Otherwise return None""" if self.curframe: g = self.curframe.f_globals l = self.curframe.f_locals else: g = globals() l = locals() pass try: val = int(eval(arg, g, l)) except (SyntaxError, NameError, ValueError, TypeError): return None return val def get_int(self, arg, min_value=0, default=1, cmdname=None, at_most=None): """If no argument use the default. If arg is a an integer between least min_value and at_most, use that. Otherwise report an error. If there's a stack frame use that in evaluation.""" if arg is None: return default default = self.get_int_noerr(arg) if default is None: if cmdname: self.errmsg( ("Command '%s' expects an integer; " + "got: %s.") % (cmdname, str(arg))) else: self.errmsg('Expecting a positive integer, got: %s' % str(arg)) pass return None pass if default < min_value: if cmdname: self.errmsg(("Command '%s' expects an integer at least" + ' %d; got: %d.') % (cmdname, min_value, default)) else: self.errmsg( ("Expecting a positive integer at least" + ' %d; got: %d') % (min_value, default)) pass return None elif at_most and default > at_most: if cmdname: self.errmsg(("Command '%s' expects an integer at most" + ' %d; got: %d.') % (cmdname, at_most, default)) else: self.errmsg(("Expecting an integer at most %d; got: %d") % (at_most, default)) pass pass return default def getval(self, arg): try: return eval(arg, self.curframe.f_globals, self.curframe.f_locals) except: t, v = sys.exc_info()[:2] if isinstance(t, str): exc_type_name = t else: exc_type_name = t.__name__ self.errmsg(str("%s: %s" % (exc_type_name, arg))) raise return def ok_for_running(self, cmd_obj, name, nargs): """We separate some of the common debugger command checks here: whether it makes sense to run the command in this execution state, if the command has the right number of arguments and so on. """ if hasattr(cmd_obj, 'execution_set'): if not (self.core.execution_status in cmd_obj.execution_set): part1 = ( "Command '%s' is not available for execution status:" % name) mess = Mmisc.wrapped_lines(part1, self.core.execution_status, self.debugger.settings['width']) self.errmsg(mess) return False pass if self.frame is None and cmd_obj.need_stack: self.intf[-1].errmsg("Command '%s' needs an execution stack." % name) return False if nargs < cmd_obj.min_args: self.errmsg( ("Command '%s' needs at least %d argument(s); " + "got %d.") % (name, cmd_obj.min_args, nargs)) return False elif cmd_obj.max_args is not None and nargs > cmd_obj.max_args: self.errmsg( ("Command '%s' can take at most %d argument(s);" + " got %d.") % (name, cmd_obj.max_args, nargs)) return False return True def process_commands(self): """Handle debugger commands.""" if self.core.execution_status != 'No program': self.setup() self.location() pass leave_loop = run_hooks(self, self.preloop_hooks) self.continue_running = False while not leave_loop: try: run_hooks(self, self.precmd_hooks) # bdb had a True return to leave loop. # A more straight-forward way is to set # instance variable self.continue_running. leave_loop = self.process_command() if leave_loop or self.continue_running: break except EOFError: # If we have stacked interfaces, pop to the next # one. If this is the last one however, we'll # just stick with that. FIXME: Possibly we should # check to see if we are interactive. and not # leave if that's the case. Is this the right # thing? investigate and fix. if len(self.debugger.intf) > 1: del self.debugger.intf[-1] self.last_command = '' else: if self.debugger.intf[-1].output: self.debugger.intf[-1].output.writeline('Leaving') raise Mexcept.DebuggerQuit pass break pass pass return run_hooks(self, self.postcmd_hooks) def process_command(self): # process command if len(self.cmd_queue) > 0: current_command = self.cmd_queue[0].strip() del self.cmd_queue[0] else: current_command = (self.intf[-1].read_command( self.prompt_str).strip()) if '' == current_command and self.intf[-1].interactive: current_command = self.last_command pass pass # Look for comments if '' == current_command: if self.intf[-1].interactive: self.errmsg("No previous command registered, " + "so this is a no-op.") pass return False if current_command is None or current_command[0] == '#': return False try: args_list = arg_split(current_command) except: self.errmsg("bad parse %s" < sys.exc_info()[0]) return False for args in args_list: if len(args): while True: if len(args) == 0: return False macro_cmd_name = args[0] if macro_cmd_name not in self.macros: break try: current_command = \ self.macros[macro_cmd_name][0](*args[1:]) except TypeError: t, v = sys.exc_info()[:2] self.errmsg("Error expanding macro %s" % macro_cmd_name) return False if self.settings('debugmacro'): print(current_command) pass if isinstance(types.ListType, type(current_command)): for x in current_command: if bytes != type(x): self.errmsg(("macro %s should return a List " + "of Strings. Has %s of type %s") % (macro_cmd_name, x, repr(current_command), type(x))) return False pass first = current_command[0] args = first.split() self.cmd_queue + [current_command[1:]] current_command = first elif type(current_command) == str: args = current_command.split() else: self.errmsg(("macro %s should return a List " + "of Strings or a String. Got %s") % (macro_cmd_name, repr(current_command))) return False pass self.cmd_name = args[0] cmd_name = resolve_name(self, self.cmd_name) self.cmd_argstr = current_command[len(self.cmd_name):].lstrip() if cmd_name: self.last_command = current_command cmd_obj = self.commands[cmd_name] if self.ok_for_running(cmd_obj, cmd_name, len(args) - 1): try: self.current_command = current_command result = cmd_obj.run(args) if result: return result except (Mexcept.DebuggerQuit, Mexcept.DebuggerRestart, SystemExit): # Let these exceptions propagate through raise except: self.errmsg("INTERNAL ERROR: " + traceback.format_exc()) pass pass pass elif not self.settings('autoeval'): self.undefined_cmd(current_command) else: self.exec_line(current_command) pass pass pass return False def remove_preloop_hook(self, hook): try: position = self.preloop_hooks.index(hook) except ValueError: return False del self.preloop_hooks[position] return True def setup(self): """Initialization done before entering the debugger-command loop. In particular we set up the call stack used for local variable lookup and frame/up/down commands. We return True if we should NOT enter the debugger-command loop.""" self.forget() if self.settings('dbg_trepan'): self.frame = inspect.currentframe() pass if self.event in ['exception', 'c_exception']: exc_type, exc_value, exc_traceback = self.event_arg else: _, _, exc_traceback = ( None, None, None, ) # NOQA pass if self.frame or exc_traceback: self.stack, self.curindex = \ get_stack(self.frame, exc_traceback, None, self) self.curframe = self.stack[self.curindex][0] self.thread_name = Mthread.current_thread_name() if exc_traceback: self.list_lineno = traceback.extract_tb(exc_traceback, 1)[0][1] else: self.stack = self.curframe = \ self.botframe = None pass if self.curframe: self.list_lineno = \ max(1, inspect.getlineno(self.curframe) - int(self.settings('listsize') / 2)) - 1 self.list_filename = self.curframe.f_code.co_filename else: if not exc_traceback: self.list_lineno = None pass # if self.execRcLines()==1: return True return False def queue_startfile(self, cmdfile): """Arrange for file of debugger commands to get read in the process-command loop.""" expanded_cmdfile = os.path.expanduser(cmdfile) is_readable = Mfile.readable(expanded_cmdfile) if is_readable: self.cmd_queue.append('source ' + expanded_cmdfile) elif is_readable is None: self.errmsg("source file '%s' doesn't exist" % expanded_cmdfile) else: self.errmsg("source file '%s' is not readable" % expanded_cmdfile) pass return def undefined_cmd(self, cmd): """Error message when a command doesn't exist""" self.errmsg('Undefined command: "%s". Try "help".' % cmd) return def read_history_file(self): """Read the command history file -- possibly.""" histfile = self.debugger.intf[-1].histfile try: import readline readline.read_history_file(histfile) except IOError: pass except ImportError: pass return def write_history_file(self): """Write the command history file -- possibly.""" settings = self.debugger.settings histfile = self.debugger.intf[-1].histfile if settings['hist_save']: try: import readline try: readline.write_history_file(histfile) except IOError: pass except ImportError: pass pass return def _populate_commands(self): """ Create an instance of each of the debugger commands. Commands are found by importing files in the directory 'command'. Some files are excluded via an array set in __init__. For each of the remaining files, we import them and scan for class names inside those files and for each class name, we will create an instance of that class. The set of DebuggerCommand class instances form set of possible debugger commands.""" import trepan.processor.command as Mcommand if hasattr(Mcommand, '__modules__'): return self.populate_commands_easy_install(Mcommand) else: return self.populate_commands_pip(Mcommand) def populate_commands_pip(self, Mcommand): cmd_instances = [] eval_cmd_template = 'command_mod.%s(self)' for mod_name in Mcommand.__dict__.keys(): if mod_name.startswith('__'): continue import_name = "trepan.processor.command." + mod_name imp = __import__(import_name) if imp.__name__ == 'trepan': command_mod = imp.processor.command else: if mod_name in ( 'info_sub', 'set_sub', 'show_sub', ): pass try: command_mod = getattr(__import__(import_name), mod_name) except: # Don't need to warn about optional modules if mod_name not in self.optional_modules: print('Error importing %s: %s' % (mod_name, sys.exc_info()[0])) pass continue pass classnames = [ tup[0] for tup in inspect.getmembers(command_mod, inspect.isclass) if ('DebuggerCommand' != tup[0] and tup[0].endswith('Command')) ] for classname in classnames: eval_cmd = eval_cmd_template % classname try: instance = eval(eval_cmd) cmd_instances.append(instance) except ImportError: pass except: print('Error loading %s from %s: %s' % (classname, mod_name, sys.exc_info()[0])) pass pass pass return cmd_instances def populate_commands_easy_install(self, Mcommand): cmd_instances = [] for mod_name in Mcommand.__modules__: if mod_name in ( 'info_sub', 'set_sub', 'show_sub', ): pass import_name = 'trepan.processor.command.' + mod_name try: command_mod = __import__(import_name, None, None, ['*']) except: if mod_name not in self.optional_modules: print('Error importing %s: %s' % (mod_name, sys.exc_info()[0])) pass continue classnames = [ tup[0] for tup in inspect.getmembers(command_mod, inspect.isclass) if ('DebuggerCommand' != tup[0] and tup[0].endswith('Command')) ] for classname in classnames: if False: instance = getattr(command_mod, classname)(self) cmd_instances.append(instance) else: try: instance = getattr(command_mod, classname)(self) cmd_instances.append(instance) except: print('Error loading %s from %s: %s' % (classname, mod_name, sys.exc_info()[0])) pass pass pass pass return cmd_instances def _populate_cmd_lists(self): """ Populate self.lists and hashes: self.commands, and self.aliases, self.category """ self.commands = {} self.aliases = {} self.category = {} # self.short_help = {} for cmd_instance in self.cmd_instances: if not hasattr(cmd_instance, 'aliases'): continue alias_names = cmd_instance.aliases cmd_name = cmd_instance.name self.commands[cmd_name] = cmd_instance for alias_name in alias_names: self.aliases[alias_name] = cmd_name pass cat = getattr(cmd_instance, 'category') if cat and self.category.get(cat): self.category[cat].append(cmd_name) else: self.category[cat] = [cmd_name] pass # sh = getattr(cmd_instance, 'short_help') # if sh: # self.short_help[cmd_name] = getattr(c, 'short_help') # pass pass for k in list(self.category.keys()): self.category[k].sort() pass return pass
def log_repr(result): r = Repr() r.maxstring = 60 r.maxother = 60 return r.repr(result)
sys.excepthook = sys.__excepthook__ """ import pdb import sys import traceback def pdb_excepthook(exc_type, exc_val, exc_tb): traceback.print_tb(exc_tb, limit=limit) pdb.post_mortem(exc_tb) sys.excepthook = pdb_excepthook return _repr_obj = Repr() _repr_obj.maxstring = 50 _repr_obj.maxother = 50 brief_repr = _repr_obj.repr # events: call, return, get, set, del, raise def trace_print_hook(event, label, obj, attr_name, args=(), kwargs={}, result=_UNSET): fargs = (event.ljust(6), time.time(), label.rjust(10), obj.__class__.__name__, attr_name)
def __init__(self, core_obj, opts=None): get_option = lambda key: \ Mmisc.option_set(opts, key, DEFAULT_PROC_OPTS) Mprocessor.Processor.__init__(self, core_obj) self.continue_running = False # True if we should leave command loop self.event2short = dict(EVENT2SHORT) self.event2short['signal'] = '?!' self.event2short['brkpt'] = 'xx' self.optional_modules = ('ipython', 'bpy', 'deparse') self.optional_modules = ('ipython', 'bpy') self.cmd_instances = self._populate_commands() # command argument string. Is like current_command, but the part # after cmd_name has been removed. self.cmd_argstr = '' # command name before alias or macro resolution self.cmd_name = '' self.cmd_queue = [] # Queued debugger commands self.completer = lambda text, state: \ Mcomplete.completer(self, text, state) self.current_command = '' # Current command getting run self.debug_nest = 1 self.display_mgr = Mdisplay.DisplayMgr() self.intf = core_obj.debugger.intf self.last_command = None # Initially a no-op self.precmd_hooks = [] self.location = lambda: print_location(self) self.preloop_hooks = [] self.postcmd_hooks = [] self._populate_cmd_lists() # Note: prompt_str's value set below isn't used. It is # computed dynamically. The value is suggestive of what it # looks like. self.prompt_str = '(trepan2) ' # Stop only if line/file is different from last time self.different_line = None # These values updated on entry. Set initial values. self.curframe = None self.event = None self.event_arg = None self.frame = None self.list_lineno = 0 # last list number used in "list" self.list_filename = None # last filename used in list self.macros = {} # Debugger Macros # Create a custom safe Repr instance and increase its maxstring. # The default of 30 truncates error messages too easily. self._repr = Repr() self._repr.maxstring = 100 self._repr.maxother = 60 self._repr.maxset = 10 self._repr.maxfrozen = 10 self._repr.array = 10 self.stack = [] self.thread_name = None self.frame_thread_name = None initfile_list = get_option('initfile_list') for init_cmdfile in initfile_list: self.queue_startfile(init_cmdfile) return
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # """ Tools for printing out extended information about frame variables """ import inspect import smtplib import sys import string import tempfile import traceback import xmlrpclib from repr import Repr _repr = Repr() _repr.maxstring = 3000 _saferepr = _repr.repr def printTraceBack(tb=None, output=sys.stderr, exc_type=None, exc_msg=None): if isinstance(output, str): output = open(output, 'w') exc_info = sys.exc_info() if tb is None: tb = exc_info[2] if exc_type is None: exc_type = exc_info[0]
import linecache from repr import Repr import re # this may change back to the "stock" bdb module once IronPython and/or # that module are are fixed to work together. from pycopia.fepy import bdb from pycopia.fepy import IO from pycopia.fepy import UI from pycopia.fepy import CLI from pycopia.aid import IF # Create a custom safe Repr instance and increase its maxstring. # The default of 30 truncates error messages too easily. _repr = Repr() _repr.maxstring = 200 _repr.maxother = 50 _saferepr = _repr.repr DebuggerQuit = bdb.BdbQuit def find_function(funcname, filename): cre = re.compile(r'def\s+%s\s*[(]' % funcname) try: fp = open(filename) except IOError: return None # consumer of this info expects the first line to be 1 lineno = 1 answer = None
# XXX TO DO: # - popup menu # - support partial or total redisplay # - more doc strings # - tooltips # object browser # XXX TO DO: # - for classes/modules, add "open source" to object browser from TreeWidget import TreeItem, TreeNode, ScrolledCanvas from repr import Repr myrepr = Repr() myrepr.maxstring = 100 myrepr.maxother = 100 class ObjectTreeItem(TreeItem): def __init__(self, labeltext, object, setfunction=None): self.labeltext = labeltext self.object = object self.setfunction = setfunction def GetLabelText(self): return self.labeltext def GetText(self): return myrepr.repr(self.object) def GetIconName(self): if not self.IsExpandable(): return "python"
diff.microseconds / 1000) except Exception, e: problem_serializing(d, e) def unicode_key(key): """ CONVERT PROPERTY VALUE TO QUOTED NAME OF SAME """ if not isinstance(key, basestring): from pyLibrary.debugs.logs import Log Log.error("{{key|quote}} is not a valid key", key=key) return quote(unicode(key)) _repr_ = Repr() _repr_.maxlevel = 2 def _repr(obj): return _repr_.repr(obj) # OH HUM, cPython with uJSON, OR pypy WITH BUILTIN JSON? # http://liangnuren.wordpress.com/2012/08/13/python-json-performance/ # http://morepypy.blogspot.ca/2011/10/speeding-up-json-encoding-in-pypy.html if use_pypy: json_encoder = pypy_json_encode else: json_encoder = cPythonJSONEncoder().encode
except Exception, e: problem_serializing(d, e) def unicode_key(key): """ CONVERT PROPERTY VALUE TO QUOTED NAME OF SAME """ if not isinstance(key, basestring): from pyLibrary.debugs.logs import Log Log.error("{{key|quote}} is not a valid key", key=key) return quote(unicode(key)) _repr_ = Repr() _repr_.maxlevel = 2 def _repr(obj): return _repr_.repr(obj) # OH HUM, cPython with uJSON, OR pypy WITH BUILTIN JSON? # http://liangnuren.wordpress.com/2012/08/13/python-json-performance/ # http://morepypy.blogspot.ca/2011/10/speeding-up-json-encoding-in-pypy.html if use_pypy: json_encoder = pypy_json_encode else: json_encoder = cPythonJSONEncoder().encode
value = func(self, *args) _cache[args] = (now + cache_store.timeout, args, value, None) return value except Exception, e: e = Except.wrap(e) _cache[args] = (now + cache_store.timeout, args, None, e) raise e else: raise exception else: return value return output _repr = Repr() _repr.maxlevel = 2 def repr(obj): """ JUST LIKE __builtin__.repr(), BUT WITH SOME REASONABLE LIMITS """ return _repr.repr(obj) class _FakeLock(): def __enter__(self): pass
class CommandProcessor(Mprocessor.Processor): def __init__(self, core_obj, opts=None): get_option = lambda key: \ Mmisc.option_set(opts, key, DEFAULT_PROC_OPTS) Mprocessor.Processor.__init__(self, core_obj) self.continue_running = False # True if we should leave command loop self.event2short = dict(EVENT2SHORT) self.event2short['signal'] = '?!' self.event2short['brkpt'] = 'xx' self.optional_modules = ('ipython', 'bpy', 'deparse') self.optional_modules = ('ipython', 'bpy') self.cmd_instances = self._populate_commands() # command argument string. Is like current_command, but the part # after cmd_name has been removed. self.cmd_argstr = '' # command name before alias or macro resolution self.cmd_name = '' self.cmd_queue = [] # Queued debugger commands self.completer = lambda text, state: \ Mcomplete.completer(self, text, state) self.current_command = '' # Current command getting run self.debug_nest = 1 self.display_mgr = Mdisplay.DisplayMgr() self.intf = core_obj.debugger.intf self.last_command = None # Initially a no-op self.precmd_hooks = [] self.location = lambda : print_location(self) self.preloop_hooks = [] self.postcmd_hooks = [] self._populate_cmd_lists() # Note: prompt_str's value set below isn't used. It is # computed dynamically. The value is suggestive of what it # looks like. self.prompt_str = '(trepan2) ' # Stop only if line/file is different from last time self.different_line = None # These values updated on entry. Set initial values. self.curframe = None self.event = None self.event_arg = None self.frame = None self.list_lineno = 0 # last list number used in "list" self.list_filename = None # last filename used in list self.macros = {} # Debugger Macros # Create a custom safe Repr instance and increase its maxstring. # The default of 30 truncates error messages too easily. self._repr = Repr() self._repr.maxstring = 100 self._repr.maxother = 60 self._repr.maxset = 10 self._repr.maxfrozen = 10 self._repr.array = 10 self.stack = [] self.thread_name = None self.frame_thread_name = None initfile_list = get_option('initfile_list') for init_cmdfile in initfile_list: self.queue_startfile(init_cmdfile) return def _saferepr(self, str, maxwidth=None): if maxwidth is None: maxwidth = self.debugger.settings['width'] return self._repr.repr(str)[:maxwidth] def add_preloop_hook(self, hook, position=-1, nodups = True): if hook in self.preloop_hooks: return False self.preloop_hooks.insert(position, hook) return True # To be overridden in derived debuggers def defaultFile(self): """Produce a reasonable default.""" filename = self.curframe.f_code.co_filename # Consider using is_exec_stmt(). I just don't understand # the conditions under which the below test is true. if filename == '<string>' and self.debugger.mainpyfile: filename = self.debugger.mainpyfile pass return filename def set_prompt(self, prompt='trepan2'): if self.thread_name and self.thread_name != 'MainThread': prompt += ':' + self.thread_name pass self.prompt_str = '%s%s%s' % ('(' * self.debug_nest, prompt, ')' * self.debug_nest) highlight = self.debugger.settings['highlight'] if highlight and highlight in ('light', 'dark'): self.prompt_str = colorize('underline', self.prompt_str) self.prompt_str += ' ' def event_processor(self, frame, event, event_arg, prompt='trepan2'): 'command event processor: reading a commands do something with them.' self.frame = frame self.event = event self.event_arg = event_arg filename = frame.f_code.co_filename lineno = frame.f_lineno if sys.version_info[0] == 2 and sys.version_info[1] <= 4: line = None else: line = linecache.getline(filename, lineno, frame.f_globals) pass if not line: opts = {'output': 'plain', 'reload_on_change': self.settings('reload'), 'strip_nl': False} line = pyficache.getline(filename, lineno, opts) self.current_source_text = line if self.settings('skip') is not None: if Mbytecode.is_def_stmt(line, frame): return True if Mbytecode.is_class_def(line, frame): return True pass self.thread_name = Mthread.current_thread_name() self.frame_thread_name = self.thread_name self.set_prompt(prompt) self.process_commands() if filename == '<string>': pyficache.remove_remap_file('<string>') return True def forget(self): """ Remove memory of state variables set in the command processor """ self.stack = [] self.curindex = 0 self.curframe = None self.thread_name = None self.frame_thread_name = None return def eval(self, arg): """Eval string arg in the current frame context.""" try: return eval(arg, self.curframe.f_globals, self.curframe.f_locals) except: t, v = sys.exc_info()[:2] if isinstance(t, str): exc_type_name = t pass else: exc_type_name = t.__name__ self.errmsg(str("%s: %s" % (exc_type_name, arg))) raise return None # Not reached def exec_line(self, line): if self.curframe: local_vars = self.curframe.f_locals global_vars = self.curframe.f_globals else: local_vars = None # FIXME: should probably have place where the # user can store variables inside the debug session. # The setup for this should be elsewhere. Possibly # in interaction. global_vars = None try: code = compile(line + '\n', '"%s"' % line, 'single') exec(code, global_vars, local_vars) except: t, v = sys.exc_info()[:2] if type(t) == str: exc_type_name = t else: exc_type_name = t.__name__ self.errmsg('%s: %s' % (str(exc_type_name), str(v))) pass return def parse_position(self, arg, old_mod=None): """parse_position(self, arg)->(fn, name, lineno) Parse arg as [filename:]lineno | function | module Make sure it works for C:\foo\bar.py:12 """ colon = arg.rfind(':') if colon >= 0: # First handle part before the colon arg1 = arg[:colon].rstrip() lineno_str = arg[colon+1:].lstrip() (mf, filename, lineno) = self.parse_position_one_arg(arg1, old_mod, False) if filename is None: filename = self.core.canonic(arg1) # Next handle part after the colon val = self.get_an_int(lineno_str, "Bad line number: %s" % lineno_str) if val is not None: lineno = val else: (mf, filename, lineno) = self.parse_position_one_arg(arg, old_mod) pass return mf, filename, lineno def parse_position_one_arg(self, arg, old_mod=None, show_errmsg=True): """parse_position_one_arg(self, arg, show_errmsg) -> (module/function, file, lineno) See if arg is a line number, function name, or module name. Return what we've found. None can be returned as a value in the triple. """ modfunc, filename, lineno = (None, None, None) if self.curframe: g = self.curframe.f_globals l = self.curframe.f_locals else: g = globals() l = locals() pass try: # First see if argument is an integer lineno = int(eval(arg, g, l)) if old_mod is None: filename = self.curframe.f_code.co_filename pass except: try: modfunc = eval(arg, g, l) except: modfunc = arg pass msg = ('Object %s is not known yet as a function, module, ' 'or is not found along sys.path, ' 'and not a line number.') % str(repr(arg)) try: # See if argument is a module or function if inspect.isfunction(modfunc): pass elif inspect.ismodule(modfunc): filename = Mfile.file_pyc2py(modfunc.__file__) filename = self.core.canonic(filename) return(modfunc, filename, None) elif hasattr(modfunc, 'im_func'): modfunc = modfunc.im_func pass else: if show_errmsg: self.errmsg(msg) return(None, None, None) code = modfunc.func_code lineno = code.co_firstlineno filename = code.co_filename except: if show_errmsg: self.errmsg(msg) return (None, None, None) pass return (modfunc, self.core.canonic(filename), lineno) def get_an_int(self, arg, msg_on_error, min_value=None, max_value=None): """Like cmdfns.get_an_int(), but if there's a stack frame use that in evaluation.""" ret_value = self.get_int_noerr(arg) if ret_value is None: if msg_on_error: self.errmsg(msg_on_error) else: self.errmsg('Expecting an integer, got: %s.' % str(arg)) pass return None if min_value and ret_value < min_value: self.errmsg('Expecting integer value to be at least %d, got: %d.' % (min_value, ret_value)) return None elif max_value and ret_value > max_value: self.errmsg('Expecting integer value to be at most %d, got: %d.' % (max_value, ret_value)) return None return ret_value def get_int_noerr(self, arg): """Eval arg and it is an integer return the value. Otherwise return None""" if self.curframe: g = self.curframe.f_globals l = self.curframe.f_locals else: g = globals() l = locals() pass try: val = int(eval(arg, g, l)) except (SyntaxError, NameError, ValueError, TypeError): return None return val def get_int(self, arg, min_value=0, default=1, cmdname=None, at_most=None): """If no argument use the default. If arg is a an integer between least min_value and at_most, use that. Otherwise report an error. If there's a stack frame use that in evaluation.""" if arg is None: return default default = self.get_int_noerr(arg) if default is None: if cmdname: self.errmsg(("Command '%s' expects an integer; " + "got: %s.") % (cmdname, str(arg))) else: self.errmsg('Expecting a positive integer, got: %s' % str(arg)) pass return None pass if default < min_value: if cmdname: self.errmsg(("Command '%s' expects an integer at least" + ' %d; got: %d.') % (cmdname, min_value, default)) else: self.errmsg(("Expecting a positive integer at least" + ' %d; got: %d') % (min_value, default)) pass return None elif at_most and default > at_most: if cmdname: self.errmsg(("Command '%s' expects an integer at most" + ' %d; got: %d.') % (cmdname, at_most, default)) else: self.errmsg(("Expecting an integer at most %d; got: %d") % (at_most, default)) pass pass return default def getval(self, arg): try: return eval(arg, self.curframe.f_globals, self.curframe.f_locals) except: t, v = sys.exc_info()[:2] if isinstance(t, str): exc_type_name = t else: exc_type_name = t.__name__ self.errmsg(str("%s: %s" % (exc_type_name, arg))) raise return def ok_for_running(self, cmd_obj, name, nargs): '''We separate some of the common debugger command checks here: whether it makes sense to run the command in this execution state, if the command has the right number of arguments and so on. ''' if hasattr(cmd_obj, 'execution_set'): if not (self.core.execution_status in cmd_obj.execution_set): part1 = ("Command '%s' is not available for execution status:" % name) mess = Mmisc.wrapped_lines(part1, self.core.execution_status, self.debugger.settings['width']) self.errmsg(mess) return False pass if self.frame is None and cmd_obj.need_stack: self.intf[-1].errmsg("Command '%s' needs an execution stack." % name) return False if nargs < cmd_obj.min_args: self.errmsg(("Command '%s' needs at least %d argument(s); " + "got %d.") % (name, cmd_obj.min_args, nargs)) return False elif cmd_obj.max_args is not None and nargs > cmd_obj.max_args: self.errmsg(("Command '%s' can take at most %d argument(s);" + " got %d.") % (name, cmd_obj.max_args, nargs)) return False return True def process_commands(self): """Handle debugger commands.""" if self.core.execution_status != 'No program': self.setup() self.location() pass leave_loop = run_hooks(self, self.preloop_hooks) self.continue_running = False while not leave_loop: try: run_hooks(self, self.precmd_hooks) # bdb had a True return to leave loop. # A more straight-forward way is to set # instance variable self.continue_running. leave_loop = self.process_command() if leave_loop or self.continue_running: break except EOFError: # If we have stacked interfaces, pop to the next # one. If this is the last one however, we'll # just stick with that. FIXME: Possibly we should # check to see if we are interactive. and not # leave if that's the case. Is this the right # thing? investigate and fix. if len(self.debugger.intf) > 1: del self.debugger.intf[-1] self.last_command = '' else: if self.debugger.intf[-1].output: self.debugger.intf[-1].output.writeline('Leaving') raise Mexcept.DebuggerQuit pass break pass pass return run_hooks(self, self.postcmd_hooks) def process_command(self): # process command if len(self.cmd_queue) > 0: current_command = self.cmd_queue[0].strip() del self.cmd_queue[0] else: current_command = ( self.intf[-1].read_command(self.prompt_str).strip()) if '' == current_command and self.intf[-1].interactive: current_command = self.last_command pass pass # Look for comments if '' == current_command: if self.intf[-1].interactive: self.errmsg("No previous command registered, " + "so this is a no-op.") pass return False if current_command is None or current_command[0] == '#': return False try: args_list = arg_split(current_command) except: self.errmsg("bad parse %s"< sys.exc_info()[0]) return False for args in args_list: if len(args): while True: if len(args) == 0: return False macro_cmd_name = args[0] if macro_cmd_name not in self.macros: break try: current_command = \ self.macros[macro_cmd_name][0](*args[1:]) except TypeError: t, v = sys.exc_info()[:2] self.errmsg("Error expanding macro %s" % macro_cmd_name) return False if self.settings('debugmacro'): print(current_command) pass if isinstance(types.ListType, type(current_command)): for x in current_command: if bytes != type(x): self.errmsg(("macro %s should return a List " + "of Strings. Has %s of type %s") % (macro_cmd_name, x, repr(current_command), type(x))) return False pass first = current_command[0] args = first.split() self.cmd_queue + [current_command[1:]] current_command = first elif type(current_command) == str: args = current_command.split() else: self.errmsg(("macro %s should return a List " + "of Strings or a String. Got %s") % (macro_cmd_name, repr(current_command))) return False pass self.cmd_name = args[0] cmd_name = resolve_name(self, self.cmd_name) self.cmd_argstr = current_command[len(self.cmd_name):].lstrip() if cmd_name: self.last_command = current_command cmd_obj = self.commands[cmd_name] if self.ok_for_running(cmd_obj, cmd_name, len(args)-1): try: self.current_command = current_command result = cmd_obj.run(args) if result: return result except (Mexcept.DebuggerQuit, Mexcept.DebuggerRestart, SystemExit): # Let these exceptions propagate through raise except: self.errmsg("INTERNAL ERROR: " + traceback.format_exc()) pass pass pass elif not self.settings('autoeval'): self.undefined_cmd(current_command) else: self.exec_line(current_command) pass pass pass return False def remove_preloop_hook(self, hook): try: position = self.preloop_hooks.index(hook) except ValueError: return False del self.preloop_hooks[position] return True def setup(self): """Initialization done before entering the debugger-command loop. In particular we set up the call stack used for local variable lookup and frame/up/down commands. We return True if we should NOT enter the debugger-command loop.""" self.forget() if self.settings('dbg_trepan'): self.frame = inspect.currentframe() pass if self.event in ['exception', 'c_exception']: exc_type, exc_value, exc_traceback = self.event_arg else: _, _, exc_traceback = (None, None, None,) # NOQA pass if self.frame or exc_traceback: self.stack, self.curindex = \ get_stack(self.frame, exc_traceback, None, self) self.curframe = self.stack[self.curindex][0] self.thread_name = Mthread.current_thread_name() if exc_traceback: self.list_lineno = traceback.extract_tb(exc_traceback, 1)[0][1] else: self.stack = self.curframe = \ self.botframe = None pass if self.curframe: self.list_lineno = \ max(1, inspect.getlineno(self.curframe) - int(self.settings('listsize') / 2)) - 1 self.list_filename = self.curframe.f_code.co_filename else: if not exc_traceback: self.list_lineno = None pass # if self.execRcLines()==1: return True return False def queue_startfile(self, cmdfile): '''Arrange for file of debugger commands to get read in the process-command loop.''' expanded_cmdfile = os.path.expanduser(cmdfile) is_readable = Mfile.readable(expanded_cmdfile) if is_readable: self.cmd_queue.append('source ' + expanded_cmdfile) elif is_readable is None: self.errmsg("source file '%s' doesn't exist" % expanded_cmdfile) else: self.errmsg("source file '%s' is not readable" % expanded_cmdfile) pass return def undefined_cmd(self, cmd): """Error message when a command doesn't exist""" self.errmsg('Undefined command: "%s". Try "help".' % cmd) return def read_history_file(self): """Read the command history file -- possibly.""" histfile = self.debugger.intf[-1].histfile try: import readline readline.read_history_file(histfile) except IOError: pass except ImportError: pass return def write_history_file(self): """Write the command history file -- possibly.""" settings = self.debugger.settings histfile = self.debugger.intf[-1].histfile if settings['hist_save']: try: import readline try: readline.write_history_file(histfile) except IOError: pass except ImportError: pass pass return def _populate_commands(self): """ Create an instance of each of the debugger commands. Commands are found by importing files in the directory 'command'. Some files are excluded via an array set in __init__. For each of the remaining files, we import them and scan for class names inside those files and for each class name, we will create an instance of that class. The set of DebuggerCommand class instances form set of possible debugger commands.""" import trepan.processor.command as Mcommand if hasattr(Mcommand, '__modules__'): return self.populate_commands_easy_install(Mcommand) else: return self.populate_commands_pip(Mcommand) def populate_commands_pip(self, Mcommand): cmd_instances = [] eval_cmd_template = 'command_mod.%s(self)' for mod_name in Mcommand.__dict__.keys(): if mod_name.startswith('__'): continue import_name = "trepan.processor.command." + mod_name imp = __import__(import_name) if imp.__name__ == 'trepan': command_mod = imp.processor.command else: if mod_name in ('info_sub', 'set_sub', 'show_sub',): pass try: command_mod = getattr(__import__(import_name), mod_name) except: # Don't need to warn about optional modules if mod_name not in self.optional_modules: print('Error importing %s: %s' % (mod_name, sys.exc_info()[0])) pass continue pass classnames = [ tup[0] for tup in inspect.getmembers(command_mod, inspect.isclass) if ('DebuggerCommand' != tup[0] and tup[0].endswith('Command')) ] for classname in classnames: eval_cmd = eval_cmd_template % classname try: instance = eval(eval_cmd) cmd_instances.append(instance) except ImportError: pass except: print('Error loading %s from %s: %s' % (classname, mod_name, sys.exc_info()[0])) pass pass pass return cmd_instances def populate_commands_easy_install(self, Mcommand): cmd_instances = [] for mod_name in Mcommand.__modules__: if mod_name in ('info_sub', 'set_sub', 'show_sub',): pass import_name = 'trepan.processor.command.' + mod_name try: command_mod = __import__(import_name, None, None, ['*']) except: if mod_name not in self.optional_modules: print('Error importing %s: %s' % (mod_name, sys.exc_info()[0])) pass continue classnames = [ tup[0] for tup in inspect.getmembers(command_mod, inspect.isclass) if ('DebuggerCommand' != tup[0] and tup[0].endswith('Command')) ] for classname in classnames: if False: instance = getattr(command_mod, classname)(self) cmd_instances.append(instance) else: try: instance = getattr(command_mod, classname)(self) cmd_instances.append(instance) except: print('Error loading %s from %s: %s' % (classname, mod_name, sys.exc_info()[0])) pass pass pass pass return cmd_instances def _populate_cmd_lists(self): """ Populate self.lists and hashes: self.commands, and self.aliases, self.category """ self.commands = {} self.aliases = {} self.category = {} # self.short_help = {} for cmd_instance in self.cmd_instances: if not hasattr(cmd_instance, 'aliases'): continue alias_names = cmd_instance.aliases cmd_name = cmd_instance.name self.commands[cmd_name] = cmd_instance for alias_name in alias_names: self.aliases[alias_name] = cmd_name pass cat = getattr(cmd_instance, 'category') if cat and self.category.get(cat): self.category[cat].append(cmd_name) else: self.category[cat] = [cmd_name] pass # sh = getattr(cmd_instance, 'short_help') # if sh: # self.short_help[cmd_name] = getattr(c, 'short_help') # pass pass for k in list(self.category.keys()): self.category[k].sort() pass return pass
import cmd import bdb from repr import Repr import os import re import pprint import traceback class Restart(Exception): """Causes a debugger to be restarted for the debugged python program.""" pass # Create a custom safe Repr instance and increase its maxstring. # The default of 30 truncates error messages too easily. _repr = Repr() _repr.maxstring = 200 _saferepr = _repr.repr __all__ = ["run", "pm", "Pdb", "runeval", "runctx", "runcall", "set_trace", "post_mortem", "help"] def find_function(funcname, filename): cre = re.compile(r'def\s+%s\s*[(]' % re.escape(funcname)) try: fp = open(filename) except IOError: return None # consumer of this info expects the first line to be 1 lineno = 1 answer = None
from __future__ import absolute_import, print_function, unicode_literals import itertools import re import sys from collections import Iterable from math import isnan from repr import Repr # Pretty printers to make sure our object representions are modestly readable. repr_data = Repr().repr repr_hooks = Repr().repr repr_data.im_self.maxlevel = 1 repr_data.im_self.maxother = 60 repr_hooks.im_self.maxlevel = 2 class SentinelType(object): """A singleton object which is used for nothing other than a sentinel value.""" def __repr__(self): return '<Sentinel>' Sentinel = SentinelType() class LazyJSObjectType(object): """A singleton object which when used as a value, causes an empty JSObject to be created only when needed."""