def main(argv): """Main entry point.""" if len(argv) > 1: raise app.UsageError("Unrecognized arguments") # Parse flags and instantiate testing objects. if not FLAGS.interesting_results_dir: raise app.UsageError("--interesting_results_dir must be set") interesting_results_dir = pathlib.Path(FLAGS.interesting_results_dir) if interesting_results_dir.exists( ) and not interesting_results_dir.is_dir(): raise app.UsageError("--interesting_results_dir must be a directory") app.Log(1, "Recording interesting results in %s.", interesting_results_dir) for path in interesting_results_dir.iterdir(): result = pbutil.FromFile(path, deepsmith_pb2.Result()) print(f"=== BEGIN INTERESTING RESULT {path.stem} ===") print("Outcome:", deepsmith_pb2.Result.Outcome.Name(result.outcome)) print() print("OpenCL kernel") print("-------------") print(fmt.Indent(2, result.testcase.inputs["src"])) print() print("Stdout") print("------") print(fmt.Indent(2, result.outputs["stderr"])) print()
def main(argv): """Main entry point.""" if len(argv) > 1: raise app.UsageError('Unrecognized arguments') # Parse flags and instantiate testing objects. if not FLAGS.interesting_results_dir: raise app.UsageError('--interesting_results_dir must be set') interesting_results_dir = pathlib.Path(FLAGS.interesting_results_dir) if interesting_results_dir.exists( ) and not interesting_results_dir.is_dir(): raise app.UsageError('--interesting_results_dir must be a directory') logging.info('Recording interesting results in %s.', interesting_results_dir) for path in interesting_results_dir.iterdir(): result = pbutil.FromFile(path, deepsmith_pb2.Result()) print(f'=== BEGIN INTERESTING RESULT {path.stem} ===') print('Outcome:', deepsmith_pb2.Result.Outcome.Name(result.outcome)) print() print('OpenCL kernel') print('-------------') print(fmt.Indent(2, result.testcase.inputs['src'])) print() print('Stdout') print('------') print(fmt.Indent(2, result.outputs['stderr'])) print()
def RunTestcase( opencl_environment: env.OpenCLEnvironment, testbed: deepsmith_pb2.Testbed, testcase: deepsmith_pb2.Testcase, opts: typing.List[str], ) -> deepsmith_pb2.Result: """Run a testcase.""" if testcase.toolchain != "opencl": raise ValueError( f"Unsupported testcase toolchain: '{testcase.toolchain}'") if testcase.harness.name != "cl_launcher": raise ValueError( f"Unsupported testcase harness: '{testcase.harness.name}'") result = deepsmith_pb2.Result() result.testbed.CopyFrom(testbed) result.testcase.CopyFrom(testcase) # Set up additional command line flags for cl_launcher. We always run with # debugging output enabled. opts.append("---debug") if testbed.opts["opencl_opt"] == "disabled": opts.append("---disable_opts") start_time_epoch_ms = labdate.MillisecondsTimestamp() process = cl_launcher.ExecClsmithSource( opencl_environment, testcase.inputs["src"], driver.NDRange.FromString(testcase.inputs["gsize"]), driver.NDRange.FromString(testcase.inputs["lsize"]), *opts, timeout_seconds=testcase.harness.opts.get("timeout_seconds", "60"), ) wall_time = labdate.MillisecondsTimestamp() - start_time_epoch_ms result = deepsmith_pb2.Result() result.testcase.CopyFrom(testcase) result.testbed.CopyFrom(testbed) result.returncode = process.returncode result.outputs["stdout"] = process.stdout result.outputs["stderr"] = process.stderr prof = result.profiling_events.add() prof.client = socket.gethostname() prof.type = "runtime" prof.duration_ms = wall_time prof.event_start_epoch_ms = start_time_epoch_ms result.outcome = GetResultOutcome(result) return result
def ToProto(self) -> deepsmith_pb2.Result: """Create protocol buffer representation. Returns: A Result message. """ proto = deepsmith_pb2.Result() return self.SetProto(proto)
def dummy_result() -> deepsmith_pb2.Result: """A test fixture which returns a dummy result.""" return deepsmith_pb2.Result( testcase=deepsmith_pb2.Testcase( harness=deepsmith_pb2.Harness(name="name"), inputs={"src": "Kernel source.", "gsize": "1,1,1", "lsize": "2,2,2",}, ), outputs={"stdout": "Standard output.", "stderr": "Standard error.",}, )
def test_duplicate_testcase_testbed_ignored(session): """Test that result is ignored if testbed and testcase are not unique.""" proto = deepsmith_pb2.Result( testcase=deepsmith_pb2.Testcase( toolchain='cpp', generator=deepsmith_pb2.Generator(name='generator'), harness=deepsmith_pb2.Harness(name='harness'), inputs={ 'src': 'void main() {}', 'data': '[1,2]', }, invariant_opts={ 'config': 'opt', }, profiling_events=[ deepsmith_pb2.ProfilingEvent( client='localhost', type='generate', duration_ms=100, event_start_epoch_ms=1123123123, ), ]), testbed=deepsmith_pb2.Testbed( toolchain='cpp', name='clang', opts={'arch': 'x86_64'}, ), returncode=0, outputs={'stdout': 'Hello, world!'}, profiling_events=[ deepsmith_pb2.ProfilingEvent( client='localhost', type='exec', duration_ms=100, event_start_epoch_ms=1123123123, ), ], outcome=deepsmith_pb2.Result.PASS, ) r1 = deeplearning.deepsmith.result.Result.GetOrAdd(session, proto) session.add(r1) session.flush() # Attempt to add a new result which is identical to the first in all fields # except for the outputs. proto.outputs['stdout'] = '!' r2 = deeplearning.deepsmith.result.Result.GetOrAdd(session, proto) session.add(r2) session.flush() # Check that only one result was added. assert session.query(deeplearning.deepsmith.result.Result).count() == 1 # Check that only the first result was added. r3 = session.query(deeplearning.deepsmith.result.Result).first() assert r3.outputs['stdout'] == 'Hello, world!'
def ProtoFromFile(cls, path: pathlib.Path) -> deepsmith_pb2.Result: """Instantiate a protocol buffer result from file. Args: path: Path to the result proto file. Returns: Result message instance. """ return pbutil.FromFile(path, deepsmith_pb2.Result())
def test_Generator_GetOrAdd_ToProto_equivalence(session): proto_in = deepsmith_pb2.Result( testcase=deepsmith_pb2.Testcase( toolchain="cpp", generator=deepsmith_pb2.Generator(name="generator"), harness=deepsmith_pb2.Harness(name="harness"), inputs={"src": "void main() {}", "data": "[1,2]",}, invariant_opts={"config": "opt",}, profiling_events=[ deepsmith_pb2.ProfilingEvent( client="localhost", type="generate", duration_ms=100, event_start_epoch_ms=1123123123, ), deepsmith_pb2.ProfilingEvent( client="localhost", type="foo", duration_ms=100, event_start_epoch_ms=1123123123, ), ], ), testbed=deepsmith_pb2.Testbed( toolchain="cpp", name="clang", opts={"arch": "x86_64", "build": "debug+assert",}, ), returncode=0, outputs={"stdout": "Hello, world!", "stderr": "",}, profiling_events=[ deepsmith_pb2.ProfilingEvent( client="localhost", type="exec", duration_ms=500, event_start_epoch_ms=1123123123, ), deepsmith_pb2.ProfilingEvent( client="localhost", type="overhead", duration_ms=100, event_start_epoch_ms=1123123123, ), ], outcome=deepsmith_pb2.Result.PASS, ) result = deeplearning.deepsmith.result.Result.GetOrAdd(session, proto_in) # NOTE: We have to flush so that SQLAlchemy resolves all of the object IDs. session.flush() proto_out = result.ToProto() assert proto_in == proto_out proto_out.ClearField("outputs") assert proto_in != proto_out # Sanity check.
def ResultProtoFromFlag(flag: typing.Optional[str]) -> deepsmith_pb2.Result: """Read a result proto from a --flag path. Args: flag: The value of the flag which points to a result proto. Returns: The Result proto. Raises: UsageError: If the flag is not set or the flag does not point to a Result proto. """ if not flag: raise app.UsageError("Path is not set.") path = pathlib.Path(flag) if not path.is_file(): raise app.UsageError(f"File not found: '{path}'.") if not pbutil.ProtoIsReadable(path, deepsmith_pb2.Result()): raise app.UsageError(f"Cannot read Result proto: '{path}'.") return pbutil.FromFile(path, deepsmith_pb2.Result())
def dummy_result() -> deepsmith_pb2.Result: """A test fixture which returns a dummy result.""" return deepsmith_pb2.Result(testcase=deepsmith_pb2.Testcase( harness=deepsmith_pb2.Harness(name='name'), inputs={ 'src': 'Kernel source.', 'gsize': '1,1,1', 'lsize': '2,2,2', }), outputs={ 'stdout': 'Standard output.', 'stderr': 'Standard error.', })
def test_ResultIsInteresting_unknown(): """An unknown outcome is not interesting.""" gs_harness = MockHarness() filters = MockFilters() result = opencl_fuzz.ResultIsInteresting( deepsmith_pb2.Result(outcome=deepsmith_pb2.Result.UNKNOWN), difftests.UnaryTester(), difftests.GoldStandardDiffTester( difftests.NamedOutputIsEqual('stdout')), gs_harness, filters) assert not result # Only the unary tester was called, no differential test was required. assert not gs_harness.RunTestcases_call_requests assert len(filters.PreDifftest_call_args) == 0
def test_ResultIsInteresting_build_timeout(): """A build timeout is interesting.""" gs_harness = MockHarness() filters = MockFilters() result = opencl_fuzz.ResultIsInteresting( deepsmith_pb2.Result(outcome=deepsmith_pb2.Result.BUILD_TIMEOUT), difftests.UnaryTester(), difftests.GoldStandardDiffTester( difftests.NamedOutputIsEqual('stdout')), gs_harness, filters) assert result assert result.outputs['difftest_outcome'] == 'ANOMALOUS_BUILD_FAILURE' # Only the unary tester was called, no differential test was required. assert not gs_harness.RunTestcases_call_requests assert len(filters.PreDifftest_call_args) == 0
def test_duplicate_testcase_testbed_ignored(session): """Test that result is ignored if testbed and testcase are not unique.""" proto = deepsmith_pb2.Result( testcase=deepsmith_pb2.Testcase( toolchain="cpp", generator=deepsmith_pb2.Generator(name="generator"), harness=deepsmith_pb2.Harness(name="harness"), inputs={"src": "void main() {}", "data": "[1,2]",}, invariant_opts={"config": "opt",}, profiling_events=[ deepsmith_pb2.ProfilingEvent( client="localhost", type="generate", duration_ms=100, event_start_epoch_ms=1123123123, ), ], ), testbed=deepsmith_pb2.Testbed( toolchain="cpp", name="clang", opts={"arch": "x86_64"}, ), returncode=0, outputs={"stdout": "Hello, world!"}, profiling_events=[ deepsmith_pb2.ProfilingEvent( client="localhost", type="exec", duration_ms=100, event_start_epoch_ms=1123123123, ), ], outcome=deepsmith_pb2.Result.PASS, ) r1 = deeplearning.deepsmith.result.Result.GetOrAdd(session, proto) session.add(r1) session.flush() # Attempt to add a new result which is identical to the first in all fields # except for the outputs. proto.outputs["stdout"] = "!" r2 = deeplearning.deepsmith.result.Result.GetOrAdd(session, proto) session.add(r2) session.flush() # Check that only one result was added. assert session.query(deeplearning.deepsmith.result.Result).count() == 1 # Check that only the first result was added. r3 = session.query(deeplearning.deepsmith.result.Result).first() assert r3.outputs["stdout"] == "Hello, world!"
def RunTestcase( opencl_environment: env.OpenCLEnvironment, testbed: deepsmith_pb2.Testbed, testcase: deepsmith_pb2.Testcase, cflags: typing.List[str], ) -> deepsmith_pb2.Result: """Run a testcase.""" if testcase.toolchain != "opencl": raise ValueError( f"Unsupported testcase toolchain: '{testcase.toolchain}'") if testcase.harness.name != "cldrive": raise ValueError( f"Unsupported testcase harness: '{testcase.harness.name}'") result = deepsmith_pb2.Result() result.testbed.CopyFrom(testbed) platform_id, device_id = opencl_environment.ids() driver = MakeDriver( testcase, True if testbed.opts["opencl_opt"] == "enabled" else False) # MakeDriver() annotates the testcase, so we must only set the testcase field # of the output result after we have called it. result.testcase.CopyFrom(testcase) # Get a temporary file to write and run the driver from. with tempfile.NamedTemporaryFile(prefix="deepsmith_", delete=False) as f: path = pathlib.Path(f.name) try: CompileDriver(driver, path, platform_id, device_id, cflags=cflags) timeout = testcase.harness.opts.get("timeout_seconds", "60") cmd = ["timeout", "-s9", timeout, f.name] start_time = labdate.GetUtcMillisecondsNow() proc = opencl_environment.Exec(cmd) end_time = labdate.GetUtcMillisecondsNow() # Build result message. result.returncode = proc.returncode result.outputs["stdout"] = proc.stdout result.outputs["stderr"] = proc.stderr runtime = result.profiling_events.add() runtime.client = system.HOSTNAME runtime.type = "runtime" runtime.duration_ms = int( round((end_time - start_time).total_seconds() * 1000)) runtime.event_start_epoch_ms = labdate.MillisecondsTimestamp( start_time) result.outcome = GetResultOutcome(result) except DriverCompilationError as e: app.Warning("%s", e) result.outcome = deepsmith_pb2.Result.UNKNOWN finally: fs.rm(path) return result
def test_ResultIsInteresting_build_crash(): """A build crash is interesting.""" gs_harness = MockHarness() filters = MockFilters() result = opencl_fuzz.ResultIsInteresting( deepsmith_pb2.Result(outcome=deepsmith_pb2.Result.BUILD_CRASH), difftests.UnaryTester(), difftests.GoldStandardDiffTester(difftests.NamedOutputIsEqual("stdout")), gs_harness, filters, ) assert result assert result.outputs["difftest_outcome"] == "ANOMALOUS_BUILD_FAILURE" # Only the unary tester was called, no differential test was required. assert not gs_harness.RunTestcases_call_requests assert len(filters.PreDifftest_call_args) == 0
def GetDeviceUnderTestHarness() -> base_harness.HarnessBase: """Instantiate the device under test harness. Uses the global FLAGS to determine the harness to instantiate. Returns: A Harness instance. """ if FLAGS.rerun_result: result = pbutil.FromFile( pathlib.Path(FLAGS.rerun_result), deepsmith_pb2.Result() ) if result.testcase.harness.name == "cldrive": harness_class = cldrive.CldriveHarness config_class = harness_pb2.CldriveHarness elif result.testcase.harness.name == "cl_launcher": harness_class = cl_launcher.ClLauncherHarness config_class = harness_pb2.ClLauncherHarness else: raise app.UsageError( f"Unrecognized harness: '{result.testcase.harness.name}'" ) elif FLAGS.generator == "clgen": harness_class = cldrive.CldriveHarness config_class = harness_pb2.CldriveHarness elif FLAGS.generator == "clsmith": harness_class = cl_launcher.ClLauncherHarness config_class = harness_pb2.ClLauncherHarness else: raise app.UsageError( f"Unrecognized value for --generator: '{FLAGS.generator}'" ) app.Log(1, "Preparing device under test.") config = GetBaseHarnessConfig(config_class) config.opencl_env.extend([FLAGS.dut]) config.opencl_opt.extend([FLAGS.opencl_opt]) dut_harness = harness_class(config) assert len(dut_harness.testbeds) == 1 return dut_harness
def RunTestcaseOnTestbed(self, testcase: deepsmith_pb2.Testcase, testbed: deepsmith_pb2.Testbed) -> \ deepsmith_pb2.Result: start_time = labdate.GetUtcMillisecondsNow() # ~~ Begin 'exec' timed region. ~~~ # TODO: Popen something. # ~~~ End 'exec' timed region. ~~~ end_time = labdate.GetUtcMillisecondsNow() # Create the result. result = deepsmith_pb2.Result() result.testcase = testcase result.testbed = testbed result.returncode = 0 # Create profiling events. exec_time = result.profiling_events.add() exec_time.client = system.HOSTNAME exec_time.type = 'exec' exec_time.duration_ms = end_time - start_time exec_time.event_start_epoch_ms = start_time return result
def _ExportSolidityResults(cursor, start_id, proto_dir): batch_size = 1000 result_id = start_id while True: cursor.execute( """ SELECT results.id, platforms.platform, platforms.version, platforms.host, testbeds.optimizations, programs.generator, programs.date, programs.generation_time, programs.src, testcases.harness, testcases.timeout, results.date, results.returncode, results.runtime, stdouts.stdout, stderrs.stderr FROM results LEFT JOIN testbeds ON results.testbed_id = testbeds.id LEFT JOIN platforms ON testbeds.platform_id = platforms.id LEFT JOIN testcases ON results.testcase_id = testcases.id LEFT JOIN programs ON testcases.program_id = programs.id LEFT JOIN stdouts ON results.stdout_id = stdouts.id LEFT JOIN stderrs ON results.stderr_id = stderrs.id WHERE results.id >= %s ORDER BY results.id LIMIT %s """, (result_id, batch_size)) i = 0 for row in cursor: i += 1 (result_id, platform_name, platform_version, host_os, optimizations, generator_id, program_date, program_generation_time, program_src, harness_id, harness_timeout, result_date, returncode, runtime, stdout, stderr) = row assert harness_id == 2 proto = deepsmith_pb2.Result( testcase=deepsmith_pb2.Testcase( toolchain='solidity', generator=_GetSolidityGenerator(generator_id), harness=deepsmith_pb2.Harness( name='solc', opts={ 'timeout_seconds': str(int(harness_timeout)), 'url': 'https://github.com/ChrisCummins/dsmith/blob/5181c7c95575d428b5144a25549e5a5a55a3da31/dsmith/sol/harnesses.py#L117', }, ), inputs={ "src": program_src, }, invariant_opts={}, profiling_events=[ deepsmith_pb2.ProfilingEvent( client="cc1", type="generation", duration_ms=int(program_generation_time * 1000), event_start_epoch_ms=dateutil. MillisecondsTimestamp(program_date), ), ]), testbed=deepsmith_pb2.Testbed( toolchain='solidity', name=platform_name, opts={ 'version': platform_version, 'optimizations': 'enabled' if optimizations else 'disabled', }, ), returncode=returncode, outputs={ "stdout": stdout, "stderr": stderr, }, profiling_events=[ deepsmith_pb2.ProfilingEvent( client='cc1', type="runtime", duration_ms=int(runtime * 1000), event_start_epoch_ms=dateutil.MillisecondsTimestamp( result_date), ), ], ) with open(proto_dir / 'sol' / 'results' / str(result_id), 'wb') as f: f.write(proto.SerializeToString()) if i < batch_size: return
def _ExportOpenCLResults(cursor, start_id, proto_dir): batch_size = 1000 result_id = start_id while True: cursor.execute( """ SELECT results.id, platforms.platform, platforms.device, platforms.driver, platforms.opencl, platforms.devtype, platforms.host, testbeds.optimizations, programs.generator, programs.date, programs.generation_time, programs.src, testcases.harness, testcases.timeout, results.date, results.returncode, results.runtime, stdouts.stdout, stderrs.stderr, stderrs.truncated, threads.gsize_x, threads.gsize_y, threads.gsize_z, threads.lsize_x, threads.lsize_y, threads.lsize_z, clsmith_testcase_metas.oclverified, dsmith_testcase_metas.gpuverified, dsmith_testcase_metas.oclverified, dsmith_program_metas.contains_floats, dsmith_program_metas.vector_inputs, dsmith_program_metas.compiler_warnings FROM results LEFT JOIN testbeds ON results.testbed_id = testbeds.id LEFT JOIN platforms ON testbeds.platform_id = platforms.id LEFT JOIN testcases ON results.testcase_id = testcases.id LEFT JOIN programs ON testcases.program_id = programs.id LEFT JOIN threads ON testcases.threads_id = threads.id LEFT JOIN stdouts ON results.stdout_id = stdouts.id LEFT JOIN stderrs ON results.stderr_id = stderrs.id LEFT JOIN clsmith_testcase_metas ON testcases.id=clsmith_testcase_metas.id LEFT JOIN dsmith_testcase_metas ON testcases.id=dsmith_testcase_metas.id LEFT JOIN dsmith_program_metas ON programs.id=dsmith_program_metas.id WHERE results.id >= %s ORDER BY results.id LIMIT %s """, (result_id, batch_size)) i = 0 for row in cursor: i += 1 (result_id, platform_name, device_name, driver_version, opencl_version, devtype, host_os, cl_opt, generator_id, program_date, program_generation_time, program_src, harness_id, harness_timeout, result_date, returncode, runtime, stdout, stderr, truncated_stderr, gsize_x, gsize_y, gsize_z, lsize_x, lsize_y, lsize_z, clsmith_oclverified, dsmith_gpuverified, dsmith_oclverified, dsmith_program_contains_floats, dsmith_program_vector_inputs, dsmith_program_compiler_warnings) = row inputs = { 'src': program_src, } if harness_id != -1: inputs['gsize'] = f'{gsize_x},{gsize_y},{gsize_z}' inputs['lsize'] = f'{lsize_x},{lsize_y},{lsize_z}' testbed_name = OPENCL_DEVICE_MAP[device_name] testbed_opts = {} _SetIf(testbed_opts, 'opencl_device', device_name.strip()) _SetIf(testbed_opts, 'opencl_version', opencl_version.strip()) _SetIf(testbed_opts, 'host', HOSTS_MAP.get(host_os, host_os)) if testbed_name == "clang": _SetIf(testbed_opts, 'llvm_version', driver_version.strip()) else: _SetIf(testbed_opts, 'driver_version', driver_version.strip()) _SetIf(testbed_opts, 'opencl_devtype', OPENCL_DEVTYPE_MAP.get(devtype, devtype)) _SetIf(testbed_opts, 'opencl_platform', platform_name.strip()) _SetIf(testbed_opts, 'opencl_opt', 'enabled' if cl_opt else 'disabled') invariant_opts = {} if clsmith_oclverified == 0: invariant_opts['oclverify'] = 'fail' elif clsmith_oclverified == 1: invariant_opts['oclverify'] = 'pass' elif dsmith_oclverified == 0: invariant_opts['oclverify'] = 'fail' elif dsmith_oclverified == 1: invariant_opts['oclverify'] = 'pass' if dsmith_gpuverified == 0: invariant_opts['gpuverify'] = 'fail' elif dsmith_gpuverified == 1: invariant_opts['gpuverify'] = 'pass' if dsmith_program_contains_floats == 0: invariant_opts['kernel_uses_floats'] = 'false' elif dsmith_program_contains_floats == 1: invariant_opts['kernel_uses_floats'] = 'true' if dsmith_program_vector_inputs == 0: invariant_opts['kernel_has_vector_inputs'] = 'false' elif dsmith_program_vector_inputs == 1: invariant_opts['kernel_has_vector_inputs'] = 'true' if dsmith_program_compiler_warnings == 0: invariant_opts['kernel_throws_compiler_warning'] = 'false' elif dsmith_program_compiler_warnings == 1: invariant_opts['kernel_throws_compiler_warning'] = 'true' testbed = deepsmith_pb2.Testbed( toolchain='opencl', name=testbed_name, opts=testbed_opts, ) proto = deepsmith_pb2.Result( testcase=deepsmith_pb2.Testcase( toolchain="opencl", generator=_GetOpenCLGenerator(generator_id), harness=_GetOpenCLHarness(harness_id, harness_timeout), inputs=inputs, invariant_opts=invariant_opts, profiling_events=[ deepsmith_pb2.ProfilingEvent( client="cc1", type="generation", duration_ms=int(program_generation_time * 1000), event_start_epoch_ms=dateutil. MillisecondsTimestamp(program_date), ), ]), testbed=testbed, returncode=returncode, outputs={ "stdout": stdout, "stderr": stderr, }, profiling_events=[ deepsmith_pb2.ProfilingEvent( client={ 'Ubuntu 16.04 64bit': 'cc1', 'CentOS Linux 7.1.1503 64bit': 'fuji', 'openSUSE 13.1 64bit': 'kobol', }[host_os], type="runtime", duration_ms=int(runtime * 1000), event_start_epoch_ms=dateutil.MillisecondsTimestamp( result_date), ), ], ) with open(proto_dir / 'opencl' / 'results' / str(result_id), 'wb') as f: f.write(proto.SerializeToString()) if i < batch_size: return
def main(argv): """Main entry point.""" if len(argv) > 1: raise app.UsageError('Unrecognized arguments') if FLAGS.ls_env: env.PrintOpenClEnvironments() return start_time = time.time() if FLAGS.rerun_result: result_to_rerun_path = pathlib.Path(FLAGS.rerun_result) if not result_to_rerun_path.is_file(): raise app.UsageError( '--rerun_result must be the path of a Result proto.') if not pbutil.ProtoIsReadable(result_to_rerun_path, deepsmith_pb2.Result()): raise app.UsageError( "Cannot read Result proto: '{result_to_rerun_path}'.") result_to_rerun = pbutil.FromFile(result_to_rerun_path, deepsmith_pb2.Result()) # harness_class = cldrive.CldriveHarness if result_to_rerun. # Parse flags and instantiate testing objects. if not FLAGS.interesting_results_dir: raise app.UsageError('--interesting_results_dir must be set') interesting_results_dir = pathlib.Path(FLAGS.interesting_results_dir) if interesting_results_dir.exists( ) and not interesting_results_dir.is_dir(): raise app.UsageError('--interesting_results_dir must be a directory') logging.info('Recording interesting results in %s.', interesting_results_dir) logging.info('Preparing generator.') if FLAGS.generator == 'clgen': generator = GeneratorFromFlag(generator_pb2.ClgenGenerator, clgen_pretrained.ClgenGenerator) harness_class = cldrive.CldriveHarness config_class = harness_pb2.CldriveHarness filters = opencl_filters.ClgenOpenClFilters() elif FLAGS.generator == 'clsmith': generator = GeneratorFromFlag(generator_pb2.ClsmithGenerator, clsmith.ClsmithGenerator) harness_class = cl_launcher.ClLauncherHarness config_class = harness_pb2.ClLauncherHarness # TODO(cec): Replace with CLSmith filters. filters = difftests.FiltersBase() else: raise app.UsageError( f"Unrecognized value for --generator: '{FLAGS.generator}'") logging.info('%s:\n %s', type(generator).__name__, generator.config) logging.info('Preparing device under test.') config = GetBaseHarnessConfig(config_class) config.opencl_env.extend([FLAGS.dut]) config.opencl_opt.extend([FLAGS.opencl_opt]) dut_harness = harness_class(config) assert len(dut_harness.testbeds) == 1 logging.info('Preparing gold standard testbed.') config = GetBaseHarnessConfig(config_class) config.opencl_env.extend( [gpu.cldrive.env.OclgrindOpenCLEnvironment().name]) config.opencl_opt.extend([True]) gs_harness = harness_class(config) assert len(gs_harness.testbeds) >= 1 TestingLoop(FLAGS.min_interesting_results, FLAGS.max_testing_time_seconds, FLAGS.batch_size, generator, dut_harness, gs_harness, filters, interesting_results_dir, start_time=start_time)
def _ExportOpenCLResults(cursor, start_id, proto_dir): batch_size = 1000 result_id = start_id while True: cursor.execute(""" SELECT results.id, platforms.platform, platforms.device, platforms.driver, platforms.opencl, platforms.devtype, platforms.host, testbeds.optimizations, programs.generator, programs.date, programs.generation_time, programs.src, testcases.harness, testcases.timeout, results.date, results.returncode, results.runtime, stdouts.stdout, stderrs.stderr, stderrs.truncated, threads.gsize_x, threads.gsize_y, threads.gsize_z, threads.lsize_x, threads.lsize_y, threads.lsize_z FROM results LEFT JOIN testbeds ON results.testbed_id = testbeds.id LEFT JOIN platforms ON testbeds.platform_id = platforms.id LEFT JOIN testcases on results.testcase_id = testcases.id LEFT JOIN programs on testcases.program_id = programs.id LEFT JOIN threads on testcases.threads_id = threads.id LEFT JOIN stdouts on results.stdout_id = stdouts.id LEFT JOIN stderrs on results.stderr_id = stderrs.id WHERE results.id >= %s ORDER BY results.id LIMIT %s """, (result_id, batch_size)) i = 0 for row in cursor: i += 1 ( result_id, platform_name, device_name, driver_version, opencl_version, devtype, host_os, cl_opt, generator_id, program_date, program_generation_time, program_src, harness_id, harness_timeout, result_date, returncode, runtime, stdout, stderr, truncated_stderr, gsize_x, gsize_y, gsize_z, lsize_x, lsize_y, lsize_z ) = row inputs = { "src": program_src, } if harness_id != -1: inputs["gsize"] = f"{gsize_x},{gsize_y},{gsize_z}" inputs["lsize"] = f"{lsize_x},{lsize_y},{lsize_z}" testbed_name = OPENCL_DEVICE_MAP[device_name] testbed_opts = {} _SetIf(testbed_opts, 'opencl_device', device_name.strip()) _SetIf(testbed_opts, 'opencl_version', opencl_version.strip()) _SetIf(testbed_opts, 'host', HOSTS_MAP.get(host_os, host_os)) if testbed_name == "clang": _SetIf(testbed_opts, 'llvm_version', driver_version.strip()) else: _SetIf(testbed_opts, 'driver_version', driver_version.strip()) _SetIf(testbed_opts, 'opencl_devtype', OPENCL_DEVTYPE_MAP.get(devtype, devtype)) _SetIf(testbed_opts, 'opencl_platform', platform_name.strip()) _SetIf(testbed_opts, 'opencl_opt', 'enabled' if cl_opt else 'disabled') testbed = deepsmith_pb2.Testbed( toolchain='opencl', name=testbed_name, opts=testbed_opts, ) proto = deepsmith_pb2.Result( testcase=deepsmith_pb2.Testcase( toolchain="opencl", generator=_GetOpenCLGenerator(generator_id), harness=_GetOpenCLHarness(harness_id, harness_timeout), inputs=inputs, invariant_opts={}, profiling_events=[ deepsmith_pb2.ProfilingEvent( client="cc1", type="generation", duration_seconds=program_generation_time, date_epoch_seconds=int(program_date.strftime('%s')), ), ] ), testbed=testbed, returncode=returncode, outputs={ "stdout": stdout, "stderr": stderr, }, profiling_events=[ deepsmith_pb2.ProfilingEvent( client={ 'Ubuntu 16.04 64bit': 'cc1', 'CentOS Linux 7.1.1503 64bit': 'fuji', 'openSUSE 13.1 64bit': 'kobol', }[host_os], type="runtime", duration_seconds=runtime, date_epoch_seconds=int(result_date.strftime('%s')), ), ], ) with open(proto_dir / 'opencl' / 'results' / str(result_id), 'wb') as f: f.write(proto.SerializeToString()) if i < batch_size: return
def test_Generator_GetOrAdd_ToProto_equivalence(session): proto_in = deepsmith_pb2.Result( testcase=deepsmith_pb2.Testcase( toolchain='cpp', generator=deepsmith_pb2.Generator(name='generator'), harness=deepsmith_pb2.Harness(name='harness'), inputs={ 'src': 'void main() {}', 'data': '[1,2]', }, invariant_opts={ 'config': 'opt', }, profiling_events=[ deepsmith_pb2.ProfilingEvent( client='localhost', type='generate', duration_ms=100, event_start_epoch_ms=1123123123, ), deepsmith_pb2.ProfilingEvent( client='localhost', type='foo', duration_ms=100, event_start_epoch_ms=1123123123, ), ]), testbed=deepsmith_pb2.Testbed( toolchain='cpp', name='clang', opts={ 'arch': 'x86_64', 'build': 'debug+assert', }, ), returncode=0, outputs={ 'stdout': 'Hello, world!', 'stderr': '', }, profiling_events=[ deepsmith_pb2.ProfilingEvent( client='localhost', type='exec', duration_ms=500, event_start_epoch_ms=1123123123, ), deepsmith_pb2.ProfilingEvent( client='localhost', type='overhead', duration_ms=100, event_start_epoch_ms=1123123123, ), ], outcome=deepsmith_pb2.Result.PASS, ) result = deeplearning.deepsmith.result.Result.GetOrAdd(session, proto_in) # NOTE: We have to flush so that SQLAlchemy resolves all of the object IDs. session.flush() proto_out = result.ToProto() assert proto_in == proto_out proto_out.ClearField('outputs') assert proto_in != proto_out # Sanity check.