def linuxSteps(): list = [ steps.ShellCommand( command = ['git', 'clean', '-xdf'], doStepIf = lambda step : isLinux(step), name = 'clean for Linux', description = 'clean old build data', ), steps.ShellCommand( command = ['git', 'submodule', 'foreach', '--recursive', 'git', 'clean', '-xdf'], doStepIf = lambda step :isLinux(step), name = 'clean submodule for Linux', description = 'clean submodule old build data', ), steps.ShellCommand( command = getLinuxConfigOptions, haltOnFailure = True, doStepIf = lambda step : isLinux(step), name = 'configure Linux', description = 'create a make files for projects', ), steps.Compile( command = base.makeCommand, name = 'Build Qt for Linux', haltOnFailure = True, doStepIf = lambda step : isLinux(step), description = 'run make for project', ), steps.Compile( command = ['make', 'install', '-j2'], name = 'Install Qt for Linux', haltOnFailure = True, doStepIf = lambda step : isLinux(step), description = 'run make for project', ), steps.ShellCommand( command = cpIcuLinux, haltOnFailure = True, doStepIf = lambda step : isLinux(step), name = 'Copy ICU libs for Linux', description = 'Copy extra libs', ), steps.ShellCommand( command = lsLinux, haltOnFailure = True, doStepIf = lambda step : isLinux(step), name = 'Create ls links for Linux', description = 'deploy qt', ), ] return list;
def LinuxSteps(): list = [ steps.ShellCommand( command=[ 'qmake-linux', "QMAKE_CXX='ccache g++'", "-r", "CONFIG+=qtquickcompiler", 'ONLINE="~/repo"' ], haltOnFailure=True, doStepIf=lambda step: isLinux(step), name='QMake Linux', description='create a make files for projects', ), steps.ShellCommand( command=['make', 'clean'], doStepIf=lambda step: isClean(step) and isLinux(step), name='clean Linux', description='clean old build data', ), steps.Compile( command=base.makeCommand, name='Build Linux', doStepIf=lambda step: isLinux(step), haltOnFailure=True, description='run make for project', ), steps.ShellCommand( command=['make', 'deploy'], doStepIf=lambda step: isDeploy(step) and isLinux(step), name='deploy Linux', haltOnFailure=True, description='deploy project ', ), steps.Compile( command=['make', 'test'], doStepIf=lambda step: isTest(step) and isLinux(step), name='tests ', haltOnFailure=True, description='run autotests of project', ), steps.ShellCommand( command=['make', 'release'], doStepIf=lambda step: isRelease(step) and isLinux(step), name='release Linux', haltOnFailure=True, description= 'release project, like push to store or online repository', ), steps.ShellCommand( command=['make', 'distclean'], doStepIf=lambda step: isLinux(step), name='clean Linux makefiles', description='clean old makefiles ', ), ] return list
def steps_build_boot_adjust_config(builder_name, env, slaves, config): st = [] if not config: raise ValueError('Missing config for booting') if not env['KBUILD_OUTPUT']: raise ValueError('Missing KBUILD_OUTPUT path in environment') st.append(steps.ShellCommand( command=['scripts/config', '--file', env['KBUILD_OUTPUT'] + '.config', # Enable IPV6 for Odroid systemd, AUTOFS4_FS/NFS_V4 will be in exynos_defconfig around v4.5 '-e', 'IPV6', '-e', 'NFS_V4', '-e', 'AUTOFS4_FS', # Enable fan so it won't be spinning on full speed on multi_v7 # (PWM_FAN will be in multi_v7 around v4.5-4.6 but both # won't be in older stables) '-e', 'SENSORS_PWM_FAN', '-e', 'PWM_SAMSUNG', # s5p-sss tests need status of selftest '-d', 'CRYPTO_MANAGER_DISABLE_TESTS', # Want DMATEST and TCRYPT for tests '-e', 'DMATEST', '-m', 'CRYPTO_TEST', # Enable Lockdep and other debugging non-heavy tools '-e', 'SCHED_STACK_END_CHECK', '-e', 'DEBUG_LOCK_ALLOC', '-e', 'DEBUG_ATOMIC_SLEEP', '-e', 'DEBUG_LIST', # Enable build-time debugging '-e', 'DEBUG_SECTION_MISMATCH', '-d', 'SECTION_MISMATCH_WARN_ONLY', # SECCOMP is required by newer Arch ARM systemd '-e', 'SECCOMP', ], haltOnFailure=True, env=env, name='Toggle config options')) st.append(steps.Compile(command=[util.Interpolate(CMD_MAKE), 'olddefconfig'], haltOnFailure=True, env=env, name='Make olddefconfig')) return st
def build_coverage(): remove_build = steps.RemoveDirectory("build") create_build = steps.MakeDirectory("build") cmake_step = steps.CMake(path=util.Property("src_dir"), definitions=util.Property("cmake_defs", {}), options=util.Property("cmake_opts", []), workdir="build", env=env) @util.renderer def join_make_opts(props): make_opts = props.getProperty("make_opts", []) return ["make"] + make_opts make_step = steps.Compile(command=join_make_opts, workdir="build", env=env) test_coverage = steps.ShellCommand(command=["make", "coverage"], workdir="build") upload_coverage_data = steps.ShellCommand(command=[ "bash", "-c", util.Interpolate("bash <(curl -s https://codecov.io/bash) -t " + tokens.codecovToken + " -C %(prop:revision)s -f coverage.info.cleaned") ], workdir="build") factory = util.BuildFactory() factory.addStep(remove_build) factory.addStep(create_build) factory.addStep(cmake_step) factory.addStep(make_step) factory.addStep(test_coverage) factory.addStep(upload_coverage_data) return factory
def steps_build_mem_ctrl_adjust_config(builder_name, env): st = [] if not env['KBUILD_OUTPUT']: raise ValueError('Missing KBUILD_OUTPUT path in environment') st.append(steps.ShellCommand( command=['scripts/config', '--file', env['KBUILD_OUTPUT'] + '.config', '-e', 'COMPILE_TEST', '-e', 'OF', '-e', 'SRAM', '-e', 'MEMORY', '-e', 'PM_DEVFREQ', '-e', 'ARM_PL172_MPMC', '-e', 'ATMEL_SDRAMC', '-e', 'ATMEL_EBI', '-e', 'BRCMSTB_DPFE', '-e', 'BT1_L2_CTL', '-e', 'TI_AEMIF', '-e', 'TI_EMIF', '-e', 'OMAP_GPMC', '-e', 'TI_EMIF_SRAM', '-e', 'FPGA_DFL_EMIF', '-e', 'MVEBU_DEVBUS', '-e', 'FSL_CORENET_CF', '-e', 'FSL_IFC', '-e', 'JZ4780_NEMC', '-e', 'MTK_SMI', '-e', 'DA8XX_DDRCTL', '-e', 'PL353_SMC', '-e', 'RENESAS_RPCIF', '-e', 'STM32_FMC2_EBI', '-e', 'SAMSUNG_MC', '-e', 'EXYNOS5422_DMC', '-e', 'EXYNOS_SROM', '-e', 'TEGRA_MC', '-e', 'TEGRA20_EMC', '-e', 'TEGRA30_EMC', '-e', 'TEGRA124_EMC', '-e', 'TEGRA210_EMC_TABLE', '-e', 'TEGRA210_EMC', ], haltOnFailure=True, env=env, name='Toggle memory controller drivers compile test config options')) st.append(steps.Compile(command=[util.Interpolate(CMD_MAKE), 'olddefconfig'], haltOnFailure=True, env=env, name='Make olddefconfig')) return st
def steps_build_linux_kernel(env, build_step_name='Build kernel', skip_warnings=True): st = [] if skip_warnings: st.append(steps.ShellCommand(command=[util.Interpolate(CMD_MAKE)], haltOnFailure=True, env=env, name=build_step_name)) else: st.append(steps.Compile(command=[util.Interpolate(CMD_MAKE)], haltOnFailure=True, warnOnWarnings=True, suppressionList=BUILD_WARN_IGNORE, env=env, name=build_step_name)) return st
def addBuildSteps(self, f, platform, *, env, **kwargs): # ScummVM builds are longer timeout = kwargs.pop('timeout', 0) timeout = max(3600, timeout) super().addBuildSteps(f, platform, **kwargs, env=env, timeout=timeout) # Build devtools if platform.build_devtools: f.addStep( steps.Compile( command=["make", "-j{0}".format(max_jobs), "devtools"], name="compile devtools", env=env, **kwargs))
def run(self): command = yield self.makeRemoteShellCommand() yield self.runCommand(command) result = command.results() if result == util.SUCCESS: mergecheck_repo = self.getProperty('mergecheck_repo') current_branch = self.observer.getStdout().strip() #default_branch = REPOS[mergecheck_repo]['default_branch'] repo_subdir = REPOS[mergecheck_repo]['checkout_subdir'] upstream_merge_base = '' upstream_remote_url = '' if ('upstream_merge_base' not in REPOS[mergecheck_repo] or 'upstream_remote_url' not in REPOS[mergecheck_repo]): # This repository has no remote to compare against, so no mergecheck has to be done. defer.returnValue(result) else: upstream_merge_base = REPOS[mergecheck_repo][ 'upstream_merge_base'] upstream_remote_url = REPOS[mergecheck_repo][ 'upstream_remote_url'] self.build.addStepsAfterCurrentStep([ steps.Compile( command=[ '/local/hdd/buildbot/mergecheck/build/bin/mergecheck', 'rebase', '--repo', '.' + repo_subdir, '--remote-url', upstream_remote_url, '--remote-name', 'upstream', '--onto', 'refs/remotes/upstream/master', '--upstream', upstream_merge_base, '--branch', current_branch, '-v', '--print-conflicts', ], workdir=ip(CHECKOUT_BASE_DIR), name='Mergecheck \"' + mergecheck_repo + '\"', warnOnWarnings=False, warningPattern=r'^CONFLICT \((content|add\/add)\).*'), ]) defer.returnValue(result)
def WinSteps(): list = [ steps.ShellCommand( command=[ 'qmake-windows', '-spec', 'win32-g++', "QMAKE_CXX='ccache x86_64-w64-mingw32-g++'", "-r", "CONFIG+=qtquickcompiler", 'ONLINE="~/repo"' ], name='QMake Windows', haltOnFailure=True, doStepIf=lambda step: isWin(step), description='create a make files for projects', ), steps.ShellCommand( command=['make', 'clean'], doStepIf=lambda step: isClean(step) and isWin(step), name='clean Windows', description='clean old build data', ), steps.Compile( command=base.makeCommand, name='Build Windows', haltOnFailure=True, doStepIf=lambda step: isWin(step), description='run make for project', ), steps.ShellCommand( command=['make', 'deploy'], doStepIf=lambda step: isDeploy(step) and isWin(step), name='deploy Windows', haltOnFailure=True, description='deploy project ', ), steps.ShellCommand( command=['make', 'release'], doStepIf=lambda step: isRelease(step) and isWin(step), name='release Windows', haltOnFailure=True, description= 'release project, like push to store or online repository', ), steps.ShellCommand( command=['make', 'distclean'], doStepIf=lambda step: isWin(step), name='clean Windows makefiles', description='clean old makefiles ', ), ] return list
def build_coverity(): remove_build = steps.RemoveDirectory("build") remove_src = steps.RemoveDirectory("src") create_build = steps.MakeDirectory("build") download_src_archive = steps.FileDownload( mastersrc=util.Property("src_archive"), workerdest="src.tar.xz", workdir="src") extract_src_archive = steps.ShellCommand( name="Extract source archive", command=["tar", "xJf", "src.tar.xz"], workdir="src") cmake_step = steps.CMake(path="../src/", definitions=util.Property("cmake_defs", {}), options=util.Property("cmake_opts", []), workdir="build", env=env) make_step = steps.Compile(command=[ "cov-build", "--dir", "cov-int", "make", "-j", "16", "-l", "32" ], workdir="build", env=env) compress = steps.ShellCommand( command=["tar", "czvf", "gnuradio.tgz", "cov-int"], workdir="build") upload = steps.ShellCommand(command=[ "curl", "--form", "token=" + tokens.coverityToken, "--form", "[email protected]", "--form", "[email protected]", "--form", util.Interpolate("version=%(prop:revision)s"), "--form", util.Interpolate( "description=\"Weekly Buildbot submission for %(prop:branch)s branch \"" ), "https://scan.coverity.com/builds?project=GNURadio" ], workdir="build") factory = util.BuildFactory() factory.addStep(remove_build) factory.addStep(remove_src) factory.addStep(create_build) factory.addStep(download_src_archive) factory.addStep(extract_src_archive) factory.addStep(cmake_step) factory.addStep(make_step) factory.addStep(compress) factory.addStep(upload) return factory
def AndroidSteps(): list = [ steps.ShellCommand( command=androidQmake, haltOnFailure=True, doStepIf=lambda step: isAndroid(step), name='QMake Android', description='create a make files for projects', ), steps.ShellCommand( command=['make', 'clean'], doStepIf=lambda step: isClean(step) and isAndroid(step), name='clean Android', description='clean old build data', ), steps.Compile( command=base.makeCommand, name='Build Android', doStepIf=lambda step: isAndroid(step), haltOnFailure=True, description='run make for project', ), steps.ShellCommand( command=['make', 'deploy'], doStepIf=lambda step: isDeploy(step) and isAndroid(step), name='deploy Android', haltOnFailure=True, description='deploy project ', ), steps.ShellCommand( command=['make', 'release'], doStepIf=lambda step: isRelease(step) and isAndroid(step), name='release Android', haltOnFailure=True, description= 'release project, like push to store or online repository', ), steps.ShellCommand( command=['make', 'distclean'], doStepIf=lambda step: isAndroid(step), name='clean Android makefiles', description='clean old makefiles ', ), ] return list
def build_and_test(): remove_build = steps.RemoveDirectory("build") remove_src = steps.RemoveDirectory("src") create_build = steps.MakeDirectory("build") download_src_archive = steps.FileDownload( mastersrc=util.Property("src_archive"), workerdest="src.tar.xz", workdir="src") extract_src_archive = steps.ShellCommand( name="Extract source archive", command=["tar", "xJf", "src.tar.xz"], workdir="src") cmake_step = steps.CMake(path="../src/", definitions=util.Property("cmake_defs", {}), options=util.Property("cmake_opts", []), workdir="build", env=env) @util.renderer def join_make_opts(props): make_opts = props.getProperty("make_opts", []) return ["make"] + make_opts make_step = steps.Compile(command=join_make_opts, workdir="build", env=env) @util.renderer def parse_test_excludes(props): command = ["ctest", "--output-on-failure", "--timeout", "120"] excludes = props.getProperty('test_excludes', []) excludes.append("qtgui") if excludes is not None: command += ["-E", "|".join(excludes)] return command test_step = steps.Test(command=parse_test_excludes, workdir="build") factory = util.BuildFactory() factory.addStep(remove_build) factory.addStep(remove_src) factory.addStep(create_build) factory.addStep(download_src_archive) factory.addStep(extract_src_archive) factory.addStep(cmake_step) factory.addStep(make_step) factory.addStep(test_step) return factory
def build_and_test(): remove_build = steps.RemoveDirectory("build") create_build = steps.MakeDirectory("build") cmake_step = steps.CMake(path=util.Property("src_dir"), definitions=util.Property("cmake_defs", {}), options=util.Property("cmake_opts", []), workdir="build", env=env) @util.renderer def join_make_opts(props): make_opts = props.getProperty("make_opts", []) return ["make"] + make_opts make_step = steps.Compile(command=join_make_opts, workdir="build", env=env) def parse_exclude_file(rc, stdout, stderr): exclude_tests = json.loads(stdout) return {"test_excludes": exclude_tests} load_exclude_file = steps.SetPropertyFromCommand( command=["cat", os.path.join("/config", "test_excludes.json")], extract_fn=parse_exclude_file, doStepIf=lambda steps: steps.getProperty("exclude_file", False)) @util.renderer def parse_test_excludes(props): command = ["ctest", "--output-on-failure", "--timeout", "10"] excludes = props.getProperty("test_excludes", None) if excludes is not None: command += ["-E", "|".join(excludes)] return command test_step = steps.Test(command=parse_test_excludes, workdir="build") factory = util.BuildFactory() factory.addStep(remove_build) factory.addStep(create_build) factory.addStep(load_exclude_file) factory.addStep(cmake_step) factory.addStep(make_step) factory.addStep(test_step) return factory
def steps_checkdtbs(env, config=None, git_reset=True): st = [] if git_reset: st += steps_build_common(env, config) else: st.append(step_make_config(env, config)) step_name_cfg = str(config) + ' config' if config else 'defconfig' step_name = 'make dtbs baseline for ' + env['ARCH'] + '/' + step_name_cfg st.append(steps.ShellCommand(command=[util.Interpolate(CMD_MAKE), 'dtbs', 'W=1'], haltOnFailure=True, env=env, name=step_name)) st.append(step_touch_commit_files()) step_name = 'make dtbs warnings for ' + env['ARCH'] + '/' + step_name_cfg st.append(steps.Compile(command=[util.Interpolate(CMD_MAKE), 'dtbs', 'W=1'], haltOnFailure=True, warnOnWarnings=True, suppressionList=BUILD_WARN_IGNORE, env=env, name=step_name)) return st
def steps_build_selected_folders(builder_name, env): st = [] if not env['KBUILD_OUTPUT']: raise ValueError('Missing KBUILD_OUTPUT path in environment') st.append(steps.ShellCommand(command=[util.Interpolate(CMD_MAKE), 'arch/arm/', # make won't build DTBs but include it for completeness 'arch/arm64/boot/dts/', 'drivers/clk/samsung/', 'drivers/pinctrl/samsung/', 'drivers/memory/', 'drivers/soc/samsung/'], haltOnFailure=True, env=env, name='Build selected paths')) st.append(step_touch_commit_files()) st.append(steps.Compile(command=[util.Interpolate(CMD_MAKE), 'arch/arm/', # make won't build DTBs but include it for completeness 'arch/arm64/boot/dts/', 'drivers/clk/samsung/', 'drivers/pinctrl/samsung/', 'drivers/memory/', 'drivers/soc/samsung/'], haltOnFailure=True, warnOnWarnings=True, suppressionList=BUILD_WARN_IGNORE, env=env, name='Rebuild selected paths')) return st
def run(self): command = yield self.makeRemoteShellCommand() yield self.runCommand(command) result = command.results() if result == util.SUCCESS: mergecheck_repo = self.getProperty('mergecheck_repo') current_branch = self.observer.getStdout().strip() default_branch = REPOS[mergecheck_repo]['default_branch'] repo_subdir = REPOS[mergecheck_repo]['checkout_subdir'] if default_branch == current_branch.replace('refs/heads/', ''): # This repository has no feature branch, so nothing has to be merged. defer.returnValue(result) self.build.addStepsAfterCurrentStep([ steps.Compile( command=[ '/local/hdd/buildbot/mergecheck/build/bin/mergecheck', 'rebase', '--repo', '.' + repo_subdir, '--upstream', 'refs/remotes/origin/' + default_branch, '--branch', current_branch, '-v', '--print-conflicts', ], workdir=ip(CHECKOUT_BASE_DIR), name='Mergecheck \"' + mergecheck_repo + '\"', warnOnWarnings=False, warningPattern=r'^CONFLICT \((content|add\/add)\).*'), ]) defer.returnValue(result)
def windowsSteps(): list = [ steps.ShellCommand( command = ['git', 'clean', '-xdf'], doStepIf = lambda step : isWin(step), name = 'clean for Windows', description = 'clean old build data', ), steps.ShellCommand( command = ['git', 'submodule', 'foreach', '--recursive', 'git', 'clean', '-xdf'], doStepIf = lambda step :isWin(step), name = 'clean submodule for Windows', description = 'clean submodule old build data', ), steps.ShellCommand( command = getWindowsConfigOptions, haltOnFailure = True, doStepIf = lambda step : isWin(step), name = 'configure Windows', description = 'create a make files for projects', ), steps.Compile( command = base.makeCommand, name = 'Build Qt for Windows', haltOnFailure = True, doStepIf = lambda step : isWin(step), description = 'run make for project', ), steps.Compile( command = ['make', 'install', '-j2'], name = 'Install Qt for Windows', haltOnFailure = True, doStepIf = lambda step : isWin(step), description = 'run make for project', ), steps.ShellCommand( command = cpGCCWindows, haltOnFailure = True, doStepIf = lambda step : isWin(step), name = 'Copy gcc libs for Windows', description = 'Copy extra libs', ), steps.ShellCommand( command = cpThreadWindows, haltOnFailure = True, doStepIf = lambda step : isWin(step), name = 'Copy thread libs for Windows', description = 'Copy extra libs', ), steps.ShellCommand( command = lsWindows, haltOnFailure = True, doStepIf = lambda step : isWin(step), name = 'Create ls links for Windows', description = 'deploy qt', ), ] return list;
def configure(c): f = util.BuildFactory() # TODO Check if this can be done without a dummy command #f.addStep(GenerateGitCloneCommand()) f.addStep(GenerateGitCloneCommand(name="Dummy_1", command=['true'], haltOnFailure=True, hideStepIf=True)) f.addStep(define('UCHROOT_SRC_ROOT', UCHROOT_SRC_ROOT)) f.addStep(define('UCHROOT_BUILD_DIR', UCHROOT_BUILD_DIR)) f.addStep(GenerateGitCheckoutCommand( name="Get branch names", command=['./tools/VaRA/utils/buildbot/bb-get-branches.sh'], workdir=ip(CHECKOUT_BASE_DIR), haltOnFailure=True, hideStepIf=True)) # CMake for step in get_uchroot_workaround_steps(): f.addStep(step) f.addStep(ucompile('../tools/VaRA/utils/vara/builds/' + BUILD_SCRIPT, env={'PATH': '/opt/cmake/bin:/usr/local/bin:/usr/bin:/bin'}, name='cmake', description=BUILD_SCRIPT, workdir=UCHROOT_SRC_ROOT + '/build')) f.addStep(GenerateMakeCleanCommand(name="Dummy_2", command=['true'], haltOnFailure=True, hideStepIf=True)) # use mergecheck tool to make sure the 'upstream' remote is present for repo in ['vara-llvm', 'vara-clang']: f.addStep(steps.Compile( command=['/local/hdd/buildbot/mergecheck/build/bin/mergecheck', 'rebase', '--repo', '.' + REPOS[repo]['checkout_subdir'], '--remote-url', REPOS[repo]['upstream_remote_url'], '--remote-name', 'upstream', '--upstream', 'refs/remotes/upstream/master', '--branch', 'refs/remotes/upstream/master', '-v'], workdir=ip(CHECKOUT_BASE_DIR), name='Add upstream remote to repository.', hideStepIf=True)) # Prepare project file list to filter out compiler warnings f.addStep(cmd("../../tools/VaRA/utils/vara/getVaraSourceFiles.sh", "--vara", "--clang", "--llvm", "--include-existing", "--relative-to", ip(BUILD_DIR), "--output", "buildbot-source-file-list.txt", workdir=ip(BUILD_DIR), hideStepIf=True)) # Compile Step f.addStep(GenerateBuildStepCommand(name="Dummy_3", command=['cat', 'buildbot-source-file-list.txt'], workdir=ip(BUILD_DIR), haltOnFailure=True, hideStepIf=True)) # Regression Test step for step in get_uchroot_workaround_steps(): f.addStep(step) f.addStep(ucompile('ninja', 'check-vara', name='run VaRA regression tests', workdir=UCHROOT_BUILD_DIR, haltOnFailure=False, warnOnWarnings=True)) # Clang-Tidy for step in get_uchroot_workaround_steps(): f.addStep(step) f.addStep(ucompile('python3', 'tidy-vara.py', '-p', UCHROOT_BUILD_DIR, '-j', '8', '--gcc', workdir='vara-llvm/tools/VaRA/test/', name='run Clang-Tidy', haltOnFailure=False, warnOnWarnings=True, timeout=3600)) # ClangFormat f.addStep(GenerateClangFormatStepCommand(name="Dummy_4", command=['opt/clang-format-static/clang-format', '-version'], workdir=ip('%(prop:uchroot_image_path)s'), haltOnFailure=True, hideStepIf=True)) c['builders'].append(builder(PROJECT_NAME, None, ACCEPTED_BUILDERS, tags=['vara'], factory=f))
def addBuildSteps(self, f, platform, *, env, **kwargs): f.addStep( steps.Compile(command=["make", "-j{0}".format(max_jobs)], env=env, **kwargs))
def make_builder_config(repo_url, name, worker_name, config, lock, snapshots_dir, snapshots_url, snapshots_default_max): builder = util.BuildFactory() builder.addStep( steps.SetProperties(name="Worker Config File", properties=config, hideStepIf=True)) builder.addStep( steps.SetPropertiesFromEnv( variables=["WORKER_HOST", "WORKER_REPO_DIR"], hideStepIf=True)) # TODO: use `reference` to a common volume instead? or make the builder # dependent on another fetch-only builder so only one builder tries to pull it? builder.addStep( steps.GitHub(repourl=repo_url, workdir=Property("WORKER_REPO_DIR", None), logEnviron=False, getDescription={ "always": True, "tags": True })) builder.addStep( FileExistsSetProperty(name="config.mk Existence Check", property="already_configured", file="%s/config.mk" % builder.workdir, hideStepIf=True)) compilation_environment = Property("env", {}) builder.addStep( steps.Configure(command=compute_configure, env=compilation_environment, doStepIf=ConfigChecker().needs_configuration)) builder.addStep( steps.SetPropertyFromCommand(name="Python (Worker)", property="cpu_count", command=["python", "-c", GET_CPU_COUNT], flunkOnFailure=False, warnOnFailure=True, hideStepIf=True, description="getting CPU count", descriptionDone="got CPU count")) # In at least Buildbot 0.9.12, warningPattern and suppressionList are not # renderable, so just get the properties from the config file immediately compiler_warning_pattern = config.get( "compiler_warning_pattern", r"^([^:]+):(\d+):(?:\d+:)? [Ww]arning: (.*)$") compiler_warning_extractor = steps.Compile.warnExtractFromRegexpGroups compiler_suppression_file = Property("compiler_suppression_file", None) compiler_suppression_list = config.get("compiler_suppression_list", None) builder.addStep( steps.Compile(command=["make", Interpolate("-j%(prop:cpu_count:~1)s")], env=compilation_environment, warningPattern=compiler_warning_pattern, warningExtractor=compiler_warning_extractor, suppressionFile=compiler_suppression_file, suppressionList=compiler_suppression_list)) builder.addStep( steps.Test(command=[ "make", Interpolate("%(prop:can_run_tests:" "#?|test|test/runner)s") ], env=compilation_environment, warningPattern=compiler_warning_pattern, warningExtractor=compiler_warning_extractor, suppressionFile=compiler_suppression_file, suppressionList=compiler_suppression_list, haltOnFailure=True, flunkOnFailure=True)) if snapshots_dir is not None and snapshots_url is not None: if snapshots_dir and snapshots_dir[-1] is not "/": snapshots_dir += "/" if snapshots_url and snapshots_url[-1] is not "/": snapshots_url += "/" snapshots_dir = "%s%%(prop:branch)s/" % snapshots_dir snapshots_url = "%s%%(prop:branch)s/" % snapshots_url builder.addStep( steps.SetProperty(name="Computed By %s" % path.basename(__file__), property="package_name", value=compute_package_name, hideStepIf=True, doStepIf=should_package)) builder.addStep( Package(package_name=Property("package_name"), package_files=Property("package_files", None), package_format=Property("package_archive_format"), make_target=Property("package_make_target"), split_debug_package=Property("split_debug_package", True), extra_files=Property("package_extra_files", None), package_script=Interpolate(config.get( "package_script", "")), env=compilation_environment, doStepIf=should_package)) latest_link = Interpolate("%s%%(prop:buildername)s-latest." "%%(prop:package_archive_format:-tar.xz)s" % snapshots_dir) make_uploader_steps(builder=builder, snapshots_dir=snapshots_dir, snapshots_url=snapshots_url, publish_name="archive", property_name="package_filename", latest_link=latest_link, do_step_if=should_package) latest_link = Interpolate("%s%%(prop:buildername)s" "-latest-debug-symbols.tar.xz" % snapshots_dir) make_uploader_steps(builder=builder, snapshots_dir=snapshots_dir, snapshots_url=snapshots_url, publish_name="debug archive", property_name="debug_package_filename", latest_link=latest_link, do_step_if=should_package_debug) builder.addStep( MasterCleanSnapshots( name="clean old snapshots", workdir=Interpolate(snapshots_dir), file_prefix=Interpolate("%(prop:buildername)s-"), num_to_keep=Property("num_snapshots_to_keep", snapshots_default_max), secondary_file_suffix="-debug-symbols", file_extensions=r"\.(?:tar(?:\.[xg]z)?|[a-z]{3,4})$", doStepIf=should_package, hideStepIf=True)) return util.BuilderConfig(name=name, workername=worker_name, collapseRequests=True, factory=builder, nextBuild=pick_next_build, locks=[lock.access("counting")])
def make_builder_config(repo_url, name, worker_name, config, lock, snapshots_dir, snapshots_url, snapshots_default_max): if snapshots_dir and snapshots_dir[-1] is not "/": snapshots_dir += "/" if snapshots_url and snapshots_url[-1] is not "/": snapshots_url += "/" builder = util.BuildFactory() builder.addStep( steps.SetProperties(name="Worker Config File", properties=config, hideStepIf=True)) builder.addStep( steps.SetPropertiesFromEnv( variables=["WORKER_HOST", "WORKER_REPO_DIR"], hideStepIf=True)) # TODO: use `reference` to a common volume instead? or make the builder # dependent on another fetch-only builder so only one builder tries to pull it? builder.addStep( steps.GitHub(repourl=repo_url, workdir=Property("WORKER_REPO_DIR", None), logEnviron=False, getDescription={ "always": True, "tags": True })) builder.addStep( FileExistsSetProperty(name="config.mk Existence Check", property="already_configured", file="%s/config.mk" % builder.workdir, hideStepIf=True)) compilation_environment = Property("env", {}) builder.addStep( steps.Configure(command=compute_configure, env=compilation_environment, doStepIf=is_not_configured)) builder.addStep( steps.SetPropertyFromCommand(name="Python (Worker)", property="cpu_count", command=["python", "-c", GET_CPU_COUNT], flunkOnFailure=False, warnOnFailure=True, hideStepIf=True, description="getting CPU count", descriptionDone="got CPU count")) # In at least Buildbot 0.9.12, warningPattern and suppressionList are not # renderable, so just get the properties from the config file immediately compiler_warning_pattern = config.get( "compiler_warning_pattern", r"^([^:]+):(\d+):(?:\d+:)? [Ww]arning: (.*)$") compiler_warning_extractor = steps.Compile.warnExtractFromRegexpGroups compiler_suppression_file = Property("compiler_suppression_file", None) compiler_suppression_list = config.get("compiler_suppression_list", None) builder.addStep( steps.Compile(command=["make", Interpolate("-j%(prop:cpu_count:~1)s")], env=compilation_environment, warningPattern=compiler_warning_pattern, warningExtractor=compiler_warning_extractor, suppressionFile=compiler_suppression_file, suppressionList=compiler_suppression_list)) builder.addStep( steps.Test(command=[ "make", Interpolate("%(prop:can_run_tests:" "#?|test|test/runner)s") ], env=compilation_environment, warningPattern=compiler_warning_pattern, warningExtractor=compiler_warning_extractor, suppressionFile=compiler_suppression_file, suppressionList=compiler_suppression_list, haltOnFailure=True, flunkOnFailure=True)) if snapshots_dir is not None and snapshots_url is not None: builder.addStep( steps.SetProperty(name="Computed By %s" % path.basename(__file__), property="package_name", value=compute_package_name, hideStepIf=True, doStepIf=should_package)) builder.addStep( Package(package_name=Property("package_name"), package_format=Property("package_archive_format"), make_target=Property("package_make_target"), package_directory=Property("package_directory", None), strip_binaries=Property("package_strip_binaries", None), env=compilation_environment, doStepIf=should_package)) source_path = Property("package_filename") target_path = Interpolate("%s%%(prop:package_filename)s" % snapshots_dir) target_url = Interpolate("%s%%(prop:package_filename)s" % snapshots_url) # This is not an ideal target link calculation since the archive format # in package_filename might be fixed up by the Package step, but here # only None is converted into tar.xz, which is not exactly the same target_link = Interpolate("%s%%(prop:buildername)s-latest." "%%(prop:package_archive_format:-tar.xz)s" % snapshots_dir) builder.addStep( CleaningFileUpload(name="publish", workersrc=source_path, masterdest=target_path, url=target_url, clean=True, doStepIf=should_package)) builder.addStep( steps.MasterShellCommand( name="update latest archive", command=["ln", "-sf", target_path, target_link], logEnviron=False, doStepIf=should_package)) builder.addStep( MasterCleanSnapshots( name="clean old snapshots", workdir=snapshots_dir, file_prefix=Interpolate("%(prop:buildername)s-"), num_to_keep=Property("num_snapshots_to_keep", snapshots_default_max), doStepIf=should_package)) return util.BuilderConfig(name=name, workername=worker_name, collapseRequests=True, factory=builder, nextBuild=pick_next_build, locks=[lock.access("exclusive")])
doc = ServoFactory([ # This is not dynamic because a) we need to pass the logEnviron kwarg # and b) changes to the documentation build are already encapsulated # in the upload_docs.sh script; any further changes should go through # the saltfs repo to avoid leaking the token. steps.ShellCommand( command=["etc/ci/upload_docs.sh"], env=envs.doc, # important not to leak token logEnviron=False), ]) def make_win_command(command): cd_command = "cd /c/buildbot/slave/windows/build; " + command return ["bash", "-l", "-c", cd_command] windows = ServoFactory([ # TODO: convert this to use DynamicServoFactory # We need to run each command in a bash login shell, which breaks the # heuristics used by DynamicServoFactory.make_step steps.Compile(command=make_win_command("./mach build -d -v"), env=envs.build_windows), steps.Test(command=make_win_command("./mach test-unit"), env=envs.build_windows), # TODO: run lockfile_changed.sh and manifest_changed.sh scripts ])
build_factory = util.BuildFactory() # check out the source checkout_step = steps.GitHub( repourl="git://github.com/scummvm/scummvm.git", mode="incremental", **default_step_kwargs, ) build_factory.addStep(checkout_step) # run the tests (note that this will require that 'trial' is installed) build_factory.addStep( steps.Configure( command=[ "./configure", "--disable-all-engines", "--enable-engine=director", ], env={"CXX": "ccache g++"}, **default_step_kwargs, )) build_factory.addStep(steps.Compile(command=["make"], **default_step_kwargs)) master_dir = os.path.dirname(os.path.dirname(__file__)) master_file = os.path.join(master_dir, "scummvm-binary") worker_file = "scummvm" build_factory.addStep( steps.FileUpload(workersrc=worker_file, masterdest=master_file)) build_factory.addStep( steps.Trigger(schedulerNames=["Director Tests"], waitForFinish=True))
def step_make_config(env, config=None): step_name = str(config) + ' config' if config else 'defconfig' step_name = 'make ' + step_name return steps.Compile(command=cmd_make_config(config), haltOnFailure=True, env=env, name=step_name)
env=envs.doc, # important not to leak token logEnviron=False), ]) def make_win_command(command): cd_command = "cd /c/buildbot/slave/windows/build; " + command return ["bash", "-l", "-c", cd_command] windows = ServoFactory([ # TODO: convert this to use DynamicServoFactory # We need to run each command in a bash login shell, which breaks the # heuristics used by DynamicServoFactory.make_step steps.Compile(command=make_win_command("./mach build -d -v"), env=envs.build_windows), steps.Test(command=make_win_command("./mach test-unit"), env=envs.build_windows), # TODO: run lockfile_changed.sh and manifest_changed.sh scripts ]) windows_nightly = ServoFactory([ # TODO same comments as windows builder steps.Compile(command=make_win_command("./mach build --release"), env=envs.build_windows), steps.Test(command=make_win_command("./mach package --release"), env=envs.build_windows), steps.Compile( command=make_win_command("./etc/ci/upload_nightly.sh windows"), env=envs.upload_nightly, logEnviron=False),
from buildbot.plugins import util, steps from buildbot.process.results import FAILURE f = util.BuildFactory() f.addSteps([ steps.SVN(repourl="http://svn.example.org/trunk/"), steps.ClangTidy(command=["clang-tidy", "src/*", "--", "-std=c++11"]), steps.Compile(command=["make", "test"], doStepIf=lambda step: step.build.executedSteps[-2].results != FAILURE), steps.ShellCommand(command=["rm", "-rf", "build/*"], alwaysRun=True) ])