def apply_preset(cls, config: V1CompiledOperation, preset: Union[Dict, str] = None) -> V1CompiledOperation: if not preset: return config preset = OperationSpecification.read( preset, is_preset=True) # type: V1Operation if preset.run_patch: config.run = config.run.patch( validate_run_patch(preset.run_patch, config.run.kind), strategy=preset.patch_strategy, ) patch_compiled = V1CompiledOperation( name=preset.name, description=preset.description, tags=preset.tags, presets=preset.presets, queue=preset.queue, cache=preset.cache, hooks=preset.hooks, events=preset.events, plugins=preset.plugins, termination=preset.termination, matrix=preset.matrix, joins=preset.joins, schedule=preset.schedule, dependencies=preset.dependencies, trigger=preset.trigger, conditions=preset.conditions, skip_on_upstream_skip=preset.skip_on_upstream_skip, ) return config.patch(patch_compiled, strategy=preset.patch_strategy)
def apply_params( cls, config: V1CompiledOperation, params: Dict = None, context: Dict = None, ) -> V1CompiledOperation: config.apply_params(params, context) return config
def apply_run_contexts(cls, config: V1CompiledOperation, contexts=None): if config.has_pipeline: raise PolyaxonSchemaError( "This method is not allowed on this specification.") params = config.validate_params(is_template=False, check_runs=True) params = {param.name: param for param in params} params = cls._update_params_with_contexts(params, contexts) parsed_data = Parser.parse_run(config.to_dict(), params) return cls.CONFIG.read(parsed_data)
def compile_operation(cls, config: V1Operation, override: Dict = None) -> V1CompiledOperation: if override: preset = OperationSpecification.read(override, is_preset=True) config = config.patch(preset, preset.patch_strategy) # Patch run component = config.component # type: V1Component if config.run_patch: component.run = component.run.patch( validate_run_patch(config.run_patch, component.run.kind), strategy=config.patch_strategy, ) # Gather contexts io config_params = config.params or {} contexts = [ V1IO(name=p) for p in config_params if config_params[p].context_only ] patch_compiled = V1CompiledOperation( name=config.name, description=config.description, contexts=contexts, tags=config.tags, presets=config.presets, queue=config.queue, cache=config.cache, hooks=config.hooks, actions=config.actions, events=config.events, plugins=config.plugins, termination=config.termination, matrix=config.matrix, schedule=config.schedule, dependencies=config.dependencies, trigger=config.trigger, conditions=config.conditions, skip_on_upstream_skip=config.skip_on_upstream_skip, ) values = [ { cls.VERSION: config.version }, component.to_dict(), { cls.KIND: kinds.COMPILED_OPERATION }, ] compiled = V1CompiledOperation.read( values) # type: V1CompiledOperation return compiled.patch(patch_compiled, strategy=config.patch_strategy)
def test_specification_with_quotes(self): run_config = CompiledOperationSpecification.read([ os.path.abspath( "tests/fixtures/plain/polyaxonfile_with_quotes.yaml"), { "kind": "compiled_operation" }, ]) run_config = CompiledOperationSpecification.apply_operation_contexts( run_config) expected_run = { "kind": V1RunKind.JOB, "container": { "image": "continuumio/miniconda3", "command": ["python"], "args": ["-c \"print('Tweet tweet')\""], "name": "polyaxon-main", }, } assert run_config.run.to_dict() == expected_run run_config = V1CompiledOperation.read(run_config.to_dict()) run_config = CompiledOperationSpecification.apply_operation_contexts( run_config) assert run_config.run.to_dict() == expected_run
def resolve(run: BaseRun, compiled_at: datetime = None, resolver_cls=None): resolver_cls = resolver_cls or CorePlatformResolver try: project = run.project return resolver.resolve( run=run, compiled_operation=V1CompiledOperation.read(run.content), owner_name=project.owner.name, project_name=project.name, project_uuid=project.uuid.hex, run_uuid=run.uuid.hex, run_name=run.name, run_path=run.subpath, resolver_cls=resolver_cls, params=None, compiled_at=compiled_at, created_at=run.created_at, cloning_kind=run.cloning_kind, original_uuid=run.original.uuid.hex if run.original_id else None, ) except ( AccessNotAuthorized, AccessNotFound, MarshmallowValidationError, PolyaxonSchemaError, ValidationError, ) as e: raise PolyaxonCompilerError("Compilation Error: %s" % e) from e
def test_resolver_default_service_ports(self): compiled_operation = V1CompiledOperation.read({ "version": 1.1, "kind": kinds.COMPILED_OPERATION, "plugins": { "auth": False, "shm": False, "collectLogs": False, "collectArtifacts": True, "collectResources": True, }, "run": { "kind": V1RunKind.SERVICE, "ports": [1212, 1234], "container": { "image": "test", "command": "{{ ports[0] }}" }, }, }) spec = resolve_contexts( namespace="test", owner_name="user", project_name="project", project_uuid="uuid", run_uuid="uuid", run_name="run", run_path="test", compiled_operation=compiled_operation, artifacts_store=None, connection_by_names={}, iteration=12, created_at=None, compiled_at=None, ) assert spec == { "globals": { "owner_name": "user", "project_name": "project", "project_unique_name": "user.project", "project_uuid": "uuid", "run_info": "user.project.runs.uuid", "name": "run", "uuid": "uuid", "context_path": "/plx-context", "artifacts_path": "/plx-context/artifacts", "run_artifacts_path": "/plx-context/artifacts/test", "run_outputs_path": "/plx-context/artifacts/test/outputs", "namespace": "test", "iteration": 12, "ports": [1212, 1234], "base_url": "/services/v1/test/user/project/runs/uuid", "created_at": None, "compiled_at": None, "cloning_kind": None, "original_uuid": None, }, "init": {}, "connections": {}, }
def test_get_from_spec(self): compiled_operation = V1CompiledOperation.read({ "version": 1.1, "kind": kinds.COMPILED_OPERATION, "plugins": { "auth": False, "shm": False, "collectLogs": False, "collectArtifacts": False, "syncStatuses": False, "externalHost": True, }, "run": { "kind": V1RunKind.JOB, "container": { "image": "test" } }, }) spec = PluginsContextsSpec.from_config(compiled_operation.plugins) assert spec.auth is False assert spec.docker is False assert spec.shm is False assert spec.collect_artifacts is False assert spec.collect_logs is False assert spec.sync_statuses is False assert spec.external_host is True
def test_sequential_pipeline(self): run_config = V1CompiledOperation.read( [ os.path.abspath( "tests/fixtures/pipelines/simple_sequential_pipeline.yml" ), {"kind": "compiled_operation"}, ] ) run_config = CompiledOperationSpecification.apply_context(run_config) assert run_config.run is not None assert len(run_config.run.operations) == 4 assert run_config.run.operations[0].name == "job1" assert run_config.run.operations[1].name == "job2" assert run_config.run.operations[1].dependencies == ["job1"] assert run_config.run.operations[2].name == "experiment1" assert run_config.run.operations[2].dependencies == ["job2"] assert run_config.run.operations[3].name == "experiment2" assert run_config.run.operations[3].dependencies == ["experiment1"] dag_strategy = run_config.run assert dag_strategy.sort_topologically(dag_strategy.dag) == [ ["job1"], ["job2"], ["experiment1"], ["experiment2"], ] assert run_config.schedule is None
def test_parallel_pipeline(self): run_config = V1CompiledOperation.read( [ os.path.abspath( "tests/fixtures/pipelines/simple_parallel_pipeline.yml" ), {"kind": "compiled_operation"}, ] ) run_config = CompiledOperationSpecification.apply_context(run_config) assert len(run_config.run.operations) == 4 assert run_config.run.operations[0].name == "job1" assert run_config.run.operations[0].dependencies is None assert run_config.run.operations[1].name == "job2" assert run_config.run.operations[1].dependencies is None assert run_config.run.operations[2].name == "experiment1" assert run_config.run.operations[2].dependencies is None assert run_config.run.operations[3].name == "experiment2" assert run_config.run.operations[3].dependencies is None dag_strategy = run_config.run assert set(dag_strategy.sort_topologically(dag_strategy.dag)[0]) == { "job1", "job2", "experiment1", "experiment2", } assert run_config.run.concurrency == 2 assert run_config.schedule is None
def test_dag_pipeline(self): run_config = V1CompiledOperation.read( [ os.path.abspath("tests/fixtures/pipelines/simple_dag_pipeline.yml"), {"kind": "compiled_operation"}, ] ) run_config = CompiledOperationSpecification.apply_context(run_config) assert len(run_config.run.operations) == 5 assert run_config.run.operations[0].name == "job1" assert run_config.run.operations[1].name == "experiment1" assert run_config.run.operations[1].dependencies == ["job1"] assert run_config.run.operations[2].name == "experiment2" assert run_config.run.operations[2].dependencies == ["job1"] assert run_config.run.operations[3].name == "experiment3" assert run_config.run.operations[3].dependencies == ["job1"] assert run_config.run.operations[4].name == "job2" assert run_config.run.operations[4].dependencies == [ "experiment1", "experiment2", "experiment3", ] dag_strategy = run_config.run sorted_dag = dag_strategy.sort_topologically(dag_strategy.dag) assert sorted_dag[0] == ["job1"] assert set(sorted_dag[1]) == {"experiment1", "experiment2", "experiment3"} assert sorted_dag[2] == ["job2"] assert run_config.run.concurrency == 3 assert run_config.schedule is None
def test_matrix_file_passes_int_float_types(self): run_config = V1CompiledOperation.read( [ os.path.abspath( "tests/fixtures/pipelines/matrix_file_with_int_float_types.yml" ), {"kind": "compiled_operation"}, ] ) run_config = CompiledOperationSpecification.apply_context(run_config) assert run_config.version == 1.05 assert run_config.is_dag_run is True assert run_config.has_pipeline is True assert run_config.schedule is None assert run_config.run.concurrency == 4 assert isinstance(run_config.run, V1Dag) assert run_config.run.early_stopping is None assert run_config.run.kind == V1Dag.IDENTIFIER assert len(run_config.run.operations) == 2 assert len(run_config.run.components) == 1 template_grid = run_config.run.operations[1].parallel assert isinstance(template_grid, V1GridSearch) assert isinstance(template_grid.params["param1"], V1HpChoice) assert isinstance(template_grid.params["param2"], V1HpChoice) assert template_grid.params["param1"].to_dict() == { "kind": "choice", "value": [1, 2], } assert template_grid.params["param2"].to_dict() == { "kind": "choice", "value": [3.3, 4.4], } assert template_grid.concurrency == 2 assert template_grid.early_stopping is None
def resolve_contexts( namespace: str, owner_name: str, project_name: str, project_uuid: str, run_uuid: str, run_name: str, run_path: str, compiled_operation: V1CompiledOperation, artifacts_store: V1ConnectionType, connection_by_names: Dict[str, V1ConnectionType], iteration: int, created_at: datetime, compiled_at: datetime, schedule_at: datetime = None, started_at: datetime = None, finished_at: datetime = None, duration: int = None, cloning_kind: V1CloningKind = None, original_uuid: str = None, ) -> Dict: run_kind = compiled_operation.get_run_kind() if run_kind not in CONTEXTS_MANAGERS: raise PolyaxonCompilerError( "Contexts manager Error. " "Specification with run kind: {} is not supported in this deployment version".format( run_kind ) ) resolved_contexts = resolve_globals_contexts( namespace=namespace, owner_name=owner_name, project_name=project_name, project_uuid=project_uuid, run_uuid=run_uuid, run_name=run_name, run_path=run_path, iteration=iteration, created_at=created_at, compiled_at=compiled_at, schedule_at=schedule_at, started_at=started_at, finished_at=finished_at, duration=duration, plugins=compiled_operation.plugins, artifacts_store=artifacts_store, cloning_kind=cloning_kind, original_uuid=original_uuid, ) return CONTEXTS_MANAGERS[run_kind].resolve( namespace=namespace, owner_name=owner_name, project_name=project_name, run_uuid=run_uuid, contexts=resolved_contexts, compiled_operation=compiled_operation, connection_by_names=connection_by_names, )
def apply_run_connections_params( cls, config: V1CompiledOperation, artifact_store: str = None, contexts: Dict = None, ) -> V1CompiledOperation: params = config.validate_params(is_template=False, check_runs=True) params = {param.name: param for param in params} params = cls._update_params_with_contexts(params, contexts) if config.run.kind in {V1RunKind.JOB, V1RunKind.SERVICE}: if config.run.connections: config.run.connections = Parser.parse_section( config.run.connections, params=params, parse_params=True) if config.run.init: init = [] for i in config.run.init: if i.artifacts and not i.connection: i.connection = artifact_store resolved_i = V1Init.from_dict( Parser.parse_section(i.to_dict(), params=params, parse_params=True)) init.append(resolved_i) config.run.init = init return config
def resolve( owner_name: str, project_name: str, project_uuid: str, run_name: str, run_uuid: str, run_path: str, compiled_operation: V1CompiledOperation, params: Optional[Dict[str, Dict]], run=None, resolver_cls=None, ): resolver_cls = resolver_cls or CoreResolver run_kind = compiled_operation.get_run_kind() if run_kind not in resolver_cls.KINDS: raise PolyaxonCompilerError( "Specification with run kind: {} is not supported in this deployment version." .format(run_kind)) resolver = resolver_cls( run=run, compiled_operation=compiled_operation, owner_name=owner_name, project_name=project_name, project_uuid=project_uuid, run_name=run_name, run_path=run_path, run_uuid=run_uuid, params=params, ) if resolver: return resolver, resolver.resolve()
def test_resolver_default_contexts(self): context_root = container_contexts.CONTEXT_ROOT compiled_operation = V1CompiledOperation.read({ "version": 1.1, "kind": kinds.COMPILED_OPERATION, "plugins": { "auth": False, "shm": False, "collectLogs": False, "collectArtifacts": False, "collectResources": False, }, "run": { "kind": V1RunKind.JOB, "container": { "image": "test" } }, }) spec = resolve_contexts( namespace="test", owner_name="user", project_name="project", project_uuid="uuid", run_uuid="uuid", run_name="run", run_path="test", compiled_operation=compiled_operation, artifacts_store=None, connection_by_names={}, iteration=None, created_at=None, compiled_at=None, ) assert spec == { "globals": { "owner_name": "user", "project_unique_name": "user.project", "project_name": "project", "project_uuid": "uuid", "run_info": "user.project.runs.uuid", "context_path": context_root, "artifacts_path": "{}/artifacts".format(context_root), "name": "run", "uuid": "uuid", "namespace": "test", "iteration": None, "created_at": None, "compiled_at": None, "schedule_at": None, "started_at": None, "finished_at": None, "duration": None, "cloning_kind": None, "original_uuid": None, }, "init": {}, "connections": {}, }
def test_matrix_file_passes(self): run_config = V1CompiledOperation.read([ os.path.abspath("tests/fixtures/pipelines/matrix_file.yml"), { "kind": "compiled_operation" }, ]) run_config = CompiledOperationSpecification.apply_operation_contexts( run_config) assert run_config.version == 1.1 assert run_config.is_dag_run is True assert run_config.has_pipeline is True assert run_config.schedule is None assert run_config.run.concurrency == 4 assert isinstance(run_config.run, V1Dag) assert run_config.run.early_stopping is None assert run_config.run.kind == V1Dag.IDENTIFIER assert len(run_config.run.operations) == 2 assert len(run_config.run.components) == 1 template_hyperband = run_config.run.operations[1].matrix assert isinstance(template_hyperband.params["lr"], V1HpLinSpace) assert isinstance(template_hyperband.params["loss"], V1HpChoice) assert template_hyperband.params["lr"].to_dict() == { "kind": "linspace", "value": { "start": 0.01, "stop": 0.1, "num": 5 }, } assert template_hyperband.params["loss"].to_dict() == { "kind": "choice", "value": ["MeanSquaredError", "AbsoluteDifference"], } assert template_hyperband.params["normal_rate"].to_dict() == { "kind": "normal", "value": { "loc": 0, "scale": 0.9 }, } assert template_hyperband.params["dropout"].to_dict() == { "kind": "qloguniform", "value": { "high": 0.8, "low": 0, "q": 0.1 }, } assert template_hyperband.params["activation"].to_dict() == { "kind": "pchoice", "value": [["relu", 0.1], ["sigmoid", 0.8]], } assert template_hyperband.params["model"].to_dict() == { "kind": "choice", "value": ["CDNA", "DNA", "STP"], } assert template_hyperband.concurrency == 2 assert isinstance(template_hyperband, V1Hyperband)
def is_kind_supported(cls, compiled_operation: V1CompiledOperation): run_kind = compiled_operation.get_run_kind() if run_kind not in cls.KINDS: raise PolyaxonCompilerError( "Resolver Error. " "Specification with run kind: {} " "is not supported in this deployment version.".format(run_kind) )
def test_pipeline_with_no_ops_raises(self): run_config = V1CompiledOperation.read( [ os.path.abspath("tests/fixtures/pipelines/pipeline_with_no_ops.yml"), {"kind": "compiled_operation"}, ] ) with self.assertRaises(PolyaxonSchemaError): CompiledOperationSpecification.apply_context(run_config)
def test_run_simple_file_passes(self): run_config = V1CompiledOperation.read([ reader.read( os.path.abspath( "tests/fixtures/typing/run_cmd_simple_file.yml")), { "kind": "compiled_operation" }, ]) assert run_config.inputs[0].value == "MeanSquaredError" assert run_config.inputs[1].value is None validated_params = run_config.validate_params() assert run_config.inputs[0].value == "MeanSquaredError" assert run_config.inputs[1].value is None assert { "loss": V1Param(value="MeanSquaredError"), "num_masks": V1Param(value=None), } == {p.name: p.param for p in validated_params} with self.assertRaises(ValidationError): CompiledOperationSpecification.apply_context(run_config) validated_params = run_config.validate_params( params={"num_masks": { "value": 100 }}) assert { "loss": V1Param(value="MeanSquaredError"), "num_masks": V1Param(value=100), } == {p.name: p.param for p in validated_params} assert run_config.run.container.args == [ "video_prediction_train", "--num_masks={{num_masks}}", "--loss={{loss}}", ] with self.assertRaises(ValidationError): # Applying context before applying params CompiledOperationSpecification.apply_context(run_config) run_config.apply_params(params={"num_masks": {"value": 100}}) run_config = CompiledOperationSpecification.apply_context(run_config) run_config = CompiledOperationSpecification.apply_run_contexts( run_config) assert run_config.version == 1.05 assert run_config.tags == ["foo", "bar"] container = run_config.run.container assert isinstance(container, k8s_schemas.V1Container) assert container.image == "my_image" assert container.command == ["/bin/sh", "-c"] assert container.args == [ "video_prediction_train", "--num_masks=100", "--loss=MeanSquaredError", ]
def test_resolver_init_and_connections_contexts(self): store = V1ConnectionType( name="test_claim", kind=V1ConnectionKind.VOLUME_CLAIM, schema=V1ClaimConnection( mount_path="/claim/path", volume_claim="claim", read_only=True ), ) compiled_operation = V1CompiledOperation.read( { "version": 1.05, "kind": kinds.COMPILED_OPERATION, "plugins": { "auth": False, "shm": False, "collectLogs": False, "collectArtifacts": False, "collectResources": False, }, "run": { "kind": V1RunKind.JOB, "container": {"image": "test"}, "connections": [store.name], "init": [{"connection": store.name}], }, } ) spec = resolve_contexts( namespace="test", owner_name="user", project_name="project", project_uuid="uuid", run_uuid="uuid", run_name="run", run_path="test", compiled_operation=compiled_operation, artifacts_store=store, connection_by_names={store.name: store}, iteration=12, ) assert spec == { "globals": { "owner_name": "user", "project_unique_name": "user.project", "project_name": "project", "project_uuid": "uuid", "name": "run", "uuid": "uuid", "artifacts_path": "/claim/path/test", "namespace": "test", "iteration": 12, "run_info": "user.project.runs.uuid", }, "init": {"test_claim": store.schema.to_dict()}, "connections": {"test_claim": store.schema.to_dict()}, }
def convert( namespace: str, owner_name: str, project_name: str, run_name: str, run_uuid: str, run_path: str, compiled_operation: V1CompiledOperation, artifacts_store: Optional[V1ConnectionType], connection_by_names: Optional[Dict[str, V1ConnectionType]], secrets: Optional[Iterable[V1K8sResourceType]], config_maps: Optional[Iterable[V1K8sResourceType]], polyaxon_sidecar: V1PolyaxonSidecarContainer = None, polyaxon_init: V1PolyaxonInitContainer = None, default_sa: str = None, converters: Dict[str, Any] = CORE_CONVERTERS, internal_auth: bool = False, default_auth: bool = False, ) -> Dict: if compiled_operation.has_pipeline: raise PolyaxonCompilerError( "Converter Error. " "Specification with matrix/dag/schedule section is not supported in this function." ) run_kind = compiled_operation.get_run_kind() if run_kind not in converters: raise PolyaxonCompilerError( "Converter Error. " "Specification with run kind: {} is not supported in this deployment version.".format( run_kind ) ) converter = converters[run_kind]( owner_name=owner_name, project_name=project_name, run_name=run_name, run_uuid=run_uuid, namespace=namespace, polyaxon_init=polyaxon_init, polyaxon_sidecar=polyaxon_sidecar, internal_auth=internal_auth, run_path=run_path, ) if converter: resource = converter.get_resource( compiled_operation=compiled_operation, artifacts_store=artifacts_store, connection_by_names=connection_by_names, secrets=secrets, config_maps=config_maps, default_sa=default_sa, default_auth=default_auth, ) api = k8s_client.ApiClient() return api.sanitize_for_serialization(resource)
def test_apply_params_extends_connections_and_init(self): content = { "version": 1.1, "kind": "compiled_operation", "inputs": [ { "name": "docker_image", "type": types.IMAGE }, { "name": "git_repo", "type": types.GIT }, ], "run": { "kind": V1RunKind.JOB, "connections": ["{{ params.docker_image.connection }}"], "container": { "name": "polyaxon-main", "image": "{{ docker_image }}", "command": "train", }, }, } run_config = V1CompiledOperation.read(content) # no params with self.assertRaises(ValidationError): CompiledOperationSpecification.apply_operation_contexts(run_config) params = { "docker_image": { "value": "destination:tag", "connection": "docker-registry", }, "git_repo": { "value": V1GitType(revision="foo"), "connection": "repo-connection", }, } assert run_config.inputs[0].value is None assert run_config.inputs[1].value is None validated_params = run_config.validate_params(params=params) run_config.apply_params(params=params) assert params == {p.name: p.param.to_dict() for p in validated_params} assert run_config.inputs[0].connection == "docker-registry" assert run_config.inputs[1].connection == "repo-connection" run_config = CompiledOperationSpecification.apply_operation_contexts( run_config) run_config = CompiledOperationSpecification.apply_params(run_config) run_config = CompiledOperationSpecification.apply_runtime_contexts( run_config) assert run_config.run.connections == ["docker-registry"] assert run_config.run.container.image == "destination:tag"
def test_cyclic_pipeline_raises(self): run_config = V1CompiledOperation.read( [ os.path.abspath("tests/fixtures/pipelines/cyclic_pipeline.yml"), {"kind": "compiled_operation"}, ] ) assert run_config.is_dag_run is True assert run_config.has_pipeline is True with self.assertRaises(PolyaxonSchemaError): CompiledOperationSpecification.apply_context(run_config)
def test_validation_for_required_inputs_outputs_raises(self): # Get compiled_operation data run_config = V1CompiledOperation.read([ os.path.abspath("tests/fixtures/typing/required_inputs.yml"), { "kind": "compiled_operation" }, ]) # Inputs don't have delayed validation by default with self.assertRaises(ValidationError): run_config.validate_params(is_template=False, check_runs=True) run_config = V1CompiledOperation.read([ os.path.abspath("tests/fixtures/typing/required_outputs.yml"), { "kind": "compiled_operation" }, ]) # Outputs have delayed validation by default run_config.validate_params(is_template=False, check_runs=True)
def test_no_params_for_required_inputs_outputs_raises(self): # Get compiled_operation data run_config = V1CompiledOperation.read([ os.path.abspath("tests/fixtures/typing/required_inputs.yml"), { "kind": "compiled_operation" }, ]) # Inputs don't have delayed validation by default with self.assertRaises(ValidationError): CompiledOperationSpecification.apply_context(run_config) run_config = V1CompiledOperation.read([ os.path.abspath("tests/fixtures/typing/required_outputs.yml"), { "kind": "compiled_operation" }, ]) # Outputs have delayed validation by default CompiledOperationSpecification.apply_context(run_config)
def _apply_runtime_contexts( cls, config: V1CompiledOperation, contexts: Dict = None, param_spec: Dict[str, ParamSpec] = None, ) -> V1CompiledOperation: if not param_spec: param_spec = cls.calculate_context_spec( config=config, contexts=contexts, should_be_resolved=True ) parsed_data = Parser.parse_runtime(config.to_dict(), param_spec) return cls.CONFIG.read(parsed_data)
def test_pipeline_ops_not_corresponding_to_components(self): run_config = V1CompiledOperation.read( [ reader.read( os.path.abspath( "tests/fixtures/pipelines/pipeline_ops_not_corresponding_to_components.yml" ) ), {"kind": "compiled_operation"}, ] ) with self.assertRaises(PolyaxonSchemaError): CompiledOperationSpecification.apply_context(run_config)
def test_apply_context_raises_with_required_inputs(self): content = { "version": 1.1, "kind": "component", "inputs": [ {"name": "lr", "type": types.FLOAT}, {"name": "num_steps", "type": types.INT}, ], "run": { "kind": V1RunKind.JOB, "container": { "name": "polyaxon-main", "image": "test/test:latest", "command": "train", }, }, } component_config = V1Component.read(content) assert component_config.to_dict() == content content = { "version": 1.1, "kind": "compiled_operation", "inputs": [ {"name": "lr", "type": types.FLOAT}, {"name": "num_steps", "type": types.INT}, ], "run": { "kind": V1RunKind.JOB, "container": { "name": "polyaxon-main", "image": "test/test:latest", "command": "train", }, }, } run_config = V1CompiledOperation.read(content) # Raise because required inputs are not met with self.assertRaises(ValidationError): CompiledOperationSpecification.apply_operation_contexts(run_config) # Validation for template should pass validated_params = run_config.validate_params() assert {"lr": None, "num_steps": None} == { p.name: p.param.value for p in validated_params } # Validation for non template should raise with self.assertRaises(ValidationError): run_config.validate_params(is_template=False)
def test_spec_without_io_and_params_raises(self): content = { "version": 1.1, "kind": "component", "run": { "kind": V1RunKind.JOB, "container": { "name": "polyaxon-main", "image": "test/test:latest", "command": "train", }, }, } config = V1Component.read(content) assert config.to_dict() == content content = { "version": 1.1, "kind": "compiled_operation", "run": { "kind": V1RunKind.JOB, "container": { "name": "polyaxon-main", "image": "test/test:latest", "command": "train", }, }, } config = V1CompiledOperation.read(content) config = CompiledOperationSpecification.apply_operation_contexts( config) assert config.to_dict() == content # Add params content["params"] = {"lr": 0.1} with self.assertRaises(ValidationError): V1CompiledOperation.read(content)