integration.kubeflow
initialize_comet_logger¶
comet_ml.integration.kubeflow.initialize_comet_logger(experiment,
workflow_uid, pod_name)
Logs the Kubeflow task identifiers needed to track your pipeline status in Comet.ml. You need to call this function from every components you want to track with Comet.ml.
Args:
- experiment: An already created Experiment object.
- workflow_uid: string. The Kubeflow workflow uid, see below to get it automatically from Kubeflow.
- pod_name: string. The Kubeflow pod name, see below to get it automatically from Kubeflow.
For example:
def my_component() -> None:
import comet_ml.integration.kubeflow
experiment = comet_ml.Experiment()
workflow_uid = "{{workflow.uid}}"
pod_name = "{{pod_name}}"
comet_ml.integration.kubeflow.initialize_comet_logger(experiment, workflow_uid, pod_name)
comet_logger_component¶
comet_ml.integration.kubeflow.comet_logger_component(api_key=None,
project_name=None, workspace=None, packages_to_install=None,
base_image=None)
Inject the Comet Logger component which continuously track and report the current pipeline status to Comet.ml.
Args:
api_key: string, optional. Your Comet API Key, if not provided, the value set in the configuration system will be used.
project_name: string, optional. The project name where all of the pipeline tasks are logged. If not provided, the value set in the configuration system will be used.
workspace: string, optional. The workspace name where all of the pipeline tasks are logged. If not provided, the value set in the configuration system will be used.
Example:
@dsl.pipeline(name='ML training pipeline')
def ml_training_pipeline():
import comet_ml.integration.kubeflow
comet_ml.integration.kubeflow.comet_logger_component()