lakehouse.databricks.common.create_type_two_condition

lakehouse.databricks.common.create_type_two_condition(spark, type_two_keys) str

Creates the Type 2 condition based on the 'type_two_keys' provided and logs the function's start and end time using 'post_la_data'.

Parameters

sparkspark context

Spark context object passed from the calling Spark instance.

type_two_keysList[str]

List of columns to be used as Type 2 condition keys.

Returns

string:

A string in the format "source.col1 <> dest.col1 or source.col2 <> dest.col2 ...", where "col1", "col2", etc. are the columns in 'type_two_keys'.