lakehouse.databricks.common.vacumm_and_optimize_all_tables
- lakehouse.databricks.common.vacumm_and_optimize_all_tables(spark, delta_db, retention_period=None)
A function to vacuum and optimize all delta tables in a specific hive database/schema
Parameters
- sparkspark context
spark context passed from the calling spark instance
- delta_dbstring
The name of the delta db where the table exists
- retention_period: int, in hours, default=None
The number of hours you would like to retain 7 days is 168 Hours; if no value added the default retention will be used, Databricks default is 7 days