Run starting. Expected test count is: 0
ChangeDataFlowStreamingSpec:
+ https://docs.databricks.com/en/structured-streaming/delta-lake.html 
A dataset that is updated
- should write its deltas to another table as a stream
  + Given a table created with the SQL: 
CREATE TABLE ChangeDataFlowStreamingSpec (
  id int,
  label String,
  partitionKey long,
  date Date,
  timestamp Timestamp
) USING DELTA TBLPROPERTIES (delta.enableChangeDataFeed = true) 
  + And a sink table created with SQL: 
CREATE TABLE streamsink (
  id int,
  label String,
  partitionKey long,
  date Date,
  timestamp Timestamp
) USING DELTA 
  + When we start streaming from ChangeDataFlowStreamingSpec to streamsink with a watermark of 4 seconds and a trigger processing time of 4000 ms 
  + And the initial count in streamsink is 0 
  + And we append 100 rows with a timestamp ranging from 2023-12-11 13:43:35.293 to 2023-12-11 13:45:14.293 
  + And we wait 4000 ms 
  + Then the final row count at Mon Dec 11 13:43:45 UTC 2023 in streamsink is 100 rows 
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +  
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +