I have a streaming job that takes event-time sensitive data from multiple sources.
It aggregates the data and writes out the results for an aggregate when it is complete or when it is timed out.
Since there are multiple data sources, some results may be for last week, some for today, some even for future dates.
I need to take my "result" data frame and append the results to parquet files, one per day (based on event time). How can I write out one data frame to multiple parquet files like this?