Would like to implement a system where data comes into a 'source' volume to hold the initial raw data (and held in an initial set of mapr DB json tables for that data), then gets split up and transferred to different sub-volumes for different use cases for that data (containing mapr DB tables for that split data). How is this kind of data transfer usually done? Looking at documentation it seems like this is done through streams, but I'm not exactly sure on how this would be implemented. Seems like this would be done by writing a java or spark app using the streams API to transfer the data between tables, but is there another way to have this done on a schedule (eg. some configuration in the MCS)? Is this even the normal way to to set up this kind of data hierarchy? Thanks.