UPDATING ONE OR MORE DATABASES BASED ON DATAFLOW EVENTS
Database environments may choose to schedule complex analytics processing to be performed by specialized processing environments by caching source datasets or other data needed for the analytics and then outputting results back to customer datasets. It is complex to schedule user database operations...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Patent |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Database environments may choose to schedule complex analytics processing to be performed by specialized processing environments by caching source datasets or other data needed for the analytics and then outputting results back to customer datasets. It is complex to schedule user database operations, such as running dataflows, recipes, scripts, rules, or the like that may rely on output from the analytics, if the user database operations are on one schedule, while the analytics is on another schedule. User/source datasets may become out of sync and one or both environments may operate on stale data. One way to resolve this problem is to define triggers that, for example, monitor for changes to datasets (or other items of interest) by analytics or other activity and automatically run dataflows, recipes, or the like that are related to the changed datasets (or other items of interest). |
---|