Data360 Analyze

 View Only
  • 1.  Master Switch/Trigger Feature

    Posted 09-26-2023 03:48

    Hi,

    I'm exploring the possibility of having some sort of master switch which will act as general trigger for all flows. Use case: Hundreds of flows are scheduled and due to the instability or some problems with ETL at the source some of flows fail to run. Having master switch would act on possible different triggers: email or some other form of alert from IT or DBA and will delay running all flows and run them after 'green' alert' or after x hours. Integrating meta checks in every flow is possible, but it's a long task due to the number of flows. Therefore having umbrella/parent switch will be more efficient and robust. Anybody have come across or have implemented solution like this before? 

    Thanks,

    IC



    ------------------------------
    Irakli Chitishvili
    Data Trust Associates
    ------------------------------


  • 2.  RE: Master Switch/Trigger Feature

    Posted 09-27-2023 01:45
    Edited by Toby Harkin 09-27-2023 01:47

    Have you considered having a master dataflow and using the "Execute Data Flow" node?

    For example we have some scenarios where we pole a DB waiting for an update before triggering the next step which would be the dataflow. This has also been implemented by looking at a sharepoint site and waiting for a file to land, again this is based on a pole.

    ------------------------------
    Toby Harkin
    Telstra Corporation Limited
    Sydney NSW
    ------------------------------



  • 3.  RE: Master Switch/Trigger Feature

    Posted 09-28-2023 04:21
    Edited by Irakli Chitishvili 09-28-2023 04:25

    Thanks Toby, this is indeed an interesting idea. Having db status attached to meta check and then execute data flow should work as it supports multiple flows simultaneously.



    ------------------------------
    Irakli Chitishvili
    Data Trust Associates
    ------------------------------



  • 4.  RE: Master Switch/Trigger Feature

    Employee
    Posted 09-27-2023 05:27
    Edited by Adrian Williams 09-27-2023 06:13

    Toby's suggestion to use a parent data flow containing some logic and the Execute Data Flow node is a good solution if you want to be able to specify a range of prerequisites that need to be satisfied before running a child data flow. The logic in the parent data flow could, for instance, check the status of services in a DB table that is updated by IT or a trigger from other monitoring software. If all checks are passed the Execute Data Flow node then runs the child data flow. Note, the schedule would be provisioned on the parent data flow not the child.

    If you want to have a mechanism that provided coarse control to enable/disable all schedules then you could use an external system to leverage the AnalyzeCli tool to pause/resume the scheduler.



    ------------------------------
    Adrian Williams
    Precisely Software Inc.
    ------------------------------



  • 5.  RE: Master Switch/Trigger Feature

    Posted 09-28-2023 04:22

    Thanks Adrian, I think AnalyzeCLI could be a viable option as we have to pause/resume around 400 flows based on the database load status. 



    ------------------------------
    Irakli Chitishvili
    Data Trust Associates
    ------------------------------