I agree that AnalyzeCli is the way to go.
I've been able to use the following code in a Transform/Generate Data node to run AnalyzeCli from inside Analyze.
import subprocess
p = subprocess.Popen([analyzeCli, '-f', 'json', "runs"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
The analyzeCli variable either references the installed analyzeCli executable, or the analyzeCli.bat file in Windows.
The output in this example is a JSON list of JSON objects in this form.
{
"scheduledTaskName" : "The Schedule Name",
"status" : "COMPLETED",
"duration" : 2031,
"startTime" : "2023-01-06T23:04:10.699Z",
"modifiedByName" : "username here"
}
Ernest
------------------------------
Ernest Jones
Precisely Software Inc.
PEARL RIVER NY
------------------------------
Original Message:
Sent: 01-09-2023 05:08
From: Irakli Chitishvili
Subject: Accessing flow runtime/statistics outside of Analyze
Hi,
I'm exploring the possibility of accessing the data flows run times and exporting or visualizing them in PowerBI in order to see which flows take more or less time to run? I know there are statistics available in Analyze for individual flows but with couple of hundred flows I really want to find more versatile way to have them somewhere automatically exported/API called for grouping then into run time groups in order to analyze and optimize further.
Thank you
------------------------------
Irakli Chitishvili
Degroof Petercam SA/NV
------------------------------