Start and stop dataflows on command
When working with dataflows I think most of us simply schedule or manually run them to get new and fresh data (given the data sync has been updated). But guess what you can trigger the dataflow using APIs – so let’s see how we can do that with Salesforce CLI and Mohan’s plugin. If you haven’t used Mohan’s plugin before please check out this blog for details on how to install or update the plugin.
Note: this blog is using the following version sfdx-mohanc-plugins 0.0.131.
The dataflow start command
One of the main command for this blog is the dataflow start command. Let’s have a look at the options for the command by using the following:
sfdx mohanc:ea:dataflow:start -h
Let’s have a closer look on the option for this command.
Username
Use the -u option to specify a username to use for your command.
--The option sfdx mohanc:ea:dataflow:start -u <insert username> --Example sfdx mohanc:ea:dataflow:start -u rikke@demo.org
Dataflow id
Use the -i option to specify a dataflow id to use for your command.
--The option
sfdx mohanc:ea:dataflow:start -u <insert username> -i <insert dataflow id>
--Example
sfdx mohanc:ea:dataflow:start -u rikke@demo.org -i 02K3h000000MtyuEAC
The dataflow jobs stop command
The other main command for this blog is the dataflow jobs stop command. Let’s have a look at the options for the command by using the following:
sfdx mohanc:ea:dataflow:jobs:stop -h
Let’s have a closer look on the option for this command.
Username
Use the -u option to specify a username to use for your command.
--The option sfdx mohanc:ea:dataflow:jobs:stop -u <insert username> --Example sfdx mohanc:ea:dataflow:jobs:stop -u rikke@demo.org
Dataflow job id
Use the -i option to specify a dataflow job id to use for your command.
--The option sfdx mohanc:ea:dataflow:jobs:stop -u <insert username> -i <insert dataflow job id> --Example sfdx mohanc:ea:dataflow:jobs:stop -u rikke@demo.org -i 03CB000000383oAMAQ
The dataflow list command
To use the dataflow start command we need to have a dataflow id, which we can get by using the dataflow list command. To get the option for this command enter the following:
sfdx mohanc:ea:dataflow:list -h
Let’s have a closer look on the option for this command.
Username
Use the -u option to specify a username to use for your command.
--The option sfdx mohanc:ea:dataflow:list -u <insert username> --Example sfdx mohanc:ea:dataflow:list -u rikke@demo.org
The dataflow jobs list command
To use the dataflow jobs stop command we need to have a dataflow job id, which we can get by using the dataflow job list command. To get the option for this command enter the following:
sfdx mohanc:ea:dataflow:jobs:list -h
Let’s have a closer look on the option for this command.
Username
Use the -u option to specify a username to use for your command.
--The option sfdx mohanc:ea:dataflow:jobs:list -u <insert username> --Example sfdx mohanc:ea:dataflow:jobs:list -u rikke@demo.org
Start a dataflow
Okay, having covered the different commands let’s have a look at how we can start a dataflow using the Salesforce CLI.
Note: Before using the load command you would have to log in to the desired org by using the command sfdx force:auth:web:login, which will launch the login window in a browser.
Step 1 – use the dataflow:list command to find the list of dataflows and their ids.
sfdx mohanc:ea:dataflow:list
Step 2 – define the username for the target org by adding the -u option.
sfdx mohanc:ea:dataflow:list -u rikke@discovery.gs0
Step 3 – press enter.
Step 4 – in the list find the dataflow you want to start and copy the associated id.
Step 5 – use the dataflow:start command to start a dataflow
sfdx mohanc:ea:dataflow:start
Step 6 – define the username for the target org by adding the -u option.
sfdx mohanc:ea:dataflow:start -u rikke@discovery.gs0
Step 7 – define the dataflow id by adding the -i option. Use the id you copied in step 4.
sfdx mohanc:ea:dataflow:start -u rikke@discovery.gs0 -i 02KB0000000nkUqMAI
Step 8 – press enter.
You will now get a message that your dataflow has been queued to run. You can always check the progress in the data monitor.
Stop a dataflow job
We just ran our dataflow, let’s say that was a mistake or maybe it’s been running too long and we want to stop the dataflow run. This we can also achieve with the Salesforce CLI. Let’s have a look at the steps to complete.
Note: Before using the load command you would have to log in to the desired org by using the command sfdx force:auth:web:login, which will launch the login window in a browser.
Step 1 – use the dataflow:jobs:list command to find the list of dataflows jobs and their ids.
sfdx mohanc:ea:dataflow:jobs:list
Step 2 – define the username for the target org by adding the -u option.
sfdx mohanc:ea:dataflow:jobs:list -u rikke@discovery.gs0
Step 3 – press enter.
Step 4 – in the list find the job you want to stop and copy the associated id.
Step 5 – use the dataflow:jobs:stop command to stop a dataflow.
sfdx mohanc:ea:dataflow:jobs:stop
Step 6 – define the username for the target org by adding the -u option.
sfdx mohanc:ea:dataflow:jobs:stop -u rikke@discovery.gs0
Step 7 – define the dataflow id by adding the -i option. Use the id you copied in step 4.
sfdx mohanc:ea:dataflow:jobs:stop -u rikke@discovery.gs0 -i 03CB0000003DTAMMA4
Step 8 – press enter.
You will now get a message that your dataflow has been stopped. You can always confirm this in the data monitor.
For anyone interested, here is the REST API that this plugin is using. With this, you could use any tool you wish to control the dataflows: https://developer.salesforce.com/docs/atlas.en-us.bi_dev_guide_rest.meta/bi_dev_guide_rest/bi_run_schedule_sync_data.htm
Rikke,
Is it possible to establish dependencies between Dataflows using this command line invocation. That is, start Dataflow 2 only after completion of Dataflow instead of relying on the schedule time for the Dataflows as you currently have in Data Manager.
Satish,
Since you’re using the API you can create a dependency in whatever method you’re using to trigger the API calls. You would use this API to get the current status of the running dataflow: https://developer.salesforce.com/docs/atlas.en-us.bi_dev_guide_rest.meta/bi_dev_guide_rest/bi_resources_dataflowjobs_nodes_id.htm