- Print
- DarkLight
What rules are evaluated?
- Cyclomatic Complexity
- Cyclomatic complexity is a measurement developed to determine the stability and level of confidence in a program. It measures the number of linearly-independent paths through a program module. Programs with lower Cyclomatic complexity are easier to understand and less risky to modify.
- Hard Coded IDs
- Hard-coded IDs in a flow makes it riskier to deploy flows. Record IDs will likely vary across environments, which will cause the flow to fail when deployed to other environments.
- Limit-Consuming Elements Inside Loops
- Certain elements count towards one or more Salesforce governor limits. If these elements exist within a loop, those limits can quickly be reached, which makes the flow likely to fail due to exceeding governor limits.
- The limit-consumption types evaluated by the flow analyzer are:
- SOQL Queries
- DML
- Single Email Deliveries configured to count against Salesforce general email limits
- Missing or Empty Fault Paths
- Certain elements, such as DML elements, can be prone to errors. Catching these errors with a fault path and taking action on it is recommended. Creating a fault path without any elements "swallows" the exception, which should either be acted on or reported.
- Nested Loops
- Nested loops quickly increases the time complexity of a flow, which can cause the run time of the flow to increase and make it more likely to run into CPU limits
- Run in Mode is System
- While it can't always be avoided, running flows in System mode can invalidate your permissions and sharing model.
- Unused Variables
- Unused variables cause clutter, can increase run time, and flows whose metadata files are larger than necessary.
- DML Between Screens
- Certain DML elements between screens can be dangerous if the previous button is enabled in the second screen because you can unwittingly create multiple records or attempt to delete a record that was already deleted. This rule evaluates if there are any create or delete elements between two screens where the second screen has the previous button enabled.
- Old API Version
- API versions more than 3 years (9 releases) old should be updated to ensure that new functionalities are available
- Context mismatch
- Best practice is to execute certain actions in certain trigger contexts (for record-triggered flows, "Fast Field Updates" or "Actions and Related Records"). This rule catches if an update to the context record is occurring in an "Actions and Related Records" flow since those actions should typically be reserved for "Fast Field Updates" flows to make the update before the record is committed to the database.
What kinds of flows can be scanned?
The Flow Analyzer is currently built to scan any flow where the ProcessType is AutoLaunchedFlow or Flow in the flow metadata. Translating this to the types of flows you see on the core user interface, this includes the below.
Core UI Type | ProcessType |
---|---|
Record-Triggered Flow | AutoLaunchedFlow |
Schedule-Triggered Flow | AutoLaunchedFlow |
Platform Event-Triggered Flow | AutoLaunchedFlow |
Autolaunched Flow (No Trigger) | AutoLaunchedFlow |
Screen Flow | Flow |
Use a hub org to run your flow scanning
The Flow Analyzer has the capability to connect to any org in your pipeline from one central org. This allows you to run your scans across your pipeline and have the results in one single place. However, if you are wary of cloud-to-cloud integrations, you are not required to use a single hub org and can certainly install the app in each org in your pipeline to run scans there for that org only.
Configure the rules
I know that certain flow antipatterns are more important to you than others. For that reason, you can configure how many points are assigned for a violation of that rule. You can also configure the threshold over which a violation is logged where that applies. This allows you to assign high point values to those rules that matter most to you to drive up run scores when they are present. If you install the Flow Analyzer as a quality gate in your devops process, this allows you to force it to fail if certain elements like hard-coded IDs are present in a flow.
Make Updates to Your Flows in Bulk
With the advent of the Flow Mechanic tool in the Flow Analyzer, you can make modifications like removal of unused variables to your flows in bulk in just a few clicks.
Templates
The Flow Analyzer has a lot of opportunities to be included in different capacities in your devops pipeline. For that reason, we wanted to make it easy for you to templatize to create suites of scans that can be run quickly and easily.
Subflows
When you trigger a run, the Flow Analyzer will automagically identify any subflows of the selected flows and will inject those into the run as well. The primary reason we do this is to provide a more full picture of the selected flow. For instance, imagine a scenario where you have a DML operation in a flow. Now imagine that you have another flow where you are using that flow as a subflow within a loop. If the subflow was not scanned, there would be no way of knowing that we have an antipattern of limit-consuming elements within a loop. For this reason, we built the subflow detection and scanning feature into our very first iteration!