mirror of
https://github.com/github/awesome-copilot.git
synced 2026-03-13 20:55:13 +00:00
feat: add flowstudio-power-automate-debug and flowstudio-power-automate-build skills (#899)
* feat: add flowstudio-power-automate-debug and flowstudio-power-automate-build skills Two companion skills for the FlowStudio Power Automate MCP server: - flowstudio-power-automate-debug: Debug workflow for failed Power Automate cloud flow runs - flowstudio-power-automate-build: Build & deploy flows from natural language descriptions Both require a FlowStudio MCP subscription: https://flowstudio.app These complement the existing flowstudio-power-automate-mcp skill (merged in PR #896). * fix: address all review comments — README, cross-refs, response shapes, step numbering - Add skills to docs/README.skills.md (fixes validate-readme CI check) - Update cross-skill references to use flowstudio- prefix (#1, #4, #7, #9) - Fix get_live_flow_run_action_outputs: returns array, index [0] (#2, #3) - Renumber Step 6→5, Step 7→6 — remove gap in build workflow (#8) - Fix connectionName note: it's the key, not the GUID (#10) - Remove invalid arrow function from Filter array expression (#11) * feat: add flowstudio-power-automate plugin bundling all 3 skills Plugin bundles: - flowstudio-power-automate-mcp (core connection & CRUD) - flowstudio-power-automate-debug (debug failed runs) - flowstudio-power-automate-build (build & deploy flows) Install: copilot plugin install flowstudio-power-automate@awesome-copilot Per @aaronpowell's suggestion in review.
This commit is contained in:
460
skills/flowstudio-power-automate-build/SKILL.md
Normal file
460
skills/flowstudio-power-automate-build/SKILL.md
Normal file
@@ -0,0 +1,460 @@
|
||||
---
|
||||
name: flowstudio-power-automate-build
|
||||
description: >-
|
||||
Build, scaffold, and deploy Power Automate cloud flows using the FlowStudio
|
||||
MCP server. Load this skill when asked to: create a flow, build a new flow,
|
||||
deploy a flow definition, scaffold a Power Automate workflow, construct a flow
|
||||
JSON, update an existing flow's actions, patch a flow definition, add actions
|
||||
to a flow, wire up connections, or generate a workflow definition from scratch.
|
||||
Requires a FlowStudio MCP subscription — see https://mcp.flowstudio.app
|
||||
---
|
||||
|
||||
# Build & Deploy Power Automate Flows with FlowStudio MCP
|
||||
|
||||
Step-by-step guide for constructing and deploying Power Automate cloud flows
|
||||
programmatically through the FlowStudio MCP server.
|
||||
|
||||
**Prerequisite**: A FlowStudio MCP server must be reachable with a valid JWT.
|
||||
See the `flowstudio-power-automate-mcp` skill for connection setup.
|
||||
Subscribe at https://mcp.flowstudio.app
|
||||
|
||||
---
|
||||
|
||||
## Source of Truth
|
||||
|
||||
> **Always call `tools/list` first** to confirm available tool names and their
|
||||
> parameter schemas. Tool names and parameters may change between server versions.
|
||||
> This skill covers response shapes, behavioral notes, and build patterns —
|
||||
> things `tools/list` cannot tell you. If this document disagrees with `tools/list`
|
||||
> or a real API response, the API wins.
|
||||
|
||||
---
|
||||
|
||||
## Python Helper
|
||||
|
||||
```python
|
||||
import json, urllib.request
|
||||
|
||||
MCP_URL = "https://mcp.flowstudio.app/mcp"
|
||||
MCP_TOKEN = "<YOUR_JWT_TOKEN>"
|
||||
|
||||
def mcp(tool, **kwargs):
|
||||
payload = json.dumps({"jsonrpc": "2.0", "id": 1, "method": "tools/call",
|
||||
"params": {"name": tool, "arguments": kwargs}}).encode()
|
||||
req = urllib.request.Request(MCP_URL, data=payload,
|
||||
headers={"x-api-key": MCP_TOKEN, "Content-Type": "application/json",
|
||||
"User-Agent": "FlowStudio-MCP/1.0"})
|
||||
try:
|
||||
resp = urllib.request.urlopen(req, timeout=120)
|
||||
except urllib.error.HTTPError as e:
|
||||
body = e.read().decode("utf-8", errors="replace")
|
||||
raise RuntimeError(f"MCP HTTP {e.code}: {body[:200]}") from e
|
||||
raw = json.loads(resp.read())
|
||||
if "error" in raw:
|
||||
raise RuntimeError(f"MCP error: {json.dumps(raw['error'])}")
|
||||
return json.loads(raw["result"]["content"][0]["text"])
|
||||
|
||||
ENV = "<environment-id>" # e.g. Default-xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Step 1 — Safety Check: Does the Flow Already Exist?
|
||||
|
||||
Always look before you build to avoid duplicates:
|
||||
|
||||
```python
|
||||
results = mcp("list_store_flows",
|
||||
environmentName=ENV, searchTerm="My New Flow")
|
||||
|
||||
# list_store_flows returns a direct array (no wrapper object)
|
||||
if len(results) > 0:
|
||||
# Flow exists — modify rather than create
|
||||
# id format is "envId.flowId" — split to get the flow UUID
|
||||
FLOW_ID = results[0]["id"].split(".", 1)[1]
|
||||
print(f"Existing flow: {FLOW_ID}")
|
||||
defn = mcp("get_live_flow", environmentName=ENV, flowName=FLOW_ID)
|
||||
else:
|
||||
print("Flow not found — building from scratch")
|
||||
FLOW_ID = None
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Step 2 — Obtain Connection References
|
||||
|
||||
Every connector action needs a `connectionName` that points to a key in the
|
||||
flow's `connectionReferences` map. That key links to an authenticated connection
|
||||
in the environment.
|
||||
|
||||
> **MANDATORY**: You MUST call `list_live_connections` first — do NOT ask the
|
||||
> user for connection names or GUIDs. The API returns the exact values you need.
|
||||
> Only prompt the user if the API confirms that required connections are missing.
|
||||
|
||||
### 2a — Always call `list_live_connections` first
|
||||
|
||||
```python
|
||||
conns = mcp("list_live_connections", environmentName=ENV)
|
||||
|
||||
# Filter to connected (authenticated) connections only
|
||||
active = [c for c in conns["connections"]
|
||||
if c["statuses"][0]["status"] == "Connected"]
|
||||
|
||||
# Build a lookup: connectorName → connectionName (id)
|
||||
conn_map = {}
|
||||
for c in active:
|
||||
conn_map[c["connectorName"]] = c["id"]
|
||||
|
||||
print(f"Found {len(active)} active connections")
|
||||
print("Available connectors:", list(conn_map.keys()))
|
||||
```
|
||||
|
||||
### 2b — Determine which connectors the flow needs
|
||||
|
||||
Based on the flow you are building, identify which connectors are required.
|
||||
Common connector API names:
|
||||
|
||||
| Connector | API name |
|
||||
|---|---|
|
||||
| SharePoint | `shared_sharepointonline` |
|
||||
| Outlook / Office 365 | `shared_office365` |
|
||||
| Teams | `shared_teams` |
|
||||
| Approvals | `shared_approvals` |
|
||||
| OneDrive for Business | `shared_onedriveforbusiness` |
|
||||
| Excel Online (Business) | `shared_excelonlinebusiness` |
|
||||
| Dataverse | `shared_commondataserviceforapps` |
|
||||
| Microsoft Forms | `shared_microsoftforms` |
|
||||
|
||||
> **Flows that need NO connections** (e.g. Recurrence + Compose + HTTP only)
|
||||
> can skip the rest of Step 2 — omit `connectionReferences` from the deploy call.
|
||||
|
||||
### 2c — If connections are missing, guide the user
|
||||
|
||||
```python
|
||||
connectors_needed = ["shared_sharepointonline", "shared_office365"] # adjust per flow
|
||||
|
||||
missing = [c for c in connectors_needed if c not in conn_map]
|
||||
|
||||
if not missing:
|
||||
print("✅ All required connections are available — proceeding to build")
|
||||
else:
|
||||
# ── STOP: connections must be created interactively ──
|
||||
# Connections require OAuth consent in a browser — no API can create them.
|
||||
print("⚠️ The following connectors have no active connection in this environment:")
|
||||
for c in missing:
|
||||
friendly = c.replace("shared_", "").replace("onlinebusiness", " Online (Business)")
|
||||
print(f" • {friendly} (API name: {c})")
|
||||
print()
|
||||
print("Please create the missing connections:")
|
||||
print(" 1. Open https://make.powerautomate.com/connections")
|
||||
print(" 2. Select the correct environment from the top-right picker")
|
||||
print(" 3. Click '+ New connection' for each missing connector listed above")
|
||||
print(" 4. Sign in and authorize when prompted")
|
||||
print(" 5. Tell me when done — I will re-check and continue building")
|
||||
# DO NOT proceed to Step 3 until the user confirms.
|
||||
# After user confirms, re-run Step 2a to refresh conn_map.
|
||||
```
|
||||
|
||||
### 2d — Build the connectionReferences block
|
||||
|
||||
Only execute this after 2c confirms no missing connectors:
|
||||
|
||||
```python
|
||||
connection_references = {}
|
||||
for connector in connectors_needed:
|
||||
connection_references[connector] = {
|
||||
"connectionName": conn_map[connector], # the GUID from list_live_connections
|
||||
"source": "Invoker",
|
||||
"id": f"/providers/Microsoft.PowerApps/apis/{connector}"
|
||||
}
|
||||
```
|
||||
|
||||
> **IMPORTANT — `host.connectionName` in actions**: When building actions in
|
||||
> Step 3, set `host.connectionName` to the **key** from this map (e.g.
|
||||
> `shared_teams`), NOT the connection GUID. The GUID only goes inside the
|
||||
> `connectionReferences` entry. The engine matches the action's
|
||||
> `host.connectionName` to the key to find the right connection.
|
||||
|
||||
> **Alternative** — if you already have a flow using the same connectors,
|
||||
> you can extract `connectionReferences` from its definition:
|
||||
> ```python
|
||||
> ref_flow = mcp("get_live_flow", environmentName=ENV, flowName="<existing-flow-id>")
|
||||
> connection_references = ref_flow["properties"]["connectionReferences"]
|
||||
> ```
|
||||
|
||||
See the `power-automate-mcp` skill's **connection-references.md** reference
|
||||
for the full connection reference structure.
|
||||
|
||||
---
|
||||
|
||||
## Step 3 — Build the Flow Definition
|
||||
|
||||
Construct the definition object. See [flow-schema.md](references/flow-schema.md)
|
||||
for the full schema and these action pattern references for copy-paste templates:
|
||||
- [action-patterns-core.md](references/action-patterns-core.md) — Variables, control flow, expressions
|
||||
- [action-patterns-data.md](references/action-patterns-data.md) — Array transforms, HTTP, parsing
|
||||
- [action-patterns-connectors.md](references/action-patterns-connectors.md) — SharePoint, Outlook, Teams, Approvals
|
||||
|
||||
```python
|
||||
definition = {
|
||||
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
|
||||
"contentVersion": "1.0.0.0",
|
||||
"triggers": { ... }, # see trigger-types.md / build-patterns.md
|
||||
"actions": { ... } # see ACTION-PATTERNS-*.md / build-patterns.md
|
||||
}
|
||||
```
|
||||
|
||||
> See [build-patterns.md](references/build-patterns.md) for complete, ready-to-use
|
||||
> flow definitions covering Recurrence+SharePoint+Teams, HTTP triggers, and more.
|
||||
|
||||
---
|
||||
|
||||
## Step 4 — Deploy (Create or Update)
|
||||
|
||||
`update_live_flow` handles both creation and updates in a single tool.
|
||||
|
||||
### Create a new flow (no existing flow)
|
||||
|
||||
Omit `flowName` — the server generates a new GUID and creates via PUT:
|
||||
|
||||
```python
|
||||
result = mcp("update_live_flow",
|
||||
environmentName=ENV,
|
||||
# flowName omitted → creates a new flow
|
||||
definition=definition,
|
||||
connectionReferences=connection_references,
|
||||
displayName="Overdue Invoice Notifications",
|
||||
description="Weekly SharePoint → Teams notification flow, built by agent"
|
||||
)
|
||||
|
||||
if result.get("error") is not None:
|
||||
print("Create failed:", result["error"])
|
||||
else:
|
||||
# Capture the new flow ID for subsequent steps
|
||||
FLOW_ID = result["created"]
|
||||
print(f"✅ Flow created: {FLOW_ID}")
|
||||
```
|
||||
|
||||
### Update an existing flow
|
||||
|
||||
Provide `flowName` to PATCH:
|
||||
|
||||
```python
|
||||
result = mcp("update_live_flow",
|
||||
environmentName=ENV,
|
||||
flowName=FLOW_ID,
|
||||
definition=definition,
|
||||
connectionReferences=connection_references,
|
||||
displayName="My Updated Flow",
|
||||
description="Updated by agent on " + __import__('datetime').datetime.utcnow().isoformat()
|
||||
)
|
||||
|
||||
if result.get("error") is not None:
|
||||
print("Update failed:", result["error"])
|
||||
else:
|
||||
print("Update succeeded:", result)
|
||||
```
|
||||
|
||||
> ⚠️ `update_live_flow` always returns an `error` key.
|
||||
> `null` (Python `None`) means success — do not treat the presence of the key as failure.
|
||||
>
|
||||
> ⚠️ `description` is required for both create and update.
|
||||
|
||||
### Common deployment errors
|
||||
|
||||
| Error message (contains) | Cause | Fix |
|
||||
|---|---|---|
|
||||
| `missing from connectionReferences` | An action's `host.connectionName` references a key that doesn't exist in the `connectionReferences` map | Ensure `host.connectionName` uses the **key** from `connectionReferences` (e.g. `shared_teams`), not the raw GUID |
|
||||
| `ConnectionAuthorizationFailed` / 403 | The connection GUID belongs to another user or is not authorized | Re-run Step 2a and use a connection owned by the current `x-api-key` user |
|
||||
| `InvalidTemplate` / `InvalidDefinition` | Syntax error in the definition JSON | Check `runAfter` chains, expression syntax, and action type spelling |
|
||||
| `ConnectionNotConfigured` | A connector action exists but the connection GUID is invalid or expired | Re-check `list_live_connections` for a fresh GUID |
|
||||
|
||||
---
|
||||
|
||||
## Step 5 — Verify the Deployment
|
||||
|
||||
```python
|
||||
check = mcp("get_live_flow", environmentName=ENV, flowName=FLOW_ID)
|
||||
|
||||
# Confirm state
|
||||
print("State:", check["properties"]["state"]) # Should be "Started"
|
||||
|
||||
# Confirm the action we added is there
|
||||
acts = check["properties"]["definition"]["actions"]
|
||||
print("Actions:", list(acts.keys()))
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Step 6 — Test the Flow
|
||||
|
||||
> **MANDATORY**: Before triggering any test run, **ask the user for confirmation**.
|
||||
> Running a flow has real side effects — it may send emails, post Teams messages,
|
||||
> write to SharePoint, start approvals, or call external APIs. Explain what the
|
||||
> flow will do and wait for explicit approval before calling `trigger_live_flow`
|
||||
> or `resubmit_live_flow_run`.
|
||||
|
||||
### Updated flows (have prior runs)
|
||||
|
||||
The fastest path — resubmit the most recent run:
|
||||
|
||||
```python
|
||||
runs = mcp("get_live_flow_runs", environmentName=ENV, flowName=FLOW_ID, top=1)
|
||||
if runs:
|
||||
result = mcp("resubmit_live_flow_run",
|
||||
environmentName=ENV, flowName=FLOW_ID, runName=runs[0]["name"])
|
||||
print(result)
|
||||
```
|
||||
|
||||
### Flows already using an HTTP trigger
|
||||
|
||||
Fire directly with a test payload:
|
||||
|
||||
```python
|
||||
schema = mcp("get_live_flow_http_schema",
|
||||
environmentName=ENV, flowName=FLOW_ID)
|
||||
print("Expected body:", schema.get("triggerSchema"))
|
||||
|
||||
result = mcp("trigger_live_flow",
|
||||
environmentName=ENV, flowName=FLOW_ID,
|
||||
body={"name": "Test", "value": 1})
|
||||
print(f"Status: {result['status']}")
|
||||
```
|
||||
|
||||
### Brand-new non-HTTP flows (Recurrence, connector triggers, etc.)
|
||||
|
||||
A brand-new Recurrence or connector-triggered flow has no runs to resubmit
|
||||
and no HTTP endpoint to call. **Deploy with a temporary HTTP trigger first,
|
||||
test the actions, then swap to the production trigger.**
|
||||
|
||||
#### 7a — Save the real trigger, deploy with a temporary HTTP trigger
|
||||
|
||||
```python
|
||||
# Save the production trigger you built in Step 3
|
||||
production_trigger = definition["triggers"]
|
||||
|
||||
# Replace with a temporary HTTP trigger
|
||||
definition["triggers"] = {
|
||||
"manual": {
|
||||
"type": "Request",
|
||||
"kind": "Http",
|
||||
"inputs": {
|
||||
"schema": {}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
# Deploy (create or update) with the temp trigger
|
||||
result = mcp("update_live_flow",
|
||||
environmentName=ENV,
|
||||
flowName=FLOW_ID, # omit if creating new
|
||||
definition=definition,
|
||||
connectionReferences=connection_references,
|
||||
displayName="Overdue Invoice Notifications",
|
||||
description="Deployed with temp HTTP trigger for testing")
|
||||
|
||||
if result.get("error") is not None:
|
||||
print("Deploy failed:", result["error"])
|
||||
else:
|
||||
if not FLOW_ID:
|
||||
FLOW_ID = result["created"]
|
||||
print(f"✅ Deployed with temp HTTP trigger: {FLOW_ID}")
|
||||
```
|
||||
|
||||
#### 7b — Fire the flow and check the result
|
||||
|
||||
```python
|
||||
# Trigger the flow
|
||||
test = mcp("trigger_live_flow",
|
||||
environmentName=ENV, flowName=FLOW_ID)
|
||||
print(f"Trigger response status: {test['status']}")
|
||||
|
||||
# Wait for the run to complete
|
||||
import time; time.sleep(15)
|
||||
|
||||
# Check the run result
|
||||
runs = mcp("get_live_flow_runs",
|
||||
environmentName=ENV, flowName=FLOW_ID, top=1)
|
||||
run = runs[0]
|
||||
print(f"Run {run['name']}: {run['status']}")
|
||||
|
||||
if run["status"] == "Failed":
|
||||
err = mcp("get_live_flow_run_error",
|
||||
environmentName=ENV, flowName=FLOW_ID, runName=run["name"])
|
||||
root = err["failedActions"][-1]
|
||||
print(f"Root cause: {root['actionName']} → {root.get('code')}")
|
||||
# Debug and fix the definition before proceeding
|
||||
# See power-automate-debug skill for full diagnosis workflow
|
||||
```
|
||||
|
||||
#### 7c — Swap to the production trigger
|
||||
|
||||
Once the test run succeeds, replace the temporary HTTP trigger with the real one:
|
||||
|
||||
```python
|
||||
# Restore the production trigger
|
||||
definition["triggers"] = production_trigger
|
||||
|
||||
result = mcp("update_live_flow",
|
||||
environmentName=ENV,
|
||||
flowName=FLOW_ID,
|
||||
definition=definition,
|
||||
connectionReferences=connection_references,
|
||||
description="Swapped to production trigger after successful test")
|
||||
|
||||
if result.get("error") is not None:
|
||||
print("Trigger swap failed:", result["error"])
|
||||
else:
|
||||
print("✅ Production trigger deployed — flow is live")
|
||||
```
|
||||
|
||||
> **Why this works**: The trigger is just the entry point — the actions are
|
||||
> identical regardless of how the flow starts. Testing via HTTP trigger
|
||||
> exercises all the same Compose, SharePoint, Teams, etc. actions.
|
||||
>
|
||||
> **Connector triggers** (e.g. "When an item is created in SharePoint"):
|
||||
> If actions reference `triggerBody()` or `triggerOutputs()`, pass a
|
||||
> representative test payload in `trigger_live_flow`'s `body` parameter
|
||||
> that matches the shape the connector trigger would produce.
|
||||
|
||||
---
|
||||
|
||||
## Gotchas
|
||||
|
||||
| Mistake | Consequence | Prevention |
|
||||
|---|---|---|
|
||||
| Missing `connectionReferences` in deploy | 400 "Supply connectionReferences" | Always call `list_live_connections` first |
|
||||
| `"operationOptions"` missing on Foreach | Parallel execution, race conditions on writes | Always add `"Sequential"` |
|
||||
| `union(old_data, new_data)` | Old values override new (first-wins) | Use `union(new_data, old_data)` |
|
||||
| `split()` on potentially-null string | `InvalidTemplate` crash | Wrap with `coalesce(field, '')` |
|
||||
| Checking `result["error"]` exists | Always present; true error is `!= null` | Use `result.get("error") is not None` |
|
||||
| Flow deployed but state is "Stopped" | Flow won't run on schedule | Check connection auth; re-enable |
|
||||
| Teams "Chat with Flow bot" recipient as object | 400 `GraphUserDetailNotFound` | Use plain string with trailing semicolon (see below) |
|
||||
|
||||
### Teams `PostMessageToConversation` — Recipient Formats
|
||||
|
||||
The `body/recipient` parameter format depends on the `location` value:
|
||||
|
||||
| Location | `body/recipient` format | Example |
|
||||
|---|---|---|
|
||||
| **Chat with Flow bot** | Plain email string with **trailing semicolon** | `"user@contoso.com;"` |
|
||||
| **Channel** | Object with `groupId` and `channelId` | `{"groupId": "...", "channelId": "..."}` |
|
||||
|
||||
> **Common mistake**: passing `{"to": "user@contoso.com"}` for "Chat with Flow bot"
|
||||
> returns a 400 `GraphUserDetailNotFound` error. The API expects a plain string.
|
||||
|
||||
---
|
||||
|
||||
## Reference Files
|
||||
|
||||
- [flow-schema.md](references/flow-schema.md) — Full flow definition JSON schema
|
||||
- [trigger-types.md](references/trigger-types.md) — Trigger type templates
|
||||
- [action-patterns-core.md](references/action-patterns-core.md) — Variables, control flow, expressions
|
||||
- [action-patterns-data.md](references/action-patterns-data.md) — Array transforms, HTTP, parsing
|
||||
- [action-patterns-connectors.md](references/action-patterns-connectors.md) — SharePoint, Outlook, Teams, Approvals
|
||||
- [build-patterns.md](references/build-patterns.md) — Complete flow definition templates (Recurrence+SP+Teams, HTTP trigger)
|
||||
|
||||
## Related Skills
|
||||
|
||||
- `flowstudio-power-automate-mcp` — Core connection setup and tool reference
|
||||
- `flowstudio-power-automate-debug` — Debug failing flows after deployment
|
||||
@@ -0,0 +1,542 @@
|
||||
# FlowStudio MCP — Action Patterns: Connectors
|
||||
|
||||
SharePoint, Outlook, Teams, and Approvals connector action patterns.
|
||||
|
||||
> All examples assume `"runAfter"` is set appropriately.
|
||||
> Replace `<connectionName>` with the **key** you used in `connectionReferences`
|
||||
> (e.g. `shared_sharepointonline`, `shared_teams`). This is NOT the connection
|
||||
> GUID — it is the logical reference name that links the action to its entry in
|
||||
> the `connectionReferences` map.
|
||||
|
||||
---
|
||||
|
||||
## SharePoint
|
||||
|
||||
### SharePoint — Get Items
|
||||
|
||||
```json
|
||||
"Get_SP_Items": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "GetItems"
|
||||
},
|
||||
"parameters": {
|
||||
"dataset": "https://mytenant.sharepoint.com/sites/mysite",
|
||||
"table": "MyList",
|
||||
"$filter": "Status eq 'Active'",
|
||||
"$top": 500
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Result reference: `@outputs('Get_SP_Items')?['body/value']`
|
||||
|
||||
> **Dynamic OData filter with string interpolation**: inject a runtime value
|
||||
> directly into the `$filter` string using `@{...}` syntax:
|
||||
> ```
|
||||
> "$filter": "Title eq '@{outputs('ConfirmationCode')}'"
|
||||
> ```
|
||||
> Note the single-quotes inside double-quotes — correct OData string literal
|
||||
> syntax. Avoids a separate variable action.
|
||||
|
||||
> **Pagination for large lists**: by default, GetItems stops at `$top`. To auto-paginate
|
||||
> beyond that, enable the pagination policy on the action. In the flow definition this
|
||||
> appears as:
|
||||
> ```json
|
||||
> "paginationPolicy": { "minimumItemCount": 10000 }
|
||||
> ```
|
||||
> Set `minimumItemCount` to the maximum number of items you expect. The connector will
|
||||
> keep fetching pages until that count is reached or the list is exhausted. Without this,
|
||||
> flows silently return a capped result on lists with >5,000 items.
|
||||
|
||||
---
|
||||
|
||||
### SharePoint — Get Item (Single Row by ID)
|
||||
|
||||
```json
|
||||
"Get_SP_Item": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "GetItem"
|
||||
},
|
||||
"parameters": {
|
||||
"dataset": "https://mytenant.sharepoint.com/sites/mysite",
|
||||
"table": "MyList",
|
||||
"id": "@triggerBody()?['ID']"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Result reference: `@body('Get_SP_Item')?['FieldName']`
|
||||
|
||||
> Use `GetItem` (not `GetItems` with a filter) when you already have the ID.
|
||||
> Re-fetching after a trigger gives you the **current** row state, not the
|
||||
> snapshot captured at trigger time — important if another process may have
|
||||
> modified the item since the flow started.
|
||||
|
||||
---
|
||||
|
||||
### SharePoint — Create Item
|
||||
|
||||
```json
|
||||
"Create_SP_Item": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "PostItem"
|
||||
},
|
||||
"parameters": {
|
||||
"dataset": "https://mytenant.sharepoint.com/sites/mysite",
|
||||
"table": "MyList",
|
||||
"item/Title": "@variables('myTitle')",
|
||||
"item/Status": "Active"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### SharePoint — Update Item
|
||||
|
||||
```json
|
||||
"Update_SP_Item": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "PatchItem"
|
||||
},
|
||||
"parameters": {
|
||||
"dataset": "https://mytenant.sharepoint.com/sites/mysite",
|
||||
"table": "MyList",
|
||||
"id": "@item()?['ID']",
|
||||
"item/Status": "Processed"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### SharePoint — File Upsert (Create or Overwrite in Document Library)
|
||||
|
||||
SharePoint's `CreateFile` fails if the file already exists. To upsert (create or overwrite)
|
||||
without a prior existence check, use `GetFileMetadataByPath` on **both Succeeded and Failed**
|
||||
from `CreateFile` — if create failed because the file exists, the metadata call still
|
||||
returns its ID, which `UpdateFile` can then overwrite:
|
||||
|
||||
```json
|
||||
"Create_File": {
|
||||
"type": "OpenApiConnection",
|
||||
"inputs": {
|
||||
"host": { "apiId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline",
|
||||
"connectionName": "<connectionName>", "operationId": "CreateFile" },
|
||||
"parameters": {
|
||||
"dataset": "https://mytenant.sharepoint.com/sites/mysite",
|
||||
"folderPath": "/My Library/Subfolder",
|
||||
"name": "@{variables('filename')}",
|
||||
"body": "@outputs('Compose_File_Content')"
|
||||
}
|
||||
}
|
||||
},
|
||||
"Get_File_Metadata_By_Path": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": { "Create_File": ["Succeeded", "Failed"] },
|
||||
"inputs": {
|
||||
"host": { "apiId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline",
|
||||
"connectionName": "<connectionName>", "operationId": "GetFileMetadataByPath" },
|
||||
"parameters": {
|
||||
"dataset": "https://mytenant.sharepoint.com/sites/mysite",
|
||||
"path": "/My Library/Subfolder/@{variables('filename')}"
|
||||
}
|
||||
}
|
||||
},
|
||||
"Update_File": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": { "Get_File_Metadata_By_Path": ["Succeeded", "Skipped"] },
|
||||
"inputs": {
|
||||
"host": { "apiId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline",
|
||||
"connectionName": "<connectionName>", "operationId": "UpdateFile" },
|
||||
"parameters": {
|
||||
"dataset": "https://mytenant.sharepoint.com/sites/mysite",
|
||||
"id": "@outputs('Get_File_Metadata_By_Path')?['body/{Identifier}']",
|
||||
"body": "@outputs('Compose_File_Content')"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> If `Create_File` succeeds, `Get_File_Metadata_By_Path` is `Skipped` and `Update_File`
|
||||
> still fires (accepting `Skipped`), harmlessly overwriting the file just created.
|
||||
> If `Create_File` fails (file exists), the metadata call retrieves the existing file's ID
|
||||
> and `Update_File` overwrites it. Either way you end with the latest content.
|
||||
>
|
||||
> **Document library system properties** — when iterating a file library result (e.g.
|
||||
> from `ListFolder` or `GetFilesV2`), use curly-brace property names to access
|
||||
> SharePoint's built-in file metadata. These are different from list field names:
|
||||
> ```
|
||||
> @item()?['{Name}'] — filename without path (e.g. "report.csv")
|
||||
> @item()?['{FilenameWithExtension}'] — same as {Name} in most connectors
|
||||
> @item()?['{Identifier}'] — internal file ID for use in UpdateFile/DeleteFile
|
||||
> @item()?['{FullPath}'] — full server-relative path
|
||||
> @item()?['{IsFolder}'] — boolean, true for folder entries
|
||||
> ```
|
||||
|
||||
---
|
||||
|
||||
### SharePoint — GetItemChanges Column Gate
|
||||
|
||||
When a SharePoint "item modified" trigger fires, it doesn't tell you WHICH
|
||||
column changed. Use `GetItemChanges` to get per-column change flags, then gate
|
||||
downstream logic on specific columns:
|
||||
|
||||
```json
|
||||
"Get_Changes": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "GetItemChanges"
|
||||
},
|
||||
"parameters": {
|
||||
"dataset": "https://mytenant.sharepoint.com/sites/mysite",
|
||||
"table": "<list-guid>",
|
||||
"id": "@triggerBody()?['ID']",
|
||||
"since": "@triggerBody()?['Modified']",
|
||||
"includeDrafts": false
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Gate on a specific column:
|
||||
|
||||
```json
|
||||
"expression": {
|
||||
"and": [{
|
||||
"equals": [
|
||||
"@body('Get_Changes')?['Column']?['hasChanged']",
|
||||
true
|
||||
]
|
||||
}]
|
||||
}
|
||||
```
|
||||
|
||||
> **New-item detection:** On the very first modification (version 1.0),
|
||||
> `GetItemChanges` may report no prior version. Check
|
||||
> `@equals(triggerBody()?['OData__UIVersionString'], '1.0')` to detect
|
||||
> newly created items and skip change-gate logic for those.
|
||||
|
||||
---
|
||||
|
||||
### SharePoint — REST MERGE via HttpRequest
|
||||
|
||||
For cross-list updates or advanced operations not supported by the standard
|
||||
Update Item connector (e.g., updating a list in a different site), use the
|
||||
SharePoint REST API via the `HttpRequest` operation:
|
||||
|
||||
```json
|
||||
"Update_Cross_List_Item": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "HttpRequest"
|
||||
},
|
||||
"parameters": {
|
||||
"dataset": "https://mytenant.sharepoint.com/sites/target-site",
|
||||
"parameters/method": "POST",
|
||||
"parameters/uri": "/_api/web/lists(guid'<list-guid>')/items(@{variables('ItemId')})",
|
||||
"parameters/headers": {
|
||||
"Accept": "application/json;odata=nometadata",
|
||||
"Content-Type": "application/json;odata=nometadata",
|
||||
"X-HTTP-Method": "MERGE",
|
||||
"IF-MATCH": "*"
|
||||
},
|
||||
"parameters/body": "{ \"Title\": \"@{variables('NewTitle')}\", \"Status\": \"@{variables('NewStatus')}\" }"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> **Key headers:**
|
||||
> - `X-HTTP-Method: MERGE` — tells SharePoint to do a partial update (PATCH semantics)
|
||||
> - `IF-MATCH: *` — overwrites regardless of current ETag (no conflict check)
|
||||
>
|
||||
> The `HttpRequest` operation reuses the existing SharePoint connection — no extra
|
||||
> authentication needed. Use this when the standard Update Item connector can't
|
||||
> reach the target list (different site collection, or you need raw REST control).
|
||||
|
||||
---
|
||||
|
||||
### SharePoint — File as JSON Database (Read + Parse)
|
||||
|
||||
Use a SharePoint document library JSON file as a queryable "database" of
|
||||
last-known-state records. A separate process (e.g., Power BI dataflow) maintains
|
||||
the file; the flow downloads and filters it for before/after comparisons.
|
||||
|
||||
```json
|
||||
"Get_File": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "GetFileContent"
|
||||
},
|
||||
"parameters": {
|
||||
"dataset": "https://mytenant.sharepoint.com/sites/mysite",
|
||||
"id": "%252fShared%2bDocuments%252fdata.json",
|
||||
"inferContentType": false
|
||||
}
|
||||
}
|
||||
},
|
||||
"Parse_JSON_File": {
|
||||
"type": "Compose",
|
||||
"runAfter": { "Get_File": ["Succeeded"] },
|
||||
"inputs": "@json(decodeBase64(body('Get_File')?['$content']))"
|
||||
},
|
||||
"Find_Record": {
|
||||
"type": "Query",
|
||||
"runAfter": { "Parse_JSON_File": ["Succeeded"] },
|
||||
"inputs": {
|
||||
"from": "@outputs('Parse_JSON_File')",
|
||||
"where": "@equals(item()?['id'], variables('RecordId'))"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> **Decode chain:** `GetFileContent` returns base64-encoded content in
|
||||
> `body(...)?['$content']`. Apply `decodeBase64()` then `json()` to get a
|
||||
> usable array. `Filter Array` then acts as a WHERE clause.
|
||||
>
|
||||
> **When to use:** When you need a lightweight "before" snapshot to detect field
|
||||
> changes from a webhook payload (the "after" state). Simpler than maintaining
|
||||
> a full SharePoint list mirror — works well for up to ~10K records.
|
||||
>
|
||||
> **File path encoding:** In the `id` parameter, SharePoint URL-encodes paths
|
||||
> twice. Spaces become `%2b` (plus sign), slashes become `%252f`.
|
||||
|
||||
---
|
||||
|
||||
## Outlook
|
||||
|
||||
### Outlook — Send Email
|
||||
|
||||
```json
|
||||
"Send_Email": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_office365",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "SendEmailV2"
|
||||
},
|
||||
"parameters": {
|
||||
"emailMessage/To": "recipient@contoso.com",
|
||||
"emailMessage/Subject": "Automated notification",
|
||||
"emailMessage/Body": "<p>@{outputs('Compose_Message')}</p>",
|
||||
"emailMessage/IsHtml": true
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Outlook — Get Emails (Read Template from Folder)
|
||||
|
||||
```json
|
||||
"Get_Email_Template": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_office365",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "GetEmailsV3"
|
||||
},
|
||||
"parameters": {
|
||||
"folderPath": "Id::<outlook-folder-id>",
|
||||
"fetchOnlyUnread": false,
|
||||
"includeAttachments": false,
|
||||
"top": 1,
|
||||
"importance": "Any",
|
||||
"fetchOnlyWithAttachment": false,
|
||||
"subjectFilter": "My Email Template Subject"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Access subject and body:
|
||||
```
|
||||
@first(outputs('Get_Email_Template')?['body/value'])?['subject']
|
||||
@first(outputs('Get_Email_Template')?['body/value'])?['body']
|
||||
```
|
||||
|
||||
> **Outlook-as-CMS pattern**: store a template email in a dedicated Outlook folder.
|
||||
> Set `fetchOnlyUnread: false` so the template persists after first use.
|
||||
> Non-technical users can update subject and body by editing that email —
|
||||
> no flow changes required. Pass subject and body directly into `SendEmailV2`.
|
||||
>
|
||||
> To get a folder ID: in Outlook on the web, right-click the folder → open in
|
||||
> new tab — the folder GUID is in the URL. Prefix it with `Id::` in `folderPath`.
|
||||
|
||||
---
|
||||
|
||||
## Teams
|
||||
|
||||
### Teams — Post Message
|
||||
|
||||
```json
|
||||
"Post_Teams_Message": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_teams",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "PostMessageToConversation"
|
||||
},
|
||||
"parameters": {
|
||||
"poster": "Flow bot",
|
||||
"location": "Channel",
|
||||
"body/recipient": {
|
||||
"groupId": "<team-id>",
|
||||
"channelId": "<channel-id>"
|
||||
},
|
||||
"body/messageBody": "@outputs('Compose_Message')"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### Variant: Group Chat (1:1 or Multi-Person)
|
||||
|
||||
To post to a group chat instead of a channel, use `"location": "Group chat"` with
|
||||
a thread ID as the recipient:
|
||||
|
||||
```json
|
||||
"Post_To_Group_Chat": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_teams",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "PostMessageToConversation"
|
||||
},
|
||||
"parameters": {
|
||||
"poster": "Flow bot",
|
||||
"location": "Group chat",
|
||||
"body/recipient": "19:<thread-hash>@thread.v2",
|
||||
"body/messageBody": "@outputs('Compose_Message')"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
For 1:1 ("Chat with Flow bot"), use `"location": "Chat with Flow bot"` and set
|
||||
`body/recipient` to the user's email address.
|
||||
|
||||
> **Active-user gate:** When sending notifications in a loop, check the recipient's
|
||||
> Azure AD account is enabled before posting — avoids failed deliveries to departed
|
||||
> staff:
|
||||
> ```json
|
||||
> "Check_User_Active": {
|
||||
> "type": "OpenApiConnection",
|
||||
> "inputs": {
|
||||
> "host": { "apiId": "/providers/Microsoft.PowerApps/apis/shared_office365users",
|
||||
> "operationId": "UserProfile_V2" },
|
||||
> "parameters": { "id": "@{item()?['Email']}" }
|
||||
> }
|
||||
> }
|
||||
> ```
|
||||
> Then gate: `@equals(body('Check_User_Active')?['accountEnabled'], true)`
|
||||
|
||||
---
|
||||
|
||||
## Approvals
|
||||
|
||||
### Split Approval (Create → Wait)
|
||||
|
||||
The standard "Start and wait for an approval" is a single blocking action.
|
||||
For more control (e.g., posting the approval link in Teams, or adding a timeout
|
||||
scope), split it into two actions: `CreateAnApproval` (fire-and-forget) then
|
||||
`WaitForAnApproval` (webhook pause).
|
||||
|
||||
```json
|
||||
"Create_Approval": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_approvals",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "CreateAnApproval"
|
||||
},
|
||||
"parameters": {
|
||||
"approvalType": "CustomResponse/Result",
|
||||
"ApprovalCreationInput/title": "Review: @{variables('ItemTitle')}",
|
||||
"ApprovalCreationInput/assignedTo": "approver@contoso.com",
|
||||
"ApprovalCreationInput/details": "Please review and select an option.",
|
||||
"ApprovalCreationInput/responseOptions": ["Approve", "Reject", "Defer"],
|
||||
"ApprovalCreationInput/enableNotifications": true,
|
||||
"ApprovalCreationInput/enableReassignment": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"Wait_For_Approval": {
|
||||
"type": "OpenApiConnectionWebhook",
|
||||
"runAfter": { "Create_Approval": ["Succeeded"] },
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_approvals",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "WaitForAnApproval"
|
||||
},
|
||||
"parameters": {
|
||||
"approvalName": "@body('Create_Approval')?['name']"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> **`approvalType` options:**
|
||||
> - `"Approve/Reject - First to respond"` — binary, first responder wins
|
||||
> - `"Approve/Reject - Everyone must approve"` — requires all assignees
|
||||
> - `"CustomResponse/Result"` — define your own response buttons
|
||||
>
|
||||
> After `Wait_For_Approval`, read the outcome:
|
||||
> ```
|
||||
> @body('Wait_For_Approval')?['outcome'] → "Approve", "Reject", or custom
|
||||
> @body('Wait_For_Approval')?['responses'][0]?['responder']?['displayName']
|
||||
> @body('Wait_For_Approval')?['responses'][0]?['comments']
|
||||
> ```
|
||||
>
|
||||
> The split pattern lets you insert actions between create and wait — e.g.,
|
||||
> posting the approval link to Teams, starting a timeout scope, or logging
|
||||
> the pending approval to a tracking list.
|
||||
@@ -0,0 +1,542 @@
|
||||
# FlowStudio MCP — Action Patterns: Core
|
||||
|
||||
Variables, control flow, and expression patterns for Power Automate flow definitions.
|
||||
|
||||
> All examples assume `"runAfter"` is set appropriately.
|
||||
> Replace `<connectionName>` with the **key** you used in your `connectionReferences` map
|
||||
> (e.g. `shared_teams`, `shared_office365`) — NOT the connection GUID.
|
||||
|
||||
---
|
||||
|
||||
## Data & Variables
|
||||
|
||||
### Compose (Store a Value)
|
||||
|
||||
```json
|
||||
"Compose_My_Value": {
|
||||
"type": "Compose",
|
||||
"runAfter": {},
|
||||
"inputs": "@variables('myVar')"
|
||||
}
|
||||
```
|
||||
|
||||
Reference: `@outputs('Compose_My_Value')`
|
||||
|
||||
---
|
||||
|
||||
### Initialize Variable
|
||||
|
||||
```json
|
||||
"Init_Counter": {
|
||||
"type": "InitializeVariable",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"variables": [{
|
||||
"name": "counter",
|
||||
"type": "Integer",
|
||||
"value": 0
|
||||
}]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Types: `"Integer"`, `"Float"`, `"Boolean"`, `"String"`, `"Array"`, `"Object"`
|
||||
|
||||
---
|
||||
|
||||
### Set Variable
|
||||
|
||||
```json
|
||||
"Set_Counter": {
|
||||
"type": "SetVariable",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"name": "counter",
|
||||
"value": "@add(variables('counter'), 1)"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Append to Array Variable
|
||||
|
||||
```json
|
||||
"Collect_Item": {
|
||||
"type": "AppendToArrayVariable",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"name": "resultArray",
|
||||
"value": "@item()"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Increment Variable
|
||||
|
||||
```json
|
||||
"Increment_Counter": {
|
||||
"type": "IncrementVariable",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"name": "counter",
|
||||
"value": 1
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> Use `IncrementVariable` (not `SetVariable` with `add()`) for counters inside loops —
|
||||
> it is atomic and avoids expression errors when the variable is used elsewhere in the
|
||||
> same iteration. `value` can be any integer or expression, e.g. `@mul(item()?['Interval'], 60)`
|
||||
> to advance a Unix timestamp cursor by N minutes.
|
||||
|
||||
---
|
||||
|
||||
## Control Flow
|
||||
|
||||
### Condition (If/Else)
|
||||
|
||||
```json
|
||||
"Check_Status": {
|
||||
"type": "If",
|
||||
"runAfter": {},
|
||||
"expression": {
|
||||
"and": [{ "equals": ["@item()?['Status']", "Active"] }]
|
||||
},
|
||||
"actions": {
|
||||
"Handle_Active": {
|
||||
"type": "Compose",
|
||||
"runAfter": {},
|
||||
"inputs": "Active user: @{item()?['Name']}"
|
||||
}
|
||||
},
|
||||
"else": {
|
||||
"actions": {
|
||||
"Handle_Inactive": {
|
||||
"type": "Compose",
|
||||
"runAfter": {},
|
||||
"inputs": "Inactive user"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Comparison operators: `equals`, `not`, `greater`, `greaterOrEquals`, `less`, `lessOrEquals`, `contains`
|
||||
Logical: `and: [...]`, `or: [...]`
|
||||
|
||||
---
|
||||
|
||||
### Switch
|
||||
|
||||
```json
|
||||
"Route_By_Type": {
|
||||
"type": "Switch",
|
||||
"runAfter": {},
|
||||
"expression": "@triggerBody()?['type']",
|
||||
"cases": {
|
||||
"Case_Email": {
|
||||
"case": "email",
|
||||
"actions": { "Process_Email": { "type": "Compose", "runAfter": {}, "inputs": "email" } }
|
||||
},
|
||||
"Case_Teams": {
|
||||
"case": "teams",
|
||||
"actions": { "Process_Teams": { "type": "Compose", "runAfter": {}, "inputs": "teams" } }
|
||||
}
|
||||
},
|
||||
"default": {
|
||||
"actions": { "Unknown_Type": { "type": "Compose", "runAfter": {}, "inputs": "unknown" } }
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Scope (Grouping / Try-Catch)
|
||||
|
||||
Wrap related actions in a Scope to give them a shared name, collapse them in the
|
||||
designer, and — most importantly — handle their errors as a unit.
|
||||
|
||||
```json
|
||||
"Scope_Get_Customer": {
|
||||
"type": "Scope",
|
||||
"runAfter": {},
|
||||
"actions": {
|
||||
"HTTP_Get_Customer": {
|
||||
"type": "Http",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"method": "GET",
|
||||
"uri": "https://api.example.com/customers/@{variables('customerId')}"
|
||||
}
|
||||
},
|
||||
"Compose_Email": {
|
||||
"type": "Compose",
|
||||
"runAfter": { "HTTP_Get_Customer": ["Succeeded"] },
|
||||
"inputs": "@outputs('HTTP_Get_Customer')?['body/email']"
|
||||
}
|
||||
}
|
||||
},
|
||||
"Handle_Scope_Error": {
|
||||
"type": "Compose",
|
||||
"runAfter": { "Scope_Get_Customer": ["Failed", "TimedOut"] },
|
||||
"inputs": "Scope failed: @{result('Scope_Get_Customer')?[0]?['error']?['message']}"
|
||||
}
|
||||
```
|
||||
|
||||
> Reference scope results: `@result('Scope_Get_Customer')` returns an array of action
|
||||
> outcomes. Use `runAfter: {"MyScope": ["Failed", "TimedOut"]}` on a follow-up action
|
||||
> to create try/catch semantics without a Terminate.
|
||||
|
||||
---
|
||||
|
||||
### Foreach (Sequential)
|
||||
|
||||
```json
|
||||
"Process_Each_Item": {
|
||||
"type": "Foreach",
|
||||
"runAfter": {},
|
||||
"foreach": "@outputs('Get_Items')?['body/value']",
|
||||
"operationOptions": "Sequential",
|
||||
"actions": {
|
||||
"Handle_Item": {
|
||||
"type": "Compose",
|
||||
"runAfter": {},
|
||||
"inputs": "@item()?['Title']"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> Always include `"operationOptions": "Sequential"` unless parallel is intentional.
|
||||
|
||||
---
|
||||
|
||||
### Foreach (Parallel with Concurrency Limit)
|
||||
|
||||
```json
|
||||
"Process_Each_Item_Parallel": {
|
||||
"type": "Foreach",
|
||||
"runAfter": {},
|
||||
"foreach": "@body('Get_SP_Items')?['value']",
|
||||
"runtimeConfiguration": {
|
||||
"concurrency": {
|
||||
"repetitions": 20
|
||||
}
|
||||
},
|
||||
"actions": {
|
||||
"HTTP_Upsert": {
|
||||
"type": "Http",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"method": "POST",
|
||||
"uri": "https://api.example.com/contacts/@{item()?['Email']}"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> Set `repetitions` to control how many items are processed simultaneously.
|
||||
> Practical values: `5–10` for external API calls (respect rate limits),
|
||||
> `20–50` for internal/fast operations.
|
||||
> Omit `runtimeConfiguration.concurrency` entirely for the platform default
|
||||
> (currently 50). Do NOT use `"operationOptions": "Sequential"` and concurrency together.
|
||||
|
||||
---
|
||||
|
||||
### Wait (Delay)
|
||||
|
||||
```json
|
||||
"Delay_10_Minutes": {
|
||||
"type": "Wait",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"interval": {
|
||||
"count": 10,
|
||||
"unit": "Minute"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Valid `unit` values: `"Second"`, `"Minute"`, `"Hour"`, `"Day"`
|
||||
|
||||
> Use a Delay + re-fetch as a deduplication guard: wait for any competing process
|
||||
> to complete, then re-read the record before acting. This avoids double-processing
|
||||
> when multiple triggers or manual edits can race on the same item.
|
||||
|
||||
---
|
||||
|
||||
### Terminate (Success or Failure)
|
||||
|
||||
```json
|
||||
"Terminate_Success": {
|
||||
"type": "Terminate",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"runStatus": "Succeeded"
|
||||
}
|
||||
},
|
||||
"Terminate_Failure": {
|
||||
"type": "Terminate",
|
||||
"runAfter": { "Risky_Action": ["Failed"] },
|
||||
"inputs": {
|
||||
"runStatus": "Failed",
|
||||
"runError": {
|
||||
"code": "StepFailed",
|
||||
"message": "@{outputs('Get_Error_Message')}"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Do Until (Loop Until Condition)
|
||||
|
||||
Repeats a block of actions until an exit condition becomes true.
|
||||
Use when the number of iterations is not known upfront (e.g. paginating an API,
|
||||
walking a time range, polling until a status changes).
|
||||
|
||||
```json
|
||||
"Do_Until_Done": {
|
||||
"type": "Until",
|
||||
"runAfter": {},
|
||||
"expression": "@greaterOrEquals(variables('cursor'), variables('endValue'))",
|
||||
"limit": {
|
||||
"count": 5000,
|
||||
"timeout": "PT5H"
|
||||
},
|
||||
"actions": {
|
||||
"Do_Work": {
|
||||
"type": "Compose",
|
||||
"runAfter": {},
|
||||
"inputs": "@variables('cursor')"
|
||||
},
|
||||
"Advance_Cursor": {
|
||||
"type": "IncrementVariable",
|
||||
"runAfter": { "Do_Work": ["Succeeded"] },
|
||||
"inputs": {
|
||||
"name": "cursor",
|
||||
"value": 1
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> Always set `limit.count` and `limit.timeout` explicitly — the platform defaults are
|
||||
> low (60 iterations, 1 hour). For time-range walkers use `limit.count: 5000` and
|
||||
> `limit.timeout: "PT5H"` (ISO 8601 duration).
|
||||
>
|
||||
> The exit condition is evaluated **before** each iteration. Initialise your cursor
|
||||
> variable before the loop so the condition can evaluate correctly on the first pass.
|
||||
|
||||
---
|
||||
|
||||
### Async Polling with RequestId Correlation
|
||||
|
||||
When an API starts a long-running job asynchronously (e.g. Power BI dataset refresh,
|
||||
report generation, batch export), the trigger call returns a request ID. Capture it
|
||||
from the **response header**, then poll a status endpoint filtering by that exact ID:
|
||||
|
||||
```json
|
||||
"Start_Job": {
|
||||
"type": "Http",
|
||||
"inputs": { "method": "POST", "uri": "https://api.example.com/jobs" }
|
||||
},
|
||||
"Capture_Request_ID": {
|
||||
"type": "Compose",
|
||||
"runAfter": { "Start_Job": ["Succeeded"] },
|
||||
"inputs": "@outputs('Start_Job')?['headers/X-Request-Id']"
|
||||
},
|
||||
"Initialize_Status": {
|
||||
"type": "InitializeVariable",
|
||||
"inputs": { "variables": [{ "name": "jobStatus", "type": "String", "value": "Running" }] }
|
||||
},
|
||||
"Poll_Until_Done": {
|
||||
"type": "Until",
|
||||
"expression": "@not(equals(variables('jobStatus'), 'Running'))",
|
||||
"limit": { "count": 60, "timeout": "PT30M" },
|
||||
"actions": {
|
||||
"Delay": { "type": "Wait", "inputs": { "interval": { "count": 20, "unit": "Second" } } },
|
||||
"Get_History": {
|
||||
"type": "Http",
|
||||
"runAfter": { "Delay": ["Succeeded"] },
|
||||
"inputs": { "method": "GET", "uri": "https://api.example.com/jobs/history" }
|
||||
},
|
||||
"Filter_This_Job": {
|
||||
"type": "Query",
|
||||
"runAfter": { "Get_History": ["Succeeded"] },
|
||||
"inputs": {
|
||||
"from": "@outputs('Get_History')?['body/items']",
|
||||
"where": "@equals(item()?['requestId'], outputs('Capture_Request_ID'))"
|
||||
}
|
||||
},
|
||||
"Set_Status": {
|
||||
"type": "SetVariable",
|
||||
"runAfter": { "Filter_This_Job": ["Succeeded"] },
|
||||
"inputs": {
|
||||
"name": "jobStatus",
|
||||
"value": "@first(body('Filter_This_Job'))?['status']"
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"Handle_Failure": {
|
||||
"type": "If",
|
||||
"runAfter": { "Poll_Until_Done": ["Succeeded"] },
|
||||
"expression": { "equals": ["@variables('jobStatus')", "Failed"] },
|
||||
"actions": { "Terminate_Failed": { "type": "Terminate", "inputs": { "runStatus": "Failed" } } },
|
||||
"else": { "actions": {} }
|
||||
}
|
||||
```
|
||||
|
||||
Access response headers: `@outputs('Start_Job')?['headers/X-Request-Id']`
|
||||
|
||||
> **Status variable initialisation**: set a sentinel value (`"Running"`, `"Unknown"`) before
|
||||
> the loop. The exit condition tests for any value other than the sentinel.
|
||||
> This way an empty poll result (job not yet in history) leaves the variable unchanged
|
||||
> and the loop continues — it doesn't accidentally exit on null.
|
||||
>
|
||||
> **Filter before extracting**: always `Filter Array` the history to your specific
|
||||
> request ID before calling `first()`. History endpoints return all jobs; without
|
||||
> filtering, status from a different concurrent job can corrupt your poll.
|
||||
|
||||
---
|
||||
|
||||
### runAfter Fallback (Failed → Alternative Action)
|
||||
|
||||
Route to a fallback action when a primary action fails — without a Condition block.
|
||||
Simply set `runAfter` on the fallback to accept `["Failed"]` from the primary:
|
||||
|
||||
```json
|
||||
"HTTP_Get_Hi_Res": {
|
||||
"type": "Http",
|
||||
"runAfter": {},
|
||||
"inputs": { "method": "GET", "uri": "https://api.example.com/data?resolution=hi-res" }
|
||||
},
|
||||
"HTTP_Get_Low_Res": {
|
||||
"type": "Http",
|
||||
"runAfter": { "HTTP_Get_Hi_Res": ["Failed"] },
|
||||
"inputs": { "method": "GET", "uri": "https://api.example.com/data?resolution=low-res" }
|
||||
}
|
||||
```
|
||||
|
||||
> Actions that follow can use `runAfter` accepting both `["Succeeded", "Skipped"]` to
|
||||
> handle either path — see **Fan-In Join Gate** below.
|
||||
|
||||
---
|
||||
|
||||
### Fan-In Join Gate (Merge Two Mutually Exclusive Branches)
|
||||
|
||||
When two branches are mutually exclusive (only one can succeed per run), use a single
|
||||
downstream action that accepts `["Succeeded", "Skipped"]` from **both** branches.
|
||||
The gate fires exactly once regardless of which branch ran:
|
||||
|
||||
```json
|
||||
"Increment_Count": {
|
||||
"type": "IncrementVariable",
|
||||
"runAfter": {
|
||||
"Update_Hi_Res_Metadata": ["Succeeded", "Skipped"],
|
||||
"Update_Low_Res_Metadata": ["Succeeded", "Skipped"]
|
||||
},
|
||||
"inputs": { "name": "LoopCount", "value": 1 }
|
||||
}
|
||||
```
|
||||
|
||||
> This avoids duplicating the downstream action in each branch. The key insight:
|
||||
> whichever branch was skipped reports `Skipped` — the gate accepts that state and
|
||||
> fires once. Only works cleanly when the two branches are truly mutually exclusive
|
||||
> (e.g. one is `runAfter: [...Failed]` of the other).
|
||||
|
||||
---
|
||||
|
||||
## Expressions
|
||||
|
||||
### Common Expression Patterns
|
||||
|
||||
```
|
||||
Null-safe field access: @item()?['FieldName']
|
||||
Null guard: @coalesce(item()?['Name'], 'Unknown')
|
||||
String format: @{variables('firstName')} @{variables('lastName')}
|
||||
Date today: @utcNow()
|
||||
Formatted date: @formatDateTime(utcNow(), 'dd/MM/yyyy')
|
||||
Add days: @addDays(utcNow(), 7)
|
||||
Array length: @length(variables('myArray'))
|
||||
Filter array: Use the "Filter array" action (no inline filter expression exists in PA)
|
||||
Union (new wins): @union(body('New_Data'), outputs('Old_Data'))
|
||||
Sort: @sort(variables('myArray'), 'Date')
|
||||
Unix timestamp → date: @formatDateTime(addseconds('1970-1-1', triggerBody()?['created']), 'yyyy-MM-dd')
|
||||
Date → Unix milliseconds: @div(sub(ticks(startOfDay(item()?['Created'])), ticks(formatDateTime('1970-01-01Z','o'))), 10000)
|
||||
Date → Unix seconds: @div(sub(ticks(item()?['Start']), ticks('1970-01-01T00:00:00Z')), 10000000)
|
||||
Unix seconds → datetime: @addSeconds('1970-01-01T00:00:00Z', int(variables('Unix')))
|
||||
Coalesce as no-else: @coalesce(outputs('Optional_Step'), outputs('Default_Step'))
|
||||
Flow elapsed minutes: @div(float(sub(ticks(utcNow()), ticks(outputs('Flow_Start')))), 600000000)
|
||||
HH:mm time string: @formatDateTime(outputs('Local_Datetime'), 'HH:mm')
|
||||
Response header: @outputs('HTTP_Action')?['headers/X-Request-Id']
|
||||
Array max (by field): @reverse(sort(body('Select_Items'), 'Date'))[0]
|
||||
Integer day span: @int(split(dateDifference(outputs('Start'), outputs('End')), '.')[0])
|
||||
ISO week number: @div(add(dayofyear(addDays(subtractFromTime(date, sub(dayofweek(date),1), 'Day'), 3)), 6), 7)
|
||||
Join errors to string: @if(equals(length(variables('Errors')),0), null, concat(join(variables('Errors'),', '),' not found.'))
|
||||
Normalize before compare: @replace(coalesce(outputs('Value'),''),'_',' ')
|
||||
Robust non-empty check: @greater(length(trim(coalesce(string(outputs('Val')), ''))), 0)
|
||||
```
|
||||
|
||||
### Newlines in Expressions
|
||||
|
||||
> **`\n` does NOT produce a newline inside Power Automate expressions.** It is
|
||||
> treated as a literal backslash + `n` and will either appear verbatim or cause
|
||||
> a validation error.
|
||||
|
||||
Use `decodeUriComponent('%0a')` wherever you need a newline character:
|
||||
|
||||
```
|
||||
Newline (LF): decodeUriComponent('%0a')
|
||||
CRLF: decodeUriComponent('%0d%0a')
|
||||
```
|
||||
|
||||
Example — multi-line Teams or email body via `concat()`:
|
||||
```json
|
||||
"Compose_Message": {
|
||||
"type": "Compose",
|
||||
"inputs": "@concat('Hi ', outputs('Get_User')?['body/displayName'], ',', decodeUriComponent('%0a%0a'), 'Your report is ready.', decodeUriComponent('%0a'), '- The Team')"
|
||||
}
|
||||
```
|
||||
|
||||
Example — `join()` with newline separator:
|
||||
```json
|
||||
"Compose_List": {
|
||||
"type": "Compose",
|
||||
"inputs": "@join(body('Select_Names'), decodeUriComponent('%0a'))"
|
||||
}
|
||||
```
|
||||
|
||||
> This is the only reliable way to embed newlines in dynamically built strings
|
||||
> in Power Automate flow definitions (confirmed against Logic Apps runtime).
|
||||
|
||||
---
|
||||
|
||||
### Sum an array (XPath trick)
|
||||
|
||||
Power Automate has no native `sum()` function. Use XPath on XML instead:
|
||||
|
||||
```json
|
||||
"Prepare_For_Sum": {
|
||||
"type": "Compose",
|
||||
"runAfter": {},
|
||||
"inputs": { "root": { "numbers": "@body('Select_Amounts')" } }
|
||||
},
|
||||
"Sum": {
|
||||
"type": "Compose",
|
||||
"runAfter": { "Prepare_For_Sum": ["Succeeded"] },
|
||||
"inputs": "@xpath(xml(outputs('Prepare_For_Sum')), 'sum(/root/numbers)')"
|
||||
}
|
||||
```
|
||||
|
||||
`Select_Amounts` must output a flat array of numbers (use a **Select** action to extract a single numeric field first). The result is a number you can use directly in conditions or calculations.
|
||||
|
||||
> This is the only way to aggregate (sum/min/max) an array without a loop in Power Automate.
|
||||
@@ -0,0 +1,735 @@
|
||||
# FlowStudio MCP — Action Patterns: Data Transforms
|
||||
|
||||
Array operations, HTTP calls, parsing, and data transformation patterns.
|
||||
|
||||
> All examples assume `"runAfter"` is set appropriately.
|
||||
> `<connectionName>` is the **key** in `connectionReferences` (e.g. `shared_sharepointonline`), not the GUID.
|
||||
> The GUID goes in the map value's `connectionName` property.
|
||||
|
||||
---
|
||||
|
||||
## Array Operations
|
||||
|
||||
### Select (Reshape / Project an Array)
|
||||
|
||||
Transforms each item in an array, keeping only the columns you need or renaming them.
|
||||
Avoids carrying large objects through the rest of the flow.
|
||||
|
||||
```json
|
||||
"Select_Needed_Columns": {
|
||||
"type": "Select",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"from": "@outputs('HTTP_Get_Subscriptions')?['body/data']",
|
||||
"select": {
|
||||
"id": "@item()?['id']",
|
||||
"status": "@item()?['status']",
|
||||
"trial_end": "@item()?['trial_end']",
|
||||
"cancel_at": "@item()?['cancel_at']",
|
||||
"interval": "@item()?['plan']?['interval']"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Result reference: `@body('Select_Needed_Columns')` — returns a direct array of reshaped objects.
|
||||
|
||||
> Use Select before looping or filtering to reduce payload size and simplify
|
||||
> downstream expressions. Works on any array — SP results, HTTP responses, variables.
|
||||
>
|
||||
> **Tips:**
|
||||
> - **Single-to-array coercion:** When an API returns a single object but you need
|
||||
> Select (which requires an array), wrap it: `@array(body('Get_Employee')?['data'])`.
|
||||
> The output is a 1-element array — access results via `?[0]?['field']`.
|
||||
> - **Null-normalize optional fields:** Use `@if(empty(item()?['field']), null, item()?['field'])`
|
||||
> on every optional field to normalize empty strings, missing properties, and empty
|
||||
> objects to explicit `null`. Ensures consistent downstream `@equals(..., @null)` checks.
|
||||
> - **Flatten nested objects:** Project nested properties into flat fields:
|
||||
> ```
|
||||
> "manager_name": "@if(empty(item()?['manager']?['name']), null, item()?['manager']?['name'])"
|
||||
> ```
|
||||
> This enables direct field-level comparison with a flat schema from another source.
|
||||
|
||||
---
|
||||
|
||||
### Filter Array (Query)
|
||||
|
||||
Filters an array to items matching a condition. Use the action form (not the `filter()`
|
||||
expression) for complex multi-condition logic — it's clearer and easier to maintain.
|
||||
|
||||
```json
|
||||
"Filter_Active_Subscriptions": {
|
||||
"type": "Query",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"from": "@body('Select_Needed_Columns')",
|
||||
"where": "@and(or(equals(item().status, 'trialing'), equals(item().status, 'active')), equals(item().cancel_at, null))"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Result reference: `@body('Filter_Active_Subscriptions')` — direct filtered array.
|
||||
|
||||
> Tip: run multiple Filter Array actions on the same source array to create
|
||||
> named buckets (e.g. active, being-canceled, fully-canceled), then use
|
||||
> `coalesce(first(body('Filter_A')), first(body('Filter_B')), ...)` to pick
|
||||
> the highest-priority match without any loops.
|
||||
|
||||
---
|
||||
|
||||
### Create CSV Table (Array → CSV String)
|
||||
|
||||
Converts an array of objects into a CSV-formatted string — no connector call, no code.
|
||||
Use after a `Select` or `Filter Array` to export data or pass it to a file-write action.
|
||||
|
||||
```json
|
||||
"Create_CSV": {
|
||||
"type": "Table",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"from": "@body('Select_Output_Columns')",
|
||||
"format": "CSV"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Result reference: `@body('Create_CSV')` — a plain string with header row + data rows.
|
||||
|
||||
```json
|
||||
// Custom column order / renamed headers:
|
||||
"Create_CSV_Custom": {
|
||||
"type": "Table",
|
||||
"inputs": {
|
||||
"from": "@body('Select_Output_Columns')",
|
||||
"format": "CSV",
|
||||
"columns": [
|
||||
{ "header": "Date", "value": "@item()?['transactionDate']" },
|
||||
{ "header": "Amount", "value": "@item()?['amount']" },
|
||||
{ "header": "Description", "value": "@item()?['description']" }
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> Without `columns`, headers are taken from the object property names in the source array.
|
||||
> With `columns`, you control header names and column order explicitly.
|
||||
>
|
||||
> The output is a raw string. Write it to a file with `CreateFile` or `UpdateFile`
|
||||
> (set `body` to `@body('Create_CSV')`), or store in a variable with `SetVariable`.
|
||||
>
|
||||
> If source data came from Power BI's `ExecuteDatasetQuery`, column names will be
|
||||
> wrapped in square brackets (e.g. `[Amount]`). Strip them before writing:
|
||||
> `@replace(replace(body('Create_CSV'),'[',''),']','')`
|
||||
|
||||
---
|
||||
|
||||
### range() + Select for Array Generation
|
||||
|
||||
`range(0, N)` produces an integer sequence `[0, 1, 2, …, N-1]`. Pipe it through
|
||||
a Select action to generate date series, index grids, or any computed array
|
||||
without a loop:
|
||||
|
||||
```json
|
||||
// Generate 14 consecutive dates starting from a base date
|
||||
"Generate_Date_Series": {
|
||||
"type": "Select",
|
||||
"inputs": {
|
||||
"from": "@range(0, 14)",
|
||||
"select": "@addDays(outputs('Base_Date'), item(), 'yyyy-MM-dd')"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Result: `@body('Generate_Date_Series')` → `["2025-01-06", "2025-01-07", …, "2025-01-19"]`
|
||||
|
||||
```json
|
||||
// Flatten a 2D array (rows × cols) into 1D using arithmetic indexing
|
||||
"Flatten_Grid": {
|
||||
"type": "Select",
|
||||
"inputs": {
|
||||
"from": "@range(0, mul(length(outputs('Rows')), length(outputs('Cols'))))",
|
||||
"select": {
|
||||
"row": "@outputs('Rows')[div(item(), length(outputs('Cols')))]",
|
||||
"col": "@outputs('Cols')[mod(item(), length(outputs('Cols')))]"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> `range()` is zero-based. The Cartesian product pattern above uses `div(i, cols)`
|
||||
> for the row index and `mod(i, cols)` for the column index — equivalent to a
|
||||
> nested for-loop flattened into a single pass. Useful for generating time-slot ×
|
||||
> date grids, shift × location assignments, etc.
|
||||
|
||||
---
|
||||
|
||||
### Dynamic Dictionary via json(concat(join()))
|
||||
|
||||
When you need O(1) key→value lookups at runtime and Power Automate has no native
|
||||
dictionary type, build one from an array using Select + join + json:
|
||||
|
||||
```json
|
||||
"Build_Key_Value_Pairs": {
|
||||
"type": "Select",
|
||||
"inputs": {
|
||||
"from": "@body('Get_Lookup_Items')?['value']",
|
||||
"select": "@concat('\"', item()?['Key'], '\":\"', item()?['Value'], '\"')"
|
||||
}
|
||||
},
|
||||
"Assemble_Dictionary": {
|
||||
"type": "Compose",
|
||||
"inputs": "@json(concat('{', join(body('Build_Key_Value_Pairs'), ','), '}'))"
|
||||
}
|
||||
```
|
||||
|
||||
Lookup: `@outputs('Assemble_Dictionary')?['myKey']`
|
||||
|
||||
```json
|
||||
// Practical example: date → rate-code lookup for business rules
|
||||
"Build_Holiday_Rates": {
|
||||
"type": "Select",
|
||||
"inputs": {
|
||||
"from": "@body('Get_Holidays')?['value']",
|
||||
"select": "@concat('\"', formatDateTime(item()?['Date'], 'yyyy-MM-dd'), '\":\"', item()?['RateCode'], '\"')"
|
||||
}
|
||||
},
|
||||
"Holiday_Dict": {
|
||||
"type": "Compose",
|
||||
"inputs": "@json(concat('{', join(body('Build_Holiday_Rates'), ','), '}'))"
|
||||
}
|
||||
```
|
||||
|
||||
Then inside a loop: `@coalesce(outputs('Holiday_Dict')?[item()?['Date']], 'Standard')`
|
||||
|
||||
> The `json(concat('{', join(...), '}'))` pattern works for string values. For numeric
|
||||
> or boolean values, omit the inner escaped quotes around the value portion.
|
||||
> Keys must be unique — duplicate keys silently overwrite earlier ones.
|
||||
> This replaces deeply nested `if(equals(key,'A'),'X', if(equals(key,'B'),'Y', ...))` chains.
|
||||
|
||||
---
|
||||
|
||||
### union() for Changed-Field Detection
|
||||
|
||||
When you need to find records where *any* of several fields has changed, run one
|
||||
`Filter Array` per field and `union()` the results. This avoids a complex
|
||||
multi-condition filter and produces a clean deduplicated set:
|
||||
|
||||
```json
|
||||
"Filter_Name_Changed": {
|
||||
"type": "Query",
|
||||
"inputs": { "from": "@body('Existing_Records')",
|
||||
"where": "@not(equals(item()?['name'], item()?['dest_name']))" }
|
||||
},
|
||||
"Filter_Status_Changed": {
|
||||
"type": "Query",
|
||||
"inputs": { "from": "@body('Existing_Records')",
|
||||
"where": "@not(equals(item()?['status'], item()?['dest_status']))" }
|
||||
},
|
||||
"All_Changed": {
|
||||
"type": "Compose",
|
||||
"inputs": "@union(body('Filter_Name_Changed'), body('Filter_Status_Changed'))"
|
||||
}
|
||||
```
|
||||
|
||||
Reference: `@outputs('All_Changed')` — deduplicated array of rows where anything changed.
|
||||
|
||||
> `union()` deduplicates by object identity, so a row that changed in both fields
|
||||
> appears once. Add more `Filter_*_Changed` inputs to `union()` as needed:
|
||||
> `@union(body('F1'), body('F2'), body('F3'))`
|
||||
|
||||
---
|
||||
|
||||
### File-Content Change Gate
|
||||
|
||||
Before running expensive processing on a file or blob, compare its current content
|
||||
to a stored baseline. Skip entirely if nothing has changed — makes sync flows
|
||||
idempotent and safe to re-run or schedule aggressively.
|
||||
|
||||
```json
|
||||
"Get_File_From_Source": { ... },
|
||||
"Get_Stored_Baseline": { ... },
|
||||
"Condition_File_Changed": {
|
||||
"type": "If",
|
||||
"expression": {
|
||||
"not": {
|
||||
"equals": [
|
||||
"@base64(body('Get_File_From_Source'))",
|
||||
"@body('Get_Stored_Baseline')"
|
||||
]
|
||||
}
|
||||
},
|
||||
"actions": {
|
||||
"Update_Baseline": { "...": "overwrite stored copy with new content" },
|
||||
"Process_File": { "...": "all expensive work goes here" }
|
||||
},
|
||||
"else": { "actions": {} }
|
||||
}
|
||||
```
|
||||
|
||||
> Store the baseline as a file in SharePoint or blob storage — `base64()`-encode the
|
||||
> live content before comparing so binary and text files are handled uniformly.
|
||||
> Write the new baseline **before** processing so a re-run after a partial failure
|
||||
> does not re-process the same file again.
|
||||
|
||||
---
|
||||
|
||||
### Set-Join for Sync (Update Detection without Nested Loops)
|
||||
|
||||
When syncing a source collection into a destination (e.g. API response → SharePoint list,
|
||||
CSV → database), avoid nested `Apply to each` loops to find changed records.
|
||||
Instead, **project flat key arrays** and use `contains()` to perform set operations —
|
||||
zero nested loops, and the final loop only touches changed items.
|
||||
|
||||
**Full insert/update/delete sync pattern:**
|
||||
|
||||
```json
|
||||
// Step 1 — Project a flat key array from the DESTINATION (e.g. SharePoint)
|
||||
"Select_Dest_Keys": {
|
||||
"type": "Select",
|
||||
"inputs": {
|
||||
"from": "@outputs('Get_Dest_Items')?['body/value']",
|
||||
"select": "@item()?['Title']"
|
||||
}
|
||||
}
|
||||
// → ["KEY1", "KEY2", "KEY3", ...]
|
||||
|
||||
// Step 2 — INSERT: source rows whose key is NOT in destination
|
||||
"Filter_To_Insert": {
|
||||
"type": "Query",
|
||||
"inputs": {
|
||||
"from": "@body('Source_Array')",
|
||||
"where": "@not(contains(body('Select_Dest_Keys'), item()?['key']))"
|
||||
}
|
||||
}
|
||||
// → Apply to each Filter_To_Insert → CreateItem
|
||||
|
||||
// Step 3 — INNER JOIN: source rows that exist in destination
|
||||
"Filter_Already_Exists": {
|
||||
"type": "Query",
|
||||
"inputs": {
|
||||
"from": "@body('Source_Array')",
|
||||
"where": "@contains(body('Select_Dest_Keys'), item()?['key'])"
|
||||
}
|
||||
}
|
||||
|
||||
// Step 4 — UPDATE: one Filter per tracked field, then union them
|
||||
"Filter_Field1_Changed": {
|
||||
"type": "Query",
|
||||
"inputs": {
|
||||
"from": "@body('Filter_Already_Exists')",
|
||||
"where": "@not(equals(item()?['field1'], item()?['dest_field1']))"
|
||||
}
|
||||
}
|
||||
"Filter_Field2_Changed": {
|
||||
"type": "Query",
|
||||
"inputs": {
|
||||
"from": "@body('Filter_Already_Exists')",
|
||||
"where": "@not(equals(item()?['field2'], item()?['dest_field2']))"
|
||||
}
|
||||
}
|
||||
"Union_Changed": {
|
||||
"type": "Compose",
|
||||
"inputs": "@union(body('Filter_Field1_Changed'), body('Filter_Field2_Changed'))"
|
||||
}
|
||||
// → rows where ANY tracked field differs
|
||||
|
||||
// Step 5 — Resolve destination IDs for changed rows (no nested loop)
|
||||
"Select_Changed_Keys": {
|
||||
"type": "Select",
|
||||
"inputs": { "from": "@outputs('Union_Changed')", "select": "@item()?['key']" }
|
||||
}
|
||||
"Filter_Dest_Items_To_Update": {
|
||||
"type": "Query",
|
||||
"inputs": {
|
||||
"from": "@outputs('Get_Dest_Items')?['body/value']",
|
||||
"where": "@contains(body('Select_Changed_Keys'), item()?['Title'])"
|
||||
}
|
||||
}
|
||||
// Step 6 — Single loop over changed items only
|
||||
"Apply_to_each_Update": {
|
||||
"type": "Foreach",
|
||||
"foreach": "@body('Filter_Dest_Items_To_Update')",
|
||||
"actions": {
|
||||
"Get_Source_Row": {
|
||||
"type": "Query",
|
||||
"inputs": {
|
||||
"from": "@outputs('Union_Changed')",
|
||||
"where": "@equals(item()?['key'], items('Apply_to_each_Update')?['Title'])"
|
||||
}
|
||||
},
|
||||
"Update_Item": {
|
||||
"...": "...",
|
||||
"id": "@items('Apply_to_each_Update')?['ID']",
|
||||
"item/field1": "@first(body('Get_Source_Row'))?['field1']"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Step 7 — DELETE: destination keys NOT in source
|
||||
"Select_Source_Keys": {
|
||||
"type": "Select",
|
||||
"inputs": { "from": "@body('Source_Array')", "select": "@item()?['key']" }
|
||||
}
|
||||
"Filter_To_Delete": {
|
||||
"type": "Query",
|
||||
"inputs": {
|
||||
"from": "@outputs('Get_Dest_Items')?['body/value']",
|
||||
"where": "@not(contains(body('Select_Source_Keys'), item()?['Title']))"
|
||||
}
|
||||
}
|
||||
// → Apply to each Filter_To_Delete → DeleteItem
|
||||
```
|
||||
|
||||
> **Why this beats nested loops**: the naive approach (for each dest item, scan source)
|
||||
> is O(n × m) and hits Power Automate's 100k-action run limit fast on large lists.
|
||||
> This pattern is O(n + m): one pass to build key arrays, one pass per filter.
|
||||
> The update loop in Step 6 only iterates *changed* records — often a tiny fraction
|
||||
> of the full collection. Run Steps 2/4/7 in **parallel Scopes** for further speed.
|
||||
|
||||
---
|
||||
|
||||
### First-or-Null Single-Row Lookup
|
||||
|
||||
Use `first()` on the result array to extract one record without a loop.
|
||||
Then null-check the output to guard downstream actions.
|
||||
|
||||
```json
|
||||
"Get_First_Match": {
|
||||
"type": "Compose",
|
||||
"runAfter": { "Get_SP_Items": ["Succeeded"] },
|
||||
"inputs": "@first(outputs('Get_SP_Items')?['body/value'])"
|
||||
}
|
||||
```
|
||||
|
||||
In a Condition, test for no-match with the **`@null` literal** (not `empty()`):
|
||||
|
||||
```json
|
||||
"Condition": {
|
||||
"type": "If",
|
||||
"expression": {
|
||||
"not": {
|
||||
"equals": [
|
||||
"@outputs('Get_First_Match')",
|
||||
"@null"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Access fields on the matched row: `@outputs('Get_First_Match')?['FieldName']`
|
||||
|
||||
> Use this instead of `Apply to each` when you only need one matching record.
|
||||
> `first()` on an empty array returns `null`; `empty()` is for arrays/strings,
|
||||
> not scalars — using it on a `first()` result causes a runtime error.
|
||||
|
||||
---
|
||||
|
||||
## HTTP & Parsing
|
||||
|
||||
### HTTP Action (External API)
|
||||
|
||||
```json
|
||||
"Call_External_API": {
|
||||
"type": "Http",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"method": "POST",
|
||||
"uri": "https://api.example.com/endpoint",
|
||||
"headers": {
|
||||
"Content-Type": "application/json",
|
||||
"Authorization": "Bearer @{variables('apiToken')}"
|
||||
},
|
||||
"body": {
|
||||
"data": "@outputs('Compose_Payload')"
|
||||
},
|
||||
"retryPolicy": {
|
||||
"type": "Fixed",
|
||||
"count": 3,
|
||||
"interval": "PT10S"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Response reference: `@outputs('Call_External_API')?['body']`
|
||||
|
||||
#### Variant: ActiveDirectoryOAuth (Service-to-Service)
|
||||
|
||||
For calling APIs that require Azure AD client-credentials (e.g., Microsoft Graph),
|
||||
use in-line OAuth instead of a Bearer token variable:
|
||||
|
||||
```json
|
||||
"Call_Graph_API": {
|
||||
"type": "Http",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"method": "GET",
|
||||
"uri": "https://graph.microsoft.com/v1.0/users?$search=\"employeeId:@{variables('Code')}\"&$select=id,displayName",
|
||||
"headers": {
|
||||
"Content-Type": "application/json",
|
||||
"ConsistencyLevel": "eventual"
|
||||
},
|
||||
"authentication": {
|
||||
"type": "ActiveDirectoryOAuth",
|
||||
"authority": "https://login.microsoftonline.com",
|
||||
"tenant": "<tenant-id>",
|
||||
"audience": "https://graph.microsoft.com",
|
||||
"clientId": "<app-registration-id>",
|
||||
"secret": "@parameters('graphClientSecret')"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> **When to use:** Calling Microsoft Graph, Azure Resource Manager, or any
|
||||
> Azure AD-protected API from a flow without a premium connector.
|
||||
>
|
||||
> The `authentication` block handles the entire OAuth client-credentials flow
|
||||
> transparently — no manual token acquisition step needed.
|
||||
>
|
||||
> `ConsistencyLevel: eventual` is required for Graph `$search` queries.
|
||||
> Without it, `$search` returns 400.
|
||||
>
|
||||
> For PATCH/PUT writes, the same `authentication` block works — just change
|
||||
> `method` and add a `body`.
|
||||
>
|
||||
> ⚠️ **Never hardcode `secret` inline.** Use `@parameters('graphClientSecret')`
|
||||
> and declare it in the flow's `parameters` block (type `securestring`). This
|
||||
> prevents the secret from appearing in run history or being readable via
|
||||
> `get_live_flow`. Declare the parameter like:
|
||||
> ```json
|
||||
> "parameters": {
|
||||
> "graphClientSecret": { "type": "securestring", "defaultValue": "" }
|
||||
> }
|
||||
> ```
|
||||
> Then pass the real value via the flow's connections or environment variables
|
||||
> — never commit it to source control.
|
||||
|
||||
---
|
||||
|
||||
### HTTP Response (Return to Caller)
|
||||
|
||||
Used in HTTP-triggered flows to send a structured reply back to the caller.
|
||||
Must run before the flow times out (default 2 min for synchronous HTTP).
|
||||
|
||||
```json
|
||||
"Response": {
|
||||
"type": "Response",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"statusCode": 200,
|
||||
"headers": {
|
||||
"Content-Type": "application/json"
|
||||
},
|
||||
"body": {
|
||||
"status": "success",
|
||||
"message": "@{outputs('Compose_Result')}"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> **PowerApps / low-code caller pattern**: always return `statusCode: 200` with a
|
||||
> `status` field in the body (`"success"` / `"error"`). PowerApps HTTP actions
|
||||
> do not handle non-2xx responses gracefully — the caller should inspect
|
||||
> `body.status` rather than the HTTP status code.
|
||||
>
|
||||
> Use multiple Response actions — one per branch — so each path returns
|
||||
> an appropriate message. Only one will execute per run.
|
||||
|
||||
---
|
||||
|
||||
### Child Flow Call (Parent→Child via HTTP POST)
|
||||
|
||||
Power Automate supports parent→child orchestration by calling a child flow's
|
||||
HTTP trigger URL directly. The parent sends an HTTP POST and blocks until the
|
||||
child returns a `Response` action. The child flow uses a `manual` (Request) trigger.
|
||||
|
||||
```json
|
||||
// PARENT — call child flow and wait for its response
|
||||
"Call_Child_Flow": {
|
||||
"type": "Http",
|
||||
"inputs": {
|
||||
"method": "POST",
|
||||
"uri": "https://prod-XX.australiasoutheast.logic.azure.com:443/workflows/<workflowId>/triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=<SAS>",
|
||||
"headers": { "Content-Type": "application/json" },
|
||||
"body": {
|
||||
"ID": "@triggerBody()?['ID']",
|
||||
"WeekEnd": "@triggerBody()?['WeekEnd']",
|
||||
"Payload": "@variables('dataArray')"
|
||||
},
|
||||
"retryPolicy": { "type": "none" }
|
||||
},
|
||||
"operationOptions": "DisableAsyncPattern",
|
||||
"runtimeConfiguration": {
|
||||
"contentTransfer": { "transferMode": "Chunked" }
|
||||
},
|
||||
"limit": { "timeout": "PT2H" }
|
||||
}
|
||||
```
|
||||
|
||||
```json
|
||||
// CHILD — manual trigger receives the JSON body
|
||||
// (trigger definition)
|
||||
"manual": {
|
||||
"type": "Request",
|
||||
"kind": "Http",
|
||||
"inputs": {
|
||||
"schema": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"ID": { "type": "string" },
|
||||
"WeekEnd": { "type": "string" },
|
||||
"Payload": { "type": "array" }
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// CHILD — return result to parent
|
||||
"Response_Success": {
|
||||
"type": "Response",
|
||||
"inputs": {
|
||||
"statusCode": 200,
|
||||
"headers": { "Content-Type": "application/json" },
|
||||
"body": { "Result": "Success", "Count": "@length(variables('processed'))" }
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
> **`retryPolicy: none`** — critical on the parent's HTTP call. Without it, a child
|
||||
> flow timeout triggers retries, spawning duplicate child runs.
|
||||
>
|
||||
> **`DisableAsyncPattern`** — prevents the parent from treating a 202 Accepted as
|
||||
> completion. The parent will block until the child sends its `Response`.
|
||||
>
|
||||
> **`transferMode: Chunked`** — enable when passing large arrays (>100 KB) to the child;
|
||||
> avoids request-size limits.
|
||||
>
|
||||
> **`limit.timeout: PT2H`** — raise the default 2-minute HTTP timeout for long-running
|
||||
> children. Max is PT24H.
|
||||
>
|
||||
> The child flow's trigger URL contains a SAS token (`sig=...`) that authenticates
|
||||
> the call. Copy it from the child flow's trigger properties panel. The URL changes
|
||||
> if the trigger is deleted and re-created.
|
||||
|
||||
---
|
||||
|
||||
### Parse JSON
|
||||
|
||||
```json
|
||||
"Parse_Response": {
|
||||
"type": "ParseJson",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"content": "@outputs('Call_External_API')?['body']",
|
||||
"schema": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"id": { "type": "integer" },
|
||||
"name": { "type": "string" },
|
||||
"items": {
|
||||
"type": "array",
|
||||
"items": { "type": "object" }
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Access parsed values: `@body('Parse_Response')?['name']`
|
||||
|
||||
---
|
||||
|
||||
### Manual CSV → JSON (No Premium Action)
|
||||
|
||||
Parse a raw CSV string into an array of objects using only built-in expressions.
|
||||
Avoids the premium "Parse CSV" connector action.
|
||||
|
||||
```json
|
||||
"Delimiter": {
|
||||
"type": "Compose",
|
||||
"inputs": ","
|
||||
},
|
||||
"Strip_Quotes": {
|
||||
"type": "Compose",
|
||||
"inputs": "@replace(body('Get_File_Content'), '\"', '')"
|
||||
},
|
||||
"Detect_Line_Ending": {
|
||||
"type": "Compose",
|
||||
"inputs": "@if(equals(indexOf(outputs('Strip_Quotes'), decodeUriComponent('%0D%0A')), -1), if(equals(indexOf(outputs('Strip_Quotes'), decodeUriComponent('%0A')), -1), decodeUriComponent('%0D'), decodeUriComponent('%0A')), decodeUriComponent('%0D%0A'))"
|
||||
},
|
||||
"Headers": {
|
||||
"type": "Compose",
|
||||
"inputs": "@split(first(split(outputs('Strip_Quotes'), outputs('Detect_Line_Ending'))), outputs('Delimiter'))"
|
||||
},
|
||||
"Data_Rows": {
|
||||
"type": "Compose",
|
||||
"inputs": "@skip(split(outputs('Strip_Quotes'), outputs('Detect_Line_Ending')), 1)"
|
||||
},
|
||||
"Select_CSV_Body": {
|
||||
"type": "Select",
|
||||
"inputs": {
|
||||
"from": "@outputs('Data_Rows')",
|
||||
"select": {
|
||||
"@{outputs('Headers')[0]}": "@split(item(), outputs('Delimiter'))[0]",
|
||||
"@{outputs('Headers')[1]}": "@split(item(), outputs('Delimiter'))[1]",
|
||||
"@{outputs('Headers')[2]}": "@split(item(), outputs('Delimiter'))[2]"
|
||||
}
|
||||
}
|
||||
},
|
||||
"Filter_Empty_Rows": {
|
||||
"type": "Query",
|
||||
"inputs": {
|
||||
"from": "@body('Select_CSV_Body')",
|
||||
"where": "@not(equals(item()?[outputs('Headers')[0]], null))"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Result: `@body('Filter_Empty_Rows')` — array of objects with header names as keys.
|
||||
|
||||
> **`Detect_Line_Ending`** handles CRLF (Windows), LF (Unix), and CR (old Mac) automatically
|
||||
> using `indexOf()` with `decodeUriComponent('%0D%0A' / '%0A' / '%0D')`.
|
||||
>
|
||||
> **Dynamic key names in `Select`**: `@{outputs('Headers')[0]}` as a JSON key in a
|
||||
> `Select` shape sets the output property name at runtime from the header row —
|
||||
> this works as long as the expression is in `@{...}` interpolation syntax.
|
||||
>
|
||||
> **Columns with embedded commas**: if field values can contain the delimiter,
|
||||
> use `length(split(row, ','))` in a Switch to detect the column count and manually
|
||||
> reassemble the split fragments: `@concat(split(item(),',')[1],',',split(item(),',')[2])`
|
||||
|
||||
---
|
||||
|
||||
### ConvertTimeZone (Built-in, No Connector)
|
||||
|
||||
Converts a timestamp between timezones with no API call or connector licence cost.
|
||||
Format string `"g"` produces short locale date+time (`M/d/yyyy h:mm tt`).
|
||||
|
||||
```json
|
||||
"Convert_to_Local_Time": {
|
||||
"type": "Expression",
|
||||
"kind": "ConvertTimeZone",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"baseTime": "@{outputs('UTC_Timestamp')}",
|
||||
"sourceTimeZone": "UTC",
|
||||
"destinationTimeZone": "Taipei Standard Time",
|
||||
"formatString": "g"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Result reference: `@body('Convert_to_Local_Time')` — **not** `outputs()`, unlike most actions.
|
||||
|
||||
Common `formatString` values: `"g"` (short), `"f"` (full), `"yyyy-MM-dd"`, `"HH:mm"`
|
||||
|
||||
Common timezone strings: `"UTC"`, `"AUS Eastern Standard Time"`, `"Taipei Standard Time"`,
|
||||
`"Singapore Standard Time"`, `"GMT Standard Time"`
|
||||
|
||||
> This is `type: Expression, kind: ConvertTimeZone` — a built-in Logic Apps action,
|
||||
> not a connector. No connection reference needed. Reference the output via
|
||||
> `body()` (not `outputs()`), otherwise the expression returns null.
|
||||
@@ -0,0 +1,108 @@
|
||||
# Common Build Patterns
|
||||
|
||||
Complete flow definition templates ready to copy and customize.
|
||||
|
||||
---
|
||||
|
||||
## Pattern: Recurrence + SharePoint list read + Teams notification
|
||||
|
||||
```json
|
||||
{
|
||||
"triggers": {
|
||||
"Recurrence": {
|
||||
"type": "Recurrence",
|
||||
"recurrence": { "frequency": "Day", "interval": 1,
|
||||
"startTime": "2026-01-01T08:00:00Z",
|
||||
"timeZone": "AUS Eastern Standard Time" }
|
||||
}
|
||||
},
|
||||
"actions": {
|
||||
"Get_SP_Items": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline",
|
||||
"connectionName": "shared_sharepointonline",
|
||||
"operationId": "GetItems"
|
||||
},
|
||||
"parameters": {
|
||||
"dataset": "https://mytenant.sharepoint.com/sites/mysite",
|
||||
"table": "MyList",
|
||||
"$filter": "Status eq 'Active'",
|
||||
"$top": 500
|
||||
}
|
||||
}
|
||||
},
|
||||
"Apply_To_Each": {
|
||||
"type": "Foreach",
|
||||
"runAfter": { "Get_SP_Items": ["Succeeded"] },
|
||||
"foreach": "@outputs('Get_SP_Items')?['body/value']",
|
||||
"actions": {
|
||||
"Post_Teams_Message": {
|
||||
"type": "OpenApiConnection",
|
||||
"runAfter": {},
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_teams",
|
||||
"connectionName": "shared_teams",
|
||||
"operationId": "PostMessageToConversation"
|
||||
},
|
||||
"parameters": {
|
||||
"poster": "Flow bot",
|
||||
"location": "Channel",
|
||||
"body/recipient": {
|
||||
"groupId": "<team-id>",
|
||||
"channelId": "<channel-id>"
|
||||
},
|
||||
"body/messageBody": "Item: @{items('Apply_To_Each')?['Title']}"
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"operationOptions": "Sequential"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Pattern: HTTP trigger (webhook / Power App call)
|
||||
|
||||
```json
|
||||
{
|
||||
"triggers": {
|
||||
"manual": {
|
||||
"type": "Request",
|
||||
"kind": "Http",
|
||||
"inputs": {
|
||||
"schema": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": { "type": "string" },
|
||||
"value": { "type": "number" }
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"actions": {
|
||||
"Compose_Response": {
|
||||
"type": "Compose",
|
||||
"runAfter": {},
|
||||
"inputs": "Received: @{triggerBody()?['name']} = @{triggerBody()?['value']}"
|
||||
},
|
||||
"Response": {
|
||||
"type": "Response",
|
||||
"runAfter": { "Compose_Response": ["Succeeded"] },
|
||||
"inputs": {
|
||||
"statusCode": 200,
|
||||
"body": { "status": "ok", "message": "@{outputs('Compose_Response')}" }
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Access body values: `@triggerBody()?['name']`
|
||||
225
skills/flowstudio-power-automate-build/references/flow-schema.md
Normal file
225
skills/flowstudio-power-automate-build/references/flow-schema.md
Normal file
@@ -0,0 +1,225 @@
|
||||
# FlowStudio MCP — Flow Definition Schema
|
||||
|
||||
The full JSON structure expected by `update_live_flow` (and returned by `get_live_flow`).
|
||||
|
||||
---
|
||||
|
||||
## Top-Level Shape
|
||||
|
||||
```json
|
||||
{
|
||||
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
|
||||
"contentVersion": "1.0.0.0",
|
||||
"parameters": {
|
||||
"$connections": {
|
||||
"defaultValue": {},
|
||||
"type": "Object"
|
||||
}
|
||||
},
|
||||
"triggers": {
|
||||
"<TriggerName>": { ... }
|
||||
},
|
||||
"actions": {
|
||||
"<ActionName>": { ... }
|
||||
},
|
||||
"outputs": {}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## `triggers`
|
||||
|
||||
Exactly one trigger per flow definition. The key name is arbitrary but
|
||||
conventional names are used (e.g. `Recurrence`, `manual`, `When_a_new_email_arrives`).
|
||||
|
||||
See [trigger-types.md](trigger-types.md) for all trigger templates.
|
||||
|
||||
---
|
||||
|
||||
## `actions`
|
||||
|
||||
Dictionary of action definitions keyed by unique action name.
|
||||
Key names may not contain spaces — use underscores.
|
||||
|
||||
Each action must include:
|
||||
- `type` — action type identifier
|
||||
- `runAfter` — map of upstream action names → status conditions array
|
||||
- `inputs` — action-specific input configuration
|
||||
|
||||
See [action-patterns-core.md](action-patterns-core.md), [action-patterns-data.md](action-patterns-data.md),
|
||||
and [action-patterns-connectors.md](action-patterns-connectors.md) for templates.
|
||||
|
||||
### Optional Action Properties
|
||||
|
||||
Beyond the required `type`, `runAfter`, and `inputs`, actions can include:
|
||||
|
||||
| Property | Purpose |
|
||||
|---|---|
|
||||
| `runtimeConfiguration` | Pagination, concurrency, secure data, chunked transfer |
|
||||
| `operationOptions` | `"Sequential"` for Foreach, `"DisableAsyncPattern"` for HTTP |
|
||||
| `limit` | Timeout override (e.g. `{"timeout": "PT2H"}`) |
|
||||
|
||||
#### `runtimeConfiguration` Variants
|
||||
|
||||
**Pagination** (SharePoint Get Items with large lists):
|
||||
```json
|
||||
"runtimeConfiguration": {
|
||||
"paginationPolicy": {
|
||||
"minimumItemCount": 5000
|
||||
}
|
||||
}
|
||||
```
|
||||
> Without this, Get Items silently caps at 256 results. Set `minimumItemCount`
|
||||
> to the maximum rows you expect. Required for any SharePoint list over 256 items.
|
||||
|
||||
**Concurrency** (parallel Foreach):
|
||||
```json
|
||||
"runtimeConfiguration": {
|
||||
"concurrency": {
|
||||
"repetitions": 20
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Secure inputs/outputs** (mask values in run history):
|
||||
```json
|
||||
"runtimeConfiguration": {
|
||||
"secureData": {
|
||||
"properties": ["inputs", "outputs"]
|
||||
}
|
||||
}
|
||||
```
|
||||
> Use on actions that handle credentials, tokens, or PII. Masked values show
|
||||
> as `"<redacted>"` in the flow run history UI and API responses.
|
||||
|
||||
**Chunked transfer** (large HTTP payloads):
|
||||
```json
|
||||
"runtimeConfiguration": {
|
||||
"contentTransfer": {
|
||||
"transferMode": "Chunked"
|
||||
}
|
||||
}
|
||||
```
|
||||
> Enable on HTTP actions sending or receiving bodies >100 KB (e.g. parent→child
|
||||
> flow calls with large arrays).
|
||||
|
||||
---
|
||||
|
||||
## `runAfter` Rules
|
||||
|
||||
The first action in a branch has `"runAfter": {}` (empty — runs after trigger).
|
||||
|
||||
Subsequent actions declare their dependency:
|
||||
|
||||
```json
|
||||
"My_Action": {
|
||||
"runAfter": {
|
||||
"Previous_Action": ["Succeeded"]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Multiple upstream dependencies:
|
||||
```json
|
||||
"runAfter": {
|
||||
"Action_A": ["Succeeded"],
|
||||
"Action_B": ["Succeeded", "Skipped"]
|
||||
}
|
||||
```
|
||||
|
||||
Error-handling action (runs when upstream failed):
|
||||
```json
|
||||
"Log_Error": {
|
||||
"runAfter": {
|
||||
"Risky_Action": ["Failed"]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## `parameters` (Flow-Level Input Parameters)
|
||||
|
||||
Optional. Define reusable values at the flow level:
|
||||
|
||||
```json
|
||||
"parameters": {
|
||||
"listName": {
|
||||
"type": "string",
|
||||
"defaultValue": "MyList"
|
||||
},
|
||||
"maxItems": {
|
||||
"type": "integer",
|
||||
"defaultValue": 100
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Reference: `@parameters('listName')` in expression strings.
|
||||
|
||||
---
|
||||
|
||||
## `outputs`
|
||||
|
||||
Rarely used in cloud flows. Leave as `{}` unless the flow is called
|
||||
as a child flow and needs to return values.
|
||||
|
||||
For child flows that return data:
|
||||
|
||||
```json
|
||||
"outputs": {
|
||||
"resultData": {
|
||||
"type": "object",
|
||||
"value": "@outputs('Compose_Result')"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Scoped Actions (Inside Scope Block)
|
||||
|
||||
Actions that need to be grouped for error handling or clarity:
|
||||
|
||||
```json
|
||||
"Scope_Main_Process": {
|
||||
"type": "Scope",
|
||||
"runAfter": {},
|
||||
"actions": {
|
||||
"Step_One": { ... },
|
||||
"Step_Two": { "runAfter": { "Step_One": ["Succeeded"] }, ... }
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Full Minimal Example
|
||||
|
||||
```json
|
||||
{
|
||||
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
|
||||
"contentVersion": "1.0.0.0",
|
||||
"triggers": {
|
||||
"Recurrence": {
|
||||
"type": "Recurrence",
|
||||
"recurrence": {
|
||||
"frequency": "Week",
|
||||
"interval": 1,
|
||||
"schedule": { "weekDays": ["Monday"] },
|
||||
"startTime": "2026-01-05T09:00:00Z",
|
||||
"timeZone": "AUS Eastern Standard Time"
|
||||
}
|
||||
}
|
||||
},
|
||||
"actions": {
|
||||
"Compose_Greeting": {
|
||||
"type": "Compose",
|
||||
"runAfter": {},
|
||||
"inputs": "Good Monday!"
|
||||
}
|
||||
},
|
||||
"outputs": {}
|
||||
}
|
||||
```
|
||||
@@ -0,0 +1,211 @@
|
||||
# FlowStudio MCP — Trigger Types
|
||||
|
||||
Copy-paste trigger definitions for Power Automate flow definitions.
|
||||
|
||||
---
|
||||
|
||||
## Recurrence
|
||||
|
||||
Run on a schedule.
|
||||
|
||||
```json
|
||||
"Recurrence": {
|
||||
"type": "Recurrence",
|
||||
"recurrence": {
|
||||
"frequency": "Day",
|
||||
"interval": 1,
|
||||
"startTime": "2026-01-01T08:00:00Z",
|
||||
"timeZone": "AUS Eastern Standard Time"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Weekly on specific days:
|
||||
```json
|
||||
"Recurrence": {
|
||||
"type": "Recurrence",
|
||||
"recurrence": {
|
||||
"frequency": "Week",
|
||||
"interval": 1,
|
||||
"schedule": {
|
||||
"weekDays": ["Monday", "Tuesday", "Wednesday", "Thursday", "Friday"]
|
||||
},
|
||||
"startTime": "2026-01-05T09:00:00Z",
|
||||
"timeZone": "AUS Eastern Standard Time"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Common `timeZone` values:
|
||||
- `"AUS Eastern Standard Time"` — Sydney/Melbourne (UTC+10/+11)
|
||||
- `"UTC"` — Universal time
|
||||
- `"E. Australia Standard Time"` — Brisbane (UTC+10 no DST)
|
||||
- `"New Zealand Standard Time"` — Auckland (UTC+12/+13)
|
||||
- `"Pacific Standard Time"` — Los Angeles (UTC-8/-7)
|
||||
- `"GMT Standard Time"` — London (UTC+0/+1)
|
||||
|
||||
---
|
||||
|
||||
## Manual (HTTP Request / Power Apps)
|
||||
|
||||
Receive an HTTP POST with a JSON body.
|
||||
|
||||
```json
|
||||
"manual": {
|
||||
"type": "Request",
|
||||
"kind": "Http",
|
||||
"inputs": {
|
||||
"schema": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"name": { "type": "string" },
|
||||
"value": { "type": "integer" }
|
||||
},
|
||||
"required": ["name"]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Access values: `@triggerBody()?['name']`
|
||||
Trigger URL available after saving: `@listCallbackUrl()`
|
||||
|
||||
#### No-Schema Variant (Accept Arbitrary JSON)
|
||||
|
||||
When the incoming payload structure is unknown or varies, omit the schema
|
||||
to accept any valid JSON body without validation:
|
||||
|
||||
```json
|
||||
"manual": {
|
||||
"type": "Request",
|
||||
"kind": "Http",
|
||||
"inputs": {
|
||||
"schema": {}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Access any field dynamically: `@triggerBody()?['anyField']`
|
||||
|
||||
> Use this for external webhooks (Stripe, GitHub, Employment Hero, etc.) where the
|
||||
> payload shape may change or is not fully documented. The flow accepts any
|
||||
> JSON without returning 400 for unexpected properties.
|
||||
|
||||
---
|
||||
|
||||
## Automated (SharePoint Item Created)
|
||||
|
||||
```json
|
||||
"When_an_item_is_created": {
|
||||
"type": "OpenApiConnectionNotification",
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "OnNewItem"
|
||||
},
|
||||
"parameters": {
|
||||
"dataset": "https://mytenant.sharepoint.com/sites/mysite",
|
||||
"table": "MyList"
|
||||
},
|
||||
"subscribe": {
|
||||
"body": { "notificationUrl": "@listCallbackUrl()" },
|
||||
"queries": {
|
||||
"dataset": "https://mytenant.sharepoint.com/sites/mysite",
|
||||
"table": "MyList"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Access trigger data: `@triggerBody()?['ID']`, `@triggerBody()?['Title']`, etc.
|
||||
|
||||
---
|
||||
|
||||
## Automated (SharePoint Item Modified)
|
||||
|
||||
```json
|
||||
"When_an_existing_item_is_modified": {
|
||||
"type": "OpenApiConnectionNotification",
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_sharepointonline",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "OnUpdatedItem"
|
||||
},
|
||||
"parameters": {
|
||||
"dataset": "https://mytenant.sharepoint.com/sites/mysite",
|
||||
"table": "MyList"
|
||||
},
|
||||
"subscribe": {
|
||||
"body": { "notificationUrl": "@listCallbackUrl()" },
|
||||
"queries": {
|
||||
"dataset": "https://mytenant.sharepoint.com/sites/mysite",
|
||||
"table": "MyList"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Automated (Outlook: When New Email Arrives)
|
||||
|
||||
```json
|
||||
"When_a_new_email_arrives": {
|
||||
"type": "OpenApiConnectionNotification",
|
||||
"inputs": {
|
||||
"host": {
|
||||
"apiId": "/providers/Microsoft.PowerApps/apis/shared_office365",
|
||||
"connectionName": "<connectionName>",
|
||||
"operationId": "OnNewEmail"
|
||||
},
|
||||
"parameters": {
|
||||
"folderId": "Inbox",
|
||||
"to": "monitored@contoso.com",
|
||||
"isHTML": true
|
||||
},
|
||||
"subscribe": {
|
||||
"body": { "notificationUrl": "@listCallbackUrl()" }
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Child Flow (Called by Another Flow)
|
||||
|
||||
```json
|
||||
"manual": {
|
||||
"type": "Request",
|
||||
"kind": "Button",
|
||||
"inputs": {
|
||||
"schema": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"items": {
|
||||
"type": "array",
|
||||
"items": { "type": "object" }
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Access parent-supplied data: `@triggerBody()?['items']`
|
||||
|
||||
To return data to the parent, add a `Response` action:
|
||||
```json
|
||||
"Respond_to_Parent": {
|
||||
"type": "Response",
|
||||
"runAfter": { "Compose_Result": ["Succeeded"] },
|
||||
"inputs": {
|
||||
"statusCode": 200,
|
||||
"body": "@outputs('Compose_Result')"
|
||||
}
|
||||
}
|
||||
```
|
||||
Reference in New Issue
Block a user