MYOB Advanced — Admin Procedures (Developer View)
Engineer tasks for the MYOB ingest pipeline — adding extractors, rotating credentials, regenerating snapshots, and replaying failed loads.
Status: draft. This page captures the shape of each task. Exact ADF screen flow and Azure DevOps approval gates should be added by someone with day-to-day pipeline access before promoting to
current.
Add a new MYOB entity to the warehouse
Section titled “Add a new MYOB entity to the warehouse”Pulling a new MYOB entity through to the warehouse is a four-step job spanning two repos.
| Step | Where | What |
|---|---|---|
| 1 | This repo | Confirm the entity exists on Default/24.200.001 via mcp__pwg-myob__myob_list_endpoints and a sample myob_query_entity(<entity>, top=5). Note the schema. |
| 2 | PWG-DataMart | Add STAGING.MYOB_<entity>_PWG.sql and _JV.sql (Tables/) with the columns from step 1. Match the _PWG / _JV suffix convention. |
| 3 | ADF (separate Azure repo) | Add Copy Data activities — one per company. Source = MYOB linked service + the entity. Sink = the staging tables from step 2. Source projection AND sink mapping must list every column you want. |
| 4 | PWG-DataMart | Add BI_MYOB_ETL/Views/MYOB_<entity>_PWG.sql (and _JV) with a SELECT projection — this is the seam where ETL columns can be renamed or filtered. Add BI_MYOB_DATA/Views/MYOB_<entity>_PWG+JV.sql if the entity is shared. Promote into CORE / MART as needed. |
Don’t skip step 4. Downstream MART views consume the
BI_MYOB_DATA views, not staging directly.
Add a column to an existing entity
Section titled “Add a column to an existing entity”| Step | Where |
|---|---|
| 1 | PWG-DataMart |
| 2 | ADF |
| 3 | PWG-DataMart |
| 4 | Update CORE and any MART views that need the new field |
Rotate OAuth credentials
Section titled “Rotate OAuth credentials”| Step | Where | Notes |
|---|---|---|
| 1 | MYOB tenant — System → OAuth Apps | Generate new client secret on the relevant company’s app. Revoke the old secret only after the new one is verified working. |
| 2 | Local MCP | Update ~/.pwg-myob/credentials.json (or equivalent env vars) and re-run mcp__pwg-myob__myob_health to force a token refresh. The refresh-token cache in ~/.pwg-myob/ will repopulate on next call. |
| 3 | Azure Key Vault | Update the linked-service credential reference for the ADF MYOB connection (one per company). |
| 4 | Verify | Trigger an on-demand ADF pipeline run; confirm rows land. |
The two companies have independent OAuth apps — rotate each separately and verify each separately.
Regenerate the CSV snapshots in data/
Section titled “Regenerate the CSV snapshots in data/”The CSVs in the source repo’s
data/
folder are convenience snapshots, not the source of truth. To
refresh:
chart_of_accounts_pwg.csv ← mcp__pwg-myob__myob_get_accounts(active_only=False, top=2000)
chart_of_accounts_jv.csv ← mcp__pwg-myob-jv__myob_get_accounts(active_only=False, top=2000)
taxes.csv ← mcp__pwg-myob__myob_query_entity("Tax", top=200)
tax_categories.csv ← mcp__pwg-myob__myob_query_entity("TaxCategory", top=200)Tax codes / categories are tenant-wide so either MCP server returns the same data — pick one consistently. Commit the regenerated files; the diff is the audit trail.
Replay a failed ADF load
Section titled “Replay a failed ADF load”- Identify the affected pipeline + activity in ADF Monitoring.
- Confirm the failure cause — most common is a column added in MYOB that the source projection doesn’t list (silent drop) or a schema-strict failure on the sink.
- If the data already partially landed, truncate the staging
table for the affected company and rerun rather than appending —
STAGING.MYOB_*tables are full snapshots, not incremental feeds. - Verify row counts against the MYOB side using the MCP servers before letting downstream ETL pick the changes up.
Manual user-access export
Section titled “Manual user-access export”Until Users / Roles are exposed via API (see known-issues):
- Log into MYOB Advanced as an admin.
- Screens SM.SM201010 (Users) and SM.SM201005 (Roles).
- Export each to CSV via the screen toolbar.
- Drop the files into the source repo’s
data/folder asusers.csvandroles.csv(kebab-case names, no company suffix — these are tenant-wide). - Commit so the access-review history is at least version- controlled.