Skip to content

MYOB Advanced — Admin Procedures (Developer View)

Engineer tasks for the MYOB ingest pipeline — adding extractors, rotating credentials, regenerating snapshots, and replaying failed loads.

Status: draft. This page captures the shape of each task. Exact ADF screen flow and Azure DevOps approval gates should be added by someone with day-to-day pipeline access before promoting to current.

Pulling a new MYOB entity through to the warehouse is a four-step job spanning two repos.

StepWhereWhat
1This repoConfirm the entity exists on Default/24.200.001 via mcp__pwg-myob__myob_list_endpoints and a sample myob_query_entity(<entity>, top=5). Note the schema.
2PWG-DataMartAdd STAGING.MYOB_<entity>_PWG.sql and _JV.sql (Tables/) with the columns from step 1. Match the _PWG / _JV suffix convention.
3ADF (separate Azure repo)Add Copy Data activities — one per company. Source = MYOB linked service + the entity. Sink = the staging tables from step 2. Source projection AND sink mapping must list every column you want.
4PWG-DataMartAdd BI_MYOB_ETL/Views/MYOB_<entity>_PWG.sql (and _JV) with a SELECT projection — this is the seam where ETL columns can be renamed or filtered. Add BI_MYOB_DATA/Views/MYOB_<entity>_PWG+JV.sql if the entity is shared. Promote into CORE / MART as needed.

Don’t skip step 4. Downstream MART views consume the BI_MYOB_DATA views, not staging directly.

StepWhere
1PWG-DataMart
2ADF
3PWG-DataMart
4Update CORE and any MART views that need the new field
StepWhereNotes
1MYOB tenant — System → OAuth AppsGenerate new client secret on the relevant company’s app. Revoke the old secret only after the new one is verified working.
2Local MCPUpdate ~/.pwg-myob/credentials.json (or equivalent env vars) and re-run mcp__pwg-myob__myob_health to force a token refresh. The refresh-token cache in ~/.pwg-myob/ will repopulate on next call.
3Azure Key VaultUpdate the linked-service credential reference for the ADF MYOB connection (one per company).
4VerifyTrigger an on-demand ADF pipeline run; confirm rows land.

The two companies have independent OAuth apps — rotate each separately and verify each separately.

The CSVs in the source repo’s data/ folder are convenience snapshots, not the source of truth. To refresh:

chart_of_accounts_pwg.csv
← mcp__pwg-myob__myob_get_accounts(active_only=False, top=2000)
chart_of_accounts_jv.csv
← mcp__pwg-myob-jv__myob_get_accounts(active_only=False, top=2000)
taxes.csv
← mcp__pwg-myob__myob_query_entity("Tax", top=200)
tax_categories.csv
← mcp__pwg-myob__myob_query_entity("TaxCategory", top=200)

Tax codes / categories are tenant-wide so either MCP server returns the same data — pick one consistently. Commit the regenerated files; the diff is the audit trail.

  1. Identify the affected pipeline + activity in ADF Monitoring.
  2. Confirm the failure cause — most common is a column added in MYOB that the source projection doesn’t list (silent drop) or a schema-strict failure on the sink.
  3. If the data already partially landed, truncate the staging table for the affected company and rerun rather than appending — STAGING.MYOB_* tables are full snapshots, not incremental feeds.
  4. Verify row counts against the MYOB side using the MCP servers before letting downstream ETL pick the changes up.

Until Users / Roles are exposed via API (see known-issues):

  1. Log into MYOB Advanced as an admin.
  2. Screens SM.SM201010 (Users) and SM.SM201005 (Roles).
  3. Export each to CSV via the screen toolbar.
  4. Drop the files into the source repo’s data/ folder as users.csv and roles.csv (kebab-case names, no company suffix — these are tenant-wide).
  5. Commit so the access-review history is at least version- controlled.