The Auditor (OrgIQ) – End User Guide

Version: 2.0
Last Updated: March 25, 2026
Application: The Auditor powered by OrgIQ


Table of Contents

  1. Introduction
  2. Getting Started
  3. Dashboard Overview
  4. Technical Debt Management
  5. Data Quality Monitoring
  6. Integration Health
  7. Data Cloud Health
  8. Cost Savings Tracker
  9. Field Exporter & Data Cloud Mapping
  10. Running Org Health Scans
  11. Troubleshooting
  12. Glossary

1. Introduction

What is The Auditor?

The Auditor (powered by OrgIQ) is a comprehensive Salesforce intelligence platform that provides deep visibility into your org’s health, data quality, technical debt, integrations, and cost optimization opportunities. It helps you:

  • Assess Org Health: Get a complete picture of your Salesforce org’s configuration, security, and performance across 6 health dimensions
  • Track Technical Debt: Identify and prioritize issues that need attention with intelligent deduplication across scans
  • Monitor Data Quality: Ensure your data meets business standards with customizable rules
  • Manage Integrations: Track all external connections and their health
  • Audit Data Cloud: Monitor data streams, calculated insights, segments, and identity resolution
  • Find Cost Savings: Discover opportunities to optimize spending

Key Benefits

BenefitDescription
360-Degree VisibilitySee everything happening in your org from one dashboard
6 Health DimensionsSecurity, Data Quality, Performance, Integration, Data Cloud, and Metadata
Actionable InsightsNot just problems, but prioritized recommendations
Proactive MonitoringCatch issues before they become critical
Smart DeduplicationRecurring issues tracked across scans without duplicates
ROI TrackingMeasure the value of improvements
Data Cloud ReadyFull Data Cloud health auditing and field metadata export

Who Should Use The Auditor?

The Auditor is designed for different roles with different needs:

RoleWhat They Get
Salesforce AdministratorFull visibility into technical debt, automated remediation, scheduled scans
CRM/Ops ManagerBusiness-friendly health dashboards, cost savings tracker, ROI metrics
Data Steward / AnalystData Quality Rule Builder, dimension-specific scores, evaluation history
Executive / StakeholderHigh-level health gauges, trend indicators, cost savings realization
Data Cloud TeamData stream health, identity resolution monitoring, segment tracking
Integration TeamIntegration registry, health summary, discovery of unregistered integrations

How to Get the Most from The Auditor

Week 1 — Discovery:

  • Install the package and assign permission sets
  • Run your first Full Scan (expect 10-20 minutes on large orgs)
  • Review the Dashboard and familiarize yourself with the health scores

Week 2 — Baseline:

  • Go through the Technical Debt list — triage Critical and High items
  • Mark obvious false positives and “Won’t Fix” items with notes
  • Register your known integrations in the Integrations tab

Week 3 — Data Quality:

  • Identify your top 5-10 most important data fields
  • Create Data Quality rules for those fields
  • Schedule daily evaluations

Week 4 — Automation:

  • Schedule a weekly Full Scan
  • Set up a recurring review meeting to check the dashboard
  • Start using the Cost Savings tracker to document optimization opportunities

Ongoing:

  • Review dashboard weekly
  • Use scan trend indicators to see if you’re improving
  • Remediate issues through the Technical Debt panel
  • Track realized cost savings

2. Getting Started

Installing The Auditor from the AppExchange

When installing The Auditor from the AppExchange, you will be prompted for an installation key (also referred to as a password).

Installation Steps:

  1. From the AppExchange listing page, click Get It Now
  2. Select the destination org (Production or Sandbox)
  3. Agree to the terms and conditions
  4. When prompted for the Installation Key, enter: Audit!2026
  5. Choose the install option:
  • Install for Admins Only (recommended) — only System Administrators get access initially
  • Install for All Users — grants access to all active users in the org
  • Install for Specific Profiles — admin-controlled per-profile access
  1. Click Install and wait for the installation to complete (typically 2-5 minutes)
  2. After installation, assign the appropriate OrgIQ permission set to each user (see Permission Requirements below)

Important: The installation key is case-sensitive. It must be entered exactly as Audit!2026 (including the capital A and the exclamation mark).

Accessing The Auditor

  1. Log into Salesforce
  2. Click the App Launcher (9-dot grid icon)
  3. Search for “The Auditor” or “OrgIQ”
  4. Click to open the application

Permission Requirements

You need one of these permission sets assigned:

Permission SetWho Should Have ItWhat They Can Do
OrgIQ_AdminSalesforce administrators, technical leadsRun scans, create/edit/delete data quality rules, execute remediation, manage all settings
OrgIQ_AnalystBusiness analysts, data stewardsRun scans, create data quality rules, analyze results. Cannot execute remediation (no write-back to records)
OrgIQ_ViewerExecutives, auditors, non-technical stakeholdersRead-only access to dashboards and reports. Cannot run scans or modify anything
OrgIQ_API_UserExternal integrations, automation accountsAPI-level access for programmatic use — not intended for human users

Contact your Salesforce Administrator if you need access.

Important: A permission set is required to use The Auditor. Without one of the above permission sets assigned, all application features will be blocked and you will see an “Insufficient access” error.

Assigning a Permission Set (Administrators)

To assign a permission set to a user:

  1. Go to Setup > Users > Permission Sets
  2. Click the permission set name (e.g., OrgIQ_Admin)
  3. Click Manage Assignments
  4. Click Add Assignments
  5. Select one or more users from the list
  6. Click Next, then Assign
  7. The user will have access the next time they log in or refresh the page

Tip: You can also assign permission sets via Setup > Users > Users, then edit a specific user and scroll to the “Permission Set Assignments” related list.

First Steps

After installation and permission set assignment, follow this recommended workflow:

  1. Run Your First Scan — Open The Auditor app, navigate to the Dashboard tab, and click Run Full Scan in the Quick Actions card. First scans on large orgs can take 10-20 minutes.
  2. Review Dashboard — Once the scan completes, the dashboard populates with 6 health score gauges (Overall, Security, Data Quality, Tech Debt, Performance, Integration, Data Cloud) and metric cards (Objects, Fields, Apex Classes, Flows, Active Users).
  3. Investigate Issues — Click any metric card to drill down into a detailed list (e.g., click “Apex Classes” to see all classes with their API versions, lines of code, and test coverage).
  4. Review Technical Debt — Navigate to the Technical Debt tab to see all findings. Sort by severity and start with Critical items.
  5. Set Up Data Quality Rules — Go to the Data Quality tab and click “Create New Rule” to define your first data quality check.
  6. Schedule Recurring Scans — From the Dashboard, click “Schedule” in the Quick Actions card to set up daily or weekly automated scans.

3. Dashboard Overview

The main dashboard provides a comprehensive view of your org’s health at a glance.

Health Score Gauges

The dashboard displays circular gauges for key health metrics:

GaugeDescriptionScore Range
Overall HealthCombined weighted score across all 6 dimensions0-100
SecurityProfile, permission, and access configuration0-100
Data QualityCompleteness, validity, and consistency of data0-100
Tech DebtCode and configuration quality0-100
PerformanceEfficiency and optimization level0-100
IntegrationHealth and reliability of external connections0-100
Data CloudData stream, segment, and identity resolution health0-100

Each gauge shows a trend indicator comparing the current score to the previous scan. Hover over any gauge to see the score rating and description.

Note: Dashboard data automatically refreshes when you navigate back to a tab. You don’t need to manually refresh after switching between tabs.

Score Interpretation:

  • 90-100: Excellent – Best practices followed
  • 70-89: Good – Minor improvements recommended
  • 50-69: Fair – Attention needed
  • Below 50: Poor – Immediate action required

Metric Cards

The dashboard displays clickable cards with key counts from your org. Click any card to open a searchable detail modal with the full list of records and drill-down information:

MetricWhat It ShowsDetail Modal Contents
ObjectsTotal number of objects in your orgObject label, API name, custom vs standard, field count, record count, last modified, last data added
Custom ObjectsCustom objects onlySame as Objects, filtered to custom
FieldsTotal fields across all custom objectsField label, API name, parent object, data type, required status
Apex ClassesNumber of Apex classesClass name, API version, namespace, status, lines of code, test coverage, created/modified dates
FlowsActive and inactive flowsFlow API name, label, description, process type, trigger type, active status, version number, last modified
Active UsersUsers who have logged in within the last 90 daysName, username, email, profile, role, last login date, created date

In any detail modal you can:

  • Search using the search box at the top to filter by any visible column
  • Sort by clicking any column header (click again to reverse the sort direction)
  • See total count of records displayed at the bottom
  • Close the modal by clicking outside it, pressing Escape, or clicking the X button

Issue Counts

Color-coded badges show issues by severity:

  • Critical (Red): Requires immediate attention
  • High (Orange): Should be addressed soon
  • Medium (Yellow): Plan to fix in upcoming sprints
  • Low (Gray): Nice-to-have improvements

Recent Technical Debt

A list of the most recent or severe technical debt items needing attention. Click “View All” to navigate to the full Technical Debt tab.

Cost Savings Summary

Shows estimated and realized savings from optimization opportunities.


4. Technical Debt Management

What is Technical Debt?

Technical debt represents compromises in configuration, code, or data that create ongoing maintenance burdens or risks. Examples include:

  • Deprecated API versions in Apex classes
  • Unused custom fields consuming storage
  • Complex flows that could be simplified
  • Security configuration gaps

Accessing Technical Debt

  1. Navigate to the Technical Debt tab
  2. Or click on issue badges in the dashboard

Filtering Debt Items

Use the filters at the top:

FilterOptions
CategoryMetadata, Code, Security, Data, Performance, Configuration, DataCloud
SeverityCritical, High, Medium, Low, Info
StatusOpen, In Progress, Resolved, Deferred, Won’t Fix, False Positive

Debt Item Details

Each item shows:

  • Severity: Impact level (color-coded)
  • Issue: Title with link to the record
  • Category: Classification (Metadata, Code, Security, etc.)
  • Component: The affected component’s API name
  • Type: Component type (ApexClass, CustomField, Flow, etc.)
  • Status: Current workflow state

Taking Action on Debt Items

Each debt item has a row actions menu (a small down-arrow button at the right end of each row). Click it to see available actions.

Update Status (Manual Tracking):

Use this when you want to manually track that an issue is being worked on or has been addressed outside The Auditor.

  1. Click the row actions menu (down-arrow) on any item row
  2. Select the new status:
  • In Progress — Someone is actively working on this
  • Resolved — The issue has been fixed (you fixed it in the org directly)
  • Deferred — You’ll address this later
  • Won’t Fix — You’ve decided not to address this (keeps it out of future scans)
  • False Positive — The scanner was wrong about this
  1. Add Resolution Notes when prompted (recommended — explains what was done or why)

Note: Items marked “Won’t Fix” or “False Positive” will be preserved across future scans — they won’t be re-flagged. Items marked “Resolved” will auto-refresh to “Open” if the scanner detects the issue again in a future scan.

Remediate (Automated Fix):

For supported issue types, The Auditor can automatically fix the issue for you.

  1. Click Remediate from the row actions menu
  2. The Remediation Panel opens, showing available remediation options. Each option explains:
  • What it will do — the specific change (e.g., “Deactivate unused fields”)
  • Impact level — Low, Medium, or High risk
  • Whether approval is required — some changes require admin review first
  1. Select an option, then click Preview to see a sample of what will change (e.g., “5 fields will be deactivated: Account.UnusedField1__c, Account.UnusedField2__c, …”)
  2. Click Execute to apply the remediation
  3. Review the results — success count, any failures, and the rollback option

Not all issues have automated remediation — many require manual intervention (e.g., “Refactor complex flow”). For those, The Auditor will show specific recommendations instead.

Remediation Safety:

  • Most remediations create a rollback snapshot before making changes
  • You can rollback within 30 days by opening the remediation record and clicking “Rollback”
  • Some destructive actions (deletions) cannot be rolled back — these require explicit confirmation

Best Practices

  1. Start with Critical items — These have the highest risk. Sort the list by Severity descending.
  2. Group by category — Work on similar items together (e.g., fix all unused fields in one batch).
  3. Track trends over time — Compare your health scores between scans. If Tech Debt is trending down, your remediation efforts are working.
  4. Document resolutions — Always add Resolution Notes. This creates an audit trail and helps team members understand past decisions.
  5. Use Won’t Fix sparingly — Only mark items as “Won’t Fix” after team discussion. Add notes explaining the decision.
  6. Review monthly — Schedule a recurring review of your debt backlog to avoid accumulation.

5. Data Quality Monitoring

Accessing Data Quality

Navigate to the Data Quality tab. The page has two sub-tabs:

  • Overview: Overall quality score, dimension breakdown, and failing rules
  • Rules: Complete list of all data quality rules with management actions

Quality Dimensions

Data quality is measured across 5 dimensions. When you create a rule, you choose which dimension applies:

DimensionWhat It MeasuresExample RuleWhen to Use
CompletenessAre important fields populated?“Account.Industry must not be blank”When missing data breaks reports or workflows
ValidityDo values match expected formats, ranges, or patterns?“Contact.Email must match an email pattern”When bad data format causes system errors
ConsistencyDo related fields agree with each other?“Opportunity.CloseDate must be after CreatedDate”When cross-field logic matters for reporting
TimelinessIs the data recent enough to be trusted?“Opportunity updated within the last 30 days”When stale data leads to bad decisions
UniquenessAre there duplicate records?“Account.Website should be unique”When duplicates inflate metrics or create confusion

Creating Data Quality Rules

Click “Create New Rule” to open the Rule Builder wizard:

Step 1 — Select Object:

  • Choose the Salesforce object you want to evaluate from the dropdown
  • Click Next to continue

Step 2 — Choose Field & Rule Type:

  • Select the specific field to check from a dropdown of all accessible fields
  • Choose a Rule Type: Completeness, Uniqueness, Validity, Consistency, or Timeliness
  • A description of the selected rule type is displayed to help you decide

For Consistency rules, additional configuration fields appear:

  • Comparison Type: Equals, Not Equals, or Contains
  • Compare To Field (same record): Select another field from the same object via dropdown to compare against
  • Related Object: Select a lookup/master-detail relationship from a dropdown (e.g., Account, Owner)
  • Field on Related Object: Once a relationship is selected, choose a field on the related object from a second dropdown
  • Expected Value: Optionally enter a literal value to compare against

A rule name is auto-generated (you can change it later). Click Next to continue.

Step 3 — Configure Thresholds:

How Thresholds Work:

When a rule evaluates, it calculates a score from 0-100% representing what percentage of records pass the rule. For example, if you have 1,000 Accounts and 950 of them have Industry populated, your Completeness score is 95%.

Thresholds determine when the system alerts you:

  • Warning Threshold — When the score falls below this, the rule is flagged as “Warning” (e.g., 90%)
  • Critical Threshold — When the score falls below this, the rule is flagged as “Critical” (e.g., 70%)

Example: If Warning=90 and Critical=70:

  • Score 95% → rule passes (green)
  • Score 85% → warning (yellow) — attention needed
  • Score 60% → critical (red) — urgent action required

Fields in Step 3:

  • Rule Name — auto-generated (e.g., “Completeness – Account.Industry”), but you can customize it
  • Warning Threshold — typically 85-95% for important rules
  • Critical Threshold — typically 60-75% (must be lower than Warning)
  • Business Impact — tells you how important this rule is for the business (different from Warning/Critical, which reflect rule performance):
  • Critical — Failing this rule directly impacts revenue, compliance, or customer experience
  • High — Failing impacts key business processes or reporting accuracy
  • Medium — Failing is a nuisance but doesn’t block business operations
  • Low — Failing is nice-to-have quality improvement
  • Business Impact Description — optional text explaining why this matters (e.g., “Required for monthly revenue reporting”)
  • Evaluation Frequency — how often the rule runs automatically:
  • Real-Time — on record save (use sparingly — can impact performance)
  • Hourly — every hour via batch job
  • Daily — once per day (most common)
  • Weekly — once per week
  • Monthly — once per month
  • On-Demand — only when manually triggered
  • Active toggle — when OFF, the rule is paused and won’t evaluate

Click Next to review your configuration.

Step 4 — Review & Save:

  • Review all settings in the summary
  • Click Save to create the rule
  • The rule will appear in your rules list and be ready for evaluation

Evaluating Rules

Evaluate Single Rule:

  1. Find the rule in the Rules tab
  2. Click the refresh icon next to the rule
  3. Results show the pass/fail score and evaluation result

Evaluate All Rules:

  1. Click “Evaluate All Rules” button on the Overview tab
  2. All active rules are evaluated
  3. Scores update with new results

Managing Rules

From the Rules tab you can:

  • Pause/Resume a rule using the toggle button
  • Delete a rule (with confirmation prompt — this also deletes evaluation history)
  • Evaluate an individual rule on demand

Viewing Results

The Overview tab displays:

  • Overall data quality score gauge
  • Rules summary (total, passing, failing, last evaluated)
  • Dimension score breakdown (Completeness, Validity, Consistency, Timeliness, Uniqueness)
  • Failing rules list with scores and quick-evaluate buttons

6. Integration Health

Accessing Integration Dashboard

Navigate to the Integrations tab to see:

  • Summary statistics across all integrations
  • Health breakdown by status
  • Individual integration cards with filtering
  • Integration discovery feature

Understanding the Dashboard

Summary Cards:

  • Total Integrations: Number of registered integrations
  • Active Integrations: Currently active connections
  • API Calls: Total monthly API call volume
  • Error Rate: Average error rate across all integrations

Health Summary:

  • Healthy: Working correctly
  • Degraded: Experiencing minor issues
  • Unhealthy: Immediate attention needed

Discovering Integrations

Click “Discover Integrations” to automatically scan your org for existing integrations (Connected Apps, Named Credentials, etc.). Discovered integrations can be registered or dismissed.

Adding an Integration

  1. Click Add Integration
  2. Complete the form:
FieldDescriptionExample
Integration NameDescriptive name“Marketing Cloud Contact Sync”
Integration TypeArchitecture pattern (see below)“Outbound API”
DirectionData flow direction“Bidirectional”
External SystemName of the system you’re integrating with“Marketo”
StatusCurrent operational state“Active”
Authentication MethodHow the integration authenticates“OAuth 2.0”
Data SensitivityClassification of the data being exchanged (see below)“Confidential”
External System URLBase URL of the external systemhttps://api.marketo.com
NotesPurpose and additional details“Syncs contacts every 15 minutes…”
  1. Click Save

Integration Type Definitions:

  • Inbound API — External system calls Salesforce APIs to push data IN
  • Outbound API — Salesforce (via Apex/Flows) calls external system APIs to push data OUT
  • Connected App — OAuth 2.0 connection for third-party apps
  • Named Credential — Salesforce’s way of storing auth for callouts
  • External Service — Declarative schema-based integration (like MuleSoft)
  • Middleware — Uses a middleware platform (e.g., Boomi, MuleSoft, Workato)
  • Data Sync — Regular batch data transfer (e.g., nightly ETL)
  • SSO — Single Sign-On identity integration
  • Other — Anything else (manual file transfers, custom scripts)

Data Sensitivity Classifications:

  • Public — Marketing content, publicly available data (lowest risk)
  • Internal — Business data for internal use only (e.g., internal reports)
  • Confidential — Customer data, financial records, non-public business data
  • Restricted — PII, payment card data, health records (highest risk — requires strong controls)

Why classify data? Data Sensitivity helps prioritize security reviews and monitoring. An integration handling “Restricted” data deserves more scrutiny than one handling “Public” data.

Filtering Integrations

Use the status filter to view integrations by status: All, Active, Inactive, Testing, or Deprecated.

Integration Best Practices

  1. Register all integrations – Even manual file transfers
  2. Set appropriate sensitivity levels – For security classification
  3. Document endpoints and credentials – Keep information current
  4. Use Discovery – Regularly scan for unregistered integrations

7. Data Cloud Health

Accessing Data Cloud Dashboard

Navigate to the Data Cloud tab. If Data Cloud is not enabled in your org, you’ll see a message indicating it’s not available.

Overview Tab

The overview provides a summary of your Data Cloud environment at a glance:

Summary Cards:

CardWhat It Shows
Data StreamsTotal count with error alert if any are in error state
Calculated InsightsNumber of calculated insights configured
SegmentsNumber of segments with member counts
Identity ResolutionNumber of identity resolution rulesets

Health Sections:
Each section shows a status breakdown (Active, Error, Inactive counts) for the respective Data Cloud component type.

Data Streams Tab

Displays all configured data streams with:

  • Stream Name: The data stream identifier
  • Source Object: Salesforce object feeding the stream
  • Status: Active, Error, Disconnected, Inactive
  • Records Processed: Number of records synced
  • Last Refresh: When the stream was last updated

Tip: Data streams in error state are flagged on the overview card and contribute to a lower Data Cloud health score.

Calculated Insights Tab

Lists all calculated insights with name, status, and last refresh time.

Segments Tab

Shows all segments with:

  • Segment Name: The segment identifier
  • Status: Published, Draft, etc.
  • Member Count: Number of individuals in the segment
  • Last Refresh: When segment membership was last computed

Identity Resolution Tab

Displays identity resolution rulesets with their status and match statistics.

Required One-Time Setup: Named Credential

What is a Named Credential and why do I need one?

The Auditor needs to call Salesforce’s Data Cloud REST APIs to inspect your data streams, segments, and identity resolution rulesets. To make these API calls securely, Salesforce requires a Named Credential — a configuration record that stores the API endpoint (your org’s URL) and authenticates the calls using an OAuth token.

Named Credentials are org-specific (they contain your org’s URL and your admin’s authentication) and cannot be bundled into a managed package for security reasons. This is a one-time setup step the admin performs after installing The Auditor.

The Named Credential must be named Salesforce_API exactly — The Auditor looks for this specific name.

IMPORTANT: Without this setup, the Data Cloud tab will show “Data Cloud Not Available” even if Data Cloud is enabled in your org. Administrators must complete the steps below before Data Cloud auditing will work.

Follow these steps (administrators only, one-time setup):

Step A: Create an External Credential

  1. Go to Setup > Security > Named Credentials > External Credentials tab
  2. Click New
  3. Label: Salesforce API External Credential
  4. Name: Salesforce_API_EC
  5. Authentication Protocol: OAuth 2.0
  6. Authentication Flow Type: Browser Flow
  7. Scope: refresh_token full
  8. Identity Provider URL: leave blank (uses same org)
  9. Click Save

Step B: Add a Principal to the External Credential

  1. On the External Credential you just created, scroll to Principals section
  2. Click New
  3. Parameter Name: OrgIQ_User
  4. Identity Type: Named Principal
  5. Sequence Number: 1
  6. Authentication Flow Type: Browser Flow
  7. Click Save, then click Authenticate and sign in as an administrator
  8. Verify the authentication status shows “Configured”

Step C: Create the Named Credential

  1. Go back to Setup > Security > Named Credentials (Named Credentials tab)
  2. Click New
  3. Label: Salesforce API
  4. Name: Salesforce_API (must match exactly — case-sensitive)
  5. URL: https://<your-my-domain>.my.salesforce.com (your org’s My Domain URL — find it in Setup > My Domain)
  6. Enabled for Callouts: checked
  7. External Credential: select Salesforce_API_EC
  8. Generate Authorization Header: checked
  9. Allow Formulas in HTTP Header: leave unchecked
  10. Allow Formulas in HTTP Body: leave unchecked
  11. Click Save

Step D: Grant Access to the OrgIQ Permission Set

  1. Go to Setup > Users > Permission Sets
  2. Open the OrgIQ_Admin permission set (and any other OrgIQ permission sets that need Data Cloud access)
  3. Under Apps > External Credential Principal Access, click Edit
  4. Add Salesforce_API_EC - OrgIQ_User
  5. Click Save

Step E: Verify

  1. Navigate to The Auditor’s Data Cloud tab
  2. The dashboard should now load with Data Cloud status and metrics
  3. If still unavailable, the error message will indicate the specific issue

Data Cloud Prerequisites

In addition to the Named Credential, your org must have:

  1. Data Cloud enabled — Contact Salesforce to enable Data Cloud in your org
  2. CRM Connector configured — Go to Setup > Data Cloud Setup > CRM Data to connect your CRM data
  3. At least one Data Stream active — Configure data streams to sync Salesforce objects to Data Cloud
  4. Data Cloud Admin permission — The user viewing the dashboard needs Data Cloud permissions in addition to an OrgIQ permission set

Data Cloud Setup Guide

If you’re setting up Data Cloud for the first time:

Step 1: Enable Data Cloud

  1. Go to Setup > Data Cloud Setup
  2. Follow the activation wizard
  3. Accept the Data Cloud terms of service

Step 2: Configure the CRM Connector

  1. Go to Setup > Data Cloud Setup > Data Streams
  2. Click “New Data Stream”
  3. Select “CRM” as the connector type
  4. Choose which Salesforce objects to sync (e.g., Contact, Account, Lead)

Step 3: Map Fields to Data Model Objects (DMOs)

  1. For each data stream, map source fields to target DMO fields
  2. Standard DMOs include: Individual, Account, Contact Point Email, Contact Point Phone, Contact Point Address
  3. Set primary keys (typically the Salesforce record ID)

Tip: Use The Auditor’s Field Exporter (Data Cloud export formats) to generate mapping files that can be used as a reference or deployed via CLI.

Step 4: Activate Data Streams

  1. Review each data stream configuration
  2. Click “Activate” to start syncing data
  3. Monitor the initial sync on the Data Cloud tab in The Auditor

Step 5: Configure Identity Resolution (Optional)

  1. Go to Data Cloud Setup > Identity Resolution
  2. Create ruleset matching rules (e.g., match on Email, Phone, Name)
  3. Activate the ruleset

Step 6: Create Segments (Optional)

  1. Go to Data Cloud Setup > Segments
  2. Define audience criteria
  3. Publish segments for activation

Interpreting the Data Cloud Health Score

The Data Cloud health score (0-100) considers:

  • Data stream health: Percentage of streams in Active status vs Error/Disconnected
  • Data freshness: How recently streams were refreshed
  • Identity resolution: Whether rulesets are active and processing
  • Segment health: Whether segments are published and have members
  • Activation targets: Status of configured activation targets

A score of 90+ indicates a well-configured Data Cloud environment. Below 70 suggests issues that need attention (error streams, stale data, inactive rulesets).


8. Cost Savings Tracker

Accessing Cost Savings

Navigate to the Cost Savings tab to track (data refreshes automatically when you return to this tab):

  • Estimated annual savings
  • Realized savings to date
  • Realization rate (realized vs. total opportunities)
  • Savings by category and status breakdown

Understanding Categories

CategoryExamples
License OptimizationDeactivate unused users, downgrade license types
Storage ReductionArchive old data, delete unused attachments
API EfficiencyOptimize API call patterns, reduce redundant calls
Automation ConsolidationConsolidate redundant flows, optimize triggers
Feature RationalizationRemove unused features, simplify configurations
Third-Party AppsRemove unused packages, renegotiate contracts
InfrastructureReduce sandbox count, optimize environments

Adding a Savings Opportunity

  1. Click Add Opportunity
  2. Complete the form:
FieldDescriptionExample
Opportunity NameClear, descriptive title“Deactivate 47 inactive Salesforce users”
CategoryType of savings (see Categories table above)“License Optimization”
StatusCurrent state (see Status Workflow below)“Identified”
PriorityHow urgent this is to pursue, based on value vs effort“High” if >$10K/year with <1 day effort
Estimated Annual SavingsProjected yearly savings amount (USD)75000 (for 47 licenses × ~$1,600/year)
Estimated One-Time SavingsAny one-time savings0 (typical for license optimization)
Implementation EffortHow much work is required“Small (1-4 hrs)” — run deactivation script
Confidence LevelHow sure you are of the savings estimate“High” if you’ve verified 47 users are truly inactive
Target DateWhen you plan to realize the savings by2026-04-30
DescriptionDetails and business justification“Users haven’t logged in for 12+ months…”
Implementation StepsThe plan to achieve the savings“1. Review list with HR. 2. Deactivate. 3. Request license credit.”
RisksPotential risks or concerns“Some users may be seasonal — verify first”
  1. Click Save

Tip: Priority should reflect value-to-effort ratio. A “Critical” priority means high savings AND easy to implement. An opportunity with $500K savings but requiring 6 months of work might be “Medium” priority in the short term.

Tracking Progress

Status Workflow:

  1. Identified – Opportunity discovered
  2. Under Review – Being evaluated
  3. Approved – Approved for implementation
  4. In Progress – Implementation underway
  5. Realized – Savings achieved
  6. Declined – Not pursuing

Updating Actual Savings:
When savings are realized:

  1. Open the opportunity
  2. Update status to “Realized”
  3. Enter Actual Savings amount
  4. The realization rate on the dashboard will update automatically

9. Field Exporter & Data Cloud Mapping

Accessing Field Exporter

Navigate to the Field Exporter tab. This dual-mode tool supports:

  • Export Mode: Extract field metadata for documentation or Data Cloud mapping
  • Import Mode: Import mapping configurations for Data Cloud deployment

Export Mode

Step 1: Select Objects

  1. Use the dual listbox to select objects
  2. Move objects from “Available” to “Selected”
  3. Click “Next: Preview Fields”

Step 2: Preview Fields

  • Review all fields grouped by object
  • Click on an object to expand and see field details
  • Verify the field count shown

Step 3: Export Options

Which Export Type Should I Pick?

If you want to…Choose…Output
Document all fields for an audit or handoffStandard Field ExportCSV or JSON with full metadata
Plan a Data Cloud implementation — see how fields would mapData Cloud DMO MappingCSV/JSON with suggested DMO mappings
Deploy data streams via Salesforce CLI in a CI/CD pipelineData Cloud Data Stream (YAML)YAML files ready for sf project deploy
Manually configure Data Cloud in Setup using a spreadsheetData Cloud Import CSVSimplified CSV matching Data Cloud’s import format
Build a complete Data Kit for portable Data Cloud deploymentData Kit Manifest (JSON)Full Data Kit JSON manifest

Detailed Export Type Descriptions:

TypeDescriptionUse Case
Standard Field ExportRaw field metadata (data type, length, required, etc.)Documentation, auditing, compliance review
Data Cloud DMO MappingFields pre-mapped to Data Cloud DMOs (Individual, Contact Point, etc.)Data Cloud implementation planning
Data Cloud Data Stream (YAML)YAML data stream definitions for CI/CDAutomated deployment via CLI
Data Cloud Import CSVSimplified CSV formatManual Data Cloud setup in Setup > Data Streams
Data Kit Manifest (JSON)Complete Data Kit JSON with all metadataFull deployment package for Data Kits

For Data Cloud Exports, configure:

  • Data Kit Name — The name that will appear in the manifest (datakit-json only; e.g., “Sales Customer Data Kit”)
  • Data Source Type — Where the data is coming from:
  • CRM — Standard Salesforce CRM data (most common)
  • Ingestion API — Data from external systems via API
  • Marketing Cloud — Data from Marketing Cloud
  • Include Relationships — When checked, includes foreign key relationships (lookup/master-detail fields) in the export. Recommended for most use cases.
  • Include Custom DMO/Field Definitions — When checked, includes definitions for any custom DMOs or fields that don’t yet exist in Data Cloud. Use this when your source includes custom fields that need to be created in Data Cloud.

Data Stream Preview:
Shows how fields will be grouped into data streams. Note that one object (like Contact) may generate multiple streams:

  • Contact → Individual (name, birthdate, etc.)
  • Contact → Contact Point Email (email fields)
  • Contact → Contact Point Phone (phone fields)
  • Contact → Contact Point Address (address fields)

Step 4: Download

Click “Export & Download” to generate and download your file.

Import Mode

Step 1: Upload File

  1. Switch to Import mode using the toggle
  2. Upload a mapping file (CSV or JSON from previous export)
  3. Enter a Configuration Name
  4. Select Data Source Type
  5. Click “Import File”

Or select a previously saved configuration from the list.

Step 2: Review Mappings

  • See configuration summary (objects, fields, status)
  • Expand each object to see field mappings
  • Verify source → target field mappings
  • Check primary key and required indicators

Step 3: Deploy

Option 1: Download Metadata Package (Recommended)

  1. Click “Download Metadata Package”
  2. A modal appears with individual XML files
  3. Download all files to a folder
  4. Deploy using Salesforce CLI:
   sf project deploy start --metadata-dir <folder>

Option 2: Validate Configuration

  1. Click “Validate Configuration”
  2. System checks that objects and fields exist
  3. Review validation results
  4. Use the guidance to complete manual setup

Option 3: Manual Data Cloud Setup

  1. Go to Setup > Data Cloud > Data Streams
  2. Create new data streams following the configuration
  3. Map fields as shown in the configuration

CSV Import Format

When importing a CSV file, use these column names:

ColumnDescription
Source ObjectSalesforce object API name
Source FieldSalesforce field API name
Target DMOData Cloud DMO name
Target Field / DMO Field NameData Cloud field name
Source Type / Salesforce TypeSalesforce data type
Target Type / Data Cloud TypeData Cloud data type
Primary Keytrue/false
Requiredtrue/false
Foreign Keytrue/false

10. Running Org Health Scans

Starting a Scan

The Quick Actions card on the Dashboard is your starting point for all scans. It’s a compact panel with four action buttons:

ButtonWhen to UseTypical Duration
Run Full ScanComprehensive analysis — use for baseline reviews or after major changes10-30 min (large orgs)
Quick ScanDaily health check — covers the highest-value metrics only2-5 min
Custom ScanWhen you want to focus on a specific areaVaries
ScheduleSet up automated recurring scansOne-time setup

How to run a scan:

  1. Open the Dashboard tab
  2. Find the Quick Actions card (usually in the upper portion of the dashboard)
  3. Click the appropriate scan button
  4. A progress banner appears at the top of the dashboard — you can continue working while the scan runs

Only one scan can run at a time per org. If you click “Run Full Scan” while another scan is in progress, you’ll see a “Scan Already Running” message.

Scan Types

Scan TypeCoverageBest For
Full ScanEverything: metadata, code, security, data quality, integrations, Data CloudWeekly baseline, post-release reviews
Quick ScanHighest-value metrics only — key security issues, recent changes, critical rulesDaily health checks
Security OnlyProfiles, permission sets, sharing rules, field-level securityPost-audit reviews, compliance checks
Data OnlyData quality rules and completeness evaluationAfter data loads or cleanups

During the Scan

  • A progress banner appears at the top of the dashboard showing scan status
  • Progress percentage and current step are displayed
  • Platform Events provide real-time updates
  • You can continue using the application while the scan runs

After the Scan

  1. Review health scores on the dashboard — they update automatically
  2. Check new debt items created by the scan
  3. Compare to previous scan — trend indicators on score gauges show improvement or regression

Intelligent Deduplication

The Auditor automatically handles recurring issues across scans:

  • Recurring issues: If the same issue is detected again, the existing finding is updated with an incremented Detection Count rather than creating a duplicate. The First Detected Date is preserved from the original finding.
  • Auto-resolved issues: If an issue from a previous scan is no longer detected, it is automatically marked as Resolved with the note “Auto-resolved: not detected in latest scan.” This means issues you fix between scans are automatically cleared.
  • User-dismissed items: Issues you marked as “Won’t Fix” or “False Positive” are not affected by subsequent scans — your dismissal is preserved.

Tip: Check the Detection Count on a debt item to see how many consecutive scans have found the same issue. A high count indicates a persistent problem that hasn’t been addressed.

Scheduling Regular Scans

Scheduled scans run automatically in the background and keep your dashboards and debt lists fresh without manual intervention.

To schedule a scan:

  1. From the Dashboard, click Schedule in the Quick Actions card
  2. Select the scan type (Full, Quick, Security Only, or Data Only)
  3. Choose frequency:
  • Daily — runs every day at the hour you specify (recommended for Quick or Security Only scans)
  • Weekly — runs once per week (recommended for Full scans on large orgs)
  1. For Weekly scans, select the day of the week
  2. Choose the hour to run (use a low-traffic time like 2:00 AM)
  3. Click Schedule to save

Recommended Schedule:

  • Small orgs (< 1000 users): Daily Full Scan at 2 AM
  • Medium orgs (1000-5000 users): Weekly Full Scan + Daily Quick Scan
  • Large orgs (5000+ users): Weekly Full Scan on Saturday night + Daily Security Only scan

Managing Scheduled Scans:

To view, modify, or cancel scheduled scans:

  1. Go to Setup > Jobs > Scheduled Jobs (search “Scheduled Jobs” in Quick Find)
  2. Look for jobs named OrgIQ_Daily_*_Scan or OrgIQ_Weekly_*_Scan
  3. To cancel a scheduled job, click Del next to the job name
  4. To modify a schedule, cancel the existing job, then re-schedule from The Auditor with new settings
  5. To view a job’s next run time, look at the “Next Scheduled Run” column

Note: Only one schedule per scan type can exist at a time. If you try to schedule a Daily Full Scan when one already exists, you’ll get a “job already scheduled” message. Cancel the old one first, then schedule the new one.

Viewing Scan History

To review past scans and see trends:

  1. Navigate to Org Health Scans tab in the App Launcher (not the Dashboard — this is a standard record list)
  2. See all past scans with date, scan type, status (Completed, Failed, In Progress), and overall health score
  3. Click any scan record to see the full details including all findings from that scan
  4. Compare scores across scans to track your org’s improvement over time

11. Troubleshooting

Common Issues

“Insufficient access” error on any page

  • You must have one of the OrgIQ permission sets assigned (Admin, Analyst, Viewer, or API_User)
  • Contact your Salesforce Administrator to request a permission set assignment

“No data available” on dashboard

  • Ensure you have the correct permission set assigned
  • Run an initial org health scan
  • Check that the scan completed successfully

Scan fails to complete

  • Large orgs may timeout — try Quick Scan first
  • Check for governor limit issues
  • Contact admin if problem persists

Cannot export fields

  • Ensure you have selected at least one object
  • Check that you have access to the objects
  • Try a smaller selection first

Data Quality rules not evaluating

  • Verify the rule is marked Active (not Paused)
  • Check that the target object/field exists
  • Ensure you have access to query the object

Import fails

  • Verify CSV/JSON format matches expected structure
  • Check for special characters in field names
  • Ensure file is not empty

Cost savings opportunity fails to save

  • Ensure required fields are completed: Name, Category, Status
  • Verify the Implementation Effort value is valid (Trivial, Small, Medium, Large, or XLarge)

Data Cloud tab shows “Not Available”

  • Most common cause: The Salesforce_API Named Credential hasn’t been created in your org. See Section 7 — “Required One-Time Setup: Named Credential” for step-by-step instructions.
  • The error message on the Data Cloud tab will specify the exact reason:
  • “Named Credential ‘Salesforce_API’ is not configured…” → Follow the Named Credential setup in Section 7
  • “Authentication failed. Verify the Named Credential…” → Re-authenticate the External Credential principal
  • “Data Cloud is not enabled in this org.” → Data Cloud is not provisioned — contact Salesforce
  • Ensure you have Data Cloud Admin permissions in addition to an OrgIQ permission set
  • Ensure your OrgIQ permission set has access to the External Credential principal (Setup > Permission Sets > External Credential Principal Access)

Data Cloud dashboard shows no data streams

  • Data Cloud is enabled but no data streams are configured
  • Go to Setup > Data Cloud Setup > Data Streams to create your first data stream
  • Use the Field Exporter’s Data Cloud export formats to generate mapping files

Data Cloud health score is low

  • Check for data streams in Error or Disconnected status
  • Verify identity resolution rulesets are active
  • Ensure segments have been published and have members
  • Check that activation targets are properly configured

Duplicate issues appearing in Technical Debt list

  • This should not happen with the current version — scans automatically deduplicate
  • If you see duplicates from before this feature was added, they are from older scans
  • Run a new scan to consolidate: recurring issues will be deduplicated and old copies auto-resolved

Getting Help

  1. Documentation: Refer to this guide
  2. Support: Contact your Salesforce Administrator

12. Glossary

TermDefinition
Activation TargetA Data Cloud destination for segment data (e.g., Marketing Cloud, Ads)
Calculated InsightA Data Cloud computed metric or aggregation derived from unified data
Contact PointData Cloud objects for email, phone, and address (e.g., Contact Point Email)
Data KitA packaged Data Cloud configuration for deployment
Data StreamA mapping from a Salesforce object to a Data Cloud DMO that syncs data
Detection CountNumber of consecutive scans in which a technical debt item has been found
DMO (Data Model Object)Data Cloud’s standard data objects like Individual, Account, Contact Point
Governor LimitsSalesforce platform limits on resources like queries and CPU time
Health DimensionOne of the 6 scoring areas: Security, Data Quality, Performance, Integration, Data Cloud, Metadata
Health ScoreA 0-100 rating of org quality across the 6 health dimensions
Identity ResolutionData Cloud feature that matches and merges records across data sources into unified profiles
MetadataConfiguration data that defines your Salesforce setup
Permission SetA Salesforce security feature that grants access to specific application features
RemediationThe process of fixing identified issues, either manually or through automated actions
ScanAn analysis of your org’s metadata and configuration across all health dimensions
SegmentA Data Cloud audience definition based on criteria applied to unified profiles
Technical DebtAccumulated inefficiencies in code, configuration, or data requiring future work
YAMLA human-readable data format used for configuration files

Quick Reference Card

Status Meanings

StatusMeaning
OpenIssue identified, not yet addressed
In ProgressWork underway
ResolvedIssue fixed
DeferredPostponed for later
Won’t FixDecided not to address
False PositiveNot actually an issue

Severity Levels

LevelAction Timeframe
CriticalImmediate (within 24 hours)
HighSoon (within 1 week)
MediumPlanned (within 1 month)
LowBacklog (when convenient)
InfoNo action required

Document Version: 2.0
Last Updated: March 25, 2026