The Auditor (OrgIQ) – End User Guide
Version: 2.0
Last Updated: March 25, 2026
Application: The Auditor powered by OrgIQ
Table of Contents
- Introduction
- Getting Started
- Dashboard Overview
- Technical Debt Management
- Data Quality Monitoring
- Integration Health
- Data Cloud Health
- Cost Savings Tracker
- Field Exporter & Data Cloud Mapping
- Running Org Health Scans
- Troubleshooting
- Glossary
1. Introduction
What is The Auditor?
The Auditor (powered by OrgIQ) is a comprehensive Salesforce intelligence platform that provides deep visibility into your org’s health, data quality, technical debt, integrations, and cost optimization opportunities. It helps you:
- Assess Org Health: Get a complete picture of your Salesforce org’s configuration, security, and performance across 6 health dimensions
- Track Technical Debt: Identify and prioritize issues that need attention with intelligent deduplication across scans
- Monitor Data Quality: Ensure your data meets business standards with customizable rules
- Manage Integrations: Track all external connections and their health
- Audit Data Cloud: Monitor data streams, calculated insights, segments, and identity resolution
- Find Cost Savings: Discover opportunities to optimize spending
Key Benefits
| Benefit | Description |
|---|---|
| 360-Degree Visibility | See everything happening in your org from one dashboard |
| 6 Health Dimensions | Security, Data Quality, Performance, Integration, Data Cloud, and Metadata |
| Actionable Insights | Not just problems, but prioritized recommendations |
| Proactive Monitoring | Catch issues before they become critical |
| Smart Deduplication | Recurring issues tracked across scans without duplicates |
| ROI Tracking | Measure the value of improvements |
| Data Cloud Ready | Full Data Cloud health auditing and field metadata export |
Who Should Use The Auditor?
The Auditor is designed for different roles with different needs:
| Role | What They Get |
|---|---|
| Salesforce Administrator | Full visibility into technical debt, automated remediation, scheduled scans |
| CRM/Ops Manager | Business-friendly health dashboards, cost savings tracker, ROI metrics |
| Data Steward / Analyst | Data Quality Rule Builder, dimension-specific scores, evaluation history |
| Executive / Stakeholder | High-level health gauges, trend indicators, cost savings realization |
| Data Cloud Team | Data stream health, identity resolution monitoring, segment tracking |
| Integration Team | Integration registry, health summary, discovery of unregistered integrations |
How to Get the Most from The Auditor
Week 1 — Discovery:
- Install the package and assign permission sets
- Run your first Full Scan (expect 10-20 minutes on large orgs)
- Review the Dashboard and familiarize yourself with the health scores
Week 2 — Baseline:
- Go through the Technical Debt list — triage Critical and High items
- Mark obvious false positives and “Won’t Fix” items with notes
- Register your known integrations in the Integrations tab
Week 3 — Data Quality:
- Identify your top 5-10 most important data fields
- Create Data Quality rules for those fields
- Schedule daily evaluations
Week 4 — Automation:
- Schedule a weekly Full Scan
- Set up a recurring review meeting to check the dashboard
- Start using the Cost Savings tracker to document optimization opportunities
Ongoing:
- Review dashboard weekly
- Use scan trend indicators to see if you’re improving
- Remediate issues through the Technical Debt panel
- Track realized cost savings
2. Getting Started
Installing The Auditor from the AppExchange
When installing The Auditor from the AppExchange, you will be prompted for an installation key (also referred to as a password).
Installation Key: Audit!2026
Installation Steps:
- From the AppExchange listing page, click Get It Now
- Select the destination org (Production or Sandbox)
- Agree to the terms and conditions
- When prompted for the Installation Key, enter:
Audit!2026 - Choose the install option:
- Install for Admins Only (recommended) — only System Administrators get access initially
- Install for All Users — grants access to all active users in the org
- Install for Specific Profiles — admin-controlled per-profile access
- Click Install and wait for the installation to complete (typically 2-5 minutes)
- After installation, assign the appropriate OrgIQ permission set to each user (see Permission Requirements below)
Important: The installation key is case-sensitive. It must be entered exactly as
Audit!2026(including the capital A and the exclamation mark).
Accessing The Auditor
- Log into Salesforce
- Click the App Launcher (9-dot grid icon)
- Search for “The Auditor” or “OrgIQ”
- Click to open the application
Permission Requirements
You need one of these permission sets assigned:
| Permission Set | Who Should Have It | What They Can Do |
|---|---|---|
| OrgIQ_Admin | Salesforce administrators, technical leads | Run scans, create/edit/delete data quality rules, execute remediation, manage all settings |
| OrgIQ_Analyst | Business analysts, data stewards | Run scans, create data quality rules, analyze results. Cannot execute remediation (no write-back to records) |
| OrgIQ_Viewer | Executives, auditors, non-technical stakeholders | Read-only access to dashboards and reports. Cannot run scans or modify anything |
| OrgIQ_API_User | External integrations, automation accounts | API-level access for programmatic use — not intended for human users |
Contact your Salesforce Administrator if you need access.
Important: A permission set is required to use The Auditor. Without one of the above permission sets assigned, all application features will be blocked and you will see an “Insufficient access” error.
Assigning a Permission Set (Administrators)
To assign a permission set to a user:
- Go to Setup > Users > Permission Sets
- Click the permission set name (e.g.,
OrgIQ_Admin) - Click Manage Assignments
- Click Add Assignments
- Select one or more users from the list
- Click Next, then Assign
- The user will have access the next time they log in or refresh the page
Tip: You can also assign permission sets via Setup > Users > Users, then edit a specific user and scroll to the “Permission Set Assignments” related list.
First Steps
After installation and permission set assignment, follow this recommended workflow:
- Run Your First Scan — Open The Auditor app, navigate to the Dashboard tab, and click Run Full Scan in the Quick Actions card. First scans on large orgs can take 10-20 minutes.
- Review Dashboard — Once the scan completes, the dashboard populates with 6 health score gauges (Overall, Security, Data Quality, Tech Debt, Performance, Integration, Data Cloud) and metric cards (Objects, Fields, Apex Classes, Flows, Active Users).
- Investigate Issues — Click any metric card to drill down into a detailed list (e.g., click “Apex Classes” to see all classes with their API versions, lines of code, and test coverage).
- Review Technical Debt — Navigate to the Technical Debt tab to see all findings. Sort by severity and start with Critical items.
- Set Up Data Quality Rules — Go to the Data Quality tab and click “Create New Rule” to define your first data quality check.
- Schedule Recurring Scans — From the Dashboard, click “Schedule” in the Quick Actions card to set up daily or weekly automated scans.
3. Dashboard Overview
The main dashboard provides a comprehensive view of your org’s health at a glance.
Health Score Gauges
The dashboard displays circular gauges for key health metrics:
| Gauge | Description | Score Range |
|---|---|---|
| Overall Health | Combined weighted score across all 6 dimensions | 0-100 |
| Security | Profile, permission, and access configuration | 0-100 |
| Data Quality | Completeness, validity, and consistency of data | 0-100 |
| Tech Debt | Code and configuration quality | 0-100 |
| Performance | Efficiency and optimization level | 0-100 |
| Integration | Health and reliability of external connections | 0-100 |
| Data Cloud | Data stream, segment, and identity resolution health | 0-100 |
Each gauge shows a trend indicator comparing the current score to the previous scan. Hover over any gauge to see the score rating and description.
Note: Dashboard data automatically refreshes when you navigate back to a tab. You don’t need to manually refresh after switching between tabs.
Score Interpretation:
- 90-100: Excellent – Best practices followed
- 70-89: Good – Minor improvements recommended
- 50-69: Fair – Attention needed
- Below 50: Poor – Immediate action required
Metric Cards
The dashboard displays clickable cards with key counts from your org. Click any card to open a searchable detail modal with the full list of records and drill-down information:
| Metric | What It Shows | Detail Modal Contents |
|---|---|---|
| Objects | Total number of objects in your org | Object label, API name, custom vs standard, field count, record count, last modified, last data added |
| Custom Objects | Custom objects only | Same as Objects, filtered to custom |
| Fields | Total fields across all custom objects | Field label, API name, parent object, data type, required status |
| Apex Classes | Number of Apex classes | Class name, API version, namespace, status, lines of code, test coverage, created/modified dates |
| Flows | Active and inactive flows | Flow API name, label, description, process type, trigger type, active status, version number, last modified |
| Active Users | Users who have logged in within the last 90 days | Name, username, email, profile, role, last login date, created date |
In any detail modal you can:
- Search using the search box at the top to filter by any visible column
- Sort by clicking any column header (click again to reverse the sort direction)
- See total count of records displayed at the bottom
- Close the modal by clicking outside it, pressing Escape, or clicking the X button
Issue Counts
Color-coded badges show issues by severity:
- Critical (Red): Requires immediate attention
- High (Orange): Should be addressed soon
- Medium (Yellow): Plan to fix in upcoming sprints
- Low (Gray): Nice-to-have improvements
Recent Technical Debt
A list of the most recent or severe technical debt items needing attention. Click “View All” to navigate to the full Technical Debt tab.
Cost Savings Summary
Shows estimated and realized savings from optimization opportunities.
4. Technical Debt Management
What is Technical Debt?
Technical debt represents compromises in configuration, code, or data that create ongoing maintenance burdens or risks. Examples include:
- Deprecated API versions in Apex classes
- Unused custom fields consuming storage
- Complex flows that could be simplified
- Security configuration gaps
Accessing Technical Debt
- Navigate to the Technical Debt tab
- Or click on issue badges in the dashboard
Filtering Debt Items
Use the filters at the top:
| Filter | Options |
|---|---|
| Category | Metadata, Code, Security, Data, Performance, Configuration, DataCloud |
| Severity | Critical, High, Medium, Low, Info |
| Status | Open, In Progress, Resolved, Deferred, Won’t Fix, False Positive |
Debt Item Details
Each item shows:
- Severity: Impact level (color-coded)
- Issue: Title with link to the record
- Category: Classification (Metadata, Code, Security, etc.)
- Component: The affected component’s API name
- Type: Component type (ApexClass, CustomField, Flow, etc.)
- Status: Current workflow state
Taking Action on Debt Items
Each debt item has a row actions menu (a small down-arrow button at the right end of each row). Click it to see available actions.
Update Status (Manual Tracking):
Use this when you want to manually track that an issue is being worked on or has been addressed outside The Auditor.
- Click the row actions menu (down-arrow) on any item row
- Select the new status:
- In Progress — Someone is actively working on this
- Resolved — The issue has been fixed (you fixed it in the org directly)
- Deferred — You’ll address this later
- Won’t Fix — You’ve decided not to address this (keeps it out of future scans)
- False Positive — The scanner was wrong about this
- Add Resolution Notes when prompted (recommended — explains what was done or why)
Note: Items marked “Won’t Fix” or “False Positive” will be preserved across future scans — they won’t be re-flagged. Items marked “Resolved” will auto-refresh to “Open” if the scanner detects the issue again in a future scan.
Remediate (Automated Fix):
For supported issue types, The Auditor can automatically fix the issue for you.
- Click Remediate from the row actions menu
- The Remediation Panel opens, showing available remediation options. Each option explains:
- What it will do — the specific change (e.g., “Deactivate unused fields”)
- Impact level — Low, Medium, or High risk
- Whether approval is required — some changes require admin review first
- Select an option, then click Preview to see a sample of what will change (e.g., “5 fields will be deactivated: Account.UnusedField1__c, Account.UnusedField2__c, …”)
- Click Execute to apply the remediation
- Review the results — success count, any failures, and the rollback option
Not all issues have automated remediation — many require manual intervention (e.g., “Refactor complex flow”). For those, The Auditor will show specific recommendations instead.
Remediation Safety:
- Most remediations create a rollback snapshot before making changes
- You can rollback within 30 days by opening the remediation record and clicking “Rollback”
- Some destructive actions (deletions) cannot be rolled back — these require explicit confirmation
Best Practices
- Start with Critical items — These have the highest risk. Sort the list by Severity descending.
- Group by category — Work on similar items together (e.g., fix all unused fields in one batch).
- Track trends over time — Compare your health scores between scans. If Tech Debt is trending down, your remediation efforts are working.
- Document resolutions — Always add Resolution Notes. This creates an audit trail and helps team members understand past decisions.
- Use Won’t Fix sparingly — Only mark items as “Won’t Fix” after team discussion. Add notes explaining the decision.
- Review monthly — Schedule a recurring review of your debt backlog to avoid accumulation.
5. Data Quality Monitoring
Accessing Data Quality
Navigate to the Data Quality tab. The page has two sub-tabs:
- Overview: Overall quality score, dimension breakdown, and failing rules
- Rules: Complete list of all data quality rules with management actions
Quality Dimensions
Data quality is measured across 5 dimensions. When you create a rule, you choose which dimension applies:
| Dimension | What It Measures | Example Rule | When to Use |
|---|---|---|---|
| Completeness | Are important fields populated? | “Account.Industry must not be blank” | When missing data breaks reports or workflows |
| Validity | Do values match expected formats, ranges, or patterns? | “Contact.Email must match an email pattern” | When bad data format causes system errors |
| Consistency | Do related fields agree with each other? | “Opportunity.CloseDate must be after CreatedDate” | When cross-field logic matters for reporting |
| Timeliness | Is the data recent enough to be trusted? | “Opportunity updated within the last 30 days” | When stale data leads to bad decisions |
| Uniqueness | Are there duplicate records? | “Account.Website should be unique” | When duplicates inflate metrics or create confusion |
Creating Data Quality Rules
Click “Create New Rule” to open the Rule Builder wizard:
Step 1 — Select Object:
- Choose the Salesforce object you want to evaluate from the dropdown
- Click Next to continue
Step 2 — Choose Field & Rule Type:
- Select the specific field to check from a dropdown of all accessible fields
- Choose a Rule Type: Completeness, Uniqueness, Validity, Consistency, or Timeliness
- A description of the selected rule type is displayed to help you decide
For Consistency rules, additional configuration fields appear:
- Comparison Type: Equals, Not Equals, or Contains
- Compare To Field (same record): Select another field from the same object via dropdown to compare against
- Related Object: Select a lookup/master-detail relationship from a dropdown (e.g., Account, Owner)
- Field on Related Object: Once a relationship is selected, choose a field on the related object from a second dropdown
- Expected Value: Optionally enter a literal value to compare against
A rule name is auto-generated (you can change it later). Click Next to continue.
Step 3 — Configure Thresholds:
How Thresholds Work:
When a rule evaluates, it calculates a score from 0-100% representing what percentage of records pass the rule. For example, if you have 1,000 Accounts and 950 of them have Industry populated, your Completeness score is 95%.
Thresholds determine when the system alerts you:
- Warning Threshold — When the score falls below this, the rule is flagged as “Warning” (e.g., 90%)
- Critical Threshold — When the score falls below this, the rule is flagged as “Critical” (e.g., 70%)
Example: If Warning=90 and Critical=70:
- Score 95% → rule passes (green)
- Score 85% → warning (yellow) — attention needed
- Score 60% → critical (red) — urgent action required
Fields in Step 3:
- Rule Name — auto-generated (e.g., “Completeness – Account.Industry”), but you can customize it
- Warning Threshold — typically 85-95% for important rules
- Critical Threshold — typically 60-75% (must be lower than Warning)
- Business Impact — tells you how important this rule is for the business (different from Warning/Critical, which reflect rule performance):
- Critical — Failing this rule directly impacts revenue, compliance, or customer experience
- High — Failing impacts key business processes or reporting accuracy
- Medium — Failing is a nuisance but doesn’t block business operations
- Low — Failing is nice-to-have quality improvement
- Business Impact Description — optional text explaining why this matters (e.g., “Required for monthly revenue reporting”)
- Evaluation Frequency — how often the rule runs automatically:
- Real-Time — on record save (use sparingly — can impact performance)
- Hourly — every hour via batch job
- Daily — once per day (most common)
- Weekly — once per week
- Monthly — once per month
- On-Demand — only when manually triggered
- Active toggle — when OFF, the rule is paused and won’t evaluate
Click Next to review your configuration.
Step 4 — Review & Save:
- Review all settings in the summary
- Click Save to create the rule
- The rule will appear in your rules list and be ready for evaluation
Evaluating Rules
Evaluate Single Rule:
- Find the rule in the Rules tab
- Click the refresh icon next to the rule
- Results show the pass/fail score and evaluation result
Evaluate All Rules:
- Click “Evaluate All Rules” button on the Overview tab
- All active rules are evaluated
- Scores update with new results
Managing Rules
From the Rules tab you can:
- Pause/Resume a rule using the toggle button
- Delete a rule (with confirmation prompt — this also deletes evaluation history)
- Evaluate an individual rule on demand
Viewing Results
The Overview tab displays:
- Overall data quality score gauge
- Rules summary (total, passing, failing, last evaluated)
- Dimension score breakdown (Completeness, Validity, Consistency, Timeliness, Uniqueness)
- Failing rules list with scores and quick-evaluate buttons
6. Integration Health
Accessing Integration Dashboard
Navigate to the Integrations tab to see:
- Summary statistics across all integrations
- Health breakdown by status
- Individual integration cards with filtering
- Integration discovery feature
Understanding the Dashboard
Summary Cards:
- Total Integrations: Number of registered integrations
- Active Integrations: Currently active connections
- API Calls: Total monthly API call volume
- Error Rate: Average error rate across all integrations
Health Summary:
- Healthy: Working correctly
- Degraded: Experiencing minor issues
- Unhealthy: Immediate attention needed
Discovering Integrations
Click “Discover Integrations” to automatically scan your org for existing integrations (Connected Apps, Named Credentials, etc.). Discovered integrations can be registered or dismissed.
Adding an Integration
- Click Add Integration
- Complete the form:
| Field | Description | Example |
|---|---|---|
| Integration Name | Descriptive name | “Marketing Cloud Contact Sync” |
| Integration Type | Architecture pattern (see below) | “Outbound API” |
| Direction | Data flow direction | “Bidirectional” |
| External System | Name of the system you’re integrating with | “Marketo” |
| Status | Current operational state | “Active” |
| Authentication Method | How the integration authenticates | “OAuth 2.0” |
| Data Sensitivity | Classification of the data being exchanged (see below) | “Confidential” |
| External System URL | Base URL of the external system | https://api.marketo.com |
| Notes | Purpose and additional details | “Syncs contacts every 15 minutes…” |
- Click Save
Integration Type Definitions:
- Inbound API — External system calls Salesforce APIs to push data IN
- Outbound API — Salesforce (via Apex/Flows) calls external system APIs to push data OUT
- Connected App — OAuth 2.0 connection for third-party apps
- Named Credential — Salesforce’s way of storing auth for callouts
- External Service — Declarative schema-based integration (like MuleSoft)
- Middleware — Uses a middleware platform (e.g., Boomi, MuleSoft, Workato)
- Data Sync — Regular batch data transfer (e.g., nightly ETL)
- SSO — Single Sign-On identity integration
- Other — Anything else (manual file transfers, custom scripts)
Data Sensitivity Classifications:
- Public — Marketing content, publicly available data (lowest risk)
- Internal — Business data for internal use only (e.g., internal reports)
- Confidential — Customer data, financial records, non-public business data
- Restricted — PII, payment card data, health records (highest risk — requires strong controls)
Why classify data? Data Sensitivity helps prioritize security reviews and monitoring. An integration handling “Restricted” data deserves more scrutiny than one handling “Public” data.
Filtering Integrations
Use the status filter to view integrations by status: All, Active, Inactive, Testing, or Deprecated.
Integration Best Practices
- Register all integrations – Even manual file transfers
- Set appropriate sensitivity levels – For security classification
- Document endpoints and credentials – Keep information current
- Use Discovery – Regularly scan for unregistered integrations
7. Data Cloud Health
Accessing Data Cloud Dashboard
Navigate to the Data Cloud tab. If Data Cloud is not enabled in your org, you’ll see a message indicating it’s not available.
Overview Tab
The overview provides a summary of your Data Cloud environment at a glance:
Summary Cards:
| Card | What It Shows |
|---|---|
| Data Streams | Total count with error alert if any are in error state |
| Calculated Insights | Number of calculated insights configured |
| Segments | Number of segments with member counts |
| Identity Resolution | Number of identity resolution rulesets |
Health Sections:
Each section shows a status breakdown (Active, Error, Inactive counts) for the respective Data Cloud component type.
Data Streams Tab
Displays all configured data streams with:
- Stream Name: The data stream identifier
- Source Object: Salesforce object feeding the stream
- Status: Active, Error, Disconnected, Inactive
- Records Processed: Number of records synced
- Last Refresh: When the stream was last updated
Tip: Data streams in error state are flagged on the overview card and contribute to a lower Data Cloud health score.
Calculated Insights Tab
Lists all calculated insights with name, status, and last refresh time.
Segments Tab
Shows all segments with:
- Segment Name: The segment identifier
- Status: Published, Draft, etc.
- Member Count: Number of individuals in the segment
- Last Refresh: When segment membership was last computed
Identity Resolution Tab
Displays identity resolution rulesets with their status and match statistics.
Required One-Time Setup: Named Credential
What is a Named Credential and why do I need one?
The Auditor needs to call Salesforce’s Data Cloud REST APIs to inspect your data streams, segments, and identity resolution rulesets. To make these API calls securely, Salesforce requires a Named Credential — a configuration record that stores the API endpoint (your org’s URL) and authenticates the calls using an OAuth token.
Named Credentials are org-specific (they contain your org’s URL and your admin’s authentication) and cannot be bundled into a managed package for security reasons. This is a one-time setup step the admin performs after installing The Auditor.
The Named Credential must be named Salesforce_API exactly — The Auditor looks for this specific name.
IMPORTANT: Without this setup, the Data Cloud tab will show “Data Cloud Not Available” even if Data Cloud is enabled in your org. Administrators must complete the steps below before Data Cloud auditing will work.
Follow these steps (administrators only, one-time setup):
Step A: Create an External Credential
- Go to Setup > Security > Named Credentials > External Credentials tab
- Click New
- Label:
Salesforce API External Credential - Name:
Salesforce_API_EC - Authentication Protocol:
OAuth 2.0 - Authentication Flow Type:
Browser Flow - Scope:
refresh_token full - Identity Provider URL: leave blank (uses same org)
- Click Save
Step B: Add a Principal to the External Credential
- On the External Credential you just created, scroll to Principals section
- Click New
- Parameter Name:
OrgIQ_User - Identity Type:
Named Principal - Sequence Number:
1 - Authentication Flow Type:
Browser Flow - Click Save, then click Authenticate and sign in as an administrator
- Verify the authentication status shows “Configured”
Step C: Create the Named Credential
- Go back to Setup > Security > Named Credentials (Named Credentials tab)
- Click New
- Label:
Salesforce API - Name:
Salesforce_API(must match exactly — case-sensitive) - URL:
https://<your-my-domain>.my.salesforce.com(your org’s My Domain URL — find it in Setup > My Domain) - Enabled for Callouts: checked
- External Credential: select
Salesforce_API_EC - Generate Authorization Header: checked
- Allow Formulas in HTTP Header: leave unchecked
- Allow Formulas in HTTP Body: leave unchecked
- Click Save
Step D: Grant Access to the OrgIQ Permission Set
- Go to Setup > Users > Permission Sets
- Open the
OrgIQ_Adminpermission set (and any other OrgIQ permission sets that need Data Cloud access) - Under Apps > External Credential Principal Access, click Edit
- Add
Salesforce_API_EC - OrgIQ_User - Click Save
Step E: Verify
- Navigate to The Auditor’s Data Cloud tab
- The dashboard should now load with Data Cloud status and metrics
- If still unavailable, the error message will indicate the specific issue
Data Cloud Prerequisites
In addition to the Named Credential, your org must have:
- Data Cloud enabled — Contact Salesforce to enable Data Cloud in your org
- CRM Connector configured — Go to Setup > Data Cloud Setup > CRM Data to connect your CRM data
- At least one Data Stream active — Configure data streams to sync Salesforce objects to Data Cloud
- Data Cloud Admin permission — The user viewing the dashboard needs Data Cloud permissions in addition to an OrgIQ permission set
Data Cloud Setup Guide
If you’re setting up Data Cloud for the first time:
Step 1: Enable Data Cloud
- Go to Setup > Data Cloud Setup
- Follow the activation wizard
- Accept the Data Cloud terms of service
Step 2: Configure the CRM Connector
- Go to Setup > Data Cloud Setup > Data Streams
- Click “New Data Stream”
- Select “CRM” as the connector type
- Choose which Salesforce objects to sync (e.g., Contact, Account, Lead)
Step 3: Map Fields to Data Model Objects (DMOs)
- For each data stream, map source fields to target DMO fields
- Standard DMOs include: Individual, Account, Contact Point Email, Contact Point Phone, Contact Point Address
- Set primary keys (typically the Salesforce record ID)
Tip: Use The Auditor’s Field Exporter (Data Cloud export formats) to generate mapping files that can be used as a reference or deployed via CLI.
Step 4: Activate Data Streams
- Review each data stream configuration
- Click “Activate” to start syncing data
- Monitor the initial sync on the Data Cloud tab in The Auditor
Step 5: Configure Identity Resolution (Optional)
- Go to Data Cloud Setup > Identity Resolution
- Create ruleset matching rules (e.g., match on Email, Phone, Name)
- Activate the ruleset
Step 6: Create Segments (Optional)
- Go to Data Cloud Setup > Segments
- Define audience criteria
- Publish segments for activation
Interpreting the Data Cloud Health Score
The Data Cloud health score (0-100) considers:
- Data stream health: Percentage of streams in Active status vs Error/Disconnected
- Data freshness: How recently streams were refreshed
- Identity resolution: Whether rulesets are active and processing
- Segment health: Whether segments are published and have members
- Activation targets: Status of configured activation targets
A score of 90+ indicates a well-configured Data Cloud environment. Below 70 suggests issues that need attention (error streams, stale data, inactive rulesets).
8. Cost Savings Tracker
Accessing Cost Savings
Navigate to the Cost Savings tab to track (data refreshes automatically when you return to this tab):
- Estimated annual savings
- Realized savings to date
- Realization rate (realized vs. total opportunities)
- Savings by category and status breakdown
Understanding Categories
| Category | Examples |
|---|---|
| License Optimization | Deactivate unused users, downgrade license types |
| Storage Reduction | Archive old data, delete unused attachments |
| API Efficiency | Optimize API call patterns, reduce redundant calls |
| Automation Consolidation | Consolidate redundant flows, optimize triggers |
| Feature Rationalization | Remove unused features, simplify configurations |
| Third-Party Apps | Remove unused packages, renegotiate contracts |
| Infrastructure | Reduce sandbox count, optimize environments |
Adding a Savings Opportunity
- Click Add Opportunity
- Complete the form:
| Field | Description | Example |
|---|---|---|
| Opportunity Name | Clear, descriptive title | “Deactivate 47 inactive Salesforce users” |
| Category | Type of savings (see Categories table above) | “License Optimization” |
| Status | Current state (see Status Workflow below) | “Identified” |
| Priority | How urgent this is to pursue, based on value vs effort | “High” if >$10K/year with <1 day effort |
| Estimated Annual Savings | Projected yearly savings amount (USD) | 75000 (for 47 licenses × ~$1,600/year) |
| Estimated One-Time Savings | Any one-time savings | 0 (typical for license optimization) |
| Implementation Effort | How much work is required | “Small (1-4 hrs)” — run deactivation script |
| Confidence Level | How sure you are of the savings estimate | “High” if you’ve verified 47 users are truly inactive |
| Target Date | When you plan to realize the savings by | 2026-04-30 |
| Description | Details and business justification | “Users haven’t logged in for 12+ months…” |
| Implementation Steps | The plan to achieve the savings | “1. Review list with HR. 2. Deactivate. 3. Request license credit.” |
| Risks | Potential risks or concerns | “Some users may be seasonal — verify first” |
- Click Save
Tip: Priority should reflect value-to-effort ratio. A “Critical” priority means high savings AND easy to implement. An opportunity with $500K savings but requiring 6 months of work might be “Medium” priority in the short term.
Tracking Progress
Status Workflow:
- Identified – Opportunity discovered
- Under Review – Being evaluated
- Approved – Approved for implementation
- In Progress – Implementation underway
- Realized – Savings achieved
- Declined – Not pursuing
Updating Actual Savings:
When savings are realized:
- Open the opportunity
- Update status to “Realized”
- Enter Actual Savings amount
- The realization rate on the dashboard will update automatically
9. Field Exporter & Data Cloud Mapping
Accessing Field Exporter
Navigate to the Field Exporter tab. This dual-mode tool supports:
- Export Mode: Extract field metadata for documentation or Data Cloud mapping
- Import Mode: Import mapping configurations for Data Cloud deployment
Export Mode
Step 1: Select Objects
- Use the dual listbox to select objects
- Move objects from “Available” to “Selected”
- Click “Next: Preview Fields”
Step 2: Preview Fields
- Review all fields grouped by object
- Click on an object to expand and see field details
- Verify the field count shown
Step 3: Export Options
Which Export Type Should I Pick?
| If you want to… | Choose… | Output |
|---|---|---|
| Document all fields for an audit or handoff | Standard Field Export | CSV or JSON with full metadata |
| Plan a Data Cloud implementation — see how fields would map | Data Cloud DMO Mapping | CSV/JSON with suggested DMO mappings |
| Deploy data streams via Salesforce CLI in a CI/CD pipeline | Data Cloud Data Stream (YAML) | YAML files ready for sf project deploy |
| Manually configure Data Cloud in Setup using a spreadsheet | Data Cloud Import CSV | Simplified CSV matching Data Cloud’s import format |
| Build a complete Data Kit for portable Data Cloud deployment | Data Kit Manifest (JSON) | Full Data Kit JSON manifest |
Detailed Export Type Descriptions:
| Type | Description | Use Case |
|---|---|---|
| Standard Field Export | Raw field metadata (data type, length, required, etc.) | Documentation, auditing, compliance review |
| Data Cloud DMO Mapping | Fields pre-mapped to Data Cloud DMOs (Individual, Contact Point, etc.) | Data Cloud implementation planning |
| Data Cloud Data Stream (YAML) | YAML data stream definitions for CI/CD | Automated deployment via CLI |
| Data Cloud Import CSV | Simplified CSV format | Manual Data Cloud setup in Setup > Data Streams |
| Data Kit Manifest (JSON) | Complete Data Kit JSON with all metadata | Full deployment package for Data Kits |
For Data Cloud Exports, configure:
- Data Kit Name — The name that will appear in the manifest (datakit-json only; e.g., “Sales Customer Data Kit”)
- Data Source Type — Where the data is coming from:
- CRM — Standard Salesforce CRM data (most common)
- Ingestion API — Data from external systems via API
- Marketing Cloud — Data from Marketing Cloud
- Include Relationships — When checked, includes foreign key relationships (lookup/master-detail fields) in the export. Recommended for most use cases.
- Include Custom DMO/Field Definitions — When checked, includes definitions for any custom DMOs or fields that don’t yet exist in Data Cloud. Use this when your source includes custom fields that need to be created in Data Cloud.
Data Stream Preview:
Shows how fields will be grouped into data streams. Note that one object (like Contact) may generate multiple streams:
- Contact → Individual (name, birthdate, etc.)
- Contact → Contact Point Email (email fields)
- Contact → Contact Point Phone (phone fields)
- Contact → Contact Point Address (address fields)
Step 4: Download
Click “Export & Download” to generate and download your file.
Import Mode
Step 1: Upload File
- Switch to Import mode using the toggle
- Upload a mapping file (CSV or JSON from previous export)
- Enter a Configuration Name
- Select Data Source Type
- Click “Import File”
Or select a previously saved configuration from the list.
Step 2: Review Mappings
- See configuration summary (objects, fields, status)
- Expand each object to see field mappings
- Verify source → target field mappings
- Check primary key and required indicators
Step 3: Deploy
Option 1: Download Metadata Package (Recommended)
- Click “Download Metadata Package”
- A modal appears with individual XML files
- Download all files to a folder
- Deploy using Salesforce CLI:
sf project deploy start --metadata-dir <folder>
Option 2: Validate Configuration
- Click “Validate Configuration”
- System checks that objects and fields exist
- Review validation results
- Use the guidance to complete manual setup
Option 3: Manual Data Cloud Setup
- Go to Setup > Data Cloud > Data Streams
- Create new data streams following the configuration
- Map fields as shown in the configuration
CSV Import Format
When importing a CSV file, use these column names:
| Column | Description |
|---|---|
| Source Object | Salesforce object API name |
| Source Field | Salesforce field API name |
| Target DMO | Data Cloud DMO name |
| Target Field / DMO Field Name | Data Cloud field name |
| Source Type / Salesforce Type | Salesforce data type |
| Target Type / Data Cloud Type | Data Cloud data type |
| Primary Key | true/false |
| Required | true/false |
| Foreign Key | true/false |
10. Running Org Health Scans
Starting a Scan
The Quick Actions card on the Dashboard is your starting point for all scans. It’s a compact panel with four action buttons:
| Button | When to Use | Typical Duration |
|---|---|---|
| Run Full Scan | Comprehensive analysis — use for baseline reviews or after major changes | 10-30 min (large orgs) |
| Quick Scan | Daily health check — covers the highest-value metrics only | 2-5 min |
| Custom Scan | When you want to focus on a specific area | Varies |
| Schedule | Set up automated recurring scans | One-time setup |
How to run a scan:
- Open the Dashboard tab
- Find the Quick Actions card (usually in the upper portion of the dashboard)
- Click the appropriate scan button
- A progress banner appears at the top of the dashboard — you can continue working while the scan runs
Only one scan can run at a time per org. If you click “Run Full Scan” while another scan is in progress, you’ll see a “Scan Already Running” message.
Scan Types
| Scan Type | Coverage | Best For |
|---|---|---|
| Full Scan | Everything: metadata, code, security, data quality, integrations, Data Cloud | Weekly baseline, post-release reviews |
| Quick Scan | Highest-value metrics only — key security issues, recent changes, critical rules | Daily health checks |
| Security Only | Profiles, permission sets, sharing rules, field-level security | Post-audit reviews, compliance checks |
| Data Only | Data quality rules and completeness evaluation | After data loads or cleanups |
During the Scan
- A progress banner appears at the top of the dashboard showing scan status
- Progress percentage and current step are displayed
- Platform Events provide real-time updates
- You can continue using the application while the scan runs
After the Scan
- Review health scores on the dashboard — they update automatically
- Check new debt items created by the scan
- Compare to previous scan — trend indicators on score gauges show improvement or regression
Intelligent Deduplication
The Auditor automatically handles recurring issues across scans:
- Recurring issues: If the same issue is detected again, the existing finding is updated with an incremented Detection Count rather than creating a duplicate. The First Detected Date is preserved from the original finding.
- Auto-resolved issues: If an issue from a previous scan is no longer detected, it is automatically marked as Resolved with the note “Auto-resolved: not detected in latest scan.” This means issues you fix between scans are automatically cleared.
- User-dismissed items: Issues you marked as “Won’t Fix” or “False Positive” are not affected by subsequent scans — your dismissal is preserved.
Tip: Check the Detection Count on a debt item to see how many consecutive scans have found the same issue. A high count indicates a persistent problem that hasn’t been addressed.
Scheduling Regular Scans
Scheduled scans run automatically in the background and keep your dashboards and debt lists fresh without manual intervention.
To schedule a scan:
- From the Dashboard, click Schedule in the Quick Actions card
- Select the scan type (Full, Quick, Security Only, or Data Only)
- Choose frequency:
- Daily — runs every day at the hour you specify (recommended for Quick or Security Only scans)
- Weekly — runs once per week (recommended for Full scans on large orgs)
- For Weekly scans, select the day of the week
- Choose the hour to run (use a low-traffic time like 2:00 AM)
- Click Schedule to save
Recommended Schedule:
- Small orgs (< 1000 users): Daily Full Scan at 2 AM
- Medium orgs (1000-5000 users): Weekly Full Scan + Daily Quick Scan
- Large orgs (5000+ users): Weekly Full Scan on Saturday night + Daily Security Only scan
Managing Scheduled Scans:
To view, modify, or cancel scheduled scans:
- Go to Setup > Jobs > Scheduled Jobs (search “Scheduled Jobs” in Quick Find)
- Look for jobs named
OrgIQ_Daily_*_ScanorOrgIQ_Weekly_*_Scan - To cancel a scheduled job, click Del next to the job name
- To modify a schedule, cancel the existing job, then re-schedule from The Auditor with new settings
- To view a job’s next run time, look at the “Next Scheduled Run” column
Note: Only one schedule per scan type can exist at a time. If you try to schedule a Daily Full Scan when one already exists, you’ll get a “job already scheduled” message. Cancel the old one first, then schedule the new one.
Viewing Scan History
To review past scans and see trends:
- Navigate to Org Health Scans tab in the App Launcher (not the Dashboard — this is a standard record list)
- See all past scans with date, scan type, status (Completed, Failed, In Progress), and overall health score
- Click any scan record to see the full details including all findings from that scan
- Compare scores across scans to track your org’s improvement over time
11. Troubleshooting
Common Issues
“Insufficient access” error on any page
- You must have one of the OrgIQ permission sets assigned (Admin, Analyst, Viewer, or API_User)
- Contact your Salesforce Administrator to request a permission set assignment
“No data available” on dashboard
- Ensure you have the correct permission set assigned
- Run an initial org health scan
- Check that the scan completed successfully
Scan fails to complete
- Large orgs may timeout — try Quick Scan first
- Check for governor limit issues
- Contact admin if problem persists
Cannot export fields
- Ensure you have selected at least one object
- Check that you have access to the objects
- Try a smaller selection first
Data Quality rules not evaluating
- Verify the rule is marked Active (not Paused)
- Check that the target object/field exists
- Ensure you have access to query the object
Import fails
- Verify CSV/JSON format matches expected structure
- Check for special characters in field names
- Ensure file is not empty
Cost savings opportunity fails to save
- Ensure required fields are completed: Name, Category, Status
- Verify the Implementation Effort value is valid (Trivial, Small, Medium, Large, or XLarge)
Data Cloud tab shows “Not Available”
- Most common cause: The
Salesforce_APINamed Credential hasn’t been created in your org. See Section 7 — “Required One-Time Setup: Named Credential” for step-by-step instructions. - The error message on the Data Cloud tab will specify the exact reason:
- “Named Credential ‘Salesforce_API’ is not configured…” → Follow the Named Credential setup in Section 7
- “Authentication failed. Verify the Named Credential…” → Re-authenticate the External Credential principal
- “Data Cloud is not enabled in this org.” → Data Cloud is not provisioned — contact Salesforce
- Ensure you have Data Cloud Admin permissions in addition to an OrgIQ permission set
- Ensure your OrgIQ permission set has access to the External Credential principal (Setup > Permission Sets > External Credential Principal Access)
Data Cloud dashboard shows no data streams
- Data Cloud is enabled but no data streams are configured
- Go to Setup > Data Cloud Setup > Data Streams to create your first data stream
- Use the Field Exporter’s Data Cloud export formats to generate mapping files
Data Cloud health score is low
- Check for data streams in Error or Disconnected status
- Verify identity resolution rulesets are active
- Ensure segments have been published and have members
- Check that activation targets are properly configured
Duplicate issues appearing in Technical Debt list
- This should not happen with the current version — scans automatically deduplicate
- If you see duplicates from before this feature was added, they are from older scans
- Run a new scan to consolidate: recurring issues will be deduplicated and old copies auto-resolved
Getting Help
- Documentation: Refer to this guide
- Support: Contact your Salesforce Administrator
12. Glossary
| Term | Definition |
|---|---|
| Activation Target | A Data Cloud destination for segment data (e.g., Marketing Cloud, Ads) |
| Calculated Insight | A Data Cloud computed metric or aggregation derived from unified data |
| Contact Point | Data Cloud objects for email, phone, and address (e.g., Contact Point Email) |
| Data Kit | A packaged Data Cloud configuration for deployment |
| Data Stream | A mapping from a Salesforce object to a Data Cloud DMO that syncs data |
| Detection Count | Number of consecutive scans in which a technical debt item has been found |
| DMO (Data Model Object) | Data Cloud’s standard data objects like Individual, Account, Contact Point |
| Governor Limits | Salesforce platform limits on resources like queries and CPU time |
| Health Dimension | One of the 6 scoring areas: Security, Data Quality, Performance, Integration, Data Cloud, Metadata |
| Health Score | A 0-100 rating of org quality across the 6 health dimensions |
| Identity Resolution | Data Cloud feature that matches and merges records across data sources into unified profiles |
| Metadata | Configuration data that defines your Salesforce setup |
| Permission Set | A Salesforce security feature that grants access to specific application features |
| Remediation | The process of fixing identified issues, either manually or through automated actions |
| Scan | An analysis of your org’s metadata and configuration across all health dimensions |
| Segment | A Data Cloud audience definition based on criteria applied to unified profiles |
| Technical Debt | Accumulated inefficiencies in code, configuration, or data requiring future work |
| YAML | A human-readable data format used for configuration files |
Quick Reference Card
Status Meanings
| Status | Meaning |
|---|---|
| Open | Issue identified, not yet addressed |
| In Progress | Work underway |
| Resolved | Issue fixed |
| Deferred | Postponed for later |
| Won’t Fix | Decided not to address |
| False Positive | Not actually an issue |
Severity Levels
| Level | Action Timeframe |
|---|---|
| Critical | Immediate (within 24 hours) |
| High | Soon (within 1 week) |
| Medium | Planned (within 1 month) |
| Low | Backlog (when convenient) |
| Info | No action required |
Document Version: 2.0
Last Updated: March 25, 2026
