Troubleshoot: My Data Looks Wrong

  • Updated

Estimated reading time: 6 minutes 

Prerequisites: Familiarity with creating queries, adding metrics and dimensions, and navigating the Explore or Storyboard interface


 

What you'll learn

This guide helps you to:

  • Narrow down why a metric result doesn't match expectations.
  • Validate whether the issue is in the time period selection, calculation, filters, or source data.
  • Use Drill-Through and ID-level analysis to pinpoint the discrepancy.
  • Escalate to Support with the right information, if needed.

     


Overview

If a metric result or data value doesn't look right, the cause is typically one of four things: the time period selection, a filter applied at the query, storyboard, or metric level, the metric calculation itself, or the underlying source data. This guide walks you through a structured process to narrow down which one is responsible.

 

There are two troubleshooting paths depending on what you're seeing:

 

Path A - Metric result looks wrong

To identify what's causing an unexpected result, you need to progressively simplify the query until you can isolate the factor responsible. Work through these steps in order.

 

Step 1. Isolate a single metric

If your query has multiple metrics reporting unexpected results, focus on one at a time.

  1. Copy the tile or save the query as a new version to avoid unplanned changes to a published visualization.
  2. Remove all metrics except the (first) one you're investigating.

Step 2. Test the time period and filters

  1. Change the time period in the query to a range where you know the expected result should appear.
  2. Remove any dimension filters one at a time to see whether a specific filter is causing the discrepancy.
  3. If the result corrects itself after a change, you've found the cause. If not, continue to the next step.

Step 3. Review the metric calculation

  1. Click on the metric result to open the Drill-Through pop-up and review the calculation summary.
  2. If you have access, open the full metric definition via Create / Edit to inspect the complete configuration.
  3. Check for:
    • Filter inclusions or exclusions applied within the metric definition
    • The base field(s) or input metrics the calculation references

Tip: If the metric is a calculated metric, you want to review each of the inputs. For example, Termination Rate = Terminations ÷ Average Headcount, or there may be a combination of calculated metrics so you'll need to 'keep digging' until you get to the table-based metrics. Take note of each input, and you'll validate them individually in the next step.

Step 4. Validate calculated metric inputs

  1. Add each input metric to your test query alongside the calculated metric.
    • For example, if investigating Termination Rate, add both Terminations and Average Headcount.
  2. Review whether the input values produce the expected result when you apply the calculation manually or compare to source data.

Step 5. Drill down to ID level

  1. Add a related ID field to your test query; for example, Employee ID or Person ID from the relevant table. Effective dates are also very helpful for event metrics. 
  2. Keep or add any related dimensions (such as event reason or business unit).
  3. Compare the line-by-line results against your source data. Look for:
    • Missing records - IDs present in the source but absent from the query
    • Additional records - IDs appearing in the query that shouldn't be there
    • Mismatched details - dimension values or dates that don't align with the source

Step 6. Compare with an unfiltered version of the metric

If your metric has filters applied and records appear to be missing:

  1. Add an unfiltered version of the same metric to your test query; for example, use Events - All with the same related dimensions.
  2. Compare the filtered and unfiltered results side by side to identify where the filtering is excluding records.
  3. This can help surface ID mapping issues or filter logic that's narrower than intended.

Still stuck?

If you've worked through these steps and the result still doesn't look right, raise a Support ticket. Include:

  • A link to your test query or storyboard (the simplified version you used for troubleshooting)
  • A link to the original query showing the unexpected result
  • A brief description of what you expected to see versus what you're seeing

The Support team will investigate from there.

 

Path B - Record-level data looks wrong

If you've drilled down to the ID level and the raw, unfiltered data shows unexpected values — a wrong dimension value, an incorrect effective date, or a field that doesn't match the source — the issue is likely in the data sourcing or transformation logic rather than the metric calculation.

Step 1. Document what you're seeing

  1. Identify specific ID examples where the data doesn't match expectations.
  2. Note the field or dimension that looks wrong, what value you see in One Model, and what value you expect based on the source system.

Step 2. Investigate with SQL Explorer

SQL Explorer (available via the Data menu for permissioned users) lets you query the underlying schema tables directly, including the source data being sent to One Model.

  1. Navigate to Data > SQL Explorer.
  2. Write a query to look up the specific IDs you identified in Step 1.
  3. Compare the values in the One Model schema tables against the source data tables to determine where the discrepancy originates.

Note: The Support team uses SQL extensively for this type of investigation. For more on using SQL Explorer, see SQL Explorer documentation.

Still stuck?

If you need help understanding why a value differs from what you expect, raise a Support ticket. Include:

  • The specific ID examples you investigated
  • The field or dimension with the unexpected value
  • What you're seeing in One Model versus what you're expecting from the source
  • Any SQL Explorer queries you ran (if applicable)


Worked example - Investigating an unexpected metric result

This example walks through the diagnostic process for a metric that doesn't look right in a table visualization. Each step maps to the procedure in Path A.

Starting point

You notice one result in a table that looks wrong.

Isolate the metric (Step 1)

Copy the tile and focus on the single metric. The result is the same (40.0%), confirming the issue isn't caused by interaction with other metrics. Assume in this example that 2025 is the correct time period, so will move to step 3. 

 

Review the calculation (Steps 3–4)

The metric is a calculated metric. Check the calculation via Drill-Through or Create / Edit, then add the input metrics to the query. The calculation logic checks out and the inputs produce the expected result, but the input values themselves warrant investigation.

 

Drill to ID level (Step 5)

Add the related ID. In this case, Person ID from the event table plus Event Date. At least one ID now shows an unexpected count that needs investigation.

 

Investigate with SQL Explorer (Path B, Step 2)


Run a basic SQL Explorer query to check the employee-level data for the flagged ID.

Compare unfiltered results (Step 6)

If records are still missing after the ID-level check, go to a fully unfiltered list to confirm whether the issue is filter-related or data-related.

 

 

Related guides


 

Was this article helpful?

0 out of 0 found this helpful

Comments

0 comments

Please sign in to leave a comment.