One Model delivers people analytics infrastructure. It pulls together the data from all your HR systems and related tools. Then One Model intelligently reorganizes this data and joins it all together, presenting it back as if it all came from a single source.

This article is for those of you who want to know what's going on "under the hood" so to speak. We're going to break that data processing journey up into four sections and provide a video demonstration for each. The article will also mix in links to related content along the way.

Part One: Getting Data into One Model

In this first video, I'll set up a new flat file data source and upload its data into One Model. To keep things somewhat real-world, I'll use a report extract from our friends over at SmartRecruiters for the example. After you watch this video, you'll know how to add a new data source and how to do a file upload. 

Here are some related links based on the content in that first step:

Adding a File Data Source:  http://help.onemodel.co/data-pipeline/file-data-sources/adding-a-file-data-source

Setting up a Delimited Input:  http://help.onemodel.co/data-pipeline/file-data-sources/setting-up-a-delimited-input

Delimited Input - Date and Timestamp columns:  http://help.onemodel.co/data-pipeline/file-data-sources/delimited-input-date-and-timestamp-columns

Incremental File Upload:  http://help.onemodel.co/data-pipeline/incremental-file-upload

Part Two:  Processing Script to Transform that Data

Below in this second video, I'll add in some processing scripting to transform our raw data import into a metric and dimension table that we can hook into the One Model query engine. 

Part Three:  Configuring the Data Tables

Below, in the third video, Josh adds in the configuration necessary for the One Model application to find our new tables, understand the relationships between them, and display our dimension. 

Part Four:  Building a Metric based on our new Data Source

Finally, in video four below, I'll define a new metric based on our new data source. I'll also use the time dimension and our application state dimension to do a couple different breakouts of that data.

And there you have it folks! One complete trip through the One Model data processing pipeline. It's a basic example, but gives you a first hand view of the steps necessary to ingest, model, and deliver data in One Model. 

We will do our best to update these videos as we continue to improve our data processing tooling. These elements are the backbone of the rest of the work we do in One Model, so there will be improvements as we continue to build out our vision of people analytics infrastructure. 

Did this answer your question?