
In our first blog on Microsoft’s Power Platform, we discussed the Data Collection stage of the ecosystem, referenced in the work carried out by Andrew Welch and Lee Baker, and in this blog, we’ll be addressing the Data Ingestion stage, detailing what part it has to play in an organisations’ journey to Power Platform success.
What is Ingestion?
Alongside the initial Data Collection stage, Ingestion is the second element of what we call “data in flight”.
It’s at this stage we begin to address data that flows into the Microsoft ecosystem, through applications such as the Azure Data Factory and Azure Event Grid.
Ingestion can be thought of in two ways; migration, which flows one way, or integration, which flows two ways. Integration is where data is being kept in sync between its source and new target destination.
What are events?
‘Events’, or triggers, could come in the form of an online application or any form of data submission.
Events can come from any source, and despite organisations becoming more and more Microsoft centric, it’s often the case that not all data feeds are from Microsoft tech. This can cause a headache when it comes to daily data ingestion and ongoing data management.
However, powerful tools such as Azure Data Factory and Event Grid, are the pain killer to this headache. Data Factory provides small, medium, and large organisations the ability to extract, transform and load data from source to target.
Unlike typical monitoring events that must poll for changes, Azure Event Grid give organisations the ability to receive push notifications when a change occurs.
With this information, data can then be processed appropriately and pushed to the correct location at near real time, and with far less computing power.
Data isn’t Microsoft Exclusive
As mentioned above, customer data does not, and should not belong to an application. Applications are simply a way of surfacing data and providing an interface to encourage good data processes and procedures.
With a mixture of various systems, software, and CRMs all collecting data and pushing it towards the Dataverse, ensuring it’s quality can become more difficult, increasing the likelihood of poor data entering the Dataverse and creating customer insults, rather than insights.
So, how do we bridge the gap between these data streams?
What DQ Global Can Do
Our range of data management products and services are the perfect solution when faced with this issue.
When data enters the ecosystem, we start by cleansing it according to a set of pre-defined rules. Once this has happened, we search and score records to understand if they already exist in the target database. If we find duplicates, the records are then either updated, mastered, or due to ambiguity, sent to a review screen for further assessment by one of our team.
The beauty of this is that wherever the data is coming from, we can convert into a common form, correctly search it into the database to ensure the correct processes are in place to prevent downstream pollution and data decay within the Power Platform ecosystem.
Once we have set up these on-going processes, you can trust the analysis and insights derived from the data that will go on to form the important decisions being made within your organisation. This comes full cycle again when it comes to our leaning tower of data blog.
If you’d like to discuss your data quality needs and find out how we can help, you can contact us today by clicking here.
You can also find out more about our solutions, here.