English

DATA --> INFORMATION --> INTELLIGENCE

The emergence of Big Data in recent years has made it absolutely necessary for companies to think about a smart strategy of DLM (Data Lifecycle Management). No digital transformation is possible without it.

Data must be accessed quickly and in a timely manner. More than ever Business Intelligence nourishes itself with the many informations coming from evergrowing sources. In this profusion of collected data, some have a short lifecycle (immediate need) and others will have a use in the long run. Let's take for instance the life of a patient : a medical record helps the doctor prescribe the right medicines in the right dose, to schedule follow-up visits, etc. and this throughout the patient's life course.

These informations are all important whether they're often needed or not by the organization, it's their level of importance that will decide where they will be stored for access in accordance with the need. At the end of its lifecycle, the information will be archived or deleted.

A smart data lifecycle management requires much more than applications, you first have to implement a form of structured thought which will result in a globally smart management system of the organization's data.

Let's take a look at design. When you start building a data management project, each step must be carefully thought out. Long before you start collecting data, you have to give your project a structure according to your needs and means and with regards to risks. The global cost will anticipate the exponential growth in time of data collecting (and use thereof), but also the "man" and "machine" costs that are necessary to proper functioning.

Some of the basic questions you have to answer :

  • Why do I need to collect data ?
  • For what purpose ?
  • To what benefit ?
  • How do I ensure data quality in the long-term ?
  • Who will have access to data ?
  • Which tools to choose ?

Regardless of the methodology selected, the first step to think of is planning.

Then you of course must decide where all those informations are going to be stored and how you will give access to them. The storage solution must also meet an absolute necessity : the level of data protection. Not just to abide by the laws that apply, but also in order to protect your data storage from increasing attack risks (see data breaches).

Once you have thoughtfully considered and set up all these steps, the long-range work can begin : collection, use and ongoing control of the quality of your informations.


 

Governance to the rescue

The gigantic size of data silos these days (and the trend shows no sign of slowing) threatens the very notion of value of your collected informations. Big Data makes it difficult to contain, have a constant accuracy and sometimes simply use the data. The costs are skyrocketing, is it still worth the try ?  

Here you are now with several years of data collecting behind you, your storage solution has stood the test of time, the data stakeholders in your organization have respected the quality rules with more or less success, the out of date informations are archived on a regular basis. But here's the thing, a certain percentage of data is inevitably inaccurate or duplicated, inaccessible even. Because we forgot a major element in our DLM project : data governance. The "detail" which will permit, among other things, to guarantee the possibility of finding the right information at the right time and in the right place.

How ?

By centralizing the access, by ensuring that the established rules are respected, the governance officer inside the organization (CIO or CDO) will allow the information systems to evolve and the contents to be permanently used (e.g. for data migration, extraction, re-use needs).

From there, it's almost simply a matter of reaping the fruits of a long preparation and starting to extract the value from all the collected data. Almost.

Indeed for analytics to be possible, the tools made available inside the organization have to be effective. Once more the governance officer is responsible for choosing the appropriate analytics tools for all the stakeholders, tools that will be able to evolve in order to meet the ever-changing needs of Business Intelligence.

Data preparation, warehousing or integration, ETL, visualization... The solutions are many on the market but the current trend is to ease-of-use, DAAS (Data As A Service) and "all-in-one" tools. Could data virtualization be the solution to achieve our objectives in a fast, easy and efficient manner, regardless of the size of the organization and the complexity of its information systems ?

We'll let you be the judge !

Topics: 

Writer: 

Muriel Adamski