Craig S. Mullins

Return to Home Page

June / July 2007
 

 

 

 

 

 

  

 

                                     



zData Perspectives
by Craig S. Mullins  

 

The Next Tipping Point: Regulatory Compliance and Data Management.

A tipping point, as explained by Malcolm Gladwell in his bestseller from a couple of years ago, The Tipping Point: How Little Things Can Make a Big Difference, occurs when a series of changes cause organizations to behave in fundamentally different ways. After a tipping point occurs it is usually quite obvious that something significant has changed. But predicting a tipping point can be difficult. Never the less, I’m going to give it a go.

The last tipping point in IT occurred during the late ‘80s and early ‘90s. During this timeframe the industry transitioned from the age of the batch window to non-stop availability. After the advent of the Internet it no longer was acceptable for technology folks to tell the business that their applications and systems were not capable of supporting round-the-clock availability.

Oh, it took some time for this tipping point to become reality. At first, businesses were clamoring for continual up-time and IT struggled to squeeze as much availability out of the technology that was available at the time. The hardware and software took some time to catch up to the need. The industry rapidly adapted to catch up to the post-tipping point requirements. We added online reorganizations, autonomic management features, and hardware with rapid failover capabilities. And now 24x7 availability is an accepted business practice and many businesses will accept nothing less.

And now, the next tipping point is on the horizon. It comes by the name of regulatory compliance. Legislation is being passed by government bodies at breakneck speed these days. No sooner than we begin to understand and comply with one law another, one is passed that requires our attention. There are over 150 regulations tracked by the IT Compliance Institute in their Universal Compliance project. This project is the first independent initiative to exclusively support IT compliance management. You can view the UCP online at http://www.itcinstitute.com/ucp/index.aspx.

So, slowly but surely we have amassed an avalanche of regulations that dictate how we must treat corporate data. As organizations react to comply with these regulations, we will see this new tipping point. I believe it will manifest itself in the form of companies beginning to treat data as a valuable corporate asset.

The change that is imminent is that we will not just be saying that, but actually doing it. Just about every high level executive mouths the platitude that they treat data as a corporate asset already. But do they? Think about how we treat other important assets. Our finances, that is, monetary assets, are treated much more rigorously than we treat data. If our financial statement is one penny out of balance we will not stop working until we track it down and get it right. Do we do the same thing for data quality? What about human resources? Every company has an organization chart that maps their personnel to their department and job. Do we have a corporate data model to accomplish the same task for our data? No, we do not treat data like we treat other assets, at least not yet.

But the laws are ahead of reality, so once again, we are in that period where the software must catch up to the requirement. What types of software innovation will be required to enable us to actually treat data as a corporate asset?

Well, we need advanced algorithms and techniques for ensuring data quality. Today we see data profiling and cleansing tools but these still require too much manual intervention to be successful. Advances will be required to automate data quality through pattern detection and real-time data profile management.

Additionally, robust database archiving solutions are just now entering the market. These solutions enable data to be maintained in an authentic manner and queried over long durations – decades, and even centuries in some cases.

Furthermore, the manner by which we protect the data in our enterprise databases needs to be improved. This includes, but is not necessarily limited to, better and more efficient encryption and decryption techniques, label-based access security to support more granular authorization, and, perhaps most importantly, improved database auditing. Knowing “who did what to which piece of data when” is a prime focus of many regulations, but today’s software offerings do not yet provide the full range of capabilities required to be in compliance.

Finally, not every change will be technology-focused. Organizations will need to adopt a data governance practice to ensure that data is managed, or governed if you will, appropriately for the corporation and in compliance with the pertinent regulations. Data governance is the practice of managing the availability, usability, integrity, and security of the data in use within your organization. A sound data governance program includes a governing body or council, a defined set of procedures, and a plan to execute those procedures. Without a data governance practice, regulatory compliance is impractical, if not impossible.

Indeed, IT in 1995 looked nothing like it looked in 1980, and the predominant driver of change was the move to non-stop availability. And the IT world of 2010 will look nothing like 1995, because of the safeguards we will put in place to better protect and manage our data.

Some people rue the onerous burden imposed by governmental regulations on their data, but I applaud them. After all, in most cases, all these regulations are doing is forcing businesses to do the things they should have been doing anyway. Too bad it takes legislation to make that happen 

 

From zJournal, June / July 2007
.

© 2007 Craig S. Mullins,  All rights reserved.

Home.