Hendon Publishing - Article Archive Details
Warning: Your new IT budget could bust quickly if you’re not careful
Government budget season has come and gone, and it seems the dust is finally settling. Even California, well known for its months-long delay in getting its budgets passed, is now operating under a signed appropriations bill.
For IT departments, it now means they can begin executing on initiatives such as data integration and interconnectivity between legacy and newer systems. But beware—if not thought out well, budgets will skyrocket out of control, and leave even bigger problems.
The true costs of lifecycle management
Most large state and federal IT projects are addressed in a vacuum in order to achieve a short-term, singular business need. As a result, the convergence of new components and existing data required is usually a secondary and “one-off” aspect of the solution that best fits that project scope, and simply “gets the thing to work.”
However, many IT shops discover that, over time, they have accumulated many platforms, multiple technology stacks, varying tools and no singular framework to manage all their expanding systems and applications. No enterprise-view or mechanism for managing their data assets exists. Worse still when it comes to addressing the “next project,” previous IT investments are not re-usable from the last solution, and the cycle of “building silos” begins yet again.
The costs are real and staggering. Consider this: Each year, new projects developed and implemented without consideration for their ability to leverage data across multiple platforms with ease and efficiency mean the organization must absorb higher levels of licensing and infrastructure overhead, new and varying tools, greater support and maintenance demands (including additional IT staff) and the perpetual “reinvention of the wheel” for solving data integration within each project. The result: a cannibalization of as much as half the IT budget in infrastructure and operational requirements, according to published sources.
To be fair, so-called “data mapping tools” have been around for some time in an attempt to solve this issue. Nonetheless, in reality the most popular ones introduce little more than a complex script file (additional hard-coded custom software) for the specific integration issue at hand in similar “one-off fashion” to how the network was developed and implemented in the first place. Such a solution is still “brittle” and will require additional IT staff to maintain and evolve them over time, costing even more in labor and hard dollar resources for an unwieldy solution that is less than elegant or robust.
“Future Proofing” data integration through automation
Government organizations are starting to see this as a mission-critical issue; the biggest indicator of this trend is how the terms “XML” and “cloud computing” are becoming more mainstream in these environments through such initiatives as the National Information Exchange Model (NIEM), Global Justice XML Data Model (GJXDM) and the Logical Entity eXchange Specification (LEXS). This is because there’s an increasing move afoot to make data integration “future proof” by leveraging standards to automate as much as possible the mapping and transformation of existing and forthcoming data exchanges. To do so requires establishing a common and reusable integration framework (or “data integration tier”) that holistically serves the IT enterprise and its full host of applications, data and services.
Several elements of these systems must be present, including:
•That it embrace and facilitate the use of standards relevant to the domain, specifically NIEM and the Justice Reference Architecture (JRA)
•That it be built on open source industry standards and commodity frameworks.
•That it leverage the Java/J2EE “run anywhere” platform.
•That its data interoperability be developed using existing and evolving XML standards.
•That it be architected according to Service-Oriented Architecture (SOA) principles.
•That it leverage reusable design methodology that includes data governance, user permissions and business use cases. In evaluating solution components and tools for enterprise data integration, there are certain benefits that must be present to ensure an ongoing return on investment.
CIOs should ensure that the framework they choose, and in particular the design and integration tools they purchase, are able to:
•Mitigate the impact of transforming legacy information systems.
•Reduce development and implementation time from months to days.
•Provide rich and intuitive user interfaces that facilitate ease of use.
•Remove the need to be an XML or standards expert, or the cost of expensive “expert” consulting services.
•Provide on-demand artifact generation from user-defined compositions.
•Allow for the sharing and reuse of defined artifacts.
•Support full lifecycle management and provide for foundation for SOA Governance.
IT departments of all shapes and sizes share one common trait: a strong desire by management to significantly improve network efficiency and enhanced services through data sharing within the confines of a limited—and closely scrutinized—budget that does not look kindly upon spending additional money in the future just to replace work that was done only a few years before. Organizations that make it a priority to build such convergence systems based on open-source platforms that conform to evolving standards will not only provide greater automation and data integration within an entire network that leverages fewer resources to maintain, but also one that is flexible and configurable to however the communications environment evolves.
Thomas Kooy is the Vice President of Business Development for Patriot Data Solutions Group, a technology company with deep domain expertise in developing middleware platforms for local, state and federal government agencies in order to share information across multiple, disparate databases. He can be reached at email@example.com.
Published in Public Safety IT, Nov/Dec 2011
Rating : 10.0
Click to enlarge images.