Hendon Publishing - Article Archive Details
IT architecture, infrastructure, and outsourcing
Written by Matt Brodacki
By Matt Brodacki
Regardless of an agency’s mission, computer networks have become a critical resource in allowing an organization to achieve goals and complete objectives. It is for this reason that department heads and network administrators are working to gain an understanding of how technology is affecting their departments. Executives and network administrators not only have to embody the foresight to balance a myriad of daily technical issues to continue operations, they must also be capable of evolving to meet the future needs of their organization.
An agency’s inability to evolve or meet organizational and security needs can prove to be extremely costly. Perhaps one of the most well-known examples of this was illustrated by the TJX security breach of 2007. CNN reported that TJX (NYSE), which operates discount stores like T.J. Maxx and Marshall’s, had to explain to traders that its profit was severely undercut as it absorbed a $118 million charge that resulted from a dramatic breach of customer data. Reaching epic levels, ultimately this breach of security resulted in 45.7 million credit and debit card numbers being stolen over an 18-month period, and another 455,000 customers who returned merchandise without receipts and had their personal data stolen, including driver’s license numbers. The breach of security was a direct result of a combined architecture and infrastructure flaws, which empowered computer criminals to steal from unsuspecting victims. Although it is easy for us speculate in hindsight of the incident, many blame TJX for not having taken better precautions. By not operating with security practices in mind, a catastrophic breach of data occurred, totaling millions of dollars in losses.
The manner in which data is stored and sent to destinations can also present a problem for companies. CNN reported another extremely large case in 2006 when a tape was lost while being sent via UPS to a credit reporting bureau TransUnion in Bridgeport, CT. The missing tape contained confidential information about customers, including bank employees, Social Security and checking account numbers. Although the breach did not have data on checking account balances, debit card numbers, personal identification numbers (PINs) or birth dates, its effects still spanned over 90,000 bank customers.
So the real question is how does an administrator oversee a computer network and balance a budget with so many technical variables? Unfortunately, there is no easy answer. Most network administrators will tell you that the only safe computer is a computer that is off and unplugged. However, one can considerably lower the chances of becoming a victim by taking a proactive stance, by evaluating the organization’s architecture, infrastructure, and outsourcing.
Architecture refers to the configuration and overall blueprint of computers within a network. How computers are designed and connected to one another can change as quickly as the evolution of technology. With such a rapid evolution of networks, it is imperative to take a systematic approach to ensure that departments can adjust with the changes of society. Having a solid architecture will optimize the method of communication and allow a solid foundation for an infrastructure to be created. When creating architecture, it is important to establish an outline or guide to help administrators. Since the implementation of architecture is a complex project that often requires many steps, administrators should strive to find practices that can help in the implementation process. A guide will allow administrators to focus on some of the key elements without the concern of missing a critical step. If you are creating a new architecture from scratch, it might be beneficial to evaluate the following:
1. Number of users accessing the system
2. Anticipated growth
3. The performance needs of the users
4. The federal, state or local mandates
5. Costs associated with the new type of communication framework
6. Connection method
If you happen to be an administrator or new department head taking over an existing network, you may also want to consider the following:
1. Determine how the computers on the network currently communicate with each other (wired networks, wireless routers, wireless modem cards, VPNs, etc.) and document a basic map for reference.
2. Determine if regulations are being met, and develop a strategy to gain compliance as soon as possible if they are not being met.
3. Perform a risk assessment to evaluate possible connection flaws in the system, and make corrections to the architecture to reflect an acceptable standard.
4. Have a contingency plan for an alternative connection method if the network encounters a disruption of service.
Infrastructure is slightly different from architecture in that it focuses more on the components of a network rather than the connection method or design of a network. Key infrastructure components are the backbone of a network and can include servers, firewalls, workstations, and other various pieces of equipment. Components of an infrastructure are limited only by the imagination and serve as the most critical part of a network. The infrastructure utilizes the architecture of a network to carry out its mission of providing users the ability to achieve their goal through the use of a computer.
Infrastructures are also comprehensive because the evolution of hardware and software are changing at an unprecedented pace. It is not uncommon for companies or organizations to implement critical components without first placing them in a test environment for an extended period of time. Many organizations financial constraint simply will not allow for a test environment, and many times the critical components are outdated while in the test environment because technology is evolving so rapidly.
So how do administrators roll out new components into the backbone of their computer networks? Many times administrators must be able to integrate new critical hardware and software “on the fly” and have the skill set or support system to troubleshoot problems as they arise. The days of having one phone number to call when things go wrong have long since dissolved. With the growing of complexity of infrastructures, even network administrators must rely on a team of skilled people who may have a specialty within the network. One employee may be responsible for the firewall settings while another may strictly handle backup sessions for data.
A well-instituted infrastructure maintains a symbiotic relationship with its architecture with a goal of allowing users to perform tasks seamlessly. It strives to pull all of the network components together to communicate without hesitation. Department heads must be able to utilize all of the exhaustive efforts of training staff, purchasing hardware, maintaining software costs and managing personnel to allow public safety personnel to use technology to perform their jobs more efficiently.
Something as easy as sending an e-mail has achieved monumental goals of combining and layering technology to complete a single multi-faceted task. And although the task may seem as simple as clicking a button to the user, it is actually an extremely complex series of commands. If one critical component of the infrastructure were to fail in the process, the communication would not be carried out. Administrators must minimize a disruption of services and optimize all network hardware and software to perform a single task. Achieving complex technological goals is possible and done every day throughout public agencies, but the process is greatly enhanced and appreciated if people are well informed as to what resources are necessary to perform even the simplest of tasks.
Perhaps one of the most sensitive subjects within the technology community is the question of outsourcing. Salaries are typically the greatest expenditure for companies and public agencies, so it is for obvious reasons that executives wrestle with the concept of outsourcing IT work. Administrators are trained to seek the greatest level of work for the least amount of money, so should outsourcing be a viable solution for most public sector workplaces? Not always, particularly if you are dealing with sensitive information.
Most often, federal, state, and local data is accessible from each other, which enables any end user to have instant access to millions of sensitive records. Although outsourcing appears to be saving an exorbitant amount of money, public sector leaders must approach this topic with the greatest level of caution.
A prime illustration of how quickly things can deteriorate might be as simple as a small agency purchasing a new computer with the on-site support. Upon encountering a problem with the computer, the administrator contacts their on-site support to repair the workstation. Most on-site support contracts now take remote control of the machine to optimize their technician’s time. By allowing this, the administrator has just given the remote technician the access to sensitive records. There has been no background check or verification of whom the technician actually is, and there is a good chance the technician will be difficult to locate after the administrator ends the call. Whether the technician is a resident of the United States or a resident of a foreign country, the administrator has just compromised the network.
All outsourced vendors should have a complete background check before performing tasks on workstations that have access to a server with sensitive information. Outsourcing can still be a useful tool if utilized in the proper environment, but administrators should consider the risks before using outsourcing as a solution.
Many proactive organizations are taking measures to properly secure and access sensitive information. By utilizing technology, public service organizations can now limit the amount of data being stored on workstations or laptop computers that might not be located in the office.
One of the premier softwares cropping up throughout the public sector is Citrix. Most people are familiar with is sister company, GoToMyPC, which essentially does the same thing by allowing users to see a particular desktop from a remote location. Citrix and GoToMyPC will allow remote users to log in and conduct any work necessary as if they are sitting at their workstations at work. The software is only sending key strokes to update the screen of the user. The technology has become so seamless that the only difference you may notice is your dog now runs by your desk while typing that overdue report. Essentially, the user is just jumping across the wire and sitting in the secure environment to conduct his work. Once the work is complete, the user can log off, and the information is saved in the secure environment rather than his local machine.
This is a simple way for organizations to expand operations to a remote location while keeping security principles in mind. Furthermore, when a laptop gets stolen out of a home of a public servant who may be using the computer to access sensitive data, administrators can have the comfort of knowing that no data resides locally on the machine that is now in the hands of a criminal.
Leaders of an agency must realize the importance of managing a network and have a working knowledge of concepts that will help protect department resources. Network administrators should be technically proficient in managing those resources and be vigilant to evolve as security measures change. Spending money to properly design a network will save agencies a great deal since they will be avoiding frivolous spending to try and correct mistakes that could have been prevented.
It only takes one breach of security to jeopardize the integrity of a network and expose records to a computer criminal. While sharing information between the federal, state, and local level has immense benefits, it also presents the added burden of agencies having to protect other organization’s records. These records can reach numbers into the millions, and department executives must determine if they are willing to take the precautionary steps necessary to protect our nation’s database of sensitive information.
Matt Brodacki has testified before Senate on emerging technologies as a police officer and has completed an M.S. program in computer forensics. He is also the owner / founder of a computer security company and can be reached at firstname.lastname@example.org.
Published in Public Safety IT, Jan/Feb 2009
Rating : 9.7