Companies face a dilemma. To maximize information’s value, they must make it available to employees, business partners, and customers. That availability makes it difficult for a company to control access and limit how many times information is copied. As a result, proprietary data gets scattered throughout the organization and other entities, increasing the chance that it will fall into the wrong hands. To avoid that problem, businesses need defense-in-depth strategies for protecting their sensitive data. They should start at the network perimeter and go from there to the operating system and applications. The final layer concerns the data itself.
The network is the first layer of protection for information. Although protection options have been around for a long time, this layer often remains porous because of misconfiguration and inadequate coverage of external connectivity.
Basic protection typically consists of an outer firewall; additional ones are added for more granular protection. Certain databases, for example, will likely need to be protected from all but a few applications. Having these layers can be expensive, however. Enterprise firewalls can range in price from a few thousand dollars to more than $100,000.
Installation is only the first step. The firewall will only be useful if it is properly configured to deny anything that is not specifically allowed. While configuring the network this way can sometimes be challenging, it’s considered an industry best practice, and this approach is, therefore, followed by most companies.
The person configuring the system must determine what legitimate network traffic consists of and shape the filtering parameters to match. The biggest challenge for system administrators typically lies in researching and identifying what type of Internet traffic the business allows and requires.
Once traffic is understood, anything that is not expected should be alarmed, blocked, or both. There should also be content filtering that helps reduce unwanted programs, malware, and information from entering the network, usually via Web browsing.
To ensure that the firewall is providing its intended protection, the perimeter should be constantly monitored for attacks and routinely tested for vulnerabilities.
In addition to blocking traffic not specifically expected, the company may establish a quarantined network segment to limit connectivity to unauthenticated and guest users.
Hybrid protection solutions, such as network access control (NAC), add another protective layer. They can ensure that systems wanting to get on the network meet a certain level of security criteria. This usually includes checking for updated antivirus software, current patches, restricted browser settings, and functioning personal firewalls. Cisco (Network Admission Control), Microsoft (Network Access Protection), and other single solution providers check systems for these requirements before granting them network access.
The implementation difficulty and cost for NAC can vary considerably depending on the network environment in which it is deployed. Older network infrastructures may have to be upgraded to accommodate NAC’s ability to inspect, quarantine, or deny a system access. Most NAC solutions depend on using network routing and services along with authentication resources.
The entry point cost of a basic NAC solution is around $20,000. However, it can range into the hundreds of thousands of dollars or more based on the size of the network environment. Most organizations require a minimum of three months to do the implementation and solution “tuning” required.
There is another cost beyond additional hardware and software. It is the time and effort required to define the policies that a system must meet to allow network access. While NAC is designed to protect the organization, it must not inhibit normal operations. The impact of NAC can be minimized and its protection maximized by well-researched and validated policies. NACs are used in conjunction with firewalls. While firewalls are used mainly for filtering traffic, they don’t assess how systems sending that traffic are configured.