Securing data across a large network means dealing with challenges such as combatting ransomware infection, phishing and spam schemes, also threats that come from within. On the surface these threats are not much different from one environment to the next in the dangers that they present. Even the type of data we’re talking about protecting may be similar at times from social security numbers, to credit card information, and other personal records.
Where we find the deep differences is in the approach to strategy creation for protecting data, due to the unique nature of these environments.
often follow a more centralized network model with business information concentrated at a primary data center. This allows for robust administrative control, the ability to bring departments together to share best practices and deploy strict data security policies, and for example – create a clean and straightforward backup schedule. IT resources are more easily consolidated and technologies can be standardized, making costs more predictable. Of course with fewer potential points of failure – the stakes are very high.
Smaller academic institutions…
may also find that their limited resources guide them towards this same centralized model, with the entire IT team wearing multiple hats but with the change-window’s for rolling out new hardware and software being considerably shorter than corporate environments.
Larger academic institutions…
find themselves with various departments managing their own data, making it much more complicated to protect the network. There are multiple reasons for this, including different groups and departments needing distinct levels of freedom and autonomy to function. Often these departments are spread across different physical locations, and students, faculty and guests must be able to bring their own devices and have connectivity. How is a school to grapple with such an unruly beast?
Creating a decentralized environment by segmenting the network into different zones can be an effective way to restrict access and limit the effects of a security threat. By moving critical information into a central core, and focusing on that infrastructure, best practice policies can then be created for those departments which must access it. From the user’s perspective, the individual needs of each network segment must be thoroughly analyzed. It is this early phase of careful pre-planning of a distributed environment which is so crucial to its manageability. Trust levels can then be established for segmented zones, letting the periphery departments govern the data that’s important to them under their own IT administration.
When we look at educational environments we must conclude that with varying amounts of openness required and the likely need for BYOD (Bring Your Own Device), uniformly blocking out the world is simply not realistic. Open networks must of course be firewalled, but we must also consider how easy it might be for someone to simply walk into a campus building and bypass the firewall. Therefore schools must operate under the assumption that they are always under attack, and that any network segment may be exposed at any time. With this frame of mind, of limiting the damage a threat can cause to the entire network via segmentation, a greater emphasis may naturally move towards network monitoring.
As institutions move to a more distributed environment there is a direct impact in how data must be secured. As a backup strategy is created, the current and future state of the network should be carefully considered. Solutions should remain hardware agnostic for maximum flexibility and seamlessly handle multiple locations and operating systems. NovaStor’s “Guide to Building a Better Backup Strategy” can be an excellent tool as you create or reevaluate your own data protection policies.
What unique methods does your academic institution incorporate to protect critical data?
We’d like to hear about it.