Secure systems design concepts help ensure that computing systems are deployed and maintained in a secure state. If you’re planning to take the SY0-601 version of the Security+ exam, you should understand different methods used to implement systems securely. This includes several steps used to secure hosts.
For example, can you answer this question?
Q. The BizzFad organization develops and sells software. Occasionally they update the software to fix security vulnerabilities and/or add additional features. However, before releasing these updates to customers, they test them in different environments. Which of the following solutions provides the BEST method to test the updates?
A. Baseline configuration
B. BYOD
C. Sandbox
D. Change management
More, do you know why the correct answer is correct and the incorrect answers are incorrect? The answer and explanation are available at the end of this post.
Using Master Images for Baseline Configurations
One of the most common methods of deploying systems is with images starting with a master image. An image is a snapshot of a single system that administrators deploy to multiple other systems. Imaging has become an important practice for many organizations because it streamlines deployments while ensuring they are deployed securely. The figure and the following text identify the overall process of capturing and deploying an image:

Capturing and deploying images
- Administrators start with a blank source system. They install and configure the operating system, install and configure any desired applications, and modify security settings. Administrators perform extensive testing to ensure the system works as desired and that it is secure before going to the next step.
- Next, administrators capture the image, which becomes their master image. Symantec Ghost is a popular imaging application, and Windows Server versions include free tools many organizations use to capture and deploy images. The captured image is simply a file stored on a server or copied to external media, such as a DVD or external USB drive.
- In step 3, administrators deploy the image to multiple systems. When used within a network, administrators can deploy the same image to dozens of systems during initial deployment or to just a single system to rebuild it. The image installs the same configuration on the target systems as the original source system created in step 1.
Administrators will often take a significant amount of time to configure and test the source system. They follow the same hardening practices discussed earlier and often use security and configuration baselines. If they’re deploying the image to just a few systems, such as in a classroom setting, they may create the image in just a few hours. However, if they’re deploying it to thousands of systems within an organization, they may take weeks or months to create and test the image. Once they’ve created the image, they can deploy it relatively quickly with minimal administrative effort.
Imaging provides two important benefits:
- Secure starting point. The image includes mandated security configurations for the system. Personnel who deploy the system don’t need to remember or follow extensive checklists to ensure that new systems are set up with all the detailed configuration and security settings. The deployed image retains all the settings of the original image. Administrators will still configure some settings, such as the coomputer name, after deploying the image.
- Reduced costs. Deploying imaged systems reduces the overall maintenance costs and improves reliability. Support personnel don’t need to learn several different end-user system environments to assist end users. Instead, they learn just one. When troubleshooting, support personnel spend their time helping the end user rather than learning the system configuration. Managers understand this as reducing the total cost of ownership (TCO) for systems.
Imaging isn’t limited to only desktop computers. You can image any system, including servers. For example, consider an organization that maintains 50 database servers in a large data center. The organization can use imaging to deploy new servers or as part of its disaster recovery plan to restore failed servers. It is much quicker to deploy an image to rebuild a failed server than rebuild a server from scratch. If administrators keep the images up to date, this also helps ensure the recovered server starts in a secure state.
Patch Management
Software is not secure. There. I said it. As someone who has written a few programs over the years, that’s not easy to say. In a perfect world, extensive testing would discover all the bugs, exploits, and vulnerabilities that cause so many problems.
However, because operating systems, applications, and firmware include millions of lines of code, testing simply doesn’t find all the problems. Instead, most companies attempt to create secure and bug-free software as they’re developing it and then make a best effort to test the software before releasing it. Later, as problems crop up, companies write and release patches or updates. Administrators must apply these patches to keep their systems up to date and protected against known vulnerabilities.
Some smaller organizations enable auto-updates. Systems regularly check for updates, download them when they’re available, and automatically apply them.
Patch management ensures that systems and applications stay up to date with current patches. This is one of the most efficient ways to reduce operating system and application vulnerabilities because it protects systems from known vulnerabilities. Patch management includes a group of methodologies and consists of identifying, downloading, testing, deploying, and verifying patches.
Administrators often test updates in a sandbox environment such as a virtual machine. A sandbox environment provides an isolated environment. After testing the patches, administrators deploy them. They don’t typically deploy the patches manually. Instead, they use third-party tools to deploy the patches in a controlled manner. For example, Microsoft Endpoint Configuration is a systems management tool used for many purposes, including patch management. It examines endpoints to determine if patches are installed.
In addition to deploying patches, systems management tools also include a verification component that verifies patch deployment. They periodically query the systems and retrieve a list of installed patches and updates. They then compare the retrieved list with the list of deployed patches and updates, providing reports for discrepancies. In some networks, administrators combine this with network access control (NAC) technologies and isolate unpatched systems in quarantined networks until they are patched.
Improper or weak patch management results in preventable vulnerabilities that attackers can exploit. This includes vulnerabilities in operating systems, applications, and firmware.
Q. The BizzFad organization develops and sells software. Occasionally they update the software to fix security vulnerabilities and/or add additional features. However, before releasing these updates to customers, they test them in different environments. Which of the following solutions provides the BEST method to test the updates?
A. Baseline configuration
B. BYOD
C. Sandbox
D. Change management
Answer C is correct. A sandbox provides a simple method of testing updates. It provides an isolated environment and is often used for testing.
A baseline configuration is a starting point of a computing environment.
Bring your own device (BYOD) refers to allowing employee-owned mobile devices in a network and is not related to this question.
Change management practices ensure changes are not applied until they are approved and documented.
See Chapter 5 of the CompTIA Security+: Get Certified Get Ahead: SY0-601 Study Guide for more information on implementing secure systems.