Practice Free SY0-701 Exam Online Questions
Which of the following types of vulnerabilities is primarily caused by improper use and management of cryptographic certificates?
- A . Misconfiguration
- B . Resource reuse
- C . Insecure key storage
- D . Weak cipher suites
C
Explanation:
Detailed Insecure key storage refers to vulnerabilities caused by improper handling of cryptographic keys and certificates, such as storing them in plaintext or lacking access controls.
Reference: CompTIA Security+ SY0-701 Study Guide, Domain 2: Threats, Section: "Cryptographic Vulnerabilities and Mitigation".
During a SQL update of a database, a temporary field used as part of the update sequence was modified by an attacker before the update completed in order to allow access to the system.
Which of the following best describes this type of vulnerability?
- A . Race condition
- B . Memory injection
- C . Malicious update
- D . Side loading
A
Explanation:
Arace condition occurs when two or more processes attempt to access and modify a shared resource simultaneously, leading to unintended behavior. In this scenario, the attacker was able to modify a temporary field before the SQL update completed, indicating a time-of-check to time-of-use (TOCTOU) vulnerability, which is a type of race condition.
Memory injection (B)refers to inserting malicious code into a running process’s memory, but that is not what is happening here.
Malicious update (C)is too broad and does not specifically describe this scenario.
Side loading (D)is a technique where malicious software is loaded via a trusted application, unrelated to this case.
Reference: CompTIA Security+ SY0-701 Official Study Guide, Threats, Vulnerabilities, and Mitigations domain.
A security analyst scans a company’s public network and discovers a host is running a remote desktop that can be used to access the production network.
Which of the following changes should the security analyst recommend?
- A . Changing the remote desktop port to a non-standard number
- B . Setting up a VPN and placing the jump server inside the firewall
- C . Using a proxy for web connections from the remote desktop server
- D . Connecting the remote server to the domain and increasing the password length
B
Explanation:
A VPN is a virtual private network that creates a secure tunnel between two or more devices over a public network. A VPN can encrypt and authenticate the data, as well as hide the IP addresses and locations of the devices. A jump server is a server that acts as an intermediary between a user and a target server, such as a production server. A jump server can provide an additional layer of security and access control, as well as logging and auditing capabilities. A firewall is a device or software that filters and blocks unwanted network traffic based on predefined rules. A firewall can protect the internal network from external threats and limit the exposure of sensitive services and ports. A security analyst should recommend setting up a VPN and placing the jump server inside the firewall to improve the security of the remote desktop access to the production network. This way, the remote desktop service will not be exposed to the public network, and only authorized users with VPN credentials can access the jump server and then the production server.
Reference: CompTIA Security+ Study Guide: Exam SY0-701, 9th Edition, Chapter 8: Secure Protocols and Services, page 382-383 1; Chapter 9: Network Security, page 441-442 1
A security analyst determines that a security breach will have a financial impact of $15,000 and is expected to occur twice within a three-year period.
Which of the following is the ALE for this risk?
- A . $7,500
- B . $10,000
- C . $15,000
- D . $30,000
B
Explanation:
The correct answer is $10,000 because Annualized Loss Expectancy (ALE) represents the expected yearly financial loss from a specific risk.
According to Security+ SY0-701 risk management principles, ALE is calculated using the formula:
ALE = Single Loss Expectancy (SLE) × Annualized Rate of Occurrence (ARO)
In this scenario, the Single Loss Expectancy (SLE) is clearly defined as $15,000, which represents the financial impact of a single security breach. The challenge lies in determining the Annualized Rate of Occurrence (ARO).
The breach is expected to occur twice over a three-year period, which means the ARO is:
ARO = 2 ÷ 3 ≈ 0.67 occurrences per year
Using the ALE formula:
ALE = $15,000 × 0.67 ≈ $10,000
This calculation aligns with standard quantitative risk assessment techniques emphasized in the SY0-701 study guide. ALE allows organizations to compare the cost of potential losses against the cost of implementing security controls, helping leadership make informed, financially sound risk management decisions.
Option A, $7,500, would be correct only if the event occurred once every two years.
Option C, $15,000, reflects the SLE but does not account for frequency.
Option D, $30,000, incorrectly represents the total loss over three years rather than an annualized value.
The SY0-701 objectives highlight ALE as a critical metric for prioritizing risks, justifying security investments, and communicating risk in business terms to executives. By converting risk into an annual expected cost, ALE bridges the gap between technical security concerns and organizational financial planning.
In summary, when frequency is spread across multiple years, the loss must be annualized. Doing so correctly results in an ALE of $10,000, making option B the correct answer.
The analyst wants to move data from production to the UAT server for testing the latest release.
Which of the following strategies to protect data should the analyst use?
- A . Data masking
- B . Data tokenization
- C . Data obfuscation
- D . Data encryption
A
Explanation:
Data masking replaces sensitive data with realistic but fictional data, preserving format and usability while protecting confidential information during testing in environments like UAT (User Acceptance Testing).
Tokenization (B) replaces data with tokens but is more common for payment data in production. Obfuscation (C) scrambles data but is less structured than masking. Encryption (D) protects data confidentiality but may not be practical for testing.
Data masking is a best practice for protecting sensitive data in non-production environments covered in the General Security Concepts domain 【 6:Chapter 7†CompTIA Security+ Study Guide 】 .
During a routine audit, an analyst discovers that a department uses software that was not vetted.
Which threat is this?
- A . Espionage
- B . Data exfiltration
- C . Shadow IT
- D . Zero-day
C
Explanation:
Shadow IT refers to software, hardware, cloud services, or applications deployed without approval from the IT or security department. In this scenario, a high school department is using an unvetted simulation program―classic Shadow IT behavior.
Security+ SY0-701 explains that Shadow IT:
Introduces unknown vulnerabilities
Bypasses security controls
Creates compliance risks
Leads to data exposure
Interferes with standard configuration management
Espionage (A) involves intelligence gathering, not unauthorized software use. Data exfiltration (B) involves data theft, not unauthorized software deployment. Zero-day (D) refers to unknown vulnerabilities, not unapproved systems.
Thus, Shadow IT is the correct answer.
A Chief Information Security Officer would like to conduct frequent, detailed reviews of systems and procedures to track compliance objectives.
Which of the following is the best method to achieve this objective?
- A . Third-party attestation
- B . Penetration testing
- C . Internal auditing
- D . Vulnerability scans
An organization has been experiencing issues with deleted network share data and improperly assigned permissions.
Which of the following would best help track and remediate these issues?
- A . DLP
- B . EDR
- C . FIM
- D . ACL
C
Explanation:
File Integrity Monitoring (FIM) is the best tool for detecting unauthorized file deletions, modifications, or improper permission changes within network shares. FIM works by creating cryptographic hashes and baselines for protected files or directories and then continuously monitoring for deviations. Any unauthorized deletion, modification, or permission change triggers alerts.
Security+ SY0-701 identifies FIM as a foundational integrity control used in compliance frameworks (PCI-DSS, HIPAA) and operational security monitoring. Because the organization is experiencing unpredictable changes to shared files and permissions, FIM provides visibility and accountability for who changed what and when.
DLP (A) prevents data leakage but does not detect permission changes.
EDR (B) focuses on endpoint threat behavior, not file integrity on network shares.
ACLs (D) define permissions but do not track changes or detect unauthorized modifications.
Therefore, C: FIM is the correct choice.
Which of the following is the best way to consistently determine on a daily basis whether security settings on servers have been modified?
- A . Automation
- B . Compliance checklist
- C . Attestation
- D . Manual audit
A
Explanation:
Automation is the best way to consistently determine on a daily basis whether security settings on servers have been modified. Automation is the process of using software, hardware, or other tools to perform tasks that would otherwise require human intervention or manual effort. Automation can help to improve the efficiency, accuracy, and consistency of security operations, as well as reduce human errors and costs. Automation can be used to monitor, audit, and enforce security settings on servers, such as firewall rules, encryption keys, access controls, patch levels, and configuration files. Automation can also alert security personnel of any changes or anomalies that may indicate a security breach or compromise12.
The other options are not the best ways to consistently determine on a daily basis whether security settings on servers have been modified:
Compliance checklist: This is a document that lists the security requirements, standards, or best practices that an organization must follow or adhere to. A compliance checklist can help to ensure that the security settings on servers are aligned with the organizational policies and regulations, but it does not automatically detect or report any changes or modifications that may occur on a daily basis3.
Attestation: This is a process of verifying or confirming the validity or accuracy of a statement, claim, or fact. Attestation can be used to provide assurance or evidence that the security settings on servers are correct and authorized, but it does not continuously monitor or audit any changes or modifications that may occur on a daily basis4.
Manual audit: This is a process of examining or reviewing the security settings on servers by human inspectors or auditors. A manual audit can help to identify and correct any security issues or discrepancies on servers, but it is time-consuming, labor-intensive, and prone to human errors. A manual audit may not be feasible or practical to perform on a daily basis.
Reference = 1: CompTIA Security+ SY0-701 Certification Study Guide, page 1022: Automation and Scripting C CompTIA Security+ SY0-701 C 5.1, video by Professor Messer3: CompTIA Security+ SY0-701 Certification Study Guide, page 974: CompTIA Security+ SY0-701 Certification Study Guide, page 98.: CompTIA Security+ SY0-701 Certification Study Guide, page 99.
A software developer released a new application and is distributing application files via the developer’s website.
Which of the following should the developer post on the website to allow users to verify the integrity of the downloaded files?
- A . Hashes
- B . Certificates
- C . Algorithms
- D . Salting
A
Explanation:
Posting hashes allows users to verify the integrity of downloaded files. As outlined in Security+ SY0-701, a cryptographic hash (e.g., SHA-256) produces a fixed-length digest unique to the file’s contents. Users can compute the hash of the downloaded file and compare it to the published value; a match confirms the file has not been altered.
Certificates (B) establish identity and trust but do not directly verify file integrity post-download unless combined with signing workflows. Algorithms (C) are general methods, not verification artifacts. Salting (D) is used with password hashing and is irrelevant here.
Therefore, A: Hashes is the correct choice.
