How exactly to automate SCAP tests with AWS Systems Protection and Manager Hub
US authorities agencies utilize the Nationwide Institute of Requirements and Technologies (NIST) framework to supply security and compliance assistance for his or her IT systems. THE UNITED STATES Department of Protection (DoD) furthermore requires its IT techniques to check out the Safety Technical Implementation Manuals (STIGs) made by the Defense Info Systems Agency (DISA). To assist in managing compliance with one of these programs, IT managers may use OpenSCAP , an open-source NIST-certified safety and compliance tool. OpenSCAP uses the Protection Content Automation Process (SCAP) to automate constant monitoring, vulnerability administration, and reporting of protection policy compliance information. Although OpenSCAP is intended for DoD security requirements, it can be put on set safety baselines for just about any industry.
This website post will walk you through how exactly to automate OpenSCAP’s STIG testing and integrate the findings with AWS Security Hub to boost your view of one’s IT systems’ compliance standing. Security Hub is really a centralized location that provides you a thorough view of one’s security position across your Amazon Web Solutions (AWS) accounts. Security Hub integrates with < natively;a href=”https://aws.amazon.com/guardduty/” focus on=”_blank” rel=”noopener noreferrer”>Amazon GuardDuty, Amazon Inspector, Amazon Macie, AWS Identification and Access Administration (IAM) Entry Analyzer, and AWS Firewall Supervisor. Security Hub also integrates with a number of AWS Partner options. For accounts that don’t have Safety Hub enabled, the outcomes of the STIG checks remain available because the email address details are furthermore stored in Amazon DynamoDB, and Amazon CloudWatch metrics are usually generated.
This post uses Red Hat Enterprise Linux (RHEL) for the example code, however the same techniques could also be used for Windows.
The perfect solution is we present in this website uses a amount of AWS services to automate the testing and reporting of OpenSCAP, as shown in Figure 1.
The workflow of the answer is really as follows:
- The procedure starts with AWS Techniques Manager, in which a Run Control association is used to perform OpenSCAP on the RHEL situations that you’ve configured to utilize the perfect solution is.
- Following the OpenSCAP scanning procedure is complete, the Amazon Elastic Compute Cloud (Amazon EC2) example puts the output-an XML outcomes document and an HTML report-into an Amazon Simple Storage Support (Amazon S3) bucket.
- The S3 bucket comes with an event notification on the s3:ObjectCreated:* strategies that creates an AWS Lambda functionality.
- The Lambda function parses the OpenSCAP results file to extract and count all of the low, medium, and high severity findings.
- The event pushes the counts as custom metrics to CloudWatch.
- The event saves the detailed information for every finding in a DynamoDB desk.
- If Protection Hub is allowed on the account, the function furthermore sends the detailed details for every finding to Safety Hub, with a web link to the HTML outcomes file that delivers remediation assistance.
Deploy the answer
This website covers the procedure of establishing your AWS account to aid automated SCAP testing and reporting. In addition, it walks you through all of the code that is found in this solution to enable you to customize the code. The walkthrough is divided into three actions:
- Step one 1: Deploy the AWS CloudFormation template
- Step two 2: Create your Red Hat Business Linux servers
- Step three 3: Test the procedure
You’ll require a Red Hat Business Linux (RHEL) 7 or RHEL 8 Amazon EC2 instance. This remedy may be used for some other operating systems which are backed by OpenSCAP, however in this article, we just discuss an execution for RHEL. The example type also will need to have enough memory to perform the OpenSCAP tool-for the entire STIG testing, at the least 8 GB of RAM is necessary.
For the Security Hub integration to function, you need to enable Security Hub. In the event that you run the automated testing but aren’t incorporated with Security Hub, then your solution publishes the outcomes only to DynamoDB.
Step one 1: Operate the AWS CloudFormation template
It is possible to access the CloudFormation template because of this solution from here. The CloudFormation template provides two parameters that you should set. The initial parameter shows to the Lambda script whether you would like to enable Protection Hub integration, which value is saved in the AWS Techniques Manager Parameter Shop under /SCAPTesting/EnableSecurityHub. The next parameter is the title of the S3 bucket you need to store the SCAP outcomes in. This bucket will undoubtedly be designed for you by the template; its name should be distinctive and must adhere to the S3 bucket naming guidelines. The template after that creates resources for you personally, which we shall describe within the next section.
Run the next command line user interface (CLI) command in order to deploy the CloudFormation template.
aws cloudformation create-stack --stack-name scaptest
–template-url https://awsiammedia.s3.amazonaws.com/open public/sample/703_Automate_SCAP_testing/create-scap-automation.yaml
–capabilities CAPABILITY_IAM Ability_NAMED_IAM
–parameters ParameterKey=Reviews3BucketName,ParameterValue= ParameterKey=EnableSecurityHubFindings,ParameterValue=
Step two 2: Setup the Red Hat Business Linux situations
There are some things that you’ll should do to the RHEL situations that require to be scanned, to allow them to work with Systems Supervisor and S3.
First, associate the IAM part developed by the CloudFormation template with each and every instance you want the scanning performed about. If you curently have an instance role connected with your instance, add the guidelines in the SCAPEC2InstanceRole function that was developed by the CloudFormation template to the part that you’re currently making use of.
aws ec2 associate-iam-instance-profile --instance-id --iam-instance-profile Name=SCAPEC2InstanceRole
Next, include the RunSCAP tag to your situations so the Run Order association will undoubtedly be alerted to perform the control on the situations.
To ensure that the instance to connect to S3, it wants to really have the AWS Control Line User interface (AWS CLI) installed. To get this done, install the unzip device, then hook up to your RHEL server through the use of SSH and operate the next commands.
Next, install and begin the Systems Manager Real estate agent for RHEL in order that it can make usage of Systems Manager. The install order must indicate the AWS Area that the situations are working in. In the next command, you will have to replace
with the right Region (for instance, us-east-1).
Step three 3: Test the procedure
To ensure that the template created all of the resources effectively and the automation will be running, see the automation in the AWS Administration Console.
To validate the answer deployment
- Register to the AWS Administration Console, and navigate to AWS Techniques Manager.
- In the left-hand menus, choose State Supervisor to create up the State Supervisor web page in the system. In the Associations desk, an entry ought to be seen by you having an association title of SCAPRunCommandAssociation . The initial run ought never to have had any situations to scan, because the EC2 example wasn’t configured however.
- Choose the association ID to create up the association information page.
- Select Apply association today to perform the association contrary to the situations that you configured in Step two 2: Create the Red Hat Business Linux situations . Following the association displays a position of achievement once again, you should start to see the Resource standing count industry show achievement:1 in the event that you set up an individual instance.
- Navigate to Safety Hub in the Gaming console and view the results made by the OpenSCAP device. In the event that you didn’t enable Protection Hub, you’ll have the ability to see the outcomes in the SCAPScanResults DynamoDB table.
Deep dive in to the solution
Now we’ll stroll you through all of the resources developed by the CloudFormation template. We describe how each resource is established, and why the perfect solution is needs it. We also explain a few of the settings that you might want to modify for the environment. This consists of walking through the elements of the Lambda functionality that deal with the parsing of the OpenSCAP outcomes and placing the results into Protection Hub.
The Parameter Shop parameter
The CloudFormation template utilizes the AWS::SSM::Parameter reference to generate the /SCAPTesting/EnableSecurityHub parameter in the Parameter Shop and put the worthiness that you selected involved with it. The Lambda functionality then utilizes this parameter to find out whether findings ought to be sent to Safety Hub.
The scan outcomes S3 bucket
The template generates an S3 bucket to carry all OpenSCAP testing outcomes and the HTML record. Each EC2 example shall have its folder in the bucket, keyed by its instance-id, and the result files are called -scap-results. < and xml ;timestamp> -scap-outcomes.html . The S3 bucket is established with the NotificationConfiguration occasion s3:ObjectCreated: and the suffix filtration system .xml. This shows Amazon S3 to notify the Lambda functionality of all files which are developed in the S3 bucket where in fact the file name leads to .xml , in order that only the XML data files are processed.
To ensure that Amazon S3 in order to invoke the Lambda functionality, the S3 bucket desires authorization. The CloudFormation template produces an AWS::Lambda:::Authorization useful resource for this purpose. The permission allows only the S3 bucket that has been intended to call the Lambda function earlier. The permission furthermore requires the decision to arrive from the foundation account where in fact the stack is made, to further enhance protection.
The template generates two DynamoDB tables. The foremost is a table to carry the true brands of OpenSCAP tests to ignore. This table is usually loaded by the Lambda functionality, and the event ignores any outcomes from the check that complement the SCAP_Principle_Name component from the XML outcomes in the DynamoDB desk. This allows one to customize your reviews so they don’t present any results that you would like ignored; this desk is empty automagically.
The next DynamoDB desk holds information about each one of the OpenSCAP results. The InstanceId is place by the Lambda function, SCAP Rule Name, period, intensity of the finding, the full total consequence of the test, and a web link to the HTML review in S3 for the check run. This table pays to if you don’t use Protection Hub but nonetheless want the results in a data shop. Remember that the ReadCapacityUnits and WriteCapacityUnits parameters are employed for instructional reasons only-you must established these appropriately in line with the amount of servers you’re scanning and regularity of scans.
EC2 example permissions
The EC2 instance requirements permissions to permit it to have usage of Systems S3 and Supervisor. To do this, Systems Supervisor Run Command needs authorization to run commands on your own EC2 instances, as well as your instances need usage of the S3 bucket. For several these reasons, the template creates an AWS::IAM::Role source that will supply the EC2 example the permissions from the AWS maintained plan AmazonEC2RoleforSSM . Systems Supervisor will need to have this policy set up so that it may use Run Order on the situations. The template also produces an inline plan which allows the EC2 example the s3:PutObject action so the example can upload its scan leads to the S3 bucket produced earlier. Make it possible for an EC2 example to use the function that the template developed, the example requires an AWS::IAM::InstanceProfile reference. This enables the EC2 example to assume the part that was created, so the example can perform what needed by Run Control and can put the outcomes in the S3 bucket.
The Lambda function description
The Procedure SCAP Outcomes Lambda function will undoubtedly be loaded from exactly the same S3 bucket that the CloudFormation script will be in, and you will see it from that S3 bucket .
In the Lambda function description in the CloudFormation template, a Timeout worth of 360 secs and a MemorySize worth of 1024 megabytes will be provided. The storage is required to process the huge XML files which contain the total results, and also provides Lambda function more main processing unit (CPU) capability to utilize.
The Lambda execution function is designated to the Lambda function inside this solution also. The Lambda function must connect to CloudWatch (both logs and metrics), DynamoDB, S3, Safety Systems and Hub Supervisor Parameter Store. This role provides least-privilege usage of those ongoing services with inline policies.
The next table exhibits the API calls permitted by the part and what they’re used for.
|cloudwatch:PutMetricData||Established to allowed so the function can press the count of reduced, medium, and high intensity results to CloudWatch.|
|logs:CreateLogGroup, logs:CreateLogStream, logs:PutLogEvents||Fixed to allowed so the functionality can write log activities to CloudWatch.|
|dynamodb:Scan||Arranged to permitted for the Ignore Checklist table, as the function must scan the desk to get all of the rules to disregard.|
|dynamodb:BatchWriteItem||Established to permitted for the outcomes table so the functionality can save the outcomes to DynamoDB.|
|s3:Obtain||Fixed to allowed so the function can accessibility the scan outcomes.|
|securityhub:BatchImportFindings||Arranged to allowed so the function can push results to Protection Hub.|
|ssm:GetParameters, ssm:DescribeParameters, ssm:GetParameter||Established to allowed so the function will get the parameters which were developed by the CloudFormation template.|
The Lambda function program code
We’ll stroll through a few of the important elements of the Process SCAP Outcomes Lambda functionality code here. You will see the entire function by following link provided previous for the Lambda functionality.
The very first thing the functionality does is to obtain the document from S3 and parse it as XML, the following.
Following the XML is experienced by the function, it checks the Parameter Shop to see if it will push the total leads to Security Hub.
obtain the findings of the check
To, the event navigates through the XML and discovers the TestResults area.
The Lambda function after that loops through each discovering that was within the TestResults portion of the XML, and checks if the finding will be in the ignore listing, whether the total outcome is “fail, ” and if the severity of the effect is “high then,” “medium,“low or ”.” The function after that adds the acquiring to a summary of items to persist to DynamoDB, adds the selecting to another set of items to send to Safety Hub, and provides counts for the various severity types to deliver to CloudWatch.
After all of the outcomes are processed, the event submits the counts to CloudWatch.
Next, the event utilizes the batch_author approach to the DynamoDB desk to insert all of the records within a batch.
Lastly, the event submits the results to Protection Hub. The Safety Hub API offers a batch_import_findings technique that facilitates the import of 100 results at a time. The next program code breaks up the results array into sets of 100 and pushes them to Protection Hub.
# Loop through the results sending 100 at the same time to Security Hub
stopIndex = startIndex 100 +
if stopIndex > len(securityHubFindings):
stopIndex = len(securityHubFindings)
findingsLeft = False
stopIndex = 100
myfindings = securityHubFindings[startIndex:stopIndex]
submit the obtaining to Security Hub
end result = securityHub.batch_import_findings(Findings = myfindings)
startIndex = startIndex 100 +
# print leads to CloudWatch print(result)
except Exception as electronic:
print(“One has occurred conserving to Safety Hub: ” + str(e))
The Techniques Manager Run Order association
The last portion of the CloudFormation template generates the Techniques Manager Run Control association ( AWS::SSM::Association ). This association creates an AWS-RunShellScript Run Order and passes it a script which will be explained within the next section. Once each day the association furthermore uses the ScheduleExpression parameter with a cron description to perform the command. Lastly, it associates the control with all EC2 situations which have the tag RunSCAP fixed to Real .
Systems Supervisor Run Command
With Techniques Manager Run Command, it is possible to run a order on an example that is utilizing the Techniques Manager Agent . The operational techniques Manager Agent allows for Systems Manager to perform commands in your stead. This solution runs on the built-in Run Command known as AWS-RunShellScript , that will operate a script on a Linux operating-system.
The very first thing that the script will is really a yum set up to make certain that the OpenSCAP device is installed or more to date ahead of operating the scanner.
Next, the script checks to discover which edition of RHEL the operating-system is therefore that it could save the right OpenSCAP testing document to the scriptFile adjustable.
else echo “System isn’t running RHEL7.rHEL or x 8.x !” fi
Next, the script checks that it discovered the construction script for the edition of RHEL that the example is making use of as its procedure system.
OpenSCAP gets the capability to run several checks for exactly the same rule. For example, when it checks to find if one’s body is patched, it shall scan several packages. If you would like OpenSCAP to checklist each bundle that failed, you must have multi-check out parameter arranged to true. Because of this solution, you shall just want the rolled-up outcomes, since when OpenSCAP formats the outcomes for the a number of checks, it uses exactly the same check ID for every check. This would bring about only the final check being recorded, but by rolling the full total results up, you’ll understand if the operational program update failed for just about any package. You are got by the remediation perform an upgrade to all packages, so you established the multi-verify parameter to fake in the OpenSCAP tests file.
Now that the right configuration script is set up and the multi-check ideals are occur it, the OpenSCAP tool could be set you back perform the check and scan for STIG compliance.
Following the scan is full, the instance is obtained by the script ID and a timestamp, and helps you to save them to variables in order that it can use these ideals when it pushes the full total results to S3.
Lastly, the script pushes the full total results XML file and the report HTML file to a folder in the S3 bucket. The real name of the folder may be the ID of the EC2 instance. The script sets both file names to become a timestamp accompanied by -scap-results. -scap-outcomes and html .xml .
In this article, you’ve learned how exactly to automate your OpenSCAP tests process for RHEL situations through the use of System Manager Run Control associations. It is possible to apply this same strategy to any operating-system supported by OpenSCAP. Additionally you learned ways to use Lambda to create any OpenSCAP results to Security Hub to be able to centralize security results across your infrastructure. I am hoping you may use these techniques to enhance your compliance and safety automation inside your own organization.
For even more security automation info AWS, see our blogs at https://aws.amazon.com/websites/protection/tag/automation/
When you have feedback concerning this post, submit remarks in the Comments area below. Should you have questions concerning this post, start a brand-new thread on the AWS Protection Hub discussion board or get in touch with AWS Help .
Want more AWS Safety how-to content, information, and feature announcements? Stick to us on Twitter .
You must be logged in to post a comment.