Amazon aws, aws, ec2

Amazon EC2 Simple Systems Manager

This week I’ll talk about Amazon EC2 Simple Systems Manager. AWS SSM is an EC2 tool that helps us to manage and configure our instances. We can send commands and see the results without logging in to each instance of our fleet. The toll has built-in command documents and also we can write our custom documents. ( Command documents are json files that describes the commands we want to run ). It supports both windows and linux instances but there are some prerequisites:

First of all, we need two permission; one for ec2 instance that will run the command (I assume the instance’s operating system supports and has internet access). We create a role with appropriate permissions and assign this role when we lauch the instance. And one for the user who will execute the commands.

Second we need to install SSM agent; we can bootstrap our instances with userdata. And we are ready to run our commands. Let’s begin to configure it.

1- We need to create a role :

ssm_create_role

2- Then we need to select and attach the policy for our role. Here I select “AmazonEC2RoleforSSM”

ssm_policy

3 – Now I launch my instance and when launching I select the role and also install SSM agent via usedata.

ssm_launch_ins

4- If our instance is ready, wee can start to send our commands.

I select “Commands” and click “Run a command” from Instances menu. I select “AWS-RunShellScript” document and select my instance as target. I write my script as “df -h”.

 

ssm_run

If you want to use AWS Cli, console automatically generates the command for you. Here executionTimeout is for the script itself and timeout-seconds is the time to check the instance reachability. Also you can define your comment for the command.

ssm_cli

 

I run my command and it finishes successfully.

 

ssm_success

 

Let’s see the result.

ssm_output

And view the output.

 

ssm_output1

 

As you see it is very easy to run our commands and configure our instances using this tool.

We can also use AWS cli to interact with SSM to use our custom documents.

First of all let’s examine the sample document provided by AWS. This document enables us to use our shell script as a parameter.

{
    "schemaVersion":"1.2",
    "description":"Run a Linux shell script or specify the paths to scripts to run.",
    "parameters":{
        "commands":{
            "type":"StringList",
            "description":"(Required) Specify the commands to run or the paths to existing scripts on the instance.",
            "minItems":1,
            "displayType":"textarea"
        },
        "workingDirectory":{
            "type":"String",
            "default":"",
            "description":"(Optional) The path to the working directory on your instance.",
            "maxChars":4096
        },
        "executionTimeout":{
            "type":"String",
            "default":"3600",
            "description":"(Optional) The time in seconds for a command to be completed before it is considered to have failed. Default is 3600 (1 hour). Maximum is 28800 (8 hours).",
            "allowedPattern":"([1-9][0-9]{0,3})|(1[0-9]{1,4})|(2[0-7][0-9]{1,3})|(28[0-7][0-9]{1,2})|(28800)"
        }
    },
    "runtimeConfig":{
        "aws:runShellScript":{
            "properties":[
                {
                    "id":"0.aws:runShellScript",
                    "runCommand":"{{ commands }}",
                    "workingDirectory":"{{ workingDirectory }}",
                    "timeoutSeconds":"{{ executionTimeout }}"
                }
            ]
        }
    }
}

Here we define our parameters to be used while executing our command. I save it as custom.json
I create a document named custom

aws ssm create-document --content file://custom.json --name "custom" --region eu-west-1

Then I send my command to my instance.  I send “uname -a , df -h and ifconfig” as my parameters.

aws ssm send-command --document-name "custom" --instance-ids "i-d79df35a" --parameters '{"commands":["uname -a","df -h","ifconfig"],"executionTimeout":["3600"]}' --timeout-seconds 600 --region eu-west-1

And it’s status is pending.

{
"Command": {
"Status": "Pending",
"ExpiresAfter": 1450694153.041,
"Parameters": {
"commands": [
"uname -a",
"df -h",
"ifconfig"
],
"executionTimeout": [
"3600"
]
},
"DocumentName": "custom",
"InstanceIds": [
"i-d79df35a"
],
"CommandId": "6f1e139b-77d1-440f-83e4-4eef3d94a9c8",
"RequestedDateTime": 1450693553.041
}
}

Now I request the output of my command using “CommandId”

aws ssm list-command-invocations --command-id "6f1e139b-77d1-440f-83e4-4eef3d94a9c8" --details
{
"CommandInvocations": [
{
"Status": "Success",
"CommandPlugins": [
{
"Status": "Success",
"Name": "aws:runShellScript",
"ResponseCode": 0,
"Output": "Linux ip-172-31-6-250 4.1.10-17.31.amzn1.x86_64 #1 SMP Sat Oct 24 01:31:37 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux\nFilesystem      Size  Used Avail Use% Mounted on\n/dev/xvda1      7.8G  1.2G  6.6G  15% /\ndevtmpfs        237M   56K  237M   1% /dev\ntmpfs           246M     0  246M   0% /dev/shm\neth0      Link encap:Ethernet  HWaddr 0A:B5:0B:46:AE:B7  \n          inet addr:172.31.6.250  Bcast:172.31.15.255  Mask:255.255.240.0\n          inet6 addr: fe80::8b5:bff:fe46:aeb7/64 Scope:Link\n          UP BROADCAST RUNNING MULTICAST  MTU:9001  Metric:1\n          RX packets:2572 errors:0 dropped:0 overruns:0 frame:0\n          TX packets:2359 errors:0 dropped:0 overruns:0 carrier:0\n          collisions:0 txqueuelen:1000 \n          RX bytes:424390 (414.4 KiB)  TX bytes:384353 (375.3 KiB)\n\nlo        Link encap:Local Loopback  \n          inet addr:127.0.0.1  Mask:255.0.0.0\n          inet6 addr: ::1/128 Scope:Host\n          UP LOOPBACK RUNNING  MTU:65536  Metric:1\n          RX packets:2 errors:0 dropped:0 overruns:0 frame:0\n          TX packets:2 errors:0 dropped:0 overruns:0 carrier:0\n          collisions:0 txqueuelen:0 \n          RX bytes:140 (140.0 b)  TX bytes:140 (140.0 b)\n\n",
"ResponseFinishDateTime": 1450693553.296,
"ResponseStartDateTime": 1450693553.293
}
],
"InstanceId": "i-d79df35a",
"DocumentName": "custom",
"CommandId": "6f1e139b-77d1-440f-83e4-4eef3d94a9c8",
"RequestedDateTime": 1450693553.041
}
]
}

Also remember that, you can send the log outputs to a S3 bucket, so you can debug if your commands fail.

This was the basic explanation of AWS Simple Systems Manager. I hope you find it useful. If you have any question or comment, please feel free to write and don’t forget to share please.

Onur SALK

AWS Cloud & DevOps Consultant, AWS Certified Solutions Architect, AWS Community Hero

More Posts - Website

Follow Me:
TwitterFacebookLinkedIn

8 thoughts on “Amazon EC2 Simple Systems Manager

  1. Can I save my results in dynamodb instead of S3, as I run ssm on more than 1000 servers its difficult to get consolidated report?

    1. Hi Suditi,

      I couldn’t see an option to see the results in dynamodb. However, you can create a Lambda function that writes the data to dynamodb and trigger it when ssm save the result in S3. I hope this helps.

      1. Hello Onur, Thank you for replying.I was thinking if there is any other solution apart from the one which you mentioned.Is there a way I can call dynamodb APIs from the custom ssm document directly?

  2. hello Onur,i wanna know how can we automate user and password creation on multiple machine(instances) through bash script.I mean i wanna create Linux user in multiple server through systems manager,so that i can manage user creation and setting password remotely.

    1. Hi Deepti,

      After you installed the agent, you can send the required command to your linux instances. Also if you want to automate this, you can create a Lambda function and trigger it when an instance goes to a “running state”. When it’s triggered, it can run the required EC2 system manager command.

      1. Thanks Onur for replying, actually I wanna do that via aws cli but whenever I am running the bash script for user creation on different servers,command is not executing. And for your solution of using lambda, I don’t know coding much.But thanks once again for replying. I am feeling sad that I did not get confirmation for your devops workshop.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.