November 2016 Archives

How to convert ElasticBeanstalk application to Lambda

The goal - take the code that has been running in an ElasticBeanstalk environment and run it as a Lambda job, triggering whenever a file is dropped into an S3 bucket.

The Requirement - To properly deploy it into our prod environment, all resources must be deployed via CloudFormation. Note that we are not the development team, so we are assuming that some code has been written and uploaded as a .war/.zip file to an S3 bucket. This means that, at a high level, we need three deployments to:

Deployments

  1. Deployment
    • Create an IAM role that uses the same policy as the EB role, but can assume lambda.amazonaws.com as its role. Also include several managed policies to let the Lambda instances come into being
    • Create a Lambda function, loading its code from a .war file uploaded to S3. Assign it the role
    • Create an S3 bucket for sourcing files
  2. Deployment
    • Create a Lambda permission, note that this is a thing in the Lambda namespace, not IAM, that allows the S3 bucket to invoke the lambda function. This cannot be done until the Lambda function and the S3 bucket have been created (deployment 1)
  3. Deployment
    • Update the S3 bucket from deployment 1 to notify the Lambda function. This cannot be done until the Lambda and the Lambda permission are created, since creation runs a test notification that must succeed for the update to be sucessful.

Cloudformation Samples [1]

Lambda Role

This is the IAM role given to the running Lambda instance. The example given spawns a Lambda inside an existing VPC, so needs the managed VPC role. If you are running outside a VPC, a different managed policy is needed.

    "LambdaRole": {
      "Type": "AWS::IAM::Role",
      "Properties": {
        "RoleName" : "LambdaRole",
        "ManagedPolicyArns" : [
          "arn:aws:iam::aws:policy/service-role/AWSLambdaVPCAccessExecutionRole"
        ],
        "AssumeRolePolicyDocument": {
          "Version" : "2012-10-17",
          "Statement": [  
            {
              "Effect": "Allow",
              "Principal": {
                "Service": [ "lambda.amazonaws.com" ]
              },
              "Action": [ "sts:AssumeRole" ]
          }]
        },
        "Path": "/"
      }
    },

Lambda Function

The actual lambda function definition. Needs to have the code uploaded to S3 in order to deploy. This can be run in parallel with the IAM role creation. This example builds a Lambda that runs in Java8, but Node.js and Python would be similar. In this sample the Lambda is given a SecurityGroup to allow it access to back-end services (RDS, etc), where access is by source group.

    "SearchLambda": {
      "Type": "AWS::Lambda::Function",
      "Properties": {
        "Description" : "Description Text",
        "FunctionName" : { "Fn::Join" : ["-", [{"Ref" : "EnvTag"}, "import", "lambda01"]]  },
        "Handler": "org.hbsp.common.lambda.pim2cs.S3EventHandler",
        "Role": { "Fn::GetAtt" : ["LambdaRole", "Arn"] },
        "MemorySize" : "512",
        "Code": {
          "S3Bucket": "Sourcecode-BucketName",
          "S3Key": { "Fn::Join" : ["/", ["directory/path", {"Ref" : "EnvTag"}, "artifact-name-version.zip"]]}
        },
        "Runtime": "java8",
        "Timeout": "300",
        "VpcConfig" : {
          "SecurityGroupIds" : [
            {"Ref" : "AppServerSG"}
          ],
          "SubnetIds" : [
            { "Ref" : "PriSubnet1" },
            { "Ref" : "PriSubnet2" },
            { "Ref" : "PriSubnet3" }
          ]
        }
      }
    }

S3 Bucket (withouth Notifcations)

Initial deployment of S3 bucket to create it. This is needed for the Lambda permissions, but cannot have notifications attached yet.

    "PlatformBucketQA":{
     "Type": "AWS::S3::Bucket",
     "Properties" : {
       "BucketName" : "sp-transfer-qa",
       "Tags" : [
           <Many Tags Go Here>
       ],
       "LoggingConfiguration" : {
         "DestinationBucketName" : "logbucket",
         "LogFilePrefix" : "s3/"
       }
     }
   },

Lambda Permission

This assigns calling permission TO the lambda function from the source S3 bucket. Both of those must be already created before this can be executed. It is possible that this would work with a "DependsOn" clause, but I find it easier to simply deploy this as a seperate step from the Lambda and Bucket

    "SearchLambdaPerm": {
      "Type": "AWS::Lambda::Permission",
      "Properties" : {
        "Action": "lambda:InvokeFunction",
        "FunctionName": {"Ref": "SearchLambda"},
        "Principal": "s3.amazonaws.com",
        "SourceAccount": {"Ref": "AWS::AccountId"},
        "SourceArn": { "Fn::Join": [":", [
            "arn", "aws", "s3", "" , "", {"Ref" : "PlatformBucketQA"}]]
        }
      }
    },

S3 Bucket (with Notifcations)

This is an addition to the previous S3 bucket code, adding the specific notification configurations. In this model only files created, where created includes renaming/moving files in the bucket, that match the glob asset/incoming/*xml. The "Event" parameter can be changed to trigger on different S3 actions

    "PlatformBucketQA":{
     "Type": "AWS::S3::Bucket",
     "Properties" : {
       "BucketName" : "sp-transfer-qa",
       "Tags" : [
           <Many Tags Go Here>
       ],
       "NotificationConfiguration": {
         "LambdaConfigurations": [
           {
             "Event" : "s3:ObjectCreated:*",
             "Function" : { "Fn::GetAtt" : ["SearchLambda", "Arn"] },
             "Filter" : {
               "S3Key" : {
                 "Rules" : [
                   {
                     "Name" : "prefix",
                     "Value" : "asset/incoming"
                   },
                   {
                     "Name" : "suffix",
                     "Value" : "xml"
                   }
                 ]
               }
             }
           }
         ]
       },
       "LoggingConfiguration" : {
         "DestinationBucketName" : "logbucket",
         "LogFilePrefix" : "s3/"
       }
     }
   },

  1. In JSON, the code in YAML will have the same fields, just different structure

About this Archive

This page is an archive of entries from November 2016 listed from newest to oldest.

October 2016 is the previous archive.

April 2017 is the next archive.

Find recent content on the main index or look in the archives to find all content.

OpenID accepted here Learn more about OpenID