2

I am trying to string together lambda functions. The first lambda function is on a 30 minute timer to scrape and put the data in a S3 bucket, the next lambda function retrieves and parses that data and puts it in a seperate S3 bucket, and the last function performs analysis on that data and sends the user (in this case myself) an email of the results via pythons smtplib module.

Instead of having the last two lambda functions running on timers, I want the second function to be triggered when the first function is done, and the last function to be triggered when the second function is done. As well as, deleting the two folders in the first S3 bucket and the contents in the second S3 bucket to save on memory and processing time.

Is there a way to do this totally in the AWS web interface rather than rewrite the python code I already have?

5 Answers 5

3

This sounds like you have a use case that can fulfilled by step functions.

You would add a flow which has the first Lambda called, then if successful call the second Lambda function passing input/output between.

You would update your scheduled action to execute the step function instead of the first Lambda.

Sign up to request clarification or add additional context in comments.

1 Comment

Hhhm interesting. I will take a look, Thanks!
2

It can be done by using S3 Put Event as a Trigger to Lambda.

Second and third lambda can have S3 Put Event Trigger.

 Timer                  Put Event                    Put Event          
-------Lambda_1 -> S3 ------------> Lambda_2 --> S3 ------------> Lambda_3 ---> Email  
  1. Timer will trigger first lamdba which will put file in S3 bucket
  2. PUT Event on S3 bucket will trigger Lambda_2 which will put file in S3 bucket.
  3. Again, PUT event on this bucket will trigger Lambda_3 which will trigger Email.

Check Below link for more info :

URL-1

URL-2

5 Comments

The thing is with S3, the initial function puts more than one file in the bucket during its execution, so would the Put trigger go off prematurely?
In that case, once all files are uploaded in S3 bucket, add code in lambda to create one file like "complete.txt" and upload in S3 bucket. Put the S3 Put Event trigger for this file using Suffix: *.txt
There are two folders in the S3 bucket, comments and articles, that get filled with json data during the execution. I am assuming this complete.txt can just be put in the S3 bucket itself and call the code that puts that file at the end of the lambda_handler.
Yup. You can add the code at the end of lambda_handler. It should be called as last operation in Lambda. You can PUT 'complete.txt' file anywhere in S3 bucket. Just make sure it is added after all the files are uploaded to all folders.
JFYI. When you will add a trigger, along with Suffix there is one more Prefix option. In this, you can mention the folder in which you are writing the file. Just keep note, in case, you are planning to write complete.txt in some folder
1

Another solution which would involve rewriting your python code, but only slightly, is:

import boto3
lambda_client = boto3.client('lambda')
payload = b'{"a": 1, "b": 2, "c": 3}'
response = lambda_client.invoke(
               FunctionName="my-second-function",
               InvocationType='Event',
               Payload=payload
           )

By contrast, I find Step Functions to be way more complicated.

Comments

0

Another way to do it would be lambda destinations.

This is a feature that provides visibility into Lambda function invocations and routes the execution results to AWS services, simplifying event-driven applications and reducing code complexity. you can choose one of four destinations: another Lambda function, SNS, SQS, or EventBridge.

You may bind lambda success to another lambda(destination) and the result to another lambda.

Comments

0

All the answers here work. To summarize, you could:

  • Trigger the lambdas when a file arrives in an S3 bucket
  • Use Step Functions co-ordinate the 3 lambda functions, one after another
  • Use Lambda destinations (on success) to orchestrate the workflow

All of these accomplish what you wish for, and here's some extra info.

  • For the S3 option you can specify which events trigger the lambda (e.g. PutObject), and even the prefix and suffix of the file(s).
  • For Step functions, this is my favorite options. It's an expensive option (typically 5-10x more than lambda) but, it actually orchestrate everything, and the view you get from the console is amazing. It's very easy to triage issues if they occur.
  • For Lambda destinations, you can also set a onFailure mode, so that the you can get alerted if the function fails. onSuccess will of course lead to the next function in question.

Hope that helps.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.