52

I have tried to use lambda function to write a file to S3, then test shows "succeeded" ,but nothing appeared in my S3 bucket. What happened? Does anyone can give me some advice or solutions? Thanks a lot. Here's my code.

import json
import boto3

def lambda_handler(event, context):

string = "dfghj"

file_name = "hello.txt"
lambda_path = "/tmp/" + file_name
s3_path = "/100001/20180223/" + file_name

with open(lambda_path, 'w+') as file:
    file.write(string)
    file.close()

s3 = boto3.resource('s3')
s3.meta.client.upload_file(lambda_path, 's3bucket', s3_path)

3 Answers 3

79

I've had success streaming data to S3, it has to be encoded to do this:

import boto3

def lambda_handler(event, context):
    string = "dfghj"
    encoded_string = string.encode("utf-8")

    bucket_name = "s3bucket"
    file_name = "hello.txt"
    s3_path = "100001/20180223/" + file_name

    s3 = boto3.resource("s3")
    s3.Bucket(bucket_name).put_object(Key=s3_path, Body=encoded_string)

If the data is in a file, you can read this file and send it up:

with open(filename) as f:
    string = f.read()

encoded_string = string.encode("utf-8")
Sign up to request clarification or add additional context in comments.

7 Comments

Thanks a lot. My code is also valid,I forgot to reload S3 bucket. I also try your method,it also works.Thank you so much.
I found that the S3 path shouldn't have a leading / otherwise an empty folder is created, essentially becoming //100001 so I think the line should read: s3_path = "100001/20180223/" + file_name
Note that this isn't streaming, but buffering to disk and then sending
This solution worked perfectly. I needed to write a csv file , so I actually wrote to an io.StringIO and then encoded the buffer content to utf-8 and saved it to an S3 file
why do you need a lambda path variable?
|
15

My response is very similar to Tim B but the most import part is

1.Go to S3 bucket and create a bucket you want to write to

2.Follow the below steps otherwise you lambda will fail due to permission/access. I've copied and pasted it the link content here for you too just in case if they change the url /move it to some other page.

a. Open the roles page in the IAM console.

b. Choose Create role.

c. Create a role with the following properties.

-Trusted entity – AWS Lambda.

-Permissions – AWSLambdaExecute.

-Role name – lambda-s3-role.

The AWSLambdaExecute policy has the permissions that the function needs to manage objects in Amazon S3 and write logs to CloudWatch Logs.

  1. Copy and past this into your Lambda python function

    import json, boto3,os, sys, uuid
    from urllib.parse import unquote_plus
    
    s3_client = boto3.client('s3')
    
    def lambda_handler(event, context):
        some_text = "test"
        #put the bucket name you create in step 1
        bucket_name = "my_buck_name"
        file_name = "my_test_file.csv"
        lambda_path = "/tmp/" + file_name
        s3_path = "output/" + file_name
        os.system('echo testing... >'+lambda_path)
        s3 = boto3.resource("s3")
        s3.meta.client.upload_file(lambda_path, bucket_name, file_name)
    
        return {
            'statusCode': 200,
            'body': json.dumps('file is created in:'+s3_path)
        }
    

Comments

0
from os import path
import json, boto3, sys, uuid
import requests
s3_client = boto3.client('s3')

def lambda_handler(event, context):
    bucket_name = "mybucket"
    url = "https://i.imgur.com/ExdKOOz.png"
    reqponse = requests.get(url)
    filenname = get_filename(url)
    img = reqponse.content
    s3 = boto3.resource("s3")
    s3.Bucket(bucket_name).put_object(Key=filenname, Body=img)
    
    return {'statusCode': 200,'body': json.dumps('file is created in:')}

def get_filename(url):
    fragment_removed = url.split("#")[0]
    query_string_removed = fragment_removed.split("?")[0]
    scheme_removed = query_string_removed.split("://")[-1].split(":")[-1]
    if scheme_removed.find("/") == -1:
       return ""
    return path.basename(scheme_removed)

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.