[solved] AWS S3 - File uploads

I’m sorry I can’t help you with your problem @brunopop. But what you’re trying to do is something I’m sure many others will want to do as well, so it would be awesome if we could figure out how best to do this, and then share it with the community.

I have implemented downloading from an s3 bucket in a Redwood Function. I’ll try to do a write-up about it, maybe it can give you some ideas. Hopefully I can find some time later today, or tomorrow, but no promises.

1 Like

@brunopop is S3 a hard requirement for you? If so, couple of questions - have you checked the mimetype of the images is correct when it reaches the server? So it uploads but when you download - you are saying it’s not recognized as an image?

Can you show the code where the front-end is sending over the form-data? If you can explain the various functions being used as well, because it’s not clear how they tie together.

My other suggestion would be to use something like Cloudinary or filestack. But have much nicer SDKs and are easier to work with.

1 Like

@brunopop I second @viperfx on using Cloudinary or Filestack (which I have used before and been very happy with) or ImageKit for uploading.

They integrate well with a number of storage providers, provide security to ensure only certain IP address or clients can upload, upload directly in a streamed way to ensure very large files can be uploaded and their SDKs and widgets are user friendly.

And they can transform images on upload to make thumbnails, or convert to other formats, or transform. Also when fetching the image, they can provide image transformations, cropping, scaling, fitting, and a number of other options.

A few things you do need to know about the web to function to S3 upload:

  • You are doing this in memory, functions have a limit
  • Functions (on Netlify) have a 10s runtime limit, so you will need to consume the image data and send to S3 in under that time. You may want to consider an async process with a background function on Netlify.
  • Your function is public. You should provide some mechanism to ensure the upload is legitimate otherwise someone could upload lots of files to your S3 bucket which can incur costs in bandwidth and storage space.

If you use Netlify you could also try their https://docs.netlify.com/forms/setup/#file-uploads Forms file uploads. Based on the plan there are some limits. You could then use a webhook to get when a form upload was doing, and us a background function to fetch the file and send it you would own S3.

3 Likes

Writeup, as promised :slight_smile:

https://tlundberg.com/blog/2020-11-28/redwood-download-files-from-protected-s3/

5 Likes

Thanks! It is not a hard requirement but I thought that could be nice to integrate it.

The mimetype is correct in the server. But if you try to open the image using the object URL, it shows just a black image. If you download the file and open it, Windows photo viewer shows an unsupported format error.

I think it is something about the binary data or something with the headers, but I couldn’t figure it out. Maybe it is how I parsed the body.

No front-end yet. I am just testing using postman, sending post request.

@dthyresson @viperfx I have tried Filestack and I didn’t like it. That was one of the reasons to try with AWS S3 directly.

After your recommendation, I’m trying imagekit and I really like it! Looks great. Actually, I think I’m gonna take that solution. @dthyresson Have you ever tried image-kit-uppy-pluggin?

Thanks!!!

To be honest, I have not used it for images, but other traditional file uploads.

Great! It was @peterp who mentioned it to me as he uses it – I’ve been using imgix for image processing, but it does not offer uploads.

I have not – but uppy has 24k :star2:s and is from transloadit.

I remember almost using them for a project 6 or so years ago when we needed to upload video and transcode it – but the client decided to just use a simple aws pipeline.

Uppy looks really good. I’ll have to think about using it in the future.

Hello,

I do this all the time with S3 (or S3 compatible APIs). My suggestion is to use a signed url to upload.

Here’s a tutorial: https://www.serverless.com/blog/s3-one-time-signed-url

The flow could be:

  1. Generate a signed url on the backend, return this
  2. POST/PUT your file contents to this url from your client

This incidentally gives you security as well, because you don’t need to make uploads to your bucket public.

2 Likes

@dthyresson Filestack pretends to be the best way to manage files. I had an issue in my first 5 minutes of use, I contacted them, and they never responded.

In my opinion, Imagekit has a low profile but it works. The documentation is much better and it’s is developers oriented. They really focus on best practices to manage your media. For example, image optimization to improve the load time of your website’s images while maintaining acceptable image quality.

In addition, if you don’t need to add your files programmatically there is an easy manage dashboard with a practical media library. It is possible to upload any kind of static file, not only images. Finally, is interesting that they only charge you for the bandwidth, not the storage.

On the other hand, Filestack includess a filepicker and a machine learning product for object recognition, explicit content detection, etc. Imagekit provides a plugin for uppy, so it is not a big difference. Machine learning sounds great! But I don’t think it is usually a hard requirement and also other integrations should be around soon :slight_smile:

This is a great writeup and super valuable info. I wish i had had it a few weeks ago :wink:

Am using them now because I wanted some fine control over what files were uploaded (ie only certain mimetypes – in this case zip files) and some of them could be very large (50-100MB) and will likely try to setup some type of pipeline to look inside the zips and check that certain items exist – ie, a valid file.

But, I’m definitely going to consider Uppy,

This post was very helpful for me – thank you @brunopop for the write up and @dthyresson for the imagekit recommendation!

1 Like

Thanks for all the informative posts! I think I can piece together what I need from all of this

Heaving read all of this my hope is: User identifies the files to be uploaded and than clicks upload; normal Lambda code generates a secure upload s3 Url to a dynamic bucket name that is good for a few minutes; upon receipt of upload Url files are uploaded from the browser (maybe using uppy). When download is requested a secure Url to a dynamic bucket name that is good for a few minutes is generated and returned; upon receipt of download Url files are downloaded to the browser.

Any thoughts/samples would be appreciated – I’ll leavae code samples here after I get it working

Cheers!

Hey @ajoslin103! If I remember correctly I ran into a lot of issues getting AWS S3 bucket policies working as expected, and eventually switched my implementation over to simply using Cloudinary, which dramatically simplified the implementation and also made it very simple to optimize the image size & file format (not affiliated with them, just a fan.)

Below is my Cloudinary implementation. I remember this working without issue, but it’s been many months since I took a look at the project that used this code, so no guarantees.

Image uploading function:

export const uploadPhoto = async (file) => {
  const url = 'https://api.cloudinary.com/v1_1/[YOUR_ACCOUNT_NAME]/image/upload'
  const preset = '[YOUR_PRESET]'
  const config = {
    headers: { 'X-Requested-With': 'XMLHttpRequest' },
  }
  const fd = new FormData()

  fd.append('upload_preset', preset)
  fd.append('file', file)

  const response = await axios.post(url, fd, config)

  return response
}

TS component using some Tailwind for styling (don’t think we need to be using Label here, though):

import { Label } from '@redwoodjs/forms'
import { uploadPhoto } from 'src/lib/utils'

interface Props {
  photoSrc: string | null
  setPhotoSrc: (src: string | null) => void
  name: string
}

const LargePhotoUpload: React.FunctionComponent<Props> = ({
  photoSrc,
  setPhotoSrc,
  name,
}) => {
  const onChange = async (e) => {
    const file = e.target.files[0]
    const response = await uploadPhoto(file)
    const src = response.data.url
    setPhotoSrc(src)
  }

  return (
    <div className="mt-6 flex-grow lg:mt-0 lg:ml-6 lg:flex-grow-0 lg:flex-shrink-0">
      <p className="text-sm font-medium text-gray-700" aria-hidden="true">
        Photo
      </p>
      <div className="mt-1 lg:hidden">
        <div className="flex items-center">
          <div
            className="flex-shrink-0 inline-block rounded-full overflow-hidden h-12 w-12"
            aria-hidden="true"
          >
            {photoSrc ? (
              <img
                className="rounded-full h-full w-full"
                src={photoSrc}
                alt={`${name}-logo`}
              />
            ) : (
              <span className="inline-flex items-center justify-center w-full h-full rounded-full bg-teal-400">
                <span className="text-xs font-medium leading-none text-white text-center"></span>
              </span>
            )}
          </div>
          <div className="ml-5 rounded-md shadow-sm">
            <div className="group relative border border-gray-300 rounded-md py-2 px-3 flex items-center justify-center hover:bg-gray-50 focus-within:ring-2 focus-within:ring-offset-2 focus-within:ring-teal-400">
              <Label
                name="user_photo"
                className="relative text-sm leading-4 font-medium text-gray-700 pointer-events-none"
              >
                <span>Change</span>
                <span className="sr-only"> user photo</span>
              </Label>
              <input
                id="user_photo"
                name="user_photo"
                type="file"
                className="absolute w-full h-full opacity-0 cursor-pointer border-gray-300 rounded-md"
                onChange={onChange}
              />
            </div>
          </div>
        </div>
      </div>

      <div className="hidden relative rounded-full overflow-hidden lg:block">
        {photoSrc ? (
          <img
            className="relative rounded-full w-40 h-40"
            src={photoSrc}
            alt={`${name}-logo`}
          />
        ) : (
          <span className="inline-flex items-center justify-center w-40 h-40 rounded-full bg-teal-400">
            <span className="text-xl font-medium leading-none text-white text-center">
              {name}
            </span>
          </span>
        )}
        <Label
          name="user-photo"
          className="absolute inset-0 w-full h-full bg-black bg-opacity-75 flex items-center justify-center text-sm font-medium text-white opacity-0 hover:opacity-100 focus-within:opacity-100"
        >
          <span>Change</span>
          <span className="sr-only"> user photo</span>
          <input
            type="file"
            id="user-photo"
            name="user-photo"
            className="absolute inset-0 w-full h-full opacity-0 cursor-pointer border-gray-300 rounded-md"
            onChange={onChange}
          />
        </Label>
      </div>
    </div>
  )
}

export default LargePhotoUpload
1 Like

@tctrautman !! I’m hip !! I had more trouble then it’s worth fighting policies for creating buckets until I figured out the error was misleading “forbidden” actually meant “not found” in my case…

I am hoping to upload from the client’s browser – but it’s not a dealbreaker for round one !!

I’ll give this a try – thanks!!

1 Like

Oh, actually the uploadPhoto function is on the web side, so this might be what you want if you’re open to using Cloudinary :slight_smile:

I had to stick with S3 – it should be done in < 24hrs – I’ll post success & code

I’ve things up & downloading via signedUrls & Cells

1 Like

That’s great – nicely done @ajoslin103! Looking forward to seeing your solution.

Ok, here we go! This explains how I got the users’ browser to do the work of uploading and downloading files to S3 for me.

TLDR; using my /api to create the bucket & request signed upload/download urls from AWS Lambda functions - I then executed uploads/downloads against those Urls from the React code.

I used some library to accept the images and convert them to DataURLs – data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAA5MAAAP2CAMAAABNCDlSAAAD…

Then I used some old Lambda code I had to create signed upload and download urls (after creating a bucket with a widely permissive Cors configuration) – at some point I’ll have to back and tighten things up…

We’ll start with the Redwood /web from component to cell, thence to /api, and finally to the Lambda. If you’d like to start with the lambda and go backwards then you can read this from the bottom up.

Note: I’ve trimmed in Redwood for brevity, and stripped out some error checking, which may introduce slight bugs – but we’ve quite a ways to go.

Big Note: I stripped the DataURL prefix (i.e. data:image/png;base64,) storing just the base64 data in S3 - someone smarter than me could figure it out without that rigamarole

Side Note: I’ve only been doing React for a year on the side, so I’d welcome any tips on that stuff!

--------- Redwood /web component code ---------

Gather an image for uploading

// S3ImageUpload /web component
import { Button } from '@material-ui/core'
import TheS3UploadCell from 'src/cells/TheS3UploadCell'
const S3ImageUpload = ({ bucketName, keyFile, setUploadComplete }) => {
    const [uploadRequested, setUploadRequested] = React.useState(false)
    const [image, setImage] = React.useState<any>(undefined)
    const setUploadCompleted = result => {
        setTimeout(() => setUploadComplete(result))
    }
    return uploadRequested ? (
        <TheS3UploadCell bucket={bucketName} s3key={keyFile} secs={duration} imgData={image.dataURL} exfiltrate={setUploadCompleted} /> 
    ) : (
        // use your favorite library to receive images and expose the image.dataURL via setImage
        // use your favorite library to receive images and expose the image.dataURL via setImage
        <Button variant="contained" color="primary" onClick={() => setUploadRequested(true)}>
            Upload
        </Button>
    )
}
export default ImageUpload;

Display a downloaded image and legend

// S3Image /web component
import { Box, CardMedia, Typography } from '@material-ui/core'
import useSharedClasses from 'src/hooks/shared-styles'
import TheS3DownloadCell from 'src/cells/TheS3DownloadCell'
const S3Image = ({ bucketName prefix, keyFile, duration, legend }) => {
    const classes = useSharedClasses()
    const [imageSrc, setImageSrc] = React.useState('')
    return imageSrc ? (
        <Box className={classes.s3Image}>
            <Box className={classes.comfortable1}>
            <Typography variant="inherit">{legend}</Typography>
            </Box>
            <Box className={classes.fourFifthsHeight}>
            <CardMedia className={classes.fullHeight} classes={{ root: classes.containBackground }} image={`${prefix},${imageSrc}`} />
            </Box>
        </Box>
    ) : (
        <TheS3DownloadCell bucket={bucketName} s3key={keyFile} secs={duration} exfiltrate={setImageSrc} />
    )
}
export default S3Image

--------- Redwood /web cell code ---------

Request a signed Url & download the image data from S3

// TheS3DownloadCell /web cell
import ky from 'ky'
export const beforeQuery = (props) => {
    const variables = { ...props }
    return { variables, fetchPolicy: 'no-cache' }
}
export const QUERY = gql`
    query TheS3DownloadCell($bucket: String!, $s3key: String!, $secs: String!) {
        s3DownloadUrl(bucket: $bucket, key: $s3key, secs: $secs) {
            doGetSignedUrlResult
        }
    }
`
export const Empty = () => null
export const Loading = () => null
export const Failure = ({ error }) => <div>Error: {error.message}</div>
export const Success = ({ s3DownloadUrl, variables }) => {
    const { doGetSignedUrlResult: awsUrl } = s3DownloadUrl;
    const { exfiltrate } = variables
    ky.get(awsUrl)
        .then(response => response.arrayBuffer())
        .then(arrayBuffer => {
            const enc = new TextDecoder("utf-8");
            const imgData = enc.decode(arrayBuffer)
            setTimeout(()=>exfiltrate(imgData))
        })
    return null
}

Request a signed Url & upload the image data from S3

// TheS3UploadCell /web cell
import ky from 'ky'
import ImageUpdateCell from 'src/cells/ImageUpdateCell' 
import Progress from 'src/components/Progress'
export const beforeQuery = (props) => {
    const { bucket, s3key, secs, imgData } = props
    const [ prefix ] = imgData.split(',')
    const [ ,,contentType] = prefix.split(/[^a-z]/)
    const mimeType = `image/${contentType}` // enforce images only
    const variables = { type: contentType, key: s3key, mimeType, ...props }
    return { variables, fetchPolicy: 'no-cache' }
}
export const QUERY = gql`
    query TheS3UploadCell($bucket: String!, $key: String!, $type: String!, $secs: String!) {
        s3UploadUrl(bucket: $bucket, key: $key, type: $type, secs: $secs) {
            doGetSignedUrlResult
        }
    }
`
export const Empty = () => null
export const Loading = () => <Progress />
export const Failure = ({ error }) => <div>Error: {error.message}</div>
export const Success = ({ s3UploadUrl, variables }) => {
    const { doGetSignedUrlResult: awsUrl } = s3UploadUrl;
    const { s3key, type, exfiltrate, imgData } = variables;
    const [ prefix, ...base64Parts ] = imgData.split(',')
    const base64 = base64Parts.join(',')
    const headers = { ContentEncoding: 'base64', 'Content-Type': `image/${type}` }
    ky.put(awsUrl, { body: base64, headers })
        .then((result) => {
            setTimeout(()=>exfiltrate(result))
        })
    return (
        // this marks the image as uploaded & stores the prefix
        <ImageUpdateCell id={s3key} name={s3key} prefix={prefix} />
    )
}

--------- Redwood /api code ---------

Get the signed download Url

// s3DownloadUrl.ts /api service (calls lambda)
const fetch = require('node-fetch')
import { requireAuth } from 'src/lib/auth'
export const s3DownloadUrl = ({ bucket, key, secs }) => {
    return fetch(`${process.env.storageDownloadPath}/${bucket}?key=${key}&secs=${secs}`, {
        method: 'get', headers: { 
            Authorization: `Bearer ${process.env.bearerToken}`, 
            'Content-Type': 'application/json', 
        }
    })
    .then((res) => res.json())
    .catch(err => logger.error(`s3download.js threw: ${err}`))
}

Get a signed upload Url

// s3UploadUrl.ts /api service (calls lambda)
const fetch = require('node-fetch')
export const s3UploadUrl = ({ bucket, key, type, secs }) => {
    return fetch(`${process.env.storageUploadPath}/${bucket}?key=${key}&type=${type}&secs=${secs}`, {
        method: 'get', headers: { 
            Authorization: `Bearer ${process.env.bearerToken}`,
            'Content-Type': 'application/json',
        },
    })
    .then((res) => res.json())
    .catch(err => logger.error(`s3upload.js threw: ${err}`))
}

Create the bucket with a permissive Cors config

// createBucket /api utility (calls lambda)
export const createBucket = async ({ bucketName }) => {
    return fetch(`${process.env.storageCreatePath}/${bucketName}`, {
        method: 'get', headers: {
            Authorization: `Bearer ${process.env.bearerToken}`,
            'Content-Type': 'application/json',
        },
    })
        .then((res) => res.json())
        .catch((err) => logger.error(err))
};

For your interest, shared classes that work w/Material-UI

// shared-styles.ts /web utility 
import { makeStyles, createStyles } from '@material-ui/core'
export default makeStyles((theme)=>createStyles({
  comfortable1: {
    padding: theme.spacing(1),
    margin: theme.spacing(1),
  },
  containBackground: {
    backgroundSize: 'contain !important',
  },
  fourFifthsHeight: {
    height: '80%',
    margin: theme.spacing(1),
    width: '50%',
  },
  fullHeight: {
    height: '100%',
  },
  s3Image: {
    padding: theme.spacing(2),
    height: '40vh',
    width: '60vw',
  },
}));
1 Like

--------- AWS Lambda code ---------

The Lambda is a separate serverless.com project, something I made a while back and just keep adding to

here’s the serverless.yml

# https://www.serverless.com/framework/docs/providers/aws/guide/serverless.yml

service: s3-browser-lambda
configValidationMode: error
frameworkVersion: '>=2'
plugins:
    - serverless-plugin-typescript
custom:
    account: ${opt:account, '<omitted>'}
    region: ${opt:region, 'us-east-1'}
    stage: ${opt:stage, 'dev'}
provider:
    stage: ${opt:stage, 'dev'}
    region: ${opt:region, 'us-east-1'}
    apiGateway:
        shouldStartNameWithService: true
    name: aws
    runtime: nodejs12.x
    logRetentionInDays: 120
    deploymentBucket:
        blockPublicAccess: true
    tracing:
        lambda: true
    environment:
        <omitted>
        <omitted>
    iamRoleStatements:
        - Effect: 'Allow'
          Action:
              - 'logs:*'
          Resource: '*'
        - Effect: Allow
          Action:
              - 's3:*'
          Resource: '*'
functions:
    # ensure a storage area w/create
    storage:
        handler: handlers/storage.create
        events:
            - http:
                  path: storage/{bucket}
                  method: get
                  cors: true
    # get an upload url for a storage area w/upload
    upload:
        handler: handlers/storage.upload
        events:
            - http:
                  path: upload/{bucket}
                  method: get
                  cors: true
    # get a download url for a storage area w/download
    download:
        handler: handlers/storage.download
        events:
            - http:
                  path: download/{bucket}
                  method: get
                  cors: true
package:
    include:
        - ./handlers/*
        - ./utils/*

Here’s the handler file

'use strict';

import {
  configForAWS,
  reportResults,
  CallbackErrorResponse,
  CallbackSuccessWithText,
} from '../utils/aws-sdk-lib';

import {
  doEnsureBucket,
  doGetSignedUrl,
  doPutBucketCors,
} from '../utils/aws-sdk-lib-s3';

import { authenticateBusiness } from '../utils/jwtAuth';

// ----------------------------------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------------------------------

const gleanFromParameters = (parameters) => {
  const { environment = {}, context = {}, event = {} } = { ...parameters };

  const { invokedFunctionArn } = context;
  
  const [ ,,,region, account ] = invokedFunctionArn.split(':');
  environment.account = account; 
  environment.region = region; 

  const { pathParameters } = event;
  const { bucket: bucketName = 'missing' } = (pathParameters || {});

  const { queryStringParameters } = event;
  const { key: fileName = 'no-file.missing', type: contentType = 'missing', secs: duration = '15' } = (queryStringParameters || {})

  return { ...parameters, bucketName, fileName, contentType: `image/${contentType}`, duration };
};

// ----------------------------------------------------------------------------------------------------------------
const buildS3HeadBucketName = parameters => {
  const { bucketName } = parameters;
  return bucketName;
}

// ----------------------------------------------------------------------------------------------------------------
const handleCreateStorageRequests = async (event = {}, context = {}) => {

  return new Promise(async (resolve, reject) => {

    let parameters = {};
    const environment = { deepFail: true, deepResults: true, deepDebug: true };
    try {
      console.info(`Lambda.handler event`, JSON.stringify(event));
      console.info(`Lambda.handler context`, JSON.stringify(context));

      parameters = await configForAWS({ environment, event, context, gleanFromParameters });
      console.info(`running environment`, environment);
      console.info(`starting parameters`, parameters);

      const buildS3CreateBucketName = buildS3HeadBucketName
      const buildS3VersioningBucketName = buildS3HeadBucketName
      parameters = { ...parameters, buildS3HeadBucketName, buildS3CreateBucketName, buildS3VersioningBucketName };
      parameters = await doEnsureBucket(parameters);

      const buildS3CorsBucketName = buildS3HeadBucketName
      const buildS3CorsBucketConfig = () => ({
        CORSRules: [{
          AllowedOrigins: ['*'],
          AllowedMethods: ['PUT', 'GET'],
          AllowedHeaders: ['*']
        }]
      })
      doPutBucketCors
      parameters = { ...parameters, buildS3CorsBucketName, buildS3CorsBucketConfig };
      parameters = await doPutBucketCors(parameters);

      parameters = await reportResults(parameters);
      resolve(parameters);

    } catch (err) {
      console.error(`handleCreateStorageRequests threw: ${err.message || err.error}`);
      if (environment.deepFail) {
        console.debug(`with deepFail, parameters:`, JSON.stringify(parameters, null, 2));
      }

      reject(err);
    }
  });

};

// ----------------------------------------------------------------------------------------------------------------
const handleUploadStorageRequests = async (event = {}, context = {}) => {

  return new Promise(async (resolve, reject) => {

    let parameters = {};
    const environment = { deepFail: true, deepResults: true, deepDebug: true };
    try {
      console.info(`Lambda.handler event`, JSON.stringify(event));
      console.info(`Lambda.handler context`, JSON.stringify(context));

      parameters = await configForAWS({ environment, event, context, gleanFromParameters });
      console.info(`running environment`, environment);
      console.info(`starting parameters`, parameters);

      const buildS3GetSignedUrlParams = (parameters) => {
        const { bucketName, fileName, contentType, duration } = parameters
        return {
          Key: fileName,
          Bucket: bucketName,
          Expires: +duration,
          ContentType: contentType,
          ContentEncoding: 'base64',
        }
      }
      const buildS3GetSignedUrlOperation = (parameters) => `putObject`
      parameters = { ...parameters, buildS3GetSignedUrlOperation, buildS3GetSignedUrlParams };
      parameters = await doGetSignedUrl(parameters);

      parameters = await reportResults(parameters);
      resolve(parameters);

    } catch (err) {
      console.error(`handleUploadStorageRequests threw: ${err.message || err.error}`);
      if (environment.deepFail) {
        console.debug(`with deepFail, parameters:`, JSON.stringify(parameters, null, 2));
      }

      reject(err);
    }
  });

};

// ----------------------------------------------------------------------------------------------------------------
const handleDownloadStorageRequests = async (event = {}, context = {}) => {

  return new Promise(async (resolve, reject) => {

    let parameters = {};
    const environment = { deepFail: true, deepResults: true, deepDebug: true };
    try {
      console.info(`Lambda.handler event`, JSON.stringify(event));
      console.info(`Lambda.handler context`, JSON.stringify(context));

      parameters = await configForAWS({ environment, event, context, gleanFromParameters });
      console.info(`running environment`, environment);
      console.info(`starting parameters`, parameters);

      const buildS3GetSignedUrlParams = (parameters) => {
        const { bucketName, fileName, duration } = parameters
        return {
          Expires: +duration,
          Bucket: bucketName,
          Key: fileName,
        }
      }
      const buildS3GetSignedUrlOperation = (parameters) => `getObject`
      parameters = { ...parameters, buildS3GetSignedUrlOperation, buildS3GetSignedUrlParams };
      parameters = await doGetSignedUrl(parameters);

      parameters = await reportResults(parameters);
      resolve(parameters);

    } catch (err) {
      console.error(`handleDownloadStorageRequests threw: ${err.message || err.error}`);
      if (environment.deepFail) {
        console.debug(`with deepFail, parameters:`, JSON.stringify(parameters, null, 2));
      }

      reject(err);
    }
  });

};

// ----------------------------------------------------------------------------------------------------------------
const create = async (event, context, callback) => {
  try {
    if (await authenticateBusiness(event, callback)) {
      const parameters = await handleCreateStorageRequests(event, context);
      const { bucketName } = parameters as any;
      CallbackSuccessWithText(callback, JSON.stringify({ bucketName }))
    }
  } catch (err) {
    CallbackErrorResponse({ err, fnName: `handler(get/storage/{name})`, callback });
  }
}

// ----------------------------------------------------------------------------------------------------------------
const upload = async (event, context, callback) => {
  try {
    if (await authenticateBusiness(event, callback)) {
      const parameters = await handleUploadStorageRequests(event, context);
      const { doGetSignedUrlResult } = parameters as any;
      CallbackSuccessWithText(callback, JSON.stringify({ doGetSignedUrlResult }))
    }
  } catch (err) {
    CallbackErrorResponse({ err, fnName: `handler(get/storage/{name})`, callback });
  }
}

// ----------------------------------------------------------------------------------------------------------------
const download = async (event, context, callback) => {
  try {
    if (await authenticateBusiness(event, callback)) {
      const parameters = await handleDownloadStorageRequests(event, context);
      const { doGetSignedUrlResult } = parameters as any;
      CallbackSuccessWithText(callback, JSON.stringify({ doGetSignedUrlResult }))
    }
  } catch (err) {
    CallbackErrorResponse({ err, fnName: `handler(get/storage/{name})`, callback });
  }
}

// ----------------------------------------------------------------------------------------------------------------
module.exports = { 
    create, 
    upload, 
    download 
};

--------- AWS Lambda code ---------

Here’s the utils part of my sdk library

'use strict';
const AWS = require('aws-sdk');
// ----------------------------------------------------------------------------------------------------------------

const gleanFromParameters_missing = (parameters) => {
    console.info(`NOTE: you may pass a function 'gleanFromParameters(parameters) 
    into 'configForAWS({ environment, event, context, gleanFromParameters }) 
    to extract info from the object: { environment, event, context }`);
}

// create a configuration for our task 
// https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/Config.html#update-property
const configForAWS = ({
    environment = { deepFail: false, deepResults: false, deepDebug: false },
    event = {},
    context = {},
    gleanFromParameters = gleanFromParameters_missing
}) => {
    try {
        let parameters = { environment, event, context };
        parameters = gleanFromParameters(parameters);
        sanityCheckEnvironment(environment);
        const awsConfig = { region: environment.region };
        console.debug(`configForAWS, call: AWS.config.update`, awsConfig);
        AWS.config.update(awsConfig);
        return Promise.resolve({
            ...parameters,
            awsConfig
        });
    } catch (err) {
        throw new Error(`configForAWS threw: ${err.message || err}`);
    }
}

// sanity check environment -- nothing can be empty
const sanityCheckEnvironment = (env) => {
    for (const fldName in env) {
        if (env[fldName] === null || env[fldName] === undefined) {
            throw new Error(`configForAWS, ${fldName} is empty in environment`);
        }
    }
}

// ----------------------------------------------------------------------------------------------------------------

// log our results, with the return value from S3
const reportResults = (parameters) => {
    const { environment } = { ...parameters };
    const { deepResults } = { ...environment };
    try {
        if (deepResults) {
            console.debug(`reportResults parameters:`, JSON.stringify(parameters, null, 2));
        }
        console.log(`completed with success`);
        return parameters;
    } catch (err) {
        throw new Error(`reportResults threw: ${err.message || err}`);
    }
}

// ----------------------------------------------------------------------------------------------------------------

const CallbackSuccessWithText = function (callback, message) {
    const response = {
        statusCode: 200,
        body: message
    };
    return callback(null, response);
};

// ----------------------------------------------------------------------------------------------------------------

const CallbackErrorResponse = function (_a) {
    const err = _a.err, fnName = _a.fnName, callback = _a.callback;
    console.error(fnName + " threw: ", err);
    if (err instanceof HTMLError) {
        return callback(null, err.htmlResponse());
    }
    const unspecifiedErr = new InternalError(function () { return err.message || err; });
    return callback(null, unspecifiedErr.htmlResponse());
};

// ----------------------------------------------------------------------------------------------------------------

module.exports = {
    configForAWS,
    reportResults,
    CallbackSuccessWithText,
    CallbackErrorResponse,
};

Here’s the S3 part of my sdk library

'use strict';
const S3 = require('aws-sdk/clients/s3');
// how long to wait between tests 
const SECONDS_BETWEEN_S3_CHECKS = 5;
// how long to wait before failure
const SECONDS_BEFORE_S3_FAIL = 20;
// https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#getSignedUrl-property
const doGetSignedUrl = (parameters) => {
    try {
        const { awsConfig } = { ...parameters };
        const s3 = new S3(awsConfig);
        return new Promise((resolve, reject) => {
            try {
                const params = paramsForDoGetSignedUrl(parameters);
                logParamsForDoGetSignedUrl(params, parameters);
                s3.getSignedUrl(params.Operation, params.Params, (err, getSignedUrlResult) => {
                    if (err) {
                        reject({ message: err });
                    }
                    resolve({
                        ...parameters,
                        doGetSignedUrlResult: getSignedUrlResult
                    });
                });
            }
            catch (err) {
                reject(err.message || err);
            }
        });
    } catch (err) {
        throw new Error(`doGetSignedUrl threw: ${err.message || err}`);
    }
}

// log safely
const logParamsForDoGetSignedUrl = (paramBlock, parameters) => {
    const { environment } = { ...parameters };
    const { deepDebug } = { ...environment };
    if (deepDebug) {
        console.debug(`---------> doGetSignedUrl, call: s3.getSignedUrl`, JSON.stringify(paramBlock, null, 2));
    } else {
        const loggableParamBlock = JSON.parse(JSON.stringify(paramBlock));
        console.debug(`doGetSignedUrl, call: s3.headBucket, paramBlock:`, loggableParamBlock);
    }
}

const buildS3GetSignedUrlOperation_missing = () => {
    throw new Error(`ERROR: a function 'buildS3GetSignedUrlOperation(parameters)' is missing from the parameters to build the S3 operation`);
}

const buildS3GetSignedUrlParams_missing = () => {
    throw new Error(`NOTE: a function 'buildS3GetSignedUrlParams(parameters)' is missing from the parameters to build the S3 params`);
}

// create S3 params for writing the variable
const paramsForDoGetSignedUrl = (parameters) => {
    const {
        buildS3GetSignedUrlOperation = buildS3GetSignedUrlOperation_missing,
        buildS3GetSignedUrlParams = buildS3GetSignedUrlParams_missing,
    } = { ...parameters };
    return {
        Operation: buildS3GetSignedUrlOperation(parameters),
        Params: buildS3GetSignedUrlParams(parameters)
    };
}

// ----------------------------------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------------------------------

// connect to s3 and check to see if a bucket exists
// https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#headBucket-property
const doHeadBucket = (parameters) => {
    try {
        const { awsConfig } = { ...parameters };
        const s3 = new S3(awsConfig);
        return new Promise((resolve, reject) => {
            try {
                const params = paramsForDoHeadBucket(parameters);
                logParamsForDoHeadBucket(params, parameters);
                s3.headBucket(params, (err, s3Result) => {
                    if (err) {
                        if (/NotFound/.test(err)) {
                            resolve(parameters); // resolve with no result, this is ok -- anything else is bust
                        }
                        if (/Forbidden/.test(err)) {
                            resolve(parameters); // resolve with no result, this is ok -- anything else is bust
                        }
                        reject({ message: err });
                    }
                    resolve({
                        ...parameters,
                        doHeadBucketResult: {
                            ...s3Result,
                            bucketName: params.Bucket
                        }
                    });
                });
            }
            catch (err) {
                reject(err.message || err);
            }
        });
    } catch (err) {
        throw new Error(`doHeadBucket threw: ${err.message || err}`);
    }
}

// log safely
const logParamsForDoHeadBucket = (paramBlock, parameters) => {
    const { environment } = { ...parameters };
    const { deepDebug } = { ...environment };
    if (deepDebug) {
        console.debug(`---------> doHeadBucket, call: s3.headBucket`, JSON.stringify(paramBlock, null, 2));
    } else {
        const loggableParamBlock = JSON.parse(JSON.stringify(paramBlock));
        console.debug(`doHeadBucket, call: s3.headBucket, paramBlock:`, loggableParamBlock);
    }
}

const buildS3HeadBucketName_missing = () => {
    throw new Error(`ERROR: a function 'buildS3HeadBucketName(parameters)' is missing from the parameters to build the S3 bucket you wish to head`);
}

// create S3 params for writing the variable
const paramsForDoHeadBucket = (parameters) => {
    const {
        buildS3HeadBucketName = buildS3HeadBucketName_missing,
    } = { ...parameters };
    return {
        Bucket: buildS3HeadBucketName(parameters),
    };
}

// ----------------------------------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------------------------------

// connect to s3 and create a bucket
// https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#createBucket-property
const doCreateBucket = (parameters) => {
    try {
        const { awsConfig } = { ...parameters };
        const s3 = new S3(awsConfig);
        return new Promise((resolve, reject) => {
            try {
                const params = paramsForDoCreateBucket(parameters);
                logParamsForDoCreateBucket(params, parameters);
                s3.createBucket(params, (err, s3Result) => {
                    if (err) {
                        reject({ message: err });
                    }
                    resolve({
                        ...parameters,
                        doCreateBucketResult: s3Result
                    });
                });
            }
            catch (err) {
                reject(err.message || err);
            }
        });
    } catch (err) {
        throw new Error(`doCreateBucket threw: ${err.message || err}`);
    }
}

// log safely
const logParamsForDoCreateBucket = (paramBlock, parameters) => {
    const { environment } = { ...parameters };
    const { deepDebug } = { ...environment };
    if (deepDebug) {
        console.debug(`---------> doCreateBucket, call: s3.createBucket`, JSON.stringify(paramBlock, null, 2));
    } else {
        const loggableParamBlock = JSON.parse(JSON.stringify(paramBlock));
        console.debug(`doCreateBucket, call: s3.createBucket, paramBlock:`, loggableParamBlock);
    }
}

const buildS3CreateBucketName_missing = () => {
    throw new Error(`ERROR: a function 'buildS3CreateBucketName(parameters)' is missing from the parameters to build the S3 bucket you wish to create`);
}

// create S3 params for writing the variable, us-east-1 was aws' first and needs special recognition
const paramsForDoCreateBucket = (parameters) => {
    const {
        environment, buildS3CreateBucketName = buildS3CreateBucketName_missing,
    } = { ...parameters };
    return (environment.region === 'us-east-1') ? {
        Bucket: buildS3CreateBucketName(parameters),
    } : {
            Bucket: buildS3CreateBucketName(parameters),
            CreateBucketConfiguration: {
                LocationConstraint: environment.region
            }
        };
}

// ----------------------------------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------------------------------

// connect to s3 and create a bucket
// https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#putBucketVersioning-property
const doPutBucketVersioning = (parameters) => {
    try {
        const { awsConfig } = { ...parameters };
        const s3 = new S3(awsConfig);
        return new Promise((resolve, reject) => {
            try {
                const params = paramsForDoPutBucketVersioning(parameters);
                logParamsForDoPutBucketVersioning(params, parameters);
                s3.putBucketVersioning(params, (err, s3Result) => {
                    if (err) {
                        reject({ message: err });
                    }
                    resolve({
                        ...parameters,
                        doPutBucketVersioningResult: s3Result
                    });
                });
            }
            catch (err) {
                reject(err.message || err);
            }
        });
    } catch (err) {
        throw new Error(`doPutBucketVersioning threw: ${err.message || err}`);
    }
}

// log safely
const logParamsForDoPutBucketVersioning = (paramBlock, parameters) => {
    const { environment } = { ...parameters };
    const { deepDebug } = { ...environment };
    if (deepDebug) {
        console.debug(`---------> doPutBucketVersioning, call: s3.putBucketVersioning`, JSON.stringify(paramBlock, null, 2));
    } else {
        const loggableParamBlock = JSON.parse(JSON.stringify(paramBlock));
        console.debug(`doPutBucketVersioning, call: s3.putBucketVersioning, paramBlock:`, loggableParamBlock);
    }
}

const buildS3VersioningBucketName_missing = () => {
    throw new Error(`ERROR: a function 'buildS3VersioningBucketName(parameters)' is missing from the parameters to build the S3 bucket you wish to create`);
}

// create S3 params for writing the variable
const paramsForDoPutBucketVersioning = (parameters) => {
    const {
        buildS3VersioningBucketName = buildS3VersioningBucketName_missing,
    } = { ...parameters };
    return {
        Bucket: buildS3VersioningBucketName(parameters),
        VersioningConfiguration: {
            MFADelete: "Disabled",
            Status: "Enabled"
        }
    };
}

// ----------------------------------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------------------------------

// https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#putBucketCors-property
const doPutBucketCors = (parameters) => {
    try {
        const { awsConfig } = { ...parameters };
        const s3 = new S3(awsConfig);
        return new Promise((resolve, reject) => {
            try {
                const params = paramsForDoPutBucketCors(parameters);
                logParamsForDoPutBucketCors(params, parameters);
                s3.putBucketCors(params, (err, s3Result) => {
                    if (err) {
                        reject({ message: err });
                    }
                    resolve({
                        ...parameters,
                        doPutBucketCorsResult: s3Result
                    });
                });
            }
            catch (err) {
                reject(err.message || err);
            }
        });
    } catch (err) {
        throw new Error(`doPutBucketCors threw: ${err.message || err}`);
    }
}

// log safely
const logParamsForDoPutBucketCors = (paramBlock, parameters) => {
    const { environment } = { ...parameters };
    const { deepDebug } = { ...environment };
    if (deepDebug) {
        console.debug(`---------> doPutBucketCors, call: s3.putBucketCors`, JSON.stringify(paramBlock, null, 2));
    } else {
        const loggableParamBlock = JSON.parse(JSON.stringify(paramBlock));
        console.debug(`doPutBucketCors, call: s3.putBucketCors, paramBlock:`, loggableParamBlock);
    }
}

const buildS3CorsBucketName_missing = () => {
    throw new Error(`ERROR: a function 'buildS3CorsBucketName(parameters)' is missing from the parameters to build the S3 bucket you wish to create`);
}

const buildS3CorsBucketConfig_missing = () => {
    throw new Error(`ERROR: a function 'buildS3CorsBucketConfig(parameters)' is missing from the parameters to build the S3 bucket cors config`);
}

// create S3 params for writing the variable
const paramsForDoPutBucketCors = (parameters) => {
    const {
        buildS3CorsBucketName = buildS3CorsBucketName_missing,
        buildS3CorsBucketConfig = buildS3CorsBucketConfig_missing,
    } = { ...parameters };
    return {
        Bucket: buildS3CorsBucketName(parameters),
        CORSConfiguration: buildS3CorsBucketConfig(parameters)
    };
}

// ----------------------------------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------------------------------

// connect to s3 and create a bucket
// https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#putPublicAccessBlock-property
const doPutPublicAccessBlock = (parameters) => {
    try {
        const { awsConfig } = { ...parameters };
        const s3 = new S3(awsConfig);
        return new Promise((resolve, reject) => {
            try {
                const params = paramsForDoPutPublicAccessBlock(parameters);
                logParamsForDoPutPublicAccessBlock(params, parameters);
                s3.putPublicAccessBlock(params, (err, s3Result) => {
                    if (err) {
                        reject({ message: err });
                    }
                    resolve({
                        ...parameters,
                        doPutPublicAccessBlockResult: s3Result
                    });
                });
            }
            catch (err) {
                reject(err.message || err);
            }
        });
    } catch (err) {
        throw new Error(`doPutPublicAccessBlock threw: ${err.message || err}`);
    }
}

// log safely
const logParamsForDoPutPublicAccessBlock = (paramBlock, parameters) => {
    const { environment } = { ...parameters };
    const { deepDebug } = { ...environment };
    if (deepDebug) {
        console.debug(`---------> doPutPublicAccessBlock, call: s3.putPublicAccessBlock`, JSON.stringify(paramBlock, null, 2));
    } else {
        const loggableParamBlock = JSON.parse(JSON.stringify(paramBlock));
        console.debug(`doPutPublicAccessBlock, call: s3.putPublicAccessBlock, paramBlock:`, loggableParamBlock);
    }
}

const buildS3PublicAccessBucketName_missing = () => {
    throw new Error(`ERROR: a function 'buildS3PublicAccessBucketName(parameters)' is missing from the parameters to build the S3 bucket you wish to create`);
}

// create S3 params for writing the variable
const paramsForDoPutPublicAccessBlock = (parameters) => {
    const {
        buildS3PublicAccessBucketName = buildS3PublicAccessBucketName_missing,
    } = { ...parameters };
    return {
        Bucket: buildS3PublicAccessBucketName(parameters),
        PublicAccessBlockConfiguration: { 
            BlockPublicAcls: true,
            BlockPublicPolicy: true,
            IgnorePublicAcls: true,
            RestrictPublicBuckets: true
        }
    };
}

// ----------------------------------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------------------------------

// ensure the bucket exists and is ready
const doEnsureBucket = (parameters) => {
    try {
        return new Promise((resolve, reject) => {
            doHeadBucket(parameters) // if bucket exists
                .then(parameters => {
                    const { doHeadBucketResult } = { ...parameters };
                    const { bucketName } = { ...doHeadBucketResult };
                    if (bucketName) {
                        resolve(parameters); // then the bucket is there
                    } else {
                        doCreateBucket(parameters) // otherwise create it, which we have to wait for
                            .then(parameters => {
                                const bucketCreateFailedAfter = new Date().getTime() + (1000 * SECONDS_BEFORE_S3_FAIL); // we won't wait too long...
                                const myWatcher = setInterval(() => {
                                    doHeadBucket(parameters) // is is there yet?
                                        .then(parameters => {
                                            const { doHeadBucketResult } = { ...parameters };
                                            const { bucketName } = { ...doHeadBucketResult };
                                            if (bucketName) {
                                                clearInterval(myWatcher);
                                                resolve(parameters); // then we succeded
                                            }
                                        })
                                        .catch(err => {
                                            if (new Date().getTime() > bucketCreateFailedAfter) {
                                                clearInterval(myWatcher);
                                                reject(`doEnsureBucket call rejected: ${err.message || err}`); // we waited, and it never showed up
                                            }
                                        })
                                }, 1000 * SECONDS_BETWEEN_S3_CHECKS);
                            })
                            .catch(err => {
                                reject(`doCreateBucket call threw: ${err.message || err}`); // we could not create it
                            });
                    }
                })
                .catch(err => {
                    reject(`doHeadBucket call threw: ${err.message || err}`); // we could not find it
                });
        });
    } catch (err) {
        throw new Error(`doEnsureBucket threw: ${err.message || err}`);
    }
}

// ----------------------------------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------------------------------

// connect to s3 and write these objects [Value] into a file as JSON
// https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#putObject-property
const doPutObject = (parameters) => {
    try {
        const { awsConfig } = { ...parameters };
        const s3 = new S3(awsConfig);
        return new Promise((resolve, reject) => {
            try {
                const params = paramsForDoPutObject(parameters);
                logParamsForDoPutObject(params, parameters);
                s3.putObject(params, (err, s3Result) => {
                    if (err) {
                        reject({ message: err });
                    }
                    resolve({
                        ...parameters,
                        doPutObjectResult: s3Result
                    });
                });
            }
            catch (err) {
                reject(err.message || err);
            }
        });
    } catch (err) {
        throw new Error(`doPutObject threw: ${err.message || err}`);
    }
}

// log safely
const logParamsForDoPutObject = (paramBlock, parameters) => {
    const { environment } = { ...parameters };
    const { deepDebug } = { ...environment };
    if (deepDebug) {
        console.debug(`---------> doPutObject, call: s3.putObject`, JSON.stringify(paramBlock, null, 2));
    } else {
        const loggableParamBlock = JSON.parse(JSON.stringify(paramBlock));
        loggableParamBlock.Body = deepDebug ? loggableParamBlock.Body : `<redacted>`;
        console.debug(`doPutObject, call: s3.putObject, paramBlock:`, loggableParamBlock);
    }
}

const buildS3PutObjectBucketName_missing = () => {
    throw new Error(`ERROR: a function 'buildS3PutObjectBucketName(parameters)' is missing from the parameters to build the S3 bucket you wish to create`);
}

const buildS3PutObjectS3Key_missing = () => {
    throw new Error(`ERROR: a function 'buildS3PutObjectS3Key(parameters)' is missing from the parameters to build the S3 bucket you wish to create`);
}

const buildS3PutObjectS3ReadyData_missing = () => {
    throw new Error(`ERROR: a function 'buildS3PutObjectS3ReadyData(parameters)' is missing from the parameters to build the S3 bucket you wish to create`);
}

// create S3 params for writing the variable
const paramsForDoPutObject = (parameters) => {
    const {
        buildS3PutObjectS3ReadyData = buildS3PutObjectS3ReadyData_missing,
        buildS3PutObjectBucketName = buildS3PutObjectBucketName_missing,
        buildS3PutObjectS3Key = buildS3PutObjectS3Key_missing,
    } = { ...parameters };
    const s3ReadyData = buildS3PutObjectS3ReadyData(parameters)
    return {
        Body: Buffer.from(s3ReadyData, 'utf8'),
        Bucket: buildS3PutObjectBucketName(parameters),
        Key: buildS3PutObjectS3Key(parameters),
    };
}

// ----------------------------------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------------------------------

// connect to s3 and read the object 
// https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#getObject-property
const doGetObject = (parameters) => {
    try {
        const { awsConfig } = { ...parameters };
        const s3 = new S3(awsConfig);
        return new Promise((resolve, reject) => {
            try {
                const params = paramsForGetObject(parameters);
                logParamsForGetObject(params, parameters);
                s3.getObject(params, (err, s3Response) => {
                    if (err) {
                        console.error(`AWS.clients.s3.getObject errored:`, err.stack);
                        reject({
                            ...parameters,
                            error: err.message,
                            stack: err.stack
                        });
                    }
                    resolve({
                        ...parameters,
                        s3Response
                    });
                });
            }
            catch (err) {
                reject(err.message || err);
            }
        });
    } catch (err) {
        throw new Error(`doGetObject threw: ${err.message || err}`);
    }
}

// log safely
const logParamsForGetObject = (paramBlock, parameters) => {
    console.debug(`doGetObject, call: s3.getObject`, paramBlock);
}

const buildS3GetObjectBucketName_missing = () => {
    throw new Error(`ERROR: a function 'buildS3GetObjectBucketName(parameters)' is missing from the parameters to build the S3 bucket you wish to create`);
}

const buildS3GetObjectS3Key_missing = () => {
    throw new Error(`ERROR: a function 'buildS3GetObjectS3Key(parameters)' is missing from the parameters to build the S3 bucket you wish to create`);
}

// create S3 params for reading the params file
const paramsForGetObject = (parameters) => {
    const {
        buildS3GetObjectBucketName = buildS3GetObjectBucketName_missing,
        buildS3GetObjectS3Key = buildS3GetObjectS3Key_missing,
    } = { ...parameters };
    return {
        Bucket: buildS3GetObjectBucketName(parameters),
        Key: buildS3GetObjectS3Key(parameters),
    };
}


// ----------------------------------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------------------------------

module.exports = {
    doCreateBucket,
    doEnsureBucket,
    doGetObject,
    doGetSignedUrl,
    doHeadBucket,
    doPutBucketCors,
    doPutObject,
    doPutPublicAccessBlock,
    doPutBucketVersioning,
};

Cheers, you made it this far !!

Hope it all helps !!

The End.

1 Like