[solved] AWS S3 - File uploads

Hi!

I am a newbie in Jamstack but trying to go into it. I am impressed with Redwoodjs and that’s the reason to do it!

Currently, I am trying to implement file uploads. I considered AWS S3 to do it (It would be interesting to hear some opinion about it).

I implemented a Custom Function where I post the file via form-data and upload it to the S3 bucket using aws-sdk.

Actually with text files is working correctly but using images, the files are uploading to the bucket but not showing correctly (looks like the data is corrupted). Any help on how to fix it?

Also, I would appreciate any recommendations about practices I should implement.

I added the Custom Function code below. Thanks!!

const aws = require('aws-sdk')
const Busboy = require('busboy') 

export const handler = async (event, context) => {
  //Parse body
  parse(event)
    .then((formData) => {
      // console.log(file)
      const file = formData.image

      // AWS config
      aws.config.update({
        accessKeyId: process.env.REDWOOD_ENV_PEPE_ID,
        secretAccessKey: process.env.REDWOOD_ENV_PEPE_LLAVE,
        region: 'eu-central-1',
      })
      const s3 = new aws.S3({ params: { Bucket: 'eu-fr-momentum' } })

      //S3 upload
      upload(s3, file)
    })
    .catch((err) => {
      console.log(err)
    })
  return {
    statusCode: 200,
    body: 'Request sent',
  }
}

const upload = (s3, file) => {
  s3.createBucket(() => {
    var params = {
      Key: Date.now().toString() + '-' + file.name,
      Body: file.data,
      ContentType: file.type,
      ContentEncoding: file.encoding,
      ACL: 'public-read',
    }
    s3.upload(params, (err, data) => {
      err ? console.log(err) : console.log(data)
    })
  })
}

//PRE: event
//POS: Return event contentType
const getContentType = (event) => {
  let contentType = event.headers['content-type']
  if (!contentType) {
    return event.headers['Content-Type']
  }
  return contentType
}

//PRE: event
//POS: Return parsed body form-data -> {object} with fieldNames as keys
const parse = async (event) =>
  new Promise((resolve, reject) => {
    const busboy = new Busboy({
      headers: {
        'content-type': getContentType(event),
      },
    })
    const result = {}
    busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
      // console.log(
      //   'File [' +
      //     fieldname +
      //     ']: filename: ' +
      //     filename +
      //     ', encoding: ' +
      //     encoding +
      //     ', mimetype: ' +
      //     mimetype
      // )
      file.on('data', (data) => {
        result[fieldname] =
          //File
          {
            name: filename,
            data: data,
            encoding: encoding,
            type: mimetype,
            size: data.length,
          }
      })
      file.on('end', () => {
        // console.log('File [' + fieldname + '] Finished')
      })
    })
    busboy.on(
      'field',
      (
        fieldname,
        val,
        fieldnameTruncated,
        valTruncated,
        encoding,
        mimetype
      ) => {
        // console.log('Field [' + fieldname + ']: value: ' + inspect(val))
        result[fieldname] = val
      }
    )
    busboy.on('finish', () => {
      // console.log('Done parsing form!')
      resolve(result)
    })
    // console.log('event.isBase64Encoded')
    // console.log(event.isBase64Encoded)
    busboy.write(event.body, event.isBase64Encoded ? 'base64' : 'binary')
    busboy.end()
  })

I’m sorry I can’t help you with your problem @brunopop. But what you’re trying to do is something I’m sure many others will want to do as well, so it would be awesome if we could figure out how best to do this, and then share it with the community.

I have implemented downloading from an s3 bucket in a Redwood Function. I’ll try to do a write-up about it, maybe it can give you some ideas. Hopefully I can find some time later today, or tomorrow, but no promises.

1 Like

@brunopop is S3 a hard requirement for you? If so, couple of questions - have you checked the mimetype of the images is correct when it reaches the server? So it uploads but when you download - you are saying it’s not recognized as an image?

Can you show the code where the front-end is sending over the form-data? If you can explain the various functions being used as well, because it’s not clear how they tie together.

My other suggestion would be to use something like Cloudinary or filestack. But have much nicer SDKs and are easier to work with.

1 Like

@brunopop I second @viperfx on using Cloudinary or Filestack (which I have used before and been very happy with) or ImageKit for uploading.

They integrate well with a number of storage providers, provide security to ensure only certain IP address or clients can upload, upload directly in a streamed way to ensure very large files can be uploaded and their SDKs and widgets are user friendly.

And they can transform images on upload to make thumbnails, or convert to other formats, or transform. Also when fetching the image, they can provide image transformations, cropping, scaling, fitting, and a number of other options.

A few things you do need to know about the web to function to S3 upload:

  • You are doing this in memory, functions have a limit
  • Functions (on Netlify) have a 10s runtime limit, so you will need to consume the image data and send to S3 in under that time. You may want to consider an async process with a background function on Netlify.
  • Your function is public. You should provide some mechanism to ensure the upload is legitimate otherwise someone could upload lots of files to your S3 bucket which can incur costs in bandwidth and storage space.

If you use Netlify you could also try their Forms setup | Netlify Docs Forms file uploads. Based on the plan there are some limits. You could then use a webhook to get when a form upload was doing, and us a background function to fetch the file and send it you would own S3.

3 Likes

Writeup, as promised :slight_smile:

https://tlundberg.com/blog/2020-11-28/redwood-download-files-from-protected-s3/

5 Likes

Thanks! It is not a hard requirement but I thought that could be nice to integrate it.

The mimetype is correct in the server. But if you try to open the image using the object URL, it shows just a black image. If you download the file and open it, Windows photo viewer shows an unsupported format error.

I think it is something about the binary data or something with the headers, but I couldn’t figure it out. Maybe it is how I parsed the body.

No front-end yet. I am just testing using postman, sending post request.

@dthyresson @viperfx I have tried Filestack and I didn’t like it. That was one of the reasons to try with AWS S3 directly.

After your recommendation, I’m trying imagekit and I really like it! Looks great. Actually, I think I’m gonna take that solution. @dthyresson Have you ever tried image-kit-uppy-pluggin?

Thanks!!!

To be honest, I have not used it for images, but other traditional file uploads.

Great! It was @peterp who mentioned it to me as he uses it – I’ve been using imgix for image processing, but it does not offer uploads.

I have not – but uppy has 24k :star2:s and is from transloadit.

I remember almost using them for a project 6 or so years ago when we needed to upload video and transcode it – but the client decided to just use a simple aws pipeline.

Uppy looks really good. I’ll have to think about using it in the future.

Hello,

I do this all the time with S3 (or S3 compatible APIs). My suggestion is to use a signed url to upload.

Here’s a tutorial: https://www.serverless.com/blog/s3-one-time-signed-url

The flow could be:

  1. Generate a signed url on the backend, return this
  2. POST/PUT your file contents to this url from your client

This incidentally gives you security as well, because you don’t need to make uploads to your bucket public.

2 Likes

@dthyresson Filestack pretends to be the best way to manage files. I had an issue in my first 5 minutes of use, I contacted them, and they never responded.

In my opinion, Imagekit has a low profile but it works. The documentation is much better and it’s is developers oriented. They really focus on best practices to manage your media. For example, image optimization to improve the load time of your website’s images while maintaining acceptable image quality.

In addition, if you don’t need to add your files programmatically there is an easy manage dashboard with a practical media library. It is possible to upload any kind of static file, not only images. Finally, is interesting that they only charge you for the bandwidth, not the storage.

On the other hand, Filestack includess a filepicker and a machine learning product for object recognition, explicit content detection, etc. Imagekit provides a plugin for uppy, so it is not a big difference. Machine learning sounds great! But I don’t think it is usually a hard requirement and also other integrations should be around soon :slight_smile:

This is a great writeup and super valuable info. I wish i had had it a few weeks ago :wink:

Am using them now because I wanted some fine control over what files were uploaded (ie only certain mimetypes – in this case zip files) and some of them could be very large (50-100MB) and will likely try to setup some type of pipeline to look inside the zips and check that certain items exist – ie, a valid file.

But, I’m definitely going to consider Uppy,

This post was very helpful for me – thank you @brunopop for the write up and @dthyresson for the imagekit recommendation!

1 Like

Thanks for all the informative posts! I think I can piece together what I need from all of this

Heaving read all of this my hope is: User identifies the files to be uploaded and than clicks upload; normal Lambda code generates a secure upload s3 Url to a dynamic bucket name that is good for a few minutes; upon receipt of upload Url files are uploaded from the browser (maybe using uppy). When download is requested a secure Url to a dynamic bucket name that is good for a few minutes is generated and returned; upon receipt of download Url files are downloaded to the browser.

Any thoughts/samples would be appreciated – I’ll leavae code samples here after I get it working

Cheers!

Hey @ajoslin103! If I remember correctly I ran into a lot of issues getting AWS S3 bucket policies working as expected, and eventually switched my implementation over to simply using Cloudinary, which dramatically simplified the implementation and also made it very simple to optimize the image size & file format (not affiliated with them, just a fan.)

Below is my Cloudinary implementation. I remember this working without issue, but it’s been many months since I took a look at the project that used this code, so no guarantees.

Image uploading function:

export const uploadPhoto = async (file) => {
  const url = 'https://api.cloudinary.com/v1_1/[YOUR_ACCOUNT_NAME]/image/upload'
  const preset = '[YOUR_PRESET]'
  const config = {
    headers: { 'X-Requested-With': 'XMLHttpRequest' },
  }
  const fd = new FormData()

  fd.append('upload_preset', preset)
  fd.append('file', file)

  const response = await axios.post(url, fd, config)

  return response
}

TS component using some Tailwind for styling (don’t think we need to be using Label here, though):

import { Label } from '@redwoodjs/forms'
import { uploadPhoto } from 'src/lib/utils'

interface Props {
  photoSrc: string | null
  setPhotoSrc: (src: string | null) => void
  name: string
}

const LargePhotoUpload: React.FunctionComponent<Props> = ({
  photoSrc,
  setPhotoSrc,
  name,
}) => {
  const onChange = async (e) => {
    const file = e.target.files[0]
    const response = await uploadPhoto(file)
    const src = response.data.url
    setPhotoSrc(src)
  }

  return (
    <div className="mt-6 flex-grow lg:mt-0 lg:ml-6 lg:flex-grow-0 lg:flex-shrink-0">
      <p className="text-sm font-medium text-gray-700" aria-hidden="true">
        Photo
      </p>
      <div className="mt-1 lg:hidden">
        <div className="flex items-center">
          <div
            className="flex-shrink-0 inline-block rounded-full overflow-hidden h-12 w-12"
            aria-hidden="true"
          >
            {photoSrc ? (
              <img
                className="rounded-full h-full w-full"
                src={photoSrc}
                alt={`${name}-logo`}
              />
            ) : (
              <span className="inline-flex items-center justify-center w-full h-full rounded-full bg-teal-400">
                <span className="text-xs font-medium leading-none text-white text-center"></span>
              </span>
            )}
          </div>
          <div className="ml-5 rounded-md shadow-sm">
            <div className="group relative border border-gray-300 rounded-md py-2 px-3 flex items-center justify-center hover:bg-gray-50 focus-within:ring-2 focus-within:ring-offset-2 focus-within:ring-teal-400">
              <Label
                name="user_photo"
                className="relative text-sm leading-4 font-medium text-gray-700 pointer-events-none"
              >
                <span>Change</span>
                <span className="sr-only"> user photo</span>
              </Label>
              <input
                id="user_photo"
                name="user_photo"
                type="file"
                className="absolute w-full h-full opacity-0 cursor-pointer border-gray-300 rounded-md"
                onChange={onChange}
              />
            </div>
          </div>
        </div>
      </div>

      <div className="hidden relative rounded-full overflow-hidden lg:block">
        {photoSrc ? (
          <img
            className="relative rounded-full w-40 h-40"
            src={photoSrc}
            alt={`${name}-logo`}
          />
        ) : (
          <span className="inline-flex items-center justify-center w-40 h-40 rounded-full bg-teal-400">
            <span className="text-xl font-medium leading-none text-white text-center">
              {name}
            </span>
          </span>
        )}
        <Label
          name="user-photo"
          className="absolute inset-0 w-full h-full bg-black bg-opacity-75 flex items-center justify-center text-sm font-medium text-white opacity-0 hover:opacity-100 focus-within:opacity-100"
        >
          <span>Change</span>
          <span className="sr-only"> user photo</span>
          <input
            type="file"
            id="user-photo"
            name="user-photo"
            className="absolute inset-0 w-full h-full opacity-0 cursor-pointer border-gray-300 rounded-md"
            onChange={onChange}
          />
        </Label>
      </div>
    </div>
  )
}

export default LargePhotoUpload
1 Like

@tctrautman !! I’m hip !! I had more trouble then it’s worth fighting policies for creating buckets until I figured out the error was misleading “forbidden” actually meant “not found” in my case…

I am hoping to upload from the client’s browser – but it’s not a dealbreaker for round one !!

I’ll give this a try – thanks!!

1 Like

Oh, actually the uploadPhoto function is on the web side, so this might be what you want if you’re open to using Cloudinary :slight_smile:

I had to stick with S3 – it should be done in < 24hrs – I’ll post success & code

I’ve things up & downloading via signedUrls & Cells

1 Like

That’s great – nicely done @ajoslin103! Looking forward to seeing your solution.

Ok, here we go! This explains how I got the users’ browser to do the work of uploading and downloading files to S3 for me.

TLDR; using my /api to create the bucket & request signed upload/download urls from AWS Lambda functions - I then executed uploads/downloads against those Urls from the React code.

I used some library to accept the images and convert them to DataURLs – data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAA5MAAAP2CAMAAABNCDlSAAAD…

Then I used some old Lambda code I had to create signed upload and download urls (after creating a bucket with a widely permissive Cors configuration) – at some point I’ll have to back and tighten things up…

We’ll start with the Redwood /web from component to cell, thence to /api, and finally to the Lambda. If you’d like to start with the lambda and go backwards then you can read this from the bottom up.

Note: I’ve trimmed in Redwood for brevity, and stripped out some error checking, which may introduce slight bugs – but we’ve quite a ways to go.

Big Note: I stripped the DataURL prefix (i.e. data:image/png;base64,) storing just the base64 data in S3 - someone smarter than me could figure it out without that rigamarole

Side Note: I’ve only been doing React for a year on the side, so I’d welcome any tips on that stuff!

--------- Redwood /web component code ---------

Gather an image for uploading

// S3ImageUpload /web component
import { Button } from '@material-ui/core'
import TheS3UploadCell from 'src/cells/TheS3UploadCell'
const S3ImageUpload = ({ bucketName, keyFile, setUploadComplete }) => {
    const [uploadRequested, setUploadRequested] = React.useState(false)
    const [image, setImage] = React.useState<any>(undefined)
    const setUploadCompleted = result => {
        setTimeout(() => setUploadComplete(result))
    }
    return uploadRequested ? (
        <TheS3UploadCell bucket={bucketName} s3key={keyFile} secs={duration} imgData={image.dataURL} exfiltrate={setUploadCompleted} /> 
    ) : (
        // use your favorite library to receive images and expose the image.dataURL via setImage
        // use your favorite library to receive images and expose the image.dataURL via setImage
        <Button variant="contained" color="primary" onClick={() => setUploadRequested(true)}>
            Upload
        </Button>
    )
}
export default ImageUpload;

Display a downloaded image and legend

// S3Image /web component
import { Box, CardMedia, Typography } from '@material-ui/core'
import useSharedClasses from 'src/hooks/shared-styles'
import TheS3DownloadCell from 'src/cells/TheS3DownloadCell'
const S3Image = ({ bucketName prefix, keyFile, duration, legend }) => {
    const classes = useSharedClasses()
    const [imageSrc, setImageSrc] = React.useState('')
    return imageSrc ? (
        <Box className={classes.s3Image}>
            <Box className={classes.comfortable1}>
            <Typography variant="inherit">{legend}</Typography>
            </Box>
            <Box className={classes.fourFifthsHeight}>
            <CardMedia className={classes.fullHeight} classes={{ root: classes.containBackground }} image={`${prefix},${imageSrc}`} />
            </Box>
        </Box>
    ) : (
        <TheS3DownloadCell bucket={bucketName} s3key={keyFile} secs={duration} exfiltrate={setImageSrc} />
    )
}
export default S3Image

--------- Redwood /web cell code ---------

Request a signed Url & download the image data from S3

// TheS3DownloadCell /web cell
import ky from 'ky'
export const beforeQuery = (props) => {
    const variables = { ...props }
    return { variables, fetchPolicy: 'no-cache' }
}
export const QUERY = gql`
    query TheS3DownloadCell($bucket: String!, $s3key: String!, $secs: String!) {
        s3DownloadUrl(bucket: $bucket, key: $s3key, secs: $secs) {
            doGetSignedUrlResult
        }
    }
`
export const Empty = () => null
export const Loading = () => null
export const Failure = ({ error }) => <div>Error: {error.message}</div>
export const Success = ({ s3DownloadUrl, variables }) => {
    const { doGetSignedUrlResult: awsUrl } = s3DownloadUrl;
    const { exfiltrate } = variables
    ky.get(awsUrl)
        .then(response => response.arrayBuffer())
        .then(arrayBuffer => {
            const enc = new TextDecoder("utf-8");
            const imgData = enc.decode(arrayBuffer)
            setTimeout(()=>exfiltrate(imgData))
        })
    return null
}

Request a signed Url & upload the image data from S3

// TheS3UploadCell /web cell
import ky from 'ky'
import ImageUpdateCell from 'src/cells/ImageUpdateCell' 
import Progress from 'src/components/Progress'
export const beforeQuery = (props) => {
    const { bucket, s3key, secs, imgData } = props
    const [ prefix ] = imgData.split(',')
    const [ ,,contentType] = prefix.split(/[^a-z]/)
    const mimeType = `image/${contentType}` // enforce images only
    const variables = { type: contentType, key: s3key, mimeType, ...props }
    return { variables, fetchPolicy: 'no-cache' }
}
export const QUERY = gql`
    query TheS3UploadCell($bucket: String!, $key: String!, $type: String!, $secs: String!) {
        s3UploadUrl(bucket: $bucket, key: $key, type: $type, secs: $secs) {
            doGetSignedUrlResult
        }
    }
`
export const Empty = () => null
export const Loading = () => <Progress />
export const Failure = ({ error }) => <div>Error: {error.message}</div>
export const Success = ({ s3UploadUrl, variables }) => {
    const { doGetSignedUrlResult: awsUrl } = s3UploadUrl;
    const { s3key, type, exfiltrate, imgData } = variables;
    const [ prefix, ...base64Parts ] = imgData.split(',')
    const base64 = base64Parts.join(',')
    const headers = { ContentEncoding: 'base64', 'Content-Type': `image/${type}` }
    ky.put(awsUrl, { body: base64, headers })
        .then((result) => {
            setTimeout(()=>exfiltrate(result))
        })
    return (
        // this marks the image as uploaded & stores the prefix
        <ImageUpdateCell id={s3key} name={s3key} prefix={prefix} />
    )
}

--------- Redwood /api code ---------

Get the signed download Url

// s3DownloadUrl.ts /api service (calls lambda)
const fetch = require('node-fetch')
import { requireAuth } from 'src/lib/auth'
export const s3DownloadUrl = ({ bucket, key, secs }) => {
    return fetch(`${process.env.storageDownloadPath}/${bucket}?key=${key}&secs=${secs}`, {
        method: 'get', headers: { 
            Authorization: `Bearer ${process.env.bearerToken}`, 
            'Content-Type': 'application/json', 
        }
    })
    .then((res) => res.json())
    .catch(err => logger.error(`s3download.js threw: ${err}`))
}

Get a signed upload Url

// s3UploadUrl.ts /api service (calls lambda)
const fetch = require('node-fetch')
export const s3UploadUrl = ({ bucket, key, type, secs }) => {
    return fetch(`${process.env.storageUploadPath}/${bucket}?key=${key}&type=${type}&secs=${secs}`, {
        method: 'get', headers: { 
            Authorization: `Bearer ${process.env.bearerToken}`,
            'Content-Type': 'application/json',
        },
    })
    .then((res) => res.json())
    .catch(err => logger.error(`s3upload.js threw: ${err}`))
}

Create the bucket with a permissive Cors config

// createBucket /api utility (calls lambda)
export const createBucket = async ({ bucketName }) => {
    return fetch(`${process.env.storageCreatePath}/${bucketName}`, {
        method: 'get', headers: {
            Authorization: `Bearer ${process.env.bearerToken}`,
            'Content-Type': 'application/json',
        },
    })
        .then((res) => res.json())
        .catch((err) => logger.error(err))
};

For your interest, shared classes that work w/Material-UI

// shared-styles.ts /web utility 
import { makeStyles, createStyles } from '@material-ui/core'
export default makeStyles((theme)=>createStyles({
  comfortable1: {
    padding: theme.spacing(1),
    margin: theme.spacing(1),
  },
  containBackground: {
    backgroundSize: 'contain !important',
  },
  fourFifthsHeight: {
    height: '80%',
    margin: theme.spacing(1),
    width: '50%',
  },
  fullHeight: {
    height: '100%',
  },
  s3Image: {
    padding: theme.spacing(2),
    height: '40vh',
    width: '60vw',
  },
}));
1 Like

--------- AWS Lambda code ---------

The Lambda is a separate serverless.com project, something I made a while back and just keep adding to

here’s the serverless.yml

# https://www.serverless.com/framework/docs/providers/aws/guide/serverless.yml

service: s3-browser-lambda
configValidationMode: error
frameworkVersion: '>=2'
plugins:
    - serverless-plugin-typescript
custom:
    account: ${opt:account, '<omitted>'}
    region: ${opt:region, 'us-east-1'}
    stage: ${opt:stage, 'dev'}
provider:
    stage: ${opt:stage, 'dev'}
    region: ${opt:region, 'us-east-1'}
    apiGateway:
        shouldStartNameWithService: true
    name: aws
    runtime: nodejs12.x
    logRetentionInDays: 120
    deploymentBucket:
        blockPublicAccess: true
    tracing:
        lambda: true
    environment:
        <omitted>
        <omitted>
    iamRoleStatements:
        - Effect: 'Allow'
          Action:
              - 'logs:*'
          Resource: '*'
        - Effect: Allow
          Action:
              - 's3:*'
          Resource: '*'
functions:
    # ensure a storage area w/create
    storage:
        handler: handlers/storage.create
        events:
            - http:
                  path: storage/{bucket}
                  method: get
                  cors: true
    # get an upload url for a storage area w/upload
    upload:
        handler: handlers/storage.upload
        events:
            - http:
                  path: upload/{bucket}
                  method: get
                  cors: true
    # get a download url for a storage area w/download
    download:
        handler: handlers/storage.download
        events:
            - http:
                  path: download/{bucket}
                  method: get
                  cors: true
package:
    include:
        - ./handlers/*
        - ./utils/*

Here’s the handler file

'use strict';

import {
  configForAWS,
  reportResults,
  CallbackErrorResponse,
  CallbackSuccessWithText,
} from '../utils/aws-sdk-lib';

import {
  doEnsureBucket,
  doGetSignedUrl,
  doPutBucketCors,
} from '../utils/aws-sdk-lib-s3';

import { authenticateBusiness } from '../utils/jwtAuth';

// ----------------------------------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------------------------------

const gleanFromParameters = (parameters) => {
  const { environment = {}, context = {}, event = {} } = { ...parameters };

  const { invokedFunctionArn } = context;
  
  const [ ,,,region, account ] = invokedFunctionArn.split(':');
  environment.account = account; 
  environment.region = region; 

  const { pathParameters } = event;
  const { bucket: bucketName = 'missing' } = (pathParameters || {});

  const { queryStringParameters } = event;
  const { key: fileName = 'no-file.missing', type: contentType = 'missing', secs: duration = '15' } = (queryStringParameters || {})

  return { ...parameters, bucketName, fileName, contentType: `image/${contentType}`, duration };
};

// ----------------------------------------------------------------------------------------------------------------
const buildS3HeadBucketName = parameters => {
  const { bucketName } = parameters;
  return bucketName;
}

// ----------------------------------------------------------------------------------------------------------------
const handleCreateStorageRequests = async (event = {}, context = {}) => {

  return new Promise(async (resolve, reject) => {

    let parameters = {};
    const environment = { deepFail: true, deepResults: true, deepDebug: true };
    try {
      console.info(`Lambda.handler event`, JSON.stringify(event));
      console.info(`Lambda.handler context`, JSON.stringify(context));

      parameters = await configForAWS({ environment, event, context, gleanFromParameters });
      console.info(`running environment`, environment);
      console.info(`starting parameters`, parameters);

      const buildS3CreateBucketName = buildS3HeadBucketName
      const buildS3VersioningBucketName = buildS3HeadBucketName
      parameters = { ...parameters, buildS3HeadBucketName, buildS3CreateBucketName, buildS3VersioningBucketName };
      parameters = await doEnsureBucket(parameters);

      const buildS3CorsBucketName = buildS3HeadBucketName
      const buildS3CorsBucketConfig = () => ({
        CORSRules: [{
          AllowedOrigins: ['*'],
          AllowedMethods: ['PUT', 'GET'],
          AllowedHeaders: ['*']
        }]
      })
      doPutBucketCors
      parameters = { ...parameters, buildS3CorsBucketName, buildS3CorsBucketConfig };
      parameters = await doPutBucketCors(parameters);

      parameters = await reportResults(parameters);
      resolve(parameters);

    } catch (err) {
      console.error(`handleCreateStorageRequests threw: ${err.message || err.error}`);
      if (environment.deepFail) {
        console.debug(`with deepFail, parameters:`, JSON.stringify(parameters, null, 2));
      }

      reject(err);
    }
  });

};

// ----------------------------------------------------------------------------------------------------------------
const handleUploadStorageRequests = async (event = {}, context = {}) => {

  return new Promise(async (resolve, reject) => {

    let parameters = {};
    const environment = { deepFail: true, deepResults: true, deepDebug: true };
    try {
      console.info(`Lambda.handler event`, JSON.stringify(event));
      console.info(`Lambda.handler context`, JSON.stringify(context));

      parameters = await configForAWS({ environment, event, context, gleanFromParameters });
      console.info(`running environment`, environment);
      console.info(`starting parameters`, parameters);

      const buildS3GetSignedUrlParams = (parameters) => {
        const { bucketName, fileName, contentType, duration } = parameters
        return {
          Key: fileName,
          Bucket: bucketName,
          Expires: +duration,
          ContentType: contentType,
          ContentEncoding: 'base64',
        }
      }
      const buildS3GetSignedUrlOperation = (parameters) => `putObject`
      parameters = { ...parameters, buildS3GetSignedUrlOperation, buildS3GetSignedUrlParams };
      parameters = await doGetSignedUrl(parameters);

      parameters = await reportResults(parameters);
      resolve(parameters);

    } catch (err) {
      console.error(`handleUploadStorageRequests threw: ${err.message || err.error}`);
      if (environment.deepFail) {
        console.debug(`with deepFail, parameters:`, JSON.stringify(parameters, null, 2));
      }

      reject(err);
    }
  });

};

// ----------------------------------------------------------------------------------------------------------------
const handleDownloadStorageRequests = async (event = {}, context = {}) => {

  return new Promise(async (resolve, reject) => {

    let parameters = {};
    const environment = { deepFail: true, deepResults: true, deepDebug: true };
    try {
      console.info(`Lambda.handler event`, JSON.stringify(event));
      console.info(`Lambda.handler context`, JSON.stringify(context));

      parameters = await configForAWS({ environment, event, context, gleanFromParameters });
      console.info(`running environment`, environment);
      console.info(`starting parameters`, parameters);

      const buildS3GetSignedUrlParams = (parameters) => {
        const { bucketName, fileName, duration } = parameters
        return {
          Expires: +duration,
          Bucket: bucketName,
          Key: fileName,
        }
      }
      const buildS3GetSignedUrlOperation = (parameters) => `getObject`
      parameters = { ...parameters, buildS3GetSignedUrlOperation, buildS3GetSignedUrlParams };
      parameters = await doGetSignedUrl(parameters);

      parameters = await reportResults(parameters);
      resolve(parameters);

    } catch (err) {
      console.error(`handleDownloadStorageRequests threw: ${err.message || err.error}`);
      if (environment.deepFail) {
        console.debug(`with deepFail, parameters:`, JSON.stringify(parameters, null, 2));
      }

      reject(err);
    }
  });

};

// ----------------------------------------------------------------------------------------------------------------
const create = async (event, context, callback) => {
  try {
    if (await authenticateBusiness(event, callback)) {
      const parameters = await handleCreateStorageRequests(event, context);
      const { bucketName } = parameters as any;
      CallbackSuccessWithText(callback, JSON.stringify({ bucketName }))
    }
  } catch (err) {
    CallbackErrorResponse({ err, fnName: `handler(get/storage/{name})`, callback });
  }
}

// ----------------------------------------------------------------------------------------------------------------
const upload = async (event, context, callback) => {
  try {
    if (await authenticateBusiness(event, callback)) {
      const parameters = await handleUploadStorageRequests(event, context);
      const { doGetSignedUrlResult } = parameters as any;
      CallbackSuccessWithText(callback, JSON.stringify({ doGetSignedUrlResult }))
    }
  } catch (err) {
    CallbackErrorResponse({ err, fnName: `handler(get/storage/{name})`, callback });
  }
}

// ----------------------------------------------------------------------------------------------------------------
const download = async (event, context, callback) => {
  try {
    if (await authenticateBusiness(event, callback)) {
      const parameters = await handleDownloadStorageRequests(event, context);
      const { doGetSignedUrlResult } = parameters as any;
      CallbackSuccessWithText(callback, JSON.stringify({ doGetSignedUrlResult }))
    }
  } catch (err) {
    CallbackErrorResponse({ err, fnName: `handler(get/storage/{name})`, callback });
  }
}

// ----------------------------------------------------------------------------------------------------------------
module.exports = { 
    create, 
    upload, 
    download 
};