I was actually interested in trying this myself, so here is how it went.[Spoiler Alert]: Did not work

I used this repo to create a function. The first issue was getting things zipped (to commit to github under the LFS size limit) and unzipped during build to copy to deploy function folder with dependencies. All good so far, or at least I thought so. The size of the bundle created to deploy to aws is too large due to the size of the dependencies and the headless_chromium itself.

If I need this really bad, I would just do it on AWS Lambda rather than have Netlify lambda functions handle it. The build and transfer would add too much overhead and time to my builds anyway.

Lambda functions are new to me since I was exposed to Gatsby and Netlify. I have read a lot about people trying the webgl with AWS lambda. Seems like it’s hopeful in that direction. Speaking of building these functions, why must these functions build each time? I was also thinking this might get out of hand. It’s just silly to build each time. This only needs to be built one time for me this will rarely change if ever.

he’ll be nice if he didn’t have to rebuild the entire project. For instance, if it just kept a cash version of the functions that are going to be exactly the same shouldn’t need to rebuild all that.

The other option for me is to build a node server with digitalocean. At least then I’ll have full control. but then I might want to just move my entire project over to digitalocean at that point. But Netlify does such a good job of handling the building.

Well. I have not checked out AWS until now, but their free tier might be all I need. This is crazy what is available these days. I am sure AWS will solve the webgl problem and which will keep Netlify focused and rebuild times extremely speedy.

be nice if he didn’t have to rebuild the entire project. For instance, if it just kept a cash version of the functions that are going to be exactly the same shouldn’t need to rebuild all that

To be clear, there is not a requirement to build functions everytime, because they can be submitted complete in their target folder.

The issue was with the solution I mentioned above was due to the size of the zipped up assets that needed pre-handling, so it forced a pre-process each time to make sure there was not a change. Sorry if I confused the issue… This just happens to be one of those times where AWS is a better place to build out the functions.