Questions tagged [aws-lambda]

AWS Lambda is a compute service that lets you run code without the overhead of managing servers.

AWS Lambda is a cloud offering from Amazon Web Services (AWS). AWS Lambda is a serverless compute service that runs code in response to events and automatically manages the underlying infrastructure for you.

Resources

Tutorial

6 questions
3
votes
2 answers

sklearn and pandas in AWS Lambda

I made a front end where I would like to make REST calls to an AWS Lambda interfaced with AWS API Gateway. I dumped my model as a pickle file (and so my encoders) which I initially trained locally. I then stored these files in a S3 bucket. The…
3nomis
  • 531
  • 6
  • 17
1
vote
1 answer

Alternative to EC2 for running ML batch training jobs on AWS

We are building an ML pipeline on AWS, which will obviously require some heavy-compute components including preprocessing and batch training. Most the the pipeline is on Lambda, but Lambda is known to have time limits on how long a job can be run…
1
vote
1 answer

Deploying ML/Deep Learning on AWS Lambda for Long-Running Training, not just Inference

Serverless technology can be used to deploy ML models to production, since the deployment package sizes can be compressed if too large (or built from source with unneeded dependencies stripped). But there is also the use case of deploying ML for…
Cybernetic
  • 770
  • 1
  • 4
  • 10
1
vote
0 answers

How speed up pytorch model loading time

I have deployed the background removal model( Pytorch- pre-trained u2net) in aws using lambda and EFS file system and APIgetway. I have stored my model in efs and loading to the lambda. the model is around 170MB. The API getaway response time is…
suri
  • 11
  • 1
0
votes
0 answers

Deploying a model with GPU and pay-per-inference

I may have the wrong stack exchange. If that's the case, could someone point me to a stack that could help with this. Anyways... My backend employs a sentence transformer model from HuggingFace. Since the number of requests per day is small,…
0
votes
0 answers

Hosting Models on a Serverless-like format, AWS specific but not exclusively

Similar question here Deploy ML Model on AWS, but more detail provided below: If I understand correctly, models such as llama and stable diffusion, are all types of Artificial Neural Networks (ANNs). These ANNs all have different ways to interact…
timhc22
  • 101