How to increase Cloud Run RAM

I have a Cloud Run service used for launching an nginx and PHP backend compression service. Currently, I have a problem where I need to pull files from GCS to local storage before compressing them. Once all the files are downloaded, I use 7za for compression and finally upload the compressed files back to GCS. The issue is that the maximum RAM for Cloud Run is 32 GB, but I might need to compress up to 50 GB of files. If I need more space to temporarily store this 50 GB, what can I do?

1 5 133
5 REPLIES 5

Hello @roceil  ,Welcome on Google Cloud Community.

Yep, there is no possibility to take more than 32GB mem for Cloud Run. However, did you've tried either divide those data to smaller parts and proceed with smaller files or utilize Cloud Run buckets attachment? According to documentation:

"Mounting the bucket as a volume in Cloud Run presents the bucket content as files in the container file system, which allows use of standard file system operations and libraries to access that file system" 

Documentation URL: https://cloud.google.com/run/docs/configuring/services/cloud-storage-volume-mounts

--
cheers,
DamianS
LinkedIn medium.com Cloudskillsboost

Thank you for your response. I have tried adding a 50GB memory disk in the disk settings and mounted it to the /app/public path. However, when the size of the downloaded files reaches 32GB, Cloud Run returns an error to me.

Hmm, if you can't split those files to lower parts, the only idea which I'm seeing is utilize VM and perform such operations ( downloading, compressing, uploading ) via GCE VM. Cloud Run could trigger such operation. 

Hi roceil, I"d like to know: are your individual files >32GB, or is the total size across all files >32GB?

If it's >32GB, does your compression library require that the entire file be read into memory at the same time? If we supported streaming reads and writes to GCS, would that address the issue?

DamianS is correct in his recommendation to use a VM for this use case in case of files >32GB; sadly, today, Cloud Run cannot help you here. I'm trying to figure out what improvements we need to make to improve the situation.

 

Thank you for your reply. Currently, the total size of all files exceeds 25GB. What I do now is first copy the files from folder A in GCS to folder B, and then use the 7za command to compress folder B. However, I've noticed that there seems to be a download operation when compressing folder B, which causes Cloud Run to run out of memory, leading to the entire service crashing. I wonder how this situation can be handled better, or if I should consider changing the architecture?