Skip to content

Instantly share code, notes, and snippets.

@ahartzog
Created June 2, 2021 18:58
Show Gist options
  • Save ahartzog/0378336f7ff76298691f2279124b73d1 to your computer and use it in GitHub Desktop.
Save ahartzog/0378336f7ff76298691f2279124b73d1 to your computer and use it in GitHub Desktop.
Alek's Multipart Upload
import apiFetch from 'modules/api-fetch';
import Toast from 'modules/toast/index.native';
import Sentry from 'modules/sentry';
const FILE_CHUNK_SIZE = 5242880;
const uploadMultipart = async (
file: string,
setIsUploading = (set: boolean) => {},
setUploadProgress = (set: number) => {},
destinationBucket = 'ClientFileBucket',
): Promise<string> => {
const parts = file.split('.');
const fileExtension = parts[parts.length - 1].toLowerCase();
setIsUploading(true);
try {
Sentry.addBreadcrumb({
message: 'Beginning multipart upload',
category: 'multipartUpload',
});
const fetchedFile = await fetch(file);
const blob = await fetchedFile.blob();
Sentry.addBreadcrumb({
message: 'Fetched file successfully',
category: 'multipartUpload',
});
const numParts = Math.ceil(blob.size / FILE_CHUNK_SIZE);
const result = await apiFetch({
url: 'Client/GetMultipartUploadUrls',
method: 'GET',
query: { fileExtension, destinationBucket, parts: numParts },
});
const {
uploadUrls,
uploadId,
fullBucketName,
filename,
}: {
uploadUrls: string[];
uploadId: string;
fullBucketName: string;
filename: string;
} = result as any;
Sentry.addBreadcrumb({
message: 'Retrieved multipart upload URLs',
category: 'multipartUpload',
});
const apiCallsList = uploadUrls.map((url, index) => {
const start = index * FILE_CHUNK_SIZE;
const end = (index + 1) * FILE_CHUNK_SIZE;
const blobPart =
index < uploadUrls.length - 1
? blob.slice(start, end)
: blob.slice(start);
const promise = () =>
fetch(url, {
method: 'PUT',
body: blobPart,
});
return promise;
});
const uploadResults: Response[] = [];
let retryCount = 0;
const uploadSingleChunk = async (index: number) => {
Sentry.addBreadcrumb({
message: `Executing upload single chunk for index #${index}, retry #${retryCount}`,
category: 'multipartUpload',
});
if (index >= apiCallsList.length) {
fireAllPartUploadsCompleted();
return;
}
setUploadProgress(index / apiCallsList.length);
try {
const callResult = await apiCallsList[index]();
retryCount = 0;
uploadResults.push(callResult);
await uploadSingleChunk(index + 1);
} catch (e) {
if (retryCount < 3) {
Sentry.addBreadcrumb({
message: `Retrying for index #${index} and retry number ${retryCount}`,
category: 'multipartUpload',
});
retryCount += 1;
const delay = (retryCount + 1) * 2000;
console.devLog.verbose(`Delaying: ${delay}`);
await new Promise((resolve) => setTimeout(resolve, delay));
await uploadSingleChunk(index);
} else {
Sentry.addBreadcrumb({
message: `No more retries, index #${index}`,
category: 'multipartUpload',
});
Sentry.captureException(e);
throw new Error(`File upload failed.`);
}
}
};
const fireAllPartUploadsCompleted = async () => {
Sentry.addBreadcrumb({
message: 'Firing all parts uploaded',
category: 'multipartUpload',
});
const resParts = uploadResults.map((part, index) => ({
ETag: (part as any).headers.map.etag,
PartNumber: index + 1,
}));
await apiFetch({
url: 'Client/CompleteMultipartUpload',
method: 'POST',
query: {
uploadId,
parts: resParts,
bucketName: fullBucketName,
guid: filename,
},
body: resParts,
});
Sentry.addBreadcrumb({
message: 'Multipart upload completed, HUZZAH!',
category: 'multipartUpload',
});
};
await uploadSingleChunk(0);
return filename;
} catch (e) {
console.devLog.info(e);
console.log('Error setting up somewhere in the multipart higher chain');
Toast.error('Error uploading file', {
position: 'bottom',
duration: 15000,
});
Sentry.captureException(e);
throw e;
} finally {
setUploadProgress(0);
setIsUploading(false);
}
};
export { uploadMultipart };
@1mike12
Copy link

1mike12 commented Jan 12, 2024

This isn't an actual good way to deal with uploading large files as the original article suggests. You will still run into problems with files that are multi GB in size since the blob call will load everything into memory all at once.
const blob = await fetchedFile.blob()

@ahartzog
Copy link
Author

The upload portion would still be correct though, yeah? You could fetch the file by pieces if desired.

@kveerapen
Copy link

@ahartzog this looks great, do you know if this logic will continue to run if the app is backgrounded during the upload?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment