Howto upload big files 2GB+ to .NET Core API controller from a form?
通过Postman上传大文件时(从前端用php编写的表单,我也遇到了同样的问题),我从Azure Web应用程序收到了502错误的网关错误消息:
502 - Web server received an invalid response while acting as a
gateway or proxy server. There is a problem with the page you are
looking for, and it cannot be displayed. When the Web server (while
acting as a gateway or proxy) contacted the upstream content server,
it received an invalid response from the content server.
我在Azure应用程序见解中看到的错误:
Microsoft.AspNetCore.Connections.ConnectionResetException: The client
has disconnected <--- An operation was attempted on a nonexistent network connection. (Exception from HRESULT: 0x800704CD)
这是在尝试上传2GB测试文件时发生的。对于1GB的文件,它可以正常工作,但需要达到?5GB。
我已经优化了使用块写入方法将文件流写入到Azure Blob存储的部分(提供给:https://www.red-gate.com/simple-talk/cloud/platform-as- a / service / azure-blob-storage-part-4-uploading-large-blobs /),但对我来说,连接似乎已关闭到客户端(在这种情况下,是对邮递员的),因为这似乎是一个单一的连接HTTP POST请求和底层的Azure网络堆栈(例如负载平衡器)正在关闭连接,这花费了很长时间,直到我的API为HTTP POST请求提供HTTP 200 OK。
我的假设正确吗?如果是,如何才能实现从我的前端(或邮递员)上传的数据块(例如15MB)发生,然后API可以比整个2GB更快的方式对其进行确认?即使创建一个SAS URL以上传到azure blob并将URL返回到浏览器也可以,但是不确定我如何轻松地集成它-还有最大块大小afaik,因此对于2GB,我可能需要创建多个块。如果这是建议,那么在这里获得一个很好的样本将是很棒的,但是也欢迎其他想法!
这是C#.Net Core 2.2中我的API控制器端点中的相关部分:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 | [AllowAnonymous] [HttpPost("DoPost")] public async Task<IActionResult> InsertFile([FromForm]List<IFormFile> files, [FromForm]string msgTxt) { ... // use generated container name CloudBlobContainer container = blobClient.GetContainerReference(SqlInsertId); // create container within blob if (await container.CreateIfNotExistsAsync()) { await container.SetPermissionsAsync( new BlobContainerPermissions { // PublicAccess = BlobContainerPublicAccessType.Blob PublicAccess = BlobContainerPublicAccessType.Off } ); } // loop through all files for upload foreach (var asset in files) { if (asset.Length > 0) { // replace invalid chars in filename CleanFileName = String.Empty; CleanFileName = Utils.ReplaceInvalidChars(asset.FileName); // get name and upload file CloudBlockBlob blockBlob = container.GetBlockBlobReference(CleanFileName); // START of block write approach //int blockSize = 256 * 1024; //256 kb //int blockSize = 4096 * 1024; //4MB int blockSize = 15360 * 1024; //15MB using (Stream inputStream = asset.OpenReadStream()) { long fileSize = inputStream.Length; //block count is the number of blocks + 1 for the last one int blockCount = (int)((float)fileSize / (float)blockSize) + 1; //List of block ids; the blocks will be committed in the order of this list List<string> blockIDs = new List<string>(); //starting block number - 1 int blockNumber = 0; try { int bytesRead = 0; //number of bytes read so far long bytesLeft = fileSize; //number of bytes left to read and upload //do until all of the bytes are uploaded while (bytesLeft > 0) { blockNumber++; int bytesToRead; if (bytesLeft >= blockSize) { //more than one block left, so put up another whole block bytesToRead = blockSize; } else { //less than one block left, read the rest of it bytesToRead = (int)bytesLeft; } //create a blockID from the block number, add it to the block ID list //the block ID is a base64 string string blockId = Convert.ToBase64String(ASCIIEncoding.ASCII.GetBytes(string.Format("BlockId{0}", blockNumber.ToString("0000000")))); blockIDs.Add(blockId); //set up new buffer with the right size, and read that many bytes into it byte[] bytes = new byte[bytesToRead]; inputStream.Read(bytes, 0, bytesToRead); //calculate the MD5 hash of the byte array string blockHash = Utils.GetMD5HashFromStream(bytes); //upload the block, provide the hash so Azure can verify it blockBlob.PutBlock(blockId, new MemoryStream(bytes), blockHash); //increment/decrement counters bytesRead += bytesToRead; bytesLeft -= bytesToRead; } //commit the blocks blockBlob.PutBlockList(blockIDs); } catch (Exception ex) { System.Diagnostics.Debug.Print("Exception thrown = {0}", ex); // return BadRequest(ex.StackTrace); } } // END of block write approach ... |
这是通过邮递员进行的HTTP POST示例:
我设置了maxAllowedContentLength
如果要将大的Blob文件上传到Azure存储,请从后端获取SAS令牌并直接从客户端上传此文件,这是更好的选择,因为它不会增加后端的工作量。您可以使用下面的代码来获得具有写许可权的SAS令牌,该令牌仅对您的客户端具有2个小时的权限:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 | var containerName ="<container name>"; var accountName ="<storage account name>"; var key ="<storage account key>"; var cred = new StorageCredentials(accountName, key); var account = new CloudStorageAccount(cred,true); var container = account.CreateCloudBlobClient().GetContainerReference(containerName); var writeOnlyPolicy = new SharedAccessBlobPolicy() { SharedAccessStartTime = DateTime.Now, SharedAccessExpiryTime = DateTime.Now.AddHours(2), Permissions = SharedAccessBlobPermissions.Write }; var sas = container.GetSharedAccessSignature(writeOnlyPolicy); |
获得此sas令牌后,可以使用它通过客户端上的存储JS SDK上载文件。这是一个html示例:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 | <!DOCTYPE html> <html> <head> upload demo <script src= "https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"> <script src="./azure-storage-blob.min.js"> </head> <body> <form method="post" action="" enctype="multipart/form-data" id="myform"> <input type="file" id="file" name="file" /> <input type="button" class="button" value="Upload" id="but_upload"> </form> <script type="text/javascript"> $(document).ready(function() { var sasToken = '?sv=2018-11-09&sr=c&sig=XXXXXXXXXXXXXXXXXXXXXXXXXOuqHSrH0Fo%3D&st=2020-01-27T03%3A58%3A20Z&se=2020-01-28T03%3A58%3A20Z&sp=w' var containerURL = 'https://stanstroage.blob.core.windows.net/container1/' $("#but_upload").click(function() { var file = $('#file')[0].files[0]; const container = new azblob.ContainerURL(containerURL + sasToken, azblob.StorageURL.newPipeline(new azblob.AnonymousCredential)); try { $("#status").wrapInner("uploading .... pls wait"); const blockBlobURL = azblob.BlockBlobURL.fromContainerURL(container, file.name); var result = azblob.uploadBrowserDataToBlockBlob( azblob.Aborter.none, file, blockBlobURL); result.then(function(result) { document.getElementById("status").innerHTML ="Done" }, function(err) { document.getElementById("status").innerHTML ="Error" console.log(err); }); } catch (error) { console.log(error); } }); }); </body> </html> |
我上传了一个3.6GB的.zip文件20分钟,它对我来说非常有效,sdk将打开多个线程并分部分上传您的大文件:
注意:在这种情况下,请确保已为存储帐户启用CORS,以便statc html可以将请求发布到Azure存储服务。
希望有帮助。