It seems amazing to me that Microsoft did such a poor job of giving developers a good way to upload large documents to Microsoft SharePoint Online (aka Office 365 or SharePoint 365). Ideally I would like to use the
Microsoft.SharePoint.Client.File.SaveBinaryDirect
method that is part of the CSOM (Client Side Object Model), but this does not work with SharePoint Online and only seems to work on SharePoint 2010 (probably 2013, but have not tested). I did get
Microsoft.SharePoint.Client.File uploadedFile = docLib.RootFolder.Files.Add(newFileFromComputer);
to work on SharePoint Online, but it not very useful because it is limited to files of about 3MB in size.
After lots of trial and error, I figured out that the most reliable was to do it is using a standard PUT request and passing the Claims Authentication cookies that are required to make most any request to SharePoint Online. This works well unless you are debugging and you still have the default exception warning enabled. In that case, for large files that take more than 60 seconds to upload you will get a message similar to this:
The CLR has been unable to transition from COM context 0x1fe458 to COM context 0x1fe5c8 for 60 seconds. The thread that owns the destination context/apartment is most likely either doing a non pumping wait or processing a very long running operation without pumping Windows messages. This situation generally has a negative performance impact and may even lead to the application becoming non responsive or memory usage accumulating continually over time. To avoid this problem, all single threaded apartment (STA) threads should use pumping wait primitives (such as CoWaitForMultipleHandles) and routinely pump messages during long running operations.
Depending on what you are doing this may be okay. In my case, I don’t care if my command line application waits for a long request to continue. This is caused by using the STA threading attribute on the main method of the application.
Below are two methods (one just an overload that calls the other basically) that upload a document to a shared library in SharePoint Online. It should also work on SharePoint 2010 plus if you replace the ClaimClientContext with a standard ClientContext that is needed for SharePoint 2010+. I’ll leave that to you to try on your own for now. You’ll notice I have also added functionality to change the created and modified date of the files after they are uploaded. For details (and source code click here). Or if you just want to understand more about Claims based authentication, sample code, options, etc definitely check out here.Lastly, the keyValues dictionary is just a dictionary of actual internal field names (see CAML field names) as keys and the values for each field are the values of the dictionary.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.SharePoint.Client;
using MSDN.Samples.ClaimsAuth;
using System.Xml.Linq;
using System.Net;
using System.IO;
...
public static void AddDocument(string fileToUpload, string webUrl, string docLibInternalName, string docLibUIName, string documentSetName, string filenameToSaveAs, DateTime? createdDate, DateTime? modifiedDate, Dictionary<string, object> keyValues, int timeoutInMilliseconds)
{
string urlToSaveAs = webUrl + "/" + docLibInternalName + "/" + documentSetName + "/" + filenameToSaveAs;
AddDocument(fileToUpload, webUrl, urlToSaveAs, createdDate, modifiedDate, keyValues, timeoutInMilliseconds);
}
// Uploads most any size file to SharePoint Online (O365) using Claims Authentication. It does NOT use CSOM, and instead uses a standard PUT request
// that has the cookies from the Claims based authentication added to it.
// This solution is based on http://stackoverflow.com/questions/15077305/uploading-large-files-to-sharepoint-365
// and
// To get the claims authentiation cookie, this solution requires: http://msdn.microsoft.com/en-us/library/hh147177.aspx#SPO_RA_Introduction
// or if you want to get the cookies for claims authentication antoher way, you can use
// http://www.wictorwilen.se/Post/How-to-do-active-authentication-to-Office-365-and-SharePoint-Online.aspx
public static void AddDocument(string fileToUpload, string webUrl, string urlToSaveAs, DateTime? createdDate, DateTime? modifiedDate, Dictionary<string, object> keyValues, int timeoutInMilliseconds)
{
//For example: byte[] data = System.IO.File.ReadAllBytes(@"C:\Users\me\Desktop\test.txt");
byte[] data = System.IO.File.ReadAllBytes(fileToUpload);
// get the cookies from the Claims based authentication and add it to the cookie container that we will then pass to the request
CookieCollection cookies = ClaimClientContext.GetAuthenticatedCookies(webUrl, 200, 200);
CookieContainer cookieContainer = new CookieContainer();
cookieContainer.Add(cookies);
// make a standard PUT request
System.Net.ServicePointManager.Expect100Continue = false;
HttpWebRequest request = HttpWebRequest.Create(urlToSaveAs) as HttpWebRequest;
request.Method = "PUT";
request.Accept = "*/*";
request.ContentType = "multipart/form-data; charset=utf-8";
request.CookieContainer = cookieContainer;
request.AllowAutoRedirect = false;
request.UserAgent = "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)";
request.Headers.Add("Accept-Language", "en-us");
request.Headers.Add("Translate", "F");
request.Headers.Add("Cache-Control", "no-cache");
request.ContentLength = data.Length;
request.ReadWriteTimeout = timeoutInMilliseconds;
request.Timeout = timeoutInMilliseconds;
using (Stream req = request.GetRequestStream())
{
req.ReadTimeout = timeoutInMilliseconds;
req.WriteTimeout = timeoutInMilliseconds;
req.Write(data, 0, data.Length);
}
// get the response back
HttpWebResponse response = null;
try
{
response = (HttpWebResponse)request.GetResponse();
Stream res = response.GetResponseStream();
using (StreamReader rdr = new StreamReader(res))
{
string rawResponse = rdr.ReadToEnd();
}
}
catch (Exception ex)
{
throw ex;
}
finally
{
if (response != null)
{
response.Close();
}
}
// NOTE: the file that was uploaded is still checked out and must be checked in before it will be available to others.
// The method includes a checkin command. If the method below is removed for some reason,
// a checkin method call should be added here so that the file will be available to all (that have access).
// NOTE: The method calls add keyValues passed in as well. These would need to be done also if the method is removed.
UriBuilder urlBuilder = new UriBuilder(urlToSaveAs);
string serverRelativeUrlToSaveAs = urlBuilder.Path;
ChangeCreatedModifiedInfo(webUrl, serverRelativeUrlToSaveAs, createdDate, modifiedDate, keyValues);
}
…
7 comments:
Hi,
Even I am stuck in a similar scenario, where I am trying to give user a seamless experience whether he uses SharePoint 2013 or SharePoint online.
The user needs to be able to upload some files using Client object, Are you sure that this scenario cannot work??
I mean cannot work from SaveBinaryDirect??
Hello,
this seems like exactly what I need, I've been trying to make it work in Powershell against a SharePoint 2010 server.
I use my own way of getting the authentication cookie, and it works (I use it successfully for both uploading and downloading)I'm trying to implement your solution because of the file size limitations of SharePoint Client Context.
So I created a web request as instructed, filled the request stream with file data (just 2 bytes), and send the request. What I get back is 409 - conflict.
I also use Fiddler to analyze the resulting traffic, and it claims that my request has a conflcit between the ContentLength property (set to 2) and actual content, which is 0. However when I view the request, the two characters from the file are there.
I'm using webrequests successfully to get the authentication cookie. I'm at a loss as to what to try next. Any advice would be appreciated.
We are having no problem uploading files to SharePoint 2013 in sharepoint.com using SaveBinaryDirect
We have absolutely uploaded files as large as 60GB.
Where we are having some challenges is getting a straight answer on some of the default settings.
For example, rumor has it that the timeout is set to 20 minutes on an upload and 250MB on the file size. So that if you can't upload your file in 20 minutes it will fail.
But I have nothing official from Microsoft to confirm this.
Its also unclear how the requesttimeout (defaults to 180000 milliseconds CONFIRMED)and the savebinarydirect timeout interact with each other.
I've tired the above code and doesn't work for me :(
know how to do this for powershell?
It's not Powershell script :(( Can you show us the real one please?
Post a Comment