by Martin Laukkanen | Aug 5, 2013 | Apps, Development, How to, Project 2013
After a few busy weeks working on my first 100% JavaScript 2013 App (watch this space for more!!) I’ve come to realise that the MSDN documentation on JSOM and CSOM still is pretty sparse!
A couple of simple examples exist in the usual place (e.g. JSOM CreateProjects) but when you get to the details you’ll find a lot missing. For example updating Custom Fields; if you look at the MSDN page covering PS.DraftProject, the method you need (draftProject.setCustomFieldValue()) is not even listed! (UPDATE 7/08: It is covered here but with no detail; PS.Project.setCustomFieldValue)
JavaScript PS.DraftProject.setCustomFieldValue Method
Here’s the missing method definition that you’ll see when using PS.debug.js:
PS.DraftProject.setCustomFieldValue(FieldName, Value);
Hey wow, that simple hey? No, unfortunately the definition is a bit misleading; easy to assume that FieldName references the custom field name used else where like in the OData fields, but in fact this refers to the InternalName from the PS.CustomField object.
An example InternalName is: Custom_a1737ae3b4fce211940b00155d000a03
So first thing you need to do is get that name, it is just the field GUID prefixed with “Custom_”, but I like to do things more dynamically so I’ll use projContext.get_customFields(); to cache that information.
Example JavaScript update of Custom Field Value
Firstly lets get those InternalName values into an array for later use.
Cache the field details with a GetCustomFields Function
var projContext;
var customFields;
var customFieldData = [];
SP.SOD.executeOrDelayUntilScriptLoaded(GetCustomFields, "PS.js");
function GetCustomFields() {
// Initialize the current client context and get the projects collection
projContext = PS.ProjectContext.get_current();
customFields = projContext.get_customFields();
projContext.load(customFields);
// Run the request on the server.
projContext.executeQueryAsync(getCFComplete, getCFFailed);
}
function getCFComplete(response) {
var cfEnumerator = customFields.getEnumerator();
// Save the details of each CF for later
while (cfEnumerator.moveNext()) {
var cf = cfEnumerator.get_current();
customFieldData.push({
Id: cf.get_id(),
Name: cf.get_name(),
InternalName: cf.get_internalName()
});
}
// Now update the project
updateProject();
}
Note the last line there; updateProject() as this is all asynchronous you need to call the update only once you have the customFieldData array ready.
Update the Project Custom Field Function
function updateProject() {
var projectId = "9C585CC0-3FC0-4133-9F2A-1FB96587CF0D";
var project = projects.getById(projectId);
var draftProject = project.checkOut();
var fieldName = "My Custom Field";
// Update custom field
var cfData = $.grep(customFieldData, function (val) {
return val.Name === fieldName;
});
if (cfData.length > 0) {
draftProject.setCustomFieldValue(cfData[0].InternalName, "Some new value");
}
//Publish the change
var publishJob = draftProject.publish(true);
//Monitor the job
projContext.waitForQueueAsync(publishJob, 30, function (response) {
if (response !== 4) {
// handle errors
}
}
}
This simple example assumes the FieldType is text, but you get the idea, also to work with Lookup Tables you’ll need to look at the cf.get_lookupEntries() values in the getCFComplete() function but hopefully the above will get you started.
by Martin Laukkanen | Jul 23, 2013 | Apps, Development, How to, SharePoint 2013
One of the noticeable gaps that comes up immediately when you start planning any significant SharePoint 2013 deployment with requirements such as multi-tenancy and SAML based authentication (ADFS, ACS, etc) are the some of the limitations with the new features of 2013.
One such limitation is the new App Store which only supported Windows Authentication and didn’t support hostheaders at RTM! Fortunately Microsoft fixed the hostheader limitation in the March PU release (KB2768001), however the SAML limitation remains.
Steve Peschka wrote about one solution to part of this problem here: Using SharePoint Apps with SAML and FBA Sites in SharePoint 2013, however that only covers the Provider hosted (or Autohosted) apps, which leaves a big gaping hole where the simplest Apps of all are not supported.
The Problem with SharePoint hosted Apps
Basically the problem is in the the App Domain configuration and authentication requirements, take a typical example:
- First tenant:
App Domain: https://tenant1-*******.contosoapps.com
- Second tenant:
App Domain: https://tenant2-*******.contosoapps.com
As you know those App Domains are auto created for each installed app with a unique name per app. (Note that you need March PU to configure the above App Domains)
So with that in mind, when a user is authenticated to tenant1.contoso.com that does not automatically authenticate them to tenant1-******.contosoapps.com forcing re-authentication, that’s where we hit our problem.
Azure ACS and ADFS – No Apps??
Unfortunately Azure ACS and ADFS don’t work at all in this scenario. The problem:
UseWReplyParameter while supported by both is restricted to sub sites only (e.g. reply to , as a result in our scenario above what you end up with is this annoying authentication loop:
The reason for this is for security, as allowing any arbitrary WReply parameter could potentially allow an attacker from one Relaying Party (RP) to be redirected with a valid authentication token to a second RP.
So in my scenario above it may be possible for tenant1 users to authenticate not only with tenant1*** apps, but also tenant2 sites!
Solutions?
Unless Microsoft changes ACS and ADFS (unlikely) or they modify the SPTrustedIdentityTokenIssuer (maybe?), then the only option right now is to roll your own STS Identity Provider based on the the Windows Identity Framework SDK!
Fortunately you’re not alone, plenty of examples of this exist;
And my favourite:
Lots of good examples to work with but for this blog I’m going to extend the last one on the list by Steve Peschka using SAML with Microsoft Accounts (LiveID). So if you want to implement my changes below, you’ll need to start by reading the last article above and download and get familiar with the source for that one.
Changes to WindowsLiveOauthSts Solution
In summary we need to make the following changes to the solution:
- Firstly we need to update the Custom STS to accept and use the WReplyParameter.
- Secondly we need to capture the original request WReplyParameter in the PassiveSTS.cs and send it to the Custom STS.
- Finally we need to ensure that we maintain the security of our solution in with all changes.
That’s it, simple hey?
Updates to CustomSecurityTokenService.cs
First add somewhere to store the parameter to the class:
protected String WReplyParameter { get; set; }
Next extend the constructor to accept the parameter:
public CustomSecurityTokenService(CustomSecurityTokenServiceConfiguration configuration,
Dictionary<string, string> ClaimValues, String wReplyParameter = "")
: base(configuration)
{
this.oAuthValues = ClaimValues;
this.WReplyParameter = wReplyParameter;
}
And finally use the saved value in the GetScope() method:
// Set the ReplyTo address for the WS-Federation passive protocol (wreply).
// Use the provided WReplyParameter if it exists
if (String.IsNullOrEmpty(WReplyParameter))
scope.ReplyToAddress = scope.AppliesToAddress;
else
scope.ReplyToAddress = WReplyParameter;
return scope;
Updates to PassiveSTS.cs
First we have to get the Query String parameter from the request:
string wReplyParameter = HttpUtility.ParseQueryString(HttpUtility.UrlDecode(state))["wreply"];
Then we make sure to pass the parameter to the CustomSTS when instantiated:
//create an instance of our sts and pass in the dictionary of values we got from Windows Live oAuth
SecurityTokenService sts =
new CustomSecurityTokenService(CustomSecurityTokenServiceConfiguration.Current, values,
wReplyParameter);
Security Considerations
The risk arises if we allow any ‘wreply’ parameter to be be used as our returned ReplyToAddress in the Custom STS, in this example implementation a simple validation is included using an array of allowed URL’s (unmodified code quoted here);
// TODO: Set enableAppliesToValidation to true to enable only the RP Url's specified in the
// PassiveRedirectBasedClaimsAwareWebApps array to get a token from this STS
static bool enableAppliesToValidation = false;
Obviously for a production multi-tenanted environment you would need something more sophisticated, but I’ll leave that to you. Also another thing that particular solution does not allow for is our dynamically created App Domains, so that too will require some changes.
But hang on..
However it is worth saying here that in our simple example when using only Live ID across all Web Apps we have not exposed anything extra (yet) with this change! Think about it; if an attacker in wants to login to Tenant2 then all they need to do is open and login! Our STS is passing only identity and user related claims so all of the securing of resources is left to the tenant. IE; John from Tenant 1 has no site collection SPUser rights to Tenant 2 sites! So as it stands this scenario is similar to an equivalent Azure ACS implementation using multiple Relaying Parties.
Final Words
It’s worth mentioning that this likely will become redundant after a future service pack, assuming Microsoft fixes this by changing SharePoint that is.
A good indication is to look at how Office365’s doing it right now, and a quick Fiddler trace shows that the two cookies (rtFa and FedAuth) are passed directly to the AppDomain from the tenant Web App, so clearly MS is handling this in the initial Auth.
That may be a topic for investigation another time.
Hope this is useful for someone out there.
by Martin Laukkanen | Jun 12, 2013 | Development, Project 2013
I’ve had the fun task of shoe-horning some 2010 PSI projects into Project Server 2013 recently, and the good news is that as promised the PSI is still there just as we remember it!
One particular exception I kept seeing in my code immediately when using SharePoint Permissions Mode (ie a Default install) was the following:
Exception: Unhandled Communication Fault occurred
Which came with the unusually helpful accompanying ULS log errors;
06/11/2013 10:05:54.39 w3wp.exe 0x161C) 0x29B0 Project Server General ab3be Monitorable Unhandled exception in the process of routing the request to the app server: Target=https://host/PWA/_vti_bin/PSI/ProjectServer.svc, exception=[InvalidOperationException] Operation is not valid due to the current state of the object., StackTrace= at Microsoft.Office.Project.Server.WcfTrustedFacadeHelperMethods.
TryGetImpersonationContextInfo(String originalTargetUrl, OperationContext context, ImpersonationHeader& impersonationHeader) …
06/11/2013 10:05:54.39 w3wp.exe (0x161C) 0x29B0 Project Server Security aibp8 High Impersonation is not supported in SharePoint permission mode
The second log is the giveaway, I won’t be forgetting that limitation again!
by Martin Laukkanen | Mar 13, 2013 | Development, How to
One of the great features of Claims authentication in SharePoint (2010 or 2013) is the ability to use external authentication providers such as ADFS, Microsoft LiveID (Hotmail, Outlook.com etc) or even Google among others. Better yet, using Microsoft Azure ACS makes setting up and managing this for extranet sites or Cloud applications simple!
The Catch!
However there’s one catch (isn’t there always?) Microsoft Live ID doesn’t give the email address in the claim. This sounds obscure but in effect what it means is that once you configure all of this this is what you get:
What a nice and friendly username!
(If you want instructions on configuring all this, I highly recommend Wictor Wilén’s indispensible series of articles on the topic: http://www.wictorwilen.se/Post/Visual-guide-to-Azure-Access-Controls-Services-authentication-with-SharePoint-2010-part-1.aspx)
Solutions?
Most people suggest using something like Google to authenticate instead to avoid this, however with SharePoint Online, Project Online, EPMonDemand (ahem – excuse the blatant plug!) and Office365 all tied to my LiveID, I personally don’t think that’s an option!
So basically we need to populate that information, sure we could ask users to do it manually, but better yet let’s find out how to do this the Microsoft way.
Live Connect API
In order to support Windows 8 apps, Web apps and all manner of mobile devices MS have this great resource available, head over to the Live Connect Developer Center to see more information. Once you’ve registered then you can follow my steps below to get you setup to auto-populate SharePoint user information via Javascript!
Getting this going
Note: Getting to the business end here; you’ll Visual Studio, some JavaScript, SharePoint Object Model and WebPart development experience to continue. (Or just skip to the end and download my example, like everyone else!)
Firstly you’ll need to create yourself an ‘application’ on Live Connect, something like the following:
The important thing is to note your Client ID, and then enter the correct URL under Redirect domain.
All done, easy!
If you want to see how all this works, the Interactive Live SDK is one of the best SDK implementations I’ve seen from MS, have a look at what kind of options are available to you.
For example, the following JavaScript will return the basic user information for the currently logged in user including name, email addresses and more;
WL.init({
client_id: “***********”,
redirect_uri: “callback.aspx”,
response_type: “token”
});
WL.login({ “scope”: “wl.emails” }).then(
function (response) {
updateUserData();
},
function (response) {}
);
function updateUserData() {
WL.api({ path: “/me”, method: “GET” }).then(
function (response) {
// Do something here!
},
function (response) {}
);
}
Neat hey?
Bring it all together
Okay not quite done yet, lets put this into a nice simple SharePoint webpart and then using the good old Object Model update the current users’ details.
What you need to do:
- Open Visual Studio and create a new SharePoint – Visual Web Part Project
- Create some ASP controls for your webpart, I won’t bore you with all the details here, get the full solution below in which I created the following:
- AccountTextBox
- NameTextBox
- EmailTextBox1
- UpdateButton
- Plus a bunch of labels to make things neat.
- We need a link to the Live Connect http://js.live.net/v5.0/wl.jsfile, to make this easier I have saved a copy in a Layouts folder in my solution, so my ScriptLink looks like this:
<SharePoint:ScriptLink ID="LiveIdscript" runat="server" Name="UpdateUserWebPart/wl.js"></SharePoint:ScriptLink>
- Also for the Live ID bit to work we need a callback.aspx file in our solution, which is referenced in the “redirect_uri” parameter passed to to WL.init in the JavaScript, this file should look something like the following;
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<title>Authentication Complete</title>
</head>
<body>
<!--
The script will handle passing oauth/authorize response back to the calling page.
-->
<script src="/_layouts/15/UpdateFromLiveID/js/wl.js" type="text/javascript">
</script>
<p>This window can be closed.</p>
</body>
</html>
- Now for the JavaScript, first we need to initialise and try to login to Windows Live.
<SharePoint:ScriptBlock ID="ScriptBlock1" runat="server">
WL.init({
client_id: "*************",
redirect_uri: "<%=SPContext.Current.Web.Url %>/_layouts/15/UpdateUserWebPart/callback.aspx",
response_type: "token"
});
if (document.getElementById("<%=EmailTextBox1.ClientID %>").value == '') {
WL.login({ "scope": "wl.emails" }).then(
function (response) {
updateUserData();
},
function (response) {}
);
}
Above I’ve included an IF that checks for an existing Email address, if none is found then it automatically tries to login. (Pop-up blockers hate this, so you’ll need to do something nicer)
- Next here’s my updateUserData() function which is called on successful login;
function updateUserData() {
WL.api({ path: "/me", method: "GET" }).then(
function (response) {
document.getElementById("<%=EmailTextBox1.ClientID %>").value = response.emails.preferred;
if (!response.name) {
document.getElementById("<%=NameTextBox.ClientID %>").value = response.emails.preferred;
}
else {
document.getElementById("<%=NameTextBox.ClientID %>").value = response.name;
}
document.getElementById("<%=UpdateButton.ClientID %>").click();
},
function (response) {}
);
}
</SharePoint:ScriptBlock>
So once we have the user data we update our ASP controls created previously so we can use those in the code behind.
- Lastly we need some code behind to get the details and update our SPUser object.
protected void Page_Load(object sender, EventArgs e)
{
if (string.IsNullOrEmpty(AccountTextBox.Text))
{
AccountTextBox.Text = SPContext.Current.Web.CurrentUser.LoginName;
NameTextBox.Text = SPContext.Current.Web.CurrentUser.Name;
EmailTextBox1.Text = SPContext.Current.Web.CurrentUser.Email;
}
UpdateButton.Click += UpdateButton_Click;
}
- Here I’m updating the controls on Page_Load but only once, then creating an event for our Button.
- For the button I basically do the following:
void UpdateButton_Click(object sender, EventArgs e)
{
[...]
SPUser currentUser =
elevatedWeb.SiteUsers.GetByID(SPContext.Current.Web.CurrentUser.ID);
currentUser.Name = NameTextBox.Text;
currentUser.Email = EmailTextBox1.Text;
currentUser.Update();
[...]
}
To keep this short(er) I have cut out the RunWithElevatedPrivileges bits and such, you actually might not need to have that depending on your user permissions, but if you leave it in then I suggest reading this.
Are we done yet?
Now lets deploy and have a look;
That’s what we see initially, but once we allow that popup:
Make sure you select Yes.
Nice.
Now if everything’s working, clicking Update should do just that!
Hurrah!
Get my full example solution here built for Visual Studio 2012 in SharePoint 2012, just make sure you update it with your Client ID from Live Connect.
Conclusion
It’s not perfect, in fact it’s a way off (think application pages, modal dialogs, mmm), but hopefully this will get you started.
Any questions, post below, but I strongly recommend you play with the Interactive SDK on Live Connect and read the Getting started documentation there if you have any problems with the JavaScript. It took me quite a bit to get all that working together the first time!
Update: See my latest post on this topic for a better approach: http://nearbaseline.azurewebsites.net/blog/2013/07/sharepoint-hosted-apps-with-saml-authentication/