Jun 112013
 

Taking a break from my code posts– because geeks love not paying utilities for things you can just get from the internet.

I just switched our home telephone service to Ooma. I bought an Ooma Telo 2 from Amazon for $150 (they’re only $119 right now though) and the voice quality is great– I have Comcast internet, and now that’s it, I’m dropping the bundled voice service of $45/month.

Here’s the Ooma Telo:

OOMA Telo device

Get the ooma from Amazon here. And check out Ooma service here.

Plug that bad boy into the internet port, the other into your home phone wiring, create your Ooma account and you’re good to go. You could plug it in between your router and home network to get better quality of service, but I just dropped it in wherever (going to go wireless) and I get good results. I splurged $10/month for Ooma Premium and got a second line and a phone spam filter, I think it’s worth it and still way cheaper than $540/year for Comcast. (We also switched from dish to cable to Hulu Plus and an antenna some time ago… that also seems to be a much better option for an internet connected home with PCs and XBoxes at each television, another $1200+ a year in savings!) I also spent $120 on the cool Ooma handsets since I had some Amazon funny money lying around from Rewards cards, but you can just use standard phones.

I was a bit hesitant to switch to a DIY VOIP option like Ooma, but the quality is great, it sounds as good or better than what I got from Comcast Voice.

*This is an unpaid endorsement, you can save a CRAPLOAD of money using Ooma.

Jun 112013
 

I found a great example of using Lucene.NET in the open source project Roadkill, a .NET wiki engine.

That is all. Yes I could have tweeted that, but I’m going old skool. Remember blogs? We used to use them prior to Facebook and Twitter.

Lucene seems to be Microsoft’s search engine of choice for Azure, since full-text search for Azure is coming no time soon. Unfortunately, there isn’t a ton of great samples out there… having the whole thing in an open source project you can look at is quite handy!

Some other useful Lucene.NET links for Azure devs:

Combine those articles and Roadkill’s source code and you’ve got a pretty good start on getting Lucene running on Azure.

Apr 192013
 

(list came from the legendary Mark Simms… )

Best Practices for the Design of Large-Scale Services on Windows Azure Cloud Services
http://msdn.microsoft.com/en-us/library/jj717232.aspx

Resilient Cloud Apps
http://channel9.msdn.com/Events/Patterns-Practices-Symposium-Online/pattern-practices-symposium-2013/Resilient-Cloud-Apps

Building Big: Lessons learned from Windows Azure customers – Part I
http://channel9.msdn.com/Events/Build/2012/3-029

Building Big: Lessons learned from Windows Azure customers – Part II
http://channel9.msdn.com/Events/Build/2012/3-030

Reference code for Azure from Contoso Social
http://code.msdn.microsoft.com/windowsazure/ContosoSocial-in-Windows-8dd9052c

Failsafe: Building scalable, resilient cloud services
http://channel9.msdn.com/Series/FailSafe

Apr 192013
 

Ok, I probably won’t ever leave the SharePoint world… but I’ve been transitioning a bit of focus lately from SharePoint to cloud development. And if you drink the Microsoft kool-aid (any why wouldn’t you, it’s ridiculously good!!! I think they laced it with crack….) there’s only one cloud. Personally, I will blissfully ignore the other clouds… Microsoft just makes it too easy to use their platform. I’m all about that, we want to leverage the work Microsoft is pouring into the cloud, and it just makes sense for Microsoft developers.

It’s also about the sushi. Seattle has the best sushi in the states… and I just found my new favorite, “Tuna House” in Bellevue. Yes, I can be easily bought with sushi dinners. But back to Azure… (mmm….. kool aid!!!!) I’ve met with some ridiculously smart people over the last few days here in Redmond, and here are some of my raw takeaways. There’s a lot of things that are specific to our problem domain which I won’t share here, but here are some of the general things I’ve learned over the last few days and the last few weeks diving into Azure:

There’s really no full text search for SQL Azure that’s coming anytime soon… everyone says “hey, just use lucene”. There’s a company called Lucidworks that offers Luce/Solr as a service, or you can try rolling your own yourself. There’s the AzureDirectory for Lucene project you can grab off of Codeplex which could be a good start… but it just seems like one more thing to add to the solution… search seems to be an awful gap in the Azure story. It seems like everyone keeps saying “just use lucene” but there’s no really great samples on that out there, and the documentation for .NET developers is questionable.

Another topic we dove into is logging and telemetry. If you can’t debug from your logs, you can’t debug in the cloud. So log, log, log… there’s also several levels of logging and telemetry you want for your app: there’s the infrastructure part of logging, but that really doesn’t matter much these days. Then there’s the app level—requests per second, latency, stuff like that. On top of that, then there’s business metrics such as transactions per second, concurrent active users, things like that. And then at the very top there’s user metrics which tells us adoption statistics and how our users are using the system. One of the really important things too is to make sure the logging is async, which rules out log4net (which actually has file lock issues which can really cause issues). In code, make sure you’re not building the strings before calling the logging interface, otherwise you’re creating a lot off overhead building strings that you may be throwing away. For example:: log(string.format(“foo {0} {1}, something, another)); is horrible since you’re creating the string overhead, instead the following is preferred: log(“foo {0} {1}, something, another); because you’re not creating a string. I’ll post more about logging in another post…

Another interesting pattern we talked about is the cloud deployment pattern. The cloud software version needs to be compatible with n-1 (the previous version) of the database, so after the application code gets updated, then the SQL code gets updated. It seems backwards, but makes sense if you think about and if you code to that standard. (Plus, it gives QA another thing to test, they love crap like that!) If the software platform succeeds, then update the database to the current schema. But if the software update fails, you can just roll back to the previous version pretty easily.

There’s a lot more patterns to talk about, that’s about all that can fit into this post. Watch for more content on Azure dev as we move to the cloud. Maybe you can join me on this adventure!

Apr 192013
 

*Note: This is a troubleshooting post… not all that interesting unless you have this issue. But I’ll try to make it interesting.

The backstory:

I’m writing a bit of Azure code lately… in which caching is a big component. When I write code, I’ve found that I really hate debugging with the Azure cloud emulator. Yeah, I’m really impatient… I hate that 15 second delay, I also hate having my URL changed to the emulator’s port (it causes issues with federation auth). So I chose to just host the site in IIS and debug attaching to the w3wp process instead of through the emulator. With this, we need flexible services we can interface in without taking dependencies on Azure components.

So I decided to wrap the caching library, in which case I want to run a local cache using AppFabric caching in dev, and Azure caching in staging/qa/production. (It’s also interesting to note I could save $120/month running a linux memcache server in a Azure VM… we may have to look at that instead of Azure caching role. We will see!)

So I ended up looking at AppFabric caching, which is the same technology used by the Azure caching role. Unfortunately, the links posted in places like Hanselman’s post ”Installing, Configuring and Using Windows Server AppFabric and the “Velocity” Memory Cache in 10 minutes” link to an older version that doesn’t install on 2012 or Windows 8. After a bit of googling, I found the download here: http://www.microsoft.com/en-us/download/details.aspx?id=27115 

Unfortunately, it just flat wouldn’t install the caching service. You’ll get an error like “AppFabric installation failed because installer MSI returned with error code : 1603”. There’s a lot of googling about this going on… here’s what worked for ME:

1.Run the exe “WindowsServerAppFabricSetup_x64.exe” from the app fabric package download.
2. Let it fail… find the temp folder it created, some arbitrary number in the root of your c: drive, and copy it to a new folder called “appfabricinstall”.
3. Go into the “packages” subfolder, and either run the following command from a command prompt:
msiexec /i appfabric-1.1-for-windows-server-64.msi /l*v Detailed.msi.log

4. If there are errors, check out the log file Detailed.msi.log.

If that doesn’t work out for you, sacrifice a goat (should have been your FIRST step, duh!!!!) and then ask the google again. This post may help as well, or any other post you find when googling “Appfabric installation failed because installer MSI returned with error code:1603”.

Sorry to all my two readers for this far from interesting post… until next time…

Nov 192012
 

If you’ve been following SharePoint 2013, you’ll know that the App Model is everything. If you’ve been a SharePoint developer for a prior release or two however, you’ve got a lot of on-premise code that does really interesting things that you can’t do with the App Model. But apps are so cool, aren’t they? And they just make sense from an end user perspective. A user doesn’t want to add a “list” or a “”site”, they want an app. Fortunately for you, if you’ve got on-premise SharePoint code that you can instantiate via a feature, you can deploy it as an app.

To “surface” a Feature as an App, include the node AppDisplayData in the Feature.xml:

<?xml version=”1.0″ encoding=”utf-8″?>
<Feature xmlns=http://schemas.microsoft.com/sharepoint/
AlwaysForceInstall=”TRUE”
Description=”Dan is awesome but he needs a Microsoft Surface.”
Id=”13B1CA7A-75DA-47E3-A439-AE651658C56D” 
Scope=”Web” Title=”Dan needs a Surface”>

  <AppDisplayData ThumbnailUrl=”_layouts/15/images/Dan/Awesomeness.png”
        LaunchUrl=”_layouts/15/DansAwesomeApp/Entrypoint.aspx”
    />
 
 
</Feature>

ThumbnailUrl defines the URL of the thumbnail, and LaunchUrl defines the URL of the instantiated app for Site Contents once it’s activated.

When you include this magic XML in the feature node, the SPSiteDefinition’s internal ShowInStoreFront property will be “true”, and will be loaded into the Add an App page:

image

That’s all there is to it! Include an AppDisplayData node with ThumbnailUrl and LaunchUrl and you can ship Apps for your on-premise SharePoint 2013 Features.

Aug 292012
 

Today we posted an update to the SharePoint AJAX Toolkit to support changes in IE10. The SharePoint AJAX Toolkit is a library designed to take an XML URL and an XSLT URL and spit the HTML out into a div. The heart of the control is the JavaScript XmlControl, which is wrapped by the AjaxXmlWebPart. At NewsGator we use the AjaxXmlWebPart for a MAJOR portion of our UI code, it’s just so handy to take an XML data source which we can update on the client for a client refesh, and point it at a simple XSLT transform. Add a little post-render jQuery code, and you’ve got REALLY sexy functionality in very little time. A similar JSON-based approach is in the works, but the XML technique is a little more powerful (and REALLY easy if you keep it simple). But I digress… this post is about changes to AJAX processing for IE 10.

Over the years, we’ve found that the Microsoft AJAX Library (AKA “Atlas” for you old timers!) was a bad idea. Not long after it shipped, Microsoft acknowledged this and said something like “yeah, that was a bad idea, just use jQuery.” Which of course made me feel REALLY great about my book that covered a lot of their framework… ouch! Still a good book, just skip those chapters on the client side AJAX library and read a good jQuery reference. You can find the book pretty cheap over on Amazon.

But again, I digress.

Here’s the change you should know:

When you need to get back an XmlDOM for doing XML things with (like XSLT) you need to TELL the request that the response type is an MSXML document (that sounds so wrong) otherwise it end up being a DOM document, which doesn’t have the XSLT functions you need. Simply set the responseType to msxml-document and it’s majically an XML object when it comes back.

Of course, the (not) awesome part about this is that this breaks all those silly JavaScript libraries (like Microsoft’s Sys.Net.WebRequest) and forces you to go old-school do-it-yourself for the request. Which isn’t all that hard, luckily.

Our old code looked like this:

var request = new Sys.Net.WebRequest();

request.get_headers()["X-SPAJAX"] = “XmlControl.LoadXml”;

request.set_url(url);

 

request.requestdate = new String(new Date());

request.get_headers()["X-REQUESTDATE"] = request.requestdate;

 

request.add_completed(this.xmlLoaderDelegate);

request.invoke();

 

Our new code now looks like this:

var request = SharePointAjax.GetXmlRequester();

request.open(“GET”, url, true);

if (request.setRequestHeader)

request.setRequestHeader(“X-SPAJAX”, “XmlControl.LoadXml”);

 

var requestDate = new String(new Date());

request.setRequestHeader(“X-REQUESTDATE”, requestDate);

 

var xc = this;

request.onreadystatechange = function () {


if (request.readyState === 4) {

xc.LoadXmlComplete(request, requestDate);

}

};

try { request.responseType = ‘msxml-document’; } catch (e) { };

request.send();

 

…where the GetXmlRequester is a simple function that looks like this, and provides compatibility for those dinosaur versions of Internet Explorer out there. It’s almost not needed, but for your mom who’s stuck on Windows Me, it might just help out.

SharePointAjax.GetXmlRequester = function(){


if (window.XMLHttpRequest)


return
new XMLHttpRequest();


var progIDs = ['Microsoft.XMLHTTP', 'Msxml2.XMLHTTP.3.0', 'Msxml2.XMLHTTP'];


for (var i = 0, l = progIDs.length; i < l; i++) {


try {


var xmlAx = new ActiveXObject(progIDs[i]);


return xmlAx;

}catch (ex) {}

}

}

 

That’s about it! Thanks to SharePointJohn (part of our incredible NewsGator dev team) and the good folks at the IEBlog for assistance.

Until next time… happy Labor Day to everyone! :)

Jul 262012
 

While the rest of the NewsGator Social Sites developer team is hard at work on our 3.0 release, I’ve been working on strategy and compatibility for SharePoint 2013. (Ok, I wrote SOME code for 3.0, but not that much. The rest of the team has added some really cool features blow the doors off of the Social experience!) One goal was to be able to carry forward code from SharePoint 2010, as I’ve talked about here. This post will add a few additional details to that… with these techniques we’re able to install our 2010 product onto SharePoint 2013 with additional WSPs for 2013 compatibility and added or modified functionality.

1. WSP Package Targeting

The package version for all 2010 WSPs should be set to 14.0. This will ensure that the _layouts folder gets deployed to 14\template\layouts which is served as “/_layouts/”. The folders ISAPI (_vti_bin), WebServices and WebClients magically get deployed to the right place in 2013 (that is, in the 15 folder structure). Features deployed from 2010 WSPs will get deployed to 14\template\features, and will show up in Features and will continue to work just as they did in 2010. For most features, this is what you want.

To add compatibility for SharePoint 2013, add an additional WSP with compatibility items. You can overwrite features deployed in your 2010 (14.0) compatible WSP with a specialized WSP for 2013 (15.0).

Site Features in SharePoint 2013 (as well as Web/Web App/Farm) will read from both 14\Template\Features AND 15\Template\Features.

2. Overriding Features for SharePoint 2013

Occasionally, you may find that a Feature needs specialized code for 2013, or it just needs to be redeployed. You may need to override a feature because the concepts have changed in 2013 and you need a different implementation, you may need to simply hide a feature from your 2010 WSP, or you might just need to redeploy files to the 15 folder structure, as will be the case for custom columns.

Custom columns need to be re-deployed to the 15 folder structure. Features that add custom columns will activate without error, but the custom columns will not be accessible, and the feature may fail to de-activate and re-activate. SharePoint requires columns to be deployed to the 15 structure in order to be used by code and to show up in the Site Columns dialog.

Side-note: If you’re not familiar with adding custom columns, check out the SharePoint “fields” Feature (15\TEMPLATE\FEATURES\fields).

To override a Feature from a 14.0 WSP, deploy a new Feature in a 15.0 WSP with the same Feature ID GUID and the same scope. Increase the version (version 15.0.0.1 is a good starting version for 15.0 scoped Features) and include the implementation.

It’s really that simple… just redeploy the Feature in the 15 structure, and it will override the previously deployed 14.0 Feature.

For site columns, if you have specific custom field types defined in the 14\template\XML folder, you’ll need to redeploy those in the 15\template\XML folder. For site columns deployed in Features, you will also need to redistribute those to 15\template\Features. The code AND the Visual Studio project items should be compatible, you can simply copy from your 14.0 Visual Studio 2010 project and include them in your 15.0 Visual Studio 11 project.

Summary

To override a Feature from a 14.0 WSP, deploy a new Feature in a 15.0 WSP with the same Feature ID and the same scope. Increase the Feature version and include the implementation.

Jul 262012
 

Just in time for SharePoint 2013 Preview development—the kind folks at Red Gate have released Reflector 7.6. If you write code for SharePoint, you need this tool!!!! For those new to Reflector: this is how we perform the black art of SharePoint development when the folks in Redmond aren’t willing to share with us, or when we just aren’t sure of what a given API really does. From Visual Studio you can choose the assemblies to debug, and Reflector creates PDBs so you can step through their code. (You will need therapy after seeing Microsoft code though…. It isn’t pretty.) My feature made it into the release—when you first fire up Reflector you’ll be asked to choose a set of default DLLs to load. If SharePoint is installed it will let you choose the SharePoint DLLs including Microsoft.SharePoint, Microsoft.Office.Server, Microsoft.Office.Server.Search, Microsoft.SharePoint.Client and Microsoft.Office.Server.UserProfile dlls. This feature came from my product feedback—the Red Gate team listens to your feedback. This works with 2013 as well as 2010.

You can get the tool at http://www.reflector.net/.

DISCLAIMER: Red Gate has provided me with their tools for no charge, and I have been involved in product feedback which I also talk about.

 

Jul 232012
 

We recently went through the work of converting our solution (NewsGator Social Sites) from SharePoint 2010 to SharePoint 2013. After a bit of research, trial and error, we were able to make our code compatible with both SharePoint 2010 and SharePoint 2013. In fact, our current codebase can be installed on SharePoint 2013. Here’s some of the things we learned in making this possible.

To begin, we started with a dual developer environment. Both environments are built on Windows 2008 R2 with SQL 2012. The 2013 environment has SharePoint 2013 and Visual Studio 11, while the 2010 environment has Visual Studio 2010 plus SharePoint 2010. To achieve backwards/forwards compatibility with a single code base, we never open the solution in VS 11, we work in VS 2010 and merely deploy onto the 2013 environment.

If you open the WSP solution in the 2013 environment, the WSP will put the _layouts mapped folder as /_layouts/15/. The layouts mapped folder will remain as /_layouts/ when the WSP is created in Visual Studio 2010. So if you don’t open the solution in Visual Studio 11, this isn’t an issue and you don’t need to change any URL references. Here’s a summary of mapped folders and what happens to them:

The mapped folder Layouts is mapped to /_layouts/ in a version 14 WSP, mapped to /layouts/15/ in a 2013 WSP. The rest of the mapped folders magically get deployed where they need to go (for the most part!) including ISAPI, WebServices, WebClients, and SQL. SQL is a problem if you’re using an SPDatabase in your service application, you’ll need to add an “if” statement to properly get the path of the installed SQL folder in your SPDatabase—otherwise SPDatabase may try to use the 15 folder when it really should read the 14 folder and vise versa. Test this out…

The other thing we started to do is add a few if blocks for specialized code, and adding in special styles, behaviors and even classes in factory patterns based on SPFarm.Local.BuildVersion. Specialized code for/compiled against SharePoint 2013 can be added through code written on the 2013 dev box, packaged in a 2013 WSP, and loaded through a factory pattern in common code.

In order to get our Central Admin pages to work, we had to add the following to Central Admin’s web config, in the bindingRedirect node. (This is a bug we’ve asked to get resolved for RTM):

<dependentAssembly>
<assemblyIdentity name=”Microsoft.SharePoint.ApplicationPages.Administration” publicKeyToken=”71e9bce111e9429c” culture=”neutral” />
<bindingRedirect oldVersion=”14.0.0.0″ newVersion=”15.0.0.0″ />
</dependentAssembly>

Keep this in mind, and you should be able to maintain WSP compatibility with SharePoint 2013 while building against SharePoint 2010.