How AngularJS let me down !!!
This all started when I was asked to build a simple website for a startup food blog. When I initially thought down on the architecture there was nothing much to decide on. The requirement was to have a simple page with some images, recipes that can be maintained easily and edited frequently. If this requirement was given to a consultancy organization I'm pretty sure they would start it with a CMS without a second thought. Well then again they didn't ask a consultancy organization. They asked me. So me being not very fond of CMS systems I decided to go down on the AngularJS path which I am already accustomed to.
I was able to get the site hosted within days. I had a nice json data repository set up for the client to work with. Everything was going smoothly until I was asked this question. Why aren't my links working in social media sites as they should be?? !!!!!!!
Do I finally give in and design a WordPress blog? No way am I gonna do that!. I have more pride than that. So I decided to do a URL rewrite and present a compiled html using PhantomJS to social media crawlers. So how did I do it?
If I see a user agent string of a crawler, I compile the site using PhantomJS and present the html to the crawler. Sounds simple but wait for it. The site was hosted in IIS and even though running PhantomJS and doing the rewrite would have been much easier in Node, due to some constraints that was not an option. So I started walking down the dark ally of setting up IIS url rewrite rules for the crawlers.
If you haven't already guessed by now this is just the first part of the solution. Then, I started doing a proof of concept for the IIS url rewrite in github.
So to summarize...
I was creating an AngularJS application for a client when I hit a roadblock. The project needed the ability to support facebook Open Graph properties, pinterest rich pins, stumble upon links, google plus links. All the above were failing.
This project is going to help me understand the user agent string that is used by the above mentioned crawlers and how to do url the rewrites for them.
I have done a URL rewrite for the above mentioned crawlers and have supported and tested the following rich object and schema handling platforms
Post a Comment