Twitter changing client side mvc usage

Very interesting article from Twitter here about how they changed their client side architecture from  a single page app running client side javascript back to something more traditional.


"When we shipped #NewTwitter in September 2010, we built it around a web application architecture that pushed all of the UI rendering and logic to JavaScript running on our users’ browsers and consumed the Twitter REST API directly, in a similar way to our mobile clients."

But there were Issues:
- initial page-load performance
- ensure that only JavaScript required is loaded and the application is interactive as soon as possible
- urls are urls (no hashbangs) (important to Dan Webb)


The defence
The riposte

By accounts #newtwitter did not do a good job of implementing hashbang and there were performance issues...the ripostes had it and....so twitter rearchitected to:
1. "render our page content on the server and deferring all JavaScript execution until well after that content has been rendered"
   moving the rendering to the server to take back control of our front-end performance .

2. "removing the need to handle routing on the client and no longer use the hashbang (#!)"
"With hashbang URLs, the browser needs to download an HTML page, download and execute some JavaScript, recognize the hashbang path (which is only visible to the browser), then fetch and render the content for that URL"

3. "On top of the rendered pages, asynchronously bootstrap a new modular JavaScript application to provide the fully-featured interactive experience our users expect"
"arranged all our code as CommonJS modules, delivered via AMD. This means that each piece of our code explicitly declares what it needs to execute"  "Our JavaScript bundles are built programmatically by a tool, we only download the code we need — and then only execute that code when required by the application."


Thoughts
Wow! So Twitter rearchitected in 2010 and then redid it approx 1.5 years later. Clearly something was not right and they've seen performance gains since their 2012s change.
An aside: I wonder how much improvement is from using AMD and just the js needed versus server side rendering?

What does it all mean?
Well we want it all: high performance interactive desktop-like web apps, sensible urls and seo support.
But there's no one solution which is backwards compatible in older browsers and provides it all.

So decisions have to be made and a mix of solutions considered.
There are some apps where its not important/desired that the pages be crawled or apps with views that are "dynamically" created based on user selections such as a wizard. These could be good candidates for single page apps because you basically have no or minimal SEO requirements.

But what about other kinds of apps?
There are solutions such as pushstate, headless crawlers and googles hashbang search support which can support crawling a single page apps, but all are extra work because a crawlable page has to be provided as well as the interactive page. By all accounts, pushstate is the future but is not well supported in IEs, not even in IE9.

If using pushstate (a backbone.js opt in) then your web server has also to serve a non single page app (and crawlable) version of the apps pushstate urls. An option I've heard is using node as a headless browser to hit the pages and generate html. But I have not tried.


Here's an interesting post where the author creates an app which works without javascript then enhances it to behave as a single page app if pushState is supported in the browser. The hash is not used in the url. If pushState is supported then view templates are loaded and handlers are added to everything to intercept click and handle on the client and put the app in the same state as it would with no javascript. Whats key is the same templates are used on server and client.
This is nice because its still 1 app (important for seo) but behaves differently depending on browser capability.
Another along similar lines here.


A lesson to me is that using a single page app (spa) architecture needs forethought.
What kind of user experience is called for? some may not be so well suited to single page app
What browsers need to be supported?
How much SEO is required?
What will the urls look like?
What are your options for generating 2 versions of the app, one for crawlers one for spas?

Whats the budget?
Whats the site technologies and architecture?

Client side mvcs are the way to write non-trivial client side javascript for one page or many but when to use a full on single page apps needs some thought.


Aside: here's a note on how twitter embed json to bootstrap their page.

Comments

  1. My opinion is that there is likely a hybrid solution where the server provides objects to the client but also renders the initial page load (ideally using the same template library as the client). So the client application views/controllers would not need to render one the initial page load but instead only initialize their events on ready.

    I found this today regarding web page performance relating to the build strategy for single page apps... a presentation on performance http://pagespeed-velocity2011.appspot.com/#14 ... It looks like combining all JavaScript files into a single file is a no-no.

    Also helpful for considering application architecture... http://gorban.org/post/32873465932/software-architecture-cheat-sheet

    The LinkedIn post provides some thoughts on this topic as well... http://highscalability.com/blog/2012/10/4/linkedin-moved-from-rails-to-node-27-servers-cut-and-up-to-2.html

    As for me, I am curious to see how quickly meteor.js is adopted, which basically abandons REST for reactive programming with event driven I/O and pushes down the wire to update clients.

    ReplyDelete

Post a Comment

Popular posts from this blog

deep dive into Material UI TextField built by mui

angular js protractor e2e cheatsheet

react-router v6.4+ loaders, actions, forms and more