Friday, July 15, 2011

Fat clients are back - big time

Back in the days when I used to develop software (1992-1998), the client-server paradigm for software development ruled. Data was stored on the server-side and it was, except for server side calculations, manipulated locally on our PC desktops. We interacted with the data via fat but highly interactive, stable and speedy Windows applications.

When the world wide web arrived to our desktops, it was sort of implied that web sites would evolve into highly interactive applications run inside of a web browser. The web was the future, and we assumed that it would mean that we would interact with everything via a web browser. Yet, the last couple of years we have seen things moving in quite the opposite direction. The client-server paradigm with fat native clients is back. What is new is that the servers are in a cloud somewhere and can be accessed via web applications in a web browser as well as via native apps.

Several trends interplay and pave the way for the comeback of fat clients, such as the following:
  • Our consumption of rich media such as video, photos and music increases
  • We store more of our information in the cloud (because we want to be ablet to access it from anywhere)
  • We want highly interactive, reliable and fast user experiences also when it comes to Internet-based tools
  • We are becoming more and more mobile, using mobile devices such as smartphones and tablets
  • The capacity of the Internet and broadband connections, especially mobile have a hard time keeping up with, ever increasing traffic volumes
The bottom line is: we need the broadband for shuffling our content, not for downloading applications. Besides, it doesn't make much sense to download an application every time we need it, especially if it's easy to access and install the application locally on a device. App stores make these tasks really easy and nothing like the messy and error-prone installation procedures we have gotten used to with Microsoft Windows.

The risk of downloading and infecting devices with malicious code also decreases if it is just content and not applications that is downloaded. No code except maybe for style sheets and content markup would be downloaded. Security mechanisms, such ad encryption and access rights, can be put the content itself so that it does not slip away and get into the wrong hands.

Fat clients are back big time, and there is no reason to think they aren't here to stay.


  1. Your post feels a bit contradictory to me.

    * Even if broadband is lagging, it's still developing fast. With 4G connectivity soon in every mobile device we got broadband speed wherever we go. It will keep geting much faster.

    * The development of HTML5 and faster javascript applications (with javascript serverside as well as client side) make up for a paradigm where some calcuations are made server side, some client side where absolute speed is needed.

    * Even native apps on iOS or Android heavily depend on connectivity and server side calculations. Gowalla, Spotify, Streaming video etc.

    * Strong forces, such as Google, are working to make clients very THIN again with its Chrome OS.

    One could rightfully argue that a modern HTML5 web app is thicker than a web app was a few years ago, but the simple paradigm of fat/thin is blured and complicated by all the above developments.

  2. Hi Björn, thanks for your comment.

    I don't see the contradiction. Although 4G is arriving, increasing the capacity of mobile broadband, it is not yet available to most consumers. Even 3G connectivity is available far from everywhere and where it is available it does often not provide the desired capacity. I believe it is safe to say that consumption of video, music and other rich media via mobile devices increases far more and faster than the mobile broadband capacity is increasing. 4G is a way to catch up with the increasing demand of mobile broadband capacity. 

    Fat clients operating on the client-server paradigm in the 90s worked pretty much in the same ways as the apps you are describing in your comment. There were both lots of calculations happening on the server-side (in stored procedures) and data was also frequently retrieved and submitted to the server from the client. This was of course possible because of the relatively high capacity of LANs. The client usually fetched a dataset, typically a record or a set of records, which was manipulated on the client side and then submitted to the server when it was saved by the user (or automatically saved by the application). 

    The thing with the fat native clients is that they were rich in terms of functionality and interactivity, and very stable. Partly the stability came from being independent of much other software than the OS (so not depending on a browser) and optimized for the specific OS, and partly from it being compiled and each compiled version tested before it was deployed and installed on the users' computers. 

    People are also getting used to the concept of apps (as opposed to web sites/apps) on their mobile devices. I believe a main factor behind the success of mobile apps is that they provide quick access to rich and stabile functionality without the need for good connectivity just to be able to start an application. 

    With so much computing power being available on the client side, I also think there's a very strong commersial reason for service providers to put more of the load on the client side and less on the server side and on the network. 

  3. In my last paragraph, I meant to say there are lots of money to be saved by service providers if they csn put more of the load on the cöient side.

  4. Thanks Oscar for this enlightening perspective. In light of the commercial success of the mobile apps I've been wondering for some time if there will be a resurgence of fat clients beyond just the mobile space. So far I haven't seen it...why is that? In the age of big-data, for companies to utilize desktop-based applications to deliver highly responsive, interactive, reliable, stable platforms for presenting data while saving network bandwidth for just piping data content is a effective and efficient model. Leveraging the computing power of client-side technology could save enormous amounts of resources (IT infrastructure, network traffic, staffing, etc). Keep in mind that computing power on client-side increases at a faster rate than on the server-side as companies to maintain a shorter upgrade cycle for PCs than for servers.