0

Hey, first of all this is a conceptional question and I do not know if StackOverflow is the appropriate place - so my apologies if I am wrong.

Nowadays the web is not only used for passing raw informations. Many and especially complex web applications are in use. These web application seem to be so complex that it seems irrational to use the HTTP protocol, which is based on so simple data exchange, plus it is stateless.

Would it not be more convincing to use remote invocations for this web applications? The big advantage to my mind is a unified GUI by using HTML. But there are applications, which have no need for a graphical interfaces and then it comes to a point where the HTTP protocol is really cumbersome.

  • 2
    I think this belongs on Programmers SE, but I'm not sure. Anyone? –  Jan 30 '11 at 23:14

4 Answers4

3

Short answer: HTTP is allowed through firewalls where other protocols would be blocked.

Eric Giguere
  • 3,435
  • 13
  • 11
1

A short partial answer is: first, for historical reasons - HTTP was used since the dawn of the web as protocol for requesting documents, and has since been used for some different purposes. One reason to keep using it is that it is generally served on port 80 which you can be sure won't be blocked by firewalls between your client and the server. The statelessness of the protocol may not always be what you want, but it has at least the advantage of protecting the server side from very trivial overloading problems.

sinelaw
  • 15,049
  • 2
  • 44
  • 78
0
  • OS independence
  • firewall passing
  • the web server is already a well understood and mostly "solved" problem in terms of load balancing, server fall over, etc.
  • don't have to reinvent the wheel
Paul Tomblin
  • 167,274
  • 56
  • 305
  • 392
0

Other protocols are being used more and more now, including remote invocations and (the one I am particularly familiar with) WCF (which allows binary TCP/IP data transfer).

This allows data to travel faster for applications which require more bandwidth. For example an n-tier application may use WCF binary transfer between application and presentation tiers. Also public web services allow multiple protocols, including binary.

For data transfer protocols, firewalls should be configured (ie. expose a port specifically for your application), not worked around, I would not recommend using a protocol because firewalls do not block it.

The protocol used really depends on who will consume it and what control you have over the consumption - eg external third parties may need a plain-text version with a commonly agrreed data interface. On the other hand, two tiers in a single web application may be able to utilise binary data transfer for performance and security.

Russell
  • 16,427
  • 22
  • 77
  • 121