24

I have this intriguing problem on Azure Website. My website uses 4 script files and 3 style files, each minified. They are not so big, bigest has near 200 KBs. Website had already started. Azure's Always On option is turned on. When I call to WebApi for data it returns in <50ms.

Azure website scripts/styles loading time

And when app is reloaded it needs 250 ms just to get first byte from tiniest script, and others needs much more. Initial Html is loaded in 60 ms. Scripts/styles are cached so they are not downloaded, but the TTFB time is killing the performance. This repeats every single reload. App is not containing any sophisticated configuration so it should run much faster than it.

What can cause such problems?

Radosław Maziarka
  • 585
  • 1
  • 5
  • 14

6 Answers6

3

Although your static files are cached, the browser still issues requests with if-modifies-since header (which results in a 304).

While it doesn't need to download the actual content, it still needs to wait the RTT + server think time to continue.

I would suggest two things:

  1. Adding Cache-Control and Expire headers - will help avoid 304 in some cases (pretty much unless you hit F5)
  2. Using a proper CDN - such as Incapsula or others, that will minimize the RTT + think time. It can also be used to easily control cache settings for various resources.

More good stuff here.

Good Luck!

ZigZag_IL
  • 134
  • 3
1

From here:

As you saw earlier, IIS 7 caches the compressed versions of static files. So, if a request arrives for a static file whose compressed version is already in the cache, it doesn’t need to be compressed again.

But what if there is no compressed version in the cache? Will IIS 7 then compress the file right away and put it in the cache? The answer is yes, but only if the file is being requested frequently. By not compressing files that are only requested infrequently, IIS 7 saves CPU usage and cache space.

By default, a file is considered to be requested frequently if it is requested two or more times per 10 seconds.

So, the reason your users are being served an uncompressed version of the javascript file is because it didn't meet the default threshold for being compressed; in other words, the javascript file was not requested 2 times within 10 seconds.

To control this, there is one attribute we must change on the <serverRuntime> element, which controls compression: frequentHitThreshold. In order for your file to be compressed when it is requested once, change your <serverRuntime> element to look like this:

<serverRuntime enabled="true" frequentHitThreshold="1" />

This will slightly impact your CPU performance if you have many javascript files that are being served and you have users quite often, but likely if you have users often enough to impact CPU from compressing these files, then they are already compressed and cached!

CC Inc
  • 5,569
  • 3
  • 29
  • 61
0

My guess would be Azures always on.
If it works anything like the one CloudFlare provides, it essentially proxies the request and tries to cache it.
Depending on the exact implementation of this cache on the side of Azure, it might wait for the scripts output to complete to cache it/validate the cache and then pass it on to the browser.

You might have a chance checking the caching configuration and disable always on for your scripts if possible.

T4cC0re
  • 81
  • 1
  • 9
0

The scripts and styles are static files and by default are compressed (you can check this with HTTP header "content-encoding": gzip) before being sent to client. So, the TTFB consists of network latency, browser HTTP channel scheduling and the static file compression time from server.

On the other hand, your Web API data is dynamic data and by default is not compressed, so possible its TTFB is less than the TTFB for static files.

However, you don't need to switch off static compressing, otherwise TTFB is minimized but content transferring time will be extended. Actually, you don't need to worry about TTFB, see more info: https://blog.cloudflare.com/ttfb-time-to-first-byte-considered-meaningles/

Shuping
  • 4,888
  • 5
  • 37
  • 59
0

I finished with storing files on Azure Storage and serving them by Azure CDN. It provides high speed of response and costs nothing. I add them to blob every publish, in Pre-build event by Gulp.

Radosław Maziarka
  • 585
  • 1
  • 5
  • 14
-12

well... there are 2 main problems with your site:

  1. you are using AZURE - a high priced service with a poor performance.... don't ask me why people think that this is a good service

  2. you are storing client files side-by-side with the server files.. while server files should be stored in a specific server, client files can practically can be served from... everywhere

so - please use a CDN (or any other server) for your client side files (mainly css and js, you may consider moving fonts and images as well)

ymz
  • 4,053
  • 1
  • 14
  • 31