1

Currently I'm loading a few external sites via iframe to showcase content. However since those sites are geographically far away from my users and my server, it's really really slow for them.

I recently came across the php method file_get_contents() and read that it would be faster since its server side scripts. My question is, from a user's perspective. Will file_get_contents load external sites faster for my users than iframe or is there some other php methods faster than iframe I can use.

Thanks

  • 4
    Why not try it and see? – Hard worker Feb 26 '13 at 09:16
  • cause right now I'm on my way travelling and not infront of my computer, I was hoping if I could get some answers right now I can already start planning on which methods to use by the time I get back. – user2110516 Feb 26 '13 at 09:19

3 Answers3

4

If you fetch all these sites on the server before sending anything to the client, the client will see nothing until your server has loaded every single page. If you use iframes, the client is loading the sites asynchronously and will see something earlier. If your server would periodically fetch these pages and cache them, you'd have an advantage.

But, loading the sites on the server and embedding their HTML would also mean you'd have to do a lot of processing server-side to rewrite and fix all external includes these sites have (stylesheets, Javascript, images), since they're now being served from a different domain.

In short: probably not.

deceze
  • 471,072
  • 76
  • 664
  • 811
  • Regarding the server-side processing the `` tag should do the trick. See: http://stackoverflow.com/a/12521755/367529. – SunnyRed May 02 '14 at 17:42
3

Think about it, you would act as a proxy:

+---------------+                      +-------------+
| external site |<---far far away------| your server |
+---------------+                      +-------------+
                                              ^
                                              |
                                      +----------------+
                                      | client browser |
                                      +----------------+

This adds another layer but does not bring the external site nearer to anyone. Conclusion: If anything, it will be slower.

Fabian Schmengler
  • 22,450
  • 8
  • 70
  • 104
  • 1
    Well, the server has a better chance of having a better connection to these other servers... but yes, I agree. – deceze Feb 26 '13 at 09:21
0

Problems with using only file_get_contents()

  • In iframe the page styling does not change the styling of iframe content, while a file_get_contents() on php might result in a messed up styling.
  • If your site gets good amount of traffic continuously querying the external link from the server might get your server ip blacklisted from the external server.

Best solution is :

  • Load and save the external file as a cron job.

  • Save the file along with the page requisites(css and js) on your server.

  • Load this local file in an iframe resulting in asynchronous display and fast user experience.

cnvzmxcvmcx
  • 954
  • 2
  • 12
  • 31
  • thanks for the reply, this sounds like what I should be doing. However I'm not quite sure what you mean by save the file along with the page requisites on my server. How would I do that? – user2110516 Feb 26 '13 at 09:46
  • iframe that shall load in the users browser might require some css and javascript from the original source. Loading them from external source will take time. So cache them and make sure the local html file links to them. – cnvzmxcvmcx Feb 26 '13 at 09:50
  • do you mean I need to use file_get_contents as a cron job and use that to periodically save the external sites on my server in a variable. And then when the user requests the site, it is loaded from the variable? If I use file_get_contents will I need to parse the CSS and JS and other file paths? thanks – user2110516 Feb 26 '13 at 09:57
  • Variables cannot be used to save data across requests. You need to save it in a file. Use curl preferably and override the required files in the cron job. – cnvzmxcvmcx Feb 26 '13 at 10:03
  • Okay, lets say I use curl and saved the target site www.mozilla.org. However what about all the rest of the pages, does this only save the home page + CSS/JS or does it save the other pages as well? – user2110516 Feb 26 '13 at 10:07
  • it only saves the homepage index.html. You need to separately download other files like images, css and js and make sure they link to each other. You can set a higher cron job interval for these files as they get updated less compared to html file – cnvzmxcvmcx Feb 26 '13 at 10:14