33

I'm calling the php code from ajax like this:

ajaxRequest.open("GET", "func.php" + queryString, true);

Since it's a get request anyone can see it by simply examining the headers. The data being passed is not sensitive, but it could potentially be abused since it is also trivial to get the parameter names.

How do I prevent direct access to http://mysite/func.php yet allow my ajax page access to it?

Also I have tried the solution posted here but its doesn't work for me - always get the 'Direct access not premitted' message.

Community
  • 1
  • 1
jriggs
  • 1,254
  • 4
  • 23
  • 46
  • 1
    So you basically want your script to be accessible via AJAX but not if I type in the URI? – F.P Nov 18 '09 at 15:09
  • Is there a reason why you can't use the .htaccess described in the link you posted? – Marek Karbarz Nov 18 '09 at 15:30
  • 1
    Zarembisty: .htaccess is NOT an option in this case. even if they were using apache. The page HAS to be available to http requests because it must be available through AJAX. If you blocked it, you'd block the AJAX too – Palantir Nov 18 '09 at 15:36
  • @Palantir - of course .htaccess is an option - you can block access from everything but your own server; obviously it's not an option with IIS, but with apache it very much is a viable alternative – Marek Karbarz Nov 18 '09 at 15:49
  • Your PHP backend [should be an API](http://devblog.supportbee.com/2011/08/10/the-pros-and-cons-of-developing-a-complete-javascript-ui/) to the web-based presentation layer. If it's a problem that users can access the API manually, then the API is broken. Otherwise, this is a non-issue and you can move on to work on something constructive. – Lightness Races in Orbit Aug 19 '11 at 21:44
  • I know this is an old thread but... @MarekKarbarz since AJAX requests come from the client, not the server, wouldn't your .htaccess solution in fact NOT be viable? – themerlinproject Jun 03 '12 at 14:50
  • 1
    @themerlinproject - old thread indeed. I don't know what was going through my head when I posted those comments, but I completely agree that .htaccess is not an option if AJAX is to be used from the client. – Marek Karbarz Jun 03 '12 at 19:07
  • Similar: [Prevent direct access to a PHP page](http://stackoverflow.com/questions/185483/prevent-direct-access-to-a-php-page) – Theraot Feb 28 '16 at 16:13

12 Answers12

48

Most Ajax requests/frameworks should set this particular header that you can use to filter Ajax v Non-ajax requests. I use this to help determine response type (json/html) in plenty of projects:

if( isset( $_SERVER['HTTP_X_REQUESTED_WITH'] ) && ( $_SERVER['HTTP_X_REQUESTED_WITH'] == 'XMLHttpRequest' ) )
{
    // allow access....
} else {
    // ignore....
} 

edit: You can add this yourself in your own Ajax requests with the following in your javascript code:

var xhrobj = new XMLHttpRequest();
xhrobj.setRequestHeader("X-Requested-With", "XMLHttpRequest"); 
Ian Van Ness
  • 674
  • 5
  • 5
  • 5
    Although not 100% effective (in the case headers are spoofed), this is a very quick and easy solution to only allow AJAX calls to the page. +1 – Corey Ballou Nov 18 '09 at 19:13
  • Ah...this is the type of solution I expected to get. Unfortunately, I'm not using any frameworks - just straight javascript/ajax. Any rate, the variable isn't set with my code so this doesn't work for me. – jriggs Nov 18 '09 at 19:57
  • I try to implement that with Google Maps calling a php script that returns XML data. However, GDownloadUrl("", function(data) { var xml = GXml.parse(data); (...)) } doesn't allow me to determine the response type that way in my PHP script. – richey Jan 08 '13 at 02:22
  • 9
    -1 This does not prevent any abuse, since the HTTP header can be trivially spoofed. – IMSoP Apr 23 '13 at 20:41
  • Its a shame, even after this much amount of comments there's still no good,discreet and succinct solution posted in answers. Where are all the gurus... we need you guyz, plz.. – Mohd Abdul Mujib Dec 06 '13 at 12:32
  • 1
    @9kSoft — There isn't a solution because the question is trying to make the user's software hide information from the user. That isn't possible. – Quentin Jan 26 '15 at 22:56
  • Nice, quick, easy solution. Whilst it's still not 100% protected, it will block most attempts via a standard browser. – jaseeey May 01 '15 at 14:27
  • 1
    A good solution would be a server-generated token which is sent in the javascript request and validated on the server again. – Alex2php Aug 10 '16 at 13:59
  • The various header-based solutions are useful for preventing users from *accidentally* accessing pages they shouldn't but won't stop a knowledgeable hacker. So... It is useful for some scenarios but shouldn't be considered truly secure. Overall, this is a **great** question though. – Stephen R Jul 11 '19 at 16:43
  • Cannot use isset() on the result of an expression (you can use "null !== expression" instead) – Kamel Labiad Jul 13 '20 at 05:43
11

what I use is: PHP sessions + a hash that is sent each time I do a request. This hash is generated using some algorithm in the server side

Gabriel Sosa
  • 7,706
  • 3
  • 36
  • 48
7

Mmm... you could generate a one-time password on session start, which you could store in the _SESSION, and add a parameter to your ajax call which would re-transmit this (something like a captcha). It would be valid for that session only.

This would shield you from automated attacks, but a human who has access to your site could still do this manually, but it could be the base to devise something more complicated.

Palantir
  • 22,691
  • 9
  • 74
  • 84
  • Thanks - this seems like overkill for my scenario. The page I call is just sending an email - the parameter passed is the email address. I really only asked this question because Im still pretty new to ajax, and this seems like something that may be an issue in future applications. – jriggs Nov 18 '09 at 15:28
5

Anyone in this thread who suggested looking at headers is wrong in some way or other. Anything in the request (HTTP_REFERER, HTTP_X_REQUESTED_WITH) can be spoofed by an attacker who isn't entirely incompetent, including shared secrets [1].

You cannot prevent people from making an HTTP request to your site. What you want to do is make sure that users must authenticate before they make a request to some sensitive part of your site, by way of a session cookie. If a user makes unauthenticated requests, stop right there and give them a HTTP 403.

Your example makes a GET request, so I guess you are concerned with the resource requirements of the request [2]. You can do some simple sanity checks on HTTP_REFERER or HTTP_X_REQUESTED_WITH headers in your .htaccess rules to stop new processes from being spawned for obviously fake requests (or dumb search-crawlers that won't listen to robots.txt), but if the attacker fakes those, you'll want to make sure your PHP process quits as early as possible for non-authenticated requests.

[1] It's one of the fundamental problems with client/server applications. Here's why it doesn't work: Say you had a way for your client app to authenticate itself to the server - whether it's a secret password or some other method. The information that the app needs is necessarily accessible to the app (the password is hidden in there somewhere, or whatever). But because it runs on the user's computer, that means they also have access to this information: All they need is to look at the source, or the binary, or the network traffic between your app and the server, and eventually they will figure out the mechanism by which your app authenticates, and replicate it. Maybe they'll even copy it. Maybe they'll write a clever hack to make your app do the heavy lifting (You can always just send fake user input to the app). But no matter how, they've got all the information required, and there is no way to stop them from having it that wouldn't also stop your app from having it.

[2] GET requests in a well-designed application have no side-effects, so nobody making them will be able to make a change on the server. Your POST requests should always be authenticated with session plus CSRF token, to let only authenticated users call them. If someone attacks this, it means they have an account with you, and you want to close that account.

Enno
  • 1,527
  • 12
  • 25
4

I would question why you are so convinced that no-one should be able to visit that file directly. Your first action really should be to assume that people may visit the page directly and act around this eventuality. If you are still convinced you want to close access to this file then you should know that you cannot trust $_SERVER variables for this as the origins of $_SERVER can be difficult to determine and the values of the headers can be spoofed. In some testing I did I found those headers ($_SERVER['HTTP_X_REQUESTED_WITH'] & $_SERVER['HTTP_X_REQUESTED_WITH']) to be unreliable as well.

Donald Duck
  • 6,488
  • 18
  • 59
  • 79
robjmills
  • 17,839
  • 14
  • 71
  • 118
  • Because bots and hackers, scrappers, ddosers, the internet is a nasty place you give them something and people will try to use it to destroy you. – Michael Rogers Jul 03 '17 at 10:19
2

Put the following code at the very top of your php file that is called by ajax. It will execute ajax requests, but will "die" if is called directly from browser.

define('AJAX_REQUEST', isset($_SERVER['HTTP_X_REQUESTED_WITH']) && strtolower($_SERVER['HTTP_X_REQUESTED_WITH']) == 'xmlhttprequest');
if(!AJAX_REQUEST) {die();}

Personally, I choose not to output anything after "die()", as an extra security measure. Meaning that I prefer to show just a blank page to the "intruder", rather than giving out hints such as "if" or "why" this page is protected.

Theo Orphanos
  • 1,255
  • 16
  • 25
2

I solved this problem preparing a check function that make three things

  1. check referer $_SERVER['HTTP_REFERER'];
  2. check http x request $_SERVER['HTTP_X_REQUESTED_WITH'];
  3. check the origin via a bridge file

if all three pass, you success in seeing php file called by ajax, if just one fails you don't get it

The points 1 and 2 were already explained, the bridge file solution works so:

Bridge File

immagine the following scenario:

A.php page call via ajax B.php and you want prevent direct access to B.php

  • 1) when A.php page is loaded it generates a complicated random code
  • 2) the code is copied in a file C.txt not directly accessible from web (httpd secured)
  • 3) at the same time this code is in clear sculpted in the rendered html of the A.php page (for example as an attribute of body, es:

    data-bridge="ehfwiehfe5435ubf37bf3834i"

  • 4) this sculpted code is retrived from javascript and sent via ajax post request to B.php

  • 5) B.php page get the code and check if it exists in the C.txt file
  • 6) if code match the code is popped out from C.txt and the page B.php is accessible
  • 7) if code is not sent (in case you try to access directly the B page) or not matches at all (in case you supply an old code trapped or trick with a custom code), B.php page die.

In this way you can access the B page only via an ajax call generated from the father page A. The key for pageB.php is given only and ever from pageA.php

Edoardo
  • 87
  • 4
  • Instead of generating `C.txt` file. Would it not be wiser to have a `$_SESSION['random_key']` variable which is generated upon `session_start()` and regenerated every time a check takes place? – Uzair Hayat Dec 30 '18 at 22:55
1

There is no point in doing this. It doesn't add any actual security.

All the headers that indicate that a request is being made via Ajax (like HTTP_X_REQUESTED_WITH) can be forged on client side.

If your Ajax is serving sensitive data, or allowing access to sensitive operations, you need to add proper security, like a login system.

Pekka
  • 418,526
  • 129
  • 929
  • 1,058
0

I tried this

1) in main php file (from which send ajax request) create session with some random value, like $_SESSION['random_value'] = 'code_that_creates_something_random'; Must be sure, that session is created above $.post.

2) then

$.post( "process_request.php", 
{ 
input_data:$(':input').serializeArray(),
random_value_to_check:'<?php echo htmlspecialchars( $_SESSION['random value'], ENT_QUOTES, "UTF-8"); ?>' 
}, function(result_of_processing) {
//do something with result (if necessary)
});

3) and in process_request.php

if( isset($_POST['random_value_to_check']) and 
trim($_POST['random_value_to_check']) == trim($_SESSION['random value']) ){
//do what necessary
}

Before i defined session, then hidden input field with session value, then value of the hidden input field send with ajax. But then decided that the hidden input field not necessary, because can send without it

Andris
  • 1,401
  • 1
  • 18
  • 30
0

I have a simplified version of Edoardo's solution.

  1. Web page A creates a random string, a [token], and saves a file with that name on disk in a protected folder (eg. with .htaccess with Deny from all on Apache).

  2. Page A passes the [token] along with the AJAX request to the script B (in OP's queryString).

  3. Script B checks if the [token] filename exists and if so it carries on with the rest of the script, otherwise exits.

  4. You will also need to set-up some cleaning script eg. with Cron so the old tokens don't cumulate on disk.

It is also good to delete the [token] file right away with the script B to limit multiple requests.

I don't think that HTTP headers check is necessary since it can be easily spoofed.

Maciek Rek
  • 911
  • 1
  • 9
  • 17
-1

Based on your description, I assume you're trying to prevent outright rampant abuse, but don't need a rock-solid solution.

From that, I would suggest using cookies:

Just setcookie() on the page that is using the AJAX, and check $_COOKIE for the correct values on func.php. This will give you some reasonable assurance that anyone calling func.php has visited your site recently.

If you want to get fancier, you could set and verify unique session ids (you might do this already) for assurance that the cookie isn't being forged or abused.

anschauung
  • 3,627
  • 3
  • 22
  • 33
-2

I tried many suggestions, no one solved the problem. Finally I protected the php target file's parameters and it was the only way to limit direct access to the php file. ** Puting php file and set limitation by.htaccess caused fail Ajax connection in the main Html page.