using suexec for apache

: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /var/www/virtual/rlogix/includes/unicode.inc on line 311.

The challenge with securing a shared hosting server is how to secure the website from attack both from the outside and from the inside. PHP has built-in features to help, but ultimately it’s the wrong place to address the problem.

So what can Apache do to help?

It turns out that there are quite a few alternative ways that Apache can help. This article will look at what we can do with stock Apache, and the next few articles will look at what we can do with some interesting third-party Apache modules.

* suexec: Running CGI Programs As A Specified User
* Configuring Apache With PHP/CGI
* Configuring suexec With PHP/CGI
* Configuring suexec For Shared Servers
* Some Benchmarks
* Other Considerations
* Conclusions

suexec: Running CGI Programs As A Specified User

To secure a shared hosting server, we want to be able to run PHP as the user who owns that particular website. One way to do this with stock Apache is with suexec.

suexec is a standard Apache module which allows you to run a CGI executable as a specified user and group. CGI executables date back to the very early days of the web, back when we all had to use Perl to create dynamic websites. Although PHP is commonly run as an Apache module, it still provides support for CGI.

Check with your Linux vendor to make sure that you have PHP/CGI installed on your box.
Configuring Apache With PHP/CGI

The first step for getting suexec working is to configure Apache to run PHP as a CGI executable, instead of using mod_php. Add the following configuration to your httpd.conf file:

ScriptAlias /php5-cgi /usr/bin/php-cgi
Action php5-cgi /php5-cgi
AddHandler php5-cgi .php
AddDirectoryIndex index.php

… and add the following line to your virtual host:

AddHandler php5-cgi .php

In your httpd.conf file (or in one of the files that httpd.conf includes), there will be a entry for the directory on disk where your virtual host is stored. Inside that entry, there should be an “Options” line, which might look like this:

Options Indexes FollowSymLinks

Add “ExecCGI” to the end of your Options line.

Make sure to comment out mod_php from Apache. Then, restart Apache, and do some testing to make sure that PHP 5 is working.

For reference, here is the Apache config from my test system:

ScriptAlias /php5-cgi /usr/bin/php-cgi
Action php5-cgi /php5-cgi
AddHandler php5-cgi .php
AddDirectoryIndex index.php index.phtml

DocumentRoot /var/www/localhost/htdocs

Options Indexes FollowSymLinks ExecCGI
AllowOverride All
Order allow,deny
Allow from all

AddHandler php5-cgi .php

Configuring suexec For PHP/CGI

With Apache now running PHP as a CGI executable, we’re ready to get Apache running PHP as the owner of each website.

In your test virtual host, add the following:

SuexecUserGroup stuart users

Replace “stuart” with the user who owns the website, and replace “users” with the group that the user belongs to. This sets the privileges that PHP will run as.

To ensure the security of your server, suexec is very particular about what conditions must be met before it will execute your PHP scripting engine. A full list of conditions can be found in the Apache docs. To make sense of the conditions, you’ll need to know what settings your copy of suexec has been compiled with. Run the command suexec -V to find out your system’s settings. This is the output from my Seed Linux LAMP Server system:

belal vhosts.d # suexec -V
-D AP_DOC_ROOT="/var/www"
-D AP_GID_MIN=100
-D AP_HTTPD_USER="apache"
-D AP_LOG_EXEC="/var/log/apache2/suexec_log"
-D AP_SAFE_PATH="/usr/local/bin:/usr/bin:/bin"
-D AP_SUEXEC_UMASK=077
-D AP_UID_MIN=1000
-D AP_USERDIR_SUFFIX="public_html"

The first condition (and one that isn’t obvious from the Apache manual!) is that the PHP CGI executable must be installed under AP_DOC_ROOT. Chances are that it isn’t installed there at the moment, so go ahead and copy it there.

mkdir /var/www/localhost/cgi-bin
cp /usr/bin/php-cgi /var/www/localhost/cgi-bin

The second condition is that the PHP CGI executable must be owned by the same user and group you listed in the SuexecUserGroup statement earlier. This causes problems for shared hosting; I’ll show you how to fix that later in this article.

chown stuart users /var/www/localhost/cgi-bin/php-cgi

Update your Apache httpd.conf file to use this copy of PHP:

ScriptAlias /php5-cgi /var/www/localhost/cgi-bin/php-cgi

Restart Apache, and test to make sure that PHP 5 is still working. You should also start to see log messages appearing in AP_LOG_EXEC. This is the first place to look if PHP isn’t working (although the log messages can be a little terse and cryptic).

For reference, here is the Apache config from my test system:

ScriptAlias /php5-cgi /var/www/localhost/cgi-bin/php-cgi
Action php5-cgi /php5-cgi
AddHandler php5-cgi .php
AddDirectoryIndex index.php index.phtml

DocumentRoot /var/www/localhost/htdocs

Options Indexes FollowSymLinks ExecCGI
AllowOverride All
Order allow,deny
Allow from all

SuexecUserGroup stuart users
AddHandler php5-cgi .ph

Configuring suexec For Shared Servers

I mentioned earlier that there was a problem with using suexec + PHP/CGI on shared servers - the very environment where suexec is needed the most :( In one of the steps above, we created a copy of the PHP CGI executable, and changed its ownership on disk to match the ownership of the website.

chown stuart users /var/www/localhost/cgi-bin/php-cgi

What happens when we have two websites, each owned by a different user? Or five, or ten, or hundreds? Apache’s suexec will refuse to re-use this copy of the PHP CGI executable for each of the websites, because it isn’t owned by the right user and group.

Each website needs its own copy of the PHP CGI executable, owned by the user and group that owns the website itself. We don’t want to create hundreds of copies of the actual PHP CGI executable (it’s a large waste of space, and a pain for managing PHP upgrades), so instead we can point each website at its own copy of a simple bash script:

#!/bin/bash

/usr/bin/php-cgi "$@"

This script simply executes our central copy of the PHP CGI executable, passing through whatever parameters Apache has called the bash script with.

To configure Apache to use this script, simply move the ScriptAlias statement from outside the VirtualHost config to inside.
Some Benchmarks

Because Apache is having to execute a new suexec process every page hit (and suexec executes a new PHP CGI process every page hit), it’s going to be slower than running mod_php. But how much slower? To find out, I used Apache’s ab benchmarking program to load a phpinfo() page 1000 times. I ran the benchmark five times and averaged out the results.

* suexec: average of 127.219 seconds
* suexec + bash script: average of 134.836 seconds
* mod_php: average of 3.753 seconds

suexec on its own is some 34 times slower than using mod_php; suexec + the bash script needed for shared hosting environments is even worse, at 36 times slower than using mod_php.

This benchmark doesn’t provide the full picture. Once you take into account the extra memory used by the suexec method, and the extra memory and CPU (and process context switches!) required to transfer output from PHP/CGI to Apache to send back to the website’s user, the final cost of using suexec + PHP/CGI will be substantially higher.
Other Considerations

Performance isn’t the only thing to think about when evaluating suexec + PHP/CGI.

* suexec + PHP/CGI does solve the security challenge, without requiring your application to support safe_mode.
* HTTP authentication is only supported by mod_php, not PHP/CGI. If your application relies on this, then suexec + PHP/CGI is not for you.

Conclusions

Apache’s suexec mechanism does secure a shared hosting server from attack from within. However, this is achieved at a heavy performance cost, which inevitably will translate into needing lots of extra servers - which is expensive.

So, if Apache itself doesn’t come with a solution that’s worth a damn, maybe there are third-party solutions out there that can do a better job? The next article in the series will take a look at what others have done to try and plug this gap.