Project

General

Profile

[Solved] FastCGI blocks on the first CGI process

Added by realnc over 10 years ago

Hello.

I have a big problem. I configured fastcgi like this:

fastcgi.server = (
    ".pl" => ((
        "socket"   => "/var/run/lighttpd/perl-fcgi.socket",
        "bin-path" => "/usr/local/lib/cgi-bin/test_dispatcher.fcgi",
        "check-local"     => "disable",
        "max-procs"       => 10,
    ))
)

This does indeed spawn 10 processes. My test_dispatcher.fcgi is a very simple Perl script:

#!/usr/bin/perl

use CGI::Fast;

{
    while (new CGI::Fast) {
        do $ENV{SCRIPT_FILENAME};
    }
}

But, lighttpd only uses one of the spawned processes, even though there are 10 available. For example, if I put this Perl script in my document root (/var/www/test.pl):

for ( ; ; )
{ }

And then open that URL with my broser at home:

http://example.com/test.pl

then I can see that indeed one of the spawned FCGI processes is now using 100% CPU. When I open the URL another time in a new browser tab, nothing happens. A second FCGI process (out of the 10 available) isn't even handling the request. Only the first is still running. All the other processes are just sitting around, idling.

The only way to make another FCGI process run, is to open a different URL instead. For example, if I copy test.pl to test2.pl, and then open http://example.com/test2.pl, then, and only then, do I see two CGI processes running, each using 50% CPU.

This is not good. I need to have all requests to the same URL handled.

I don't know what to do here. I'm surely doing something wrong, but I can't tell what.

I'm running Debian 6.0.10 and the lighttpt that comes with it, which is "lighttpd/1.4.28 (ssl)".

My full lighttpd configuration is:

server.modules = (
        "mod_access",
        "mod_alias",
        "mod_compress",
        "mod_redirect",
        "mod_fastcgi" 
)

server.document-root        = "/var/www" 
server.upload-dirs          = ( "/var/cache/lighttpd/uploads" )
server.errorlog             = "/var/log/lighttpd/error.log" 
server.pid-file             = "/var/run/lighttpd.pid" 
server.username             = "www-data" 
server.groupname            = "www-data" 

index-file.names            = ( "index.php", "index.html",
                                "index.htm", "default.htm",
                               " index.lighttpd.html" )

url.access-deny             = ( "~", ".inc" )

static-file.exclude-extensions = ( ".php", ".pl", ".fcgi" )

include_shell "/usr/share/lighttpd/use-ipv6.pl" 

dir-listing.encoding        = "utf-8" 
server.dir-listing          = "enable" 

compress.cache-dir          = "/var/cache/lighttpd/compress/" 
compress.filetype           = ( "application/x-javascript", "text/css", "text/html", "text/plain" )

include_shell "/usr/share/lighttpd/create-mime.assign.pl" 
include_shell "/usr/share/lighttpd/include-conf-enabled.pl" 

# Custom settings below

server.max-fds = 2048
server.event-handler = "linux-sysepoll" 
server.max-keep-alive-idle = 2
#fastcgi.debug = 1

fastcgi.server = (
    ".pl" => ((
        "socket"   => "/var/run/lighttpd/perl-fcgi.socket",
        "bin-path" => "/usr/local/lib/cgi-bin/perl-dispatcher.fcgi",
        "check-local"     => "disable",
        "max-procs"       => 10,
    )),
)

Replies (3)

RE: FastCGI blocks on the first CGI process - Added by stbuehler over 10 years ago

The only way to make another FCGI process run, is to open a different URL instead. For example, if I copy test.pl to test2.pl, and then open http://example.com/test2.pl, then, and only then, do I see two CGI processes running, each using 50% CPU.

This actually sounds more like a client problem (or some proxy between) - I think it is possible our balancer is buggy, but it sounds weird that it would depend on the url in this case.

So please test with curl instead of the browser as a start.

You also can try using spawn-fcgi + multiwatch (instead of bin-path + max-procs), where all forks would listen on the same socket and the kernel would do the balancing instead of lighty.

RE: FastCGI blocks on the first CGI process - Added by realnc over 10 years ago

OK, I was using Google Chrome and opened the same URL in multiple tabs. Now I tried wget instead, multiple times in multiple terminals.

This works OK!

I would have expected that the client that is used to access a URL wouldn't matter.

But everything is fine though. I'm just curious :-)

RE: FastCGI blocks on the first CGI process - Added by HenrikHolst over 10 years ago

This is probably due to Chrome trying to limit the number of outgoing connections by only using one connection per URL even though they are opened from different tabs.

    (1-3/3)