Project

General

Profile

Actions

How to run in a container (with Docker)

Containers can be built and used in many different ways.
In this example, Docker will be used primarily, but other container tools such as Podman, containerd, CRI-O, Buildah, Kaniko, or img are also available and can be used interchangeably in most scenarios.

Why use containerization?

The power of containerization lies in its ability to isolate and package applications in a lightweight and reproducible way.
Especially for production environments, containers enable reliable, secure, and consistent builds across different systems and deployment targets.

For developers, containers also provide the flexibility to bundle lighttpd (or any other service) together with debugging tools, shells, and diagnostic utilities - allowing for quick testing, debugging, and prototyping.

In short, containerization offers a wide range of options for different use cases, all of which can be clearly separated and managed independently.

Examples

To demonstrate two common use cases:
  • Production builds – a minimal container image containing only what is necessary to run the application.
  • Development builds – an image including standard tools and debugging utilities for testing and troubleshooting.

Disclaimer:
These examples are provided for demonstration purposes and are optimized for specific scenarios.
If you want to try them yourself, make sure to adapt and test them within your own environment, workload, and configuration.
Results will vary depending on your infrastructure, traffic, and parameters, and cannot be generalized to all use cases.

Development build

Here is an example build, where we have the following directory structure:

 .
 ├─ Dockerfile
 ├─ lighttpd.conf
 └─ html
    └─ index.html

Inside lighttpd.conf (only diff from standard):

#server.errorlog             = "/var/log/lighttpd/error.log" 

Inside Dockerfile:

FROM debian:trixie

RUN apt-get update && apt-get install -y lighttpd media-types nano

COPY html /var/www/html
COPY lighttpd.conf /etc/lighttpd/lighttpd.conf

EXPOSE 80

CMD ["lighttpd", "-D", "-f", "/etc/lighttpd/lighttpd.conf"]

When you now build the image with sudo docker build -t test . and run it with sudo docker run --rm -p 8083:80 -it test you can see you website at http://localhost:8083/.

Production build

In this setup I use a static lighttpd build with pre-compressed files from a node Angular project.
When you pre-compress the files you dont need to compress the files on the fly.

In the Dockerfile you can specify more stages to build a very small container at the end.

First we need to build the static binary with some extensions (lua, pcre) to allow mod_indexfile, mod_magnet, mod_staticfile and some scripts.

First stage Dockerfile:

FROM python:3.13.5 AS build

RUN apt update && \
  apt install -y lua5.4 liblua5.4-dev && \
  python -m pip install scons && \
  release=$(curl -s https://download.lighttpd.net/lighttpd/releases-1.4.x/latest.txt) && \
  curl -s -o $release.tar.xz https://download.lighttpd.net/lighttpd/releases-1.4.x/$release.tar.xz && \
  tar xvJf $release.tar.xz && \
  cd $release && \
  scons -c
RUN release=$(curl -s https://download.lighttpd.net/lighttpd/releases-1.4.x/latest.txt) && \
  cd "$release" && scons -j $(nproc) build_fullstatic=1 with_lua=1 with_pcre2=1 build_dynamic=0 prefix=/usr/local install && \
  mkdir -p /temp-dirs/run /temp-dirs/var/tmp /temp-dirs/scripts

The second stage involves building the node project and compressing the files:

FROM node:24.2.0-alpine AS builded

RUN apk add --no-cache zopfli brotli
WORKDIR /app
COPY ["package-lock.json", "package.json", "."]
RUN npm ci
COPY ["angular.json", "tsconfig.app.json", "tsconfig.json", "tsconfig.spec.json", "."]
COPY src src
COPY public public
RUN npx ng build && \
  mv dist/lowcode-frontend/3rdpartylicenses.txt dist/lowcode-frontend/browser && \
  mv dist/lowcode-frontend/prerendered-routes.json dist/lowcode-frontend/browser && \
  find dist/lowcode-frontend/browser -type f \( \
    -name "*.html" -o -name "*.htm" -o -name "*.css" -o -name "*.csv" -o \
    -name "*.xml" -o -name "*.js" -o -name "*.json" -o -name "*.rss" -o \
    -name "*.atom" -o -name "*.xhtml" -o -name "*.ttf" -o -name "*.eot" -o \
    -name "*.otf" -o -name "*.woff" -o -name "*.woff2" -o -name "*.wasm" -o \
    -name "*.svg" -o -name "*.txt" -o -name "*.md" -o -name "*.map" \
  \) -exec sh -c 'echo Compressing $1; brotli -k --best $1; zopfli --i1000 $1;' _ {} \;

The last stage is to pack all together:

FROM scratch

COPY --from=build /usr/local/sbin/lighttpd /lighttpd

COPY .docker/lighttpd.conf /lighttpd.conf

COPY --from=build /temp-dirs/ /
COPY --from=builded /app/dist/lowcode-frontend/browser /var/www/html

COPY .docker/content-negotiation.lua /scripts/content-negotiation.lua

ENV PATH=$PATH:/
ENTRYPOINT ["lighttpd", "-D", "-f", "/lighttpd.conf"]

The following files are structured as follows:

 .
 ├─ .docker
 │  ├─ content-negotiation.lua
 │  ├─ lighttpd.conf
 │  └─ Dockerfile
 └─ node project (more files angular.json, package.json, etc)

The script for content-negotiation.lua from AbsoLUAtion:

-- content-negotiation.lua
--
-- Summary: perform custom content negotiation via lighttpd mod_magnet
--
-- Notes: various filesystem naming conventions might be used to place
-- lang and/or encoding extensions prior to original file extension,
-- or afterwards.  This implementation places lang before and encoding after,
-- demonstrationg how to implement either.
--
--
-- Copyright (c) 2017, Glenn Strauss (gstrauss () gluelogic.com), incremental
-- All rights reserved.
--
-- License: 3-clause BSD
--
-- Redistribution and use in source and binary forms, with or without
-- modification, are permitted provided that the following conditions are met:
-- 
-- - Redistributions of source code must retain the above copyright notice, this
--   list of conditions and the following disclaimer.
-- 
-- - Redistributions in binary form must reproduce the above copyright notice,
--   this list of conditions and the following disclaimer in the documentation
--   and/or other materials provided with the distribution.
-- 
-- - Neither the name of the 'incremental' nor the names of its contributors may
--   be used to endorse or promote products derived from this software without
--   specific prior written permission.
-- 
-- THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" 
-- AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
-- IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
-- ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
-- LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-- CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-- SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
-- INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
-- CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
-- ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
-- THE POSSIBILITY OF SUCH DAMAGE.

function parse_HTTP_list (header)

    -- parse HTTP list from given request header
    -- There are *many* ways to parse strings in Lua, some more efficient than
    -- others, but none are a one-size-fits-all scenarios.  This is but one
    -- solution and has not been optimized.  Better solutions are welcomed.
    -- For HTTP list parsing with qvalues, the following assumes *no*
    -- quoted-strings containing the ',' or ';' delimiters, which is not fully
    -- RFC-compliant, but is also a reasonable simplification for headers
    -- including Accept-Language and Accept-Encoding, which generally have
    -- well-known sets of values.  Given that the following does not handle
    -- quoted-string, simplify string by removing all whitespace.  Then split
    -- on ',', and for each result, split on ';' and parse for qvalue.  Assume
    -- ';' is followed by "q=..." and nothing else, or ignore.  Ignore items
    -- with q=0 (or equivalent) since default content file will be provided if
    -- there is no better match.

    local tokens = {}, token
    if (not header) then return tokens end
    header = string.gsub(header, "%s+", "")
    for token in string.gmatch(header, "[^,]+") do
        local b, e = string.find(token, ";q=", 1, 1)
        local n = 1
        if b then
            n = tonumber(token:sub(e+1))
            token = (n ~= nil and n > 0) and token:sub(1, b-1) or "" 
        else
            b = string.find(token, ";", 1, 1)
            if b then token = token:sub(1, b-1) end
        end
        if (token ~= "") then
            table.insert(tokens, { n, token })
        else
            -- ignore (skip) if invalid (e.g. empty token or invalid qvalue)
        end
    end

    -- sort on qvalue and return simple ordered, indexed list
    table.sort(tokens, function (v1,v2) return v1[1] > v2[1] end)
    local v
    local list = {}
    for _, v in ipairs(tokens) do
        table.insert(list, v[2])
    end
    return list

end

function negotiate_language (path)

    local langs = parse_HTTP_list(lighty.request["Accept-Language"])
    if not langs[1] then return 0 end

    -- split path into basepath and ext
    -- basepath ends in '.' for use below, and if present, ext begins with '.'
    local ext = string.match(path, "(%.[^/]+)$")
    local basepath
    if (ext) then
        basepath = path:sub(1, -#ext)
    else
        basepath = path .. "." 
        ext = "" 
    end

    -- check if basepath .. lang .. ext exists
    local lang
    for _, lang in ipairs(langs) do
        local attr = nil
        if (string.find(lang, "/", 1, 1)) then
            -- skip lang containing '/'
            -- security: avoid path traversal
            -- since lang is used in filenames, lang must not contain '/'
        else
        if (lang == "en-US" or lang == "en") then
            -- (optional optimization; remove condition if should not apply)
            -- skip if default file is for "en-US" and "en" 
            -- (assumes Accept-Language contains only en-US and/or en,
            --  or they are last, i.e. other languages are preferred)
        else
            path = basepath .. lang .. ext
            attr = lighty.stat(path)
        end
        end
        if (attr and attr["is_file"]) then
            lighty.env["physical.path"] = path
            lighty.env["physical.rel-path"] = (#ext)
              and lighty.env["physical.rel-path"]:sub(1, -#ext) .. lang .. ext
              or lighty.env["physical.rel-path"] .. "." .. lang
            return 0
        end
    end

    return 0

end

local encoding_exts =
  {
    ["br"]    = ".br"                          -- brotli
   ,["gzip"]  = ".gz",  ["x-gzip"]  = ".gz"    -- gzip
-- ,["bzip2"] = ".bz2", ["x-bzip2"] = ".bz2"   -- bzip2
  }

function negotiate_encoding (path, content_type)

    local encs = parse_HTTP_list(lighty.request["Accept-Encoding"])
    if not encs[1] then return 0 end

    -- check if pre-encoded file exists with mapped extension
    local enc
    local basepath = path
    for _, enc in ipairs(encs) do
        local ext = encoding_exts[enc:gsub(".*", string.lower)]
        if (ext) then
            path = basepath .. ext
            local attr = lighty.stat(path)
            if (attr and attr["is_file"]) then
                lighty.env["physical.path"] = path
                lighty.env["physical.rel-path"] =
                  lighty.env["physical.rel-path"] .. ext
                lighty.header["Content-Encoding"] = enc
                if (content_type) then
                    lighty.header["Content-Type"] = content_type
                else
                    lighty.header["Content-Type"] = "application/octet-stream" 
                end
                return 0
            end
        end
    end

    return 0

end

--
-- content negotiation
--

-- check that default content file exists (or index-file)
local attr = lighty.stat(lighty.env["physical.path"])
if (not attr) then return 0 end
if (not attr["is_file"]) then
    if (attr["is_dir"]) then
        -- check for index file (code below checks only for index.html)
        local path = lighty.env["physical.path"]
        local indexfile =
          string.sub(path, -1) == "/" and "index.html" or "/index.html" 
        path = path .. indexfile
        attr = lighty.stat(path)
        if (not attr or not attr["is_file"]) then return 0 end
        -- (below assignments not required; merely shortcut mod_indexfile)
        lighty.env["physical.path"] = path
        lighty.env["physical.rel-path"] =
          lighty.env["physical.rel-path"] .. indexfile
    else
        return 0
    end
end

-- negotiate_language(lighty.env["physical.path"])

negotiate_encoding(lighty.env["physical.path"], attr["content-type"])

return 0

And the lighttpd.conf:

server.compat-module-load = "disable" 
server.modules = ( "mod_indexfile", "mod_magnet", "mod_staticfile" )
server.tag = "" 
server.upload-dirs = ( "/var/tmp" )
server.document-root = "/var/www/html" 
server.bind = "/run/lighttpd.sock" 
server.socket-perms = "0666" 
server.max-read-idle = 60
server.max-write-idle = 360
server.max-keep-alive-idle = 5 # in seconds
server.max-keep-alive-requests = 1000 # requests per connections
server.max-fds = 16384 # should be min 3x the max-connections, +100kb RAM per 4096 connections
server.max-connections = 4096 # max connections
server.feature-flags += (
    "server.h2c" => "disable",
    "server.h2proto" => "disable" 
) # behind a proxy: disable h2c (HTTP/2 without TLS) + http2
server.stat-cache-engine = "inotify" 

mimetype.assign = (
    ".html"   => "text/html",
    ".htm"    => "text/html",
    ".css"    => "text/css",
    ".csv"    => "text/csv",
    ".xml"    => "application/xml",
    ".js"     => "application/javascript",
    ".json"   => "application/json",
    ".rss"    => "application/rss+xml",
    ".atom"   => "application/atom+xml",
    ".xhtml"  => "application/xhtml+xml",
    ".ttf"    => "font/ttf",
    ".eot"    => "application/vnd.ms-fontobject",
    ".otf"    => "font/otf",
    ".woff"   => "font/woff",
    ".woff2"  => "font/woff2",
    ".wasm"   => "application/wasm",
    ".svg"    => "image/svg+xml",
    ".txt"    => "text/plain",
    ".md"     => "text/markdown",
    ".map"    => "application/json",
    ""        => "application/octet-stream" 
)

index-file.names = ( "index.html", "index.csr.html" )
magnet.attract-physical-path-to = ( "/scripts/content-negotiation.lua" )

When you now build sudo docker build -t test -f .docker/Dockerfile . you can use this image as a lightweight container to serve files after a proxy.

For example with a nginx proxy:

    location / {
        proxy_pass http://unix:/run/lighttpd.sock;
    }

Updated by ruben 4 days ago · 1 revisions