Project

General

Profile

Actions

Bug #2579

closed

svg and svgz content

Added by rdtennent over 10 years ago. Updated over 10 years ago.

Status:
Fixed
Priority:
Normal
Category:
documentation
Target version:
ASK QUESTIONS IN Forums:

Description

lighttpd with the distributed configuration files doesn't serve svg or svgz content with the appropriate headers. Please add the following lines to the mimetype.assign block in mime.conf:

".svg"          =>      "image/svg+xml",
".svgz" => "image/svg+xml",

and the following to modules.conf:

$HTTP["url"] =~ "\.svgz$" {
setenv.add-response-header = ( "Content-Encoding" => "gzip")
mimetype.assign = (".svgz" => "image/svg+xml" )

Also uncomment the mod_env line in the server.modules block.

Every modern browser supports svg and svgz natively. This issue (for svg) was reported seven years ago and it was suggested that it was up to "distributions" to set up the mime mapping. You distribute lighttpd sources so you are the most important distributor. You can't expect every distribution to figure out your configuration scheme, much less every sys admin. What possible reason do you have for not implementing these simple changes to the default configuration?

Actions #1

Updated by stbuehler over 10 years ago

  • Category set to documentation
  • Target version set to 1.4.36

debian still includes create-mime.assign.pl - but you are right, our example file should include more mime types.

svgz / "Content-Encoding" => "gzip" is not going to happen though - this is an ugly hack which won't work with clients that are not supporting gzip.

Actions #2

Updated by rdtennent over 10 years ago

"this is an ugly hack" But it's what w3c expect: http://lists.w3.org/Archives/Public/www-svg/2014Mar/0090.html.

"Won't work with clients that are not supporting gzip" Every modern browser I've tried renders svgz content perfectly as long as both Content-Encoding and Content-Type headers are provided. Try any browser on the images at http://www.cs.queensu.ca/students/undergraduate/prerequisites/; if there's a problem I want to know about it.

Actions #3

Updated by stbuehler over 10 years ago

simple example: curl. unless given --compressed it won't decode a compressed response. And many clients (outside the browser world) are using libcurl in one way or another.

Actions #4

Updated by rdtennent over 10 years ago

Does curl not realize PDFs are compressed? I'd call this discrimination against svgz a bug in curl. For downloads, that web site provides PDFs.

Actions #5

Updated by darix over 10 years ago

rdtennent wrote:

Does curl not realize PDFs are compressed? I'd call this discrimination against svgz a bug in curl. For downloads, that web site provides PDFs.

you do realize that most people save the compressed PDF right.
I would also argue that if someone clicks on a compressed file he wants to save it as a compressed file.
personally i would rather let me users only click on .svg files. and send them the precompressed file when their browser announces/requests compression support. exposing users to the detail that you have .svgz files seems wrong.

Actions #6

Updated by rdtennent over 10 years ago

most people save the compressed PDF. I would also argue that if someone clicks on a compressed file he wants to save it as a compressed file.

I agree; but it seems curl doesn't allow the compressed svg to be downloaded and saved. That's what's wrong.

let me users only click on .svg files. and send them the precompressed file when their browser announces/requests compression support.

Compressing static content every time it's requested seems inefficient. And in the case of that server (not mine to configure), it doesn't happen: the svg file itself gets transferred, which is certainly inefficient. What does lighttpd do (in the default configuration)? I'm using it locally and can't really tell whether there's automatic compression. And I doubt that it's any easier to configure automatic compression than to configure an encoding header.

Actions #7

Updated by Olaf-van-der-Spek over 10 years ago

stbuehler wrote:

simple example: curl. unless given --compressed it won't decode a compressed response. And many clients (outside the browser world) are using libcurl in one way or another.

Curl is being silly :(

Actions #8

Updated by darix over 10 years ago

Olaf-van-der-Spek wrote:

stbuehler wrote:

simple example: curl. unless given --compressed it won't decode a compressed response. And many clients (outside the browser world) are using libcurl in one way or another.

Curl is being silly :(

curl is behaving correctly.

Actions #9

Updated by stbuehler over 10 years ago

I think the RFC can be blamed for this. The current draft for semantics says (and rfc 2616 is similar):

A request without an Accept-Encoding header field implies that the
user agent has no preferences regarding content-codings. Although
this allows the server to use any content-coding in a response, it
does not imply that the user agent will be able to correctly process
all encodings.

This is basically not specifying anything at all. The server is allowed to send what it wants, but the client obviously may not be able to read it.

But it doesn't matter; we can't include a properly working default config handling those pre-compressed files; setenv.add-response-header = ( "Content-Encoding" => "gzip") must only be used if the request is handled with mod_staticfile (no fastcgi, proxy, ...); and $HTTP["url"] can't be used reliably to check the file extension ($PHYSICAL["existing-path"] should work).

Actions #10

Updated by rdtennent over 10 years ago

> ($PHYSICAL["existing-path"] should work)

Not till 1.5.0 is available :+)

Actions #11

Updated by stbuehler over 10 years ago

rdtennent wrote:

($PHYSICAL["existing-path"] should work)

Not till 1.5.0 is available :+)

ah right, sorry. (1.5 is unlikely to see a release, 2.0 might some day)

Actions #12

Updated by stbuehler over 10 years ago

The "fix" will probably include the mapping for ".svgz" => "image/svg+xml" (but not the content-encoding header), due to a (imho buggy) line in /etc/mime.types.

https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=139333 says

image/svg+xml svg svgz

is the correct line, but ".svgz" is actually an alias for ".svg.gz" according to the registration

It looks like /etc/mime.types was supposed to be used for the HTTP "Content-Type" header (and not MIME) with a (not-existing) additional list for the "Content-Encoding" header.

The current patch at http://git.lighttpd.net/lighttpd/lighttpd-1.x.git/commit/?h=lighttpd-1.4.x-stbuehler-mimetypes also adds "; charset=utf-8" for text/css, text/csv, text/plain and a lot of source code "types"; these text types have no other standard way of specifying an encoding, and the "old" default of iso-8859-1 is imho not a good one :)

text/html is a mess: the charset in the HTTP "content-type" header overwrites the <meta> information in a HTML document; so it seems like a bad idea to set charset in the HTTP header.

Actions #13

Updated by stbuehler over 10 years ago

  • Status changed from New to Fixed
Actions

Also available in: Atom