I have a few articles with lots of pictures that I've uploaded into my wiki and reference using image tags:
{{:test:test_picture.jpg}}
Whenever these pages are refreshed, they seem to completely re-download all of the pictures each time. I figured I'd spent 20 minutes of my Sunday to fix this but it's been like two hours and I still haven't got this worked out.
Here's a capture of what I'm referring to:
[m]HTTP 686 0.000133000 GET /lib/tpl/dokuwiki/images/button-donate.gif HTTP/1.1
HTTP 287 0.028589000 HTTP/1.1 304 Not Modified
HTTP 683 0.000185000 GET /lib/tpl/dokuwiki/images/button-php.gif HTTP/1.1
HTTP 287 0.028928000 HTTP/1.1 304 Not Modified
HTTP 717 0.000187000 GET /lib/exe/fetch.php?media=tech:ruckus:23_zd_wlan2.png HTTP/1.1
HTTP 74 0.000001000 HTTP/1.1 200 OK (PNG)
HTTP 719 0.000130000 GET /lib/exe/fetch.php?media=tech:ruckus:20_gp_profile.png HTTP/1.1
HTTP 74 0.000001000 HTTP/1.1 200 OK (PNG)
[/m]
So it looks like images explicitly path'd (whatever, just see the blue) is working as you'd expect. But the images called via fetch.php are being redownloaded instead of returning Not Modified (see red) no matter how many times I refresh.
My original solution was to enable mod_expires and "whitelist" image files in an .htaccess file:
<IfModule mod_expires.c>
<FilesMatch "\.(jpe?g|png|gif)$">
ExpiresActive On
ExpiresDefault "access plus 1 day"
</FilesMatch>
</IfModule>
But that's obviously not working for me. I think it might be because the images in question are being "fetched" via the php file. I'm not a web admin (surprise!) so let me know if this is possible and what I should be doing to accomplish this ostensibly simple task.