dokuwiki images load randomly

Status
Not open for further replies.

rejectli

New Member
Messages
25
Reaction score
0
Points
1
I have some images on a dokuwiki page. Most are around 10k in size and no more than 4 images per page.

Sometimes images load. Sometimes they don't. Individually. Every time I refresh the page it's totally random which images display. I have a fast connection.

What can I do to force the page to only display once all images are loaded?
 

essellar

Community Advocate
Community Support
Messages
3,295
Reaction score
227
Points
63
You'd pretty much have to re-write the client side of the software, or at least add enough JavaScript to amount to the same thing. It's definitely not something that has a quick fix.
 

rejectli

New Member
Messages
25
Reaction score
0
Points
1
So, does the problem originate with Dokuwiki or with x10hosting?
I've searched for other people who've experienced the same problem with Dokuwiki but there are none.

Also, I am wondering if it is related to an error I'm getting infrequently, saying that I've exceeded my data limit (or something similar). But the page I'm editing has very little text (about twice the size of this message) and a few icons about 10k to 50k large. Can that meagre amount be overloading x10hosting? Since Dokuwiki is installed on my website it's not like there's anything waiting for Dokuwiki's website to respond. It's all running on x10hosting.
 
Last edited:

essellar

Community Advocate
Community Support
Messages
3,295
Reaction score
227
Points
63
It depends on how your images are being stored/served. If they're just image files, there shouldn't be a problem - there is no data/bandwidth limit as such. But if they're stored in a database, or if you need to go through a PHP script to get to the image(s), then you may be running into the entry processes limit because the browser is trying to request too many items (generating an entry process for each) at the same time. You can only have 5 entry processes simultaneously, so if each image request takes a non-zero amount of time to serve from a script, you can have more than 5 overlapping. Again, rewriting either the front end (browser side) to wait for requests or rewriting the back end to change the storage/service scheme would be your fixes. (To answer the next question: no, the limits won't be lifted here, because that would interfere with other users on the same server. You'd need to be on a much less crowded server than the Free Hosting servers to make it practical to have higher user limits for entry processes, memory or CPU time.)
 

rejectli

New Member
Messages
25
Reaction score
0
Points
1
That must be it then, an entry point access issue, because Dokuwiki is totally flat (it has no database) and it very PHP light. So the problem is my webpages are too efficient lol.

I think I can fix the problem by turning my icons into animated gifs. That'll increase their filesizes and reduce the chance of the page flooding x10hosting with requests.

Never thought I'd have an issue with a webpage that loads too quickly lol. That's a new one on me.
 

essellar

Community Advocate
Community Support
Messages
3,295
Reaction score
227
Points
63
That wouldn't be the problem, then.
 

rejectli

New Member
Messages
25
Reaction score
0
Points
1
Making the website less efficient has reduced the problem somewhat. Hopefully as the site becomes fuller it'll be inefficient enough to avoid the following error:

Resource Limit Is Reached
The website is temporarily unable to service your request as it exceeded resource limit. Please try again later.

At least now it looks all animaty :)
 

rejectli

New Member
Messages
25
Reaction score
0
Points
1
If I had the time and skills it wouldn't be too difficult to write some code to extract each image on the client side from an image map. How this works is you paste all your images for the whole page into one single image then use some code on the client side to crop/extract each image from this map. Thereby no resource limit would ever be reached since there would only ever be one image per webpage.

If anyone else more qualified wants to give this a go it might be worth it.

Something like this might work:
http://www.html5canvastutorials.com/tutorials/html5-canvas-image-crop/
 
Last edited:

rejectli

New Member
Messages
25
Reaction score
0
Points
1
After some fiddling with code managed to produce this. Just need to find a way to incorporate it into DokuWiki.

<!DOCTYPE html>
<html>
<body>

<p>This page only loads one image.</p>

<table>
<tr>
<td><canvas id="c1"/></td>
<td><canvas id="c2"/></td>
<td><canvas id="c3"/></td>
<td><canvas id="c4"/></td>
</tr>
<tr>
<td><canvas id="c5"/</td>
<td><canvas id="c6"/</td>
<td><canvas id="c7"/</td>
<td><canvas id="c8"/</td>
</tr>
</table>

<script>
window.onload = function() {
(img = new Image()).src =
"http://68.media.tumblr.com/1dee464f3179885291f003d73db09f6d/tumblr_ndnl78QUvS1r5vojso5_500.jpg";
document.getElementById("c1").getContext("2d").drawImage(img, 0, 0, 166, 155, 0, 0, 166, 155);
document.getElementById("c2").getContext("2d").drawImage(img, 167, 0, 166, 155, 0, 0, 166, 155);
document.getElementById("c3").getContext("2d").drawImage(img, 334, 0, 166, 155, 0, 0, 166, 155);
document.getElementById("c4").getContext("2d").drawImage(img, 0, 158, 166, 155, 0, 0, 166, 155);
document.getElementById("c5").getContext("2d").drawImage(img, 167, 158, 166, 155, 0, 0, 166, 155);
document.getElementById("c6").getContext("2d").drawImage(img, 334, 158, 166, 155, 0, 0, 166, 155);
document.getElementById("c7").getContext("2d").drawImage(img, 0, 313, 166, 155, 0, 0, 166, 155);
document.getElementById("c8").getContext("2d").drawImage(img, 167, 313, 166, 155, 0, 0, 166, 155);
};
</script>

</body>
</html>
 

rejectli

New Member
Messages
25
Reaction score
0
Points
1
You'd pretty much have to re-write the client side of the software, or at least add enough JavaScript to amount to the same thing. It's definitely not something that has a quick fix.

I've fixed the client side within Docuwiki itself using some plugins and JQuery code. Not bad considering when I started I hadn't even heard of JQuery, let alone know how to use it. Took 2 days for an absolute beginner to implement a permanent fix. Booya!

The base implementation of Dokuwiki in x10hosting is full of bugs. Keeps throwing NetworkError: 508 Unknown errors, even without any content added. This is not a coincidence and partly why I've been facing problems displaying images with Docuwiki via x10hosting. It's a poor implementation basically, as Docuwiki itself is fine.
 

essellar

Community Advocate
Community Support
Messages
3,295
Reaction score
227
Points
63
For what it's worth, Softaculous isn't an x10Hosting thing, it's third-party, and not all of the offerings in Softaculous are usable within the restrictions of Free Hosting (or necessarily fully compatible with its config). It's there because it's a brain-free installer for things like WordPress and forums that would otherwise require error-prone database setup and software configuration, and offered via cPanel (which is also third-party, not x10Hosting).

Also FWIW, you mentioned in your feedback thread that a page called fetch.php is being thrashed. That's your entry process limiter right there - you're getting content via repeated calls to a PHP script rather than simply opening files. That's just wrong in so many ways...
 

rejectli

New Member
Messages
25
Reaction score
0
Points
1
Never heard of Softaculous so not sure what it has to do with this discussion, so just need to read up on it.

I'm not entirely sure what fetch.php does. Perhaps I need to look at its code to understand it better. But the solution I've made works 100%. No more failed images - ever. Not bad for a rookie who in just two days turned a non-working Dokuwiki site into a working Dokuwiki site. I've already written the tutorial into how to install it, just holding back publishing it here until I secure my site. I may look at making my own Dokuwiki plugin.

I'm not convinced this is a Docuwiki issue, since if it was, there would be some other people posting the same issues with Dokuwiki. I can't find any. I refuse to believe I'm the only person in the world to experience Resource Limit errors. If I am then great. I was the first and also the first to solve it >:) If I'm not the first then even better, because hopefully my solution can be used to help others.

FETCH.PHP
PHP:
application/x-httpd-php fetch.php
PHP script text

<?php
/**
* DokuWiki media passthrough file
*
* @license    GPL 2 (http://www.gnu.org/licenses/gpl.html)
* @author     Andreas Gohr <andi@splitbrain.org>
*/

if(!defined('DOKU_INC')) define('DOKU_INC', dirname(__FILE__).'/../../');
if (!defined('DOKU_DISABLE_GZIP_OUTPUT')) define('DOKU_DISABLE_GZIP_OUTPUT', 1);
require_once(DOKU_INC.'inc/init.php');
session_write_close(); //close session

require_once(DOKU_INC.'inc/fetch.functions.php');

if (defined('SIMPLE_TEST')) {
    $INPUT = new Input();
}

// BEGIN main
    $mimetypes = getMimeTypes();

    //get input
    $MEDIA  = stripctl(getID('media', false)); // no cleaning except control chars - maybe external
    $CACHE  = calc_cache($INPUT->str('cache'));
    $WIDTH  = $INPUT->int('w');
    $HEIGHT = $INPUT->int('h');
    $REV    = & $INPUT->ref('rev');
    //sanitize revision
    $REV = preg_replace('/[^0-9]/', '', $REV);

    list($EXT, $MIME, $DL) = mimetype($MEDIA, false);
    if($EXT === false) {
        $EXT  = 'unknown';
        $MIME = 'application/octet-stream';
        $DL   = true;
    }

    // check for permissions, preconditions and cache external files
    list($STATUS, $STATUSMESSAGE) = checkFileStatus($MEDIA, $FILE, $REV, $WIDTH, $HEIGHT);

    // prepare data for plugin events
    $data = array(
        'media'         => $MEDIA,
        'file'          => $FILE,
        'orig'          => $FILE,
        'mime'          => $MIME,
        'download'      => $DL,
        'cache'         => $CACHE,
        'ext'           => $EXT,
        'width'         => $WIDTH,
        'height'        => $HEIGHT,
        'status'        => $STATUS,
        'statusmessage' => $STATUSMESSAGE,
        'ispublic'      => media_ispublic($MEDIA),
    );

    // handle the file status
    $evt = new Doku_Event('FETCH_MEDIA_STATUS', $data);
    if($evt->advise_before()) {
        // redirects
        if($data['status'] > 300 && $data['status'] <= 304) {
            if (defined('SIMPLE_TEST')) return; //TestResponse doesn't recognize redirects
            send_redirect($data['statusmessage']);
        }
        // send any non 200 status
        if($data['status'] != 200) {
            http_status($data['status'], $data['statusmessage']);
        }
        // die on errors
        if($data['status'] > 203) {
            print $data['statusmessage'];
            if (defined('SIMPLE_TEST')) return;
            exit;
        }
    }
    $evt->advise_after();
    unset($evt);

    //handle image resizing/cropping
    if((substr($MIME, 0, 5) == 'image') && ($WIDTH || $HEIGHT)) {
        if($HEIGHT && $WIDTH) {
            $data['file'] = $FILE = media_crop_image($data['file'], $EXT, $WIDTH, $HEIGHT);
        } else {
            $data['file'] = $FILE = media_resize_image($data['file'], $EXT, $WIDTH, $HEIGHT);
        }
    }

    // finally send the file to the client
    $evt = new Doku_Event('MEDIA_SENDFILE', $data);
    if($evt->advise_before()) {
        sendFile($data['file'], $data['mime'], $data['download'], $data['cache'], $data['ispublic'], $data['orig']);
    }
    // Do something after the download finished.
    $evt->advise_after();  // will not be emitted on 304 or x-sendfile

// END DO main

//Setup VIM: ex: et ts=2 :
 

essellar

Community Advocate
Community Support
Messages
3,295
Reaction score
227
Points
63
Most people aren't using Docuwiki on servers with these constraints, and I'm sure the writers had a reason (that sounded good to them) for doing things the way they did. A sane system would manage files directly, at the cost of (perhaps) making direct file management at the filesystem level a little more difficult (by, say, modifying filenames while leaving them meaningful). fetch.php basically translates the browser's requests into something the server's file system can handle. The problem with that approach is that all requests have to go through a PHP script separately, and that's what's causing the entry process limiter to kick in. There wouldn't be that problem if the files were being requested directly. Because they've decided to run everything through a script instead of using flat files as flat files, you've had to severely slow things down at the browser side to manage the request rate. Yes, it's a fix, and yes, you have good reason to be proud of having made the fix that you could make with limited knowledge and the next best thing to no documentation, but that doesn't change the fact that it's a fix for a problem that should never have been there in the first place.

Softaculous is the "thing that offers templates". It's a third-party software installer with its own list of software, and is in broad use on shared hosting servers, particularly those that are running cPanel as a user account management interface. Not everything in Softaculous's list can even run on the Free Hosting servers, never mind run poorly, but x10Hosting doesn't get to choose what Softaculous includes. As I said, it's there mostly to be a one-click installer for common blog and forum software for people who wouldn't know where to start installing from a download.
 

rejectli

New Member
Messages
25
Reaction score
0
Points
1
Looking at WikiMatrix I can see about 436,715 people use Dokuwiki per year. That's not very much but still, you'd think that just one of those people would have posted the same issue. They haven't.

So, why am I the only known case getting Resource Limit errors? It's weird.
Either that or google hasn't indexed a single forum page where the issue was reported.
 

rejectli

New Member
Messages
25
Reaction score
0
Points
1
I realise that it's unlikely that I'll post this solution any time soon so here's a version of it that doesn't identify my website:

Code:
<JS>
    (img = new Image()).src = "http://www.example.com/lib/exe/fetch.php?media=map.png";
    img.addEventListener("load", function() {
       var offset = 5;
       drawImage("#home", img, 1, 1, offset);
       drawImage("#news", img, 1, 36, offset);
       drawImage("#help", img, 1, 58, offset);
       drawImage("#list", img, 1, 88, offset);
       drawImage("#faqs", img, 1, 118, offset);
    });
    function drawImage(id, img, sx, sy, os) {
       for (var i = 1; jQuery(n = id+i.toString()).length > 0 ; i++) {
           var c = jQuery(n)[0];
           c.width += os; c.height += os;
           c.getContext("2d").drawImage(img, sx, sy, c.width, c.height, os, os, c.width, c.height);
       }
    }
</JS>


Then wherever you want an image just put something like
Code:
<canvas home1 30,33></canvas>

This uses a Canvas plugin to capture the <canvas> tags and an Inline Javascript plugin to decouple the JQuery code from the <canvas> tags (basically making the Dokuwiki formatting easier to understand).

It's not yet perfect. <canvas> has weird effects on html alignment that might be worthwhile overcoming.
If you want to use the code the format for the tags is <canvas idn width,height></canvas>. The n after the id is an integer starting from 1 to enable ids of the same name to be uniquely identified. Currently the JQuery code needs to be edited per page to plug in the x and y offsets.

It works well. The map.png file gets cached to memory so once it's loaded you don't get any delay at all. My website currently uses a 400k image for all images across the whole site. So, once that image is loaded the load times on pages with media is effectively zero with practically no data usage at all.
 
Last edited:
Status
Not open for further replies.
Top