learning_brain
New Member
- Messages
- 206
- Reaction score
- 1
- Points
- 0
Some of you may know I now run an image search engine that crawls for high quality pics and graphics.
Unfortunately, due to the somewhat unpredictable nature of my 'good' free host, my account has been deleted so I can't demonstrate the problem.
Essentially, there are two parts (well more but two critical ones) to the crawling process.
1) find and store all <img src="whatever">'s
2) create and store small jpg thumbnail for reference.
Simple so far?
Yep, but my index page search has an interesting function. Firstly, it loads all thumbnails related to the search (fast loading). In addition however, it also hotlinks (or loads the image from the originating site) so that a large preview can be shown when you hover over the thumbnail.
Sounds great, so whats the problem?
The problem is that "hotlinking" is frowned upon by many. It eats into the originating site's bandwidth and, as I'm using high resolution images, this can be a severe hit to someone with limited resource. The secondary hang-up for my site is that the page takes a while to load, as each page result has about 16 large images on it.
This practice is not always frowned upon. My site provides the originating site's address, hence creating a back link for them and driving traffic. Also, the ability to create direct links is kinda the point of the world wide web is it not?
I have two options:
1) Drop the large image preview and stick to the 120x80px thumbs which are stored on my site... which will make it more like google and lose part of the USP uniqueness. This will also speed up my page loads.
2) Keep the hotlinked large image preview, maintain my sites uniqueness and risk the consequences... if there are any :S (other than the known "switcheroo" problem which means the originating site may alter the image without warning for something unsuitable)
Any opinions?
Unfortunately, due to the somewhat unpredictable nature of my 'good' free host, my account has been deleted so I can't demonstrate the problem.
Essentially, there are two parts (well more but two critical ones) to the crawling process.
1) find and store all <img src="whatever">'s
2) create and store small jpg thumbnail for reference.
Simple so far?
Yep, but my index page search has an interesting function. Firstly, it loads all thumbnails related to the search (fast loading). In addition however, it also hotlinks (or loads the image from the originating site) so that a large preview can be shown when you hover over the thumbnail.
Sounds great, so whats the problem?
The problem is that "hotlinking" is frowned upon by many. It eats into the originating site's bandwidth and, as I'm using high resolution images, this can be a severe hit to someone with limited resource. The secondary hang-up for my site is that the page takes a while to load, as each page result has about 16 large images on it.
This practice is not always frowned upon. My site provides the originating site's address, hence creating a back link for them and driving traffic. Also, the ability to create direct links is kinda the point of the world wide web is it not?
I have two options:
1) Drop the large image preview and stick to the 120x80px thumbs which are stored on my site... which will make it more like google and lose part of the USP uniqueness. This will also speed up my page loads.
2) Keep the hotlinked large image preview, maintain my sites uniqueness and risk the consequences... if there are any :S (other than the known "switcheroo" problem which means the originating site may alter the image without warning for something unsuitable)
Any opinions?