short url catalog - Still just thinking...

BentFX

New Member
Messages
116
Reaction score
0
Points
0
Hi All,

I'm just thinking about the possibility of creating my own short URL catalog for personal use on twitter and such...

The way I'm thinking, something like... http://BentFX.com/6217
Should return a redirect to something like...
http://BigLongReallySuckyDomain.org/BigFolderOfMoreFolders/BigFolderOfFunnyStuff/FunnyPhoto.jpg?stuff=Lots&more=LotsMore

I'm thinking the way to do it would be to start with a MySQL database of long URLs and link numbers or names. Then a custom 404 handler that would check the requested doc name against the database. If it finds a match return a redirect, else return a standard 404 page.

The big thing I'm wondering about is a secure method of adding URLs to the catalog... To ensure that people aren't adding their illegal links to my link database. I certainly don't want http://BentFX.com/Kids pointing toward someones kiddie porn collection.

Like I said in the topic title... I'm still just thinking. A project like this would be challenging (read fun) for me, but I'm concerned that I'm missing (or misunderstanding) simple stuff.

Would x10 even allow something like this?

Any input is appreciated!


Skip
 

lemon-tree

x10 Minion
Community Support
Messages
1,420
Reaction score
46
Points
48
The first suggestion for limiting input would be to use a captcha to prevent a bot form endlessly adding hundreds or thousands of links. I would suggest looking at how many of the other sites do it and take inspiration from it.
Second, when a user adds a link, you check to see if you already have that link in your database, if so don't create a new record but just send the old one; this'll dramatically reduce records if a popular link is requested many times. Also, store submission info such as IP in in your database too so you can check for spammers and implement a blocking system. A similar system could also limit the number of addresses each IP can add per day. In terms of user experience, a link limiter would be a better solution than a captcha.
Finally, however hard you try you will find it very difficult to prevent people from posting link to your undesired sites. If you really don't want it then don't make it public. You could try a blacklist that scans the URLs for any common indicators, but you can't really do better than that without loading up each page and checking it itself.
 

BentFX

New Member
Messages
116
Reaction score
0
Points
0
Thanks lemon-tree,

I really don't want the responsibility of maintaining a public list. And I really don't want my domain name related with questionable content.

I figure I could bury the add-link page in passwords but I want it to be easy and quick to add links.

I do run Apache on my local machine. So I was thinking I might host an add-link.php page locally and have it create some kind of cryptic hash that would authenticate it to an add-link script on x10. As I'm think about it, I think thats the best security solution... a local script, that calls a password protected remote script.
 

lemon-tree

x10 Minion
Community Support
Messages
1,420
Reaction score
46
Points
48
Alternatively, if you could use a cookie to identify your computer, so you would only have to enter a password the first time you use the site. Or, if you have a static IP, you could use that to identify yourself.
 

xav0989

Community Public Relation
Community Support
Messages
4,467
Reaction score
95
Points
0
I built a small software that basically does exactly that, and compiles stats about each click. You need to log in as admin to add links, so people should not be able to add any. If you want to check it out, you can see http://sourceforge.net/projects/linkdb-afrosoft/ for more info. As the website is currently down, you can PM me for any question. My script needs some improvements, but if you want, you can come in the team with me.
 
Last edited:
Top