How can you leave "old" web pages (pages you no longer have links too ect..) on your web site host, but not have search engines pick those "old" pages up. Is their something you can do to inactivate those "old" pages. ------------------ humble, but hungry.
Other than notifying the search engines? Do you have an example? ------------------ "No one gets out ALIVE!" SaveOurRockets.com
Let's say Clutch City had a search function on it's own web site. Let's also say that you have an old web page reporting on Lewis Lloyd's play, that you don't want anyone to ever see, but that you would like to keep in your files just in case he made a comeback. So now you've got this web page on Lewis Lloyd just sitting out there in your files. Someone runs a search on "Lewis Lloyd" and picks up the page in the results. Oops. You hoped nobody would ever find it. The problem is that your search engine knows the page exists in your files because it can search them, so it picks up the document, and allows someone to view and link to it. I'm wondering if there is a way to code the old web page so that it will keep the old page from being picked up by the search engine. Can you rename it *.old or something. Or tag it somehow. Or anything like that. On a larger scale, I've got the main search engines picking up these "old" pages too. Wondering how to keep them around, but keep people from finding them so easily. Don't know if that cleared up the question or made it mo' confusing, but there you go! If all I need to do is notify those search engines, where can I do that? ------------------ humble, but hungry.
If it is some type of spider search, you are out of luck. Your best bet is just to archive it on your hard drive instead of keeping it on the server. Spider searches look through all the text of the site. Anything that is up there is fair game. ------------------ "No one gets out ALIVE!" SaveOurRockets.com
Thanks. I'll see what I can figure out. ------------------ humble, but hungry. [This message has been edited by PhiSlammaJamma (edited July 11, 2000).]
Actually, cant you create some kind of file that tells spider indexes not to look at those files? Something about a robot.txt file or some crap ------------------ Kentucky or bust!
Thanks Charles. That's what I was looking for and, from a search engine naturally, I found it. Here is a nice FAQ about all of it. http://info.webcrawler.com/mak/projects/robots/faq.html ------------------ humble, but hungry.