Step 5: Check + Control Shadow Domain™
Before the automatic search engine submission can be effected, the phantom pages will have to be uploaded to the Shadow Domain™. This can be worked either with a tarball (compressed file) generated in Step 4, or by uploading the individual pages by either FTP or via Telnet.
On each Shadow Domain™, a “Central Keyword Switch (CKS)” script will be installed. Every visitor arriving at your Shadow Domain™ (regardless of which specific page is actually being hit) will be checked by this script.
If the script determines that your visitor is a human surfer, said visitor will be redirected to your Core Domain. Should the visiting entity be a search engine spider, it will be fed the pertinent phantom page's content. (No redirect for search engine spiders!)
In Step 5, you can configure this script to suit your specific setup.
After implementing the Shadow Domain™ you should conduct a functionality test. This is performed during Step 5.
The Shadow Domain Control™ Center
The User Interface presented under Step 5 will also serve you as a Control Center for your Shadow Domain™: it allows to check the functionality of the CKS, review your list of phantom pages and check the access logs, e.g. for search engine spider activity.
Automatic botBase Maintenance
The fantomas shadowMaker™ program relies for its search engine spider recognition on fantomas spiderSpy™
– the world's most comprehensive botBase. This is updated every six hours, seven days a week, to ensure optimum reliability and security for your Shadow Domain setup.
To facilitate automatic updates of the fantomas shadowSniper™ database on your Shadow Domain™, you should use our proprietary script fantomas spyFetcher™.
This script has to be installed in a separate process, it cannot be configured via the Step 5 web interface. Hence, make sure you read the manual section "The fantomas spyFetcher™ Module: Automatic botBase Maintenance" which explains in all detail how to install and configure this script.
Central Keyword Switch (CKS) Functionality
All visitors' IP addresses are be checked by the CKS. If found belonging to a search engine spider, the phantom page will be read internally and fed to the spider. In this case, no redirection will take place and the spider will not notice the difference: it will crawl and index the phantom page just like any other web page.
If no established search engine spider IP is detected, the visitor's Referrer data will be parsed for keywords/search phrases. If keywords/search phrases are found for which redirection instructions have been defined in the Links List, the visitor will be redirected to the predefined target URL.
If no Referrer is detected, or if no specific target URL has been defined for the keywords/search phrases found, the visitor will be redirected to the defined standard URL.
All hits are logged in the log file “hits.log”. Search engine spider spider hits are marked by two preceding exclamation marks: “!!”.
The file “human-hits.log” logs only hits from human visitors and spiders not assigned to search engines. (The latter may include whackers, extractor bots, etc.)