Web spiders updating dynamic pages
It’s been known for a long time that Googlebot acts more like a modern day browser, rendering content, crawling and indexing Java Script and dynamically generated content rather well.
The SEO Spider is now able to render and crawl web pages in a similar way.
The main problem with any static site appears when you wish to update the content.
Unless you are conversant with HTML and the design methods used in the site then you have to go back to the designer to have any content changes made.
A dynamic page is literally updating each page load.
If you want to know when a dynamic page is updated you're going to need to look at the page itself or an RSS feed for the page.
When you update and save a Dynamic Web Template (DWT), Microsoft Expression Web prompts you to update the web pages attached to the DWT.
After the updates are complete, a dialog box appears that alerts you to the number of web pages that were updated.
Flexibility is the main advantage of a static site - every page can be different if desired, to match the layout to different content, and the designer is free to put in any special effects that a client may ask for in a unique way on different pages.So, what is the less intrusive way to ask a site if a page has changed since some arbitrary time or when the page was updated the last time.I obviously could download the whole page and compare with the content I have saved on file, but I want to reduce overhead.We use a huge set of computers to fetch (or "crawl") billions of pages on the web.The program that does the fetching is called Googlebot (also known as a robot, bot, or spider).
Search for web spiders updating dynamic pages:
This is why we created the Screaming Log File Analyser, as a crawler will only ever be a simulation of search bot behaviour.