What is a good, simple to use program, that will take a list of URL's and

mike1942f

New member
recover the HTML of the page? I would like to get the HTML of a sampling of Yahoo!Answers pages based on the URL's included in the e-mail sent me about Best Answers so I want a program to read a text file of URL's, connect to the Internet and save the source code out to another file. Any suggestions? Working in Windows XP and used to working in MSDOS. Do some VisualBASIC and a lot of older BASIC programming.
 
More and more, the servers are monitoring to see if we have a human on the other end. What used to work in the browser as View Source is now blocked at many sites.

I gained a lot of useful insight of some of the "why" of the http and the html from playing around with this book. Mr. Heaton's book on Bots and Spyders was an incredible read the 1st two chaps and builds upon that later with java programs.

Just wanted to mention that if none of the other answers gets you where you wanna be.
 
Try using a program called HTTrack. It records most things on a website and can store them locally on your computer so you can modify the code and whatnot, or just run the site offline.
 
More and more, the servers are monitoring to see if we have a human on the other end. What used to work in the browser as View Source is now blocked at many sites.

I gained a lot of useful insight of some of the "why" of the http and the html from playing around with this book. Mr. Heaton's book on Bots and Spyders was an incredible read the 1st two chaps and builds upon that later with java programs.

Just wanted to mention that if none of the other answers gets you where you wanna be.
 
More and more, the servers are monitoring to see if we have a human on the other end. What used to work in the browser as View Source is now blocked at many sites.

I gained a lot of useful insight of some of the "why" of the http and the html from playing around with this book. Mr. Heaton's book on Bots and Spyders was an incredible read the 1st two chaps and builds upon that later with java programs.

Just wanted to mention that if none of the other answers gets you where you wanna be.
 
More and more, the servers are monitoring to see if we have a human on the other end. What used to work in the browser as View Source is now blocked at many sites.

I gained a lot of useful insight of some of the "why" of the http and the html from playing around with this book. Mr. Heaton's book on Bots and Spyders was an incredible read the 1st two chaps and builds upon that later with java programs.

Just wanted to mention that if none of the other answers gets you where you wanna be.
 
Try using a program called HTTrack. It records most things on a website and can store them locally on your computer so you can modify the code and whatnot, or just run the site offline.
 
More and more, the servers are monitoring to see if we have a human on the other end. What used to work in the browser as View Source is now blocked at many sites.

I gained a lot of useful insight of some of the "why" of the http and the html from playing around with this book. Mr. Heaton's book on Bots and Spyders was an incredible read the 1st two chaps and builds upon that later with java programs.

Just wanted to mention that if none of the other answers gets you where you wanna be.
 
Back
Top