The actual implementation is a bit of a long answer, but I'll give you a short summary and you can pursue whichever strategy you want. See the links below for more information.
Many sites these days offer data via "web services" such as "RPC", "SOAP", "REST", etc. Search the target site for developer or "API" (application programming interface) documentation.
You can also "screen scrape" by making an HTTP request from inside your program (essentially making it its own special purpose browser), and then pattern matching or otherwise parsing the output for the data you want.
NOTE: Screen scraping my violate some sites' terms of service (TOS). Please be a good net citizen and check site policy first.
For PHP, consider the CURL client library.
For Perl, check out "LWP" or "WWW::Mechanize".
Search engine indexing spiders are a form of screen scraping. So are the email address harvestors used by spammers.