Subj : Re: Novice seeking advice To : comp.programming From : Willem Date : Mon Aug 08 2005 10:17 pm programmernovice@yahoo.com wrote: ) I'd like to learn to extract information from websites automatically, ) which is available from them manually. Sites such as ebay, etc. I ) have a little basic familiarity with logic, looping, etc. but ) unfortunately learned to program in the Fortran days! I'd appreciate ) any and all recommendations on how to go about getting started on ) this, such as which language, which books are best to learn from, etc. ) Many thanks for all help. First, learn the HTTP protocol. Then, find out how these websites work, I.E. what kind of HTTP requests they generate from the information you type in manually. Then it should be a matter of forming an HTTP request such that it will get you the information you need, sending it to the right server, receiving the data you get in response (a text/html stream, usually), and cutting out the relevant bits. So what you need is a language that makes it easy and/or has libraries for parsing/forming text, and doing TCP/IP stuff. perl comes to mind, as does tcl, and probably a host of other high-level scripting type languages. Depends on what you know already, really, and what you're comfortable with. SaSW, Willem -- Disclaimer: I am in no way responsible for any of the statements made in the above text. For all I know I might be drugged or something.. No I'm not paranoid. You all think I'm paranoid, don't you ! #EOT .