Results 1 to 15 of 15
  1. #1

    Which language/technology for this task?

    First post here by the way. If I'm looking to have my site pull a string of numbers from another site and graph the results on a weekly basis, can you suggest what language/technology is easiest to do this in or with?

  2. #2
    Join Date
    Dec 2007
    Location
    Lebanon
    Posts
    413
    depends on where from on the other site?

  3. #3
    Join Date
    Apr 2002
    Location
    Hollywood, CA
    Posts
    3,046
    Site scrapping... Perl, should be simple with LWP::UserAgent and or WWW::Mechanize. Plenty of articles online that document the process too!

  4. #4
    Quote Originally Posted by case View Post
    Site scrapping... Perl, should be simple with LWP::UserAgent and or WWW::Mechanize. Plenty of articles online that document the process too!
    Great. Thanks for giving me some leads to go with. Appreciate it. I'm sure I'll be back with more questions once I brief up on the articles.

  5. #5
    Join Date
    Feb 2003
    Location
    Canada
    Posts
    958
    Another option would be Ruby, hpricot, and Gruff Graphs.

  6. #6
    PHP with JpGraph is good too, or Artichow.
    Last edited by Renard Fin; 02-04-2008 at 11:28 AM.

  7. #7
    Join Date
    Oct 2002
    Location
    WWW
    Posts
    718
    Quote Originally Posted by Renard Fin View Post
    PHP with JpGraph is good too, or Artichow.
    I also recommend PHP.
    “We don’t see things as they are, we see them as we are.” - Anais Nin
    + + http://goo.gl/FueXnz

  8. #8
    Join Date
    Nov 2005
    Location
    Palma de Mallorca, Spain
    Posts
    259

  9. #9
    Join Date
    Apr 2000
    Location
    California
    Posts
    3,051
    You can use any language out there to accomplish this and many other tasks. It's really about preference and what one's you know best. You can use Perl, PHP, Python, Ruby, C, C++, shell scripts (sh, ksh, etc.), and so on.

  10. #10
    PYTHON POWER!!!
    Very powerful, simple, productive, etc...
    you can use matplotlib to make amazing graphs, and several other libs to make web slurping...

    As said, most languages do the jobs

  11. #11
    It should be easy to do in Perl, with LWP:imple & HTML::TreeBuilder.

    Then again, is the data you want to pull in a table? The Windows version of Excel has a pretty nice feature that let's your grab external web data. From there you can use whatever other features you want (e.g., pivot tables, charts, etc.).

    -Bill

  12. #12
    I often develop tools that use server side Java to pull data [including xml] from other sites. It is fast, however you must have a web host that supports server side Java.

    Q...

  13. #13
    Join Date
    Sep 2005
    Location
    Canada
    Posts
    645
    PHP (or a C++ cgi) using the CURL library will turn any web page into a file you can parse and do whatever with.

    PHP has some built in graphing functions as well.

    That said, you can use just about any language to do this. If your roots are C/C++ PHP is probably easiest, but thats just an opinion which is worth the paper its written on
    VPSVille.com
    Toronto, London, Dallas, Los Angeles
    Quality VPS hosting on Premium bandwidth

  14. #14
    Join Date
    Feb 2003
    Location
    L.A. C.A.
    Posts
    335
    Quote Originally Posted by Shikha View Post
    I also recommend PHP.
    I also recommend PHP.
    WLKNS.co - A collection of my programmer thoughts

  15. #15
    Join Date
    Apr 2000
    Location
    California
    Posts
    3,051
    Quote Originally Posted by arkin View Post
    I also recommend PHP. :)
    I also recommend Perl. :)

    Or PHP

    Or a dozen other languages.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •