問題描述
我正在尋找一個功能類似于 Perl 的WWW::Mechanize,但適用于 PHP.基本上,它應該允許我使用簡單的語法提交 HTTP GET 和 POST 請求,然后解析結果頁面并以簡單格式返回所有表單及其字段以及頁面上的所有鏈接.
I'm looking for a library that has functionality similar to Perl's WWW::Mechanize, but for PHP. Basically, it should allow me to submit HTTP GET and POST requests with a simple syntax, and then parse the resulting page and return in a simple format all forms and their fields, along with all links on the page.
我知道 CURL,但它有點過于準系統,而且語法非常難看(大量的 curl_foo($curl_handle, ...)
語句
I know about CURL, but it's a little too barebones, and the syntax is pretty ugly (tons of curl_foo($curl_handle, ...)
statements
說明:
我想要比目前的答案更高級的東西.例如,在 Perl 中,您可以執行以下操作:
I want something more high-level than the answers so far. For example, in Perl, you could do something like:
# navigate to the main page
$mech->get( 'http://www.somesite.com/' );
# follow a link that contains the text 'download this'
$mech->follow_link( text_regex => qr/download this/i );
# submit a POST form, to log into the site
$mech->submit_form(
with_fields => {
username => 'mungo',
password => 'lost-and-alone',
}
);
# save the results as a file
$mech->save_content('somefile.zip');
要使用 HTTP_Client 或 wget 或 CURL 做同樣的事情將需要大量工作,我必須手動解析頁面以查找鏈接、查找表單 URL、提取所有隱藏字段等.我要求 PHP 解決方案的原因是我沒有使用 Perl 的經驗,我可能可以通過大量工作來構建我需要的東西,但如果我能在 PHP 中完成上述操作,速度會快得多.>
To do the same thing using HTTP_Client or wget or CURL would be a lot of work, I'd have to manually parse the pages to find the links, find the form URL, extract all the hidden fields, and so on. The reason I'm asking for a PHP solution is that I have no experience with Perl, and I could probably build what I need with a lot of work, but it would be much quicker if I could do the above in PHP.
推薦答案
SimpleTest 的 ScriptableBrowser 可以獨立于測試框架使用.我已經將它用于許多自動化工作.
SimpleTest's ScriptableBrowser can be used independendly from the testing framework. I've used it for numerous automation-jobs.
這篇關于是否有相當于 Perl 的 WWW::Mechanize 的 PHP?的文章就介紹到這了,希望我們推薦的答案對大家有所幫助,也希望大家多多支持html5模板網!