This is the community forum. For a developer response use the Client Area.
Follow us on Facebook, Twitter and YouTube!

Reference
#1

Hi,

i'm having some problems with the reference.
If i use ctmx_title or cmtx_h1 it is really slowing down my website.

If i want to use cmtx_url the topic is empty.

This is my configuration,

$cmtx_parameters = 'id';
$cmtx_identifier = 'cmtx_url';
$cmtx_reference = 'cmtx_title';
$cmtx_path = 'comments/';
define('IN_COMMENTICS', 'true'); //no need to edit this line
require $cmtx_path . 'includes/commentics.php'; //no need to edit this line

any suggestions?

Many thanks
Reply
#2

Hi,

The 'cmtx_title' and 'cmtx_h1' keywords are generally slow because they have to load your page using cURL in order to extract the relevant bits from the page's source code before allowing the original page to continue loading. There's very little that I can do to help with that. It works quickly on my websites, which is why I added it, but other websites seem to struggle. I can only suggest to supply the reference value without using these keywords.

The topic is sought from the reference, so if the topic is empty then it's likely to be because the reference is empty too. Note that once the page is created, the topic uses the reference from the database and no longer the reference from the integration code. If you want to change the topic after the page is created, you can either use the $cmtx_set_topic variable in the integration code or edit the reference in the database using Manage -> Pages in the admin panel.

Have you completed the interview?
Reply
#3

Ok thank you Steven

Then i have to find out why it is loading that slow. Using the h1 or title would be awesome. I'm planning to use the 'Latest comments' add on.
Adding the topic manually is to much work. Would be cool to have it fully automated.
Reply
#4

Actually my last post wasn't totally accurate. If 'allow_url_fopen' is enabled on your server then it will use the file_get_contents() function instead of cURL.

To find out whether it's using file_get_contents() or cURL, can you open /comments/includes/functions/page.php and add the parts in red:

if ((bool)ini_get('allow_url_fopen')) {
echo 'Using file_get_contents()';
$file = file_get_contents($path);
} else if (extension_loaded('curl') && cmtx_get_ip_address() != "127.0.0.1") { //if cURL is available and not on localhost
echo 'Using cURL';
$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_MAXREDIRS, 5);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0)");
curl_setopt($ch, CURLOPT_URL, $path);
$file = curl_exec($ch);
curl_close($ch);
}

If you're not using cURL, you might want to comment out the lines so that it uses it:

//if ((bool)ini_get('allow_url_fopen')) {
// $file = file_get_contents($path);
//} else if (extension_loaded('curl') && cmtx_get_ip_address() != "127.0.0.1") { //if cURL is available and not on localhost
$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_MAXREDIRS, 5);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0)");
curl_setopt($ch, CURLOPT_URL, $path);
$file = curl_exec($ch);
curl_close($ch);
//}

If cURL is slow, try adding this line to get detailed information why:

$file = curl_exec($ch);
var_dump(curl_getinfo($ch));
curl_close($ch);

Have you completed the interview?
Reply
#5

Thank you very much Steven,

It was indeed using file_get_contents().
I disabled allow_fopen on the server and now it is using cURL

It is a bit faster, but still too slow.
But if I set CURLOPT_TIMEOUT to "1" it is loading that said time. Is it safe to leave it like this?

The information I get using var_dump:

Using cURLarray(22) {
["url"]=> string(67) "http://www.domain.com"
["content_type"]=> NULL
["http_code"]=> int(0)
["header_size"]=> int(0)
["request_size"]=> int(175)
["filetime"]=> int(-1)
["ssl_verify_result"]=> int(0)
["redirect_count"]=> int(0)
["total_time"]=> float(10.012068)
["namelookup_time"]=> float(0.001062)
["connect_time"]=> float(0.001133)
["pretransfer_time"]=> float(0.001197)
["size_upload"]=> float(0)
["size_download"]=> float(0)
["speed_download"]=> float(0)
["speed_upload"]=> float(0)
["download_content_length"]=> float(-1)
["upload_content_length"]=> float(0)
["starttransfer_time"]=> float(0)
["redirect_time"]=> float(0)
["certinfo"]=> array(0) { }
["redirect_url"]=> string(0) "" }
Reply
#6

Is it working when you set that to "1"? You wouldn't want it to timeout before it's done its job.

If it times out prematurely then you might get an identifier/reference of "Title tag not found", "H1 tag not found" or "Server incapable" when the page is created.

From the var_dump, it looks like it's taking 10 seconds in total. The name lookup, connection, pretransfer and redirects aren't to blame, which is bad in a way because at least it would have identified an issue. I'm no expert on cURL so it might be worth asking your host about it. It might be that they can tweak something, for example compression support etc, to make it faster.

Have you completed the interview?
Reply
#7

hmm sorry. The var_dump is before I changed the time out setting.

But I did some research on cURL and found a tip to use the server ip if cURL is slowing things down.
I changed cmtx_get_ip_address to my (dedicated) server ip and restored the time out setting to "10"
It is indeed working faster now, even faster than setting the time out to "1"

All looking good now.

Thanks for the support so far!
Reply
#8

Okay, glad to hear it's working quickly now.

Have you completed the interview?
Reply
#9

Hi pascal,

I think I have identified and fixed the issue with the 'cmtx_title' and 'cmtx_h1' keywords being really slow.

Basically, because the script re-opens the same page that it is on to get its resulting source code, it was getting into an infinite loop of re-opening itself because there was no check to make sure that it would only do it once. Therefore it kept re-loading in the background until it timed out.

That's why when you set the timeout to "1", it was working fine because it had already done its job and any further time that it took was unnecessary.

To fix this, what I have done is to set the user-agent of the script as "Commentics". Then at the very start of the script it checks if the user-agent is "Commentics" and if so it exits straight away. This way when a normal user views the page everything goes normally, the script re-opens itself to get its source code, except this time it knows to stop straight away because it knows that it's only Commentics that is loading the page.

It's very fast now. If you want to test it out, just go to the development version and click the "Download Zip" button on the right-hand side. Then update your /comments/includes/functions/page.php file with the one from the download.

Have you completed the interview?
Reply


Possibly Related Threads…
Thread / Author Replies Views Last Post

Forum Jump:


Users browsing this thread: 4 Guest(s)