Coding For Online Success

Simple script for connecting to Commission Junction’s Product Web Service and populating a local database

Today’s post, I’m going to get back to code.. about time I would say. I thought it would be cool to talk about a way to populate a local database with Commission Junction products. Then you could display those products on your store front. I like to pre-populate a store and use a local database versus hitting a web service so I can speed the experience for the visitor. Web service calls always have a pause, and CJ’s seemed a little slow.

Things we’re going to use to pull this off. You’ll need to make sure all of these requirements are met.

PHP

MySQL

NuSOAP client (for sending/receiving web service requests) — PHP5 users are spoiled with better functionality, and this will not be required, but you’re on your own for sending/receiving the XML, this tutorial covers PHP4 only. You can download NuSOAP here (http://dietrich.ganx4.com/nusoap/)

This tutorial will discuss receiving and populating a products database (for example every hour) to maintain a fresh product list from a CJ merchant. When I initially created this I wrote it as a CLI (Command LIne script, where cron was running it every hour). If you plan on using this as a straight up web script, ensure you’ve limited access to it (usually only the web server or your IP only). Another security consideration, this script uses mysql_real_escape_string(), if you’re environment does NOT use SAFE_MODE then you’ll need to add stripslashes() inside of that function.

How CJ web services work is pretty simple. Based off of your PID which you pass to them, you’re able to search for products from all merchants associated with your account. Obviously, you’ll want to be able to search only from a few merchants, so you might want to do an initial search for products (please see http://webservices.cj.com/ and read the API Reference and Help section). From there you’ll be able to see ‘AdvertiserIDs’, once you have that, you’ll be able to narrow your search down. I haven’t seen a better way to get this, and if you know.. please tell me!

Here we’re setting up some important web service variables on the products I’m looking for:

$Keywords = ‘widgets’;$LowPrice = 0;$HighPrice = 1000;$AdvertiserIDs = 123,5566; // this is the advertiserIDs I was talking about. They can be comma delimited when adding more than one

Now, it’s time to make our call to the web service! The code below will assemble a proper web service request and connect to CJ. It will then receive a ‘response’ and set them up as local variables for us.

If there are records returned, I’m going to empty the database products table(you can do what you want ie: UPDATE, check for changes, etc). Once the DB table is empty I re-populate it with current data. I run this script once a day.

This script was intended to be a ‘worker’ script, not accessible by the public but run only from cron internal to re-populate my database table. You definitely could use some of the programming above to make real time calls, etc. If you want real time search, make sure you sanitize search keyword with htmlspecialchars(strip_tags()). Keywords are somewhat weird, you’ll want to refer to the API documentation on this, since you can have positive and negative keywords to help refine your search.

Again, make sure that this script is not accessible to your visitors since you don’t want robots or malicious users hitting this script and potentially abusing CJ’s web service or corrupting your products database table.

Whoa. That is a weird error. Are you connecting to version 1 wsdl? It’s basically saying that search() is not an available function in the web service? I have a feeling you might be trying to connect to version 2?

I was assuming that there needs to be a ” (quote) after the word tblProducts to close the query. I tried inserting a quote after the word tblProducts and when I ran the script, it did not give me an error but it did not populate the tblProducts database at all. Nothing happaned at all

One last thing – the code runs without any errors but the tbl_Products table in my database is empty? I can not seem to figure out why. If it could not connect or if there was an error with the code, wouldnt MySQL display some sort of error message?

Nothing in the database table would probably be the result of 0 records returned from your API query. I would echo $count to verify that the loop is turning with INSERTs. If count is 0, then nothing would be inserted.

Hi Hanji!
I get XML files from several stores via FTP, after that i put data to my mysql. My question is – Should i include key developer code to PHP file? Everthing happens on my server, why should i put DK?

First question: I get XML files from several stores via FTP, after that i put data to my mysql. My question is – Should i include key developer code to PHP file? Everthing happens on my server, why should i put DK?

My example was using Nusoap to get the feed. Since you’re pulling them directly from FTP, you will NOT need DK on the server, since you already have the data and parsing it in your database.

I’m definitely confused on your second question: Hanji, there is more important question and if u can answer me that would be awesome.
I want to pass my value in buyUrl http://www.kfgzyfj.com/hg116xdmjdl0484669502163A857&customer_id=235434654
So &customer_id=235434654 is one. When i get Commission Detail Service ReST or Soap report, do they send me my values? do u understand what i’m talking about?

I’m unfamiliar with Commission Detail Service REST. Are you pulling reports from their API?

Thank a lot for your attention Hanji! As i understood from i read there i can get reports for all deals and commission, very detailed reports.
I wanted to get answer from their forum, that is very empty.
So…
I want to give some values with their link and get that value when they send a report via XML to me.
So is that posable or not? hmmm

since you already have the data and parsing it in your database.The development of dynamic Web pages often use a database to store information lead to database access to frequently refresh the page, greatly consuming system resources.