Attachments:

Please indicate how many lines you are uploading. I assume you are referring to the CSV files as shown in your attachment. Also what are the settings under the Bulk actions importer are you using (I.e skip geocoding etc)

There are no warnings or any other messages. I upgraded to SLP <span style=”color: #000000; font-family: ‘Open Sans’, sans-serif; font-size: 13px; widows: 1; background-color: #f7fcfe;”>4.3.20 and SLP pro pack 4.3.02. Not sure how to find what version I upgraded from…</span>

You would need to use the Debugger to see any Javascript errors.
Also See Troubleshooting Ajax blocked for some reason the server is not allowing the load so am thinking there may have been some recent changes in your system configuration or an issue with memory.
I haven’t seen this issue of late but Lance wrote an extensive article on what to check. As mentioned we have customers with very large lists loading.
See Lances Blog some time ago about the amount of Locations he has loaded and any issues. You can search Docs and News for info on this any subject

If you are on a shared Server, the issue is most likely with the configuration and limitations of your server. Google API will not change your server capacity. It only deals with the limitation for the queries to Google for geocoding.

Based on what you have described, as mentioned before, check with your server provider or Your OS for limitations. We cannot tell you which server to get or use, that is outside the scope of this topic.

Operating System Limits

There are various operating system limits that can cause issues including general process limitations, directory security issues, and memory limitation issues. All system-level applications that run on a server are beholden to the over-arching system limitations set by the server administrator. For WordPress sites Apache (or whatever flavor web server app you use) is the primary communicator with the operating system and is most likely to be flagged for “over resource limitations”. PHP can be flagged depending on how you have it configured to work with Apache. Finding server-level limits means search system-level log files.

[box]This problem is rare.[/box]

Network Limits

Some firewalls, especially on shared hosting environments, can throttle and time out high bandwidth data streams.

There are several factors that work in tandem when dealing with long-running web processes. All of these elements can cause limitations on exports (and imports). The primary factors include:

An environment that is properly tuned to deal with large data sets will have not issues with import, exporting, or managing large lists of locations in Store Locator Plus. The core of the product is coded to work well with large data sets with consideration for things like per-event execution time, per-event memory limitations, and reduced disk I/O operations.

The “Debugging WordPress” section on the main Troubleshooting page can help. Turning on WordPress debug logs can quickly identify and resolve export issues. The web server log files (/var/log/httpd/*log on Apache RHEL servers) is also a good place to find limitation hints.

P.S. I just thought of something, you said you had a previous version, a part of the puzzle is missing here, did you export a csv file that you already had and are you trying to re-import it either with a new domain?

Under Manage Locations Bulk actions, what settings are you using on the Bulk Import actions? Are you trying to re-import a list that has already been semi geocoded?

Under the Import button:

Did you use the setting to Skip Geodoce or just geocode missing , skip duplication etc…

Skip First Line

First Line Has Field Name

Skip Geocoding

Load Data

Duplicates Handling

If you can send us the csv file you are trying to import so I can test it on my site

If you do not want to do that I can you can Try using Janitor to clen up those locations and re import

The .csv file is not from a previous version. The list does contain some locations that are already uploaded. I am not sure how to send the CSV file because it is too big. I have attached and abridged one… please try that!

I’m not sure which checkboxes to use. I’ve tried all, none, and combos… what do you recommend?

Attachments:

Hi Erin,
I sent you an email as well that provides more details but wanted to get back to you in this thread in case anyone sees it for answers in the forum.

My response:

By the looks of the List you provided in forum there were errors that I believe point to an issue with some of the addresses For instance in the address line you have info that cannot be geocoded per Google: That is why you are reaching your over Google Query even with API. Google can try forever and not be able to geocode.
​It is not in the proper format: I.e.
​
Raisin Rack
2545 Schrock Rd. Westerville, OHæ
​
​
​The Westervill, ,OHae is confusing the system trying to Geocode and is looping thru Google and that is why you are getting over Google query, . You cannot have special characters in your csv file format.
​The commercial version of Google Maps could not locate that address either and it will keep looping in the SLP system trying to geocode: It gets this response from Google Maps (not using SLP)
​
​We could not find 2545 Schrock Rd. Westerville, OHæ
Make sure your search is spelled correctly. Try adding a city, state, or zip code.
More Google search results for 2545 Schrock Rd. Westerville, OH

​There are other errors in your CSV file format, you have a comma between some and not others when you added NW.

You should export your list in SLP to a csv file format and then you might need to use Janitor if the locations didnt delete completely. Inspect your csv file format using the best practices as discussed in the Troubleshooting and Documentation. (i.e. If you want to skip a field that column in the CSV file must be blank:) You have some locations geocoded and others that are not. You can handle that by deleting all of that column or just click on the Bulk update to skip.
Additional info on how to set retries are under. Geocoding Errors

Set the retires as suggested in the above.

If the developer finds anything else then he will address it in next few days.