The location imports have been updated with the Power add on in order to simplify and speed up the import process. The CSV file format remains as the preferred import method for locations.
CSV File Format
The CSV file should be a UTF-8 encoded file with proper quotes around strings and commas between fields.
Special characters should be stripped or converted to proper UTF-8 format. In our experience many applications such as Numbers and Excel will not strip special characters properly during a UTF-8 CSV file export. Google Sheets has been a good application for importing those CSV files and re-exporting them with special characters removed.
The CSV file must have a header row defining field names as the first row of the file. All other rows must contain proper CSV data. Rows cannot contain more columns (fields) that defined by the header row. That is often an indicator that special characters exist in your file and is used as a safety stop mechanism in the import processor.
CSV File Headers
All location import files must start with a field name header. This is the first line of the CSV file that contains the name of each locator field at the top of the column. The option to “not include a header + is no longer optional in the Power add-on.
Header field names are not case sensitive. The field names will be stripped of any character that is not a letter, number, hyphen, or underscore. If it makes your file more readable you can use a name such as “Address 2” which will be changed to “address2” , then the file is processed. Keep in mind the “sanitized key” version of whatever you use as the header MUST match the list of field names below.
Base Field Names
These are the fields provided by Store Locator Plus that can be imported by the CSV file. The header field names should be entered exactly as listed. Items with two options can use either option, the faster processing option is listed first.
- sl_id , id
- sl_store , store
- sl_address , address
- sl_address2 , address2
- sl_city , city
- sl_state, state
- sl_zip , zip
- sl_country , country
- sl_latitude , latitude
- sl_longitude, longitude
If the sl_id column is present the import will look for a pre-existing location with the same ID and update it.
If the sl_latitude and sl_longitude columns are present the geolocation process will be skipped for that location.
The combination of sl_store , sl_address, sl_address2, sl_city, sl_state, sl_zip, and sl_country are used when the “update duplicate locations” setting is active. In this mode these fields are checked for an EXACT MATCH in the pre-existing list of locations. If they match exactly the location is updated. If they don’t match the location is added.
Power Field Names
If you have enabled extended contact fields under the General / Data tab you will have the following fields available for import:
The identifier field can be used to associate a Store Locator Plus location id with a remote data system identifier. Import records that have this field set will automatically search the pre-existing locations in the Store Locator Plus database for an exact match of the identifier field. If found the location will be updated. If not found the location in the import file will be added.
Faster CSV Imports
Add latitude & longitude to your locations.
If ALL of your locations have a latitude/longitude check skip geocoding.
If you KNOW you do not have duplicate locations set the Duplicates Handling setting to Add.
Try importing with skip geocoding enabled and use the Bulk Action “Geocode All Uncoded” from the Manage subtab. This does the faster upload-and-parse of the CSV file first in a separate step, getting all your locations into the system. The often-slower geocoding process is done in a separate session.
If you are on a shared hosting plan, especially a lower-end GoDaddy hosting plan, consider upgrading your hosting. You will likely have issues with completing the entire file import without experiencing a timeout. Google restricts how many of their services can be used by a single server.
If there are 20,000 other companies on the same server you will all be sharing the Google quota. As requests for that server get near the daily quota established by Google , your sites performance will be significantly impaired when trying to load the geocoding requests. Note: This can also impact users when they type in their address on your map page as it uses the same Google service to figure out the latitude/longitude for where that address.