CSV Import Format
Getting the proper CSV format can be a challenge when using spreadsheet apps like Excel or Numbers. Here are some tips to getting a valid CSV format out of a spreadsheet app:
- Make sure you always have a header row. That tells the import process what data is in each column. The column headers have meaning, so be precise.
- When a cell is a text field that STARTS WITH A LEADING ZERO, make sure you explicitly mark the cell as text. The easiest way to do this is to type a single apostrophe then the number when adding data to the cell as in ‘01886 for the zip code 01886.
- Always export with UTF-8 support enabled.
- Export to a CSV format. Comma delimited not tab delimited and quoted strings if given the option.
Check out the Example CSV Imports post to get some usable import files you can play with.
WordPress (non-MySLP) IMPORTANT NOTE: As of SLP 4.9 and Power 4.9 the import process runs as a background cron job. Your site MUST keep WP Cron enabled. If you have basic authentication protecting your site the import cannot kick off the Cron process automatically. You’ll know if you’ve setup Basic Authentication; any site that asks you to enter a username/password before viewing ANY web pages even the home page have Basic Authentication. If you are on a staging/development site you may need to force WP-Cron to run by adding /wp-cron.php to the site address in your browser. Normal production sites should not have any problems with the new import process.
We like to check our files by uploading them to a web-based spreadsheet app like Google Sheets. This free service allows you to drag-and-drop your Excel or Numbers generated CSV file and make sure all the data lines up in the right columns.
The CSV file should be a UTF-8 encoded file with proper quotes around strings and commas between fields.
Special characters should be stripped or converted to proper UTF-8 format. In our experience many applications such as Numbers and Excel will not strip special characters properly during a UTF-8 CSV file export. Google Sheets has been a good application for importing those CSV files and re-exporting them with special characters removed.
The CSV file must have a header row defining field names as the first row of the file. All other rows must contain proper CSV data. Rows cannot contain more columns (fields) that defined by the header row. That is often an indicator that special characters exist in your file and is used as a safety stop mechanism in the import processor.
With Power 4.9 and MySLP 17.12 the import process is managed in 3 steps that detaches the file upload from the file processing and geocoding. This allows you to work on other things while the server process imports. While this provides a better user experience there are some things that make things faster.
Never include the sl_id column unless updating existing locations.
Delete any columns that have no data. This is especially important if you exported from Store Locator Plus as it includes a column header for every field.
If you KNOW you do not have duplicate locations set the Duplicates Handling setting to Add. If you cleared out all locations or are starting on a new site choose “Add” under duplicates handling options on the Import tab.
If ALL of your locations have a latitude/longitude check skip geocoding.
Add latitude & longitude to your locations.The more locations you can set with a valid latitude/longitude the faster the process will be. Consider starting from a Store Locator Plus export if updating a list of locations.
For OLDER versions (before Power 4.9 / MySLP 17.12) – try importing with skip geocoding enabled and use the Bulk Action “Geocode All Uncoded” from the Manage subtab. This does the faster upload-and-parse of the CSV file first in a separate step, getting all your locations into the system. The often-slower geocoding process is done in a separate session. The newer releases do this automatically for you in the background file processor.
The Header Row
All location import files must start with a field name header row. This is the first line of the CSV file that contains the name of each locator field at the top of the column.
- The header row contains the field names.
- The field names are NOT case sensitive.
- The legacy export field names that start with sl_ are valid. The processor strips off the leading “sl_” when it sees this in the header row.
Header field names are not case sensitive. The field names will be stripped of any character that is not a letter, number, hyphen, or underscore. If it makes your file more readable you can use a name such as “Address 2” which will be changed to “address2” , then the file is processed. Keep in mind the “sanitized key” version of whatever you use as the header MUST match the list of field names below.
Base Field Names
These are the fields provided by Store Locator Plus that can be imported by the CSV file. The header field names should be entered exactly as listed. Items with two options can use either option, the faster processing option is listed first.
- sl_id , id ** Do NOT include this column unless updating existing locations that are already loaded
- sl_store , store
- sl_address , address
- sl_address2 , address2
- sl_city , city
- sl_state, state
- sl_zip , zip
- sl_country , country
- sl_latitude , latitude. ** Including a valid latitude/longitude can speed up imports.
- sl_longitude, longitude
If the sl_id column is present the import will look for a pre-existing location with the same ID and update it.
If the sl_latitude and sl_longitude columns are present the geolocation process will be skipped for that location.
The combination of sl_store , sl_address, sl_address2, sl_city, sl_state, sl_zip, and sl_country are used when the “update duplicate locations” setting is active. In this mode these fields are checked for an EXACT MATCH in the pre-existing list of locations. If they match exactly the location is updated. If they don’t match the location is added.
Power Field Names
If you have enabled extended contact fields under the General / Data tab you will have the following fields available for import:
The identifier field can be used to associate a Store Locator Plus location id with a remote data system identifier. Import records that have this field set will automatically search the pre-existing locations in the Store Locator Plus database for an exact match of the identifier field. If found the location will be updated. If not found the location in the import file will be added.
If you are on a shared hosting plan, especially a lower-end GoDaddy hosting plan, consider upgrading your hosting. You will likely have issues with completing the entire file import without experiencing a timeout. Google restricts how many of their services can be used by a single server.
If there are 20,000 other companies on the same server you will all be sharing the Google quota. As requests for that server get near the daily quota established by Google , your sites performance will be significantly impaired when trying to load the geocoding requests. Note: This can also impact users when they type in their address on your map page as it uses the same Google service to figure out the latitude/longitude for where that address.