This script is not actively maintained. I wrote it quickly for a project back in 2012 and haven’t used it much since. I’ve done some bug fixes and improvements since then, but no longer have the time to maintain or support this script.

Potential issues and solutions can be found in the comments below or on the issues and pull requests pages on GitHub.

If someone wants to take over this project, I’ll help transfer it over and can provide guidance on fixing and/or re-writing the script (I have a new architecture sketched out). It’s a great project for someone who wants a small open source project on their resume, or someone who actively uses this script and needs it improved. Contact me or leave a comment below.

ImportJSON imports data from public JSON APIs into Google Spreadsheets. It aims to operate similarly to how the native Google Spreadsheet functions ImportData and ImportXML work.

Usage

ImportJSON

Imports a JSON feed and returns the results to be inserted into a Google Spreadsheet. The JSON feed is flattened to create a two-dimensional array. The first row contains the headers, with each column header indicating the path to that data in the JSON feed. The remaining rows contain the data.

ImportJSON takes 3 parameters:

url
The URL to a JSON feed.

query
A comma-separated list of paths to import. Any path starting with one of these paths gets imported.

parseOptions
A comma-separated list of options that alter processing of the data.

By default, data gets transformed so it looks more like a normal data import. Specifically:

Data from parent JSON elements gets inherited to their child elements, so rows representing child elements contain the values of the rows representing their parent elements.

Values longer than 256 characters get truncated.

Headers have slashes converted to spaces, common prefixes removed and the resulting text converted to title case.

To change this behavior, pass in one of these values in the parseOptions parameter:

noInherit
Don’t inherit values from parent elements

noTruncate
Don’t truncate values

rawHeaders
Don’t prettify headers

noHeaders
Don’t include headers, only the data

debugLocation
Prepend each value with the row & column it belongs in

For example, to return all the number of shares and comments for the URL http://www.yahoo.com/ from the Facebook Graph API, you could use:

The “rawHeaders” allows us to see the full path to each column of data in the spreadsheet.

ImportJSONViaPost

Imports a JSON feed via a POST request and returns the results to be inserted into a Google Spreadsheet. This function works the same as ImportJSON, but allows you to specify a payload and fetch options to perform a POST request instead of a GET request.

To retrieve the JSON, a POST request is sent to the URL and the payload is passed as the content of the request using the content type “application/x-www-form-urlencoded”. If the fetchOptions define a value for “method”, “payload” or “contentType”, these values will take precedent. For example, advanced users can use this to make this function pass XML as the payload using a GET
request and a content type of “application/xml; charset=utf-8″.

ImportJSONViaPost takes 5 parameters:

url
The URL to a JSON feed.

payload
The content to pass with the POST request; usually a URL encoded list of name-value parameters separated by ampersands. Use the URLEncode function to URL encode parameters.

fetchOptions
A comma-separated list of options used to retrieve the JSON feed from the URL.

query
A comma-separated list of paths to import. Any path starting with one of these paths gets imported.

parseOptions
A comma-separated list of options that alter processing of the data.

For more information on the available fetch options, see the documentation for UrlFetchApp. At this time the “headers” option is not supported.

For a list of the supported parseOptions and how to use queries, see ImportJSON.

ImportJSONAdvanced

An advanced version of ImportJSON designed to be easily extended by a script. This version cannot be called from within a spreadsheet.

Imports a JSON feed and returns the results to be inserted into a Google Spreadsheet. The JSON feed is flattened to create a two-dimensional array. The first row contains the headers, with each column header indicating the path to that data in the JSON feed. The remaining rows contain the data.

ImportJSONAdvanced takes 6 parameters:

url
The URL to a JSON feed.

fetchOptions
An Object whose properties are the options used to retrieve the JSON feed from the URL.

query
A comma-separated list of paths to import. Any path starting with one of these paths gets imported.

parseOptions
A comma-separated list of options that alter processing of the data.

includeFunc
A function with the signature func(query, path, options) that returns true if the data element at the given path should be included or false otherwise.

transformFunc
A function with the signature func(data, row, column, options) where data is a 2-dimensional array of the data and row & column are the current row and column being processed. Any return value is ignored. Note that row 0 contains the headers for the data, so test for row==0 to process headers only.

The function returns a two-dimensional array containing the data, with the first row containing headers.

The fetchOptions can be used to change how the JSON feed is retrieved. For instance, the “method” and “payload” options can be set to pass a POST request with post parameters. For more information on the available parameters, see the documentation for UrlFetchApp. The fetchOptions must be an Object.

Use the include and transformation functions to determine what to include in the import and how to transform the data after it is imported.

In this example, the import function checks to see if the path to the data being imported starts with the query. The transform function takes the data and truncates it. For more robust versions of these functions, see the internal code of this library.

URLEncode

Encodes the given value to use within a URL.

URLEncode takes 1 parameters:

value
The value to be encoded.

The function returns the given value encoded using the URL encoding scheme.

For instance:

=URLEncode("Value with spaces")

Returns the value “Value%20with%20spaces”.

You can use the URLEncode function to create URL (GET) and POST parameters by combining it with the CONCATENATE function. For instance:

Source Code

Bugs & Feature Requests

If you have features you’d like to see added, or notice bugs in this library, please submit a new issue on GitHub. If you don’t have a GitHub account, you can always leave a comment on this page too.

The library isn’t under active development, so don’t expect issues to be fixed quickly. But I do occasionally revisit it and add new features or fix bugs. If you would like to contribute to the development of this library, let me know.

trevor says:

The ImportJSON library doesn’t use XPath syntax. Eventually it should use a JSONPath syntax, but for now it does a simple starts-with match.

If you want to use the function to import the JSON data you showed, use the query “/aaData”. Though currently the ImportJSON function would return the JSON you provided as a comma-separated list within a single cell because it currently collapses all scalar arrays into single values.

If I have time I’ll work on adding an option to allow you to import scalar arrays as arrays in the final result. For my initial test cases it made sense to import them as a single cell, but now that I have your test case, I can add to it. Thanks.

trevor says:

Thanks. I just re-published the script with versions enabled. The current version is now v1.1.1 which translates to the Google script version 3. It may take a couple days for the script to be published in the main directory.

Laura- says:

trevor says:

I did publish it, but didn’t check the Script Gallery beforehand. It looks like someone using the e-mail address “tlckreative@gmail.com” republished my library back in January and hijacked my listing.

It looks like I may not be able to get this republished until I can get Google to re-assign ownership back to me. I’m working on figuring out how to do this.

Unfortunately, it looks like Google Scripts is not well supported by Google. The Report Abuse link points to an old e-mail that bounces and I never wind up getting notifications when the scripts do get published.

The result of this call is put in only one lines of my spreadshet although I get results for two pages. The results of the second page is put in the same line after the results of the first line. I expected these data of the second page under the first page in a second line.

The JSON output looks good for me I cannot find any error.

I think that there could be a problem in your script with the phenomen of using a commata in the facebook api call.

Cheers,
Besim

trevor says:

The problem is not with the script, but with the way Facebook is returning multiple objects in the API. Rather than returning the objects as a top-level array, which ImportJSON would correctly parse as multiple rows, Facebook is returning the objects as a key-value map where the key is the id of each object.

This makes each object appear to be attributes of a single top-level object, rather than multiple objects. Hence the single row.

Though it’s not a trivial fix, and I don’t know when I’ll have time to work on it. If I do add this functionality, I’ll definitely let you know. In the meantime, your best bet is to use separate ImportJSON calls, one per line.

Nathan says:

This is so great! I’ve been running this for a bit and pretty quickly hit the Google Docs rate limit. It returns #ERROR! and the following message:
Service invoked too many times in a short time: urlfetch. Try Utilities.sleep(1000) between calls. (line 167, file “Code”)

I don’t know how to add this myself. How could I add this pause so that I wouldn’t keep hitting the rate limit? Thanks!

trevor says:

1. Use the Tools > Script editor menu item to launch the script editor.
2. Add Utilities.sleep(1000); on line 167 so it becomes the first line of the ImportJSONAdvanced function. The code var jsondata = UrlFetchApp.fetch(url, fetchOptions); should now be on line 168 after you do this.
3. Use File > Save to save the script.
4. Go back to your spreadsheet, which should refresh in a couple seconds to use the new code.

Experiment with smaller values than 1000 milliseconds if this is taking too long between calls.

As long as I only fetch 50 at a time, I don’t hit the rate limit. If I push it up to over 100 at a time, I hit the #ERROR!, even after adding the sleep instruction. From watching the page load, it doesn’t appear that the sheet loads one JSON record, pauses for 1 second, then loads the next. It looks like they just snowball. Maybe that’s just how it looks, though.

Pulling down 10,000 records is a bit of a slog, 50 at a time, but it’s sure better than the process I used previously.

trevor says:

Now that I think about it, that makes sense. Each function on the spreadsheet is likely executed in a separate thread. So the sleep being executed on one instance doesn’t affect the other instances.

One question though: why do you need separate statements for each row?

If the API you’re querying was designed well, it should have a way to pull multiple records at the same time. The ImportJSON places each record into a separate row if it receives multiple rows of data. This way you’d only use a single ImportJSON statement instead of multiple statements, and you shouldn’t hit the rate limit.

If the API doesn’t have a way to query multiple rows of data, consider contacting the creator of the API and asking them to add this capability.

The other thing you can try is to write your own JavaScript function that calls ImportJSONAdvanced multiple times with different parameters and concatenates the results. This would then run in a single thread and the sleep function would work as intended.

Cara says:

Can any of you guys help me with the usage of Utilities.sleep()? The error I’m getting is “Service invoked too many times in a short time: driveWriteVolume rateMax. Try Utilities.sleep(1000) between calls.”

trevor says:

The API you refer to requires you to substitute {PLATOON_ID} in the URL with an actual platoon ID. It looks like if the wrong platoon ID is entered, an HTML error page is returned rather than JSON. Which is probably what’s causing the error you’re seeing.

And replace {PLATOON_ID} with the ID of the platoon you want to query. If that’s still generating an error, try accessing the URL directly in the browser to confirm you’re getting JSON back (it should look like plain text with lots of { and } in the text).

If you are getting back JSON, but still getting the error, post the function exactly as you’re using it and I can look into it.

Michael Hirsch says:

I set up a time based trigger to update my spreadsheet automatically every minute (I’m creating a visualization) that tracks NYC’s citibike program. However, the spreadsheet isn’t updating based on the trigger, and instead only updates when I go into the Script Editor and press save. I don’t even have to make any changes.

trevor says:

I do know why it updates when you press Save in the Script Editor. By pressing Save, you’re telling Google Spreadsheet the script has been updated and needs to be re-run.

I’m not sure how to interpret your logs. It almost looks like the URL being used to fetch the results is the string “object Object”, which is the string representation of an object. This shouldn’t be occurring in the ImportJSON library (nor are there normally execution logs created from the library), so I don’t really know where to point you.

If you do figure out the cause, or how to get time-based triggers to work, can you leave a reply here so others can learn from it. Thanks.

Is it possible to switch over to COLUMNS representing child elements contain the values of the COLUMNS representing their parent elements?

I am running into the problem where there are 100+ child elements for one parent, and google docs is not wide enough to display all of them. It is however tall enough, but how do we get the function to display the data in rows instead of across columns?

Dominik says:

trevor says:

You’d have to set up a time-based trigger. I don’t know how to do this, and Michael Hirsch commented earlier that he was having problems getting this to work with ImportJSON.

Be careful about setting up a time-based trigger in the Script Editor. I believe these only run the scripts, rather than asking the spreadsheet to update. The ImportJSON function doesn’t work standalone in the script editor; you have to create a wrapper parameterless function which passed parameters to ImportJSON to get it to work.

Or at least, I’ve never been able to figure out how to run a function with parameters from within the Script Editor. During debugging, I just create a wrapper function and test it.

trevor says:

You could use the ImportJSONAdvanced function within a loop so it retrieves the data, grabs the paging/next link at the end and then uses that to retrieve the next set of data. Then append all the results together into a single array to return them to the spreadsheet.

trevor says:

Depending on what you’re trying to do, you may be able to use the ImportJSONViaPost function and set the fetchOptions to include the header you want sent. See the list of options you can set in the documentation for UrlFetchApp.fetch, which is what ImportJSON uses to retrieve the JSON from the server.

If you need to send a custom header, you likely need to call ImportJSONViaPost from a script, or modify the library to convert text headers into a JavaScript object, since the ‘headers’ option requires an object, not a string.

Also, though the function is called ImportJSONViaPost, you can do GET requests by setting the ‘method’ fetch option to “GET”.

trevor says:

The problem is that JSON-P adds extra JavaScript code that the JSON parser can’t parse. This should be able to be gotten around by trimming the start and end of the JSON-P response.

I just created the issue Figure out how to import JSON-P data in the issue tracking system. Check this out for a way you can modify the code to get what you want. If you try it, please add a comment that describes whether it was successful or not.

This creates a column for prefix, suffix, width and height and a row for each photo then populates the cells with the relevant data. However, I wish to combine the prefix, suffix, width and height for each photo along with the letter ‘x’ to generate a full url for each photo and display the URLS in a single cell separated by commas, like so:

URL1,URL2,URL3,URL4 etc

Is there anyway to do this?

Thanks for any help.

trevor says:

If you have a known number of rows, you should be able to do this by setting up your worksheet as follows:

1. Enter your ImportJSON function in B1

2. In A1, enter =CONCATENATE(B1, C1, D1, E1) to concatenate the prefix, width, height and suffix into a single column. Add whatever other text you need as parameters here. So you might use =CONCATENATE("http://someserver/images/", B1, "_image-", C1, "x", D1, ".", E1) which would produce something like “http://someserver/images/bu_image-200×420.jpg”.

3. Copy the formula in A1 down the rest of column A for as many rows as you expect to receive from the server

3. In F1 (or any other column which doesn’t have data), enter =JOIN(",", A:A). This will combine all the individual URLs into a single cell, separating them by commas.

The only issue here is that you have to manually copy the A1 formula and know how many rows you’re receiving.

To do this with an unknown number of rows, you’d have to write a script using the ImportJSONAdvanced function to do this.

Thank you for this script! And I hate to add a real noob question, but here goes (I tried googling for a few hours before I posted this, so if I missed something obvious please forgive my fried brain).

So I have an import source from a url that feeds me order data, but unfortunately it does so in order of most recent to oldest.

I’m trying to make an additional column to where we can assign a value for the order (a room number), and the problem I’m running into is the resultant structure looks something like this:

Room Order Person
105 3 Joe
461 2 Curly
107 1 Moe

Once the script refreshes after a new order, the first column, being static, does this:

Room Order Person
105 4 Shemp
461 3 Joe
107 2 Curly
1 Moe

(So on and so forth for each order thereafter)

Now, it would be easy if I could just simply sort the data by the order column, but when I do that I get partial sort, that’s removes most of the entries because it’s being referenced by formulas such as “=CONTINUE(A1, 2, 3)” instead of the values.

I guess is there any way to work around this, e.g. to have the JSON data sent to the cells as actual values instead of formulas, or something else I’m not aware of?

Any assistance you could offer would be a godsend. Thanks for the script though -for the advertised purpose, it worked BEAUTIFULLY!

trevor says:

If you only need to update once in a while, the simple solution is to click the upper-left corner of the worksheet to highlight the entire worksheet, press Copy, create a new worksheet, then use Paste Special > As Values into that sheet. Then you can sort normally.

You could probably also do this programmatically by calling ImportJSONAdvanced, then calling some Google Spreadsheet methods to insert the data directly into the spreadsheet. I’m pretty sure this can be done, but I don’t know how to do it. You could search for how to insert data programatically into the spreadsheet.

The problem right now is that ImportJSON is designed to be used as a function, so it returns the results dynamically, which doesn’t allow for sorting, as you’ve noticed.

Joey says:

Hey Trevor,
Thanks for the great script. I’m trying to use it to pull data from basecamp into a spreadsheet, but it seems to get tripped up by the html basic authentication. Any way to get around this?

trevor says:

ImportJSON uses the UrlFetchApp class from Google, which I don’t think supports using the username & password in the URL itself.

It may be possible to do this with ImportJSONAdvanced by creating a custom function in a script. UrlFetchApp supports arbitrary headers, so you could pass the username & password using the Authenticate HTTP header.

How you can see that rows aren’t same length (1st – 4 column, 2nd – 3 columns..). This data can’t be written to spreadsheet using setValues() function because all rows are need to have same length. If I write it by one row with sheet.appendRow function, there is time limit exceed..

Can you help me?

trevor says:

Since you’re doing this in code, try normalizing the array before you pass it into the setValues() function. That is, go through the array and increase the length of any rows that are too short before calling setValues() by populating the extra places with blank or empty data (e.g. ” or null).

trevor says:

The response coming back from the Google Trends URL you posted is JSON-P, not pure JSON. So it can’t be parsed by ImportJSON.

See this issue in the issue tracking system for a potential workaround using a custom function that calls into ImportJSONAdvanced.

Basically, you need to trim off the beginning and end of the response to get at the pure JSON. Also, if Google ever decides to change the callback function or comments in the JSON-P, you’ll need to adjust where you do that trimming (in case you get it to work & it suddenly stops working at some point).

Steven says:

Thanks for the quick reply. Missed the equal sign thanks. But I am still getting a parse error on all three of the options you suggested. I even tried calling a cell from the same sheet to if that was the problem.

trevor says:

If you know the response is always a fixed number of rows, you can use the ImportJSON function on the row after the previous response ends. For instance, if the response is always 5 rows, you can insert your first ImportJSON function with the first URL at line 1, the second one at line 6, the third one at line 11, etc.

If you have a variable number of rows, you’ll need to get into editing the code to create a new function. This function can iterate through your list of potential URLs, call ImportJSON to retrieve the data for each one, and then merge the results together.

The key to know is that all ImportJSON is doing is returning a 2-dimensional array. So all you need to do is call it multiple times, merge the arrays and return the result.

The above code probably won’t work because I don’t think the concat() method works on 2-dimensional arrays. But this answer on StackOverflow might be a useful start to create a function that merges 2-dimensional arrays.

If you do work out a generic solution, please post it here so others can use it. Thanks and good luck!

trevor says:

ImportJSON doesn’t currently have a way to set the headers for a request. So you can’t set the user agent.

In theory, the UrlFetchApp class that ImportJSON uses can set the headers of the request and you could try explicitly setting the “User-Agent” header. You could do this modifying ImportJSON to pass in a ‘headers’ object in the fetchOptions:

I wouldn’t be surprised if Google doesn’t let you override the user agent though. See here for full documentation on UrlFetchApp, which is what ImportJSON uses to do the retrieval. The ‘fetchOptions’ are what are passed through as the ‘params’.

However, it’s unclear from the error message you posted that the problem is the user agent. A 403 Forbidden error can occur for many reasons. You may want to set muteHttpExceptions to see the full exception. You can do this by modifying the ImportJSON function above:

var fetchOptions = { 'muteHttpExceptions' : true };

Or by using ImportJSONViaPost, which allows you to pass in a limited set of the fetch options:

Ethan says:

Again, can’t thank you enough for writing this script. I’ve tried many variations of the formula to make it work on this url: http://api.grepwords.com/lookup?apikey=carter&q=mesothelioma|seo. What would the parameters be to get all elements? I know it’s something probably really elementary that I’m missing.

Note that you cannot encode the entire URL, since the colon, slashes and question mark need to remain unencoded in the URL (i.e., they are special characters whose meaning you want to remain special). Just encode the parameters you need to.

Gabriela says:

Hi, This is a great script. I got it working in one of my spreesheets but not in another.

In the spreed I got it working I’ve set the JSON URL and it displays data. My question is, how do I set it to refresh data automatically. I have done everything. I changed spreedsheet configurations, set a trigger in script and even added Minute(Now()) at the end of my JSON url. I can’t get it to refresh data in minutes. Can you help me in this task.

The other thing is, when I load the script using ImportJSON button in the script editor it gives me an error at line 130. It says DNS error: http://undefined (line 130, file “importJSON”).

Thank you.

trevor says:

To get ImportJSON to refresh every minutes, you need to use the function GOOGLECLOCK() instead of NOW(). This forces an update every minute whereas NOW() only recalculates when the spreadsheet is opened or re-calculated.

An easy way to do this is to add GOOGLECLOCK() as a 4th parameter:

=ImportJSON("http://myurl.com", "/", "", GOOGLECLOCK())

Read the documentation linked above; if you are using the new Google Spreadsheets, you must use the Google NOW() function and change the recalculation setting to every minute under File > Spreadsheet settings instead.

The error you are getting in the script editor using the ImportJSON button is because you are calling the function with no parameters. ImportJSON needs at least a URL passed in. When you execute it with no parameters, the URL is undefined (hence the error you’re seeing).

trevor says:

You can only use the NOW() function if you are using the new Google Spreadsheets. If you are still using the old Google Spreadsheets, you have to use GOOGLECLOCK().

Re-read the comment above that you replied to for full details. Unfortunately, this is all I know about how to get ImportJSON to automatically refresh. I did a test using the old Google Spreadsheets and GOOGLECLOCK() worked for me. I haven’t tried using the new Google Spreadsheets and changing the recalculation settings.

My table will not fresh automatically, no matter what I do or how long I wait. Help!

trevor says:

June 24, 2014 at 4:07 pm (UTC -4)

Crypt,

The GoogleClock() function still should be active in the old Google Spreadsheets. It’s not active in the new version. As Greg has noted, there is no way to update your data automatically using custom functions in the new Google Spreadsheets. You have to use the old version.

The sheet you linked to is using the old version of Google Spreadsheets. To make ImportJSON execute every minute, simply add GoogleClock() as the fourth parameter:

trevor says:

Can you point me to the URL that generates the JSON or copy-and-paste part of the JSON into a comment?

JSON is a key-value pair format. For ImportJSON, the column headers are always created using the keys and the data always comes from the values.

If the JSON you’re trying to retrieve uses the row IDs as the keys for the rows rather than the values, then you’ll see those in the column headers. Right now there’s no way to format this differently.

So I’ve set up a spreadsheet with your script.
=Importjson(“https://itunes.apple.com/search?term=”&A5&”&country=de&limit=1″)
works fine and extracts me the entire search result into the spreadsheet. However, I’m only interested in the URL thats located in

trevor says:

I suspect your problem is due to the format of the formula, rather than a problem with ImportJSON itself.

When I copy and paste the test you pasted above into Google Spreadsheets, I get a parsing error. But this is because the double quotes being used are smart quotes rather than regular double quotes. Google Spreadsheets doesn’t recognize smart quotes as string delimiters, so throws a Parsing Error trying to parse the formula. This prevents ImportJSON from ever being called.

If you convert your smart quotes to regular quotes, you should be able to use:

Hi I am trying to install the script to a spreadsheet, but cannot find it in the add-ons library. How do I install? I cannot find script manager as it has been deprecated in light of add-ons. Any thoughts? Thanks!

trevor says:

Until I have time to figure out how to convert the script to an add-on, you’ll have to install the script manually.

To do this, open the script editor using the Tools > Script Editor menu item. Then copy-and-paste the ImportJSON code from here and use File > Save to save the script. You should then be able to use ImportJSON within your spreadsheet.

Brad says:

Awesome script. It works flawlessly! Thank you so much for your continued contribution to the community.

I was just wondering if there is an option (presumably in ImportJSONAdvanced() that allows for the return of only the first value of an array? I have scoured the documentation and found a hint that it may be require row==0 in tranformFunc somewhere, but can’t make heads or tails of it.

My end-requirement is to return the first value of “/return/ANC/sellorders/price” from the API “http://pubapi.cryptsy.com/api.php?method=singleorderdata&marketid=66″.

Joanna M says:

I know this is nuw quite some time afte you posted the message, but think this might be useful info for others, looking through this thread.

You can use google sheets function INDEX() to select a specific cell from the result returned by ImportJSON(). If your result is a single column with lots of rows, and you want the second value (for illustration purposes) you would use:

trevor says:

telefrancisco says:

Regarding in updating the data in Google Spreadsheets, just to mention that Google Drive Time-Based triggers do not work.
ImportJSON as said earlier is not an standalone function.

Regarding the new Google Spreadsheets you can update ImportJson every minute configuring in the options of the Spreadsheet to update it every minute and using now() instead of the deprecated “googleclock” (which in fact, updated the spreadsheet every minute, so there is no function loss at all for this topic in new spreadsheets).

trevor says:

I just did a test in Google Spreadsheets using a YouTube JSON feed and the results were coming back as numbers. Or, at least all of Google’s internal functions were treating them as numbers. So it may be a problem with executing queries against the spreadsheet.

You could try forcing the numbers by modifying the defaultTransform_ function. To force every value to an float, at the end you could put:

John says:

Hi Trevor,
Great work. Really nice. I was curious if I am compliant with the GNU license GPL-3.0 with the change I made below in my copy of your script in order to get this script to work with what I needed. I simply added a header object directly to the UrlFetchApp so I could authenticate with the JSON feed I needed to pull into the spreadsheet. It works great. No issues. My question is if I am technically allowed to make this modification to a copy of the script in my spreadsheet based on the GNU GPL 3.0 license?

trevor says:

ImportJSON is a custom function, not an add-on. It won’t appear on any menus within Google Spreadsheets. Nor does Google Spreadsheets show it as an auto-complete suggestion when you’re typing formulas in cells. You need to use the documentation at the top of this page to use it within your spreadsheet.

Trevor

Petar says:

Im traying to import data from game API to google Spredsheet but i constantly get error: The coordinates or dimensions of the range are invalid. (line 19, file “Code”)
Im new in this and would very much appreciate any help

trevor says:

I’m sorry, but I’m not an expert at Google Spreadsheets. I wrote the ImportJSON for a specific project and open sourced it for others to use. My experience with Google Spreadsheets is limited to what I needed to learn for this project.

Trevor

Jumbo Panda says:

Question on slowing down the API calls when I use this in a Google Docs worksheet to pull in Flurry data. I have a worksheet with 10+ =importJSON(blah) cells in it. Seems they all fire at once and I exceed the 1 API call per second Flurry limit.

trevor says:

Apologies for the delay in responding. The problem looks to be two-fold.

First, there is a bug in the ImportJSON library so not all the rows from your JSON file are being returned. I’ve documented it as Issue 8, but don’t have time right now to look into it. Based on my current schedule, it may be several months before I’ll have time to do another round of bug fixes.

The other issue is that there is no way to filter based on the values of what is returned. So even if the bug was fixed, you would get all the rows of your data (including all of the name:LANGUAGE ones–not just the first one). The query parameter let’s you filter columns, but there’s no way currently to filter rows.

Sorry I couldn’t be more helpful.

Trevor

ratman720 says:

Hey Trevor, The scrypt is phenomenal and works wonders for me. Thanks so much for the effort. I’m trying to work through a spreadsheet of cryptocoin data from cryptsy and the API get functions simply aren’t kind. The trouble I am having is I only want the first entry in the orderbook data. Is there any way to get importjson to do this? advice would be greatly appreciated.

The second parameter is the number of rows to display and the third parameter is the number of columns to display. Set the number of columns to be equal to or greater than the number of columns you’re fetching.

trevor says:

Check the URL that you are passing into ImportJSON and confirm you are getting data back.

If the data is coming back when you request it directly in the browser, then post the parameters for the ImportJSON function you are using here, and I’ll see if I can figure it out. If you have a sample Google Spreadsheet showing the problem, that’s ideal.

Jay says:

Justin says:

Hi Trevor – Thanks for creating this great tool. I’ve been trying to use it with the Edmunds API (vehicle data) to populate a google spreadsheet with all makes/models and some specifications of each vehicle. However ImportJSON is only pulling in one model from each make (e.g. Acura is a make that has several different models, but only the first model is being pulled in to the spreadsheet). I think the issue has something to do with the way that recursion is set up because at the bottom of the google spreadsheet a few additional models are brought in but it seems to stop importing when it gets back up to a make that only has one model listed (e.g. Tesla). Basically, I want column A to contain all makes and column B to contain all models that exist. Do you have any idea why ImportJSON is not pulling the entire population of models in? Is it user error? Any help would be much appreciated.

trevor says:

There was a bug in the ImportJSON code in how it handled nested arrays. I just fixed it.

Since the Script Library is now deprecated, you’ll have to install the code manually. To do so, open the script editor using the Tools > Script editor… menu item. Then copy and paste the new code from here.

Let me know if this solves the problem. I only had time to do basic testing.

Trevor

Chris says:

This fix addressed one issue I was having, but substituted another. Now values from nested arrays are showing up, but the first item in each array is missing. It’s like the index started at 1 rather than 0.

trevor says:

The current version of ImportJSON does not support password-protected JSON resources.

However, I describe code in this comment which can be used to add basic authentication support using a hardcoded username & password. ZenDesk appears to use basic authentication, so this should work for you too.

Hope this helps.

Trevor

Greg Wright says:

I added the code you cited but I still get the “error”:”Couldn’t authenticate you”. I tried a couple of variants with and without the ‘/token” syntax that ZenDesk ask for but was not able to get it to work.

By using the following syntax I am no longer getting the “error”:”Couldn’t authenticate you”:

trevor says:

Make sure your username is “{username}%2Ftoken” and your password is “{token}”, as this is what your curl arguments are indicating. If you’re not familiar with curl, check out the manpage, which explains the arguments and their formats.

Note that curl is a completely different program from ImportJSON and none of the curl options work with ImportJSON, nor with URLs in general. So passing in “-u {username}/token:{token} https://{domain}.zendesk.com/api/v2/ticket_metrics.json” is essentially meaningless. You’re passing an invalid URL, so you should be getting an error.

I haven’t used ZenDesk before with ImportJSON, but based on the Security and Authentication section of the Core API, it should work fine with the code for basic authentication.

If you’re still having problems, you may have to hire a contractor on Elance or oDesk to help connect. You may need to authenticate using OAuth2, which is also supported by ZenDesk, but outside the scope of what I can help you with. You can read about OAuth2 support in UrlFetchApp, the library used by ImportSON. It requires custom scripting.

Jakke says:

Hey, I love the script, perfect to import what I want from the WoW Armory. However, since I am loading rather big amounts of info (650 rows, 5 columns, almost 3000 cells), I tend to get the famous “Use utilities.sleep(1000)” error often.

Is there any way that I can tell the ImportJSON to not refresh the information it fetches everytime, and instead add a button to my custom menu that allows me to update it? Would make it alot easier for both Google servers and for the use we have for the Spreadsheet (very sluggish till you have had it open for some minutes).

Maximaal says:

Hi, I’ve got a problem with that import tool.
when i import this: http://www.cscpro.org/secura/employer/865.json
instead of 0 values it copies value from a row before.
Could you let me know what to do?
also when I try to get a inner query it only reads the second one and forgets about the first.
Cheers

Frank Braker says:

Okay – I got it by surrounding the whole file with a leading [ and trailing ], and adding commas at the end of the surrounding {…} pair. Luckily this is easy because each record is newline terminated. Easy to do with vi using regular expressions.

sarath says:

Hi , is it possible to recreate the json from spreadsheet? . So that we can make it as a db sync app for low security applications . After editing on spreadsheet , generate json and make a put request to the url .

It seems that I can’t delete rows of data that are imported via ImportJSON. If I delete the first row in a Google Spreadsheet (the one that contains the formula) it messes up all of the rows below it. If I try to delete any row besides the first, it simply won’t let me delete it – I presume it’s because it’s connected to the JSON URL that I specified.

Is there a way around this? Or, do you think this predicament will change after implementing a script to access the paginated JSON that I need as I mentioned above?

Hi!
You’re script is great!!!
I’ve tried with lots of public JSON and it works great.
Then I’ve tried to use it with a POST request and it’s a other matter.

the request I have to use is :
Query the API at : https://www.voilanorbert.com/api/v1/
– Make a POST request to my API with these three parameters:
name : The full name to search
domain : The domain to search against
token : My token

I I don’t know how to do it. It connect to the serveur, but did not give the name/domain/token

The first 4 columns are populated from a single call to the Wikipedia API returning data on 500 landmarks. The 5th column runs a separate call per row to find the umber of results the Europeana API will return for each landmark, so in total 501 calls.

Do you have any tips how I might get this to work more elegantly, for example by only calling the function once the cell above has completed. If it takes 10 minutes to complete loading that’s not an issue to me!

Thanks, James

trevor says:

I also proposed a different solution in this comment that didn’t seem to work as well.

But the other thing you can try is to write your own JavaScript function that calls ImportJSONAdvanced multiple times with different parameters and concatenates the results. This would then run in a single thread and the sleep function would work as intended.

Alternatively, if you control the API, you could add a function that let’s you pass in multiple parameters and get an array of results back. This would let you reduce the number of calls dramatically.

Thanks Trevor. I can see the logic of the first option but it’s still proving problematic. Some cells work fine, but many return “Error. Internal error executing the custom function”. I think it’s just the sheer number of calls. I wonder if having each cell call the next one when it has loaded would work? Or even ten at a time. Not quite sure how to do that though!

Zubair says:

Hi,
I am trying to import a json link, but the problem is it importing one of the tree it has like 5 another rows below it. Basically I just want to import certain number of rows and columns and how?
thanks

trevor says:

Shiva says:

June 15, 2015 at 9:37 am (UTC -4)

Looks like the only option left is this “If those don’t work, your other choice is to upload the JSON you are receiving to a public site that Google can access. This is, of course, a manually process and not ideal.”

I tried your script for authentication but it is authenticating JIRA not my private network (server). This is my problem, we are discussing.

Manually process looks fine for now, but my actual task is quiet big writing manual code may not be the good thing to do. Please let me know any other way? other than the manual one.

trevor says:

June 15, 2015 at 7:40 pm (UTC -4)

If you are running a server on a private network not connected to the Internet, then there is no way to get Google Spreadsheets to access your data. The Google servers need to be able to connect to the server containing your data to retrieve the data.

If you cannot make your server available on the Internet, then you’ll need to manually upload your data somewhere.

Also, note that the authentication code I linked to only works for Basic authentication. If Basic authentication is not turned on for the server you are accessing, then that code won’t work.

NightroadAX says:

I’m having a bit of trouble when I try to pull a list of items from a json file, the results always seem to be missing the first item. JSON View for this first item always contains a [0] element. (e.g. itemList.items[0].name). I’m guessing this is a pretty simple/common thing but just couldn’t find a solution. Any advice would be greatly appreciated!

Ben says:

I have now figured out how to display this into a table on a google spreadsheet, now I have issues with the True and False booleans as they do not display correctly… actually they do not display at all just blanks … also the biggest issue though is I cannot authenticate my Zendesk.

I tried using the code in a previous post but still cannot get this to work.

FastZ says:

Hello and thanks for the ImportJSON. I have a question:
I have a query: ImportJSON(URL, buy + "," + sell, "noHeaders");. As you can see, I trying to get 2 values (“/pair/buy” and “/pair/sell”), but they are showing in 2 different cells, like this:[ buy][sell]
How can I place the result in one cell like this?[ buy / sell]

Jim T says:

Hi Trevor! Using Your script a lot in my sheet, and love it. However, I did once ask for a menu button that makes me able to manually refresh the data.

However, since I got my own custom menu, is it any way I can add it there instead? I lose the ability to refresh it since my menu overrides Yours (which it must, so I don’t lose tons of handy features I need for sorting my sheets).

Also, is there any way to make it not update/refresh the info unless I manually Call for it to do so? I am hitting the Google limit when the Blizzard Armory API (for World of Warcraft) gets issues, it seems like it is trying to pull the info many times, and another script I got updates everytime ImportJSON updates its info. Makes havoc, got a backup and a backup backup sheet on other users to be sure I get the info in case something goes wrong.

Wouldn’t mind an email if you got the time, as I am travelling a bit these days and can’t check around on all these websites

Jim, Norway

Bernio says:

Like Jim, I’m using your script a lot. It’s incredibly useful. But now something’s wrong with google spreadsheet.

At first time read the values, but if we change something in the URL then it does not automatically recalculated as it did before. In the importjson cell appear the message of #error! (internal error).

Michael Perdue says:

I have been trying for a couple of days to find a way to download Json data into a database table (either stata, access, google, or anything similar).
Earlier today I installed your code for installing Jason with google. I ran across one problem. When I perform a command to fetch data a run into the following error:
SyntaxError: Unexpected token: < (line 170, file "")
Dismiss

line 170 reads:
var object = JSON.parse(jsondata.getContentText());
and is from the script you provided for downloading the intall json program.

If you want to import this data into a spreadsheet, all you need to do is download the CSV file, which most spreadsheet programs support. If you want to import into Google Spreadsheets, you can use the built-in ImportData function.

In general, though, if you get an error on line 170, then it means the JSON URL is formatting incorrectly. This can be because the URL doesn’t point to a JSON source, the JSON is incorrectly formatted or the JSON source is inaccessible.

trevor says:

Have you copied the ImportJSON code into Script Editor for your spreadsheet (via the Tools > Script Editor menu item)?

You won’t be able to use ImportJSON in the spreadsheet until you have the code copied over. If you’ve copied the code into the script editor and it’s still giving you an unknown function, I don’t know what the problem is.

In the ImportJSONBasicAuthentication function you define, you have username and password as parameters, but then you override them by defining local parameters with the same name. Remove these to allow the variables to come through the function call.

3. Remove “SELECT …” as parameter

The ‘query’ parameter is not a SQL query. It is a comma-separated list of JSON paths to import. See the examples at the top of this page.

Even if you do all these things, this may not solve your problem. Trying to access the URL directly returns the result:

903 - Internal system error.

This means your basic authentication is not likely to work, since directly accessing a URL protected by basic authentication should result in a 401 Not Authorized (or, before that, a dialog should pop up on the browser).

The service appears to be written poorly, since there is no 903 HTTP response code (an internal system error should be a 500-level code).

I’d recommend contacting the developers of the API you’re trying to access to see how/if you can access it in the way you want.

I also haven’t tested the basic authentication code you’ve added, so can’t confirm this works. Good luck.

TonyM says:

August 18, 2015 at 6:09 am (UTC -4)

hmmm!

I ve sent u a mail with the link to the google sheet, and the right Api

trevor says:

The problem is that your query (“/fixture_history/all/0/1″) is referencing array indexes and ImportJSON flattens those into a single value. For this data, there is no easy fix for this. ImportJSON isn’t designed to pick elements out of arrays within JSON data.

As far as using more than one parseOption, yes, that’s possible. Just use a comma to separate each option.

Hi, I keep on coming back to this script and use it across many sheets. But each time I want to use it I have to add it as a new script on that sheet. I had a quick look into this and it seems you could publish this as an add-on and then it would be simple for anyone to add. Maybe I’m wrong, but there’s more info at https://developers.google.com/apps-script/add-ons/publish

I know this is probably simple but I don’t understand jason or paths and am trying to do a little hack. https://app.propertyware.com/pw/00a/326632291/JSON?0NYqfdP is the jason url. there are columns which are the column labels and records. i used =ImportJSON(“https://app.propertyware.com/pw/00a/326632291/JSON?0NYqfdP”, “”, “”) which lists the column labels as in separate rowes instead labels accrossed columns over the records. i have a way to hack this with in excel but how can I use this function to view everything properly?

O.k. i did find a way to get what I wanted. But I think this is a janky work around so i wouldn’t mind knowing the answers to the question:
first paste this:
=transpose(ImportJSON(“https://app.propertyware.com/pw/00a/326632291/JSON?0NYqfdP”, “/columns/label”, “noHeaders”))
then below this:
=ImportJSON(“https://app.propertyware.com/pw/00a/326632291/JSON?0NYqfdP”, “/records”, “noHeaders”)

trevor says:

The URL you pointed to is basically trying to take record data and format it as JSON. The approach taken isn’t how you’d normally use JSON data, which is usually meant to represent hierarchical object trees rather than data rows.

In general, flat structured record data like this is better represented using CSV. CSV is a more compact format that better reflects the structure of the data. Additional header rows can be added to represent data types.

If the data was formatted as CSV, you’d use the built-in ImportData() function for your Google spreadsheet.

trevor says:

It may be possible to do this with ImportJSONAdvanced by creating a custom function in a script. UrlFetchApp supports arbitrary headers, so you could pass the token using your Authorization HTTP header.

Tavis Elliott says:

I’m using it to import data from Guild Wards 2 API into a google spreadsheet. Most of the API returns only IDs for items. I would like the names, but no problem I then add another import that concats the IDS together to ask for them, like: https://api.guildwars2.com/v2/items?ids=37090,66924,43766

My problem: The GW2 team has decided that the order of the JSON returned is random (probably for performance on their end), and there’s no way I can make them return the JSON order the IDS the same as I requested.

I think I’m looking at needed to write my own function that uses ImportJSONAdvanced() to take the array and then sort the results based on that, but any bright ideas you might have would be most appreciated!

Alex says:

Maybe I missed this. But how often does this pull fresh data (refresh) from the API link? Will it do it at all when sheet is closed or only when sheet is open and tab is active? Will it only refresh if I edit cell that the IMPORTJSON is in? Just trying to figure out how fresh my data is. I have been using this but can’t easily tell.

I’m currently experiencing a problem, though, which I was hoping you might have some insight into.

I have a Google Sheet that makes four ImportJSON calls.

It pulls in data that is new each day. It’s worked fine for months, but as of October 20 has ceased to work – when I check the spreadsheet now it always displays October 20′s data.

Here are the things I’ve tried, and the results:

1. Reloading the page without using the cache (Ctrl + F5) – no change
2. Loading the spreadsheet in Firefox (I mainly use Chrome) – the same as with Chrome; 20 October data
3. Reauthorising the ImportJSON code – I did this by trying to run the ImportJSON code in Script Editor, as I think I had to the first time I used it. Doing so obviously throws an error, as there’s no URL from which to carry out the import, but I was prompted to authorise the code (which surprised me, as, like I say, I thought I’d authorised it when I first started using it). In any case, this didn’t make any difference.
4. Using the exact same ImportJSON code in a new spreadsheet – in this case, I did get up-to-date data i.e. fresh 30 October data. So obviously I could just recreate the dashboard that I’ve made in Sheets, but I’d rather not do that if it’s not necessary (and given I’d be concerned the same issue could arise again).

If you had any thoughts on other things to try then that would be much appreciated – it’s a great tool, so I’m missing being able to use it!

Thank you.

Philip Nye

trevor says:

That sounds like a problem with Google Spreadsheets. Maybe they released a new version. Either way, it’s outside my expertise. If ImportJSON works in a new spreadsheet, then you’ve eliminated the code as the problem. Good luck on solving it.

Trevor

Herbert Huyo says:

I stumble upon your blog today, cause I was searching a way how to have the auto-update feature on your IMPORTJSON script… However, I was able to discover that I have a script that was based on IMPORTXML in googlesheets…

I tried to apply substitute the IMPORTXML to IMPORTJSON, then BINGO… my sheet autoupdates from time to time…

Thank you both, Trevor and Dan, for the very speedy replies, and for what you suggest.

As it happens, I’d seen these replies earlier today and was thinking I’d give your suggestion a try, Dan. But now I’ve come to try it, my spreadsheet has updated for the first time in a fortnight! Will definitely keep your suggestion in my pocket for if and when it happens again.

Dan says:

No, I’m afraid it hasn’t. It seems that it’s trying to though, as now quite often when I reopen my spreadsheets there are errors where it has failed to pull the data.

I’m running up to 70 ImportJSON calls on a single sheet due to the way the API I’m pulling data from deals with individual records separately.

The errors disappear when I go back to my old trick of inserting a column. It’s a shame though because I’m sharing the sheet with other people and they’ll be regularly opening it to errors. I’m also automatically recording data at a set time each day. If there are errors at this time, it doesn’t record the data

With =ImportJSONBasicAuthentication(=concat(“https://xxxxxxx.myshopify.com/admin/orders.json?fields=email,name,total-price,shipping_address,tags,note,shipping_lines,fulfillment_status&page=”,A2)), it seems that some null cells are replaced by the previous non null cell. How can I fix that ? Regards, Vincent REHM

I use a custom function to change the item id right before the .json in the url to get change which item’s price I want to import. My function only returns the last value from the “Daily” portion of the json data. Here is my custom function: http://pastebin.com/hAtbbaeS

The problem I get is sometimes I get an “Error Loading Data…” in the sheet, from what I’ve read this may be caused by having a large import range. What’s strange is I can make the call from a different sell and it will work just fine… Have any idea what might be causing this any way to fix it?

I read your post about using ImportJSONAdvanced in a loop to handle this, but to be totally honest, that’s way beyond my abilities. At best I can tinker with other people’s code, but creating a loops to deal with paginated data is over my head.

Paul Griswold says:

ATM my solution is – since the last thing I get is the URL for the next page of data, I’ve created a new page that calls importJSON and points to the cell containing the URL of the next page. I end up with multiple pages instead of one, but it works for now.

Jeune says:

There appears to be an issue where a backslash “\” causes the script to fail to parse the data. Is there a fix for that or where in the script would this need to be accounted for? Even better question, why is it having an issue with that character. The API is using it as an escape character before an apostrophe.

Herbert Huyo says:

Thank you for these wonderful IMPORTJSON formula… You, have help me alot… Now, it’s my time to help for the auto-update issues in new googlesheets. For those who wants their IMPORTJSON auto-updated in gSheets, this is what I did to have it auto-updated every minute…. I have just added a new script in my script editor below, and have it trigger every minute…. Hope this helps…

Having some weird issue where i can no longer see the importjson when i type in a = sign on my sheet. Have v1.2.1 with the script named ImportJSON.gs and copied and pasted the raw code from github. Any ideas?

Al says:

I’ve successfully used it to import data to my sheet from a json feed but the values which should be showing as integers seem to have been imported as strings and as such I can’t aggregate them with a function like sum().

Please help.

trevor says:

You could try forcing the numbers by modifying the defaultTransform_ function. To force every value to an float, at the end you could put:
if (!isNaN(parseFloat(data[row][column]))) {
data[row][column] = parseFloat(data[row][column]);
}

This solution changes the exception from being the first element in the array to the last element. That may work for some people. Others have proposed:

if (data[state.rowIndex]) {

Which definitely works for some problems. I haven’t merged this solution into the main codebase because there was clearly some problem I was solving by putting i > 0, and while taking that out fixes a bug for some people, it may create another bug for others.

For each one, I am trying to pull /players/name but it never returns the first player (in this case, “Oh”). Some of the jsons I want to import from have 4 or 6 players and every time it returns all players except the first and I’m not sure why. I’d appreciate any help. Thanks.

trevor says:

Your problem is that “offsite_conversion.fb_pixel_add_to_cart” isn’t a field, it’s a value. You would need to use:
=ImportJSON(A1,”/data/actions/action_type”)

To pull the array of action types, then use a VLOOKUP to find the one with the whose ‘actiion_type’ property has the value “offsite_conversion.fb_pixel_add_to_cart” and look for the ‘value’ property in the same row.

trevor says:

You can test the character limit by progressively cutting down the URL and seeing if the error goes away (the request doesn’t have to return valid data, it only has to get through line 164).

You might also want to test chopping off bits of the URL and progressively adding them back in to see if there’s a special character that’s tripping it up. For instance, chop off 25% of the length until you stop getting the error on line 164, then add back in the length you just chopped off and chop off 25% of that. Keep going until you’re removing one character at a time and find the character causing the problem.

The only other thing I can think of off the top of my head looking at the URL is to consider encoding the colons, in case the Google fetch code is mistaking these as part of the protocol.

Good luck.

Adam says:

May 11, 2016 at 2:23 am (UTC -4)

Hey Trevor –

Just for your knowledge and anyone else that might encounter this problem –

The function seems not to like squirly brackets, { and }, I encoded them in the URL to %7B and %7D respectively and that seemed to do the trick.

However, since the ‘data’ is being parsed from JSON, any zeros will be in string form, so unless there’s other code that’s converting values to numbers, the test !data[row][column] will evaluate to true for a zero data value (since “0″ is true, while 0 is false).

Dídac says:

Good afternoon, I have a problem I’m using this on a Google Drive but I need to update only every 1 or 2 hours. It updates me live and the Drive takes time to recalculate it. Can somebody help me? Thank you

After some hours to find a way to ask http request with headers in spreadsheet cell, I finally found:

add these following lines to the script (v1.2.1) at line 108:
(...)
if (postOptions["headers"] != null) {
postOptions["headers"] = JSON.parse(postOptions["headers"]);
}
(...)

in cell use ImportJSONViaPost function like this
=ImportJSONViaPost("https://bravenewcoin-v1.p.mashape.com/convert?from=btc&qty=1&to=usd";"";"headers={""X-Mashape-Key"":""1uPWfE36tomshdxFZSmlqut49XDop1LThHQjsnECklt9SFLSy0""}";"";"noInherit,noTruncate,rawHeaders,muteHttpExceptions=false")

yo says:

Dieter says:

I’m having quite some trouble trying to get the sheet to update automatically. I’ve followed many of the suggestions mentioned in the forum but nothing seems to work. Here are my observations:

1) Setting a trigger doesn’t seem to fire
2) When i try use now() as an additional parameter, I get an error that says “This function is not allowed to reference a cell with NOW(), RAND(), or RANDBETWEEN()”
3) Trying shahrul’s suggestion of dynamically creating the call from another cell, I get the error “You do not have permission to call setValue”

Furthermore, I would need to run the automated update regularly (Every 10 seconds) and it seems most solutions will only run every minute. Does anybody have any suggestions or a working copy that I could have a look at please?

Hey everyone, I figured out how to get it to automatically update every minute.

First you need to pick a cell somewhere in your spreadsheet that says “http”
Next you write a script the every minute clears that cell and sets the value back to http
Next set a trigger to run the script every minute
Finally, go to the cell where you are using the importjson and in your url replace http with “&Sheet1!O1&” (including the quotes) replacing Sheet1!O1 with the cell that says http.

Hope this helps!

If you found this helpful, consider sending me a small tip with bitcoin
16V9McqCFsvk7STWnYQhmodHBL5P1dphtZ

Pavel says:

Greetings. I really liked your script, but I have such a question, you can transfer it to the microsoft excel, but it uses the language of vba, I did not find anything like yours on the microsoft excel

13 pings

[…] a custom script to do this. This is much easier than it sounds. I got all of this from here: http://blog.fastfedora.com/projects/import-json, which is a brilliant overview of it. If you don’t want to read it, do […]

[…] links to api calls to pull in boundary data as well as other stuff. It uses a script called importJSON which means you can query the data from the police api directly and have it appear nicely in a […]