Answered by:

Leveraging Google Analytics and Power Query

Question

We recently started leveraging Power Query as a "self service" BI tool for our startup
Navinum.

What we really love is to have the option to connect different data sources together (fairly) easily.

We have started doing combined analysis across our Google Analytics dataand our data from our PostgreSQL database. A typical example would be a view of fraudulent transactions broken down by marketing channel.

This can already be achieved by leveraging e.g. the Excel Plugin NextAnalytics, that connects to the Google API. With that we populate worksheets that are then the data source for Power Query.

However, I think this kind of integration would be useful for almost all companies - and it would make a lot of sense to have Google Analaytics connectivity as a standard data source for Power Query.

Can someone comment on if this is planned or not - and maybe pass it on for the roadmap?

Just my 2 cents: it would be *very* interesting - we have a system running for that, but since we cannot schedule a direct refresh, we receive a weekly mail with data we want. Stiil an incomplete system, it would be much better integrating it in Power
Query (but, at that point the ability of writing from Power Query into SQL Azure or SQL Server would be very welcome...)

toulou01: Yes this is on our backlog but there is no expected date of completion yet (nor has work been started.) We'll have to prioritize it with our other backlog items. Feedback like this thread help to bump up the priority so please keep the feedback
coming!

Marco: Who knows how our direction will change in the future, but right now we are 100% focused on getting data INTO Excel, not writing it back out to other sources. That being said, there are goofy things you can do with the Native SQL box in the From SQL
dialog and I've also seen some people create queries that output the text of a SQL query which they then parse and run with sqlcmd. But this type of scenario isn't something we're focused on as a product.

it would be interesting to investigate possible development once Power Query will run on the cloud as part of an incremental update. In that case, writing a partition into a data model would be similar to writing into another "adapter" that can push data
somewhere. Not something for Tomorrow, I agree :)

(5) Then I publish the result sheet via menu 'File / Publish to the web...' and copy published URL to clipboard.

From now, I have auto-updated report from GA, which I can easily access via Power Query connecting to URL, which I copied on previous step. Having done simple cleaning and I can load figures to Power Pivot and combine with metrics from on-premises systems.

Can I assume that you're using real values for the client_id, client_secret and refresh_token and not those hardcoded constants? :)

When I've used Google authentication to refresh an access token, I've always passed the parameters in the query string instead of the body. (That's not to say that I know it might not work to pass them in the body.) What I'd definitely do is to use Fiddler
or some other network tracing tool to see whether there might be more information about the error in the body of the 400 response.

Microsoft is conducting an online survey to understand your opinion of the Technet Web site. If you choose to participate, the online survey will be presented to you when you leave the Technet Web site.