Dialogflow 12 – Sending data to BigQuery

Dialogflow 12 – Sending data to BigQuery


Taking the opportunity of a question I just got I’m writing on the way to send data from our agent to BigQuery. For those of you who don’t know BigQuery, it is Data Warehouse service in Google Cloud. It is flexible, powerful and fully managed, and you only need to create a dataset, a table and insert data … all the underlying infrastructure and the service is managed by Google. It is also a pay per use model with an almost null entry point.

With all this in mind, it looks to me like the ideal place to store usage information from our agents, transactions happening over time, etc.

In this example I’m showing how to do a simple tracking of the number of bookings done on the agent. It only saves the date and number of tickets, but we could add more information captured from the parameters or coming from the Dialogflow.

This would be the code:

And we would call this function with this:

const rdate = resDate.split(‘T’)[0]+” 12:00″;
const today = BigQuery.timestamp(new Date()).value;

You would need to add this dependency to your node.js function:

“@google-cloud/bigquery”: “^0.6.0”

Obviously you need to match the data format that we are sending with the fields defined in the table that stores the date. Look at the date format conversion to match the timestamp format from BigQuery. 

Another great point of BigQuery is that it is quite easy to visualize the information using dashboards from the free tool DataStudio … with a few clicks you can generate a report similar to this one to see the number of interactions per day on our agent, and also add filters, stats, etc.:




I work for Google Cloud, but this post are personal ideas and opinions

Leave a Reply