Digital Analytics Review — Week 7

Erwin Solis
6 min readMar 28, 2021

Welcome to a new article on my blog! During this week, I learned together with the specialists of the CXL Institute more details about BigQuery and Facebook Analytics, specifically how to automate insightful reports with Google’s lightning-fast analytics data warehouse and the basics skills necessary to use Facebook Analytics including how to create funnel reports, the pro’s and con’s of the LTV report, measuring outcomes from Facebook posts, and more. Thanks to CXL Institute for giving me the opportunity to be a better professional and to learn more about how to help organizations to make better decisions everyday.

On the first video, we started exploring Google BigQuery, using SandBox and public datasets. We learned how to build our data schema to prepare our own BigQuery data structure. The topics we covered are to examine the free BigQuery trainings available through Google Cloud, learn to use free BigQuery features for your SQL and data management skills and build your own data structure for your future BigQuery data warehouse.

On the second video, we learned about the different possibilities of getting our data into BigQuery, set up a manual or automatic import to BigQuery, and learned a little bit about data flow or data transfer tools. We can get our data from Google Storage, Google Ads,YouTube, and other tools to Google BigQuery automatically. We will need to configure it exactly in the BigQuery interface, and we will see during our screencast how exactly we can do it. Another option, we can use Google Cloud Function to automatically transfer our data from Google Storage to Google BigQuery. To get our data into Google Storage, we will need to ask our developers to send automatically the files into Google Storage. There is also a Google Dataflow.

On the third video, we discovered how to work with our data in BigQuery. We learned to use SQL language to create, register and schedule our queries, views and new tables. We found exactly how we can work with our data in BigQuery. Then, we learned basic SQL language by creating and testing our queries and finally, we prepared the data for our reports and analysis using queries and views. On the fourth video, we learned how to analyze our Google Analytics data in BigQuery and how to work with partitioned tables and nested fields. Then, we learned how Google Analytics data is structured and how to work with partitioned tables and nested fields. Finally, we practiced querying your Google Analytics data. We saw how to create automated reports using BigQuery data with Google Sheets and Google Data Studio and to connect BigQuery to non-Google data visualisation tools.

To create our reports in Google Spreadsheet, we will have two possibilities. We can use a natural data connector with BigQuery that is available only for G Suite user. So, if we have a paid version of Google Drive, we will see this available and we can connect to BigQuery directly from our Google Spreadsheet like this. So, we will click on Connect to BigQuery. We will select a project for which the billing is activated. And then, we will see the query editor where we can edit a query. And then, click on Insert results and we will see the results here. So, whenever we see the icon, it means that this data is from BigQuery. So, now we have our data in Google Sheet and we can reuse it to create monthly reports.

On the final video with Christina Genko, we explored the possibilities that BigQuery opens for our company and we discovered use cases and best practices. We got inspired by BigQuery use cases, discovered BigQuery best practices and we improved our knowledge with useful resources.

We learned about some BigQuery best practices, and we learned about some BigQuery use-cases. When we want to query the table we don’t use select all from the table because we have a preview mode for every table we have in BigQuery. So we can check the data that we have without selecting all. Then we use a query validator before you run our query, so that we can make sure that we don’t run the query that has errors in it. When we are not sure about how much the query will cost or how much the storage in BigQuery will cost, we can always use the pricing calculator. It’s much more safe than running the query or storing the data and then paying too much. When we work with tables we have to try to use partition table if we have these possibilities. Because when we query partition tables we consume less data than when we consume the normal tables. So then, with the partition tables we will pay less. So we use table expiration if needed. Because in BigQuery we will pay for storage and for querying. So if we want to use a table, a temporary table, for some amount of time, just set an expiration dates or it will be deleted automatically after the date we want.

We use the denormalized table. It means that try to avoid having several different tables with different pieces of data. We have to try to put more data in one table. Well currently we have discovered that BigQuery teamtried to optimize normalized table as well, but still this is a recommendation that we would like to give our company. We need to know that we can control costs and budget in BigQuery. We can setup a cost or budget per person, per project, etcetera. So that if we have a huge team that use plenty of queries in BigQuery we can limit an amount of data consumed by person. So no one will be able to create a query that will consume a large amount of money. We can use it to do our machine-learning models and to combine it with our existing data. And to do some kind of prediction based on the weather, for example. Or we can see some examples like when streaming the data in the realtime to BigQuery like Harvard University did, so that we can optimize the use of our online courses. Some use BigQuery for the realtime fraud detection.They stream the data in realtime and they can understand all the fraudulent activities in their platform. But the main use-case for us marketers is consolidate all the data we need for the reporting so that we can automate our reportsand spend less time reporting and more time analyzing.

This week’s BigQuery chapter has definitely opened my eyes to better utilize this platform. Definitely, all that I have learned in these days will help me make better decisions in my organization and decipher what our users want to tell us through our data. Each of the sessions that I had with the teacher and the resources that she offered at the end of the class helped me to complement the information and to recognize each of the new actions that we can do through BigQuery. Thank you very much CXL Institute for the opportunity to continue growing and being a better professional. Never stop learning!

--

--

Erwin Solis
0 Followers

Be the person your dog thinks you are.