Important: As of August 13, 2024, this page will no longer be actively maintained. Please refer to the current version of this content here.
Prerequisites
Firebase is a Google-owned, leading tracking software. Businesses use Firebase to gain insight into the performance of their iOS, Android and web applications. Aanlytics has a seamless integration with Firebase through BigQuery.
Sharded tables, meaning if your BigQuery tables end with the _MMDDYYYY format, are not currently supported through this UI. To set up an integration using sharded tables, you will need to get in touch with a product specialist. Additionally, if there are additional enrichments required such as joining with user property tables or deriving custom user_ids, please contact us.
Instructions
Adding a Data Source In Analytics
-
n Analytics, click on the gear icon and select Project Settings.
-
Select the Data Sources tab.
- Select New Data Source.
- Select Google BigQuery and Firebase. Click Connect.
- You should see this BigQuery + Firebase overview screen. Click Next
Enabling BigQuery Export
- Sign into the Firebase Console
- Click on the Settings icon, then click Project Settings.
- On the Project Settings page, click the Integrations tab.
- On the BigQuery card, click Link.
Connection Information
- Open the BigQuery console on Google Cloud Platform and Select a project.
- Enter the GCP Project ID containing your data.
- Enter the Dataset Name
- Enter the Table Name and click Next in Analytics.
Grant Permissions
-
This integration works by sharing the dataset with Analytics' service account and only requires read-only access to that dataset. Analytics takes on the cost of the query and caches this data in Analytics' proprietary analytics engine.
- Within the BigQuery Console, select your Project and your dataset from the previous section.
- Click on Share Dataset
- In the Dataset Permissions panel, in the Add Members field, place the user below.
integrations@indicative-988.iam.gserviceaccount.com
- In the Select a Role dropdown, select BigQuery Data Viewer and click Add.
Data Loading
-
Start Date
Select the date from where Analytics should load your data from.
If your data history exceeds 1 billion events, a Solutions Engineer will contact you to assist with the integration.
-
Schedule Interval
Select the frequency to make new data available in Analytics. -
Load Timestamp Field
Select the field used to identify new data. We recommend using a timestamp that denotes when the event was published, not the actual event timestamp to allow for late data to be collected. This will not impact your analyses since we reference the event timestamp for our queries. If you select to load data every 3, 6, or 12 hours, make sure to select a load timestamp field with at least hour precision (not a date only field).
For example, if an event with an event timestamp of 12/1 was published to the table on 12/3, this will not be collected unless we use the publishing timestamp since every daily extract would look for events that occurred on 12/3. Using the publishing timestamp will allow us to extract all new data that was published to the table on a nightly basis. -
Advanced - Processing Delay
Select when we should start extracting your data in UTC. This time should be when all of your previous day's data is fully available in your table for extraction.
Assisted Modeling
You should see a summary of your data based on the last 7 days in three main blocks
You should only be concerned if the margin of error is significant. If so, please reach out to a product specialist:
-
Events Summary
You should see a daily breakdown of your Total Event Count, and the number of Unique Event Names. If there are certain events to exclude, please click on the Exclude checkbox for those events.
If you would like to exclude any events by regex or property value, please contact a product specialist.
If this section looks good, click Next
-
Properties Summary
Here you will see the number of Unique Event Property Names. If there are certain properties to exclude, please click on the Exclude checkbox for those events.
If you require more advanced configurations such as parsing out JSON fields, creating derived properties, or excluding properties based on regex, please contact a product specialist.
If this section looks good, click Next
-
Users Summary
This section lists the number of Unique users seen. If the numbers do not look correct, please go back to the User Modeling section to confirm that the correct ID was chosen. Please note that the counts may not reflect exactly what gets loaded into Analytics due to aliasing and other event modeling configurations.
If this section looks good, click Next
Waiting for Data
If you see this screen, you're all done! You should see your data in Analytics within 48-72 hours and will be notified by email.