paint-brush
Testing Analytics Events: A Guide for QA Engineersby@makeeva
198 reads

Testing Analytics Events: A Guide for QA Engineers

by Nataliia MakeevaFebruary 27th, 2024
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

In today's digital landscape, understanding user behavior is paramount for optimizing application performance and user experience. Through analytics, we gain insights into which features of our application resonate with users and which ones fall short. This article will explore effective approache to testing analytics events, ensuring accurate data collection and enhancing application functionality.
featured image - Testing Analytics Events: A Guide for QA Engineers
Nataliia Makeeva HackerNoon profile picture

Welcome to this guide on testing analytics events!


In today's digital landscape, understanding user behavior is paramount for optimizing application performance and user experience. This article will explore effective approaches to testing analytics events, ensuring accurate data collection, and enhancing application functionality.


Popular analytics systems for tracking user behavior:


  • Google Analytics - for tracking website traffic and user behavior.
  • Amplitude - for analyzing user behavior in mobile and web applications.
  • Mixpanel - provides event analytics to track user interaction with applications.
  • Adobe Analytics - for traffic and user behavior analysis, offers advanced marketing research features.
  • Heap Analytics - offers user experience and behavior analysis tools for applications.
  • Hotjar - tracks heatmaps, user session recordings, and website surveys.
  • Segment - provides infrastructure for collecting and transferring data between analytics systems.
  • MyTracker - tracks user activity in mobile apps, offering event tracking and funnel analysis.
  • AppMetrica - a mobile app analytics platform by Yandex, tracks user activity and marketing campaign effectiveness.


An essential aspect of our application lies in its analytics system. It would be erroneous to assume that if users don't interact with it directly, it's not vital and doesn't require testing. Through analytics, we gain insights into which features of our application resonate with users and which ones fall short. For instance, consider a scenario where we showcase a promo banner "Send a money transfer by phone number." Out of 100 users who viewed it, 60 clicked for more information, but only 5 completed the action. This data prompts us to contemplate potential improvements, such as addressing any bugs hindering user completion or refining the user flow.


However, today we'll delve into testing analytics events from a QA specialist's perspective. How do we go about it? Firstly, I recommend creating a table in Excel or Google Sheets listing events and their parameters. We'll create two columns for Android and iOS (if we're testing web/desktop concurrently, we'd add those too, but I'll focus on mobile applications).


Once we have our table set up, we review all events to ensure clarity on when they should be triggered. For instance, should an event be retriggered when users navigate away and return to the current screen? It's crucial to clarify these aspects with an analyst, manager, or designer—the individual responsible for analyzing our events.


An example of a Google Sheets with event descriptions


Now, let's delve into the technical aspect—connecting to any sniffer. I prefer using Charles. We filter by the name of the analytics system in use (for instance, in our project, we use Amplitude). Then, we execute the action that should trigger a specific event and await the POST request sent to our analytics service. Typically, libraries aggregate events before sending them.


Therefore, you'll have to wait until a certain number of events accumulate or the timer expires. Alternatively, you can also minimize the application (most likely, at this moment, all accumulated events will be sent). Afterward, we inspect the request contents, ensuring the desired event is sent and no extraneous data is transmitted. Once verified, we mark it off on our table and proceed to the next event.


We search for the necessary requests in Charles



For ease of analyzing the transmitted information, it's convenient to use any JSON formatter, which you can find online. After formatting the JSON, search the page for the event that was supposed to be sent and examine the parameters with which it was sent, as well as the number of times.



We analyze the information being sent to the analytics system



After verifying events through Charles, we access the analytics system to ensure that all events have been captured accurately and that the quantities correspond to our testing activities.


So, the main points we consider during testing are as follows:


  • The event is triggered at the correct moment.

  • The event is not duplicated without cause.

  • Unnecessary events are not sent arbitrarily.

  • We verify spelling consistency: it should match the documentation, and event names across different platforms should be consistent (including case sensitivity).

  • When sending data like country or city names, ensure uniformity across platforms (for instance, consistently using "UK" instead of variations like "Great Britain" in some instances).


After the release, testers can also monitor the analytics system. They should examine the events related to new features or modified functionality. Have all the events been received? Are any of the old ones malfunctioning? If any irregularities are observed, the tester should first inspect the production build of the application. Our main goal is to ensure that our application works in production. Does the functionality work? If the functionality works, then see if the events are beIf it is, then check if the events are being triggered. Analyzing these aspects will aid in identifying latent application issues.


In conclusion, thorough testing of analytics events is essential for maintaining the integrity of data-driven decision-making in application development. By following the methodology outlined in this guide, testers can ensure the reliability and accuracy of analytics data, ultimately leading to improved user engagement and satisfaction.