Private Equity Analysis
Conducted comprehensive analysis to evaluate the digital maturity of a potential acquisition target in the fast-casual restaurant chain industry.
Redacted approached us about consulting with them on a confidential potential acquisition of a fast-casual restaurant chain they were in the process of evaluating. MonstarLab (and Fuzz previously) have a reputation as leaders in this space so they wanted our help in evaluating the digital maturity (and potential investment areas) of the business in comparison to a set of peers in the space.
Initially, to produce a comparison between a competitive set of sites, I conducted an analysis driven by Google PageSpeed Insights. My goal was to assess the performance, accessibility, best practices, and search engine optimization (SEO) of these websites, with a specific focus on key user-experience affecting metrics such as the Index score, Largest Contentful Paint (LCP), Accessibility index, Best Practices index, and SEO index. I evaluated these metrics separately for both desktop and mobile platforms to account for differing user experiences.
For every second delay in mobile page load, conversions can fall by up to 20%.
Expand Lighthouse Metrics Explanation
The index score served as an overarching indicator of a website's performance on PageSpeed Insights. Ranging from 0 to 100, a higher score signified superior performance. This composite metric considered various factors, including LCP, First Input Delay (FID), Cumulative Layout Shift (CLS), and more. Consequently, the index score provided us with a concise summary of each website's performance.
Largest Contentful Paint (LCP)
Largest Contentful Paint (LCP) played a crucial role in determining how quickly the main content became visible to users. As a core web vitals metric, it measured the time taken for the largest visible element on a website to render. We strived to achieve an LCP of less than 2.5 seconds, as recommended by Google, to ensure optimal user experience.
The Accessibility index allowed us to gauge the adherence of each website to web accessibility standards and guidelines. We evaluated factors such as proper semantic HTML markup, alternative text for images, keyboard accessibility, and compatibility with assistive technologies. A higher Accessibility index indicated better accessibility for users with disabilities, emphasizing the importance of inclusivity.
Best Practices Index
In assessing the Best Practices index, we considered various web development best practices endorsed by Google. This evaluation encompassed factors such as efficient resource loading, secure connections (HTTPS), proper cache usage, responsive design, and avoidance of deprecated or obsolete features. Websites with a higher Best Practices index demonstrated a commitment to industry standards and optimal development practices.
The SEO index measured the adherence of each website to search engine optimization guidelines and techniques. We scrutinized factors including meta tags, structured data, page titles, headings, and mobile-friendliness. A higher SEO index indicated superior optimization for search engines, which enhanced the likelihood of achieving favorable search engine rankings.
By analyzing these metrics individually for desktop and mobile platforms, I was able to gain insights into the relative performance of each website within the competitive set. This approach enabled us to identify areas for improvement and prioritize optimization efforts based on the specific needs and expectations of users across platforms. Ultimately, this site comparison using Google PageSpeed Insights empowered us to enhance user experiences, accessibility, adherence to best practices, and search engine visibility for the websites in question.
Continuing my analysis, I proceeded to work towards generating a backend competitive comparison by leveraging the power of Cypress and the "@neuralegion/cypress-har-generator" package. These tools, along with the usage of Cypress studio’s Record functionality allowed us to quickly automate the completion of each site's primary flow, which encompasses a site’s generalized path to order, sign-up, or completion that could be applied uniformly across all competitors. This involves simulating user interactions, such as browsing products, adding items to the cart, proceeding to checkout, and completing a purchase. By automating these flows, we can systematically measure and compare the performance of different applications, including response times, transaction completion rates, and overall user journey smoothness.
To capture a comprehensive view of network traffic, we utilized the industry-standard HAR (HTTP Archive) file format. Cypress in conjunction with the "cypress-har-generator" package enabled us to record all network requests and responses into HAR files during the automated interactions with each site. To ensure we only tracked the network calls that mattered to the test, we filtered the network traffic to include only the first-party API URLs, excluding any analytics traffic that could skew the results.
In order to account for potential variations and ensure statistical significance, we executed multiple runs for each competitor tested. By averaging the results across these runs, we mitigated the margin for error and obtained more reliable performance metrics. Also recognizing the importance of performance during peak demand times, we conducted our tests across a carefully selected cross-section of hours. This approach allowed us to evaluate and illustrate the performance of each competitor's primary flow under varying levels of user activity, thereby gaining insights into their ability to handle high-demand situations.
By utilizing Cypress and the "@neuralegion/cypress-har-generator" package in this manner, we established an automated and standardized framework for capturing the network traffic and timing associated with each competitor's primary flow. The resulting HAR files provided us with a detailed record of the backend performance, allowing for comprehensive analysis and comparison. This approach empowered us to make data-driven decisions by evaluating the performance of each competitor's primary flow, including factors such as response times, API call efficiency, and overall backend performance. It enabled us to identify strengths and weaknesses, identify areas for improvement, and make informed decisions around investment to potentially enhance our clients backend performance with an understanding of the competitive landscape.
Both analyses are integral to understanding areas for improvement in driving conversions and increasing revenue for any type of digital business. Studies have consistently shown that faster load times lead to higher conversion rates, with even small delays negatively impacting user engagement and conversions. The frontend analysis addresses the website's visual and interactive elements, ensuring optimal performance, accessibility, and adherence to best practices. The backend analysis focused on the behind-the-scenes operations, ensuring efficient network communication and response times.
By combining frontend and backend performance analysis, along with the automation of key user flows and API-level benchmarking, I was able create a preliminary and fully external understanding of a business's digital performance in comparison to industry standards. This information serves as a valuable reference point for businesses, enabling them to identify areas for improvement, optimize their digital strategies, and align their performance with industry leaders. This project enabled
Redacted to quantify the cost and potential lift in conversion by optimizing the user experience and focusing on performance. The data-driven decisions based on these analyses provide valuable insights into the websites' digital maturity and investment potential, positioning Redacted for success in the potential acquisition.