This notebook presents an analysis of the website performance for dadgang.co. The analysis includes metrics on the site's accessibility, best practices, performance, PWA information, and SEO. Additionally, we will measure the load times of the site and create visualizations to illustrate the performance of the site.
The full report generated by Google's Lighthouse project can be found here.
Let's start by importing the necessary libraries and defining the metrics and opportunities data.
import pandas as pd
import matplotlib.pyplot as plt
import seaborn as sns
# Metrics data
metrics_data = {
'firstContentfulPaint': 2753,
'firstMeaningfulPaint': 3444,
'largestContentfulPaint': 4186,
'interactive': 19187,
'speedIndex': 8119,
'totalBlockingTime': 999,
'maxPotentialFID': 202,
'cumulativeLayoutShift': 0.0484375
}
# Opportunities data
opportunities_data = {
'render-blocking-resources': 745,
'unminified-css': 150,
'unused-css-rules': 150,
'unused-javascript': 3920,
'modern-image-formats': 300,
'efficient-animated-content': 1600,
'legacy-javascript': 150
}
Let's start by visualizing the metrics data. We'll create a bar plot to show the values of each metric.
# Convert the metrics data to a DataFrame for easier plotting
metrics_df = pd.DataFrame(list(metrics_data.items()), columns=['Metric', 'Value'])
# Create a bar plot
plt.figure(figsize=(10, 6))
sns.barplot(x='Value', y='Metric', data=metrics_df, palette='viridis')
plt.title('Website Performance Metrics')
plt.xlabel('Value')
plt.ylabel('Metric')
plt.show()
Next, we'll analyze the opportunities data. These are areas where improvements can be made to enhance the performance of the website. We'll create a bar plot to show the potential savings for each opportunity.
# Convert the opportunities data to a DataFrame for easier plotting
opportunities_df = pd.DataFrame(list(opportunities_data.items()), columns=['Opportunity', 'Potential Savings'])
# Create a bar plot
plt.figure(figsize=(10, 6))
sns.barplot(x='Potential Savings', y='Opportunity', data=opportunities_df, palette='viridis')
plt.title('Website Performance Improvement Opportunities')
plt.xlabel('Potential Savings (ms)')
plt.ylabel('Opportunity')
plt.show()
Next, we'll write some Python code to measure the load times of the website. We'll use the requests
library to send a GET request to the website and measure the time it takes to receive a response. We'll perform this operation multiple times to get an average load time.
import requests
import time
load_times = []
for _ in range(10):
start_time = time.time()
response = requests.get('https://dadgang.co')
end_time = time.time()
load_times.append(end_time - start_time)
average_load_time = sum(load_times) / len(load_times)
average_load_time
0.2190446138381958
Finally, let's create a histogram to visualize the distribution of load times.
# Create a histogram
plt.figure(figsize=(10, 6))
sns.histplot(load_times, bins=10, kde=True, color='skyblue')
plt.title('Distribution of Load Times')
plt.xlabel('Load Time (seconds)')
plt.ylabel('Frequency')
plt.show()
y
<script async src="//s.imgur.com/min/embed.js" charset="utf-8"></script>