You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Not sure if this is a bug report or a feature request, but the calculation of performance score and best practices seems utterly wrong in many cases. For example, apps that are built using Next.js will often have a bundle size of 1 MB with 2 MB additional content loaded on the first load. There are 30 or so network requests with over half of them being Javascript files. Every single site built with that tech stack is visibly slow and very heavy, yet they are able to score 100% on performance.
My own sites have a bundle size of 0.1MB with 0.1MB extra content, and 3 network requests on load, a single minified JS file is included. This tech stack is much faster, open instantly on first load and on navigation, yet they score the same, also 100%.
How is this possible? One site being slow and heavy yet, the other being lean and fast, why do they get the same score? Or rather, why is large bundles and a high number of network requests not being used in the calculation of performance score and best practices? Shouldn't Lighthouse warn about this or punish slow sites?
Add this to the list of issues in the Lighthouse report please:
"Reduce bundle size - use plain HTML, CSS and Javascript to avoid large bundle sizes"
"Reduce number of network requests - your site will load faster if you avoid a large number of network requests"
Shouldn't Lighthouse warn about excessive network requests and dissuade people from using inferior tech such as Next.js? If not, why not?
The text was updated successfully, but these errors were encountered:
Not sure if this is a bug report or a feature request, but the calculation of performance score and best practices seems utterly wrong in many cases. For example, apps that are built using Next.js will often have a bundle size of 1 MB with 2 MB additional content loaded on the first load. There are 30 or so network requests with over half of them being Javascript files. Every single site built with that tech stack is visibly slow and very heavy, yet they are able to score 100% on performance.
My own sites have a bundle size of 0.1MB with 0.1MB extra content, and 3 network requests on load, a single minified JS file is included. This tech stack is much faster, open instantly on first load and on navigation, yet they score the same, also 100%.
How is this possible? One site being slow and heavy yet, the other being lean and fast, why do they get the same score? Or rather, why is large bundles and a high number of network requests not being used in the calculation of performance score and best practices? Shouldn't Lighthouse warn about this or punish slow sites?
Add this to the list of issues in the Lighthouse report please:
Shouldn't Lighthouse warn about excessive network requests and dissuade people from using inferior tech such as Next.js? If not, why not?
The text was updated successfully, but these errors were encountered: