The Liquid Purple Digital Marketing Agency can assist you with website creation, management, marketing, and SEO...
The following metrics are generated using performance data.
First Contentful Paint marks the time at which the first text or image is painted onto your page. A good user experience is 0.9s or less. 6.7 s DESKTOP 2.1 s Score: 0.26 Learn More
Time to interactive is the amount of time it takes for the page to become fully interactive. A good user experience is 2.5s or less. 22.0 s DESKTOP 4.6 s Learn More
Speed Index shows how quickly the contents of a page are visibly populated. A good user experience is 1.3s or less. 12.8 s DESKTOP 4.6 s Score: 0.05 Learn More
How much time is blocked by scripts during your page loading process or the sum of all time periods between FCP and Time to Interactive. A good user experience is 150ms or less. Learn More
Largest Contentful Paint marks how long it takes for the largest element of content (e.g. a hero image) to be painted on your page. A good user experience is 1.2s or less. Learn more.
Cumulative Layout Shift measures the movement of visible elements within the viewport or how much your page's layout shifts as it loads. A good user experience is a score of 0.1 or less. Learn More
Total Page Size - 4.3 MB
Images
JavaScript
Fonts
CSS Stylesheets
Other
HTML
Media
Total Page Requests - 209
Response Codes:
Here is more detailed information about the page.
NoIndex : noindex directive is a meta tag value. noindex directive is for not to show your website on search engine results. You must not set ‘noindex’ as value in meta tags if you want to be your website on search engine result.
By default, a webpage is set to “index.” You should add a <meta name="robots" content="noindex" /> directive to a webpage in the <head> section of the HTML if you do not want search engines to crawl a given page and include it in the SERPs (Search Engine Results Pages).
<meta name="robots" content="noindex" />
DoFollow & NoFollow : nofollow directive is a meta tag value. Nofollow directive is for not to follow any links of your website by search engine bots. You must not set ‘nofollow’ as value in meta tags if you want follow your link by search engine bots.
By default, links are set to “follow.” You would set a link to “nofollow” in this way: <a href="http://www.example.com/" rel="nofollow">Anchor Text</a> if you want to suggest to Google that the hyperlink should not pass any link equity/SEO value to the link target.
<a href="http://www.example.com/" rel="nofollow">Anchor Text</a>
Learn more
<link rel="canonical" href="https://mywebsite.com/home" />
<link rel="canonical" href="https://www.mywebsite.com/home" />
209 requests • 4,412 KiB
209 requests • 4,177 KiB
Potential savings of 7,600 ms
Potential savings of 2,730 ms
94 resources found
97 resources found
Third-party code blocked the main thread for 2,960 ms
Third-party code blocked the main thread for 60 ms
13177 ms
6.7 s
2.3 s
2,830 ms
80 ms
Potential savings of 1,180 KiB
Potential savings of 14 KiB
Potential savings of 121 KiB
Potential savings of 53 KiB
0 ms
Potential savings of 334 KiB
Potential savings of 219 KiB
Potential savings of 73 KiB
Potential savings of 72 KiB
Total size was 4,177 KiB
Total size was 4,412 KiB
15.2 s
46 chains found
45 chains found
1,973 elements
1,879 elements
Potential savings of 35 KiB
9 user timings
Potential savings of 9 KiB
Potential savings of 1,572 KiB
Potential savings of 260 KiB
Potential savings of 340 KiB
1 facade alternative available
Root document took 720 ms
Root document took 1,520 ms
20 long tasks found
6 long tasks found
f the Joomla site is installed within a folder such as at # e.g. www.example.com/joomla/ the robots.txt file MUST be # moved to the site root at e.g. www.example.com/robots.txt # AND the joomla folder name MUST be prefixed to the disallowed # path, e.g. the Disallow rule for the /administrator/ folder # MUST be changed to read Disallow: /joomla/administrator/ # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/orig.html # # For syntax checking, see: # http://tool.motoricerca.info/robots-checker.phtml User-agent: * Disallow: /administrator/ Disallow: /bin/ Disallow: /cache/ Disallow: /cli/ Disallow: /components/ Disallow: /includes/ Disallow: /installation/ Disallow: /language/ Disallow: /layouts/ Disallow: /libraries/ Disallow: /logs/ Disallow: /modules/ Disallow: /plugins/ Disallow: /tmp/ Disallow: *.pdf$ User-agent: Slurp Crawl-delay: 3 User-agent: dotbot Disallow: / User-agent: AhrefsBot Disallow: / # JSitemap entries Sitemap: https://uptimeinstitute.com/index.php?option=com_jmap&view=sitemap&format=xml
Here are some additional factors that affect your SEO page score and rankings.
There are browser cache times that could be set longer.
Serve static assets with an efficient cache policy. A long cache lifetime can speed up repeat visits to your page. Learn more
There are 97 resources found with short cache lifetimes.
Third-party code IS NOT significantly impacting load performance!
Third-party code can significantly impact load performance. Limit the number of redundant third-party providers and try to load third-party code after your page has primarily finished loading. [Learn more](https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/loading-third-party-javascript/).
Third-party code blocked the main thread for 60 ms.
Resources ARE blocking the first paint of your page!
Resources are blocking the first paint of your page. Consider delivering critical JS/CSS inline and deferring all non-critical JS/styles. [Learn more](https://web.dev/render-blocking-resources/).
Potential savings of 2,730 ms.
The network round trip time is good!
Network round trip times (RTT) have a large impact on performance. If the RTT to an origin is high, it's an indication that servers closer to the user could improve performance. [Learn more](https://hpbn.co/primer-on-latency-and-bandwidth/).
Main thread work can use minimizing!
Consider reducing the time spent parsing, compiling and executing JS. You may find delivering smaller JS payloads helps with this. [Learn more](https://web.dev/mainthread-work-breakdown/)
Reduce unused css!
Reduce unused rules from stylesheets and defer CSS not used for above-the-fold content to decrease bytes consumed by network activity. [Learn more](https://web.dev/unused-css-rules/).
Potential savings of 72 KiB.
Could use improvement!
Time to interactive is the amount of time it takes for the page to become fully interactive. [Learn more](https://web.dev/interactive/).
Looks good!
The Critical Request Chains below show you what resources are loaded with a high priority. Consider reducing the length of chains, reducing the download size of resources, or deferring the download of unnecessary resources to improve page load. [Learn more](https://web.dev/critical-request-chains/).
Could be better!
Consider lazy-loading offscreen and hidden images after all critical resources have finished loading to lower time to interactive. [Learn more](https://web.dev/offscreen-images/).
Largest Contentful Paint marks the time at which the largest text or image is painted. [Learn more](https://web.dev/lighthouse-largest-contentful-paint/)
Keep the server response time for the main document short because all other requests depend on it. [Learn more](https://web.dev/time-to-first-byte/).
Leverage the font-display CSS feature to ensure text is user-visible while webfonts are loading. [Learn more](https://web.dev/font-display/).
Consider reducing the time spent parsing, compiling, and executing JS. You may find delivering smaller JS payloads helps with this. [Learn more](https://web.dev/bootup-time/).
Large GIFs are inefficient for delivering animated content. Consider using MPEG4/WebM videos for animations and PNG/WebP for static images instead of GIF to save network bytes. [Learn more](https://web.dev/efficient-animated-content/)
Sum of all time periods between FCP and Time to Interactive, when task length exceeded 50ms, expressed in milliseconds. [Learn more](https://web.dev/lighthouse-total-blocking-time/).
Text-based resources should be served with compression (gzip, deflate or brotli) to minimize total network bytes. [Learn more](https://web.dev/uses-text-compression/).
Large network payloads cost users real money and are highly correlated with long load times. [Learn more](https://web.dev/total-byte-weight/).
Reduce unused JavaScript and defer loading scripts until they are required to decrease bytes consumed by network activity. [Learn more](https://web.dev/unused-javascript/).
For users on slow connections, external scripts dynamically injected via `document.write()` can delay page load by tens of seconds. [Learn more](https://web.dev/no-document-write/).
Redirects introduce additional delays before the page can be loaded. [Learn more](https://web.dev/redirects/).
.
Serve images that are appropriately-sized to save cellular data and improve load time. [Learn more](https://web.dev/uses-responsive-images/).
Polyfills and transforms enable legacy browsers to use new JavaScript features. However, many aren't necessary for modern browsers. For your bundled JavaScript, adopt a modern script deployment strategy using module/nomodule feature detection to reduce the amount of code shipped to modern browsers, while retaining support for legacy browsers. [Learn More](https://philipwalton.com/articles/deploying-es2015-code-in-production-today/)
Set an explicit width and height on image elements to reduce layout shifts and improve CLS. [Learn more](https://web.dev/optimize-cls/#images-without-dimensions)
Minifying CSS files can reduce network payload sizes. [Learn more](https://web.dev/unminified-css/).
Image formats like WebP and AVIF often provide better compression than PNG or JPEG, which means faster downloads and less data consumption. [Learn more](https://web.dev/uses-webp-images/).
Lists the longest tasks on the main thread, useful for identifying worst contributors to input delay. [Learn more](https://web.dev/long-tasks-devtools/)
Optimized images load faster and consume less cellular data. [Learn more](https://web.dev/uses-optimized-images/).
Collection of useful page vitals.
Unique words are uncommon words that reflects your site features and informations. Search engine metrics are not intended to use unique words as ranking factor but it is still useful to get a proper picture of your site contents. Using positive unique words like complete, perfect, shiny, is a good idea user experience. Stop words are common words like all the preposition, some generic words like download, click me, offer, win etc. since most used keyword may be a slight factor for visitors you are encouraged to use more unique words and less stop words.
Lists the network requests that were made during page load.
To set budgets for the quantity and size of page resources, add a budget.json file.
Used for treemap app.