I mentioned that Go2marine SCA website has been us...
# suitecommerce
k
I mentioned that Go2marine SCA website has been using Layer0 (now Edgio), a 3rd party caching service, to improve Google Core Web Vital measurements. Has NetSuite made any progress or has plans to improve the loading performance of SuiteCommerce websites so that the websites will pass Google Core Web Vital measurements? I recall a conversation with NetSuite SuiteCommerce development team that they working on improving the loading performance of SuiteCommerce, but I haven’t seen any mention of this in the last two or three SuiteCommerce release notes. Has anyone heard from NetSuite about their plans to improve performance?
s
Third-party caching services are not supported with SuiteCommerce, so the the support team is correct in pushing back against this. If you / your customer choose to use it then you will have to shoulder the risk of it. We strongly recommend against using them, especially as NetSuite already provides CDN caching services through Akamai. Is NetSuite making improvements to caching and performance? Yes, of course. An example related to this is that we have introduced a feature to improve the caching and serving of images so that next-gen image formats are preferred if they result in a smaller file size.
k
@Steve Goldberg, thanks for responding to my questions. The main issue we’re experiencing is the Google Core Web Vital measurements CLS and CLP are way too large to meet Google’s web page performance goals. The problem is that the shopping.js and shopping.css are very large files, and it takes a long time for NetSuite’s servers to deliver these so that the client browser can render them in time to meet Google’s metrics. I have developed a way to display the SEO Page Generator output initially, which loads very fast then replace it when the client has rendered the client side javascript backbone views and have inserted the CMS content. But, I need the 3rd party caching service to make some changes to the SEO Page Generator output before sending it to the client. I recall someone (I think it was Flo) that said a couple of years ago that the SuiteCommerce development team was working on a way to deliver the javascript and css files much faster, possibly breaking them up in to smaller chunks, or working on a way to speed up the initial loading time of a page. But I have seen nothing in the release notes. Do you know if the changes to the Akamai caching servers have been targeted at improving the initial page load of SCA web pages?
s
She may have been referring to webpack, which is something we are looking at but it's not on the horizon. I am aware that Google's performance standards can seem quite daunting when applied to SuiteCommerce. I would advise that 'poor' scores are reasonably common across ecommerce sites, primarily because of the weight of code and data required to show, for example, a PDP or category page. We have been making improvements to the caching service (I mentioned the new image formats work) and we are rolling out a change to the SEO page generator itself in the next few weeks that should improve TTFB and FCP scores. There are other aspects that we are looking to improve in terms of performance, particularly around caching, that we are exploring. I would remind you that performance scores from Google are a minor ranking factor and that they are relative to your competitors. Having a lightning fast site does not necessarily equal a huge increase in ranking as Google will always try to rank sites based on relevance, domain authority, linkbanks, etc, first.
k
@Steve Goldberg, I truly appreciate the information about what NetSuite and the SuiteCommerce teams are doing to improve performance. I will share this with LFS because they are considering moving to a different web platform. 😞 The focus on the Google CWV measurements became significant because the Google Merchant Center was penalizing URLs due to perform CLS and CLP measurements. Consumer retail ecommerce websites unfortunately have to meet Google’s standards if they want to place well and be found in their search engine and shopping sections. I tried all kinds of things to improve these measurements, but I couldn’t make much of a dent due to the slow loading of the initial page files.
s
The focus on the Google CWV measurements became significant because the Google Merchant Center was penalizing URLs due to perform CLS and CLP measurements. Consumer retail ecommerce websites unfortunately have to meet Google’s standards if they want to place well and be found in their search engine and shopping sections.
I've not heard of this. Do you have more information you can share on this? Is it this? https://support.google.com/merchants/answer/11192630?hl=en
And if you could share a screenshot of what Google is showing you (by DM if you prefer) that would be helpful as I don't have an idea of what this looks like for a real customer site
k
@Steve Goldberg, I am using Google PageSpeed Insights (https://pagespeed.web.dev/) to measure the Google’s core web vital (CWV) measurements. I have attached screenshots CWV measurements of two SCA website domains that point to the same SCA website: https://www.go2marine.com and https://sca.go2marine.com. The www.go2marine.com website is the production website that is published to the world. It uses the Edgio caching frontend service. The sca.go2marine.com website is the same SCA website that uses the NetSuite CDN cache. Both domains point to the same SSP application on LFS’ NetSuite production account. The CWV measurements that we care about is Largest Content Paint (LCP) and Cumulative Layout Shift (CLS).
s
Thanks – I'm aware of CWV and the page speed testing tools. My point is specifically about the part you said about Google Merchant Center. Those are the screens I can't access because they are private to your business. You say that Google says that they are going to penalise you for poor scores, so I wanted to know how you know that. Where do they say that?
k
@Steve Goldberg sorry. I didn’t understand. I’ll request them from LFS and send them to you tomorrow.
s
OK thanks. My understanding of the situation is that there is no 'penalty' for poor performance scores, rather there is a small boost available to companies who have good scores. While it would be disappointing not to get that boost, I would want to make sure that the scale and context of the problem is kept in mind. It seems to me (and I am obviously biased) that replatforming for such a small boost (which your new platform may not give you) is a potentially costly and hasty thing to do.