hey! the recent release of ItemBadges 1.1.6 has an...
# suitecommerce
s
hey! the recent release of ItemBadges 1.1.6 has an error which is being reported by the SEO Engine, I'd wait until it gets fixed before updating the current 1.1.5 version
s
Have you raised a case for this?
s
I've requested the customer to do so, I can share the number as soon as I have it
s
Thanks, because I couldn't see an open issue for it
d
@Steve Goldberg typically 99% of SC/A customers will never debug the seo page generator. Its not until a partner with seo experience tests it, that you'd get a support case. I think we should change the approach on how to tackle seo issues, shouldn't we? I mean, this feature is for devs not for admins nor users. If you hire a non-sca seo agency, they'd never figure out what is going on (and I expect them to not even have to worry about it?) My proposal is to all be more proactive on testing it. In order for that to happen, we the "partners" need to be told what will change just like it happens with regular netsuite releases - and have at least some time and awareness to proactively test and report issues before making it into prod. Once an SEO issue made it to production, customers loses money for months - and recovery isn't free (the fix to the issue doesn't recover the traffic back on its own!)
s
I agree in principle. We're going to be making a change to the SEO generator soon that should eliminate the majority of problems people encounter. Secondly, if an extension fails (like in this case), partial content is still served to the crawler bot. As extensions JavaScript executes after the main JS, the page should still have the key details.
s
good morning @Steve Goldberg here is the case number #6469315
s
Thanks. I'll keep an eye out for the issue ticket.
šŸ™Œ 1
d
hi @Steve Goldberg What is the definition of "key details"? To me, all frontend html is key details. IE: if we don't render the footer, we are losing all the internal linking we put there. IE2: if we utilize a cms area below items on PLP pages for seo purposes but we the rendering stops at a certain point, we lose all the value of that cms area. Etc. So if I am interpreting yourself correctly, no, rendering "some" content is not better than rendering nothing. Rendering nothing at least gives google a signal that he should try use JS to render the page (unreliable, undesirable) - while if we serve content Google will assume that IS the content of the page. Google is always trying to optimize its resources, the only strategy that would benefit SuiteCommerce sites is to "pay the cost ourselves" instead of trying to also save resources ourselves. The more pre-render scenarios we cover, the better - if we think of SC/A's ability to growh and have higher visibility
s
What is the definition of "key details"? To me, all frontend html is key details. IE: if we don't render the footer, we are losing all the internal linking we put there.
I'm going to disagree that the repeated, generic links in the footer are 'key details' of a PDP. If you consider everything on a page a top priority, then nothing is. In my person opinion, the key details of a PDP are the unique product details such as name, description, price, images. If someone has added in an extension, for example, that renders a YouTube video demonstrating the product, then I would not consider that as a 'key detail' for SEO purposes. Things like footer links repeat on every page, and I would expect the crawler to pick them up on at least one page.
So if I am interpreting yourself correctly, no, rendering "some" content is not better than rendering nothing
This is phrased awkwardly, so I will be clear: rendering the majority of the page (with the key details) and serving partial content, is better than serving nothing. I'm not sure I agree with your assessment that we should show nothing and then hope that the search engine crawler can, instead, try to execute the SPA.
Google is always trying to optimize its resources, the only strategy that would benefit SuiteCommerce sites is to "pay the cost ourselves" instead of trying to also save resources ourselves. The more pre-render scenarios we cover, the better - if we think of SC/A's ability to growh and have higher visibility
I'm not sure what you mean by this. While we do, obviously try to optimise the SEO page generator, the primary concern is to generate SEO content. I can't think of a scenario where we have decided to 'save resources' at the expense of SC SEO. We want to render the page and serve the content to crawlers, and we are actively working on improving it so that about 95% of the current problems our customers have with it are eliminated.
d
Hi @Steve Goldberg Strongly disagree and (sorry) it is not an opinion but a fact. In SEO, there's nothing such as "key details of a page". The entire website must be inter-connected and everything you do in one page affects all the pages of the site. In SEO, you do not primarily work page by page - you work on topic silos. And a topic silo is composed by hundreds of pages interlinked semantically. If you break the interlinking (aka menu, footer, or any other area) you break your SEO strategy. 1- you can boost the authority of a page by linking it in the menu or footer. More internal links = stronger the page get hence this is inaccurate: "Things like footer links repeat on every page, and I would expect the crawler to pick them up on at least one page.". Yes, it will be CRAWLED but how many links the url has, is critical. 2- modern SEO strategies are composed of optimizing the internal linking structure so that we can "force" google to crawl some pages more than others 3- Building topic silos is another modern strategy, also composed by creating semantic internal linking. 4- we can't predict where the page will break with this context you are mentioning. And it will make things even harder to debug than our current "it works or it doesn't". Let me repeat: it is a fact that a partially rendered page WILL have a negative impact in seo for the entire website. Even larger and harder to find than a non-rendered page because from the day to day of a person that WORKS doing SEO daily, it turns almost impossible to see if a page dropped because the content is partially rendering. It truly means I have to revise the entire html output instead of just confirming if it rendered at all!! I am happy to sign an NDA and meet with product managers to go over this. I'm really worried of what I'm reading, sorry BTW: In fact this approach was attempted years ago, it even used to be the behavior of V8 (former seo engine) and it was a terrible experience.
"I'm not sure I agree with your assessment that we should show nothing and then hope that the search engine crawler can, instead, try to execute the SPA." In fact, when we don't serve content at all because of a broken seo page generator (aka seo page generator didn't run) we can still see Google rendering the pages in Google Search Console. The issues with rankings on these scenarios happens for a diverse combination of OTHER factors: • Where the metadata comes from, so Google could render the body html but not the metadata (head) • The crawl budget that's going to be assigned to the website if google has to render every page • The perception of slowness google will have, which leads to lower rankings Google DOES execute the SPA nowadays if seo page generator doesn't run and sees the "key" content, but the issues happens because SuiteCommerce is not prepared for that scenario at all (what is served, how it is served, what runs on each scenario IE tracking tools etc) hence we rely on the seo engine output. I think there's plenty of scenarios not being considered here, hence I insist: I volunteer to collaborate with getting a clearer picture. Unless there's a dashboard that tells us exactly which pages had errors and which ones didn't run completely (BEFORE releasing the V2, not months after), this will make things harder - more complex - and not resolve issues. Anyways, if I was you guys, I'd rather just replace Backbone with a non-SPA tech or at least with a modern one that doesn't rely on SSR to be crawler compatible. Just my 2 cents
s
OK, well, as I said, the majority of issues (such as ones like this) are very likely to be a thing of the past when we introduce the v2 version. The headless browser in v2 is going to be a modern version of Chrome, so the page returned by SSR and the one built in the user's browser should be exactly the same. Current thinking is that whenever content is returned from the SSR, it will be a 200 – this includes scenarios where content is returned because of faulty JS in extensions. If there's no content (because a catastrophic failure) then it will be a 500.
d
Are you open to discussing it at some point? I think there are a few scenarios not being covered and I'd love to share my thoughts. You can use them whatever way you guys decide @Steve Goldberg
s
I think the first step is to put into an email what it is that you're thinking, so that we have an opportunity to review them ahead of time.
āœ… 1
Jumping on a call without an opportunity to prepare would likely not be a productive use of time
d
@Steve Goldberg sounds totally reasonable! šŸ™‚ I will work on a document with everything I believe we should discuss/consider and send to you. However, my preference was to be presented with what the requirements are, what use cases are being considered and how you are planning to resolve them, and what the approach is going to be - then provide feedback. Its hard(er) to come up with feedback or challenges on something I don't have details about. But I'll do my best. My only goal is for SuiteCommerce (and its customers) to be successful cause we all benefit from keeping the ecosystem healthy and capable of growing - so I will take the time to write everything down and send your way