Really, how then do they parse code if they don’t know what valid code is?
Poor code massively increases the liklihood that your site will break. Depending on the severity of the poor code it can damage your SEO as the indexer cannot read your site. It will also impact any other website that connects to your site such as Social Media or any other services you are running, because they cannot understand your code.
I appreciate that some errors are minor, but some aren’t and a validator is a good way to find this out. Although browsers will have their own rules, in general they have to follow some validation or the browser won’t know how to parse the code. W3C gives out that standard and deviating from this may well lead you into problems.
Bottomline: If your code is bad it will impact your sites connectivity to the rest of the internet.
You are falling into the trap that something flagged by a HTML checker from the last decade, equals bad code.
As Stuart commented, what is flagged as an Error in your example is incredibly minor / insignificant today and very easy for someone to miss interpret. A blank ID is ignored by modern browsers. Otherwise Google PageSpeed Insights / Lighthouse would not be able to report a 100% for all 4 areas for both mobile and desktop.
If you believe that a Blank ID causes a problem “with bad code” then please explain exactly why.
Can I suggest you start another thread about W3C Validation?
I have been using W3CValidator recently and actually found it very useful to find buttons and links, etc. that are badly formed or have missing content.
Also it detects errors that are otherwise difficult to catch where simple typos in places like attributes, can exist. For example I had added "loading="lazy" instead of loading="lazy" in an image that was copying and pasting throughout a page. Those pesky little Stacks boxes can make this stuff difficult to see.
I would say it is well worth checking your site to find such issues.