I made some adjustments to the structured data I send to Google. Do we spend to much time trying to understand the ranking algorithm of Google and to little on the user experience ?

On one hand, when to tailor your website for crawl bots you can increase your seo traffic. By doing that more people will benefit from what your product offer. The company will make more money and will be able to spend more time on improving the user experience.

On the other hand, there is a difference between what a "good" page look like for a bot and what it look like for a user. By pleasing to much the crawlers, you might deteriorate your user experience. So more users will come to your website, only to bounce to somewhere else because they are unimpressed.

Is that possible to have the best user experience while providing everything a bot wants ? That's certainly a good challenge.