I think the issue here is that the various web stats tools seem to promise something that they can't really fulfil - which is, to accurately measure your web site's visits and other KPIs.
Even a seemingly simple KPI like "page view" can have an enormous spread, depending on the methodology used to measure (as shown by your issue here). It gets even worse when you look at the calculated KPIs, like "visits", "bounce rate" or "conversions". In most cases, these aren't even clearly defined and where they are, these definitions differ from one tool to another.
To be clear: the web stats can still be useful - if my tool shows that this week I had 50% more visits than last week, then that is a good and important information. Or if I see that an article has got less visits since I changed the title, that is a good indication that I should undo the change...
But what you should never do is take the numbers at face value.
And as for your initial request: I honestly don't believe that there is a "rogue robot" on the run on your site. It is probably really down to different types of ad-blocking and different ways of using the web.
PS: I am actually more amazed that there are still people who don't use ad-blockers. I even have multiple layers of ad-blocking active (a DNS blocker on PiHole plus a browser-blocker ... and some devices [smart TVs!] even only run over VPNs!) Without those, the Interwebz of today are almost unusable.