Probably only works for dumb bots and I’m guessing the big ones are resilient to this sort of thing.
Judging from recent stories the big threat is bots scraping for AIs and I wonder if there is a way to poison content so any AI ingesting it becomes dumber. e.g. text which is nonsensical or filled with counter information, trap phrases that reveal any AIs that ingested it, garbage pictures that purport to show something they don’t etc.
The sane way of dealing with it is to use UTC everywhere internally and push local time and local formatting up to the user facing bits. And if you move time around as a string (e.g. JSON) then use ISO 8601 since most languages have time / cron APIs that can process it. Often doesn’t happen that way though…