I'm currently preparing for a major SEO initiative at work. Instead of just reading theory, I decided to take a "learn-by-doing" approach with a personal side project.
The results after one month? 632 organic visitors.
It has been an eye-opening sprint. I've learned that even tiny UX details matter for ranking—I was genuinely surprised to see a correlation between font size adjustments and performance metrics.
The Bot Anomaly
On October 15th, I hit a daily peak of 83 visitors. But digging into the data, something looked wrong.
A significant chunk of that traffic came directly from China. Using Yandex Webvisor, I watched the sessions and realized they were accessing the site with CSS disabled. I estimate about 20 of those "users" were actually bots.
Question for the community: Why would bots target a small site with direct traffic while specifically disabling CSS? Is it to save bandwidth during scraping?
I’d love to hear your theories as I continue to analyze the logs.