Good morning search marketers, let's make it official.
Yesterday, Google announced that it posted a Request for Comments to the Internet Engineering Task Force to formalize the Robots Exclusion Protocol (REP) specification. Although it's been around for a quarter of a century, the REP has yet to become a formal standard despite every major search engine using robots.txt as a crawling directive. Is anything really changing, though? Our own Barry Schwartz posed that question to Google's Gary Illyes, to which he replied, "No, nothing at all."
Along with that announcement, Google also said that it's going to be open-sourcing its robots.txt parser — you can access it on Github right now.
Managing campaigns, developing strategies, pleasing clients and adapting to the circumstances are all table stakes for successful marketers, but what really sets the best apart? Felicia Delveccio, director of digital media at DAC Group and a Search Engine Land Award search marketer of the year winner, says it's largely about relationships. Her long-view approach and emphasis on listening to her client enabled her to lead an 18-month realignment that checked off all the boxes in terms of success metrics (including a whopping 971% increase in store visits) as well as identifying new opportunities.
I'm going to stop here, but there are more insights below. Keep on reading for your Pro Tip on how to track button clicks with Google Tag Manager as well as your daily Search Shorts and much more.
George Nguyen,
Associate Editor