“You wouldn’t build a house without having a strong foundation,” Niki Mosier, head of SEO and content at AgentSync said, “And you shouldn’t build a website without having a strong foundation either, and without constantly making sure that foundation is strong and that there are no cracks in it.”
Optimizing the architecture of your site can help search engine crawlers find and index your content, which enables them to show those pages to users in the search results. It can also help send link authority around your site as well as make it easy for visitors to find what they’re looking for.
In her session at SMX Create, Niki Mosier shared the strategies she uses to ensure that the foundations of her site are solid and identify opportunities for greater search visibility.
Crawl budget analysis
Crawl budget refers to the number of URLs per site that Googlebot (or any other search engine crawler) can and wants to crawl.
“Every website gets a crawl budget, which can vary depending on the size of the site and the frequency that new content is being published on the site, so having an idea of what a website’s crawl budget is can be really beneficial in making informed decisions on what to optimize,” Mosier said.
Conducting a crawl budget analysis enables you to get a more comprehensive view of:
- How your website is being crawled. “If you identify that Googlebot is the client, you can use log file analysis to find out how Googlebot is handling the URLs on your site [and] if it is crawling any pages with parameters,” she said.
- How fast your site is. While there are many tools that can tell you how fast your server reacts, a log file analysis shows you how long it’s taking for a bot to download a resource from your server.
- Indexing problems. “Getting into the log files can really show us whether bots are having trouble downloading a page fully,” Mosier said.
- How often a URL is being crawled. The crawl frequency can be used to figure out if there are URLs that a search engine crawler should be crawling but isn’t, or vice versa.
- Crawling problems. This tactic can also reveal when a crawler is encountering 404 errors or redirect chains, for example.
“When it comes to actually doing crawl budget analysis, there’s a couple of tools that are helpful,” Mosier said, recommending ScaremingFrog’s Log File Analyser, Microsoft Excel and Splunk.
Mosier outlined her steps to performing a crawl budget analysis:
- Obtain your log files; Mosier recommended working with at least a month of data.
- Look at URLs with errors.
- Assess which bots are crawling which areas of your site.
- Evaluate by day, week and month to establish patterns that may be useful for analysis.
- See if a crawler is crawling URLs with parameters, which may indicate wasted crawl budget.
- Cross-reference crawl data with sitemap data to assess for missed content.
“Once you’ve dived into the server logs and have a good sense for what your crawl budget looks like, you can use this data to prioritize your SEO tasks,” she said, adding that SEOs should “prioritize based on the impact that fixing different areas of your site will have, the dev resources needed to fix issues and the time to fix those issues.”
RELATED: How to optimize your website’s crawl budget
Driving traffic with technical SEO
Finding out how well your site is functioning can help you put the right strategies in place to drive more traffic to it.
“Doing regular site audits is a great way to keep a pulse on what’s happening with our websites,” Mosier recommended. In addition, Google Search Console should be used to check for Core Web Vitals or schema issues, for example. “Using monitoring tools, [such as] Rank Ranger, Semrush and Ahrefs, these are great ways to stay alerted to any issues that might pop up with your website,” she said.
Assessing the search results pages (SERP) can give you a feel for the landscape of the keywords you’re targeting. In addition to seeing what search features may be available, the SERP also shows you which sites are ranking higher than you — “See what those sites are doing; looking at their source code can tell you what schema they’re using,” Mosier said, adding that you should also be viewing their pages to scope out what their headings and user experience look like.
Updating your old content can also result in a rankings boost. Mosier recommends paying extra attention to your headings and above-the-fold content. Adding schema markup may also enable your content to appear as a rich result, which may also increase your visibility on the SERP.
“Using tools like Frase or Content Harmony can help you see what other sites that are ranking for the keywords that you want to be ranking for are using for headings, what kind of FAQ content they’re using and what content they have above the fold,” she added.
“Paying attention to page speed is definitely an important metric to think about, [but] I think it’s also important to pay attention to what the industry average is,” Mosier said, “So, go and look at where your competitors’ sites are ranking or are at as far as page speed and kind of set that as your benchmark.”
It’s also important to assess individual page speed versus overall site speed: “You want to see what each page on your site is loading for and make improvements on a page-by-page basis and not just look at the site speed as a whole because pages are what is ranking, not necessarily the whole site,” she said.
Additionally, how your pages render can affect your user experience as well as what search engine crawlers “see.” “Is there a pop-up or a really big header on a particular page that’s taking up a lot of the above-the-fold space? That can be a problem,” Mosier said, noting that page speed can also impact how search engines render a page.
More from SMX
New on Search Engine Land