• The One-Hour Guide to SEO: Link Building - Whiteboard Friday

    Posted by randfish

    The final episode in our six-part One-Hour Guide to SEO series deals with a topic that's a perennial favorite among SEOs: link building. Today, learn why links are important to both SEO and to Google, how Google likely measures the value of links, and a few key ways to begin earning your own.


    Click on the whiteboard image above to open a high resolution version in a new tab!

    Video Transcription

    Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. We are back with our final part in the One-Hour Guide to SEO, and this week talking about why links matter to search engines, how you can earn links, and things to consider when doing link building.

    Why are links important to SEO?

    So we've discussed sort of how search engines rank pages based on the value they provide to users. We've talked about how they consider keyword use and relevant topics and content on the page. But search engines also have this tool of being able to look at all of the links across the web and how they link to other pages, how they point between pages.

    ??

    So it turns out that Google had this insight early on that what other people say about you is more important, at least to them, than what you say about yourself. So you may say, "I am the best resource on the web for learning about web marketing." But it turns out Google is not going to believe you unless many other sources, that they also trust, say the same thing. Google's big innovation, back in 1997 and 1998, when Sergey Brin and Larry Page came out with their search engine, Google, was PageRank, this idea that by looking at all the links that point to all the pages on the internet and then sort of doing this recursive process of seeing which are the most important and most linked to pages, they could give each page on the web a weight, an amount of PageRank.

    Then those pages that had a lot of PageRank, because many people linked to them or many powerful people linked to them, would then pass more weight on when they linked. That understanding of the web is still in place today. It's still a way that Google thinks about links. They've almost certainly moved on from the very simplistic PageRank formula that came out in the late '90s, but that thinking underlies everything they're doing.

    How does Google measure the value of links?

    Today, Google measures the value of links in many very sophisticated ways, which I'm not going to try and get into, and they're not public about most of these anyway. But there is a lot of intelligence that we have about how they think about links, including things like more important, more authoritative, more well-linked-to pages are going to pass more weight when they link.

    A.) More important, authoritative, well-linked-to pages pass more weight when they link

    That's true of both individual URLs, an individual page, and websites, a whole website. So for example, if a page on The New York Times links to yoursite.com, that is almost certainly going to be vastly more powerful and influential in moving your rankings or moving your ability to rank in the future than if randstinysite.info — which I haven't yet registered, but I'll get on that — links to yoursite.com.

    This weighting, this understanding of there are powerful and important and authoritative websites, and then there are less powerful and important and authoritative websites, and it tends to be the case that more powerful ones tend to provide more ranking value is why so many SEOs and marketers use metrics like Moz's domain authority or some of the metrics from Moz's competitors out in the software space to try and intuit how powerful, how influential will this link be if this domain points to me.

    B.) Diversity of domains, rate of link growth, and editorial nature of links ALL matter

    So the different kinds of domains and the rate of link growth and the editorial nature of those links all matter. So, for example, if I get many new links from many new websites that have never linked to me before and they are editorially given, meaning I haven't spammed to place them, I haven't paid to place them, they were granted to me because of interesting things that I did or because those sites wanted to editorially endorse my work or my resources, and I do that over time in greater quantities and at a greater rate of acceleration than my competitors, I am likely to outrank them for the words and phrases related to those topics, assuming that all the other smart SEO things that we've talked about in this One-Hour Guide have also been done.

    C.) HTML-readable links that don't have rel="nofollow" and contain relevant anchor text on indexable pages pass link benefit

    HTML readable links, meaning as a simple text browser browses the web or a simple bot, like Googlebot, which can be much more complex as we talked about in the technical SEO thing, but not necessarily all the time, those HTML readable links that don't have the rel="nofollow" parameter, which is something that you can append to links to say I don't editorially endorse this, and many, many websites do.

    If you post a link to Twitter or to Facebook or to LinkedIn or to YouTube, they're going to carry this rel="nofollow,"saying I, YouTube, don't editorially endorse this website that this random user has uploaded a video about. Okay. Well, it's hard to get a link from YouTube. And it contains relevant anchor text on an indexable page, one that Google can actually browse and see, that is going to provide the maximum link benefit.

    So a href="https://yoursite.com" great tool for audience intelligence, that would be the ideal link for my new startup, for example, which is SparkToro, because we do audience intelligence and someone saying we're a tool is perfect. This is a link that Google can read, and it provides this information about what we do.

    It says great tool for audience intelligence. Awesome. That is powerful anchor text that will help us rank for those words and phrases. There are loads more. There are things like which pages linked to and which pages linked from. There are spam characteristics and trustworthiness of the sources. Alt attributes, when they're used in image tags, serve as the anchor text for the link, if the image is a link.

    There's the relationship, the topical relationship of the linking page and linking site. There's text surrounding the link, which I think some tools out there offer you information about. There's location on the page. All of this stuff is used by Google and hundreds more factors to weight links. The important part for us, when we think about links, is generally speaking if you cover your bases here, it's indexable, carries good anchor text, it's from diverse domains, it's at a good pace, it is editorially given in nature, and it's from important, authoritative, and well linked to sites, you're going to be golden 99% of the time.

    Are links still important to Google?

    Many folks I think ask wisely, "Are links still that important to Google? It seems like the search engine has grown in its understanding of the web and its capacities." Well, there is some pretty solid evidence that links are still very powerful. I think the two most compelling to me are, one, the correlation of link metrics over time. 

    So like Google, Moz itself produces an index of the web. It is billions and billions of pages. I think it's actually trillions of pages, trillions of links across hundreds of billions of pages. Moz produces metrics like number of linking root domains to any given domain on the web or any given page on the web.

    Moz has a metric called Domain Authority or DA, which sort of tries to best replicate or best correlate to Google's own rankings. So metrics like these, over time, have been shockingly stable. If it were the case someday that Google demoted the value of links in their ranking systems, basically said links are not worth that much, you would expect to see a rapid drop.

    But from 2007 to 2019, we've never really seen that. It's fluctuated. Mostly it fluctuates based on the size of the link index. So for many years Ahrefs and Majestic were bigger link indices than Moz. They had better link data, and their metrics were better correlated.

    Now Moz, since 2018, is much bigger and has higher correlation than they do. So the various tools are sort of warring with each other, trying to get better and better for their customers. You can see those correlations with Google pretty high, pretty standard, especially for a system that supposedly contains hundreds, if not thousands of elements.

    When you see a correlation of 0.25 or 0.3 with one number, linking root domains or page authority or something like that, that's pretty surprising. The second one is that many SEOs will observe this, and I think this is why so many SEO firms and companies pitch their clients this way, which is the number of new, high quality, editorially given linking root domains, linking domains, so The New York Times linked to me, and now The Washington Post linked to me and now wired.com linked to me, these high-quality, different domains, that correlates very nicely with ranking positions.

    So if you are ranking number 12 for a keyword phrase and suddenly that page generates many new links from high-quality sources, you can expect to see rapid movement up toward page one, position one, two, or three, and this is very frequent.

    How do I get links?

    Obviously, this is not alone, but very common. So I think the next reasonable question to ask is, "Okay, Rand, you've convinced me. Links are important. How do I get some?" Glad you asked. There are an infinite number of ways to earn new links, and I will not be able to represent them here. But professional SEOs and professional web marketers often use tactics that fall under a few buckets, and this is certainly not an exhaustive list, but can give you some starting points.

    1. Content & outreach

    The first one is content and outreach. Essentially, the marketer finds a resource that they could produce, that is relevant to their business, what they provide for customers, data that they have, interesting insights that they have, and they produce that resource knowing that there are people and publications out there that are likely to want to link to it once it exists.

    Then they let those people and publications know. This is essentially how press and PR work. This is how a lot of content building and link outreach work. You produce the content itself, the resource, whatever it is, the tool, the dataset, the report, and then you message the people and publications who are likely to want to cover it or link to it or talk about it. That process is tried-and-true. It has worked very well for many, many marketers. 

    2. Link reclamation

    Second is link reclamation. So this is essentially the process of saying, "Gosh, there are websites out there that used to link to me, that stopped linking." The link broke. The link points to a 404, a page that no longer loads on my website.

    The link was supposed to be a link, but they didn't include the link. They said SparkToro, but they forgot to actually point to the SparkToro website. I should drop them a line. Maybe I'll tweet at them, at the reporter who wrote about it and be like, "Hey, you forgot the link." Those types of link reclamation processes can be very effective as well.

    They're often some of the easiest, lowest hanging fruit in the link building world. 

    3. Directories, resource pages, groups, events, etc.

    Directories, resource pages, groups, events, things that you can join and participate in, both online or online and offline, so long as they have a website, often link to your site. The process is simply joining or submitting or sponsoring or what have you.

    Most of the time, for example, when I get invited to speak at an event, they will take my biography, a short, three-sentence blurb, that includes a link to my website and what I do, and they will put it on their site. So pitching to speak at events is a way to get included in these groups. I started Moz with my mom, Gillian Muessig, and Moz has forever been a woman-owned business, and so there are women-owned business directories.

    I don't think we actually did this, but we could easily go, "Hey, you should include Moz as a woman-owned business.We should be part of your directory here in Seattle." Great, that's a group we could absolutely join and get links from. 

    4. Competitors' links

    So this is basically the practice you almost certainly will need to use tools to do this. There are some free ways to do it.

    The simple, free way to do it is to say, "I have competitor 1 brand name and competitor 2 brand name.I'm going to search for the combination of those two in Google, and I'm going to look for places that have written about and linked to both of them and see if I can also replicate the tactics that got them coverage." The slightly more sophisticated way is to go use a tool. Moz's Link Explorer does this.

    So do tools from people like Majestic and Ahrefs. I'm not sure if SEMrush does. But basically you can plug in, "Here's me. Here's my competitors. Tell me who links to them and does not link to me." Moz's tool calls this the Link Intersect function. But you don't even need the link intersect function.

    You just plug in a competitor's domain and look at here are all the links that point to them, and then you start to replicate their tactics. There are hundreds more and many, many resources on Moz's website and other great websites about SEO out there that talk about many of these tactics, and you can certainly invest in those. Or you could conceivably hire someone who knows what they're doing to go do this for you. Links are still powerful. 

    Okay. Thank you so much. I want to say a huge amount of appreciation to Moz and to Tyler, who's behind the camera — he's waving right now, you can't see it, but he looks adorable waving — and to everyone who has helped make this possible, including Cyrus Shepard and Britney Muller and many others.

    Hopefully, this one-hour segment on SEO can help you upgrade your skills dramatically. Hopefully, you'll send it to some other folks who might need to upgrade their understanding and their skills around the practice. And I'll see you again next week for another edition of Whiteboard Friday. Take care.

    Video transcription by Speechpad.com

    In case you missed them:

    Check out the other episodes in the series so far:


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


    2019-04-19T00:02:00+00:00
  • 4 Unconventional Ways to Become a Better SEO

    Posted by meagar8

    Let’s get real for a moment: As much as we hear about positive team cultures and healthy work environments in the digital marketing space, many of us encounter workplace scenarios that are far from the ideal. Some of us might even be part of a team where we feel discouraged to share new ideas or alternative solutions because we know it will be shot down without discussion. Even worse, there are some who feel afraid to ask questions or seek help because their workplace culture doesn’t provide a safe place for learning.

    These types of situations, and many others like it, are present in far too many work environments. But what if I told you it doesn’t have to be this way? 

    Over the last ten years as a team manager at various agencies, I’ve been working hard to foster a work environment where my employees feel empowered to share their thoughts and can safely learn from their mistakes. Through my experiences, I have found a few strategies to combat negative culture and replace it with a culture of vulnerability and creativity.

    Below, I offer four simple steps you can follow that will transform your work environment into one that encourages new ideas, allows for feedback and positive change, and ultimately makes you and your team better digital marketers.

    Vulnerability leads to creativity

    I first learned about the impact of vulnerability after watching a viral TED talk by Dr. Brene Brown. She defined vulnerability as “uncertainty, risk, and emotional exposure.” She also described vulnerability as “the birthplace of love, belonging, joy, courage, empathy, and creativity.” From this, I learned that to create a culture of vulnerability is to create a culture of creativity. And isn’t creativity at the heart of what we SEOs do?

    A culture of vulnerability encourages us to take risks, learn from mistakes, share insights, and deliver top results to our clients. In the fast-paced world of digital marketing, we simply cannot achieve top results with the tactics of yesterday. We also can’t sit around and wait for the next Moz Blog or marketing conference, either. Our best course of action is to take risks, make mistakes, learn from those mistakes, and share insights with others. We have to learn from those with more experience than us and share what we know to those with less experience. In other words, we have to be vulnerable.

    Below is a list of four ways you can help create a culture of vulnerability. Whether you are a manager or not, you can impact your team’s culture.

    1. Get a second pair of eyes on your next project

    Are you finishing up an exciting project for your client? Did you just spend hours of research and implementation to optimize the perfect page? Perfect! Now go ask someone to critique it!

    As simple as it sounds, this can make a huge difference in fostering a culture of creativity. It’s also extremely difficult to do.

    Large or small, every project or task we complete should be the best your team can provide. All too often, however, team members work in silos and complete these projects without asking for or receiving constructive feedback from their teammates before sending it to the client. This leaves our clients and projects only receiving the best one person can provide rather than the best of an entire team.

    We all work with diverse team members that carry varying levels of experience and responsibilities. I bet someone on your team will have something to add to your project that you didn’t already think of. Receiving their feedback means every project that you finish or task that you complete is the best your team has to offer your clients.

    Keep in mind, though, that asking for constructive feedback is more than just having someone conduct a “standard QA.” In my experience, a “standard QA” means someone barely looked over what you sent and gave you the thumbs up. Having someone look over your work and provide feedback is only helpful when done correctly.

    Say you’ve just completed writing and editing content to a page and you’ve mustered up the courage to have someone QA your work. Rather than sending it over, saying “hey can you review this and make sure I did everything right,” instead try to send detailed instructions like this:

    "Here is a <LINK> to a page I just edited. Can you take 15 minutes to review it? Specifically, can you review the Title Tag and Description? This is something the client said is important to them and I want to make sure I get it right."

    In many cases, you don’t need your manager to organize this for you. You can set this up yourself and it doesn’t have to be a big thing. Before you finish a project or task this week, work with a team member and ask them for help by simply asking them to QA your work. Worried about taking up too much of their time? Offer to swap tasks. Say you’ll QA some of their work if they QA yours.

    Insider tip

    You will have greater success and consistency if you make QA a mandatory part of your process for larger projects. Any large project like migrating a site to https or conducting a full SEO audit should have a QA process baked into it.

    Six months ago I was tasked to present one of our 200+ point site audits to a high profile client. The presentation was already created with over 100 slides of technical fixes and recommendations. I’m normally pretty comfortable presenting to clients, but I was nervous about presenting such technical details to THIS particular client.

    Lucky for me, my team already had a process in place for an in-depth QA for projects like this. My six team members got in a room and I presented to them as if they were the client. Yes, that’s right, I ROLE PLAYED! It was unbearably uncomfortable at first. Knowing that each of my team members (who I respect a whole lot) are sitting right in front of me and making notes on every little mistake I make.

    After an agonizing 60 minutes of me presenting to my team, I finished and was now ready for the feedback. I just knew the first thing out of their mouths would be something like “do you even know what SEO stands for?” But it wasn’t. Because my team had plenty of practice providing feedback like this in the past, they were respectful and even more so, helpful. They gave me tips on how to better explain canonicalization, helped me alter some visualization, and gave me positive feedback that ultimately left me confident in presenting to the client later that week.

    When teams consistently ask and receive feedback, they not only improve their quality of work, but they also create a culture where team members aren’t afraid to ask for help. A culture where someone is afraid to ask for help is a toxic one and can erode team spirit. This will ultimately decrease the overall quality of your team’s work. On the other hand, a culture where team members feel safe to ask for help will only increase the quality of service and make for a safe and fun team working experience.

    2. Hold a half-day all hands brainstorm meeting

    Building strategies for websites or solving issues can often be the most engaging work that an SEO can do. Yes that’s right, solving issues is fun and I am not ashamed to admit it. As fun as it is to do this by yourself, it can be even more rewarding and infinitely more useful when a team does it together.

    Twice a year my team holds a half-day strategy brainstorm meeting. Each analyst brings a client or issues they are struggling to resolve its website performance, client communication, strategy development, etc. During the meeting, each team member has one hour or more to talk about their client/issue and solicit help from the team. Together, the team dives deep into client specifics to help answer questions and solve issues.

    Getting the most out of this meeting requires a bit of prep both from the manager and the team.

    Here is a high-level overview of what I do.

    Before the Meeting

    Each Analyst is given a Client/Issue Brief to fill out describing the issue in detail. We have Analysts answer the following 5 questions:

    1. What is the core issue you are trying to solve?
    2. What have you already looked into or tried?
    3. What haven’t you tried that you think might help?
    4. What other context can you provide that will help in solving this issue?

    After all client briefs are filled out and about 1-2 days prior to the half day strategy meeting I will share all the completed briefs to the team so they can familiarize themselves with the issues and come prepared to the meeting with ideas.

    Day of the Meeting

    Each Analyst will have up to an hour to discuss their issue with the team. Afterwards, the team will deep dive into solving it. During the 60 minute span, ideas will be discussed, Analysts will put on their nerd hats and dive deep into Analytics or code to solve issues. All members of the team are working toward a single goal and that is to solve the issue.

    Once the issues is solved the Analyst who first outlined the issue will readback the solutions or ideas to solving the issue. It may not take the full 60 minutes to get to a solution. Whether it takes the entire time or not after one issue is solved another team member announces their issue and the team goes at it again.

    Helpful tips

    • Depending on the size of your team, you may need to split up into smaller groups. I recommend 3-5.
    • You may be tempted to take longer than an hour but in my experience, this doesn’t work. The pressure of solving an issue in a limited amount of time can help spark creativity.

    This meeting is one of the most effective ways my team practices vulnerability allowing the creativity flow freely. The structure is such that each team member has a way to provide and receive feedback. My experience has been that each analyst is open to new ideas and earnestly listens to understand the ways they can improve and grow as an analyst. And with this team effort, our clients are benefitting from the collective knowledge of the team rather than a single individual.

    3. Solicit characteristic feedback from your team

    This step is not for the faint of heart. If you had a hard time asking for someone to QA your work or presenting a site audit in front of your team, then you may find this one to be the toughest to carry out.

    Once a year I hold a special meeting with my team. The purpose of the meeting is to provide a safe place where my employees can provide feedback about me with their fellow teammates. In this meeting, the team meets without me and anonymously fills out a worksheet telling me what I should start doing, stop doing, and keep doing.

    Why would I subject myself to this, you ask?

    How could I not! Being a great SEO is more than just being great at SEO. Wait, what?!? Yes, you read that right. None of us work in silos. We are part of a team, interact with clients, have expectations from bosses, etc. In other words, the work we do isn’t only technical audits or site edits. It also involves how we communicate and interact with those around us.

    This special meeting is meant to focus more on our characteristics and behaviors, over our tactics and SEO chops, ensuring that we are well rounded in our skills and open to all types of feedback to improve ourselves.

    How to run a keep/stop/start meeting in 4 steps:

    Step 1: Have the team meet together for an hour. After giving initial instructions you will leave the room so that it is just your directs together for 45 minutes.

    Step 2: The team writes the behaviors they want you to start doing, stop doing, and keep doing. They do this together on a whiteboard or digitally with one person as a scribe.

    Step 3: When identifying the behaviors, the team doesn’t need to be unanimous but they do need to mostly agree. Conversely, the team should not just list them all independently and then paste them together to make a long list.

    Step 4: After 45 minutes, you re-enter the room and over the next 15 minutes the team tells you about what they have discussed

    Here are some helpful tips to keep in mind:

    • When receiving the feedback from the team you only have two responses you can give, “thank you” or ask a clarifying question.
    • The feedback needs to be about you and not the business.
    • Do this more than once. The team will get better at giving feedback over time.

    Here is an example of what my team wrote during my first time running this exercise.

    Let’s break down why this meeting is so important.

    1. With me not in the room, the team can discuss openly without holding back.
    2. Having team members work together and come to a consensus before writing down a piece of feedback ensures feedback isn’t from a single team member but rather the whole team.
    3. By leaving the team to do it without me, I show as a manager I trust them and value their feedback.
    4. When I come back to the room, I listen and ask for clarification but don’t argue which helps set an example of receiving feedback from others
    5. The best part? I now have feedback that helps me be a better manager. By implementing some of the feedback, I reinforce the idea that I value my team’s feedback and I am willing to change and grow.

    This isn’t just for managers. Team members can do this themselves. You can ask your manager to go through this exercise with you, and if you are brave enough, you can have you teammates do this for you as well.

    4. Hold a team meeting to discuss what you have learned recently

    Up to this point, we have primarily focused on how you can ask for feedback to help grow a culture of creativity. In this final section, we’ll focus more on how you can share what you have learned to help maintain a culture of creativity.

    Tell me if this sounds familiar: I show up at work, catch up on industry news, review my client performance, plug away at my to-do list, check on tests I am running and make adjustments, and so on and so forth.

    What are we missing in our normal routines? Collaboration. A theme you may have noticed in this post is that we need to work together to produce our best work. What you read in industry news or what you see in client performance should all be shared with team members.

    To do this, my team put together a meeting where we can share our findings. Every 2 weeks, my team meets together for an hour and a half to discuss prepared answers to the following four questions.

    Question 1: What is something interesting you have read or discovered in the industry?

    This could be as simple as sharing a blog post or going more in depth on some research or a test you have done for a client. The purpose is to show that everyone on the team contributes to how we do SEO and helps contribute knowledge to the team.

    Question 2: What are you excited about that you are working on right now?

    Who doesn’t love geeking out over a fun site audit, or that content analysis that you have been spending weeks to build? This is that moment to share what you love about your job.

    Question 3: What are you working to resolve?

    Okay, okay, I know. This is the only section in this meeting that talks about issues you might be struggling to solve. But it is so critical!

    Question 4: What have you solved?

    Brag, brag, brag! Every analyst has an opportunity to share what they have solve. Issues they overcame. How they out-thought Google and beat down the competition.

    In conclusion

    Creativity is at the heart of what SEOs do. In order to grow in our roles, we need to continue to expand our minds so we can provide stellar performance for our clients. To do this requires us to receive and give out help with others. Only then will we thrive in a culture that allows us to be safely vulnerable and actively creative.

    I would love to hear how your team creates a culture of creativity. Comment below your ideas!



    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


    2019-04-18T09:00:00+00:00
  • How Do I Improve My Domain Authority (DA)?

    Posted by Dr-Pete

    The Short Version: Don't obsess over Domain Authority (DA) for its own sake. Domain Authority shines at comparing your overall authority (your aggregate link equity, for the most part) to other sites and determining where you can compete. Attract real links that drive traffic, and you'll improve both your Domain Authority and your rankings.

    Unless you've been living under a rock, over a rock, or really anywhere rock-adjacent, you may know that Moz has recently invested a lot of time, research, and money in a new-and-improved Domain Authority. People who use Domain Authority (DA) naturally want to improve their score, and this is a question that I admit we've avoided at times, because like any metric, DA can be abused if taken out of context or viewed in isolation.

    I set out to write a how-to post, but what follows can only be described as a belligerent FAQ ...

    Why do you want to increase DA?

    This may sound like a strange question coming from an employee of the company that created Domain Authority, but it's the most important question I can ask you. What's your end-goal? Domain Authority is designed to be an indicator of success (more on that in a moment), but it doesn't drive success. DA is not used by Google and will have no direct impact on your rankings. Increasing your DA solely to increase your DA is pointless vanity.

    So, I don't want a high DA?

    I understand your confusion. If I had to over-simplify Domain Authority, I would say that DA is an indicator of your aggregate link equity. Yes, all else being equal, a high DA is better than a low DA, and it's ok to strive for a higher DA, but high DA itself should not be your end-goal.

    So, DA is useless, then?

    No, but like any metric, you can't use it recklessly or out of context. Our Domain Authority resource page dives into more detail, but the short answer is that DA is very good at helping you understand your relative competitiveness. Smart SEO isn't about throwing resources at vanity keywords, but about understanding where you realistically have a chance at competing. Knowing that your DA is 48 is useless in a vacuum. Knowing that your DA is 48 and the sites competing on a query you're targeting have DAs from 30-45 can be extremely useful. Likewise, knowing that your would-be competitors have DAs of 80+ could save you a lot of wasted time and money.

    But Google says DA isn't real!

    This topic is a blog post (or eleven) in and of itself, but I'm going to reduce it to a couple points. First, Google's official statements tend to define terms very narrowly. What Google has said is that they don't use a domain-level authority metric for rankings. Ok, let's take that at face value. Do you believe that a new page on a low-authority domain (let's say DA = 25) has an equal chance of ranking as a high-authority domain (DA = 75)? Of course not, because every domain benefits from its aggregate internal link equity, which is driven by the links to individual pages. Whether you measure that aggregate effect in a single metric or not, it still exists.

    Let me ask another question. How do you measure the competitiveness of a new page, that has no Page Authority (or PageRank or whatever metrics Google uses)? This question is a big part of why Domain Authority exists — to help you understand your ability to compete on terms you haven't targeted and for content you haven't even written yet.


    Seriously, give me some tips!

    I'll assume you've read all of my warnings and taken them seriously. You want to improve your Domain Authority because it's the best authority metric you have, and authority is generally a good thing. There are no magical secrets to improving the factors that drive DA, but here are the main points:

    1. Get more high-authority links

    Shocking, I know, but that's the long and short of it. Links from high-authority sites and pages still carry significant ranking power, and they drive both Domain Authority and Page Authority. Even if you choose to ignore DA, you know high-authority links are a good thing to have. Getting them is the topic of thousands of posts and more than a couple of full-length novels (well, ok, books — but there's probably a novel and feature film in the works).

    2. Get fewer spammy links

    Our new DA score does a much better job of discounting bad links, as Google clearly tries to do. Note that "bad" doesn't mean low-authority links. It's perfectly natural to have some links from low-authority domains and pages, and in many cases it's both relevant and useful to searchers. Moz's Spam Score is pretty complex, but as humans we intuitively know when we're chasing low-quality, low-relevance links. Stop doing that.

    3. Get more traffic-driving links

    Our new DA score also factors in whether links come from legitimate sites with real traffic, because that's a strong signal of usefulness. Whether or not you use DA regularly, you know that attracting links that drive traffic is a good thing that indicates relevance to searches and drives bottom-line results. It's also a good reason to stop chasing every link you can at all costs. What's the point of a link that no one will see, that drives no traffic, and that is likely discounted by both our authority metrics and Google.


    You can't fake real authority

    Like any metric based on signals outside of our control, it's theoretically possible to manipulate Domain Authority. The question is: why? If you're using DA to sell DA 10 links for $1, DA 20 links for $2, and DA 30 links for $3, please, for the love of all that is holy, stop (and yes, I've seen that almost verbatim in multiple email pitches). If you're buying those links, please spend that money on something more useful, like sandwiches.

    Do the work and build the kind of real authority that moves the needle both for Moz metrics and Google. It's harder in the short-term, but the dividends will pay off for years. Use Domain Authority to understand where you can compete today, cost-effectively, and maximize your investments. Don't let it become just another vanity metric.


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


    2019-04-17T18:09:00+00:00
  • 12 Steps to Lightning Page Speed

    Posted by WallStreetOasis.com


    At Wall Street Oasis, we’ve noticed that every time we focus on improving our page speed, Google sends us more organic traffic. In 2018, our company's website reached over 80 percent of our traffic from organic search. That’s 24.5 million visits. Needless to say, we are very tuned in to how we can continue to improve our user experience and keep Google happy.

    We thought this article would be a great way to highlight the specific steps we take to keep our page speed lightning fast and organic traffic healthy. While this article is somewhat technical (page speed is an important and complex subject) we hope it provides website owners and developers with a framework on how to try and improve their page speed.

    Quick technical background: Our website is built on top of the Drupal CMS and we are running on a server with a LAMP stack (plus Varnish and memcache). If you are not using MySQL, however, the steps and principles in this article are still relevant for other databases or a reverse proxy.

    Ready? Let’s dig in.

    5 Steps to speed up the backend

    Before we jump into specific steps that can help you speed up your backend, it might help to review what we mean by “backend”. You can think of the backend of everything that goes into storing data, including the database itself and the servers -- basically anything that helps make the website function that you don’t visually interact with. For more information on the difference between the backend vs. frontend, you read this article

    Step 1: Make sure you have a Reverse Proxy configured

    This is an important first step. For Wall Street Oasis (WSO), we use a reverse proxy called Varnish. It is by far the most critical and fastest layer of cache and serves the majority of the anonymous traffic (visitors logged out). Varnish caches the whole page in memory, so returning it to the visitor is lightning fast.

    https://en.wikipedia.org/wiki/Reverse_proxy



    Step 2: Extend the TTL of that cache

    If you have a large database of content (specifically in the 10,000+ URL range) that doesn’t change very frequently, to drive the hit-rate higher on the Varnish caching layer, you can extend the time to live (TTL basically means how long before you flush the object out of the cache).

    For WSO, we went all the way up to two weeks (since we were over 300,000 discussions). At any given time, only a few thousand of those forum URLs are active, so it makes sense to heavily cache the other pages. The downside to this is that when you make any sitewide, template or design changes, you have to wait two weeks for it to arrive across all URLs.

    Step 3: Warm up the cache

    In order to keep our cache “warm," we have a specific process that hits all the URLs in our sitemap. This increases the likelihood of a page being in the cache when a user or Google bot visits those same pages (i.e. our hit rate improves). It also keeps Varnish full of more objects, ready to be accessed quickly.

    As you can see from the chart below, the ratio of “cache hits” (green) to total hits (blue+green) is over 93 percent.

    Step 4: Tune your database and focus on the slowest queries

    On WSO, we use a MySQL database. Make sure you enable the slow queries report and check it at least every quarter. Check the slowest queries using EXPLAIN. Add indexes where needed and rewrite queries that can be optimized.

    On WSO, we use a MySQL database. To tune MySQL, you can use the following scripts: https://github.com/major/MySQLTuner-perl and https://github.com/mattiabasone/tuning-primer

    Step 5: HTTP headers

    Use HTTP2 server push to send resources to the page before they are requested. Just make sure you test which ones should be pushed, first. JavaScript was a good option for us. You can read more about it here.

    Here is an example of server push from our Investment Banking Interview Questions URL:

    </files/advagg_js/js__rh8tGyQUC6fPazMoP4YI4X0Fze99Pspus1iL4Am3Nr4__k2v047sfief4SoufV5rlyaT9V0CevRW-VsgHZa2KUGc__TDoTqiqOgPXBrBhVJKZ4CapJRLlJ1LTahU_1ivB9XtQ.js>; rel=preload; as=script,</files/advagg_js/js__TLh0q7OGWS6tv88FccFskwgFrZI9p53uJYwc6wv-a3o__kueGth7dEBcGqUVEib_yvaCzx99rTtEVqb1UaLaylA4__TDoTqiqOgPXBrBhVJKZ4CapJRLlJ1LTahU_1ivB9XtQ.js>; rel=preload; as=script,</files/advagg_js/js__sMVR1us69-sSXhuhQWNXRyjueOEy4FQRK7nr6zzAswY__O9Dxl50YCBWD3WksvdK42k5GXABvKifJooNDTlCQgDw__TDoTqiqOgPXBrBhVJKZ4CapJRLlJ1LTahU_1ivB9XtQ.js>; rel=preload; as=script,

    Be sure you're using the correct format. If it is a script: <url>; rel=preload; as=script,

    If it is a CSS file: <url>; rel=preload; as=style,

    7 Steps to speed up the frontend

    The following steps are to help speed up your frontend application. The front-end is the part of a website or application that the user directly interacts with. For example, this includes fonts, drop-down menus, buttons, transitions, sliders, forms, etc.

    Step 1: Modify the placement of your JavaScript

    Modifying the placement of your JavaScript is probably one of the hardest changes because you will need to continually test to make sure it doesn’t break the functionality of your site. 

    I’ve noticed that every time I remove JavaScript, I see page speed improve. I suggest removing as much Javascript as you can. You can minify the necessary JavaScript you do need. You can also combine your JavaScript files but use multiple bundles.

    Always try to move JavaScript to the bottom of the page or inline. You can also defer or use the async attribute where possible to guarantee you are not rendering blocking. You can read more about moving JavaScript here.

    Step 2: Optimize your images

    Use WebP for images when possible (Cloudflare, a CDN, does this for you automatically — I’ll touch more on Cloudflare below). It's an image formatting that uses both Lossy compression and lossless compression.

      Always use images with the correct size. For example, if you have an image that is displayed in a 2” x 2 ” square on your site, don’t use a large 10” x 10” image. If you have an image that is bigger than is needed, you are transferring more data through the network and the browser has to resize the image for you

      Use lazy load to avoid/delay downloading images that are further down the page and not on the visible part of the screen.

      Step 3: Optimize your CSS

      You want to make sure your CSS is inline. Online tools like this one can help you find the critical CSS to be inlined and will solve the render blocking. Bonus: you'll keep the cache benefit of having separate files.

      Make sure to minify your CSS files (we use AdVagg since we are on the Drupal CMS, but there are many options for this depending on your site).  

      Try using less CSS. For instance, if you have certain CSS classes that are only used on your homepage, don't include them on other pages. 

      Always combine the CSS files but use multiple bundles. You can read more about this step here.

      Move your media queries to specific files so the browser doesn't have to load them before rendering the page. For example: <link href="frontpage-sm.css" rel="stylesheet" media="(min-width: 767px)">

      If you’d like more info on how to optimize your CSS, check out Patrick Sexton’s interesting post.

      Step 4: Lighten your web fonts (they can be HEAVY)

      This is where your developers may get in an argument with your designers if you’re not careful. Everyone wants to look at a beautifully designed website, but if you’re not careful about how you bring this design live, it can cause major unintended speed issues. Here are some tips on how to put your fonts on a diet:

      • Use inline svg for icon fonts (like font awesome). This way you'll reduce the critical chain path and will avoid empty content when the page is first loaded.
      • Use fontello to generate the font files. This way, you can include only the glyphs you actually use which leads to smaller files and faster page speed.
      • If you are going to use web fonts, check if you need all the glyphs defined in the font file. If you don’t need Japanese or Arabic characters, for example, see if there is a version with only the characters you need.
      • Use Unicode range to select the glyphs you need.
      • Use woff2 when possible as it is already compressed.
      • This article is a great resource on web font optimization.

      Here is the difference we measured when using optimized fonts:

      After reducing our font files from 131kb to 41kb and removing one external resource (useproof), the fully loaded time on our test page dropped all the way from 5.1 to 2.8 seconds. That’s a 44 percent improvement and is sure to make Google smile (see below).

      Here’s the 44 percent improvement.

      Step 5: Move external resources

      When possible, move external resources to your server so you can control expire headers (this will instruct the browsers to cache the resource for longer). For example, we moved our Facebook Pixel to our server and cached it for 14 days. This means you’ll be responsible to check updates from time to time, but it can improve your page speed score.

      For example, on our Private Equity Interview Questions page it is possible to see how the fbevents.js file is being loaded from our server and the cache control http header is set to 14 days (1209600 seconds)

      cache-control: public, max-age=1209600

      Step 6: Use a content delivery network (CDN)

      What’s a CDN? Click here to learn more.

      I recommend using Cloudflare as it makes a lot of tasks much easier and faster than if you were to try and do them on your own server. Here is what we specifically did on Cloudflare's configuration:

      Speed

      • Auto-minify, check all
      • Under Polish
      • Enable Brotoli
      • Enable Mirage
      • Choose Lossy
      • Check WebP

      Network

      • Enable HTTP/2 – You can read more about this topic here
      • No browsers currently support HTTP/2 over an unencrypted connection. For practical purposes, this means that your website must be served over HTTPS to take advantage of HTTP/2. Cloudflare has a free and easy way to enable HTTPS. Check it out here.

      Crypto

      • Under SSL
        • Choose Flexible
      • Under TLS 1.3
        • Choose Enable+0RTT – More about this topic here.

      Step 7: Use service workers

      Service workers give the site owner and developers some interesting options (like push notifications), but in terms of performance, we’re most excited about how these workers can help us build a smarter caching system.

      To learn how to to get service workers up and running on your site, visit this page.

      With resources (images, CSS, javascript, fonts, etc) being cached by a service worker, returning visitors will often be served much faster than if there was no worker at all.

      Testing, tools, and takeaways

      For each change you make to try and improve speed, you can use the following tools to monitor the impact of the change and make sure you are on the right path:

      We know there is a lot to digest and a lot of resources linked above, but if you are tight on time, you can just start with Step 1 from both the Backend and Front-End sections. These 2 steps alone can make a major difference on their own.

      Good luck and let me know if you have any questions in the comments. I’ll make sure João Guilherme, my Head of Technology, is on to answer any questions for the community at least once a day for the first week this is published.

      Happy Tuning!



        Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


        2019-04-15T00:03:00+00:00
      • The One-Hour Guide to SEO: Technical SEO - Whiteboard Friday

        Posted by randfish

        We've arrived at one of the meatiest SEO topics in our series: technical SEO. In this fifth part of the One-Hour Guide to SEO, Rand covers essential technical topics from crawlability to internal link structure to subfolders and far more. Watch on for a firmer grasp of technical SEO fundamentals!


        Click on the whiteboard image above to open a high resolution version in a new tab!

        Video Transcription

        Howdy, Moz fans, and welcome back to our special One-Hour Guide to SEO Whiteboard Friday series. This is Part V - Technical SEO. I want to be totally upfront. Technical SEO is a vast and deep discipline like any of the things we've been talking about in this One-Hour Guide.

        There is no way in the next 10 minutes that I can give you everything that you'll ever need to know about technical SEO, but we can cover many of the big, important, structural fundamentals. So that's what we're going to tackle today. You will come out of this having at least a good idea of what you need to be thinking about, and then you can go explore more resources from Moz and many other wonderful websites in the SEO world that can help you along these paths.

        1. Every page on the website is unique & uniquely valuable

        First off, every page on a website should be two things — unique, unique from all the other pages on that website, and uniquely valuable, meaning it provides some value that a user, a searcher would actually desire and want. Sometimes the degree to which it's uniquely valuable may not be enough, and we'll need to do some intelligent things.

        So, for example, if we've got a page about X, Y, and Z versus a page that's sort of, "Oh, this is a little bit of a combination of X and Y that you can get through searching and then filtering this way.Oh, here's another copy of that XY, but it's a slightly different version.Here's one with YZ. This is a page that has almost nothing on it, but we sort of need it to exist for this weird reason that has nothing to do, but no one would ever want to find it through search engines."

        Okay, when you encounter these types of pages as opposed to these unique and uniquely valuable ones, you want to think about: Should I be canonicalizing those, meaning point this one back to this one for search engine purposes? Maybe YZ just isn't different enough from Z for it to be a separate page in Google's eyes and in searchers' eyes. So I'm going to use something called the rel=canonical tag to point this YZ page back to Z.

        Maybe I want to remove these pages. Oh, this is totally non-valuable to anyone. 404 it. Get it out of here. Maybe I want to block bots from accessing this section of our site. Maybe these are search results that make sense if you've performed this query on our site, but they don't make any sense to be indexed in Google. I'll keep Google out of it using the robots.txt file or the meta robots or other things.

        2. Pages are accessible to crawlers, load fast, and can be fully parsed in a text-based browser

        Secondarily, pages are accessible to crawlers. They should be accessible to crawlers. They should load fast, as fast as you possibly can. There's a ton of resources about optimizing images and optimizing server response times and optimizing first paint and first meaningful paint and all these different things that go into speed.

        But speed is good not only because of technical SEO issues, meaning Google can crawl your pages faster, which oftentimes when people speed up the load speed of their pages, they find that Google crawls more from them and crawls them more frequently, which is a wonderful thing, but also because pages that load fast make users happier. When you make users happier, you make it more likely that they will link and amplify and share and come back and keep loading and not click the back button, all these positive things and avoiding all these negative things.

        They should be able to be fully parsed in essentially a text browser, meaning that if you have a relatively unsophisticated browser that is not doing a great job of processing JavaScript or post-loading of script events or other types of content, Flash and stuff like that, it should be the case that a spider should be able to visit that page and still see all of the meaningful content in text form that you want to present.

        Google still is not processing every image at the I'm going to analyze everything that's in this image and extract out the text from it level, nor are they doing that with video, nor are they doing that with many kinds of JavaScript and other scripts. So I would urge you and I know many other SEOs, notably Barry Adams, a famous SEO who says that JavaScript is evil, which may be taking it a little bit far, but we catch his meaning, that you should be able to load everything into these pages in HTML in text.

        3. Thin content, duplicate content, spider traps/infinite loops are eliminated


        Thin content and duplicate content — thin content meaning content that doesn't provide meaningfully useful, differentiated value, and duplicate content meaning it's exactly the same as something else — spider traps and infinite loops, like calendaring systems, these should generally speaking be eliminated. If you have those duplicate versions and they exist for some reason, for example maybe you have a printer-friendly version of an article and the regular version of the article and the mobile version of the article, okay, there should probably be some canonicalization going on there, the rel=canonical tag being used to say this is the original version and here's the mobile friendly version and those kinds of things.

        If you have search results in the search results, Google generally prefers that you don't do that. If you have slight variations, Google would prefer that you canonicalize those, especially if the filters on them are not meaningfully and usefully different for searchers. 

        4. Pages with valuable content are accessible through a shallow, thorough internal links structure

        Number four, pages with valuable content on them should be accessible through just a few clicks, in a shallow but thorough internal link structure.

        Now this is an idealized version. You're probably rarely going to encounter exactly this. But let's say I'm on my homepage and my homepage has 100 links to unique pages on it. That gets me to 100 pages. One hundred more links per page gets me to 10,000 pages, and 100 more gets me to 1 million.

        So that's only three clicks from homepage to one million pages. You might say, "Well, Rand, that's a little bit of a perfect pyramid structure. I agree. Fair enough. Still, three to four clicks to any page on any website of nearly any size, unless we're talking about a site with hundreds of millions of pages or more, should be the general rule. I should be able to follow that through either a sitemap.

        If you have a complex structure and you need to use a sitemap, that's fine. Google is fine with you using an HTML page-level sitemap. Or alternatively, you can just have a good link structure internally that gets everyone easily, within a few clicks, to every page on your site. You don't want to have these holes that require, "Oh, yeah, if you wanted to reach that page, you could, but you'd have to go to our blog and then you'd have to click back to result 9, and then you'd have to click to result 18 and then to result 27, and then you can find it."

        No, that's not ideal. That's too many clicks to force people to make to get to a page that's just a little ways back in your structure. 

        5. Pages should be optimized to display cleanly and clearly on any device, even at slow connection speeds

        Five, I think this is obvious, but for many reasons, including the fact that Google considers mobile friendliness in its ranking systems, you want to have a page that loads clearly and cleanly on any device, even at slow connection speeds, optimized for both mobile and desktop, optimized for 4G and also optimized for 2G and no G.

        6. Permanent redirects should use the 301 status code, dead pages the 404, temporarily unavailable the 503, and all okay should use the 200 status code

        Permanent redirects. So this page was here. Now it's over here. This old content, we've created a new version of it. Okay, old content, what do we do with you? Well, we might leave you there if we think you're valuable, but we may redirect you. If you're redirecting old stuff for any reason, it should generally use the 301 status code.

        If you have a dead page, it should use the 404 status code. You could maybe sometimes use 410, permanently removed, as well. Temporarily unavailable, like we're having some downtime this weekend while we do some maintenance, 503 is what you want. Everything is okay, everything is great, that's a 200. All of your pages that have meaningful content on them should have a 200 code.

        These status codes, anything else beyond these, and maybe the 410, generally speaking should be avoided. There are some very occasional, rare, edge use cases. But if you find status codes other than these, for example if you're using Moz, which crawls your website and reports all this data to you and does this technical audit every week, if you see status codes other than these, Moz or other software like it, Screaming Frog or Ryte or DeepCrawl or these other kinds, they'll say, "Hey, this looks problematic to us. You should probably do something about this."

        7. Use HTTPS (and make your site secure)

        When you are building a website that you want to rank in search engines, it is very wise to use a security certificate and to have HTTPS rather than HTTP, the non-secure version. Those should also be canonicalized. There should never be a time when HTTP is the one that is loading preferably. Google also gives a small reward — I'm not even sure it's that small anymore, it might be fairly significant at this point — to pages that use HTTPS or a penalty to those that don't. 

        8. One domain > several, subfolders > subdomains, relevant folders > long, hyphenated URLs

        In general, well, I don't even want to say in general. It is nearly universal, with a few edge cases — if you're a very advanced SEO, you might be able to ignore a little bit of this — but it is generally the case that you want one domain, not several. Allmystuff.com, not allmyseattlestuff.com, allmyportlandstuff.com, and allmylastuff.com.

        Allmystuff.com is preferable for many, many technical reasons and also because the challenge of ranking multiple websites is so significant compared to the challenge of ranking one. 

        You want subfolders, not subdomains, meaning I want allmystuff.com/seattle, /la, and /portland, not seattle.allmystuff.com.

        Why is this? Google's representatives have sometimes said that it doesn't really matter and I should do whatever is easy for me. I have so many cases over the years, case studies of folks who moved from a subdomain to a subfolder and saw their rankings increase overnight. Credit to Google's reps.

        I'm sure they're getting their information from somewhere. But very frankly, in the real world, it just works all the time to put it in a subfolder. I have never seen a problem being in the subfolder versus the subdomain, where there are so many problems and there are so many issues that I would strongly, strongly urge you against it. I think 95% of professional SEOs, who have ever had a case like this, would do likewise.

        Relevant folders should be used rather than long, hyphenated URLs. This is one where we agree with Google. Google generally says, hey, if you have allmystuff.com/seattle/ storagefacilities/top10places, that is far better than /seattle- storage-facilities-top-10-places. It's just the case that Google is good at folder structure analysis and organization, and users like it as well and good breadcrumbs come from there.

        There's a bunch of benefits. Generally using this folder structure is preferred to very, very long URLs, especially if you have multiple pages in those folders. 

        9. Use breadcrumbs wisely on larger/deeper-structured sites

        Last, but not least, at least last that we'll talk about in this technical SEO discussion is using breadcrumbs wisely. So breadcrumbs, actually both technical and on-page, it's good for this.

        Google generally learns some things from the structure of your website from using breadcrumbs. They also give you this nice benefit in the search results, where they show your URL in this friendly way, especially on mobile, mobile more so than desktop. They'll show home > seattle > storage facilities. Great, looks beautiful. Works nicely for users. It helps Google as well.

        So there are plenty more in-depth resources that we can go into on many of these topics and others around technical SEO, but this is a good starting point. From here, we will take you to Part VI, our last one, on link building next week. Take care.

        Video transcription by Speechpad.com

        In case you missed them:

        Check out the other episodes in the series so far:


        Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


        2019-04-12T00:08:00+00:00
      • How to Convince Your Boss to Send You to MozCon 2019

        Posted by cheryldraper

        From networking with your peers to hearing from industry leaders, there are benefits a-plenty to attending conferences. You know that. Your peers know that. But how do you persuade the powers-that-be (aka your boss) that sending you is beneficial for your business? 

        To help convince your boss that won't just be lounging pool-side, sipping cocktails on the company dime, we’ve gathered the goods to help you get your boss to greenlight your MozCon attendance.

        How to make the case

        Business competition is fiercer than ever. What used to make a splash now feels like it’s barely making ripples. Only those who are able to shift tactics with the changing tides of marketing will be able to come out on top.

        And that’s exactly what MozCon is going to help you do.

        Covering everything a growing marketer needs for a well-balanced marketing diet (SEO, content, strategy, growth), MozCon delivers top-notch talks from hand-selected speakers over three insightful days in July.

        There's so much in store for you this year. Here’s just a sampling of what you can expect at this year’s MozCon:

        Speakers and content

        Our speakers are real practitioners and industry leaders. We work with them to ensure they deliver the best content and insights to the stage to set you up for a year of success. No sales pitches or talking heads here!

        Networking

        You work hard taking notes, learning new insights, and digesting all of that knowledge — that’s why we think you deserve a little fun in the evenings. It's your chance to decompress with fellow attendees and make new friends in the industry. We host exciting evening networking events that add to the value you'll get from your day of education. Plus, our Birds of a Feather lunch tables allow you to connect with like-minded peers who share similar interests.

        High-quality videos to share with your team

        About a month or so after the conference, we’ll send you a link to professionally edited videos of every presentation at the conference. Your colleagues won’t get to partake in the morning Top Pot doughnuts or Starbucks coffee (the #FOMO is real), but they will get a chance to learn everything you did, for free.

        An on-going supportive group 

        Our MozCon Facebook group is incredibly active, and it’s grown to have a life of its own — marketers ask one another SEO questions, post jobs, look for and offer advice and empathy, and more. It’s a great place to find TAGFEE support and camaraderie long after the conference itself has ended.

        Great food on site 

        We know that conference food isn’t typically worth mentioning, but at MozCon is notorious for its snacking. You can expect two hot meals a day and loads of snacks from local Seattle vendors — in the past we’ve featured a smorgasbord from the likes of Trophy cupcakes, KuKuRuZa popcorn, Starbucks’ Seattle Reserve cold brew.

        Swag

        No duds here, we do our homework when it comes to selecting swag worthy of keeping. One-of-a-kind Roger Mozbots, a super-soft t-shirt, and more cool stuff you’ll want to take home and show off.

        Wear your heart on your sleeve

        MozCon and our attendees give back each year through donating Moz dollars towards a charitable organization.

        Discounts for subscribers and groups 

        Moz Pro subscribers get a whopping $500 off their ticket cost and there are discounts for groups as well, so make sure to take advantage of savings where you can!

        Ticket cost

        At MozCon our goal is to breakeven, which means we invest all of your ticket prices back into you. Check out the full breakdown of what your MozCon ticket gets you:

        But of course, don’t take our word for it! There are some incredible resources available at your fingertips that tout the benefits of attending conferences:

        I'm convinced, now grab my ticket!

        Need a little more to get your boss on board? Check out some videos from years past to get a taste for the caliber of our speakers. We’ve also got a call for community speaker pitches (closes at 5 pm PDT on April 15, 2019) so if you’ve been thinking about breaking into the speaking circuit, it could be an amazing opportunity.

        Buy ticket, save money, get competitive marketing insights. Everyone wins!

        MozCon is one unforgettable experience that lives and grows with you beyond just the three days you spend in Seattle. And there's no time like the present to pitch MozCon to your boss. If they're still stuck on the "why", let them know about our subscriber or group pricing tiers to your boss — you’ll save hundreds of dollars when you do. Just think of all the Keurigs you could get for that communal kitchen! 

        Grab your ticket to MozCon!


        Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


        2019-04-11T09:00:00+00:00
      • How Bad Was Google's Deindexing Bug?

        Posted by Dr-Pete

        On Friday, April 5, after many website owners and SEOs reported pages falling out of rankings, Google confirmed a bug that was causing pages to be deindexed:

        MozCast showed a multi-day increase in temperatures, including a 105° spike on April 6. While deindexing would naturally cause ranking flux, as pages temporarily fell out of rankings and then reappeared, SERP-monitoring tools aren't designed to separate the different causes of flux.

        Can we isolate deindexing flux?

        Google's own tools can help us check whether a page is indexed, but doing this at scale is difficult, and once an event has passed, we no longer have good access to historical data. What if we could isolate a set of URLs, though, that we could reasonably expect to be stable over time? Could we use that set to detect unusual patterns?

        Across the month of February, the MozCast 10K daily tracking set had 149,043 unique URLs ranking on page one. I reduced that to a subset of URLs with the following properties:

        1. They appeared on page one every day in February (28 total times)
        2. The query did not have sitelinks (i.e. no clear dominant intent)
        3. The URL ranked at position #5 or better

        Since MozCast only tracks page one, I wanted to reduce noise from a URL "falling off" from, say, position #9 to #11. Using these qualifiers, I was left with a set of 23,237 "stable" URLs. So, how did those URLs perform over time?

        Here's the historical data from February 28, 2019 through April 10. This graph is the percentage of the 23,237 stable URLs that appeared in MozCast SERPs:

        Since all of the URLs in the set were stable throughout February, we expect 100% of them to appear on February 28 (which the graph bears out). The change over time isn't dramatic, but what we see is a steady drop-off of URLs (a natural occurrence of changing SERPs over time), with a distinct drop on Friday, April 5th, a recovery, and then a similar drop on Sunday, April 7th.

        Could you zoom in for us old folks?

        Having just switched to multifocal contacts, I feel your pain. Let's zoom that Y-axis a bit (I wanted to show you the unvarnished truth first) and add a trendline. Here's that zoomed-in graph:

        ??

        The trend-line is in purple. The departure from trend on April 5th and 7th is pretty easy to see in the zoomed-in version. The day-over-day drop on April 5th was 4.0%, followed by a recovery, and then a second, very similar, 4.4% drop.

        Note that this metric moved very little during March's algorithm flux, including the March "core" update. We can't prove definitively that the stable URL drop cleanly represents deindexing, but it appears to not be impacted much by typical Google algorithm updates.

        What about dominant intent?

        I purposely removed queries with expanded sitelinks from the analysis, since those are highly correlated with dominant intent. I hypothesized that dominant intent might mask some of the effects, as Google is highly invested in surfacing specific sites for those queries. Here's the same analysis just for the queries with expanded sitelinks (this yielded a smaller set of 5,064 stable URLs):

        Other than minor variations, the pattern for dominant-intent URLs appears to be very similar to the previous analysis. It appears that the impact of deindexing was widespread.

        Was it random or systematic?

        It's difficult to determine whether this bug was random, affecting all sites somewhat equally, or was systematic in some way. It's possible that restricting our analysis to "stable" URLs is skewing the results. On the other hand, trying to measure the instability of inherently-unstable URLs is a bit nonsensical. I should also note that the MozCast data set is skewed toward so-called "head" terms. It doesn't contain many queries in the very-long tail, including natural-language questions.

        One question we can answer is whether large sites were impacted by the bug. The graph below isolates our "Big 3" in MozCast: Wikipedia, Amazon, and Facebook. This reduced us to 2,454 stable URLs. Unfortunately, the deeper we dive, the smaller the data-set gets:

         

        At the same 90–100% zoomed-in scale, you can see that the impact was smaller than across all stable URLs, but there's still a clear pair of April 5th and April 7th dips. It doesn't appear that these mega-sites were immune.

        Looking at the day-over-day data from April 4th to 5th, it appears that the losses were widely distributed across many domains. Of domains that had 10-or-more stable URLs on April 4th, roughly half saw some loss of ranking URLs. The only domains that experienced 100% day-over-day loss were those that had 3-or-fewer stable URLs in our data set. It does not appear from our data that deindexing systematically targeted specific sites.

        Is this over, and what's next?

        As one of my favorite movie quotes says: "There are no happy endings because nothing ever ends." For now, indexing rates appear to have returned to normal, and I suspect that the worst is over, but I can't predict the future. If you suspect your URLs have been deindexed, it's worth manually reindexing in Google Search Console. Note that this is a fairly tedious process, and there are daily limits in place, so focus on critical pages.

        The impact of the deindexing bug does appear to be measurable, although we can argue about how "big" 4% is. For something as consequential as sites falling out of Google rankings, 4% is quite a bit, but the long-term impact for most sites should be minimal. For now, there's not much we can do to adapt — Google is telling us that this was a true bug and not a deliberate change.


        Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


        2019-04-11T00:05:00+00:00