Bummer! This is just a preview. You need to be signed in with a Pro account to view the entire video.
Onsite SEO in 2015: An Elegant Weapon for a More Civilized Marketer54:39 with Rand Fishkin
SEO has come full circle as on-page SEO has returned to the forefront. Rand will share how and why on-site SEO is so important and show off uncommon tactics with powerful potential.
[MUSIC] 0:00 I love SEO. 0:07 I love the idea that there's an algorithm that is not transparent. 0:08 That Google hides information from us. 0:16 Information that would make us more successful, that in my opinion would make 0:18 the web a better place if it were transparent and open. 0:22 I realize that's my opinion and not their opinion. 0:26 Doesn't mean it's any less true to me. 0:29 And so I love the job that I have, and 0:33 that I sometimes get to do of making Google more transparent. 0:37 That's what I wanna do. 0:41 And that's what I get to do today. 0:41 So, you can find these slides at bit.ly/onsiteseo2015. 0:45 I wanna urge you not to have to, you know, take mad crazy notes and 0:49 look up URLs on the fly, because you can get them all, click them all there. 0:55 I will ask you to do one thing at the end of the presentation, 1:00 but you can't use your laptop for it. 1:03 At least you can't use any device connected to the WiFi, 1:05 if you remember this little trick. 1:07 Got another fun one for you, all right? 1:10 So let's take a quick journey into the past. 1:14 Remember when we only had one job? 1:18 Hopefully we didn't mess it up like this. 1:22 And that job, right, was perfectly optimized pages. 1:25 Sweet, I can perfectly optimize a page. 1:31 It was the search quality team at Google. 1:36 I remember my first trip to building 43, right. 1:39 Go in there during SES 2005 in San Jose. 1:42 And they used to host a party, at the Googleplex, for SEOs. 1:46 That was awesome. 1:51 Very, very early Googly of them. 1:53 And those early search quality engineers, 1:56 they decided that links said more about you than content. 2:00 Because what other say about you is more important than what you say about 2:03 yourself. 2:08 That's pretty reasonable. 2:10 So we did what we always do. 2:13 We optimized the hell out of that and had this ubiquitous link spam. 2:17 I think by 2007, you know, you want to rank for 2:21 something, eh, you probably bought some links. 2:25 Even in 2012, Will Reynolds got on this stage 2:28 and told us how Google was making liars out of the white hat SEO world. 2:35 He was right. 2:41 Totally, totally right. 2:43 It was hard being a white hat. 2:46 The last three years though, not so hard. 2:49 Really really not hard. 2:52 I think the infrastructure improvements, hardware and 2:53 software infrastructure improvements that Google made, underlying their systems, 2:56 things like caffeine, allowed them to make the progress that they have made. 3:00 I know and Moz knows a little something about technical debt. 3:06 I think that's exactly what Google was experiencing. 3:08 So they finally had these algorithms to fight manipulative links and 3:11 content, right? 3:15 They leveraged fear and uncertainty of penalization to keep the rest of 3:16 the sites inline, even if they couldn't catch you, you never knew if they might. 3:20 They figured out intent, right? 3:25 They knew hey if I'm looking for best beef in Seattle, I probably am not just looking 3:27 for pages that have lots of anchor text pointing to them with beef. 3:31 Right. 3:35 I want Butcher shops, 3:35 maybe I want a Carnivore's guide to a restaurants in Seattle. 3:37 They looked at language, not just keywords. 3:41 All right. So, what's that movie where they make fun 3:44 of Star Trek? 3:45 Yeah, here. 3:47 Pretty good. 3:48 Solid. 3:49 Right. 3:51 They predicted when I wanted diverse results. 3:51 So, they're not just gonna give me list of books, they're gonna give me a book. 3:53 They're going to give me start up library, they're on it. 3:57 They figured out when we wanted freshness, and when we didn't. 4:02 Right? That's marketing conference as well. 4:07 You know, he probably doesn't want stuff from before 2014, 4:08 that's not going to be useful. 4:11 I think one of the best things that they did, 4:16 one of the most impressive things they did, is they started closing the loophole 4:19 around navigational versus informational queries. 4:24 Right? So, it used to be that, 4:27 when I'd search for these things, Google was very confused. 4:28 They weren't sure which one I wanted. 4:32 Now they've kinda got that QDD, query deserves diversity, and 4:34 a lot of intense signals, and a lot of behavioral signals, and so 4:38 they can do a much, much better job around these kinds of queries. 4:42 They learned to ID entities of knowledge. 4:49 This is also very impressive and very, very cool. 4:53 Right? 4:57 And then they took those entities and 4:57 were able to connect an entity to a topic or a keyword. 4:59 Smart, very smart. 5:02 One of the things that became and entity was a brand. 5:08 Remember Bill and I had a big conversation about this, 5:11 and he's like Google does not talk about brands as 5:17 being something as they recognize, that's not something that they use a lot. 5:22 And then a few days later he tweeted at me, he's like, 5:25 you know what, I went back and looked, Google does talk a lot about brands. 5:27 Their research papers, their patents, this brand thing, maybe I think you're right, 5:32 I think it's got some legs. 5:36 And a lot of these advancements bring Google back in line with 5:39 their public statements around how SEOs should be done. 5:44 I like that a little bit because I think that it's good when 5:50 how SEO works matches how Google talks about SEO. 5:54 But because for all of us in here who have skepticism, rightfully so, 5:57 skepticism around Google or any other entity that tells us how we should behave, 6:01 how we should be doing marketing. 6:06 There are 100, maybe 1,000 people who do some form of marketing, 6:08 some form of working on the web, and don't have that skepticism. 6:14 They think they just are supposed to blindly trust Google. 6:18 I get really frustrated for that. 6:22 You know, I feel for those people, 6:24 cuz I think they rely on Google to tell them the truth. 6:26 And when they don't, it hurts them. 6:30 It hurts the small entities. 6:34 It doesn't hurt Google. 6:36 So during all these advances, 6:38 Google search quality team underwent kind of a quiet revolution. 6:41 A revolution that we do not talk about. 6:45 And I'll take you there. 6:49 So this is Peter Norvig. 6:51 Originally the director of Google search quality way back in the day. 6:53 Peter Norvig was interviewed on a blog called Datawocky. 6:59 This is 2008. 7:03 We're going way back, right. 7:05 And he says Google still uses the manually crafted formula for search results. 7:06 They haven't cut over to a machine learned model yet. 7:11 Now machine learning was already something that was out, right, in 2008, 7:14 but not something Google was using yet. 7:18 Amit Singhal, right, next sort of person in this position. 7:21 Head of Search Quality there. 7:25 This is Edmond Lau who worked on Amit's team answering a question on Quora. 7:28 I think answering a question on Quora in 2011 or 12, from what I gathered while I 7:34 was there, Amit Singhal who heads Google's core ranking team has a philosophical 7:38 bias against using machine learning in the ranking algorithm. 7:43 All right. 7:49 But, and if you want, 7:51 you can read admin's answer in more depth and see why Amit was so against this. 7:53 But 2012 rolls around and Google publishes a paper 8:00 about how they're using machine learning to predict ad click through rate. 8:04 So they're using it on the PPC side, not, not necessarily yet on the SEO side. 8:07 And here's Susan, I think her last name is pronounced Wojcicki, Wojcicki? 8:14 If you're Polish and you can correct me, feel free to yell it out. 8:19 >> [INAUDIBLE] >> What'd you say? 8:23 Sorry, Marta, what was that? 8:26 >> [INAUDIBLE] >> Wojcicki. 8:27 Wojcicki? 8:29 Okay, all right. 8:30 Gonna, the person who introduced her in this video also mispronounced her name. 8:32 So you can, you know, all right. 8:36 So this is 2012 ,right? 8:38 And she talks about their SmartAss system. 8:40 'Cuz, of course, 8:42 Google engineers are gonna name something as cleverly as they can. 8:43 it learns whether users are interested in ads, and 8:48 whether users are gonna click on them. 8:50 Predictively learns. 8:52 So, 2013 rolls around, Matt Cutts is at Pubcon and he talks about 8:54 how they're using machine learning in search, not just on the ad side, 8:59 on the organic side. 9:04 This is sea change, when machine learning takes over more of Google's algorithm, 9:09 the underpinnings of how rankings work and the intuition, 9:16 like that, you know that sixth sense that we build up as search marketers? 9:21 Of like, oh, I know why that ranks. 9:26 Like I know it right, I can tell it's like yeah, it's an exact match domain, but 9:29 that's not what it's doing. 9:33 It was an exact match domain five years ago. 9:33 So it's getting a bunch of anchor text, and so that's why. 9:36 That kind of intuition that we build up over the years of looking at search 9:40 results and reverse engineering and all that, they change. 9:44 Google's actually public about exactly how they use machine learning for 9:48 image recognition and classification. 9:52 Remember they had a little problem with gorillas last week, two weeks ago? 9:55 Yeah. 9:59 Well, this is that process. 10:00 So Google takes potential identification factors for images, 10:04 like color, shapes, gradients, perspectives, 10:09 surroundings, interlacing, alt tags, whatever they can find, right? 10:13 And then they take training data Images where people have actually gone in and 10:17 said, this is what this image is about. 10:21 Here's all the tags for this image, whether editorially created or 10:23 created by communities, I'm not sure, but regardless. 10:28 And then they have a learning process so that they can take all those factors and 10:32 build the best possible automated algorithm to pump out 10:37 results that look like the training data, right, that match up to that. 10:41 And you get a Best Match Algo. 10:46 All right. 10:48 That's kinda classic machine learning, right? 10:50 Not too challenging. 10:53 So if you have not yet read it, my opinion, 10:55 Jeff Dean's slides on deep learning at Google are a must-read. 10:58 I think that is. 11:04 You can't call yourself a modern, sophisticated SEO without 11:05 reading through that slide deck and understanding at least a little. 11:09 You guys know who Jeff Dean is? 11:12 He's one of these very very 11:14 highly regarded individual contributor engineers at Google. 11:17 There's actually a wiki at Google 11:20 made up of jokes about how good at programming Jeff Dean is. 11:23 I'll tell you one of my favorite Jeff Dean jokes. 11:28 It's totally groan-worthy, it's like a Dad joke. 11:30 So back in 2003, 11:36 the speed of light in a vacuum was 35 miles an hour. 11:40 Until Jeff Dean spent a weekend optimizing physics. 11:48 >> [LAUGH] >> Jeff Dean, alright. 11:52 Machine learning in search could work a lot like this. 11:58 A lot like this. 12:02 Watch you take potential ranking factors, everything that you can think of. 12:02 TFIDF, page rank, anchor text, whatever you want. 12:07 QDF, clicks, etc., etc., etc. 12:11 And then you take training data. 12:12 Things where you know, hey, we Google have awesome search results for these queries. 12:15 Here's 50 thousand queries where we just feel awesome, 12:20 we love every single result in the top ten. 12:23 We love the ordering. 12:26 Boom, you plug those in, you have your learning process, 12:28 and you get a best fit algorithm. 12:34 Sweet. 12:37 Right? 12:38 Training data, that's Jeff Dean right there. 12:40 Of course he's sort of all American, good looking. 12:43 Engineer guy. 12:46 And, he says, right, this is a good SERP, searchers rarely bounce. 12:48 They rarely short-click. 12:54 They rarely need to enter other queries or go to page 2. 12:55 Good search result. 12:58 All right, 13:00 we're gonna take that, we're gonna compare that against a bad search result. 13:01 Right? 13:06 Bad SERP results the searchers bounce they click other results they rarely long click 13:07 they are very unhappy with these search results. 13:12 And then the machines are going to try and emulate the ranking input 13:14 formula that gives them the most good results and the least bad results. 13:19 Just all not that tremendously complicated. 13:26 Why was Amit Singhal against this why was Peter Norvig against this? 13:29 Because once you start digging into a machine learning system it's actually very 13:32 hard to figure out why something ranks where it does. 13:35 Very, very hard. 13:40 You get all these weird calculations and derivatives and ya duh, duh, duh, duh. 13:42 Lot of stuff I don't even understand. 13:47 And then there's deep learning. 13:48 So deep learning is just a slightly more advanced version of machine learning. 13:52 It basically says, okay, yes. 13:56 It's fancy in that it's inspired, loosely inspired, 13:58 by what we know about how the biological brain works. 14:02 Which makes it sound like oh, that must be really cool. 14:05 No, no, no. It's just more layers of abstraction. 14:08 So that you get a more sophisticated machine learning, right? 14:12 So, Jeff Dean says by using deep learning they don't 14:15 actually have to tell the system that this is a cat. 14:19 The system figures out what a cat is. 14:26 It looks at lots of pictures and it creates its own classification for cat. 14:28 That's an algorithm that builds an algorithm. 14:34 Right, with no human intervention. 14:38 Googlers don't feed in the ranking factors 14:40 at least in deep learning frameworks, right? 14:44 The machines determine the ranking factors themselves. 14:46 Guess what we don't need? 14:53 Oh, no, no, you just give me training data, 14:55 you tell me what good is, and I the machine will learn everything else. 14:57 Ooh! 15:02 That's a interesting to get a best fit algo. 15:03 No wonder that Elon Musk and 15:07 Bill gates are like super freaked out that Google's gonna build the terminator. 15:09 This is actually a real concern that these gentlemen have. 15:13 Like they're smart guys and they're like oh no, Google will kill us all. 15:18 And I think Larry Page is there. 15:23 He's like you know, we're all gonna die anyway. 15:24 [LAUGH] >> One of Moz's investors Brad Feld 15:26 likes to say Google believes that human kind is just the boot loader for 15:32 the eventual machine intelligence that will conquer the universe. 15:38 Shit. 15:42 That's, that's just depressing Brad. 15:44 Okay, what does deep learning mean for SEO? 15:46 Cause you know, if we're gonna be killed by machines, 15:51 we should at least try and do the best marketing we can before the kill us all. 15:53 Well it means, for one thing Google is not gonna know why something ranks or 15:58 whether it's a variable in the algorithm. 16:01 Okay, you're right Jeff, I'm sorry. 16:06 >> [LAUGH] >> He's Jeff Dean, he'll know. 16:07 Query success metrics. 16:15 Query success metrics will be all that matters to the machines. 16:17 They don't care how they got there. 16:22 They don't care what inputs they used. 16:23 They just care that they got the best SERP right, so they might look at things like 16:24 long to short click ratio, and relative click-through rate versus other results. 16:28 And rate of searchers conducting additional related searches, 16:33 which suggests they weren't satisfied by the first one. 16:36 Things like sharing and amplification rate versus other results. 16:39 They might look at metrics of user engagement across the domain, 16:43 metrics of user engagement on the specific page, that stuff matters. 16:48 If lots of results on a search result do those well, and 16:53 higher results are outperforming lower results, 16:57 well, you know what, the deep learning algo's gonna be like, nailed it. 16:59 We're gonna be optimizing less for 17:05 ranking inputs like more linking your domains, more keywords in title, and 17:08 more anchor text and content uniqueness and duh, duh, duh, duh. 17:12 And we're gonna be optimizing more for 17:14 searcher outputs like, did we get a high click-through rate for this position? 17:17 Did we get good engagement? 17:21 Did we get a high amplification rate? 17:22 Did we get a low bounce rate? 17:24 A strong pages per visit after landing on this URL? 17:25 People return to this site right after making an initial search visit. 17:28 Optimizing for outputs rather than inputs? 17:33 That's, that's a new SEO? 17:37 That's weird, right? 17:41 And then it will try and figure out which ones we missed and work on those. 17:42 I think when we get to the future of on-site SEO, 17:48 these are going to be new criteria. 17:51 We're still going to have to worry about the inputs, 17:53 I'm not saying those are going away, the inputs still matter because they 17:56 predict and suggest successful output, but will have to do both. 17:59 Okay, your thinking so terminators and 18:05 deep learning, Rand, that's my future? 18:09 Forget about the terminators. 18:13 Right now, today, can I show you anything that suggests this is already happening? 18:16 Anything? 18:23 Well, of course I can. 18:25 There's no way I'd set up a question like that and 18:28 then be like nope, sorry, can't, presentation over. 18:30 So here we go. 18:33 [LAUGH] You know what I just, I know this is wrong. 18:34 I know this is not a good way to feel psychologically good about one's self, 18:38 but I love it when I can mess with Google. 18:44 I love it when they tell me something and I can be like, yeah, but really, no. 18:47 All right, here we go. 18:52 So do you remember our queries and clicks test from last year? 18:53 Right, we all search for something, we click on a result, boom. 18:57 That result suddenly rises to the top. 19:01 Fascinating. 19:05 Well, turns out, I tried this test a few more times after MozCon last year. 19:05 Because it worked successfully a number of times in a row leading up to MozCon, 19:10 then we all did the test together. 19:15 And by an hour after the show had ended, 19:17 that result that we all tested had jumped to the top. 19:20 I think it was a result for Seattle wedding dresses. 19:22 Boom, jumped to the top. 19:24 Since then, it's been way harder to do this. 19:27 I've tried the experiment a number of other times. 19:29 Way, way harder to move the needle with raw queries and clicks. 19:32 And then this year at SMX Advanced, 19:37 a Google representative said, I, well, okay. 19:41 Danny, right. 19:46 Danny asked him, I thought this was fun. 19:47 We know you measure what clicks are going on, blah blah blah. 19:48 And then, you know Gary is like. 19:53 There are many people who are trying to induce noise in clicks. 19:56 One would be Rand Fishkin. 20:00 >> [LAUGH] >> Now wait, using those clicks directly 20:01 in ranking would be pretty, and then ellipses, cuz Danny says, 20:06 is Rand just clicking the stuff to mess it up, which he knows I'm not, but. 20:11 And Gary says, I think what he's doing is hiring people to click on stuff. 20:16 >> [LAUGH] >> Did I pay any of you to click on stuff? 20:20 Have you received a check from me? 20:28 Any cash? 20:30 No, it's fine. 20:31 Like I get what Gary's trying to say, 20:33 hiring might not have been the verb he was looking for. 20:34 But using clicks directly in ranking would not make too much sense with that noise. 20:37 No, it would not. 20:43 Fair enough, Gary, you're totally right, 20:45 case closed, Google says they don't use clicks in the rankings. 20:47 Maybe it's just like, who knows, 20:50 random that those five times I tried it last summer totally worked. 20:53 I must be, well, what if we tried something else? 20:56 What if we tried long clicks and short clicks? 21:01 Long click is I click on something in the search results, I stay on the page, 21:04 versus short click, I click on something, go to the page, click back immediately. 21:08 So here's Serious Eats, which by the way, 21:12 Bobby Flay, come on, not a quality chef. 21:16 Not a quality dude, but also not a quality chef. 21:21 I'm sorry to those of you who are big Bobby Flay fans. 21:24 But you should be making steak via this Serious Eats recipe. 21:27 So there we go, ranking number four there. 21:32 11:39 AM on June 21st, I sent this Tweet. 21:35 If you have 20 seconds, I'd love some help doing this. 21:39 Click the query, 21:42 long click this, short click on Bobby Flay, long click on Serious Eats. 21:43 40 minutes and 400 interactions later, whew, 21:47 I guess Google is not using, I mean, is using clicks. 21:52 Weird. 70 minutes and 500 interactions, oh. 21:56 I can't even describe how good I was feeling, I was like [SOUND]. 22:04 >> [LAUGH]. 22:08 >> Just, sometimes proving people wrong, it just feels so good. 22:09 Stays 12 hours, then it fell that next, look, I am not a conspiracy theorist, 22:15 but at 9 AM, Switzerland time, the next morning when 22:22 the Google Switzerland webmaster tools team got into the office, 22:26 it fell down 13 ranking positions, almost exactly at 9:05 AM. 22:31 Then, it goes back to number four about an hour later. 22:37 They're, via Google Trends by the way, 22:41 this is the sickest thing you can do with the new Google Trends. 22:43 You can specify a custom date range by hour. 22:46 So for example, this has nothing to do with my presentation, but like a bunch of 22:50 speakers have talked about this so, for example, if you run a branded ad on TV, 22:53 and you wanna see how much search volume you induced, you can go see. 22:59 You won't know exactly how many people clicked on you, 23:05 but you can get a pretty good sense. 23:07 That's awesome, I love the new Google Trends, that's great. 23:09 So we had about five to ten times normal volume over three to four hours. 23:12 That's fairly substantial, but not huge, right, not huge. 23:16 By the way, do not try and spam this. 23:20 This is crazy hard to replicate. 23:24 We're talking about 600 real searchers on devices that were differently geolocated, 23:27 different logged in versus logged out users, 23:33 real users who performed real searches, all day, every day. 23:35 We are not talking about bot farms. 23:39 If you go to clickmonkey.com and purchase like the $500 click package, 23:41 you will be sorely disappointed. 23:46 I think the future that we have is optimizing for two algorithms. 23:51 And the best SEOs have always optimized where we're going, right? 23:58 I think today, we know where we're going. 24:01 We're going to your new home, the user and usage data and 24:03 machine learning model cabin. 24:07 Oh my God, that's the least sexy name ever. 24:11 I'm gonna work on the branding, I promise. 24:13 So I think we get to choose how we're gonna balance our work. 24:15 There's the signals of old, right, the signals of old, the inputs, 24:21 the search inputs. 24:26 And then there's the new signals that are on the rise, the outputs. 24:27 So classic on-site SEO, things like keyword targeting, quality and 24:32 uniqueness, crawl and bot friendliness, snippet optimization, 24:37 user experience, multi-device. 24:41 That's not going anywhere, at least not yet. 24:44 But we got new ones, relative click-through rate. 24:48 If you can do what we kind of fake did for 24:51 Serious Eats, [SOUND] wow, wow! 24:56 Long clicks versus short clicks, 25:00 man, if I can get more people spending more time on my page than they are on my 25:02 competitor's after they visit from a search result, that's a win. 25:06 That's on-site success, short versus long click, 25:10 content gap fulfillment, amplification and loyalty, task completion success. 25:15 This is new on-site SEO. 25:23 These are like the factors that Google has published research papers about and 25:26 patents about and suggested that they 25:29 might be using as their success metrics in what might be used for 25:32 that machine learning, deep learning model for rankings. 25:36 So I'm gonna talk through these a little bit, 25:41 spend some time with these five new on-site signals. 25:44 Cuz I think you know these ones really well. 25:47 First up. 25:56 How are we going to punch above our average click-through rate? 25:58 So if I look, and we've done this for a while, 26:05 because we wanted the traffic anyway. 26:09 But when Google's talked about this, 26:13 at least when they talked about this in research and patents and 26:14 that kind of stuff, it's never absolute click-through rate. 26:17 It's not that position number two needs to get more clicks than position number one. 26:19 It's that relative to an average search that looks like this, 26:25 are you over-performing? 26:30 So, if I'm ranking number three, but I have a higher than average click 26:33 through rate for the kind of search result that I'm in, and I think this is where Dr. 26:37 Pete's key word opportunity model and 26:41 the key word opportunity score is gonna be big. 26:44 And then click through rate stuff which I'm working on, and I will try and get to 26:46 you in the next few months an accurate click through rate curve across devices. 26:51 In this case, every element is gonna count. 26:57 Everything here, so the title, URL, 27:01 if you get that brand drop down, a lot of commercial queries. 27:05 We're seeing that brand drop down be super correlated with whether you rank high or 27:10 don't, it's kinda ridiculous. 27:14 Searchers recognize and wanna click on your domain, all that brand stuff. 27:16 Is your result fresh? 27:22 Do searchers want something that's newer than that? 27:23 Does the description create curiosity, encourage that click? 27:27 Given that what happens in Google, like we know that this happens in Google. 27:31 That a lot of times, you publish new content, and they test it out. 27:35 They like give you this little test window. 27:38 You can see there is actually a result right now that is getting this 27:40 test window. 27:42 If you do a search for, it was just today so 27:44 I didn't have time to put it in the deck but it, how big is the solar system? 27:46 You search for how big is the solar system, 27:50 there's this like one pixel crazy visualization this guy did. 27:53 He did no SEO for it so like, it doesn't even have the key words in the title or 27:56 anything like that, but it ranks because Google's good at intent now, and 28:00 they're testing it. 28:04 They're testing it on page one. 28:05 If it gets lots of clicks, performs well, I think it's gonna stay there. 28:07 If it doesn't, it'll fall off. 28:10 So given that we know that, I think it probably is gonna be worthwhile 28:11 to think about repeated publication on the same topic, same keywords 28:16 until we sort of nail what our users want, nail what our searchers want. 28:22 And this isn't actually a bad thing from a topic modelling perspective either, 28:26 because remember that Google now cares a lot about whether we're publishing 28:29 regularly on a topic in the associations that Google makes between websites and 28:33 topics, topics being collections of keywords around a subject. 28:38 I think, actually, another one is driving up that click-through 28:44 rate through branding or branded searches. 28:47 Remember what Will talked about with Wayfair, where they ran those TV ads. 28:49 The TV ad said, Google Wayfair Sofa Beds. 28:55 Oh, that is so smart. 29:00 Oh, that is so smart. 29:02 You know what? 29:04 I almost guarantee that they read the Google site quality patent 29:05 before they bought those ads. 29:09 Cuz you know what this Google site quality patent says? 29:11 It says that the percentage of people who do a branded search 29:13 Influence whether the brand in that branded search is going to 29:22 rank higher for the non-branded search. 29:26 That's a quality signal. 29:31 Lots of people search for Trip Advisor New York hotels, you know what, 29:32 we should probably rank Trip Advisor higher for New York hotels. 29:35 Lots of people search for Wayfair sofa beds. 29:39 We should probably rank Wayfair higher for just sofa beds. 29:42 Crazy. 29:46 So, here's car insurance quotes. 29:48 Oh, the number two Ad Spender is number one and four. 29:51 Number one Ad Spender's two. 29:54 Number four Ad Spender #4, 3 Ad Spender, 5 Ad Spender. 29:55 Do you think it could possibly be the case that these are also the five 30:03 companies that are absolutely the best at SEO? 30:07 I don't think so. 30:13 I don't buy that. 30:14 I think their TV spend, their offline spend, their brand spend, is directly 30:17 influencing the searches that are being performed, the click-through rate 30:21 they're getting, the engagement they're seeing, the trust users have in them. 30:26 And therefore, secondarily influencing their success metrics. 30:31 With that Google Trends system, you can see it, you can watch it. 30:38 Pretty sweet, next time you see an ad for 30:45 a company you don't know at all, go check it out. 30:47 Go see what that search volume's like and then go check. 30:51 Go check, are they ranking well for the non-branded query too? 30:54 Are they creeping up after the big ad spend? 30:57 I have a feeling, by the way, one big ad dispenser trivago, 31:01 we've seen a bunch of ads from them, I think they're gonna be creeping up. 31:03 #2, beating out your fellow search result residents on engagement. 31:08 So, pogo-sticking, the long click short click stuff, 31:14 we know that this might determine a lot of where you rank. 31:17 We've done the experiments, we've seen it, Google's talked about it. 31:19 What influences those? 31:24 Well, the way that I think about this is, we have a checklist, right? 31:26 We've gotta fulfill two parts of the searcher's needs, 31:31 conscious and unconscious need. 31:35 We have to fulfill speed because people are clicking that back button, 31:38 they're expecting more and more of us. 31:42 We've gotta deliver the best user experience on every browser. 31:45 We have to compel visitors to go deeper into our site. 31:47 And we've gotta avoid any features that annoy or dissuade visitors. 31:52 I love this because I hope it means that all those effing pop-ups 31:57 that work will eventually fall off of page one. 32:01 That'll be so nice. 32:04 For example, the New York Times has gotten crazy smart about this. 32:07 Not only are they showing you visuals, right, and a graph around this story. 32:13 How does family income predict your children's college success chances. 32:17 They make you draw the graph. 32:23 You draw the graph. 32:26 Guess how long people spend on that page? 32:28 A long time. 32:30 It's so engaging. 32:32 It's so cool to see how my guess matches up with the real data, insane. 32:34 Here's VoilaNorbert, which is one of my favorite email tools. 32:41 It's a great way to find anyone's email address for anything, I love it. 32:45 I actually paid for it, I totally use it. 32:48 And they've got nothing, they just do no SEO and oh man, there's no content. 32:52 They just don't do anything, but it doesn't matter. 32:59 Cuz they get visitors to return again and again and again, and engage, and 33:01 it's very simple and the UI makes it easy. 33:05 The payment process makes it super simple, great tool. 33:07 Or Nomadlist, which basically does the absolute best job in their field. 33:12 If you're looking for 33:16 the best cities to work remote, you couldn't find a better resource. 33:17 #3, filling in the gaps in your visitors' knowledge. 33:21 So here we go. 33:29 Hm, Google wants that content, right, that will fulfill all the searchers' needs so 33:32 that they don't need to come back and search again. 33:37 And of course Geoff has a few ways to figure that out. 33:40 So we've seen some discussion about this, right? 33:46 Where essentially Google is looking at topic modeling and 33:50 saying hey, terms and phrases that are connected to this particular keyword. 33:53 When we see those present, it suggests to us that this content is more 33:58 comprehensive, more accurate, and predicts a better searcher experience. 34:02 Right, so for example, if you wanna try and rank for New York, but 34:07 you do not mention Bronx, Manhattan, New York City, 34:11 Brooklyn, how relevant to New York are you? 34:15 And Google totally can figure this out. 34:19 So if I'm trying to rank well for natural language processing, but 34:25 I don't have anything on the topics like text classification or tokenization or 34:29 parsing, question answering, I might not get there. 34:34 Matt Brown was on the stage, he talked about the Moz Context API, 34:40 that is our data scientists team to do exactly this. 34:44 Right, we wanna find what those terms and phrases are, so 34:47 that you know what Google is using in the topic 34:51 modeling algorithm, and can successfully target those on your pages. 34:56 Until then, until this formally launches, which hopefully won't be that long, 35:03 you can check out two tools, Alchemy API and MonkeyLearn. 35:08 Which both do this and I think Jean Luca mentioned Knipe, 35:11 K-N-I-P-E, which also does this. 35:16 #4, this is a weird one. 35:20 It's weird because it's like, gosh, 35:25 does Google really care about sharing an amplification? 35:26 Is that something that's important to them? 35:29 Well, you know what, I see this a lot, I know you guys see this a lot. 35:31 You see a lot of these pages that have not yet gotten links or 35:36 just don't have links at all. 35:38 But they've gotten a lot of social engagement and 35:40 they seem to rank ridiculously well, they just over perform. 35:43 I know Google says they don't use social signals directly and 35:48 honestly, I believe them. 35:52 It's just that examples like these make a lot of SEOs suspicious. 35:54 Right, like we have this, how, if they don't use social signals then, 35:58 social seems to be the only thing that's going on here. 36:01 They're not doing great SEO from the keyword front or on-site. 36:04 They're not doing a great amount of link building. 36:08 Well, what is it? 36:10 Even for insanely competitive keywords, right? 36:12 Man, why is it that this thing that just did so 36:18 well on Facebook seems to rank for hyper competitive keywords? 36:20 I think Google is telling us the truth. 36:25 I don't think they're actually using the raw shares, 36:27 I don't think they're actually, that they care all that much about the numbers. 36:29 I think they look more at engagement, right? 36:33 They could use so many things to get the data that mimics social shares. 36:38 Clickstream data from Chrome and Android, engagement data, 36:41 branded queries from search, sure, navigational queries, why not? 36:45 Rate of link growth, which correlates very nicely with social. 36:49 Well, I don't care, kinda don't care. 36:54 I care personally, just because I have to know how Google works, 36:57 it's in my fiber of my DNA. 37:01 But I just want to rank like them if I'm a marketer. 37:02 I think Google almost certainly, by the way, 37:08 classifies different search results differently. 37:10 So in your industry, in your field, 37:12 around your topics and keywords, they might not actually care. 37:14 When it comes to medical stuff, I bet they don't care at all. 37:17 I bet they totally disconnect these things. 37:20 And this is another interesting thing about a machine learning or 37:23 deep learning model. 37:25 A machine learning model is gonna learn, hey, not a successful search result 37:26 in health, yes a successful search result in politics. 37:31 So, for us, raw shares and links, those, eh, they might be okay metrics, 37:36 especially if you're doing social media marketing. 37:40 But, if the competition is naturally earning them faster, 37:43 you're kinda out of luck, right? 37:47 So if these guys are getting ten new shares a day and I'm only getting four, 37:49 I don't have long. 37:53 My life span's limited. 37:55 And Google probably doesn't just wanna see shares, they wanna see shares that 37:57 result in engagement, loyalty, returning visits, those kinda things. 38:00 So I think this is a metric that I would like to use, 38:05 which is unique visits divided by my shares and links. 38:11 The percentage of my visitors, of my visits, 38:14 that are doing some sort of amplification of any kind. 38:16 That is probably a very good way to measure 38:22 where I'm having great success from an amplification rate perspective. 38:25 And the second one is total visitor sessions divided by number of returning 38:30 cuz that tells me loyalty. 38:33 Right, am I doing a good job earning back the visitors who came to me one, 38:36 two, three, four, five times, whatever it is. 38:40 If we know what our audience and their influencers share and 38:42 we measure those things, we can improve them. 38:46 This poor bastard. 38:50 >> [LAUGH] >> My god, I mean, that's loyalty. 38:51 Tragic loyalty, at the end he just sort of puts his head down and cries but. 38:58 We don't need better content for this, we need 10x content. 39:05 A number of presenters have talked about 10x content. 39:12 I'm very passionate about this, I think this is the wrong question to ask. 39:16 How do we make something as good as that? 39:21 Don't ask, wrong question! 39:24 How do we make something 10 times better than any of these? 39:25 Gonna need that 10x content. 39:32 That's how we're gonna get that loyalty, that engagement, that higher sharing and 39:34 amplification rate, that higher click-through rate, build that brand, 39:39 get those output metrics that we need. 39:43 10x content, I think it's the only way that we're gonna stand out from this 39:45 increasingly noisy crowd. 39:50 Oops, because the top 10% of content gets all the shares. 39:52 That's the bottom 75% down there. 39:58 No shares, no links. 40:02 Why bother making it? 40:04 Why did I hit publish? 40:06 None of our old school tactics are gonna get this done, right? 40:08 None of our old school on-site, none of our old school off-site. 40:12 These are all inputs. 40:15 They're not gone, they're not dead. 40:16 I'm not even sure they're gonna go away entirely, 40:18 but they're not gonna get us to 10x. 40:20 Last one. 40:23 Fulfilling the searcher's task. 40:25 Not the searcher's query, the searcher's task, what they are trying to accomplish. 40:29 So Google is public about this, right? 40:35 They want to get searchers accomplishing their task faster. 40:37 And what do we do when we search? 40:40 Like planning a vacation, or I'm making a very big considered purchase. 40:41 I do a broad search, I do a website visit, a narrow search, an even narrower search. 40:47 This is what you do when you plan a vacation or buy a house or 40:51 a motorcycle or a new laptop, or decide which new phone you're gonna get. 40:54 Google wants to be a Star Trek computer. 41:01 They want to take you from, you did a broad search, I'm gonna show you 41:03 all the sites or answers that you probably would have visited along that path so 41:07 that you can complete your task directly. 41:11 So you don't have to do it again and again. 41:14 If Google sees that a lot of people who perform these types of queries, 41:18 best ramen noodles, instant noodle brands, tastiest packaged noodles, 41:22 eventually end their queries after they get to The Ramen Rater. 41:25 Brilliant Seattle website, all he does is stay home all day and 41:31 eat packaged noodles. 41:34 He is a hero. 41:36 >> [LAUGH] >> Also I worry about his salt levels, 41:37 but hero. 41:43 They might use the clickstream data to help figure this out. 41:46 Even if it has no traditional ranking signals. 41:48 And I see so many search results where people are like, what is going on there? 41:50 I'm like, well, does that answer the task, not just the query, 41:54 but the task better than anyone else? 41:58 A lot of the time it does. 42:00 Ho oh, you better believe they're getting it and storing it. 42:02 You look at Chrome, they're basically like, look, when your finger touches 42:06 the keyboard we own the DNA of your ancestors, so it's ours now. 42:09 If you got a page that answers the searcher's initial query, 42:17 that might not be enough, right? 42:20 If I'm looking for Adventure Time comics, I'm probably looking to complete a task. 42:23 And there's only one website that really gets me to complete that entire task. 42:30 And that is these folks, Kaboom Studios, 42:35 who actually make all the Adventure Time comics. 42:38 Amazon doesn't carry a big enough selection, eBay doesn't have it. 42:40 These guys, no offense, but they suck at SEO, like classic SEO, 42:44 they're just terrible. 42:48 There's a duplicate version of every page times ten, and the navigation's a mess. 42:50 User experience is not great. 42:54 They're even ranked number one on mobile, and 42:56 they are the least mobile-friendly site I've seen since the 90s. 42:58 Gang, we're in a two-algorithm world. 43:03 The old algorithm is still powerful, still important. 43:09 You can still win with it, you can still rank with it. 43:13 That algo is Google's input. 43:17 But algo two, that Google is starting to care about, 43:20 is the subset of humanity that's gonna interact with your content and 43:26 interact with your queries and the tasks that they want to accomplish. 43:29 Let me show you some advice that I hear all the time after a presentation 43:35 like this, and I hate. 43:39 It is a totally wrong interpretation of what I mean here. 43:43 Well, you know what, I think Rand's just saying we should make pages for people, 43:48 not engines. 43:52 That is terrible advice. 43:53 Terrible, awful, miserable advice, that a content marker who hates SEO and doesn't 43:55 want anything to do with our field, wants to believe in the unicorn rainbow world. 44:00 And maybe, no offense, I mean, I get it, but that's terrible advice. 44:05 That's engines, that's people. 44:11 Gotta do both. 44:15 Outputs and inputs. 44:17 These two things are on-site SEO. 44:21 That's the world we're living in today. 44:26 All right, I'm technically over time, but I'm gonna go into my bonus round anyway. 44:30 Bonus time! 44:35 >> [APPLAUSE] >> All right, bonus number one. 44:36 If you hear me and a bunch of other presenters talk about 10X content and 44:40 you're like, show me, show me the 10X content. 44:43 All right, I got something for you. 44:46 If you go to bit.ly/10X, that's a capital X, 10Xcontent, 44:49 you will find my personal list of 10X content, 44:54 what I talk about when I talk about 10X content. 44:57 And I'll keep this list updated. 45:01 I keep adding to it all the time. 45:02 I think I got 30 or 40 entries in there now. 45:04 If you have some content you think that I should put on the list, 45:07 please let me know. 45:10 Second, those MonkeyLearn guys, they heard I was gonna mention them, 45:12 they reached out. 45:16 And they said, you know, we can build a tool that might be helpful for you. 45:17 So they took their machine learning algo for topic modeling and if you plug in your 45:22 URL in a search query, and which engine you want, they'll go scrape the top ten 45:27 of Google, pull out all the keywords of the top ten and compare that to your page. 45:32 Pretty sweet. 45:37 I'll be honest, in the Moz context API is actually better than what MonkeyLearn has 45:40 right now, but it's really cool that they built this tool for us. 45:44 And I think you can find some awesome stuff through there, and 45:48 it'll save you a ton of time. 45:51 If you went through the posts that John Luca talked about, 45:52 that walks you through the whole process of doing it manually. 45:55 This is the automatic way to do that. 45:58 And this is not on the slide deck online, so 46:01 I need, oops, go back, go back, all right. 46:03 So I need your promise, your solemn promise, that you will 46:05 not tweet it, that you will not use a Wi-Fi connected laptop or desktop. 46:13 Nobody brought their desktop, that's crazy. 46:20 >> [LAUGH] >> There's one person here who's like, oh, 46:22 I have it. 46:25 Or your mobile device, you have to use just your cell phone on your 3G, 46:26 4G wireless, whatever, LTE and 46:31 we can perform the long click, short click test if you guys want. 46:34 So if you're up for it, not tweeting, not sharing, just doing it. 46:40 We have enough people in this room, we should be able to repeat it. 46:44 You guys want to do that and we can check the results at the party tonight? 46:47 All right, let's do it. 46:50 Here you go. 46:52 The query is gonna be for smoking deer shirt. 46:55 As you can see here, this result, from Jon Wye, 47:01 which is pictured over there, clearly the vastly superior result to these top three. 47:05 And a lot of you asked what T-shirt I was wearing on day one, 47:10 the jackalope, that came from Jon Wye. 47:13 He doesn't really do SEO, but 47:15 he's this awesome little independent designer in Washington DC, 47:17 one man show, exactly the kind of small business guy you'd wanna support. 47:19 So Amazon, local company, I love them, they don't need to rank here. 47:24 Really, they just don't, doesn't matter. 47:29 Click them, click back. 47:33 As soon as the page loads in full just click on them, then click back. 47:34 Remember, not on a Wi-Fi device. 47:38 Then click on Jon Wye's first result there, men's smoking deer shirt. 47:40 Stay on his site. 47:48 Do not go back to Google. 47:49 Click around a little bit, stay on the site, browse, 47:50 close your browser or whatever. 47:53 Try and spend like 20, 30, 40 seconds on there. 47:55 If we do that, if my predictions are right, somewhere between 45 minutes and 47:58 an hour an a half from now, we should see a result that we've seen a few times 48:03 previously, which is that this guy will move up to the top. 48:07 Then someone from Google will get pissed about it and find out and shut it down. 48:10 Or maybe that's just built into the algorithm. 48:15 I don't know, I can't say for certain. 48:17 Should be fascinating to find out. 48:19 And with that, thank you very much, it's been great to have you all here. 48:21 >> [APPLAUSE] >> Okay, 48:26 we are over time, but we want to do a few questions, if that's okay with you. 48:34 >> Please, let's bring it on. 48:38 >> First of all, I want to ask where will we find out the results of this test? 48:39 >> So the Imac Labs Crew, Eric, Mark, Dave McCollough, 48:43 they're all watching this, they'll let us know. 48:46 And we'll be publishing data about this and the previous test as well, so. 48:49 >> Follow Rand on Twitter, he'll tell [CROSSTALK] 48:53 >> Yeah, we'll get it to you. 48:55 >> Okay, first question from Ryan Glass, who, if memory serves, works for 48:55 U-Haul and will help us all move to Seattle. 48:59 Ryan asked, how much will deep learning actually impact results for all users, 49:02 or will they have a much deeper learning set per user? 49:08 >> If I had to guess, and this is gonna be pure guess because I can't say for 49:14 certain, it will be a little from column a and a little from column b. 49:17 I expect it to influence the broader results because I think Google has talked 49:21 publicly about the fact that they're now using machine learning, 49:24 deep learning, in search results. 49:26 I mean, Penguin, 49:28 they talked about how Penguin was built on a machine learning infrastructure. 49:29 Panda, same story. 49:33 So we're gonna see it influence all results and personalized results. 49:34 >> Okay, we had a couple questions on this topic, so 49:40 I'm gonna pick one from Arman Sargansian. 49:42 Sorry, I slaughtered that, Arman, thank you. 49:45 How can Google track user engagement across the domain if the visitor isn't on 49:49 Chrome or if there isn't GA code on the website? 49:53 Are they using Google analytics? 49:55 How is Google tracking [INAUDIBLE]? 49:56 >> Well, so If it's not Chrome, it's Android. 49:58 If it's not Chrome and Android, I don't actually think they're using GA. 50:03 I don't think they're using GA at all, I totally believe them when they say we 50:07 don't track GA data and then use that in rankings, I believe that. 50:10 However, they also have a ton of infrastructure and ISP's and 50:14 can get it that way, and can buy data from ISP's if they want. 50:18 They also can get a ton of it from the free Wi-Fi that they offer. 50:21 Like, they enough coverage that they don't need everybody. 50:24 It's not like they have to also go and 50:27 get everything that happens on Apple devices or Microsoft devices. 50:29 They just need the percent that they have, which is over 60%. 50:32 >> Okay, you talked about click through rates. 50:36 Matthew Barnett, Rand, what's your process for 50:38 evaluating and analyzing organic click through rates? 50:41 Are there tools for that? 50:44 >> Yes. 50:46 >> Oh. 50:46 >> I didn't know about them until just now, but yes there are. 50:47 You can find out the click through rate for 50:51 a particular SERP, if it's a popular enough search query, from SimilarWeb. 50:53 Go to SimilarWeb, sign up for their pro tool, those guys have 50:58 the ability to show you how much traffic is going to each of the sites and 51:02 pages that show up in a given result. 51:06 It blew my mind when I saw it on Monday. 51:09 Congrats to them, their panel is outstanding. 51:12 It's like 50 million desktop and laptop users. 51:14 50 million mobile. 51:17 So you can see click-through rates on both. 51:18 >> Yeah, we just partnered with them on a study that's going to be 51:20 coming out really soon. 51:23 >> [INAUDIBLE] Hey, sorry to bug you. 51:24 I think you can use Jon Why's site to find that. 51:26 >> Oh, sorry Jon. 51:29 >> So yeah, we might have killed that site. 51:30 [APPLAUSE] >> Yeah. 51:33 You know what, I'll give him a call and send him some better web hosting. 51:38 [LAUGH] >> So 51:42 here's a really- >> Thanks, [CROSSTALK] 51:45 >> relevant question regarding that. 51:46 Jack Boland asks, isn't the real test to buy the smoking gear shirt and 51:48 complete a task? 51:53 >> Yes. Absolutely. 51:54 So I think Google is using what I would say today, right, 51:55 with the long click versus short click, that's an unsophisticated model. 51:59 And I think they're gonna get more and more sophisticated, 52:03 just as we saw the queries and clicks test stop working or be much harder to do. 52:05 It doesn't stop working but you need like so many people in such a big distributed 52:09 geography to make that work now, that it's really hard. 52:13 I got it to work in like, one conference in Europe that was huge. 52:16 That was it. 52:18 So, I think that the next step for them is the long click, short click, 52:19 and then eventually, it'll be that like, bigger loyalty task completion engagement, 52:23 and all that kind of stuff. 52:28 >> Okay, so here's a philosophical question. 52:31 Tyler Frosh, what's Google's goal with trying to get people's tasks done faster 52:33 if their revenue is based on ad impressions? 52:38 Seems like there's two things going on there. 52:40 >> Yeah, I totally agree with that. 52:42 So, that goes right to my intro talk, which is, 52:44 Google believes in disrupting itself. 52:47 They want to shut down their own operations. 52:50 Google wants to be the thing that kills Google. 52:52 And I think that goes exactly to that point, right? 52:54 They're essentially saying, hey, 52:57 you know what, we are willing to sacrifice huge amounts of ad revenues to directly 52:58 drive a visitor when they start typing into their mobile phone. 53:02 And suggest a URL, before we can show them your super valuable ads that could send 53:06 them to the same URL and give us $80 a click. 53:10 The fact that they were willing to do that says to me, 53:13 this is all about user experience, keeping searchers, 53:15 getting more searches per searcher, and more user happiness. 53:18 >> Okay, I got to read this one. 53:21 Aaron Ballard, I joined Twitter just to ask, Ran Fish, can we take a selfie? 53:23 My first tweet. 53:29 [LAUGHTER] [APPLAUSE] >> So 53:30 as Ruth said, yes you can take a selfie with me, absolutely, but you should just 53:38 make sure that you network with all of the amazing, awesome people here. 53:43 Cuz I'm just kind of like a Nebashi moustached weirdo who happens to be 53:46 crazy about making Google transparent, but yes, let's take a selfie for sure. 53:50 >> Okay so please stay on stage, that's gonna do it for Q&A. 53:54 I want to let everybody know, I know some of you have flights, for those gonna be in 53:58 town tonight, we're going to the Garage, I believe it starts around seven o'clock. 54:01 Yes, looking at Jen. 54:07 Be sure to bring an I.D. or a passport if you're international, 54:08 they check every one of those and you will not get in unless you do. 54:12 And before we go I want to thank everybody here, this has been an amazing experience, 54:16 can we give ourselves a round of applause for everybody in this room? 54:21 [APPLAUSE] This has been fantastic. 54:23 [APPLAUSE] >> And 54:26 we'll see you tonight at the garage. 54:32 And please lift one more round of applause, Mr Rand Fishkin, whoo! 54:33 >> [APPLAUSE] 54:37
You need to sign up for Treehouse in order to download course files.Sign up