Is it okay to add many different sites?

Is it okay to add many different sites?

Webmaster Guru Asked on December 21, 2016 in Search Console.
Add Comment
2 Answer(s)

JOHN MEULLER (Webmaster Analyst from Google):

So first off that one thing that I see from smaller and larger sites all the time is they have a ton of different subdomains for essentially the same content so I had a big European airline recently that that pink means i do they don’t really know what what’s happening in search console and one thing I notice is they have all of these different subdomains and they’re serving very similar content on these subdomains which means you have to add all of these subdomains to search console combine all of that data and then look at the combine view before youkan really understand what’s actually happening here so fewer sites is a lot easier in your main account verify all of the different site variations but only delegate access to the preferred version so instead of delegating access to dub dub dub and all of these variations just pick the one that you really want to have indexed and grant access to that and make sure you ‘recollecting all the email that comes in regardless maybe set up some filters to make sure that the important emails you get through search console actually reach someone so one thing that you can also do if you have a bunch of different sites and you can’t easily just clean them up because sometimes it’s not as easy as it sounds you can use the property sets feature in search console so this is a great way to get one view of multiple sites at the same time i use this for example when looking at our help forums because weave HTTP and https kind of semi randomly index because we never managed to clean that up properly or on the forum side and you want to look at the overall picture so you need to combine all of these versions into a single-entry I and look at it like that so they’re not all reports support property sets but more and more reports due and some important things to keep in mind is that this is per user they can’t be shared you need to be verified for all of the very variations that you include in there and it doesn’t backfill the data so if you decide in January you’d like to see a combined view of the data from December then you view of the data from December then youkan only do that if you’ve set up property sets already so make sure these those up early and one thing that sometimes confusing when you’re looking at search analytics the data is aggregated for this set so if you’re looking at impressions per keyword per query pages per query those kind of things then keep in mind this is for the whole sets if you have multiple sites and the ranking independently for the same keywords in search then we will pick one of those account that one as impression and we will pick the top one as the average top ranking for that query so it’s not the case that you can just add up the individual sites and see the combined view this is essentially doing something pretty smart that helps you understand how much visibility do you have overall not pur pur is tore per URL one thing that also comes up all the time is how aggregate reports work and the important thing to keep in mind here is that this these are based off of indexing they’re not based off of your current site they’re based off of what we found will be crawled and indexed your site so if you make significant changes on your site then it’s normal that it will take some time to settle down against in this case there was a test run here and you can see some of the theirs dropped this was I think that the index pages report then they move tots completely they did the full move on this state but you can see that’s a lot of the URLs dropped out fairly quickly because they’re now on the HTTPS version but for the rest it takes quite some time to actually settle down completely and this is not something that you need to force not something that you need to kind of like manually tweak to make them go away and this is essentially the normal crawling and indexing and you’ll see the same thing if you fix an issue with structured data if you change you ramp implementation you’ll see a fairly sizable chunk of the URLs move fairly quickly and the rest just take some time to kind of happen

Webmaster Guru Answered on December 21, 2016.
Add Comment

JOHN MEULLER (Webmaster Analyst from Google):

So first off that one thing that I see from smaller and larger sites all the time is they have a ton of different subdomains for essentially the same content so I had a big European airline recently that that pink means i do they don’t really know what what’s happening in search console and one thing I notice is they have all of these different subdomains and they’re serving very similar content on these subdomains which means you have to add all of these subdomains to search console combine all of that data and then look at the combine view before youkan really understand what’s actually happening here so fewer sites is a lot easier in your main account verify all of the different site variations but only delegate access to the preferred version so instead of delegating access to dub dub dub and all of these variations just pick the one that you really want to have indexed and grant access to that and make sure you ‘recollecting all the email that comes in regardless maybe set up some filters to make sure that the important emails you get through search console actually reach someone so one thing that you can also do if you have a bunch of different sites and you can’t easily just clean them up because sometimes it’s not as easy as it sounds you can use the property sets feature in search console so this is a great way to get one view of multiple sites at the same time i use this for example when looking at our help forums because weave HTTP and https kind of semi randomly index because we never managed to clean that up properly or on the forum side and you want to look at the overall picture so you need to combine all of these versions into a single-entry I and look at it like that so they’re not all reports support property sets but more and more reports due and some important things to keep in mind is that this is per user they can’t be shared you need to be verified for all of the very variations that you include in there and it doesn’t backfill the data so if you decide in January you’d like to see a combined view of the data from December then you view of the data from December then youkan only do that if you’ve set up property sets already so make sure these those up early and one thing that sometimes confusing when you’re looking at search analytics the data is aggregated for this set so if you’re looking at impressions per keyword per query pages per query those kind of things then keep in mind this is for the whole sets if you have multiple sites and the ranking independently for the same keywords in search then we will pick one of those account that one as impression and we will pick the top one as the average top ranking for that query so it’s not the case that you can just add up the individual sites and see the combined view this is essentially doing something pretty smart that helps you understand how much visibility do you have overall not pur pur is tore per URL one thing that also comes up all the time is how aggregate reports work and the important thing to keep in mind here is that this these are based off of indexing they’re not based off of your current site they’re based off of what we found will be crawled and indexed your site so if you make significant changes on your site then it’s normal that it will take some time to settle down against in this case there was a test run here and you can see some of the theirs dropped this was I think that the index pages report then they move tots completely they did the full move on this state but you can see that’s a lot of the URLs dropped out fairly quickly because they’re now on the HTTPS version but for the rest it takes quite some time to actually settle down completely and this is not something that you need to force not something that you need to kind of like manually tweak to make them go away and this is essentially the normal crawling and indexing and you’ll see the same thing if you fix an issue with structured data if you change you ramp implementation you’ll see a fairly sizable chunk of the URLs move fairly quickly and the rest just take some time to kind of happen

Webmaster Guru Answered on December 21, 2016.
Add Comment

Your Answer

By posting your answer, you agree to the privacy policy and terms of service.