Affiliated Business Logo Affiliated Business News
51

This post is perhaps a bit overdue, but after speaking at the Content Marketing Show about working with bloggers, I figured that some of you might be interested in becoming one of them. I’m here to gently persuade you to give it a go, purely to see if it can help you within your own […]

The post rel="nofollow" href="http://white.net/blog/why-blogging-has-made-me-a-better-marketer/">Why Blogging Has Made Me A Better Marketer appeared first on rel="nofollow" href="http://white.net">White.net.

15

Yesterday Google shared they see greater mobile than desktop search volumes in 10 countries including Japan and the United States.


3 years ago RKG shared CTR data which highlighted how mobile search ads were getting over double the CTR as desktop search ads.


The basic formula: less screen real estate = higher proportion of user

Read More
clicks on ads.


Google made a big deal of their "mobilepocalypse" update to scare other webmasters into making their sites mobile friendly. Part of the goal of making sites "mobile friendly" is to ensure it isn't too ad dense (which in turn lowers accidental ad clicks & lowers monetization). Not only does Google have an "ad heavy" relevancy algorithm which demotes ad heavy sites, but they also explicitly claim even using a moderate sized ad unit on mobile devices above the fold is against their policy guidelines:


Is placing a 300x250 ad unit on top of a high-end mobile optimized page considered a policy violation?


Yes, this would be considered a policy violation as it falls under our ad placement policies for site layout that pushes content below the fold. This implementation would take up too much space on a mobile optimized site's first view screen with ads and provides a poor experience to users. Always try to think of the users experience on your site - this will help ensure that users continue to visit.


So if you make your site mobile friendly you can't run Google ads above the fold unless you are a large enough publisher that the guidelines don't actually matter.


If you spend the extra money to make your site mobile friendly, you then must also go out of your way to lower your income.


What is the goal of the above sort of scenario? Defunding content publishers to ensure most the ad revenues flow to Google.


If you think otherwise, consider the layout of the auto ads & hotel ads Google announced yesterday. Top of the search results, larger than 300x250.



If you do X, you are a spammer. If Google does X, they are improving the user experience.


@aaronwall they will personally do everything they penalize others for doing; penalties are just another way to weaken the market.— Cygnus SEO (@CygnusSEO) May 5, 2015


The above sort of contrast is something noticed by non-SEOs. The WSJ article about Google's new ad units had a user response stating:


With this strategy, Google has made the mistake of an egregious use of precious mobile screen space in search results. This entails much extra fingering/scrolling to acquire useful results and bypass often not-needed coincident advertising. Perhaps a moneymaker by brute force; not a good idea for utility’s sake.


That content displacement with ads is both against Google's guidelines and algorithmically targeted for demotion - unless you are Google.


How is that working for Google partners?



According to eMarketer, by 2019 mobile will account for 72% of US digital ad spend. Almost all that growth in ad spend flows into the big ad networks while other online publishers struggle to monetize their audiences:


Facebook and Google accounted for a majority of mobile ad market growth worldwide last year. Combined, the two companies saw net mobile ad revenues increase by $6.92 billion, claiming 75.2% of the additional $9.2 billion that went toward mobile in 2013.


Back to the data RKG shared. Mobile is where the growth is...



...and the smaller the screen size the more partners are squeezed out of the ecosystem...



The high-intent, high-value search traffic is siphoned off by ads.


What does that leave for the rest of the ecosystem?


It is hard to build a sustainable business when you have to rely almost exclusively on traffic with no commercial intent.


One of the few areas that works well is perhaps with evergreen content which has little cost of maintenance, but even many of those pockets of opportunity are disappearing due to the combination of the Panda algorithm and Google's scrape-n-displace knowledge graph.


.@mattcutts I think I have spotted one, Matt. Note the similarities in the content text: pic.twitter.com/uHux3rK57f— dan barker (@danbarker) February 27, 2014


Even companies with direct ad sales teams struggle to monetize mobile:


At The New York Times, for instance, more than half its digital audience comes from mobile, yet just 10% of its digital-ad revenue is attributed to these devices.


Other news websites also get the majority of their search traffic from mobile.


Why do news sites get so much mobile search traffic? A lot of it is navigational & beyond that most of it is on informational search queries which are hard to monetize (and thus have few search ads) and hard to structure into the knowledge graph (because they are about news items which only just recently happened).


If you look at the organic search traffic breakdown in your analytics account & you run a site which isn't a news site you will likely see a far lower share of search traffic from mobile. Websites outside of the news vertical typically see far less mobile traffic. This goes back to Google dominating the mobile search interface with ads.


Mobile search ecosystem breakdown


  • traffic with commercial intent = heavy ads
  • limited commercial intent but easy answer = knowledge graph
  • limited commercial intent & hard to answer = traffic flows to news sites

Not only is Google monetizing a far higher share of mobile search traffic, but they are also aggressively increasing minimum bids.


As Google continues to gut the broader web publishing ecosystem, they can afford to throw a few hundred million in "innovation" bribery kickback slush funds. That will earn them some praise in the short term with some of the bigger publishers, but it will make those publishers more beholden to Google. And it is even worse for smaller publishers. It means the smaller publishers are not only competing against algorithmic brand bias, confirmation bias expressed in the remote rater documents, & wholesale result set displacement, but some of their bigger publishing competitors are also subsidized directly by Google.


Ignore the broader ecosystem shifts.


Ignore the hypocrisy.


Focus on the user.


Until you are eating cat food.



Categories: google
17
Website speed and page load times are important for both search engine rankings and user satisfaction, so optimizing your website's performance is key.
Read More
BiTT20:F7zBnMyn0Lo">
6
Search Console users will be able to analyze how their indexed app content is performing in Google’s search results.
Read More
0Lo">
15
A month or so ago I was watching this video on YouTube with John Reese,  Frank Kern, and Tony Robbins were talking about their frustrations in that people who buy their training programs rarely take action.   The really interesting thing is when they talked about how the SAME people keep buying products  for thousands of dollars and then email them email them saying they don’t know how to get started. That video really got me thinking.  After all – I have sold over 25,000 units of my 2 training products  (over ten million dollars worth) focused on educating people on
22

Online designer clothing retail behemoth Net-a-Porter, not content with 6 million unique visits a month to its desktop site, yesterday launched its new app, the Net Set. Having been impressed with their content strategy in the past year, which included the expansion of its online magazine and the creation of a dedicated print magazine, I […]


The post Net-a-Porter launch their own social platform, The NET SET, and we’ve reviewed it! appeared first on

Read More
ref="http://white.net">White.net.

41

The Truth About Subjective Truths


A few months ago there was an article in New Scientist about Google's research paper on potentially ranking sites based on how factual their content is. The idea is generally and genuinely absurd.


For a search engine to be driven primarily by group think (see unity100's posts here) is the death of diversity.


Less Diversity, More Consolidation


The problem is rarely attributed to Google, but as ecosystem diversity has declined (and entire segments of the ecosystem are unprofitable to service), more people are writing things like: "The market for helping small businesses maintain a home online isn’t one with growing profits – or, for the most part, any profits. It’s one that’s heading for a bloody period of consolidation."


As companies grow in power the power gets monetized. If you can manipulate people without appearing to do so you can make a lot of money.


If you don't think Google wants to disrupt you out of a job, you've been asleep at the wheel for the past decade— Michael Gray (@graywolf) March 13, 2015


We Just Listen to the Data (Ish)


As Google sucks up more data, aggregates intent, and scrapes-n-displaces the ecosystem they get air cover for some of their gray area behaviors by claiming things are driven by the data & putting the user first.


Those "data" and altruism claims from Google recently fell flat on their face when the Wall Street Journal published a number of articles about a leaked FTC document.


That PDF has all sorts of goodies in it about things like blocking competition, signing a low margin deal with AOL to keep monopoly marketshare (while also noting the general philosophy outside of a few key deals was to squeeze down on partners), scraping content and ratings from competing sites, Google force inserting itself in certain verticals anytime select competitors ranked in the organic result set, etc.


As damning as the above evidence is, more will soon be brought to light as the EU ramps up their formal statement of objection, as Google is less politically connected in Europe than they are in the United States:


"On Nov. 6, 2012, the night of Mr. Obama’s re-election, Mr. Schmidt was personally overseeing a voter-turnout software system for Mr. Obama. A few weeks later, Ms. Shelton and a senior antitrust lawyer at Google went to the White House to meet with one of Mr. Obama’s technology advisers. ... By the end of the month, the FTC had decided not to file an antitrust lawsuit against the company, according to the agency’s internal emails."


What is wild about the above leaked FTC document is it goes to great lengths to show an anti-competitive pattern of conduct toward the larger players in the ecosystem. Even if you ignore the distasteful political aspects of the FTC non-decision, the other potential out was:


"The distinction between harm to competitors and harm to competition is an important one: according to the modern interpretation of antitrust law, even if a business hurts individual competitors, it isn’t seen as breaking antitrust law unless it has also hurt the competitive process—that is, that it has taken actions that, for instance, raised prices or reduced choices, over all, for consumers." - Vauhini Vara


Part of the reason the data set was incomplete on that front was for the most part only larger ecosystem players were consulted. Google engineers have went on record stating they aim to break people's spirits in a game of psychological warfare. If that doesn't hinder consumer choice, what does?


@aaronwall rofl. Feed the dragon Honestly these G investigations need solid long term SEOs to testify as well as brands.— Rishi Lakhani (@rishil) April 2, 2015


When the EU published their statement of objections Google's response showed charts with the growth of Amazon and eBay as proof of a healthy ecosystem.



The market has been consolidated down into a few big winners which are still growing, but that in and of itself does not indicate a healthy nor neutral overall ecosystem.


The long tail of smaller e-commerce sites which have been scrubbed from the search results is nowhere to be seen in such charts / graphs / metrics.


The other obvious "untruth" hidden in the above Google chart is there is no way product searches on Google.com are included in Google's aggregate metrics. They are only counting some subset of them which click through a second vertical ad type while ignoring Google's broader impact via the combination of PLAs along with text-based AdWords ads and the knowledge graph, or even the recently rolled out rich product answer results.


Who could look at the following search result (during anti-trust competitive review no less) and say "yeah, that looks totally reasonable?"



Google has allegedly spent the last couple years removing "visual clutter" from the search results & yet they manage to product SERPs looking like that - so long as the eye candy leads to clicks monetized directly by Google or other Google hosted pages.


The Search Results Become a Closed App Store


Search was an integral piece of the web which (in the past) put small companies on a level playing field with larger players.


That it no longer is.


WOW. RT @aimclear: 89% of domains that ranked over the last 7 years are now invisible, #SEO extinction. SRSLY, @marcustober #SEJSummit— Jonah Stein (@Jonahstein) April 15, 2015


"What kind of a system do you have when existing, large players are given a head start and other advantages over insurgents? I don’t know. But I do know it’s not the Internet." - Dave Pell


The above quote was about app stores, but it certainly parallels a rater system which enforces the broken window fallacy against smaller players while looking the other way on larger players, unless they are in a specific vertical Google itself decides to enter.


"That actually proves my point that they use Raters to rate search results. aka: it *is* operated manually in many (how high?) cases. There is a growing body of consensus that a major portion of Googles current "algo" consists of thousands of raters that score results for ranking purposes. The "algorithm" by machine, on the majority of results seen by a high percentage of people, is almost non-existent." ... "what is being implied by the FTC is that Googles criteria was: GoogleBot +10 all Yelp content (strip mine all Yelp reviews to build their database). GoogleSerps -10 all yelp content (downgrade them in the rankings and claim they aren't showing serps in serps). That is anticompetitive criteria that was manually set." - Brett Tabke


The remote rater guides were even more explicitly anti-competitive than what was detailed in the FTC report. For instance, requiring hotel affiliate sites rated as spam even if they are helpful, for no reason other than being affiliate sites.


Is Brand the Answer?


About 3 years ago I wrote a blog post about how branding plays into SEO & why it might peak. As much as I have been accused of having a cynical view, the biggest problem with my post was it was naively optimistic. I presumed Google's consolidation of markets would end up leading Google to alter their ranking approach when they were unable to overcome the established consensus bias which was subsidizing their competitors. The problem with my presumption is Google's reliance on "data" was a chimera. When convenient (and profitable) data is discarded on an as need basis.


Or, put another way, the visual layout of the search result page trumps the underlying ranking algorithms.


Google has still highly disintermediated brand value, but they did it via vertical search, larger AdWords ad units & allowing competitive bidding on trademark terms.


If Not Illegal, then Scraping is Certainly Morally Deplorable...


As Google scraped Yelp & TripAdvisor reviews & gave them an ultimatum, Google was also scraping Amazon sales rank data and using it to power Google Shopping product rankings.


Around this same time Google pushed through a black PR smear job of Bing for doing a similar, lesser offense to Google on rare, made-up longtail searches which were not used by the general public.


While Google was outright stealing third party content and putting it front & center on core keyword searches, they had to use "about 100 “synthetic queries”—queries that you would never expect a user to type" to smear Bing & even numerous of these queries did not show the alleged signal.


Here are some representative views of that incident:


  • "We look forward to competing with genuinely new search algorithms out there—algorithms built on core innovation, and not on recycled search results from a competitor. So to all the users out there looking for the most authentic, relevant search results, we encourage you to come directly to Google. And to those who have asked what we want out of all this, the answer is simple: we'd like for this practice to stop." - Google's Amit Singhal
  • “It’s cheating to me because we work incredibly hard and have done so for years but they just get there based on our hard work. I don’t know how else to call it but plain and simple cheating. Another analogy is that it’s like running a marathon and carrying someone else on your back, who jumps off just before the finish line.” Amit Singhal, more explicitly.
  • "One comment that I’ve heard is that “it’s whiny for Google to complain about this.” I agree that’s a risk, but at the same time I think it’s important to go on the record about this." - Matt Cutts
  • "I’ve got some sympathy for Google’s view that Bing is doing something it shouldn’t." - Danny Sullivan

What is so crazy about the above quotes is Google engineers knew at the time what Google was doing with Google's scraping. I mentioned that contrast shortly after the above PR fiasco happened:


when popular vertical websites (that have invested a decade and millions of Dollars into building a community) complain about Google disintermediating them by scraping their reviews, Google responds by telling those webmasters to go pound sand & that if they don't want Google scraping them then they should just block Googlebot & kill their search rankings


Learning the Rules of the Road


If you get a sense "the rules" are arbitrary, hypocritical & selectively enforced - you may be on to something:


  • "The bizrate/nextag/epinions pages are decently good results. They are usually well-format[t]ed, rarely broken, load quickly and usually on-topic. Raters tend to like them" ... which is why ... "Google repeatedly changed the instructions for raters until raters assessed Google's services favorably"
  • and while claimping down on those services ("business models to avoid") ... "Google elected to show its product search OneBox “regardless of the quality” of that result and despite “pretty terribly embarrassing failures” "
  • and since Google knew their offerings were vastly inferior, “most of us on geo [Google Local] think we won't win unless we can inject a lot more of local directly into google results” ... thus they added "a 'concurring sites' signal to bias ourselves toward triggering [display of a Google local service] when a local-oriented aggregator site (i.e. Citysearch) shows up in the web results”"

Google's justification for not being transparent is "spammer" would take advantage of transparency to put inferior results front and center - the exact same thing Google does when it benefits the bottom line!


Around the same time Google hard-codes the self-promotion of their own vertical offerings, they may choose to ban competing business models through "quality" score updates and other similar changes:


The following types of websites are likely to merit low landing page quality scores and may be difficult to advertise affordably. In addition, it's important for advertisers of these types of websites to adhere to our landing page quality guidelines regarding unique content.


  • eBook sites that show frequent ads
  • 'Get rich quick' sites
  • Comparison shopping sites
  • Travel aggregators
  • Affiliates that don't comply with our affiliate guidelines

The anti-competitive conspiracy theory is no longer conspiracy, nor theory.


Key points highlighted by the European Commission:


  • Google systematically positions and prominently displays its comparison shopping service in its general search results pages, irrespective of its merits. This conduct started in 2008.
  • Google does not apply to its own comparison shopping service the system of penalties, which it applies to other comparison shopping services on the basis of defined parameters, and which can lead to the lowering of the rank in which they appear in Google's general search results pages.
  • Froogle, Google's first comparison shopping service, did not benefit from any favourable treatment, and performed poorly.
  • As a result of Google's systematic favouring of its subsequent comparison shopping services "Google Product Search" and "Google Shopping", both experienced higher rates of growth, to the detriment of rival comparison shopping services.
  • Google's conduct has a negative impact on consumers and innovation. It means that users do not necessarily see the most relevant comparison shopping results in response to their queries, and that incentives to innovate from rivals are lowered as they know that however good their product, they will not benefit from the same prominence as Google's product.

Overcoming Consensus Bias


Consensus bias is set to an absurdly high level to block out competition, slow innovation, and make the search ecosystem easier to police. This acts as a tax on newer and lesser-known players and a subsidy toward larger players.


Eventually that subsidy would be a problem to Google if the algorithm was the only thing that matters, however if the entire result set itself can be displaced then that subsidy doesn't really matter, as it can be retracted overnight.


Whenever Google has a competing offering ready, they put it up top even if they are embarrassed by it and 100% certain it is a vastly inferior option to other options in the marketplace.


That is how Google reinforces, then manages to overcome consensus bias.


How do you overcome consensus bias?


Categories: google
22
After six years of rumors around the strength of the partnership, Yahoo and Microsoft have reached a new agreement, which splits ad sales and ends exclusivity.
Read More
Kc:SdsYrZTDTwY:F7zBnMyn0Lo">
6
Video content can be an important part of a rounded content strategy, but it can also cause serious SEO problems. Here are four ways your video content can cause engagement problems, along with how to keep that from happening.
Read More
rder="0">
30
I wrote a detailed review of the newest features AdStation’s mailing platform rolled out earlier this year. You can read that here http://www.shoemoney.com/2015/01/27/official-adstation-kicking-email-marketings-ass . But, since then, AdStation has continued to add features, functionality and product enhancements to their platform. I mean, most companies would just be happy to be rest on their laurels and...     Read More ->
Join Our Newsletter:


What is Affiliated Business?

Affiliated Business is a social network of bloggers, webmasters, and Internet entrepreneurs. It allows you to publish and share your news and to discover the best resources, tips, and ideas concerning affiliate marketing, blogging, homebusiness and making money online. You can submit your stories, vote for interesting news, take part in discussions, and network with other site users.

Latest Comments
Affiliated Business RSS Subscribers