What ive leaned after 20+ Years of SEO

in #seo8 years ago (edited)

Image of Google Logo

Starting out as a programmer first with Visual Basic and C, then C++ .. I found a bit of a niche working with ASP and HTML. 25 years ago Tim Berners-Lee developed the first web page using hypertext markup language.. and it was completely static. the thought of dynamic web pages was a concept to be implemented later on.. and as more web pages became available on the internet a search engine created named Archie in 1990 by Alan Emtage, a student at McGill University in Montreal. It was basically creating an indexing engine where they could use words that are in the title of the web page to sort and display results based on what the user entered and/or was looking for. We're not going to get into Google and algorithms and all that just yet we're really going to look at the premise because one of the reasons why I'm so good at my job is because if you understand where it started you can understand where it's going. You see, the object of the exercise and developing a search engine is to match the most relevant websites with the user input regardless of what it is the users searching for. At first, it was just a keyword matching service.. very very basic keywords entered would match at first the title and then later on as the search engines became more sophisticated body text. Then,. As the amount of web pages entered into the millions, Inktomi, infoseek and MSN started developing an algorithm that would crawl full web pages,. not just what was planted in the meta and so came about an era of manipulating search engines to display a particular page ahead of others.. or SEO.

When Google launched, their algorithm applied a scoring method that would "read" the text on a page and rank it based on the amount of verbiage that was there, the amount of times a keyword or search phrase that the results would generate,. and added to that another scoring method of gauging how many links were pointing to the website thus giving some websites and pages authority over other websites.. named PageRank (note:PageRank may have been removed from the public scoring system recently, however it is still alive and well internally) of course that being easily reverse-engineered at the time,. one would be able to pile links to a website and plant lots of keywords on the page to artificially boost the rankings of that website. It really wasn't until only a few years ago that they released some updates to their core algorithm that could combat that (Panda, Penguin, Hummingbird etc.)

Almost 10 years ago, they released their own tool called "Webmaster tools" and basically the object of the exercise there would be as anyone could imagine.. If the web page or site is spam, or the text quality or information quality on that page is low,. then one would imagine that it would be a "churn and burn" type of website and would not be added into the Webmaster tools system where an actual webmaster would care about the crawl rate or any of the other indicators that are present inside of such a tool.

At first this seemed to do pretty good job of sorting out the spam from the real websites, and was treated as a type of intermediary. But, as you could imagine the driving factor for releases such as hummingbird and panda and penguin were simply because the Webmaster tools system wasn't doing inadequate enough job.. even after revamping it and renaming it to the Google Search Console.

Now they count things like link velocity and potential virology,. social signals, hang time, link diversity, anchor text diversity, bounce rate .. the list goes on and on. The AI systems that drives the bots have been updated to a machine learning system rather then just a text crawling system.. This being the case the actual text text on the page has to pass what's called a Flesch–Kincaid readability "minimum" score. Which is fine to them, you and me,. because most real websites can easily pass this. However keep in mind that poor grammar and misplaced punctuation will drop the score significantly.. Which is exactly what they wanted to do in the first place.

There are so many factors involved in rankings websites now, if for example we take into consideration the implications of social media virality and followership,. We will find that there are sites that are out there that are ranking really well for really difficult and high competitive keywords that actually don't have that many links to it .. These results are driven solely by the social media indicators and the amount of people that visit the website because of links put into social media and or posts that are shared in places such as Facebook and G+. I would have to say that if I was going to rank a website for any keyword for keyword phrase these days I would have to encompass so many different metrics from so many different places if only to ensure the success of my campaign.

And of course, inside of the webmaster guidelines is a very clear statement that any attempts at manipulation or artificial rank boosts are strictly forbidden.

See ,. now SEO has become a completely different animal than it was when I first started. Now if I was to begin an SEO project for a new customer the first thing that I would do is I would go to a tool like SEMRush or even the webmaster tool system where I can derive some existing keywords and phrases to begin the job.. Find the competitors that are ranking .. And independently re-create whatever they did to get there through reverse engineering and research.

I'm pretty sure that most of the people reading this are looking for insight into what they can do now because they may be buried on the fifth page or worse. Maybe your looking for some tools to use that can boost their rankings ahead of everybody else as quickly as possible? .. So let's get to that.

The definitive list and instruction of how to SEO website in 2016.

The first thing is go into SEMRush and search for your domain, then use their tool to find competitors.
Second enter your website into Webmaster tools and find out where the keyword placements are and the impression count.
Use this information as a basis because Webmaster tools is basically telling us what they feel that the site is about and where you are positioned for topic relevance

Begin searching inside of Google for footprints. This basically what that means search for a keyword, and put immediately after it "WordPress" "leave a reply" or "leave a comment". This will sort out blogs from business websites because those footprints are generally associated with posts. And where there's posts there's people.

Get a good list of email addresses together combined with websites and reach out to the webmasters and authors of these websites and see if you can attain a link somehow ,. either by a paid link although that's not allowed as far as google is concerned .. or a guest blog .. or some type of mutually benefiting business transaction..

Next take it to social.. begin with Facebook first. search inside of Facebook for your keyword and gather a list of Facebook pages and places where you can get in touch with influencers and establish some type of dialogue with them so that they can link to you and pass you some of their juice by creating a guest post for them perhaps ask them if they would be able to mention you it's just as easy as asking and the worst they can say is no.

Next hire a writer, whether it's a contract or whether not it's full-time or whether not you just go to someplace like writer access and purchase a few articles and blog posts. As I said before they're going to score what you put on the page so it has to be legible it has to be good grammar has to be written well . Populate your website with this content only!!

After you populated your content you can use some of the same articles just rewritten, or buy new ones and use those as the guest blogs on other websites for links back.

OK now to social.
You need to have an account on every social media network out there representing you were your business this is paramount.
Every single one of your pages needs to be shared on to these social networks.. It is an easy decent link to your website.
Don't do it all at once do it over a period of about a month.
In the interim you should post things .. this is what I call gray matter.. These are shared posts things that are entertaining means aggregated content that you found another website etc.. The optimal amount of posting I would say is at least once a day .. Up to seven or eight times if you can.
Don't promote these posts this is gray matter you don't promote the gray matter .. Instead you post one of your decent pieces of content every two or three days and use Facebook to boost that content to target demographics would you select inside of the ad manager.. Basically what this will do is that will create a single post that has likes and shares on at that point back to your website it's not actual variety however as far as Google is concerned and Bing is concerned that is a decent piece of content that's pointing to your website and that link will be increased in weight.

For quick and dirty monetary gratification, deciding how much you can spend on PPC is usually the best way to earn ROI quickly and effectively. I've told several clients the best way to gauge how much you should spend on the cost per click system is quite simple. Take your profit from the sale in a dollar amount, multiply that by 2% that is your max CPC. If you find that it's only a few cents then that's fine .. set your max CPC and don't think about it again.. just make sure you get a really good quality score for that keyword. You can do this by creating a landing page per keyword set as well as optimizing your ad copy in such a way that you get a nice high click through rate.

Dealing with information products versus tangible goods

He who has the most information has the most power. Or knowledge is power.. Or something like that anyway.

Information products are amazing. I created a few of my own and developed some software and I think the best part of information products and software is that it's like a photocopier where you don't have to pay for ink or paper, and you can just resell the same thing over and over again for decent price and you never have to worry about overhead.
The problem with information and software is that it will expire. It loses its relevance and facts can change as research broadens knowledge.

Tangible goods on the other hand can give you opportunity for several landing pages, several keyword sets, products and services lead to more opportunity for ranking well .. This remains a preference in the SEO world because you will have a page that, for the most part ,. won't change . yet everything continues to remain relevant and Evergreen. You can go through the products of store several times and just change the text on the page a little bit ,. add some content,. take some content away .. add some images .. but the page itself remains.. and the links to that page remain.

Remember SEO is a long game.
Quick and dirty .. that's PPC.

Sort:  

Congratulations @renthemighty! You have completed some achievement on Steemit and have been rewarded with new badge(s) :

Happy Birthday - 1 Year on Steemit
Click on any badge to view your own Board of Honor on SteemitBoard.
For more information about SteemitBoard, click here

If you no longer want to receive notifications, reply to this comment with the word STOP

By upvoting this notification, you can help all Steemit users. Learn how here!

Congratulations @renthemighty! You received a personal award!

Happy Birthday! - You are on the Steem blockchain for 3 years!

You can view your badges on your Steem Board and compare to others on the Steem Ranking

Vote for @Steemitboard as a witness to get one more award and increased upvotes!