After building multiple sites, here’s what I learned about organic traffic and SEO
Photo by Stephen Phillips - Hostreviews.co.uk on Unsplash
I’ve built multiple apps and websites and I know the pain to sit long hours in front of your screen just to get all the details right. But the real pain is experienced when despite all the hard work and countless hours you’d put in, your product fails to get adequate traction online.
Running on ads on social platforms costs a fortune, especially when you are just starting out. Moreover, as soon as you stop running the ads, your website traffic, that is, the people who visit your website, will dip significantly.
There is only one way to have organic growth without putting a dent in your wallet and that is, making search engines like Google index your article in their search results.
But before Google is able to index your site, you need to show Google what to index. This entire process of building a ‘search engine presence’ is called Search Engine Optimization(SEO).
I, personally, struggled with SEO a lot while building websites, and hence, I have put all my knowledge and expertise in this ultimate guide to SEO for beginners and JavaScript developers.
For easier readability, I will break this post into 2 parts.
Part I: What is SEO and how does it work?
Part II: SEO in websites that use JavaScript
Since this is a beginner’s guide on SEO, I am expecting you to be familiar with HTML and basic JavaScript.
Let’s get started.
Part 1. What is SEO and how does it work?
SEO is the key to organic growth in the number of visitors to your website. The word “organic” is the keyword here. Organic growth means unpaid traffic, the visitors who visit your site purely because of the content you offer and not by the ads you run.
Search results
From the media above, you can see when I have Googled ‘Guide to Investing’, Google has shown me the most relevant, well-optimized sites that matched my search query.
But how did Google find these sites?
There are various stages and factors based on which Google will be able to index your site. Refer to the image below to get a quick look at these factors. The lower the factor, the more important it is.
Image made by the author
Search engines like Google and Bing use something called “Crawlers”, which is just a fancy term for a program that basically discovers and scans websites.
The program opens a website and then opens any link present on that website and then do the same until it reaches a page without any further links.
For example, the program opens this article, then finds a link to my profile, and then goes to my profile where I have all my blogs listed. It will then go through each blog post and index them.
Google’s crawler is called Googlebot and it goes through the HTML of the site for content and links. This is the first step in SEO.
Adding links is more than enough but you can dive deeper and control what pages the crawler should visit and which pages it shouldn’t, due to reasons such as duplicate content or old URLs, etc. To achieve this, you can include a Robots.txt file in the root directory of your website. You can find a more precise guide here.
From the image above, the next step is optimizing keywords. Keywords are crucial to rank your website high on Google search.
Ask yourself how many times have you clicked on page 2 of a Google search.
Therefore, it is more crucial than ever to make sure you appear on the first page of the search results.
Keywords are nothing but words that resonate with the reader’s search query. For example, if you have a blog about innovation, the keywords would be along the lines of “machine learning”, “renewable energy”, “flexible display”, etc depending upon your blog.
All these are buzzwords that provide proper keyword optimization.
But how will you know which keywords are trending and relevant? There are various tools dedicated to providing you with excellent keywords.
While most of them cost a dime, Google offers free keyword search tools and personally, I’ve had much more success with these free tools than the paid ones.
For beginners, I highly recommend using Google Trends since it is fairly straightforward. Additionally, you can compare the popularity of keywords as well as get insights from particular countries and time periods.
Comparison between Football and Cricket search trends. Image by the author.
For a relatively advanced use case, you can try Google’s Keyword Planner. The main reason why I suggest using Google’s tool is that Google search engine has over 92% market share. Thus optimizing Google search instead of Bing or Yahoo search is a smart choice.
With the information I have given out so far, you must already be comfortable with the basics of SEO. However, the next stages — Meta tags and overall experience, are important too.
Most beginners miss out on these two final stages and it affects their traffic drastically.
Meta tags provide valuable information called “metadata” via the HTML documents. Meta tags are normal HTML tags but they don’t have any impact on the visuals of the webpage, instead, they are used by the crawler.
If you have built a website, chances are you have used these tags knowingly or unknowingly. Below is a code snippet containing a good set of meta tags.
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Title Meta Tag</title>
<meta name="description" content=" A precise description the webpage">
<meta name="robots" content="noindex" />
</head>
All these tags between the head tags. Although title and description are the most important tags, other meta tags like ‘robots’ are useful as well.
The <meta name=”robots” content=”noindex” />
tell the crawler not to index this page. You can find a full list of these tags here but I’ve shared the most important ones to get you started.
Last but not least we have the overall experience as the final factor affecting SEO. This one is pretty self-explanatory. It refers to the overall loading speed of sites and how content is presented.
Most sites built via JavaScript frameworks like React and Vue are blazingly fast anyways but if you load data from external Databases, make sure it does not affect the loading time too much.
Additionally, be sure to use relevant tags like heading tags <h1> </h1>
for headings because this makes it easier for crawlers to understand the content.
In brief, Part I has covered all the factors affecting how SEO works and the core fundamentals like tags and keywords can boost your rankings.
Part 2. SEO in websites that use JavaScript
If you have read Part I carefully, you must have noticed that crawlers only refer to the HTML of the page.
But most websites these days are becoming increasingly complex and therefore, incorporate JavaScript to add dynamic content loading and functionalities that aren’t possible with plain HTML.
Luckily, Google has got us covered. They have written an in-depth guide on understanding SEO with Javascript.
Google’s process to crawl. Source: Google.
As shown by the image above, Googlebot(Google’s crawler) processes JavaScript web apps in three main phases:
Crawling
Rendering
Indexing
Since JavaScript injects content into HTML, the process can take some time — the time which crawler doesn’t have to wait and see the full rendered HTML.
Therefore, the crawler has to wait for a second to let the HTML load entirely, along with the JS content. Hence, Googlebot waits for rendering to finish.
So for sites using JavaScript, you don’t really need to do much. You can always follow the guide provided by Google. The one thing I suggest though is to use the Rich Results Test to debug and understand how the crawler interprets your site contents.
But what if you use a JavaScript framework such as React?
React and other such framework loads DOM virtually. They send an empty HTML container with bare minimum tags and then populate it using JavaScript. They populate the HTML on the client’s browser and this process is called Client-side rendering.
It is quite challenging to implement SEO on these Single Page Applications( websites made with Vue, React, Angular, etc) however, you can always render these pages on the server and then send the final HTML to the user or the crawler.
This is called Server-side rendering(SSR). Server-side sends a completely rendered page to the user hence, the crawler sees the finished product with all the links and contents injected into it on the server-side.
To render React on the server, the popular choice is to use Next.js which is built on top of React and greatly simplifies things like routing. But most importantly it provides support to render content on the server by pushing out a static site to the user, which are great for the purposes of SEO.
Similarly, for Angular, we have Angular Universal and for Vue we have Nuxt.js. Using a server-side rendering framework will provide the best of both the worlds — the flexibility of frameworks and the SEO of the static sites.
What about apps that are already built with client-side frameworks like Vue and React?
Moving a fully-fledged app to a new framework, such as moving your React app to the Next.js framework, can be quite challenging and time-consuming.
However, all hope is not lost.
There are many packages available that provide the ability to add meta tags to enhance SEO, the benefits of which I have discussed in Part I of this article.
To manage the dynamic addition of tags in Angular, one can use the Ngx-meta/core package. Similarly, for Vue, we have the Vue-Meta package & for React.js, we have React-Helmet.
All the pages are quite simple and I have shared the packages for the 3 most widely used frameworks. Usually, all the other frameworks also have a similar package available.
Using these packages are quite simple as well as you can see below, depending upon which framework you are using. You can click to expand.
Angular(left), React(center) and Vue(right). Source: author.
It is worth noting that if you are just starting your project, opt for a Server-side framework to escape all the trouble of managing SEO.
Using JAMStack is also a viable option since the data loaded on the go with CDN and also provides a better SEO score than most front-end frameworks, without hampering performance.
Conclusion
With the growing demand for digital content, it is becoming more important than ever to structure your websites appropriately to get as much traffic as possible.
Although running ads can definitely boost your traffic, mastering SEO ensures long term stable and organic growth.
Therefore, I have tried to explain the art of SEO by demystifying the commonly used terms associated with SEO, as well as marketing, giving you and your article the best chance to get high organic traffic.
Most developers think learning and implementing proper SEO has a quite steep learning curve, but in truth, they know all the technicalities involved in SEO and even have implemented some maybe unknowingly.