Enhanced SEO in Next.js

Searching online has become synonymous to using cellphones and computer devices. If we decide to look for answers, read reviews or understand products or services; the first thing we tend to do is search online. A myriad of search engines bring about their results as a result of readability, SEO practices, domain authority and other factors. 

This brings the value proposition of optimizing websites so that search engines can play a vital role in sorting through terabytes of data and bringing relevant information to the user. SEO allows the search engines to read and understand applications more easily. 

In the modern competitive world, an effective SEO strategy can increase website ranking and thus lead to organic traffic and more business.

Here are some SEO practices that you can undertake to improve your online presence while employing Next.js. 


Challenges faced in Single Page Applications

In Single page applications, a single route loads the entire website on the browser. In React, a single index.html file will be mounted and handles user interactions and page navigations from that file. The search engine can only see the single page's content and metadata, making it difficult for them to index more content.

This can be overcome by using Next.js which has different .html files for each page. In this, React mounts the contents for each page when visited rather than relying on the initial page. Next.js has different techniques for SEO as discussed below:

  1. Meta tags

    The meta tags provide information to search engines on how to crawl a website. They can be added inside the built-in component “Head” tag of Next.js.

    For example:

    <meta name="robots" content="noindex,nofollow" />

    By adding this, the website will not appear in the search results.

    In the following example, there are no restrictions for indexing or crawling.

    <meta name="robots" content="all" />

  2. Metadata

    The Metadata defines the purpose of our website. It can be made dynamic for websites to include the latest updates on the website. 

    There are mainly two components

    • Title - It describes what the page is about. It is recommended to add important keywords to the title and to make it readable for humans.
    • Description - It complements the information about the website.  

    import Head from "next/head";
     
    export default function About({product}:{product:string}) {

    return (

    <>

    <Head>

    <title>{`My ${ product } Store`}</title>

    <meta

    name="description"
     
    content={`Store with different varieties of ${product}`}

    />

    <meta property="og:title" content={`My ${ product } Store`} />

    <meta property="og:description" content={`Different variety of ${product}`} />

    <meta property="og:url" content={`https://mystore.com/products/${product}`} />

    <meta property="og:type" content="website" />

    <meta name="viewport" content="width=device-width, initial-scale=1" />

    </Head>

    <main>

    <h1>What is SEO?</h1>

    </main>

    </>

    );

    }

    export async function getStaticProps({params}:{params:any}) {

    return {

    props: {

    product:params.product

    }

    }

    }

    export async function getStaticPaths() {

    return {

    paths: [{ params: { product: 'shoes' } }],

    fallback: true,

    }

    }

  3. Open Graph Protocol

    The Open Graph protocol enables any web page to become a rich object in a social graph. To turn our website into a graph object basic metadata has to be added to our webpage. 

    The basic properties to be included are:

    • og:title - The title of the object as it should appear within the graph.
    • og:type - The type of the object
    • og:image - An image URL which should represent the object within the graph.
    • og:url - The canonical URL of the object that will be used as its permanent ID in the graph

    Along with this other optional properties can also be added. For more details please refer to this link: https://ogp.me/.

  4. Robots.txt file

    There are situations when we might want to protect certain areas of our website from being crawled or indexed such as CMS, API routes etc by search engines. In that case, these file paths can be specified in the robots.txt file added to the public folder in the root directory.

    For example, you can visit the robots.txt file of Facebook 

    Link: robots.txt

  5. XML Sitemaps

    XML Sitemaps are the easiest way to communicate with Google. They are extremely useful when the websites are large or when they are rich in content.

    It is a file that contains information about the pages, videos, and other files on the site, and the relationships between them. Search engines read this file to crawl our site more efficiently.

    It is recommended to write XML files that are dynamic to populate new content available on the website to enhance the SEO.

    There are 2 options

    • Manual

      A static sitemap.xml file can be written in the public folder of the root directory

      <xml version="1.0" encoding="UTF-8">

      <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">

      <url>
       
      <loc>http://www.mystore.com/shoes</loc>

      <lastmod>2023-03-08</lastmod>

      </url>

      </urlset>

      </xml>

    • getServerSideProps

      getServerSideProps can generate XML sitemaps on demand.

      function generateDynamicSiteMap(product:string) {
       
      return `<?xml version="1.0" encoding="UTF-8"?>
       
      <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
       
      <url>
       
      <loc>https://mystore.com</loc>
       
      </url>

      <url>

      <loc>${`https://mystore.com/${product}`}</loc>

      </url>

      </urlset>

      `;

      }


    A sitemap.ts can be added to the pages folder and it will hit our API and will be able to get information about the dynamic URL’s.

  6. HTML Structure

    A good page structure is also very important for improved SEO. Here are some of the best practices to be followed while writing the code.

    • <h1> tag for headings - Use it on each page to describe what the page is about. <h2> tags can be used for consecutive headings after <h1>.
    • Link tag - Next.js provides Link components for client-side transitions. It can be imported from “next/link”.

      import Link from "next/link";

      const About = () => {

      return (

      <Link href={`/contact`}>

      Link to the Contact Page!

      </Link>

      );

      };

      export default About;

    • Image tag - Next.js also provides an Image component that automatically resizes, optimizes, and serves images in modern formats.

       

      import Image from "next/image";
       
      const About = () => {
       
      return (
       
      <Image src="/images/flower.jpeg" alt="Flower Image" width={500} height={500} />
       
      );
       
      };
       
      export default About;

Conclusion

While React was not the best at SEO, Next.js provides tools to improve performance and enhance customer experience. Webstores must now strategize multiple operations that improves the overall traffic and SEO scores. There are multiple tools to measure the SEO practices and this task must be done with care to further improve the quality of your website's reach.  

 
Gowri Lakshmi D