Let’s face it… getting your website to rank high on Google isn’t an easy task. Juggling between finding the proper keywords, optimizing images, and user experience feels like a long to-do list. This is where Next.js comes in, not as another fancy React framework, but as a game changer that both users and search engines love.
Whether you are running a blog, an e-commerce store, or a portfolio site, Next.js just makes it easier to create fast and SEO-friendly websites without headaches.
Well first, Next.js supports Server Side Rendering (SSR) out of the box, but it also supports Static Site Generation (SSG). What are those? Well, for short, SSR pre-renders the page on the server before being sent to the client, allowing faster content delivery for initial page loads, while SSG prepares the pages at build time, serving as static assets and ensuring faster page loads, improving user experience.
You can easily achieve this in Next.js by making use of the generateStaticParams
function in a [slug]
page.
Let’s assume we have the following structure, where page.tsx
under [slug]
represents a post:
posts
…
[slug]
page.tsx
export async function generateStaticParams() { const posts = await fetch('someApi'); // posts = [{id:1, name: 'post 1'}...] return posts.map(post => ({ slug: post.id })); } export default async function Post({ params }) { const { slug } = params; const post = await getPost(slug); return <div> <p>{post.name}</p> </div>; }
Now if you run:
yarn build // or npm build
you will notice that your page has been generated at the specific route:
/posts/1
This approach is ideal for SEO as search engines can access an early pre-rendered version of the page.
Metadata plays a key role in SEO, but let’s be honest—it can get tedious adding all those tags manually in the <head>
section. The good news? Starting with Next.js 13 and the introduction of the App Router, we now have the metadata
object! This handy feature makes defining page metadata, like titles, descriptions, and Open Graph tags, so much easier and more organized. 😊
You can add a metadata
object in two places:
layout.tsx
file: Your metadata will be available for all the pages on your site.page.tsx
of each route: This will overwrite the default one from layout.tsx
.Configuring proper metadata will not only optimize your title and description but also how your site preview looks when sharing on social media. This can be achieved by configuring proper Open Graph and Twitter tags.
Here is an example of how you can do this in a layout.tsx
file:
export const metadata: Metadata = { title: "Your Awesome Title", description: "Some pretty cool description", icons: ["/yourIcon.svg"], verification: { google: "your-google-key", // we will talk about this later }, openGraph: { title: "My Awesome Site", siteName: "My Cool Stuff", description: "", url: "https://yourdomain.com", type: "website", images: ["/hero.webp"], }, twitter: { title: "My Awesome Site", description: "My Cool Stuff", card: "summary", images: ["/hero.webp"], }, keywords: ["some key", ...], authors: [{ name: "" }], alternates: { canonical: new URL("https://yourdomain.com"), }, robots: "index, follow", };
You can always check your configured metadata at OpenGraph.xyz.
Open Graph is an internet protocol originally developed by Facebook. Its purpose is to standardize the way metadata is used to represent a webpage's content. In simpler terms, it helps define what information about your webpage is shared when someone links to it on social media or other platforms.
Twitter also has its own metadata configuration system called Twitter Cards. The purpose is the same as Open Graph: to define how your webpage's content is displayed when shared. The only difference is that you use Twitter-specific metadata tags to achieve this. Just add them to your metadata, and you're good to go!
This tag tells the search engine which URL is the official version of your website. It is important to avoid duplicated content that can confuse search engines and split ranking signals between multiple pages.
A sitemap.xml
file helps search engines like Google understand which URLs on your website should be indexed. Without indexing, your pages won't appear in search results, so it's a crucial part of making your site visible online.
On the other hand, robots.txt
is a set of instructions for search engines, telling them which parts of your website should or shouldn't be crawled. While the sitemap focuses on what should be indexed, robots.txt
helps manage what should not be crawled, keeping certain pages or sections out of search results.
You can configure these in two ways:
public
folder.Create two files: sitemap.ts
and robots.ts
under the /app
folder.
sitemap.ts
:
const defaultRoutes = []; // your routes export default async function sitemap(): Promise<MetadataRoute.Sitemap> { const routes = defaultRoutes.map((route) => ({ url: `${WEBSITE_HOST_URL}${route}`, lastModified: new Date(), changeFrequency: ChangeFrequency.weekly, priority: 0.5, })); return [...routes]; }
robots.ts
:
import { MetadataRoute } from "next"; export default function robots(): MetadataRoute.Robots { return { rules: { userAgent: ["*"], allow: ["/"], disallow: ["/private"], }, sitemap: "https://yourdomain.com/sitemap.xml", host: "https://yourdomain.com", }; }
You can see that they are generated at build time. After deployment, you can view your sitemap and robots files at:
https://yourdomain.com/sitemap.xml
https://yourdomain.com/robots.txt
Now, all you need to do is head over to Google Search Console, add your website as a new property (e.g., yourdomain.com
), and submit your sitemap.xml
. This will help Google start indexing your pages and make them visible in search results.
application/ld+json
in Next.jsJSON-LD is a format for structured data that allows you to describe your website's content in a way that search engines can easily understand. By embedding JSON-LD into your pages, you help search engines deliver richer and more accurate search results.
Here is an example of how to use JSON-LD in Next.js:
export default async function Page({ params }) { const post = await getPost(params.id); const jsonLd = { '@context': 'https://schema.org', '@type': 'Post', name: post.name, image: post.image, description: post.description, }; return ( <section> <script type="application/ld+json" dangerouslySetInnerHTML={{ __html: JSON.stringify(jsonLd) }} /> </section> ); }
Thank you for reading all the way to the end! I put togheter a small Github repository where I demonstrate this concepts on action. If you have any questions or want to discuss more about Next.js and SEO, feel free to reach out to me via the contact section or drop me an email. I’d love to hear from you!