Technical On-Page SEO

Rankings start with technical fundamentals. I handle the on-page SEO that search engines actually care about. Site structure, metadata, structured data, performance, and crawlability. This is the foundation that everything else builds on.

Did You Know
68%
of all trackable website traffic starts with a search engine. Organic search delivers over 5x more ROI than paid ads over time.

What I Do

Technical on-page SEO is about making sure search engines can find, understand, and prioritize your content. Here is what I work on:

Meta Tags & Title Optimization

Unique, keyword-targeted title tags and meta descriptions for every page. Proper Open Graph and Twitter Card tags so your pages look right when shared. No duplicate titles, no missing descriptions, no generic boilerplate.

Heading Structure

Clean heading hierarchy with one H1 per page, logical H2/H3 nesting, and headings that actually describe the content beneath them. Search engines use heading structure to understand page topics and content relationships.

Structured Data & JSON-LD

Schema markup that tells search engines exactly what your pages represent. Business info, services, FAQs, articles, products, reviews, all implemented as JSON-LD and validated against Google's requirements. This is how you get rich results in search.

Internal Linking

Strategic internal links that distribute page authority and help search engines discover your content. Proper anchor text, logical site hierarchy, no orphan pages.

Site Speed & Core Web Vitals

Page speed is a ranking factor. I optimize for Largest Contentful Paint, Interaction to Next Paint, and Cumulative Layout Shift. These are the three Core Web Vitals that Google measures. This means optimized images, minimal render-blocking resources, efficient CSS and JavaScript, and proper asset loading.

Crawlability & Indexation

Robots.txt configuration, XML sitemaps, canonical tags, and proper use of noindex/nofollow directives. I make sure search engines can crawl what they should and ignore what they should not. No crawl budget wasted on duplicate content or dead ends.

Core Web Vitals Explained

Google measures three specific metrics to judge your site's user experience. These scores directly affect your search rankings. Here is what each one means in plain terms.

Largest Contentful Paint (LCP)

LCP measures how long it takes for the biggest visible element on the page to finish loading. This is usually a hero image, a large heading, or a video thumbnail. Google wants this under 2.5 seconds. Poor LCP is usually caused by slow server response times, unoptimized images, render-blocking CSS or JavaScript, or slow font loading. I fix these by optimizing images, implementing lazy loading, deferring non-critical scripts, and improving server configuration.

Interaction to Next Paint (INP)

INP measures how quickly your page responds when a user clicks, taps, or types. Google wants this under 200 milliseconds. If your page freezes or lags after a click, your INP score suffers. Common causes include heavy JavaScript execution, long tasks blocking the main thread, and unoptimized event handlers. I fix this by breaking up long tasks, optimizing JavaScript, and ensuring the browser stays responsive during interactions.

Cumulative Layout Shift (CLS)

CLS measures how much the page content moves around while it loads. You have seen this happen. You start reading text and then an image loads above it and pushes everything down. Google wants a CLS score under 0.1. I prevent layout shift by setting explicit dimensions on images and videos, reserving space for ads and embeds, and loading fonts in a way that does not cause text to reflow.

Structured Data and Rich Results

Structured data is code that tells search engines exactly what your content represents. When implemented correctly, it unlocks rich results. These are the enhanced search listings that stand out on the results page.

What Rich Results Look Like

You have seen them even if you did not know the name. Star ratings next to product listings. FAQ dropdowns directly in search results. Recipe cards with cooking time and calorie counts. Event listings with dates and venues. These are all powered by structured data markup on the page.

How Schema Markup Works

I implement structured data using JSON-LD, which is the format Google recommends. This is a block of code added to your page that describes the content in a way search engines can parse. For a service page, it might describe the business name, service type, area served, and pricing. For a product page, it might include price, availability, and review ratings.

Types I Implement

I work with all the major schema types that Google supports. This includes LocalBusiness, Service, Product, FAQ, Article, HowTo, Event, Review, and BreadcrumbList. Each one is validated against Google's structured data testing tools before deployment. I make sure every required field is present and every value is accurate.

The Impact

Pages with structured data earn more clicks even without ranking higher. Rich results take up more visual space in search results and draw the eye. A listing with star ratings and pricing information gets more clicks than a plain blue link. This is free real estate in search results, and most sites are not using it.

Common Technical SEO Problems

Most sites have technical SEO issues hiding in plain sight. Here are the ones I find and fix most often.

Duplicate Content

When the same content is accessible at multiple URLs, search engines do not know which version to rank. This happens with www vs non-www, HTTP vs HTTPS, trailing slashes, URL parameters, and paginated content. I consolidate duplicates with canonical tags, redirects, and proper URL structure.

Broken Canonical Tags

Canonical tags tell search engines which URL is the "official" version of a page. When these are missing, self-referencing incorrectly, or pointing to the wrong URL, search engines get confused about which pages to index. I audit every canonical tag and fix the ones that are wrong.

Slow Rendering

Pages that rely heavily on JavaScript to render content can be invisible to search engines. If Googlebot has to execute JavaScript to see your page content, indexing gets delayed or skipped entirely. I ensure critical content is in the initial HTML response, not hidden behind client-side rendering.

Redirect Chains

When URL A redirects to URL B, which redirects to URL C, which finally reaches URL D, you have a redirect chain. Each hop wastes crawl budget and dilutes link equity. I flatten redirect chains so every redirect goes directly to the final destination in a single hop.

Missing Alt Text

Images without alt text are invisible to search engines and inaccessible to screen readers. Every image on your site should have descriptive alt text that explains what the image shows. I audit all images and add meaningful alt attributes that serve both SEO and accessibility.

Orphan Pages

Pages with no internal links pointing to them are effectively hidden from search engines. If Googlebot cannot discover a page by following links from other pages on your site, it probably will not index it. I identify orphan pages and connect them to your site's internal linking structure.

What I Don't Do

I want to be upfront about this. I do not do backlink building or off-page SEO.

Backlinks are important. They are still one of the strongest ranking signals. But earning quality backlinks requires years of relationship building, content marketing, PR outreach, and platform presence. Anyone who tells you they can get you authoritative backlinks quickly is either cutting corners or lying.

I focus on what I can control and deliver reliably: the technical foundation of your site. Get this right and you are giving your content the best possible chance to rank. Pair it with good content and the links will come.

Measuring SEO Results

Technical SEO is not guesswork. I track everything with real tools and share the data with you so you can see exactly what is improving.

Tools I Use

I work with Google Search Console for indexing and search performance data, Google PageSpeed Insights for Core Web Vitals, Screaming Frog for technical audits and crawl analysis, and Ahrefs or Semrush for keyword tracking and competitive research. Each tool gives a different piece of the picture, and I use them together to get the full view.

What to Track

The metrics that matter most are organic search impressions, click-through rates, keyword positions, Core Web Vitals scores, crawl errors, and index coverage. I set up tracking for all of these and provide regular reports that show the trend lines, not just snapshots.

Realistic Timeline

Technical SEO changes do not produce overnight results. Google needs to recrawl and reprocess your pages. Here is what a realistic timeline looks like. Technical fixes like canonical tags, redirects, and structured data typically show measurable impact within two to four weeks. Core Web Vitals improvements can show up in search rankings within one to three months. Overall organic traffic growth from a full technical SEO overhaul usually becomes clear within three to six months. Anyone promising page-one rankings in a week is not being honest with you.

Built-In, Not Bolted On

Every site I build starts with technical SEO as a baseline, not as an afterthought or an add-on package. The sites coming out of 137Foundry ship with proper meta tags, structured data, semantic HTML, optimized performance, and clean crawl paths from day one.

If you have an existing site that needs technical SEO work, I can audit and fix it. But the best results come from building it right from the start.

Get Your Technical SEO Right

Book a call and I will take a look at where your site stands. No jargon, no fluff. Just what needs fixing and what it will take.

Book a Call