This is a case study on how I built a website that receives over 100,000 visitors per month, in less than 1 year, without spending $1 on advertising.
This was done 100% through SEO and content strategy.
Before we dive in, allow me to clarify a few things:
- The website reached over 100,000 visitors in 9 months.
- This was a new domain, registered just a couple months before launch.
- This was done in a language I do not read nor speak (Japanese).
- Japanese is a non-roman character language, making it nearly impossible to use most of the popular SEO tools.
The purpose of this post is to walk you through precisely how my team and I reached this milestone, the approach we took, and show how technical SEO combined with content strategy can deliver serious results.
Key Drivers of Traffic Growth
There were a few key elements that led to the widespread and sustained growth of the project, these range from commonsense to technical, but come down to three main focus areas:
- Math – we took a mathematical approach to designing an evaluation model that would allow us to gauge opportunities based on their potential returns. Ultimately this led to the creation of what we now call our keyword opportunity evaluation, which is a financial model that measures the approximate output (traffic) based on a finite set of inputs, including elements like average DA, number of links / linking domains, age of site, content footprint, etc.
- Analysis – Using our newly built algorithm we got to testing, creating websites to test content patterns and architecture. We were quick to declare defeat within verticals without traction, and paid close attention to where the traffic was growing the most. The algorithm started to take shape and after roughly 3 months was able to identify within an order of magnitude the amount of traffic we could acquire for a given set of costs.
- Pumpkin Hacking – This is a term that I came across (thank you Peter Da Vanzo) that seems to describe exactly what we did to continue to grow our traffic by double and even triple digits, month after month. The core concept is simple; focus resources on building what works. What this meant for us was paying attention to the search verticals and content that received the most traffic, most comments, most social shares, and being quick to cut the cord on traffic that didn’t perform.
First Month After Launch
With zero promotion and no advertising, we had a decent first month, bringing in over 2,000 visitors. This was mostly due to our pre-launch strategy – which I’ll explain more later in this post.
Nine Months After Launch
After only 9 months we were 3 months ahead of schedule to pass 100,000 visitors with no signs of slowing down.
As you can see in the screenshot above, organic search drives the most significant portion of our traffic. Referral traffic is almost entirely from blogs and industry publications, and campaigns is representative of the ads that we place, only on our website, to test different language and call to actions to drive conversions.
Building a Keyword Database
This is an obvious no-brainer for all SEO’s, however, unlike most search campaigns – this was a big keyword database, to the tune of 50,000 keywords.
The main idea here was leave no stone un-turned. Since we were of the mind to test everything and let the performance metrics dictate where to allocate resources, we had to get creative with query combinations.
We first went through all of our target search verticals, as dictated by our chosen go-to-market categories, which I think was roughly 19 to start. The next step was to identify the top 100 highest search volume terms within those verticals and scrape the top 100 URL’s that were currently ranking.
From here we began what started out as an exhaustive process of evaluating the opportunities for each keyword, and then aggregating opportunities to discern which categories we needed to focus on to grow traffic.
Essentially we targeted the low-hanging fruit; keywords identified by our model that could generate a minimum level of traffic in 3 months or less, with a minimum investment in content development.
I watched (obsessively) which phrases and topics generated the most traffic.
As soon as a topic began to grow legs, we would focus additional keyword research on finding concepts and phrases that were both complimentary and contextually relevant.
Designing a Content Strategy
This is the single hardest part of any content-focused website or project.
The key to success on this particular project was taking a page out of Jeff Bezos’ book, and becoming obsessed with our
We not only embarked on an aggressive a/b testing schedule, but we constantly reached out to our users for feedback.
We asked tough questions, ranging from what users’ liked and disliked (colors, fonts, and layouts) but also the specific components of the website they found to be less than ideal or even ‘sub-par.’
We took the responses seriously, making changes as they came in, trying to take something constructive from every piece of feedback, and pushing as many as 10 deployments a week.
It started to work.
Once we saw the needle begin to move on our user engagement metrics; time on site, pages per visit, and direct or branded traffic, we moved onto the next phase of our strategy; analyzing our audience.
Targeting the right audience is so much harder than it sounds.
I can honestly say from the experience of working on this project it is almost never as it seems. We began with targeting a very large segment of users (remember that time I talked about a keyword database of over 50,000 keywords?) but after a few months it turned out our largest (and most active) users were finding us from only a handful of targeted categories.
Information Architecture with SEO in Mind
Please allow me to preface this by saying that I am bias; in my opinion the architecture of a website is critical to achieving SEO success.
My largest successful SEO projects have come due to a variety of factors, but tend to come down to 3 core components of architecture:
- It’s Scalable
- It’s Crawlable
- It’s Tiered
Scalable architecture is an obvious one; you need a system that can grow as large as you want/need it to.
Crawlable is nothing new to anyone in SEO; this simply means that the structure of our pages allowed for all of the most important content to quickly and easily be crawled and indexed by search engine robots. It actually sounds easier than it is… ensuring that the content is rendered (code wise) in the most ideal format for robots to parse takes more consideration than just laying out your div’s to properly render your designs.
To do this properly you need to make sure all of your code is in the right place, and more so, check how each crawler sees your page.
Take every opportunity to DRY out your code as much as possible, remember modern code is designed to cascade for a reason.
Information tiering… is a concept I have long-time preached to anyone who has ever talked with me, at length, about SEO. It means that your URL architecture should be built in a way so authority flows upwards through your directories.
For example, if I wanted to build authority around a core concept, I would focus my domain on that concept. If I then wanted to build relevance around specific locations for that concept, I would structure my URL’s so that all relevant content for that location fed upwards to a location specific directory.
So let’s say I had an SEO consulting firm with locations in several cities across the U.S., I would design an architecture that would allow for location-specific information to feed upwards through my directories.
So something like NicksSEOFirm.com/Philadelphia/Specific-Location-Content. The specific location content could be the team, any value-add competencies, anything geo-specific that was relevant to operations at that location, flowing relational authority upwards to the parent directory of /Philadelphia/.
Link in sub-directories can feed authority to parent directories.
A perfect example of this is local sitelinks for popular categories; tertiary directories with the most links and content which cause their upstream sub-directories to receive authority translating into higher rankings and local sitelinks.
Launch Before The Launch
The easiest way to ensure a successful product or website launch is to launch before you actually launch.
What I mean is to build your prospect list well in advance of pulling the trigger to go live.
John Doherty wrote a great post on ProBlogger that talks about the power of leveraging list-building pre-launch pages. By building a list of users before publishing your full website you are essentially guaranteeing traffic immediately upon launch.
Our pre-launch is how we were able to generate over 2,000 visitors within the first 30 days of taking the website live.
Since our platform is not built on WordPress we didn’t get to use any of the fancy plugins available, and instead created a basic one-page site that allowed visitors to convert the same way the full website would support, just on a much smaller scale.
The most important part of our pre-launch page was that it not only supported social sharing, but was able to track and aggregate shares to give active users more points; gamification is cool.
Some of the major benefits of a well planned pre-launch are:
- Your website is already being crawled and indexed by major search engines.
- You begin building your user base and audience.
- You can gain invaluable feedback while it’s still easy to make changes.
Choosing a Platform
Let me start by saying not all platforms are created equal.
It’s also worth sharing that it is not always better to build versus buy, as there are a lot of smart people building a lot of slick content platforms.
However, we chose to build.
Once we had laid out all of the project requirements, including URL architecture, conversion funnels, user permissioning, design templating, and localization, it became clear that in order to get exactly what we needed – we were going to have to build it ourselves.
One major benefit of building is we were able to design a system that would support both our internal and external processes right out of the gate. This also meant it was going to take a lot more time and a shitload more money to bring our website to market.
Hosting & Evolution
This is a known but rarely talked about factor – hosting infrastructure is critical.
Once we were ready for public launch we setup chose a reasonably affordable VPS provider with what seemed like more than enough memory, and it was at first.
By month 4 it was clear we were going to have to make some changes; load times began to bloat and large content pages were timing out. We beefed up the space and quadrupled the memory, which solved the problem temporarily until…
We got some press.
On June 5th we were featured by one of the largest news publications in the world. We were able to handle almost 40,000 visits before out VPS crashed, hard.
It was that week we made the move to localized cloud hosting from Amazon Web Services.
We haven’t crashed since.
The End Result
Not really the end result since this project is still enjoying a healthy and fruitful life, but after 9 months of careful planning, remaining flexible to the marketplace, and nurturing our most valued asset; our users, we surpassed our milestone of 100,000 visitors.
Great, But Is It Repeatable?
In case you weren’t already thinking it, you are now.
The answer is Yes.
Taking what we learned and applying the concept of pumpkin hacking, we started a new blog at the end of July 2012 to test the transferability of our strategy, and here were the results:
In the first 12 days we had over 17,000 visitors. In the first full month, we had over 50,000 unique visitors coming to the website over 100,000 times (see below).
And it didn’t slow down…
By the end of the 3rd month we were receiving over 100,000 unique visitors, and over 200,000 visits.
Benchmark And Grow
One of the best ways to get started on your path to increasing your site’s organic traffic is to understand how much traffic potential is within your vertical, and to benchmark against it.
Start by looking at the top 5 sites that rank for the big head terms you’re targeting, and get a read on their traffic. To do this I like to use similarweb.com’s web traffic tool. It provides a good relative measure of a website’s traffic, sources, and some detailed data on where this traffic is coming from – and best of all, it’s free!