Answers to Frequently Asked Questions
We are hired by a variety of digital marketing, web design/development and advertising agencies to manage SEO for their clients. We offer all of our ongoing SEO and consultation services as white-label branded (including reporting). In addition, agencies hire us for general and retainer-based consultation, audit/analysis and website and domain migration project management.
Our clients are located throughout the nation. To date, we have provided service and consultation for businesses and agencies located in 24 states and 5 countries.
Please see our SEO hiring guide that includes considerations, research tips, expectations, standards and proposal information for this very purpose.
Stellar customer service, fast response times, easy to understand SEO reporting and of course, trackable, visible results. We love what we do and have been doing it for years.
Being able to have professional relationships with the people behind all of the different companies we assist. It is empowering to work and connect with such a diverse group of passionate and skilled individuals.
The cost of SEO is dependent on a number of variables; such as scale and scope. We like to talk to you about your business objectives and conduct market research prior to providing a proposal. There is no charge for this, as it assists with the development of effective SEO campaigns.
We provide service on the basis of a month to month agreement for the majority of our clients. We can provide an MSA/SOW upon request and may elect to use contracts when providing consultation.
Google doesn’t reward websites that produce content more frequently with better search engine rankings. Instead, the algorithm looks for quality and consistency. Quality in that Google favors beefy posts that it can easily find and index.
Consistency in that Google can detect a specific time/date when new content is posted. This assists Google with determining how frequently to visit your site to look for new content. Which, can affect how long it takes for new content to show up in Google search results.
If a given website does happen to be churning out quality blog posts on a consistent basis, then it is entirely possible for the website to increase search engine visibility by increasing content posting frequency.
With regard to Google’s position on most anything – quality over quantity reigns supreme.
What is the relative importance of traffic to a website in the Google algorithm for ranking search results?
The relative importance of incoming traffic to a website with regard to organic search engine visibility is tied to a number of qualitative metrics. It’s not the sheer volume of visitors that Google evaluates as ranking criteria. Instead, their algorithm focuses on the behavior (bounce rate, time on site) and the source of the originating traffic (domain/page authority). However, Google will take notice if an increasing number of authoritative websites are sending site visitors.
Once a new traffic source is discovered, it typically takes Google around 3 months before associated signal weight is applied to search engine rankings. Keep in mind that incredibly authoritative traffic sources tend to speed up the process.
- Make sure you are aware of historical keyphrase volume when reviewing which keyphrases to target. You can use Google’s Keyword Planner in addition to a number of other tools like SEMRush.
- Utilize the homepage and subsequent categorical landing pages for your broad/top of the funnel/most competitive phrases.
- Target longer tail keyphrases with sub-pages relevant to their homepage/landing page.
- No need to worry about loading your pages up with keyphrases. Natural language copy is the new norm. Key terms and related variants just need to be in their appropriate locations: page titles, headings, copy, images (filenames/alt attributes) and URLs.
- Your homepage and landing pages should be viewed as evergreen/cornerstone content pieces. Keep them updated on a set-in-stone schedule. This teaches Google how often to revisit the website.
- Here’s a big one – avoid all instances of internal content that competes. There’s a huge difference between content that competes and content that supports. Carefully construct the layout of the website using a spreadsheet that identifies the primary keyphrase targets and content focus of each and every page. Supporting content could be a sub-page or blog post that links back to the homepage or relevant landing page. Competing content will dilute the website’s ability to rank for a given phrase if Google can’t easily tell which page means what. This is one of the core reasons why quality content websites don’t rank well.
Tools provided by Rank Ranger and SEMRush help identify which pages are cannibalizing each other. As does a Google Search using the “site:” command. Example: yoursite.com: keyphrase (widgets.com: blue widgets).
- It is also important to pay close attention to page title character limits (60 max) and duplicated page titles, meta descriptions and headings. Landing page copy should be at least between 350 and 500 words. Use short, easy to read page titles, headings, paragraphs and sentences. Grammer and spelling count. Use Grammerly and typosaur.us.
- Don’t forget to make technical optimization considerations. Indexation, structured data, mobile friendliness, HTTPS and site speed all assist with forming a solid SEO base that benefits keyphrase targeting.
The short answer: subdirectory. Aside from being easier for Google to index, the directory approach passes authority from the root. Here’s a more detailed explanation along with a couple examples of when using a subdomain may be appropriate.
Example Subdomain: https://services.jtree.net/
Example Subdirectory: https://jtree.net/services/
Headings are an important part of the referential integrity equation that Google favors with regard to on-page optimization. We recommend not using headings in sliders/carousels if at all possible. If headings cannot be avoided, use an H3 heading or higher. Slider/carousel content is subject to a high rate of change and duplication. Headings should be static and unique in order to be an effective part of the SEO matrix.
The amount of time it takes a given keyphrase or keyword to rank in Google varies based on the scale and scope of the SEO campaign, competitive factors and timeliness of search engine algorithm updates. Under most circumstances, substantial ranking increases occur within the 6-12 month timeframe.
Excellent question! Please view our best practices for blogging guide.
There are essentially two options when it comes to keeping search engines from displaying your content.
Option 1 – Robots.txt
The Robots.txt Disallow Directive, where URLs and/or directories that you don’t want crawled are listed in the site’s robots.txt file. It is important to note that disallowed pages or directories can still end up in search results.
The robots.txt file is typically stored in the website root: https//examplesite.com/robots.txt
Option 2 – Robots Meta Tag (Preferred)
The Robots Meta Tag, which is added to the head section of a given page or in the HTTP response header (via x-robots-tag). This tells search engines if they should index (include) the page in search results and follow (utilize) the links on the page in question. Keep in mind that search engines can’t access the robots meta tags of disallowed pages.
Example robots meta tag with noindex and nofollow parameters:
meta name=”robots” content=”noindex,nofollow”
Option 2, the robots meta tag with noindex and nofollow parameters, is definitely the better approach. It works especially well when there are a low number of pages involved and is directly focused on preventing said pages from showing up in search results. Option 1 was only referenced because it is common practice to implement both options and accidentally override robots meta tags.
Website CMSs have a habit of duplicating content to the extreme; think along the lines of category, tag and keyword archives along with sidebars, footers and widgets. This paves the way for one link to show up on many pages. If the originating link website is large, a single link can easily multiply and be counted hundreds, if not thousands of times depending on which link research tool is used.
Google customizes search results based on user search history, device usage and current location – especially if local intent is present.
In order to see past customization and view search results as they would appear if you were in another location a little tweaking is in order.
Adding the Near Parameter to a search query URL is currently the most efficient way to check Google search engine results for a city that you are not actively located in without the use of a rank tracker.
Example Coffee Shop Search Query URL:
Example Portland Coffee Shop Search Query URL with Near Parameter:
Example Boise Ice Cream Search Query URL with Near Parameter:
Simply replace the search term and city in the examples above and copy/paste into your web browser address bar to conduct your own city specific searches. While not a perfect solution, adding the Near Parameter to search query URLs gets the job done.
Technically, social media efforts do not DIRECTLY affect the search engine rankings of your website. Google uses over 200 signals to assist with determining search engine rankings and social media is currently not one of them.
Jtree offers a wide range of single-use and ongoing consultation services that range from site health and migrations to intelligence and crawl testing. Our consulting services are designed to assist businesses and agencies with gaining increased website exposure in Google.
After contacting us our team will conduct research and prepare a proposal for service or consultation. Research involves a review of your website’s search engine positioning, market conditions and top competitors.
If you find value in accepting our proposal we’ll move forward with launching service. If not, that’s ok too. At the very least you will have received some free insight that can be readily applied.
No. Although website design and development are necessary and required SEO skills, we focus on what we do best. However, we do work with a number of talented, SEO-friendly web designers and developers that we are happy to connect you with.