Phoenix SEO Services

Home » Phoenix SEO Services

What is SEO and why do I need it?

The first thing is to know where the term “Phoenix Search Engine Optimization Company” emerges: The first search engines emerged in the early 90’s and until Google came in 1996 many, including Yahoo were created; that’s when he started the boom of web pages and people realized you could really make money with them, that’s when it was obvious, they concluded that they needed to attract traffic and what was the best way to attract traffic ? In fact is was search engines. Just then the owners of the websites began to think how they could reach the top positions … SEO was born !!!

SEO focuses on organic search results, ie, which are not paid:

organic results 953×768 What is SEO and why do I need? But hey, let’s go to what matters and why (I think) you’re reading this chapter:
separator What is SEO and why do I need?

What is Phoenix SEO?

According to Wikipedia, SEO is:

Search engine optimization or search engine optimization is the process of improving the visibility of a website in organic results of different search engines. It is also common to name it by its English title, SEO (Search Engine Optimization).

SEO is one of the “disciplines” that has changed in recent years, we have only to look at the large number of updates that have been Penguin and Panda, and how are you have taken a 360 degree turn to what is meant by SEO until recently. Now with SEO is pursued what Matt Cutts himself calls “Search Experience Optimization” or what is the same, all for the user.

Although there are many factors that a search engine is based to position a page or another you could say that there are two basic factors: the authority and relevance

The Authority is basically the popularity of a website, the more popular a page or a web valuable is the information it contains. This factor is a search engine takes more account since it is based on the experience of the user, the more a content sharing is why more users have found useful.
Relevance is the relationship that has a page against a given search, this is not simply a page containing a lot of times the term sought (in the beginning it was so) if a search engine is based on hundreds of factors on-site to determine this.

In turn, SEO can be divided into two groups: the SEO on-site and off-site SEO.

On-site: The on-site SEO is concerned with relevance, ensuring that the website is optimized for the search engine understands the main thing, which is the content of it. Within the SEO On-site would include optimizing keywords, charging time, user experience, code optimization and format of URLs.
Off-site: The off-site SEO is the part of SEO work focuses on the external website we work factors. The most important factors in SEO off-site are the number and quality of links, social media presence, mentions in local media authority of the brand and performance in search results, ie CTR having our results in a search engine. You’re probably thinking that this is all very well and that is very interesting but you are here because you need to know the SEO on your website and you will benefit if you integrate your online strategy.

The SEO can make a difference in whether or not we follow the “recommendations” search engine: SEO Black Hat or White Hat SEO

Black Hat SEO: It’s called black hat to attempt to improve search engine rankings of a website through unethical or that contradict the guidelines of search engine techniques. Examples of Black Hat SEO are the Cloaking, Spinning, SPAM in forums and blog comments, or Keyword Stuffing. The black hat can provide benefits in the short term, but is usually a risky strategy, without continuity in the long run and it does not add value.
White Hat SEO: Consists of all those ethically correct actions that meet the guidelines of search engines to position a web page and search results. Since search engines give more importance to pages that best respond to a user’s search, the White Hat techniques comprising looking to make more relevant a page for search engines by providing value to its users.

2. Why is SEO important?

The most important reason that SEO is necessary is because it makes your website more useful for both users and search engines because, although the latter are more sophisticated every day still can not see a website as it does a human, SEO is needed to help search engines understand about treating each page and whether it is useful to users.

Now let an example to see things clearer:

We have an e-commerce dedicated to the sale of children’s books, well, for the term “coloring” there are 673,000 monthly searches, if we have that 22% of visitors to this term leads the first position, that would be about 148,000 visits per month.

But how much are these 148,000 visits? Well if that term average spending per click is 0,20 € we are talking about more than 29,000 € per month. This just in Spain, if you have a business oriented several countries, hourly 1.4 billion searches are performed in the world, of those searches, 70% of clicks are on the organic results and 75% of users do not reach the second page; if we consider all this many clicks per month for the first result.

Hiring a Phoenix SEO Company is the best way for your users to find in searches where your page is relevant, these users looking for what you offer them and the best way to reach all those users through a search engine.
3. How do search engines work?

Running a search engine can be summarized in two steps: crawling and indexing.
tracking

A search engine traverses the web tracking with what are called bots, these bots walking all pages through links (hence the importance of a good link structure) just as you would any user to browse the contents of the web, moving from one link to another, and collect data on these websites providing to their servers. The tracking process begins with a list of web addresses of previous scans and sitemaps provided by other websites. Once access these web bots seek links to other pages to visit them. The bots are particularly attracted by new sites and changes in existing web.

Are the bots themselves decide which sites to visit, how often and how long will track the web, so it is important to have an optimum load time and updated content.

It is very common for a web page need to restrict crawling certain pages or content to prevent these appear in search results. For this you can tell the bots of search engines do not crawl certain pages through the file “robots.txt”.
indexing

Once a bot has tracked a website and has collected the necessary information, you are pages included in an index where they are sorted according to their content, their authority and relevance; so when we make a query to the search engine will be much easier to show results that are more related to our practice.

At first search engines were based on the number of times a word was repeated on a page, when doing a search index tracked on these terms to find pages had them in his texts better Rankings the most times it was repeated . From that point the search engines become more sophisticated and base their rates on hundreds of different aspects such as publication date, if they contain photos and videos, etc; because surely if you are looking for the word “house” you want not see a page with the word “home” thousands of times.

Once the pages are crawled and indexed comes a time in which it operates the search engine algorithm: the algorithms are formulas and computer processes that convert answers questions, such algorithm in the index seek the answers that best fit the question we have done, this is done based on hundreds of signals as the terms of websites, content today, our region or PageRank.s

SEO Guide How Google ranks a web?
We know that a search engine first collects the list of pages on the web, and then ordered according to their algorithm, but depends what we leave early in Google?

Mainly on 2 factors: the authority and relevance of your website; external and internal factors; related to the naturalness and Google Penguin and related quality and Google Panda. These are the two halves of SEO, the Ying and Yang of SEO.
separator How Google ranks a web?
Authority and Relevance

To put it simply, Google’s algorithm is based on authority and relevance. The relevance depends largely on internal factors of the web: the content (text and images), how is programmed, the charging time, etc. However the authority is based largely on external factors, largely links to other pages that page.
relevance

The relevance of a web means how well a web responds to the question asked by a user to a search engine.

So, if we Tourist Information Phoenix, Google believe that the official website for tourism in Phoenix is more interesting than a page of fly fishing. When multiple websites competing to be relevant to a search Google takes into account the content of each, which is older, which provides more information, etc.

In the above example, if two sites have the same content, the oldest is the original, and therefore more relevant. Other ways to be more relevant would have many other pages within the web to talk about Phoenix, provide more information in different formats and highest quality.
the authority

The authority is how important and reliable is a website for the user compared to the others.

For Google’s web functions as a democracy in which the sites are voted each other. In this case, votes are links between each other. The most important is a page, the more important the vote. That is why the pages with more links to them have a higher authority, and two websites with similar themes compete for the number of links that point to each.

For example, if the search for locksmiths in Phoenix the search is 3 pages with very similar content, will give a better ranking position to one that has more links or links pages, so that the 3 will fight for more links and gain more authority.

How much is a link?
The value of a link is determined, among others, the following factors:

Authority web links: If you link a website with great authority, such as a university or government agency, will give great authority. If you link a page with very little authority or is blacklisted by Google, can become harmful and diminish your authority.
Number of links on the page: If the page that links to you only have one link going to your website, you are giving 100% authority which can convey. In call this SEO link juice, fruit juice from Links-, as if it were an athourity site with juice which is transmitted through -links- straws. Conversely, if the page has 10 links to 10 different domains, is only 10% giving the juice to each link.
Anchor text: The anchor text is the text of the link, and provides information on how Google treats your website. For example, if many pages you link with the anchor text “tourism in Phoenix” is easier than you Google for that term position if you enlazasen with a generic text. Ojo, many SEOs abused this technique, so you should not abuse it. Google updated its algorithm to see patterns unnatural links and penalize suspicious sites; this includes the anchor text.

separator How Google ranks a web?
What Google understands my web?

When a Google bot visit a website, you are not watching the same thing you see as user: it looks almost only the HTML code of the page. Have a beautiful and useful website full of photos? Google will not know what is safe for you to include html tags to photos. If no text that Google can recognize it is very difficult that the page is relevant, since Google will be unable to know what is the web and if it makes more or less value than those with which it competes.

The main problems that Google understands how they treat a web are:

Missing text in HTML
Text written in image format, not HTML
Sites programmed into flash: Google is unable to read the information contained in a flash file

What factors have a page on Google account?

When we speak of factors relating to a web, we refer to on-page seo or seo on-site. That is, the part of SEO Google trying to have it as easy as possible to identify what is a web. It is not essential, but you should have a basic knowledge of HTML language to understand the following points.

The main elements to analyze a web are:

Page text: It is obvious that the page has to talk about the subject you want to position, using the keywords for which you wish to find.
URL of the page: It is beneficial to the address of your page containing the keyword and structure of logical folders. For example, if you want to sell a Persian carpet; What option is best?
www.tutiendaonline.com/item_2178
www.tutiendaonline.com/alfombras/alfombra-persa

In this case, the second option gives much more information to Google, and that within the category includes carpets -and that is similar to other products out there there-and that page is related to the term “carpet-Persian” .
Page title: This is the text that identifies all there on the web. Furthermore, it is what we see in the browser window
H1: The H1 tag in HTML indicates the most important text on a page.
Meta-description: This meta tag gives Google a brief description of what is the web. No direct SEO weight, but defines the text that accompanies the search result, which is vital to the user experience.

serp 40 How Google ranks a web?

In the picture:

1. Page title
2. Url of the page
3. Description of the web. Google takes her meta-description.
4. Structure of the page: Google is able to identify the most important pages of the structure of your site and put them direct access.

Some results may contain information Microdata: If we add microdata to our website, Google can read and display it in search results. This information may include feedback from users, photo and name of the author, video, audio, etc.
separator How Google ranks a web?
Usability and User Experience in SEO

Another factor that Google takes into account ahead of the SEO of our website is the user experience we offer, that is, how happy is when a user clicks on our result.
How do you measure Google user experience?

CTR Results: The percentage of users who click on each result is a figure known for the search engine. If a result than the first one is getting more clicks than your share (being a page with good reputation, because the title or description offer information responsive to the query, or because the result includes additional information such as author photo or video) Google may give you more relevant and give you a higher ranking.
Rebound: If a user enters our result and then return to Google and enter another, means that the former has not met its search and Google may consider you less relevant.

What should we do to improve the user experience?
Empathy is our greatest ally to meet the needs of a user. When you generate a content, always think of whom it may concern: What information do you need? What format? What tone?

For example, suppose we do a page on national highways for drivers. We could make a page with a lot of content on motorways counting construction, layout and curiosities, but drivers will not be interested in it. But yes it will be in the traffic on motorways, although having less content. Possibly interested in a graphic and easy to understand format and adapted to mobile web as they consult from your mobile. Maybe you want to include alternative routes in case of traffic jam? Do the best places for lunch and dinner?

If people come to your website and stay and enjoy with it, Google will have that information. Moreover, you may generate links passively-people who want to share the great content you have generators which will mean a big profit.

Another important factor is the design: Make sure that the fonts, colors and resources are tailored to your profile while you differentiate.

The charging time is also essential; an increase of 2 seconds load time can mean a 25% increase in the rate of rebound. Note that the load time factor not only worsens the user experience, but also a factor that directly influences the Rankings of your website and the ability of search engines to index all pages of your website.
Social Media and SEO

Social networks have an indirect and direct influence on the SEO of a website.

A social network is usually a website with great authority, and a page can get thousands of links in them. By contrast these platforms typically require that the user is logged in, which greatly limits the ability of the seeker to detect these links.

The Bing search engine has an agreement to access Facebook data with which complements the searches. Moreover, Google has its social network, Google+, which is one of the ranking factors that most influences the authority of a web profile with links.
Is the relationship between SEO and Social Networking limited to these two cases?

No, actually these two cases-along with links generated in social networks that Google can detectar- are the only ones in which this influence is direct, but social networks have a greater effect indirectly.

First, social networks enable people to share a link to a page or content to large audiences, which also can give rise to viralizarse. At this point, you have to visualize acquiring links as a funnel: in 100 users visit you, X users link to your site, generating an SEO benefit.
Social networking is the best way to create branding for a web in their infancy and a community around it. This will allow generate traffic that will result in links.
Future trends in SEO

Putting it in context, SEO at its inception was a matter of repeating keywords and getting links with specific anchor texts, and now has more than 200 factors including social networks are counted, the user experience or browsing from a mobile .

The maximum is that search engines are going to be increasingly difficult to cheat, and SEO only have to serve for how to make them understand that our content is best for the user, not to trick the search engine or the person visiting us : It’s a tool that supplements other online marketing disciplines.
Making an optimized website for SEO

Chapter 3 Making an optimized website for SEO
Now that you know what is SEO and what are the main factors that Google takes into account when Rankings a website what you are missing are is knowing what you have to do to make your website have real opportunities to position.

In this chapter we will discuss how to optimize Rankings the main factors leading SEO and problems that arise when optimizing a website and their possible solutions.

Topics to be treated in this chapter will divide into 3 main sections:

Accessibility and indexability
content
technical factors

separator Making an optimized website for SEO

The first thing to do when optimizing a website to for it to be accessible to Google to get into the skin of Google. Users do not see a web page the same way he sees a search engine to see if what Google sees is what we want to see have to follow a series of steps:

Install this plug-in: Web developer for Mozilla Firefox or Google Chrome
Javascript disables We do this because if the drop down menus Javascript are not accessible by Googlebot can not crawl links containing
Google Chrome
How to disable javascript web optimized for SEO
Mozilla Firefox
Firefox Making an optimized website for SEO
Disable CSS CSS is disabled because Google reads a page in the order of HTML. The CSS sometimes change place objects on a page.
Google Chrome
css chrome Making an optimized website for SEO
Mozilla Firefox
css firefox Making an optimized website for SEO
How your website is without Javascript or CSS?

Here’s how Google sees 40deFiebre. Much nicer than the eyes of a user is not it? But how can you see the dropdown menu links are clickable and it seems that there is nothing messy on the page.
40 of fever Making an optimized website for SEO

You have to keep several things in mind when you check your website in the eyes of Google:

Can you see all the links on your menus?
Are all links appear as plain text?
Are all links are clickable?
Is there any piece of text that was not before? If so try to fix it asap, albeit involuntarily is a very serious offense for Google, for it means you’re hiding text to your users.
Is your sidebar widgets or your website at the top of the page? It is important that the most important links are in the top part of the page.

Main problems of accessibility

There are many factors that make a website more or less accessible to Google but there are certain problems that can cause the robot Google does not reach every page of our website, especially the deeper:
Structure of the web

If the structure of a web is too deep has Google will find it more difficult to reach all the pages, it is recommended that the structure is no more than 3 levels deep (excluding home) since the robot Google has a limited time to track a website and the more levels you have to go through less time will be to access the deeper pages, so it is always recommended to create a web structure horizontally and not vertically:
Vertical structure
6 levels deep Making an optimized website for SEO
Horizontal structure
flattened to 3 levels deep Making an optimized website for SEO

As you can see, in the second diagram information from the Web is much better organized by category and is much more accessible to the Google robot.
internal linkage

The robot uses the Google links on your driving between pages so we can index them, then what if a particular page of a website has no internal or external link pointing to it? Well Google robot can not access it.
It is very important to ensure that all pages of a web having at least one link to them, to check this we use Screaming Frog. You only have to run the program, set the filter to show us only the pages with HTML content and scrolling to the right looking for the “inlinks” tab.

For this reason it is very important to have a horizontal structure as we find it much easier to create a good network of internal links
Upload speed

This is something we will see in detail later in this chapter, but keep in mind that besides affecting Rankings itself be a time of very high load on a web robot means that Google will time you spend in this site without agreeing to all pages of it.
indexibility

Once the robot Google has agreed to a page the next step is that indexed, these pages are included in an index where they are sorted according to their content, their authority and their relevance to then be easier and faster access to Google them.
How to check if Google has indexed my site correctly?

The first thing you have to do to know if Google has indexed your website properly is to do a search with the “site:” command this way Google will tell us the approximate number of pages you have indexed our website:
2013 October 23, 1822 Making an optimized website for SEO

If you have linked Google Webmaster Tools on your website you can also check the actual number of pages indexed from Google Index> State of indexing:
sdfasf 1024×233 Making an optimized website for SEO

Knowing (more or less) the exact number of pages your website has this information helps you to compare pages Google has indexed with the actual pages of your website. They can happen three scenarios:

In both cases the number is very similar. It means that everything is in order
The number on Google search is lower, which means that Google is not indexing many pages.
The number on Google search is higher, which means that your website has a problem of duplicate content.

Seen this is possible your website have a two problems: Google has fewer pages indexed than they actually are in your website or Google has more pages indexed than they actually are in your web; whatever will see what may be causing it and what action we should take to fix it.
Google has fewer pages indexed

This is a more common problem than it looks and is usually due to two things: the Google robot does not reach all the pages of your website thus not be indexed, for this part reviews the accessibility of this chapter to fix; and the second is that you are blocking the Google robot and thus preventing it from indexing certain pages.

Assuming that the accessibility of your site is perfect will try to find the way you are blocking the Google robot and how to fix it:
robots.txt file

The robots.txt file is used to prevent search engines from indexing and accessing certain parts of your website, although it is very useful to prevent displayed on Google search results pages do not want, can be a situation that we block access to our website without realizing it.

In this extreme case, the robots.txt file is blocking access to the entire site:

User-agent: *
Disallow: /

What you should do is check that the robots.txt file manually (usually the URL usually www.navy-gold.com/robots.txt) is not blocking any important part of your website. Once you check it and correct it following the guidelines of Google have to check that no errors through Google Webmaster Tools> Tracking> Blocked URLs.
GWT Making an optimized website for SEO

In case there is an error with the file an error icon where you can see the faults that have been detected will appear.
Robots meta tag

Meta tags are used to tell the robot to Google if you can or not index the page and follow the links it contains. What we have to check is that none of the pages you want Google to index it appears in the code:

<meta name = “robots” content = “noindex, nofollow”>

To find out quickly if a page is being reused label Screaming Frog directives clicking the tab and looking the golf Meta Data 1:
Screaming 1024×211 Making an optimized website for SEO

Once you have all the pages are labels you only have to remove them.
Caching frequently very low

Once Google indexes for the first time a page determines how often returns to visit this page for updates, this depends on the authority and relevance of the domain to which it belongs that page and how often it is updated.

To know the frequency of caching your page has to manually check the days that pass between a frisk and another using the “cache: www.navy-gold.com” in Google:
cache 1024×479 Making an optimized website for SEO

Check it every day until Google’s cache again, then you know what is the frequency.

If you consider that the frequency is too low may be due to 4 reasons:

Your website is not relevant enough to stroll Google more often for her
Not often you update the content of your web
Your web server gives too many mistakes and 404, which makes Google believes that your site has lost relevance. To check if the page has this type of query errors in Google Webmater Tools> Tracking> Crawl errors
You have slowed frequency tracking in Google Webmaster Tools to edit it sees Google Webmaster Tools> Site Settings, ideally let Google decide

Tracking Making an optimized website for SEO
Google has more indexed pages

Surely the reason that more pages indexed than they actually are in your website is that you have duplicate content or the indexing pages that Google does not want to be indexed. You can check Google Webmaster Tools> Appearance Search> Improvements HTML if you have duplicate pages.
Duplicate content

Having duplicate content means that for several URLs have the same content, is a very common problem that often is involuntary and can also carry a penalty of Google, these are the main reasons for duplicate content.
“Canonicalization” page

This is the most common reason for duplicate content, occurs when home page has more than one URL:

navy-gold.com
www.navy-gold.com
navy-gold.com/index.html
www.navy-gold.com/index.html

Each of the above target the same page with the same content, if you do not provide any way to Google which is the right not know which is which position and it may be that you do not want.

solution

You have 3 options:

Make a redirect on the server to make sure there is only one page displayed to users.
Define that subdomain want to be the principal (“www” or “non-www”) in Google Webmaster Tools. Defining the main subdomain
Add a tag “rel = canonical” in each version that points to consider your correct

Parameters in the URL

There are many types of parameters, especially in e-commerce: filter products (color, size, punctuation, etc.), management (retail price, relevance, higher pricing, grid, etc.) and user sessions. The problem is that many of these parameters do not change the page content, that generates many URLs for the same content.

www.navy-gold.com/boligrafos?color=negro&precio-desde=5&precio-hasta=10

In this example we can see three parameters: color, minimum price and maximum price.

solution

The solution to any problem with the parameters is to add a label “rel = canonical” to the original page, with this will avoid any confusion by Google with the original page.

Another possible solution is to tell Google through Google Webmaster Tools> Tracking> Parameters URL parameters should ignore when indexing pages on your website.

Making an optimized website for SEO
pagination

When an article, list of products or pages of labels and categories have more than one page duplicate content issues may occur if the pages have different content, all are focused on the same subject. This is a huge problem in e-commerce sites which have hundreds of items in the same category.

solution

There currently the rel = next and rel = prev labels that allow search engines know that all pages belong to the same category / publication do not indexing all pages and focusing the full potential of Rankings on the first page.

Using the NEXT and PREV parameters

Another solution is to find the parameter page and enter the URL in Google Webmaster Tools to not be indexed.
Pages that do not want to be indexed

The difference between these pages and we classify as duplicate content is that these are unique pages but for one reason or another do not want Google from indexing. To check for pages that do not want to be indexed but being indexed, whether private pages or the pages of the process of buying a e-commerce, you just have to search “site: www.navy-gold.com” and check if these pages appear in search results.

If these pages appear can do two things:

Add the <meta name = “robots” content = “noindex, nofollow”> on pages that do not want to see Indexed
Modify the “robots.txt” file not to be indexed.

For example, if we have a website in WordPress as usual you do not want the contents of “/ wp-admin” is indexed in Google, “/ wp-includes”. To do so first you’ll have to include the tag “rel =” noindex “on every page; if you do it the second way you have to upload the robots.txt file with the following content:

User-agent: *
Disallow: / wp-admin /
Disallow: / wp-includes /

separator Making an optimized website for SEO
content
Why is it important content?

The first thing that comes to mind is that Google “Content is king”, but for simplicity’s a page will never positioned for something that does not contain, ie, if you want to position for “Halloween” and not talk about Halloween on your website you will not get.

Let’s divide the contents of the web into two parts:

On-Site SEO: It would be unseen content, ie HTML tags
texts

On-Site SEO: Meta Tags

In this section we will refer to the On-Site SEO as meta-tags (considering the fact that within the SEO On-Site also enter the speed of loading or text). The major labels that we consider are the following.

title

The title tag is the most important element of the meta-tag is the first thing that appears in the results in Google and is what shows when people put the link on social networks (except Twitter).

There are a number of things to consider when optimizing title

The label must be in the code section
Each page should have a unique title
Should not exceed 70 characters, otherwise it will appear cut.
Must be descriptive about the content of the page
Must contain the keyword for which we are optimizing the page
We must never abuse the keywords in the title, this will make users wary that Google thinks we are trying to deceive

Another thing to consider is where to put the “mark”, ie, the name of the web, usually is usually placed at the end to give more importance to keywords, separating these the name of the web with a dash or a vertical bar.

meta Description

Although not a factor of Rankings significantly affects the rate of clicks (click-through rate) in the search results so.

For the meta description will follow the same principles as the title, only this length should not exceed 155 characters. For both titles and descriptions must avoid duplication, here we can see in Google Webmaster Tools> Appearance Search> Improvements HTML.

meta Keywords

At the time the meta keywords were a very important factor Rankings but as Google saw how easy it is to manipulate search results he was eliminated as a factor in Rankings. Since its only use now is to categorize the news on Google News the best you can do is delete them.

Tags H1, H2, H3 …

; H1, H2, labels etc. are very important for good information structure and a good user experience as defined hierarchy of content, which will improve the SEO Company results considerably. We must give importance to the H1 because it is usually in the top of the content and since the higher this one of most important keyword will Google them.

Google Authorship

A label that has gained importance in the last year, is a content label linked to a Google+ profile showing a search result enriched and aunmentando the CTR (Click Through Rate). A search result enriched authorship would look like:
authourship Making an optimized website for SEO

To successfully implement this label have to follow a series of steps:

Having a Google+ profile
Make sure your profile picture on your face is recognized
On page content your name should appear
The author’s name must be the same as Google+
You need to link your Google+ page to your website by adding the link in “contributes”

Making an optimized website for SEO
You have to link the content to Google+ page in the page code

1024×106 Making an optimized website for SEO

Label “alt” image

The “alt” tag images are added directly in the code from the image itself:

<img src = “http://www.navy-gold.com/navy-gold.jpg” alt = “keyword molona” />

This label must be descriptive with respect to the image and content that surrounds the image as it is what you read Google to see an image and one of the factors used to position it in Google Images.
texts

Content is the most important part of a website and as much as your web this very well optimized SEO level, if the content is not relevant with regard to search that users are never going to appear in the top positions.

To create good content that you follow the guidelines of Google must take a number of things in mind:

The page has enough content, although there is a standard measure of how much is “enough” is recommended that at least contains 300 words
The content must be relevant, useful to the reader, just wondering if we would read that, we give you the answer
The text must contain the keywords for which we want to position
It has to be well written, ie you can not have typos, syntax errors or erroneous data
It must be easy to read, if not tedious reading makes us, be well.
Must be bondable, if you do not provide the user as share is likely not. Put buttons social sharing in conspicuous places that do not impede page displaying content, be it a video, photo or text
Must be current, the more updated this your older content will crawl rate of Google on your website and better the user experience.

Problems with the content

In SEO, how in all, if we had something can become a problem for Google is very important that the texts are well optimized but not exceed, if it detects an envelope text optimization can fall into a penalty. So we must avoid the following errors:

keyword Stuffing

The keyword stuffing is a technique considered black hat SEO for Google in which a text is written with a high density of keywords or KW density in order to position for that keyword.

Google penalizes very often this kind of over-optimization. To avoid any negative action by Google texts must always be drafted to provide value to the user in the way that best suits your profile public if the text gets to a better, more original, different, better synthesized or information to provide more value, that will be an indicator for Google better than any variation in the number of keywords you have in the text.

cannibalization

Keyword cannibalization occurs when a web page there are several competing for the same keywords, which is confusing to the search engine, not knowing which page is the most relevant for that keyword, causing a loss in Rankings .

This problem is very common in e-commerce that having multiple versions of the same product attacking them all to the same keywords, for example, if you sell a book version paperback, hardcover and digital versions have 3 pages with virtually the same content. To fix it better to create a main product from which you access the pages of different formats in which we will include a canonical tag to the product home page as does Amazon:
bears 1024×373 Making an optimized website for SEO

Optimally, always focusing each keyword on one page to avoid cannibalization problem.

Hidden from Google

Google is increasingly “smart” when reading the content included in a web, and is able to read the text contained within the Javascript and eventually end up reading the text within images, but as of now is not so the best we can do is avoid publishing important text within images.
separator Making an optimized website for SEO
Technical factors: Upload speed

The loading speed besides being an important factor in SEO will make your users fall in love with your website, because who does not like to enter a web page to load at a time? In this section we will see how to detect which is what slows down the loading of your website and how to solve some of the most common problems.
Measure the speed of loading with Google Page Speed

Google Page Speed is a free tool that Google provides for measuring the speed of loading of a page that also gives information about what is what slows down the loading and how to fix it.

The first thing to do is use the tool on the home of your website
google page speed Making an optimized website for SEO

The first thing that shows you the tool is “overall rating” of your website, once you know it is time to know what is more slowing your website, there are three types of improvements by priority:
speed rules Making an optimized website for SEO

Of course the first thing you have to do is focus on the same key, simply click on icon and the full explanation with a few examples are displayed:
google page speed 2 Making an optimized website for SEO

The next thing to do is use the tool in the inside pages of your website, especially those that are different from the home, for example, we 40defiebre dictionary Inbound Marketing is completely different from our home
google page speed 3 Making an optimized website for SEO

We can not forget the loading speed on mobile devices. Use the same process as with the desktop version
google page speed 4 Making a web optimized for SEO
Find images that optimize web

The image is one of the factors that slow down a website and it is also very easy to fall into it.

With Google images

The first thing you have to do is use the “site:” to search all the images on your web
How do you images optimized for SEO web
Filter the images by size, starting with an intermediate size to rule all small images:
images 2 How to make a website optimized for SEO
Search photos you know that is not big on your website and look at the actual size you are taking:
images 21 Making an optimized website for SEO
Reduce where possible the image size and pass by Yahoo S, this tool will optimize your image as much eliminating unnecessary information from it.

With Web Developer

This is the fastest way to see the resized images of one particular page, the only one but I have to go page by page:

Install Web developer for Mozilla Firefox or Google Chrome
Re sized images with Google Chrome
web developer Making an optimized website for SEO
Re sized images with Mozilla Firefox
web developer 2 How to make a website optimized for SEO
Reduce where possible the image size and pass by Yahoo S, this tool will optimize your image as much eliminating unnecessary information from it.

With Screaming Frog

Best of find images Screaming Frog is that you can download a CSV with all the URLs of the images:

Download Screaming Frog
Enter your website into the program and click on the tab “images”
screaming Making an optimized website for SEO
Filter images weighing more than 100kb and export it
screaming 2 How to make a website optimized for SEO
How in the previous two, if possible reduce the size of the image and transfer it via Yahoo S, this tool will optimize your image as much as it eliminating unnecessary information.

Tools to optimize the speed of your website

Google Closure: Google is a program that allows better the Javascript in a single file.
LESS: LESS is a program that lets you create CSS automatically optimizebesides making them smaller. For Windows and iOS
Yahoo! : Is Yahoo! tool removes unnecessary information from the images without reducing the quality of the same
Yslow of Yahoo !: Yslow is the extension of Yahoo! to measure the speed of a page. It has the advantage that you can use without having to open any program at any time
Pingdom Speedtest: Along with Google Page Speed is the most used to measure the speed of loading tool. The best thing about this tool is that you detailing the order in which things are loaded, the time it takes to load and level of optimization of each load time.