Introducing Alexa Pro
We are happy to announce that starting today we will begin offering "Alexa Pro" subscription services for site owners.
For years Alexa has been the leading provider of free analytics for the web, and we will continue to provide free web analytics so that webmasters worldwide can monitor their Alexa Traffic Rank and other stats. With the addition of Alexa Pro, website owners can now use Alexa services to improve and grow their online businesses as well.
"We want to give site owners the insights and tools they need to be successful on the web," says Dave Sherfesee, Alexa's General Manager, "These services are the first of Alexa's new professional offerings."
The Alexa Pro Basic subscription is only $9.99/month, and is designed to be an affordable plan for small sites. Alexa Pro Basic includes a Premium Listing on Alexa.com and the ability to Certify your website. A Premium Listing includes branding, links back to your website, and overall more control over how your website is presented on Alexa.com. If you also choose to Certify your website, you will have the option of publishing your directly-measured visitors and pageviews metrics. See the Alexa Preimum Listing page for more details.
The $149/month Alexa Pro Advanced Plan offers bi-weekly Site Audit reports in addition to all the features in the Basic plan. With reports running automatically twice each month, the Alexa Site Audit monitors a website for performance, security and SEO best practices so site owners can concentrate on growing their businesses. All current Site Audit subscribers have been upgraded to the new Alexa Pro Advanced subscription for free. See the Alexa Site Audit page for more details.
For a comparison of all subscription services see the Alexa Pro subscription comparisons page.
Along with these new services, we have added a new Dashboard feature. The Dashboard is a custom view of Alexa that gathers all reports and sites in one place. From the Dashboard, subscribers can access a private stats dashboard (currently in beta) where they can keep tabs on their most-important site metrics.
We are very excited about these new offerings, but this is just the start. We want to help small and medium online businesses not only monitor how they are doing, but also allow them to take control and grow their business.
For years Alexa has been the leading provider of free analytics for the web, and we will continue to provide free web analytics so that webmasters worldwide can monitor their Alexa Traffic Rank and other stats. With the addition of Alexa Pro, website owners can now use Alexa services to improve and grow their online businesses as well.
"We want to give site owners the insights and tools they need to be successful on the web," says Dave Sherfesee, Alexa's General Manager, "These services are the first of Alexa's new professional offerings."
The Alexa Pro Basic subscription is only $9.99/month, and is designed to be an affordable plan for small sites. Alexa Pro Basic includes a Premium Listing on Alexa.com and the ability to Certify your website. A Premium Listing includes branding, links back to your website, and overall more control over how your website is presented on Alexa.com. If you also choose to Certify your website, you will have the option of publishing your directly-measured visitors and pageviews metrics. See the Alexa Preimum Listing page for more details.
The $149/month Alexa Pro Advanced Plan offers bi-weekly Site Audit reports in addition to all the features in the Basic plan. With reports running automatically twice each month, the Alexa Site Audit monitors a website for performance, security and SEO best practices so site owners can concentrate on growing their businesses. All current Site Audit subscribers have been upgraded to the new Alexa Pro Advanced subscription for free. See the Alexa Site Audit page for more details.
For a comparison of all subscription services see the Alexa Pro subscription comparisons page.
Along with these new services, we have added a new Dashboard feature. The Dashboard is a custom view of Alexa that gathers all reports and sites in one place. From the Dashboard, subscribers can access a private stats dashboard (currently in beta) where they can keep tabs on their most-important site metrics.
We are very excited about these new offerings, but this is just the start. We want to help small and medium online businesses not only monitor how they are doing, but also allow them to take control and grow their business.
Tuesday, April 12, 2011
Introducing the Alexa Toolbar Creator
Did you know that you can now create a toolbar for your website that includes all of your own content and branding, yet is powered by Alexa? You can, and it's easy.
The Alexa Toolbar Creator allows you to build a custom toolbar to help you connect with your visitors and increase traffic to your site. You can choose from a selection of features to add, including your logo, site search, custom menus, links to popular content on your site, and social networking. Also, by including dynamic headlines from blogs or other RSS/Atom feeds, you can message your customers in real time, wherever they are on the web. You can create toolbars for Internet Explorer or Firefox, and yes Chrome is on the way.
How easy is it to create a toolbar? Easy, but don't just take my word for it. Watch this short video to see just how easy it can be.
Every toolbar built also includes access to Alexa data, so anyone who installs your toolbar will get the advantage of Alexa's website information. This also means anyone who installs your toolbar will also have the option of joining the millions of people already in the Alexa Toolbar Panel, helping to make the web a better place for everyone by contributing to Alexa's free, public web metrics.
A custom built Alexa toolbar is a great way to connect with your customers, build your community, and increase traffic to your website. Check it out!
The Alexa Toolbar Creator allows you to build a custom toolbar to help you connect with your visitors and increase traffic to your site. You can choose from a selection of features to add, including your logo, site search, custom menus, links to popular content on your site, and social networking. Also, by including dynamic headlines from blogs or other RSS/Atom feeds, you can message your customers in real time, wherever they are on the web. You can create toolbars for Internet Explorer or Firefox, and yes Chrome is on the way.
How easy is it to create a toolbar? Easy, but don't just take my word for it. Watch this short video to see just how easy it can be.
Every toolbar built also includes access to Alexa data, so anyone who installs your toolbar will get the advantage of Alexa's website information. This also means anyone who installs your toolbar will also have the option of joining the millions of people already in the Alexa Toolbar Panel, helping to make the web a better place for everyone by contributing to Alexa's free, public web metrics.
A custom built Alexa toolbar is a great way to connect with your customers, build your community, and increase traffic to your website. Check it out!
Wednesday, March 02, 2011
Audit Your Google Analytics Coverage
The Alexa Site Audit now checks your site for Google Analytics tags, and tells you which of your pages are missing them. Using a pixel based analytics solution, such as Google Analytics, is a great way to get insights into how people are using your website. The problem is making sure that all of your pages have the tracking code, or tag, that generates the pixel installed. For small sites it is easy to manually check every page, but for larger sites that is unrealistic. So now, in addition to great SEO and useabiltiy recommendations, the Alexa Site Audit will check your pages for Google Analytics tags for you.
The report is fairly simple, but then it doesn't need to be complicated. We list the tag type, the coverage, how many pages were checked in the audit, and how many pages were missing tags. We also break the list down by subdomain, to make it easier to identify large portions of your site that might be missing tags. If we find tags on every page we crawl, there will be a green check mark indicating everything is fine.
The example report above is for my family blog, and it looks like I am missing Google Analytics tags on 4 pages. To find out which pages those are, I can either click the link or download the entire list of URLs as a CSV file. With this information I can go back and make sure that each and every page on my site has a tag on it, and with my free follow-up report I can verify that I have 100% coverage. No more wondering, with the Alexa Site Audit I know how well my site has been covered with Google Analytics tags.
The report is fairly simple, but then it doesn't need to be complicated. We list the tag type, the coverage, how many pages were checked in the audit, and how many pages were missing tags. We also break the list down by subdomain, to make it easier to identify large portions of your site that might be missing tags. If we find tags on every page we crawl, there will be a green check mark indicating everything is fine.
The example report above is for my family blog, and it looks like I am missing Google Analytics tags on 4 pages. To find out which pages those are, I can either click the link or download the entire list of URLs as a CSV file. With this information I can go back and make sure that each and every page on my site has a tag on it, and with my free follow-up report I can verify that I have 100% coverage. No more wondering, with the Alexa Site Audit I know how well my site has been covered with Google Analytics tags.
Thursday, February 03, 2011
Audit Your Landing Pages
We have added what we feel is a cool new feature to the Alexa Site Audit, the Landing Page Auditor. The Landing Page Auditor is a simple, yet powerful, interactive tool designed to help you giving your landing pages the best SEO boost you can.
You start by choosing a page you want to audit. It can be any page on your site, but you want to concentrate on those few pages you really want to drive organic search traffic to. The idea is do everything you can to help search engines learn what these pages are about, and to emphasize that these are the pages to show when people search for your keywords.
When you click on the Landing Page Auditor, the first thing you see are tables containing an overview and a histogram of all the anchor text on your site for links to the page you are analyzing. If you look at the screen shot to the right, which was taken from a report run on my family blog, you can see that my site has 3064 links to my home page. You can also see how the numbers of links break down by anchor text phrase.
Looking at your anchor text is a great place to start. First, make sure that your text is descriptive and that you have avoided phrases like "click here" or "more". For my blog, over 1000 pages have "Home" as anchor text. These "Home" links, while fine for visitors to my blog, are of little value to a search engine. Also, while you are looking over your anchor text, make sure each phrase describes the page you are auditing. With the exception of "Home" and a few navigation links, the anchor text to my blog's home page seems to be reasonable so I'll move on.
Once you are comfortable with your anchor text, the next step is to enter the keyword or keyword phrase you want to audit. Landing pages should be fairly specific, so you should optimize each page for three to five keywords. There are a small number of crucial places you want your keyword to appear, and when you enter your keyword a new column will appear telling you which of these crucial your keyword appears in. In the example to the right, you can see that my keyword phrase is missing in all but the page text. I guess I have some work to do.
By default we show you the results of the page crawled as part of the site audit, but you can also check your live site by pressing the "Update From Live Site" button. Since the anchor text data is built by looking at your entire site, the only way to update that data is to run another report.
When a keyword has been entered, the anchor text column on the left also changes slightly. The data is the same, but it is split between links whose anchor text contain the keyword and links whose anchor text does not. In the example, less than 1/3 of the links on my site to my home page have anchor text containing my keyword phrase. I am fine with that since I want my anchor text to be optimized for other keywords as well, and I want to avoid keyword stuffing.
Once I am happy with my page and have decided on the changes I want to make to my anchor text, I can move on to my the next keyword or my next landing page. That's it. As I said, the Landing Page Auditor is a simple yet powerful tool to help you optimize your website for organic search.
You start by choosing a page you want to audit. It can be any page on your site, but you want to concentrate on those few pages you really want to drive organic search traffic to. The idea is do everything you can to help search engines learn what these pages are about, and to emphasize that these are the pages to show when people search for your keywords.
When you click on the Landing Page Auditor, the first thing you see are tables containing an overview and a histogram of all the anchor text on your site for links to the page you are analyzing. If you look at the screen shot to the right, which was taken from a report run on my family blog, you can see that my site has 3064 links to my home page. You can also see how the numbers of links break down by anchor text phrase.
Looking at your anchor text is a great place to start. First, make sure that your text is descriptive and that you have avoided phrases like "click here" or "more". For my blog, over 1000 pages have "Home" as anchor text. These "Home" links, while fine for visitors to my blog, are of little value to a search engine. Also, while you are looking over your anchor text, make sure each phrase describes the page you are auditing. With the exception of "Home" and a few navigation links, the anchor text to my blog's home page seems to be reasonable so I'll move on.
Once you are comfortable with your anchor text, the next step is to enter the keyword or keyword phrase you want to audit. Landing pages should be fairly specific, so you should optimize each page for three to five keywords. There are a small number of crucial places you want your keyword to appear, and when you enter your keyword a new column will appear telling you which of these crucial your keyword appears in. In the example to the right, you can see that my keyword phrase is missing in all but the page text. I guess I have some work to do.
By default we show you the results of the page crawled as part of the site audit, but you can also check your live site by pressing the "Update From Live Site" button. Since the anchor text data is built by looking at your entire site, the only way to update that data is to run another report.
When a keyword has been entered, the anchor text column on the left also changes slightly. The data is the same, but it is split between links whose anchor text contain the keyword and links whose anchor text does not. In the example, less than 1/3 of the links on my site to my home page have anchor text containing my keyword phrase. I am fine with that since I want my anchor text to be optimized for other keywords as well, and I want to avoid keyword stuffing.
Once I am happy with my page and have decided on the changes I want to make to my anchor text, I can move on to my the next keyword or my next landing page. That's it. As I said, the Landing Page Auditor is a simple yet powerful tool to help you optimize your website for organic search.
Tuesday, February 01, 2011
Measuring the Internet Blackout in Egypt
It has been widely reported that the Egyptian government has essentially flipped a "kill switch" on the internet, making access in or out of the country largely unavailable starting just after midnight Friday, as large political protests took place. Limited numbers of sites have been routinely blocked by a number of countries, but Egypt has taken the unprecedented step of blocking access to all sites for most of its users. Media reports on the outage have quoted analyses based on the availability of Egyptian networks and servers from outside the country, yielding estimates that around 88% to 92% of connections had initially gone dark before a last working ISP (the Noor Group) also pulled the plug late on Monday.
Alexa's worldwide panel of internet users, on the other hand, allows us to estimate the fraction of users who have been affected within the country. The results paint an even more severe picture of the shutdown, with more than 99.3% of Egyptian users blocked starting on Jan 28.
In the graph below, we scale the number of Egyptian users per day so that the number of users we observed on Jan 21 is set to 100. Note that the vertical axis is plotted with a logarithmic scale. The size of our panel is sufficient to measure the outage with a very high degree of statistical significance; we have assumed here only that users of the Noor Group ISP are not significantly less likely to be included in our panel than users of other Egyptian ISPs.
We have also observed a huge increase of over 200% in traffic to aljazeera.net, as people try to learn about the events unfolding in Egypt. The next graph below shows the percentage of internet users who visited the site each day over the past week. More information is available on our site info page.
Alexa's worldwide panel of internet users, on the other hand, allows us to estimate the fraction of users who have been affected within the country. The results paint an even more severe picture of the shutdown, with more than 99.3% of Egyptian users blocked starting on Jan 28.
In the graph below, we scale the number of Egyptian users per day so that the number of users we observed on Jan 21 is set to 100. Note that the vertical axis is plotted with a logarithmic scale. The size of our panel is sufficient to measure the outage with a very high degree of statistical significance; we have assumed here only that users of the Noor Group ISP are not significantly less likely to be included in our panel than users of other Egyptian ISPs.
Date | Users Jan 28 | 0.69% Jan 29 | 0.67% Jan 30 | 0.62% Jan 31 | 0.70%
We have also observed a huge increase of over 200% in traffic to aljazeera.net, as people try to learn about the events unfolding in Egypt. The next graph below shows the percentage of internet users who visited the site each day over the past week. More information is available on our site info page.
Thursday, October 28, 2010
Has traffic to your site dropped? You're not alone.
As many webmasters and website owners by now know, sometime around October 21st Google changed how they ranked search results. This change, or possibly changes, caused the traffic to some sites to drop by as much as 80%. The Google Webmaster Forums are alive with questions about what happened, and how webmasters should react (for examples see here, here, and here).
At Alexa we can confirm that this wasn't something isolated to a few websites, but rather a change in the search results shown by Google that is shifting traffic across the web. Also, while some sites are losing traffic, others are seeing gains of 30% or more. The exact nature of the change is still under investigation, but it is possible that Google made an Algorithmic change in how they rank search results. This is very serious for many sites. If a website is trying to monetize their traffic, either through selling products or lead generation or simply showing ads, a sudden drop in the number of visitors can represent an unexpected and possibly significant change in the bottom line.
This shifting of web traffic is best illustrated by looking at the daily Reach graphs for different sites. A website's reach is determined by the number of unique visitors to a site, shown as a percentage of the total number of people on the Internet that day.
First, here is site that was showing nice growth over the past three months, and then took a sudden hit in traffic between the 21st and 22nd. I've only shown one example here, but this is happening across the web.
Search traffic isn't quite a zero sum game, but it's close and some sites have experienced a significant increase in visitors due to the Google change. Among the biggest winners, oddly enough, are file sharing and torrent sites. It's tempting to speculate why Google might be sending more visitors to file sharing sites, but it's still too early in our analysis to say anything definitive.
So far the changes in traffic to affected sites have been sustained, so this appears to be a deliberate change on the part of Google instead of a transient glitch in their system. There were some indexing issues that happened prior to the change that caused some sites to report problems as early as the 19th, but according to our data the change to Google's search results went live on October 21st around 3PM Mountain View time. The change may be related to the "Mayday" change, but given how quickly web wide traffic shifted it seems unlikely this was something that had been slowly percolating through Google's indexing system over the past six months.
Our analysis of the October 21st event is ongoing, and I will update this blog as we uncover more information.
This shifting of web traffic is best illustrated by looking at the daily Reach graphs for different sites. A website's reach is determined by the number of unique visitors to a site, shown as a percentage of the total number of people on the Internet that day.
First, here is site that was showing nice growth over the past three months, and then took a sudden hit in traffic between the 21st and 22nd. I've only shown one example here, but this is happening across the web.
Search traffic isn't quite a zero sum game, but it's close and some sites have experienced a significant increase in visitors due to the Google change. Among the biggest winners, oddly enough, are file sharing and torrent sites. It's tempting to speculate why Google might be sending more visitors to file sharing sites, but it's still too early in our analysis to say anything definitive.
So far the changes in traffic to affected sites have been sustained, so this appears to be a deliberate change on the part of Google instead of a transient glitch in their system. There were some indexing issues that happened prior to the change that caused some sites to report problems as early as the 19th, but according to our data the change to Google's search results went live on October 21st around 3PM Mountain View time. The change may be related to the "Mayday" change, but given how quickly web wide traffic shifted it seems unlikely this was something that had been slowly percolating through Google's indexing system over the past six months.
Our analysis of the October 21st event is ongoing, and I will update this blog as we uncover more information.
Thursday, July 22, 2010
The Alexa Site Audit
Today I thought I would write about one of our newest offerings, the Alexa Site Audit. The Site Audit takes an in depth look at your website, grades it, and recommends ways to make it easier for people to find and use it. This is a project I've been personally involved in since the beginning, so I am especially exited today to be blogging about it.
Once a report is initiated, our Site Audit crawler crawls your website. Depending on how many pages your site has, and how quickly we can fetch them, this can take up to 12 hours to perform. We then process the crawled pages, which can take several hours, and only when the report is done do we bill you. Because the process takes some time, we will send you an email when the report is complete. If you log in you may see your report as "payment pending," don't worry. It can sometimes take over an hour for the payment to complete, although most of the time it's much faster.
When you click on the finished report, the first page you see is an overview with your sites's grade and our top recommendations for improving it. The report is broken into five sections, Crawl Coverage, Reputation, Page Optimization, Keywords, and Stats. The first three sections contribute to your grade, while the last one is for anyone interested in the details of the crawl.
I've included screenshots of a report run on my family blog. As you can see I received a grade of "C". Ouch. I guess I have some work to do. It looks like I need to work on getting links to my site from more popular sites, and I need to make sure I have relevant title and meta descriptions on each of my pages.
Crawl Coverage: This section of the report is about the structure of your website, and makes recommendations on how easy it is for crawlers and visitors to find pages. In the example you can see that the reachability of pages in my site is quite poor, and over two thirds of the pages were more than three clicks away from the home page. I obviously need to make sure content on my site is easier for visitors to find. I can also see if I've accidentally blocked any important search engines from crawling my site, how many temporary redirects were found, etc.
Reputation: The reputation of a site, at least in terms of Page Rank, is based on how many sites link it to it and what their reputations are. My blog is in the 9th percentile for inbound links, which means that 91% of sites with a similar Alexa Traffic Rank have more inbound links than my site does. I guess I really do need to work on getting other sites to link to me. Note that by default the graph in the example screen shot is hidden, but if you click on "Learn More" you will see it for your site.
Page Optimization: This section gives recommendations on how to improve the pages of your site, as opposed to the Crawl Coverage section which makes recommendations based on site structure. Here we identify things like duplicate content, which is when two or more URLs show the same "page", pages with little text, missing image descriptions, broken links, etc. The report found two broken links on my blog, which I probably would have never found otherwise.
Keywords: This section recommends keywords to buy if you advertise your site on search engines. We also suggest words to use when linking to pages within your site. Using descriptive words in link text makes it easier for search engines to understand what the linked-to pages are about, and using popular yet relevant words will make the linked-to pages easer to find in search engines.
Stats: This section provides a short summary of the crawl of your site we performed. It includes information such as the number of pages we requested, the errors we encountered, and the unique hosts we crawled. Even if you're not that interested in stats, I do recommend looking over the Unique Hosts Crawled. If you see any sites on the list you don't recognize, it may mean you have some spam links somewhere on your site.
I hope this inspires you to run an Alexa Site Audit, and that you find it useful. We welcome all questions and feedback. You can post to the Alexa help forums, in the comments section here, or send an email to siteaudit@alexa.com.
Once a report is initiated, our Site Audit crawler crawls your website. Depending on how many pages your site has, and how quickly we can fetch them, this can take up to 12 hours to perform. We then process the crawled pages, which can take several hours, and only when the report is done do we bill you. Because the process takes some time, we will send you an email when the report is complete. If you log in you may see your report as "payment pending," don't worry. It can sometimes take over an hour for the payment to complete, although most of the time it's much faster.
When you click on the finished report, the first page you see is an overview with your sites's grade and our top recommendations for improving it. The report is broken into five sections, Crawl Coverage, Reputation, Page Optimization, Keywords, and Stats. The first three sections contribute to your grade, while the last one is for anyone interested in the details of the crawl.
I've included screenshots of a report run on my family blog. As you can see I received a grade of "C". Ouch. I guess I have some work to do. It looks like I need to work on getting links to my site from more popular sites, and I need to make sure I have relevant title and meta descriptions on each of my pages.
Crawl Coverage: This section of the report is about the structure of your website, and makes recommendations on how easy it is for crawlers and visitors to find pages. In the example you can see that the reachability of pages in my site is quite poor, and over two thirds of the pages were more than three clicks away from the home page. I obviously need to make sure content on my site is easier for visitors to find. I can also see if I've accidentally blocked any important search engines from crawling my site, how many temporary redirects were found, etc.
Reputation: The reputation of a site, at least in terms of Page Rank, is based on how many sites link it to it and what their reputations are. My blog is in the 9th percentile for inbound links, which means that 91% of sites with a similar Alexa Traffic Rank have more inbound links than my site does. I guess I really do need to work on getting other sites to link to me. Note that by default the graph in the example screen shot is hidden, but if you click on "Learn More" you will see it for your site.
Page Optimization: This section gives recommendations on how to improve the pages of your site, as opposed to the Crawl Coverage section which makes recommendations based on site structure. Here we identify things like duplicate content, which is when two or more URLs show the same "page", pages with little text, missing image descriptions, broken links, etc. The report found two broken links on my blog, which I probably would have never found otherwise.
Keywords: This section recommends keywords to buy if you advertise your site on search engines. We also suggest words to use when linking to pages within your site. Using descriptive words in link text makes it easier for search engines to understand what the linked-to pages are about, and using popular yet relevant words will make the linked-to pages easer to find in search engines.
Stats: This section provides a short summary of the crawl of your site we performed. It includes information such as the number of pages we requested, the errors we encountered, and the unique hosts we crawled. Even if you're not that interested in stats, I do recommend looking over the Unique Hosts Crawled. If you see any sites on the list you don't recognize, it may mean you have some spam links somewhere on your site.
I hope this inspires you to run an Alexa Site Audit, and that you find it useful. We welcome all questions and feedback. You can post to the Alexa help forums, in the comments section here, or send an email to siteaudit@alexa.com.
http://blog.alexa.com/
Walang komento:
Mag-post ng isang Komento