Archive for the ‘Web Marketing’
Dorcas Alexander wrote on the Luna Metrics blog recently about an important and often-overlooked topic: Organizing the campaign information you can gather in Google Analytics. I’m following up here with a way to document your campaigns. This method also solves the problem of constructing the special URLs used to create those campaigns in the first place.
If that seems a little opaque to you, read on. I suggest you start with this excerpt of Dorcas’ post:
It’s so easy to tag your campaigns for Google Analytics that you can quickly fill your reports with a mishmash of labels and end up with campaign tag soup! But what’s the best way to get organized? Even if you know what medium and source mean, it’s not always obvious how you should fit campaign info into those slots. And what about the extra slots we get for campaign tags like campaign and content and term?
It goes on to list four simple steps to preventing confusion. The fourth discusses documenting your work. It recommends how — by setting up a Google Docs spreadsheet, which can be shared among all content or analytics team members. He goes on to say, “Another good thing about using a spreadsheet is that a formula can pull all your labels together into a campaign-tagged URL.”
That’s a great idea, but how exactly can this be done?
Here’s my how-to, an addendum to that Luna Metrics post.
Above is the Google Spreadsheet I created for a former client (I needed to stop working with them when I joined Accenture). I’ve replaced the live information they were using with some of my own, to protect confidentiality. I’ll assume you already know how to set up a free Google Docs account, which includes the use of their cloud-based Excel competitor, named Spreadsheet.
- Create five columns: Output URL, Target URL, Formula, Campaign, Source and Medium. But wait!, you say. Where is that third column? It’s the Formula column, and is hidden here. I hid it because, a.) It looks identical to Output URL when you have live data in there, so it was redundant, and b.) I prefer to keep it hidden because each cell of that column contains the same forumla — one that you definitely don’t want to accidentally change or delete. If I were setting up the system in Excel, I’d make those cells protected.
- Before “hiding” column C, place this formula in it:
=((((((((B2&IF(ISERROR(FIND(CHAR(63),B2,1)),"?","&"))&"utm_campaign=")&D7)&"&utm_source=")&E2)&"&utm_medium=")&F2))This formula confirms that the target URL (in cell B2) does not already contain a question mark in it. If it finds one already, none will be added. If it finds no question mark, it added one. After that it builds a trailing URL string that will be familiar to those who roll their own URLs, or use Google’s URL Builder. Once you’re done you’re safe to highlight the column and hide it.
- In the Output URL column, place a far smaller formula:
=C2Yes, that’s all. Just display the contents of the hidden cell C2 in the visible cell B2.
- Populate the Target URL cell in that row with the web address of the landing page you want to tag with campaign information.
- Finally, fill in the Campaign field, along with the Source and Medium fields. These are the unique names of the campaign you wish to credit that visit to, along with the web site or social app it was came from (e.g., Twitter, or Jason Falls’ Social Medial Explorer blog), and the general medium (e.g. social, or web).
That’s it! In the Output URL you’ll find the line. Copy it, and paste it wherever you are setting up a hyperlink on another site or digital channel. For example, that top line shows the URL I used when I was Tweeting about my recent blog post extolling the nw release of an Excellent Analytics upgrade.
In the rows to the right of those I’ve shown, you can make notes about when it was used, why, and how you promoted the link. All of this can be helpful when you pull the campaign, source and media statistics for analysis.
I hope this helps. Let me know what improvements you might have experienced in how to catalog your campaign information.
Tags: GA, google analytics, Google Spreadsheets, luna metrics
Posted in Web Analytics, Web Marketing | No Comments »
If you’ve been following my web analytics work, you know that I’m a major fan of Excellent Analytics. I have news. There has been a major (and much-needed, considering the changes by Google Analytics API), of their terrific Excel Add-on. Here are a few of the improvements and additions, as listed in the announcement of the Excellent Analytics update:
- All dimensions and metrics are up to date. I.e. everything made available to the API should be included in EA.
- A new tab in the menu bar, “Settings” has been added. You can access proxy settings at any time if you’re using a proxy and need to enter your settings. Before it only popped up when it seemed to be needed. In the settings dialog you can also find “Request timeout”, increase this figure if you have problems logging in. “Update metrics” and “Update dimensions” will make it easier for us to make sure you always will be able to access new metrics and dimensions as Google add them to the API. Before we needed to make a new release of EA for every update.
- You can choose to save your password locally on your computer if you do not want to enter it every time you open Excel. It won’t be stored in clear text. We do not store your password anywhere for you. It’s only stored on your computer. If you don’t want your password stored at all, just don’t check “Remember password.”
- EA checks for updates every time you use it. If there is a newer version of EA you’ll be prompted to download it. You can also ask to be reminded again later.
- Improved user interface. Some things like sorting and profile selection have been moved.
- Make a query and run it for multiple profiles at once! Before you could only create a query for one profile at a time. Note that you, however, when you want to update data per profile, have to do one update per profile. It’s for that initial creation that you can run one query for multiple profiles. This makes creating your report templates easier.
- Multiple level sorting of data. I.e., you are able to sort by descending x and then by ascending y, etc.
This application is, for the moment, free and open-source. It’s a valuable web analytics resource!
Posted in Web Marketing | 1 Comment »
My work with Accenture has meant this blog has been silent since I joined. I’m loving my work there, by the way. But as for the central focus of this blog, I’ve been continuing to have fun in my off hours with web marketing analytics, especially using Google Analytics. If you use this app, you know they’ve launched a major upgrade of their reporting. It includes a way to create custom dashboards. Below you’ll find one small way I’ve used these new custom dashboards to save time and gain valuable insights.
Until I joined Accenture I was one of the contributors to Jason Fall’s exceptional social media marketing blog, Social Media Explorer. I miss being in such terrific company (they haven’t kicked me out of their Facebook group, something I’m very pleased about). I also miss those posts and the greater audience they had afforded me for my ideas on measuring social media.
But all was not well. I had always wondered how often people viewed my posts, the way I can with this blog. Yes, I could see which posts were the most likely to go viral. I could get that like anyone, from this summary of all of my posts there.
Then Jason shared with his contributors full reporting access to his Google Analytics metrics. Heaven!
Now I had a different problem: I could see aggregate information, but there was no easy way to view just the information about my pages. If the structure of the site had been, say, “domain.com/jefflarche/blogname,” I could view only the pages starting with /jefflarche/. That’s not the case, though. So I walked away, vowing to someday find a way to create a report that would give me a breakdown of my posts, at least for the KPI of Page Views. I got busy today by creating a new Dashboard for the profile. I then populated it with Widgets. Here you can see what the set up looks like for each widget I added (one per post):
Below are the steps taken in this form:
- I chose the widget called “Metric.” This shows one number only (along with a couple of others, for context), instead of a chart, a timeline or a table
- I chose the metric of Pageviews. But I needed to add a filter. For that, you can see I chose to only show the count for pages that contain a unique string. For this example, I chose the unique string social-media-awareness-measurement/ portion for this post’s URL
- I gave the widget the title of that post and linked to it so reviewing content for hints of popularity (or lack thereof!) would be easier
So what insights can I glean from this? First of all, it took a while to build an audience. I learned as I went along, from the first post (lower right corner) to the latest (upper left). I knew this from other measures, which made it particularly sad for me to walk away from the posts. I saw a growth for 693 percent, comparing the views my first post got versus my last.
Turning Information Into Insights
Here are other insights:
- People love “how to” content, and respond to headlines that contain those magical words. (I knew this from my direct response days, but it’s cool how thoroughly this has been carried to the online world.)
- People like to read reviews of relevant books. That’s what I did with the extremely popular post Lessons from the Twitter Love Guru
- Sparklines can give valuable hints to user habits
This last one isn’t readily apparent. I’m going to assume you know what a sparkline is and just say that each of them above shows a sharp rise and fall in readership. After the week it has been posted you can see the view plateau very near zero. It’s to be expected. But there was an outlier, which you could only see if you viewed the full report. It’s shown above right.
Not only did this post not immediately “click” with readers (look at the leading tail), but once it did, its tail at the end is thicker, showing more ongoing popularity. If you’ve been a reader from the start, you’ve already read here and elsewhere about The Long Tail. Here it is in action!
This odd sparkline caused me to dig deeper, and I saw this report for all sources of visits to that page since it post (to the right).
It shows a significant number of links from referring sites and search engines. The referrers obviously liked the content enough to send their readers to it. And search engines? This is the ultimate long tail. I even got four visits from Google for the phrase “measure if people share your content on social media.” Believe it or not, this is hotly contested (I no longer show up for this phrase — at least in the top three pages).
By the way, “feed” stands for Feedburner, which means the fourth (or third, depending on how you look at it) source of visits is people who read Jason’s blog using an RSS reader.
As I said, it pays to be in cool company. By the way, here’s a shout-out to Argyle Social. They’re right near the top as a source for clicks to this page. Their latest post, Is Post Automation Effective? particularly fitting. I would say certainly say yes!
A Link To All of My Social Media Explorer Posts
If the headlines of the above got you curious about my content, I encourage you to visit this summary page, with links to all of them. I’ll be watching this new dashboard to see just how many of you do!
Tags: Argyle Social, dashboards, google analytics, Jason Falls, measurement, Socail Media Explorer
Posted in Social Networks, Web Analytics, Web Marketing | No Comments »
If you’ve been wondering why I’ve been so silent lately, it’s because I’ve been in the midst of a major career move. Those of you who know me well are used to my willingness to dive into something and master it. I’d like to think there has been a pattern to it all — that there’s a method to my madness.
To explain how my 20-plus year career has led me to find Accenture (and them to find me!) I’ve put together this video. Enjoy!
Tags: Accenture, google, Search Stories Video Creator, youtube
Posted in Web Marketing | 4 Comments »
Reading my headline, you may be thinking, “Why would I even want the press releases on my corporate site to be more blog-like? Aren’t blogs kind of, well, flaky?” The answer has to do with three dimensions of a sound digital communication strategy:
- Improved engagement
- Improved compliance
- Improved search engine presence
Here’s what I mean by improved engagement. It has to do with engaging (involving, informing) web visitors. If your site provides more information about a newsworthy product or service, the chances are better that a web visitor will want one. There’s an old salesmanship saying: “Telling is selling.”
Of course, engagement isn’t the same thing as boring someone to death. That’s why the web is such a great way to deliver content. When content is served up properly, there are ample spaces for long copy to breathe. This copy can be broken up between several pages, headings or hidden div layers. When a visitor wants more information, he or she simply has to click, or to scan headlines and sub-headings.
Some think you can’t pack much selling information into a stodgy old press release. They’re quite mistaken. Press releases that are loaded into a web site can describe features and benefits. Because of the journalistic style of them, these descriptions are usually short — or at least written in an easy-to-digest inverted pyramid style.
To keep these online press releases relatively short, yet packed with selling power, writers should link often to more detailed information elsewhere within the site. Just as I have with my link to a definition of the inverted pyramid style of journalism (earlier in this paragraph), a single link can speak volumes for those curious enough to click.
Just as you can (and should!) link off of the press release to other source information, you should consider linking to a given press release from blog posts on that topic. This allows you to speak more loosely in your blog posts without running afoul of the legal team in your company who want to ensure the company won’t lose credibility or get sued.
Below is an example, from the press release section and blog section of the Tripit.com site:
Look at the headline for the blog post: “Tripit Joins Concur to Become Bigger, Better and Stronger.” It’s a lot easier, from a compliance perspective, to say this in a blog post. Journalists and the financial industry — the two groups most likely to be interested in this story — would both regard a press release using this language with some skepticism.
However, by linking to the press release talking about the same event, this level of hyperbole is understandable … even expected. Blogs are about opinion. Press releases are about cold facts.
So how is the press release in this example behaving more like a blog post? You really can’t tell unless you look at the source code, but this press release was posted using the WordPress blogging system. Which leads to better search engine performance …
Improved Search Engine Presence
Here are four ways that serving up your press releases the way Tripit.com has is smart from a search engine marketing perspective:
1. More press releases = more search engine optimization
This technique gets your public relations or marketing teams out of the mindset that press releases should be rationed out carefully. Just as blog posts go up often, so should your press releases. Mind you, this does not mean more spending with services such as PRWeb or PRNewswire. That’s what got your PR team into the mindset in the first place!
Instead, continue to use those far-reaching press release distribution services for the big announcements. When you do, replicate the press releases on your site. Then post others that aren’t earth-shattering news, but can support other product or service launches or upgrades.
Most corporate site web visitors don’t seek out press releases. But if you link to specific releases from blog posts, the content will reinforce what’s in that post. The intra-site links will also slightly boost the rankings of both pages in the eyes of many search engines.
2. Double the odds of getting into top search engine results pages
With the one-two punch of a blog post and a press release, you double your chances of ranking high. That assumes that none of the content between the two items is shared. Search engines frown on identical copy one pages that it is indexing. True, you can excerpt the press release in your blog, but don’t get carried away.
3. Provide search engines with a second RSS feed
One of the reasons that search engines love to rank blog posts high in results pages is they timely and are easily accessible through RSS feeds. When a blog post goes live, a search engine that has access to its RSS feed has the inside scoop, so to speak. The same can be said for your press releases when you use a blogging platform such as WordPress to publish them.
4. Benefit from search-engine-friendly page URLs, categories and much more
Here is the full URL of the press release example shown about:
This is exactly the sort of URL that search engine spiders can sink their teeth into!
Finally, a press release can be given a “category,” such as Tripit’s “Company Announcements” (the full length is truncated and you can see the left edge in the upper right corner, below. This allows that category page to be indexed by search engines as well. Here’s an example from the Tripit.com Press Release section:
I could add one more reason to use this technique: It’s extremely easy to install and test. Try it with your corporate site. If you already have, let me know what you think of it.
Have you noticed, by the way, that I haven’t mentioned press releases in PDF format once? It’s only because I trust you, dear reader, to not even dream of using this unwise tactic with your company. But if you have no choice, please share this posts with the powers that be and beg them to reconsider their folly.
The marketing power of a press release is a terrible thing to waste!
Tags: cms, concur, press release, public relations, seo, tripit.com, wordpress
Posted in Branding, Search Engine Marketing, Web Marketing | 2 Comments »
There was much excitement when Google Analytics unveiled its Events metric. This meant web analytics could store several levels of information on a specific action, and associate that information with a unique web visit and visitor. Before that, if you wanted to — let’s say — record a download, you’d need to create a Virtual Page View.
So why did I recently blog on Jason Falls’ site about creating Virtual Pageviews when recording interest actions, such as “Send to a Printer,” or sharing actions, such as “Email a Friend?” or “Share on Facebook?” Why don’t I just create Events?
Using AddThis To Talk To Google Analytics
The answer is simple: If you consider sharing to be a goal of your site, you may want to set it as a Google Analytics (GA) Goal. Events, for all of their power, can’t be set as Goals.
Another action that Events are commonly used for is downloading white papers. Events seem perfect for this because you can set and capture a number of variables, such as title. In other words, you can set the Event Label as the title of the paper. But if you want to measure this as a Goal in GA, you’re out of luck.
Events don’t event “talk” to Goals. Let’s say you want to generate a report showing how many people who downloaded a white paper remained on the site for three or more minutes. The time on site can be set as a GA Goal, but you can’t easily generate a report showing the percentage of those who downloaded that remained on the site for that time period.
You can do all of this with GA Virtual Pageviews.
My rule of thumb is this: If you need to identify more than one variable with an event (such as identifying various Actions and Labels), and you do not need to correlate these with GA Goals, used Events. For all else, stick with Virtual Pageviews.
How To Track Content Interest Index In GA Using AddThis
Here is that how-to post I was referring to:
How To Measure Interest Using Google Analytics and AddThis, posted on Social Media Explorer by Jason Falls.
Tags: addthis, CII, content interest index, GA Events, google analytics, Jason Falls, Social Media Explorer, Virtual Pageviews
Posted in Web Analytics | No Comments »
If you’ve been following my (roughly) monthly posts on Jason Falls’ blog you know that I’ve taken this tack: On his blog I cover the key concepts of a particular web analytics approach, then provide additional support for that idea here.
A recent example is from two months ago. I posted about the use of Brownie Charts as a way to report Content Interest Index. I posted a parallel piece here on another use of the technique (Using Brownie Charts to Measure Bounce Rates). You could say this blog has become my laboratory: Results of preliminary experiments are described here, while the “real” story is broken on Jason’s blog. Tomorrow will be a little different.
Tomorrow, on Jason’s blog, I’ll be posting on someone else’s innovation. It is a review of an extraordinary book: Hashtag Analytics. I’m a huge fan of its author, Kevin Hillstrom, and over the years I’ve spent way too many hours creating Excel-driven models in order to replicate and fully understand his findings.
I’ll be doing that again, this time in support of Kevin’s approach to monitoring Twitter communities. Check back at this tag (hashtag-analytics) to read updates on my “lab work.” Ill be reporting over the next several weeks.
When A Hashtag Community Member Is “Removed”
You may want to check Kevin’s blog as well — especially later this week, when Kevin reports on the future vitality of the hashtag community #measure. He posted about it last week. Now he plans to theoretically whack an active member. Here’s an excerpt from his post, where he invites readers to suggest whom to “remove”:
In every e-commerce company, somebody is responsible for forecasting sales for the next twelve months, by day. So it makes logical sense that any community manager would want to know what the future of his/her community is, right? This is something you don’t find in any of the popular Twitter-based analytics tools. This is my focus. This is what I love doing, it’s completely actionable, and it’s an area of analysis not being explored!
Next week [the week starting January 24 — that’s today!], we’ll do something neat — we’ll remove one important user from the community, and we’ll see if the absence of the individual harms or helps the future trajectory of the community. If you are an active participant in the #measure community, please send me a user_id that you’d like to see removed in the forecast … I’ll run an example for the individual who gets the most votes.
And in two weeks, we’ll compare the #measure community to the #analytics community … competing communities doing similar work … which community is forecast to have a stronger future?
It’s a fun stunt / modeling experiment that has real world implications. It should serve as a proof of sorts of the predictive power of his Hashtag Digital Profiles and the statistical work behind them. More relevant to online community managers, it should illustrate why showing your participants “love,” lest they never return, is of tremendous importance.
What To Expect Here
I will be applying my own Hashtag Analytics to a different online group — one that has the advantage of weekly meetings. It’s a fairly new group, so the rules may not fully apply (Does an acorn sprout follow the same natural laws of growth as a full-grown tree?). To ensure I don’t jinx my test or influence the community — in a far more direct way than Heisenberg was referring to — its identity will remain unknown until I’ve gathered and analyzed a critical mass of data.
Do stop back.
January 25, 2011 Update:
Here are two related links I didn’t have yesterday. The first is Kevin’s post where he removes that member to the #measure Hashtag community. The second is my review of his book today on Jason Falls’ blog:
Tags: hashtag analytics, kevin hillstrom, minethatdata
Posted in Social Networks, Web Analytics | No Comments »
Yesterday I was conducting a “Web Analytics Forensics” session with a new client. They posed a common question: The monthly reports on clicks that they were getting from suppliers of their ad buys were off by 10 percent — sometimes even more — compared to their own Google Analytics metrics. The number differences were veering all over the road. Sometimes these vendor reports seemed to overstate traffic, other times the clicks seemed to be understated. When I responded, I was reminded of the reassurances that a friend of mine gives. He’s a pediatrician.
My friend Paul had told me once that the bane of pediatricians everywhere are the late-night calls from parents who are worried about their child’s fever. He doesn’t mind being awakened (well, not much), but he has trouble fully reassuring parents of this fact: Fevers are normal, even healthy. If a child doesn’t run a fever every so often, it’s then that he’d be alarmed!
Don’t take Paul’s word for it. Here is a post in the New York Times a couple of days about on this very subject.
Client concerns about analytics discrepancies are my own profession’s “fever fears.” They can be a distraction from deeper problems. (The doctor in the post mentioned a mom whose child had a fever and abdominal pains. He said his primary concern was the abdominal symptom, but Mom kept steering things back to the fever!)
So the answer to the promise I gave you in the headline is this. For media buy discrepancies, don’t bother trying to resolve them!
Unless you think you are being overcharged for the traffic you’re buying from vendors, in the form of ad clicks or affiliate links, rest assured. I’d only be worried if their numbers were identical to yours. No two systems measure web traffic precisely, or the same way. They are all uniquely flawed.
So instead, focus on what you’re doing with this traffic after it arrives. Are visitors finding what they came for? Are they returning in healthy numbers? And most importantly …
Are they converting?
Usually when I’m called in to consult, the answer to all of these questions appears to be no. Focus your attention, and your boss’s, on these issues. They are the only path to online marketing success.
Tags: google analytics
Posted in Web Analytics, Web Marketing | 2 Comments »
Occasionally I share links to blog posts of others in my industry. Some things are too good to keep to myself. Here’s a perfect example, from Luna Metrics:
We have a customer who considers the SEO we do for her to be one of her “sales channels” and we get ranked along with her other channels. She sends us reports when a lead comes in and when a lead is closed. The other day, I saw that she closed one that was worth not quite half a million dollars. (!! that was my reaction, too.) So I wrote her and said, how awesome. To which she replied,
“Started with google analytics. Saw that they spent some time on the site… sicked Jane on a cold calling mission… after a bunch of calls she found the engineer at the company who was interested in the product. I flew out.. presented… sold and they put out a public bid. Our company is the low bidder and need to send a sample next week for review then release of contract.”
So in case you are wondering, she was talking about the [Google Analytics] Network Locations report, which she mines daily for sales leads.
It’s hard to believe all of this was accomplished by reviewing a typically-overlooked report in a totally free analytics package. Read the rest of their post, and check out this helpful post from the past on how to exploit the fact that many large organizations will self-identify in this report instead of resolving to their ISP’s name.
Tags: b2b, lead generation, luna metrics
Posted in Web Analytics, Web Marketing | No Comments »
Here are the most popular posts on this blog from the past year.
Enjoy, and have a great New Year!
Posted in Web Analytics, Web Marketing | No Comments »
Today on Jason Fall’s Social Media Explorer, I discuss my new favorite data visualization technique — one that I’m starting to move into production with web analytics reports I create for clients. Its official name is the Tree Map, but as I mention in that post, I prefer to call it The Brownie Chart.
That post has an example of how I use brownie charts to show a promising new web metric, the content interest index (CII). My example on the site uses a made-up business, Everything Brownies, with a web address of EB.com.
Note: Yes, I know. That web address resolves to a real encyclopedia site. The reason I didn’t just make up a domain name is you never know when one will go live with a site. I didn’t want to have someone inform me, two months from now, that my blog is now pointing to a porn or gambling site! Unless Encyclopedia Britannica takes a surprisingly sleazy turn, I think I’m safe.
Here is another example of how the tree map / brownie chart can make web analytics reporting easier to understand:
Charting Bounce Rates: “I came, I saw, I puked.”
I agree with Avinash Kaushik that bounce rates are a helpful way to measure how well you’re connecting with site visitors. Actually, he’s a little more enthusiastic than I am, with blog post titles such as this model of understatement: Bounce Rate: The Sexiest Metric Ever? Three years ago, on his own blog, Avinash described bounce rates this way:
So what is this mysterious metric? In a nutshell bounce rate measures the percentage of people who come to your website and leave “instantly.”
They’re the one-page visitors. Yes, they might be finding what they were looking for — but more often than not, these people just didn’t dig the neighborhood.
Avinash has refined his description over time. In his recent, truly outstanding book on measuring web traffic, Web Analytics 2.0, he characterizes bounce rates this way: “I came, I saw, I puked.”
Bounces can be reviewed for all traffic to a site, or only for certain important segments — traffic from search engines is a good example. Reporting of bounce rates can also be broken down by page.
The brownie chart becomes particularly handy for this per-page bounce rate reporting. It helps those responsible assess the severity of a site’s problem pages.
You see, you can’t easily be sure that a page with a high bounce rate really is a problem page. Think of it: If nearly everyone ups and leaves when they arrive at a particular page, but that page gets relatively little traffic, there’s no huge emergency. Content management resources are usually scarce, so it’s better to keep looking, for other pages that attract more page views that happen to have comparatively high bounce rates. It’s those more popular pages that require immediate first aid!
To illustrate, take a look at Everything Brownies’ bounce rates on this brownie chart. The graphic shows all major pages of this fictitious site, and shows the pages as more red if they have the highest bounce rates relative to the others. You should know that size represents the relative numbers of page views. The bigger the “brownie piece,” the more views that page gets.
The Holiday Brownie Baking Kit, which I placed my mouse over in this screen capture, has an excellent (i.e., low) bounce rate. It also has a ton of page views.
That means this page is doing quite well in keeping visitors from leaving immediately. Well done! On the other hand, Deluxe Baking Pan is not nearly as successful. Its relative bounce rate is quite high, and because it has the most page views of the entire site, it’s clear this page is majorly dropping the ball!
There are plenty more insights, but you get the picture.
As I mentioned on Jason’s blog, what I like about this charting format is non-math types (such as myself!) can understand these statistics immediately, and know exactly what needs to be investigated further — and in what order of priority. As my friend Bob likes to say, “That’s good stuff!”
I hope you find the potential of this charting technique as exciting at I do.
A Round of Applause for BeGraphic and Sparklines for Excel
This example of a fake report for EB.com, as well as the one on Social Media Explorer, was produced using an “Add-on” for Excel called BeGraphic. The Add-on consists of a whole suite of graphic tools — all based on Excel data and rendered within that application. The particular functions I used were part of Sparklines for Excel within the BeGraphic suite. I urge you to support the folks behind these amazing visualization tools.
Tags: Avinash Kaushik, brownie charts, CII, content interest index, Jason Falls, Occam's Razor, Social Media Explorer, Web Analytics 2.0
Posted in Visualization, Web Analytics | No Comments »