Another look at our fascinating country:
Full disclosure: U.S. Census is a client.
Another look at our fascinating country:
Full disclosure: U.S. Census is a client.
Below is a new “embeddable” version of the U.S. Census Bureau’s Glossary widget:
Full disclosure: U.S. Census is a client!
Dorcas Alexander wrote on the Luna Metrics blog recently about an important and often-overlooked topic: Organizing the campaign information you can gather in Google Analytics. I’m following up here with a way to document your campaigns. This method also solves the problem of constructing the special URLs used to create those campaigns in the first place.
If that seems a little opaque to you, read on. I suggest you start with this excerpt of Dorcas’ post:
It’s so easy to tag your campaigns for Google Analytics that you can quickly fill your reports with a mishmash of labels and end up with campaign tag soup! But what’s the best way to get organized? Even if you know what medium and source mean, it’s not always obvious how you should fit campaign info into those slots. And what about the extra slots we get for campaign tags like campaign and content and term?
It goes on to list four simple steps to preventing confusion. The fourth discusses documenting your work. It recommends how — by setting up a Google Docs spreadsheet, which can be shared among all content or analytics team members. He goes on to say, “Another good thing about using a spreadsheet is that a formula can pull all your labels together into a campaign-tagged URL.”
That’s a great idea, but how exactly can this be done?
Here’s my how-to, an addendum to that Luna Metrics post.
Above is the Google Spreadsheet I created for a former client (I needed to stop working with them when I joined Accenture). I’ve replaced the live information they were using with some of my own, to protect confidentiality. I’ll assume you already know how to set up a free Google Docs account, which includes the use of their cloud-based Excel competitor, named Spreadsheet.
=((((((((B2&IF(ISERROR(FIND(CHAR(63),B2,1)),"?","&"))&"utm_campaign=")&D7)&"&utm_source=")&E2)&"&utm_medium=")&F2))This formula confirms that the target URL (in cell B2) does not already contain a question mark in it. If it finds one already, none will be added. If it finds no question mark, it added one. After that it builds a trailing URL string that will be familiar to those who roll their own URLs, or use Google’s URL Builder. Once you’re done you’re safe to highlight the column and hide it.
=C2Yes, that’s all. Just display the contents of the hidden cell C2 in the visible cell B2.
That’s it! In the Output URL you’ll find the line. Copy it, and paste it wherever you are setting up a hyperlink on another site or digital channel. For example, that top line shows the URL I used when I was Tweeting about my recent blog post extolling the nw release of an Excellent Analytics upgrade.
In the rows to the right of those I’ve shown, you can make notes about when it was used, why, and how you promoted the link. All of this can be helpful when you pull the campaign, source and media statistics for analysis.
I hope this helps. Let me know what improvements you might have experienced in how to catalog your campaign information.
If you’ve been following my web analytics work, you know that I’m a major fan of Excellent Analytics. I have news. There has been a major (and much-needed, considering the changes by Google Analytics API), of their terrific Excel Add-on. Here are a few of the improvements and additions, as listed in the announcement of the Excellent Analytics update:
This application is, for the moment, free and open-source. It’s a valuable web analytics resource!
Twenty years ago the consumer tech magazine to read was PC/Computing. This, in spite of its stupid name. (Spell it out: “Personal Computing Computing.”) I recall an ad from its back pages for some long-forgotten software. Buy the software and you’d get a bonus bumper sticker, reading: In the future everything will work. Those were the days of crappy dial-up modems and the crash-prone Windows 3.1, so this was high sarcasm.
In honor of the upcoming Rock the Green music festival, to be held on Milwaukee’s lakefront September 18, I submit this similarly absurd proclamation: Technology will fix our planet. What’s more, I assert that your guffaws (yes, I can hear them now) is in fact further evidence of its certitude. Here’s how:
Yes, a chain mail glove, produced by Within Technologies. It represents an amazing technology; one that uses one-tenth the metal of conventional manufacturing and arguably just as little fossil fuel. What’s more, there isn’t a single seam or joint in the whole works. It was manufactured by a printer.
In the future we’ll be printing things, not just pictures or descriptions of things. Things made of metal, fiber, just about any material. This printing will take place well beyond the factory floor. It will happen in the back of auto repair shops, clothing stores, hobby shops. Out of the nozzles of printers will issue the very supplies and inventory of future commerce. Someday we may even be printing our own “stuff,” right in our home.
This will save huge amounts of fuel. Needless to say, that will spare our planet from megatons of pollutants annually. And yes, I know I’m sounding all PC Computing. But contrary to the bumper sticker, I know these printers won’t work perfectly. Nothing ever does. There will be problems. But it won’t matter. The technology will arrive, whether we invite it into our lives or not. And if his particular revolution won’t take place, I’m confident another will . It will be just as extraordinary. I blogged about that one other tech revolution: Not printing things in three dimensions but the very food we eat, using living cattle, pig or chicken cells instead of metals and plastics. Yes, printing T-bone steaks.
I base my confidence — my optimism — on history. The world is full of smart people. Always has been. But we have a massive blind spot for the reality of our lives in the future because we’re limited by our imaginations.
Take the telephone. The phone has been a tool of almost universal good, aiding both business and society. It has saved lives (think: Dial 9-1-1) and saved massive transportation costs (as when you phoned ahead before driving to the store and discovering they’re sold out anyway).
I’ve written before about the phone, and how no one thought that technology would be much beyond a less precise telegram. Here’s an excerpt from that 2007 Business Journal post:
Executives in the telegraph industry couldn’t imagine that a device with no written record of the communication could be a threat to their business. In fact, legend has it that William Orton, the president of [the once mighty telegram company] Western Union … was offered a chance to buy Alexander Bell’s phone patent for $100,000. The story goes that he replied, “What use could this company make of an electric toy?”
The emphasis is mine, but you get the point.
I do suggest you click on this link to a post on my personal blog, about, of all things, growing meat in the lab. It’s another new technology that I think will be as revolutionary as the telephone.
To be fair, it’s not so much the technology that makes it revolutionary — this one uses in vitro lab culturing, which isn’t new — but the good it can do . It promises to feed the world, and in doing so conserve a staggering amount of natural resources and carbon-emitting fuels. I’m not kidding.
When they hear about culturing meat in a lab most people laugh or wince. I frankly don’t blame them. I may be a bit of a whack job to think this level of aversion can be overcome, but I do believe the writer of the New Yorker piece, who I cite in that post, when he says he expects to see it arrive, in at least a limited way, within ten years.
Read it, and I dare you to disagree. It’s that much of a game-changer.
Which brings me back to 3-D printers. They’ve actually been around for about 20 years. I saw my first one at my brother’s company, ten years ago. Originally used to print manufacturing prototypes, now they’re frequently being upgraded to manufacture the things themselves. Here’s an excerpt from a recent Economist story about them:
Far-fetched as this may seem, … people are using three-dimensional printing technology to create … medical implants, jewellery [sic], football boots designed for individual feet, lampshades, racing-car parts, solid-state batteries and customised [sic again — hey, they’re British] mobile phones. Some are even making mechanical devices.
At the Massachusetts Institute of Technology (MIT), Peter Schmitt, a PhD student, has been printing something that resembles the workings of a grandfather clock. It took him a few attempts to get right, but eventually he removed the plastic clock from a 3D printer, hung it on the wall and pulled down the counterweight. It started ticking.
Although the article focuses on the savings in raw materials (it takes only ten percent of the metal to make something complicated in this way, compared to the wasteful machining and whittling away of metal blocks), there are other clear implications for a greener planet.
It will happen when these 3-D printers become more affordable. Think about it: There was a time when only well-financed businesses could afford a fax machine. Time improved the technology and drove costs down. Now you can buy a fax machine for about fifty bucks.
What if these printers were to follow the same trajectory? At first only the largest businesses could afford to produce their own machine parts. Then smaller businesses could afford it. But they wouldn’t be faxing each other digitized pictures of documents. They’d be transmitting the digitized plans to actually make stuff.
Jeff Jarvis famously said that the reason businesses like Amazon.com and Apple’s iTunes have prevailed over stores that sell real books and CDs is that moving atoms from place to place is costly. Digital transportation, by comparison, is frictionless. Atoms are a drag.
Imagine a world where the friction is taken out of moving real things around. Instead of the rotor to your car’s disk brakes arriving by truck at the repair shop, the part would arrive digitally, and perfectly configured to your vehicle. Or at least the plans for it would. Then a tiny pile of metal and composite powder, a small fraction of the size of what’s needed using today’s technology, would be fed into a 3-D printer. You’d be on the road more quickly, at a lower cost, and at a lower cost to the planet.
What do you think about a future where atoms are no longer a drag? Do we dare to dream of our grandkids inhabiting a world with more fresh water, less pollution and fewer pollution-borne illnesses?
I say we must.
My work with Accenture has meant this blog has been silent since I joined. I’m loving my work there, by the way. But as for the central focus of this blog, I’ve been continuing to have fun in my off hours with web marketing analytics, especially using Google Analytics. If you use this app, you know they’ve launched a major upgrade of their reporting. It includes a way to create custom dashboards. Below you’ll find one small way I’ve used these new custom dashboards to save time and gain valuable insights.
Until I joined Accenture I was one of the contributors to Jason Fall’s exceptional social media marketing blog, Social Media Explorer. I miss being in such terrific company (they haven’t kicked me out of their Facebook group, something I’m very pleased about). I also miss those posts and the greater audience they had afforded me for my ideas on measuring social media.
But all was not well. I had always wondered how often people viewed my posts, the way I can with this blog. Yes, I could see which posts were the most likely to go viral. I could get that like anyone, from this summary of all of my posts there.
Then Jason shared with his contributors full reporting access to his Google Analytics metrics. Heaven!
Now I had a different problem: I could see aggregate information, but there was no easy way to view just the information about my pages. If the structure of the site had been, say, “domain.com/jefflarche/blogname,” I could view only the pages starting with /jefflarche/. That’s not the case, though. So I walked away, vowing to someday find a way to create a report that would give me a breakdown of my posts, at least for the KPI of Page Views. I got busy today by creating a new Dashboard for the profile. I then populated it with Widgets. Here you can see what the set up looks like for each widget I added (one per post):
Below are the steps taken in this form:
So what insights can I glean from this? First of all, it took a while to build an audience. I learned as I went along, from the first post (lower right corner) to the latest (upper left). I knew this from other measures, which made it particularly sad for me to walk away from the posts. I saw a growth for 693 percent, comparing the views my first post got versus my last.
Here are other insights:
This last one isn’t readily apparent. I’m going to assume you know what a sparkline is and just say that each of them above shows a sharp rise and fall in readership. After the week it has been posted you can see the view plateau very near zero. It’s to be expected. But there was an outlier, which you could only see if you viewed the full report. It’s shown above right.
Not only did this post not immediately “click” with readers (look at the leading tail), but once it did, its tail at the end is thicker, showing more ongoing popularity. If you’ve been a reader from the start, you’ve already read here and elsewhere about The Long Tail. Here it is in action!
This odd sparkline caused me to dig deeper, and I saw this report for all sources of visits to that page since it post (to the right).
It shows a significant number of links from referring sites and search engines. The referrers obviously liked the content enough to send their readers to it. And search engines? This is the ultimate long tail. I even got four visits from Google for the phrase “measure if people share your content on social media.” Believe it or not, this is hotly contested (I no longer show up for this phrase — at least in the top three pages).
By the way, “feed” stands for Feedburner, which means the fourth (or third, depending on how you look at it) source of visits is people who read Jason’s blog using an RSS reader.
As I said, it pays to be in cool company. By the way, here’s a shout-out to Argyle Social. They’re right near the top as a source for clicks to this page. Their latest post, Is Post Automation Effective? particularly fitting. I would say certainly say yes!
If the headlines of the above got you curious about my content, I encourage you to visit this summary page, with links to all of them. I’ll be watching this new dashboard to see just how many of you do!
If you’ve been wondering why I’ve been so silent lately, it’s because I’ve been in the midst of a major career move. Those of you who know me well are used to my willingness to dive into something and master it. I’d like to think there has been a pattern to it all — that there’s a method to my madness.
To explain how my 20-plus year career has led me to find Accenture (and them to find me!) I’ve put together this video. Enjoy!
Reading my headline, you may be thinking, “Why would I even want the press releases on my corporate site to be more blog-like? Aren’t blogs kind of, well, flaky?” The answer has to do with three dimensions of a sound digital communication strategy:
Here’s what I mean by improved engagement. It has to do with engaging (involving, informing) web visitors. If your site provides more information about a newsworthy product or service, the chances are better that a web visitor will want one. There’s an old salesmanship saying: “Telling is selling.”
Of course, engagement isn’t the same thing as boring someone to death. That’s why the web is such a great way to deliver content. When content is served up properly, there are ample spaces for long copy to breathe. This copy can be broken up between several pages, headings or hidden div layers. When a visitor wants more information, he or she simply has to click, or to scan headlines and sub-headings.
Some think you can’t pack much selling information into a stodgy old press release. They’re quite mistaken. Press releases that are loaded into a web site can describe features and benefits. Because of the journalistic style of them, these descriptions are usually short — or at least written in an easy-to-digest inverted pyramid style.
To keep these online press releases relatively short, yet packed with selling power, writers should link often to more detailed information elsewhere within the site. Just as I have with my link to a definition of the inverted pyramid style of journalism (earlier in this paragraph), a single link can speak volumes for those curious enough to click.
Just as you can (and should!) link off of the press release to other source information, you should consider linking to a given press release from blog posts on that topic. This allows you to speak more loosely in your blog posts without running afoul of the legal team in your company who want to ensure the company won’t lose credibility or get sued.
Below is an example, from the press release section and blog section of the Tripit.com site:
Look at the headline for the blog post: “Tripit Joins Concur to Become Bigger, Better and Stronger.” It’s a lot easier, from a compliance perspective, to say this in a blog post. Journalists and the financial industry — the two groups most likely to be interested in this story — would both regard a press release using this language with some skepticism.
However, by linking to the press release talking about the same event, this level of hyperbole is understandable … even expected. Blogs are about opinion. Press releases are about cold facts.
So how is the press release in this example behaving more like a blog post? You really can’t tell unless you look at the source code, but this press release was posted using the WordPress blogging system. Which leads to better search engine performance …
Here are four ways that serving up your press releases the way Tripit.com has is smart from a search engine marketing perspective:
This technique gets your public relations or marketing teams out of the mindset that press releases should be rationed out carefully. Just as blog posts go up often, so should your press releases. Mind you, this does not mean more spending with services such as PRWeb or PRNewswire. That’s what got your PR team into the mindset in the first place!
Instead, continue to use those far-reaching press release distribution services for the big announcements. When you do, replicate the press releases on your site. Then post others that aren’t earth-shattering news, but can support other product or service launches or upgrades.
Most corporate site web visitors don’t seek out press releases. But if you link to specific releases from blog posts, the content will reinforce what’s in that post. The intra-site links will also slightly boost the rankings of both pages in the eyes of many search engines.
With the one-two punch of a blog post and a press release, you double your chances of ranking high. That assumes that none of the content between the two items is shared. Search engines frown on identical copy one pages that it is indexing. True, you can excerpt the press release in your blog, but don’t get carried away.
One of the reasons that search engines love to rank blog posts high in results pages is they timely and are easily accessible through RSS feeds. When a blog post goes live, a search engine that has access to its RSS feed has the inside scoop, so to speak. The same can be said for your press releases when you use a blogging platform such as WordPress to publish them.
Here is the full URL of the press release example shown about:
This is exactly the sort of URL that search engine spiders can sink their teeth into!
Finally, a press release can be given a “category,” such as Tripit’s “Company Announcements” (the full length is truncated and you can see the left edge in the upper right corner, below. This allows that category page to be indexed by search engines as well. Here’s an example from the Tripit.com Press Release section:
I could add one more reason to use this technique: It’s extremely easy to install and test. Try it with your corporate site. If you already have, let me know what you think of it.
Have you noticed, by the way, that I haven’t mentioned press releases in PDF format once? It’s only because I trust you, dear reader, to not even dream of using this unwise tactic with your company. But if you have no choice, please share this posts with the powers that be and beg them to reconsider their folly.
The marketing power of a press release is a terrible thing to waste!
Surprisingly there are still skeptics to the power of a brand to add significant value to a business. This usually comes from the operations and finance folks. Accounting practices don’t have a tidy place for things that exist more between the ears of consumers than within a company’s warehouses and bank accounts.
When I’m teaching the concept of brand building (versus business building), I define the former efforts as bolstering its “intellectual property value.” This is done partially by the folks in marketing and communications, but far more commonly it’s done by such things as innovations in product design, improved distribution and creative pricing.
Apple has certainly done two out of three. Their pricing is hardly “creative.” As a market leader, Apple’s prices are mostly set to convert this elevated brand value into lovely lucre. Lots of it.
Below is a case in point, taken from an Economist Magazine published earlier this month (a pay wall may block you from full content):
The story is about diminishing returns for most cell phone handset makers. Diminishing, that is, except for Apple. Comparing Apple’s handset share of market with the brand’s share of profits is a clear demonstration of how powerful a strong brand can be to maximize profits — even with relatively modest market share.
There was much excitement when Google Analytics unveiled its Events metric. This meant web analytics could store several levels of information on a specific action, and associate that information with a unique web visit and visitor. Before that, if you wanted to — let’s say — record a download, you’d need to create a Virtual Page View.
So why did I recently blog on Jason Falls’ site about creating Virtual Pageviews when recording interest actions, such as “Send to a Printer,” or sharing actions, such as “Email a Friend?” or “Share on Facebook?” Why don’t I just create Events?
The answer is simple: If you consider sharing to be a goal of your site, you may want to set it as a Google Analytics (GA) Goal. Events, for all of their power, can’t be set as Goals.
Another action that Events are commonly used for is downloading white papers. Events seem perfect for this because you can set and capture a number of variables, such as title. In other words, you can set the Event Label as the title of the paper. But if you want to measure this as a Goal in GA, you’re out of luck.
Events don’t event “talk” to Goals. Let’s say you want to generate a report showing how many people who downloaded a white paper remained on the site for three or more minutes. The time on site can be set as a GA Goal, but you can’t easily generate a report showing the percentage of those who downloaded that remained on the site for that time period.
You can do all of this with GA Virtual Pageviews.
My rule of thumb is this: If you need to identify more than one variable with an event (such as identifying various Actions and Labels), and you do not need to correlate these with GA Goals, used Events. For all else, stick with Virtual Pageviews.
Here is that how-to post I was referring to:
If you’ve been following my (roughly) monthly posts on Jason Falls’ blog you know that I’ve taken this tack: On his blog I cover the key concepts of a particular web analytics approach, then provide additional support for that idea here.
A recent example is from two months ago. I posted about the use of Brownie Charts as a way to report Content Interest Index. I posted a parallel piece here on another use of the technique (Using Brownie Charts to Measure Bounce Rates). You could say this blog has become my laboratory: Results of preliminary experiments are described here, while the “real” story is broken on Jason’s blog. Tomorrow will be a little different.
Tomorrow, on Jason’s blog, I’ll be posting on someone else’s innovation. It is a review of an extraordinary book: Hashtag Analytics. I’m a huge fan of its author, Kevin Hillstrom, and over the years I’ve spent way too many hours creating Excel-driven models in order to replicate and fully understand his findings.
I’ll be doing that again, this time in support of Kevin’s approach to monitoring Twitter communities. Check back at this tag (hashtag-analytics) to read updates on my “lab work.” Ill be reporting over the next several weeks.
You may want to check Kevin’s blog as well — especially later this week, when Kevin reports on the future vitality of the hashtag community #measure. He posted about it last week. Now he plans to theoretically whack an active member. Here’s an excerpt from his post, where he invites readers to suggest whom to “remove”:
In every e-commerce company, somebody is responsible for forecasting sales for the next twelve months, by day. So it makes logical sense that any community manager would want to know what the future of his/her community is, right? This is something you don’t find in any of the popular Twitter-based analytics tools. This is my focus. This is what I love doing, it’s completely actionable, and it’s an area of analysis not being explored!
Next week [the week starting January 24 — that’s today!], we’ll do something neat — we’ll remove one important user from the community, and we’ll see if the absence of the individual harms or helps the future trajectory of the community. If you are an active participant in the #measure community, please send me a user_id that you’d like to see removed in the forecast … I’ll run an example for the individual who gets the most votes.
And in two weeks, we’ll compare the #measure community to the #analytics community … competing communities doing similar work … which community is forecast to have a stronger future?
It’s a fun stunt / modeling experiment that has real world implications. It should serve as a proof of sorts of the predictive power of his Hashtag Digital Profiles and the statistical work behind them. More relevant to online community managers, it should illustrate why showing your participants “love,” lest they never return, is of tremendous importance.
I will be applying my own Hashtag Analytics to a different online group — one that has the advantage of weekly meetings. It’s a fairly new group, so the rules may not fully apply (Does an acorn sprout follow the same natural laws of growth as a full-grown tree?). To ensure I don’t jinx my test or influence the community — in a far more direct way than Heisenberg was referring to — its identity will remain unknown until I’ve gathered and analyzed a critical mass of data.
Do stop back.
Here are two related links I didn’t have yesterday. The first is Kevin’s post where he removes that member to the #measure Hashtag community. The second is my review of his book today on Jason Falls’ blog: