Reputation Management in a Social Media World and On Your Site
Hey, hey! Time to talk reputation management and social media with Katie Delahave Paine (KDPaine Partners) and Steve Bernstein (PayPal). Let’s do it.
[Or not. We’re having some technical difficulties so the session is starting a bit late. Amuse yourselves. I’m cruising Facebook.
Okay, it looks like we’re ready. Yey!]
For years the big thing was counting eyeballs. Then it was hits. Now it’s engagement. The biggest piece of this is to understand that measurement and engagement mean different things to different people. If you’re trying to sell something, engagement means ‘did you move someone down the conversion path?’ If you’re just trying to get some influence out there then its comments and links.
How do you measure the impact of all those various communication efforts?
Six Steps to the Perfect Measurement System
- Define your goals
- Understand your audience and what motivates them
- Define the metrics
- Determine what your benchmark is
- Pick a tool and undertake the research
- Analyze results and glean insight, take action, measure again.
A proposed engagement index:
Output: The activity
Outtake: What do people believe about you
Outcome: What do you want them to accomplish
It’s the combination of those three things that define engagement. You can’t just do one without the others.
Katie says they’re also trying to measure the impact of social media networks. You can see your share of discussion on Technorati and YouTube. But what does this all mean? Is there a connection between YouTube content on the Presidential candidates and their share of votes? Does it impact the outcome?
Katie analyzed the traffic patterns for Obama and John Paul and found that they dominated YouTube. Then she analyzed the amount of videos for each on YouTube and the extent that they were commented on, viewed, etc. She found that there was a direct correlation between the activity on these videos and the voting patterns. [Yes. I am so sure that YouTube is directly influencing people’s voting. What?]
Components of an Engagement Index
- Involvement: Web site visits, time spent, page view
- Interaction:: Comments, reviews
- Intimacy: Sentiment, positioning
- Influence: Likelihood to recommend, brand affinity, forwards, links.
You also have to test relationships, taking into account control mutuality, trust, satisfaction, commitment, exchange, and command.
Katie throws out some stats:
Engagement in external blogs = 13 comments
High engaged admissions blogs = 35 comments per post
Good momentum on social bookmarking sites = 1 submitted item every other day
Average positive = 50 percent average, negatives 9 percent
Most of the content shared on Facebook is video. Traditional news media plays a much bigger role in sharing information than people think. Thirty eight percent of people get information from sites like Flickr and YouTube.
If you think you’re going to put a video out there and control it, forget it. 86 percent of watched videos come from individuals, not corporations. If you’re corporation trying to release video, hide that you’re a corporation. Otherwise it will be rejected. [It will also be rejected when people find out you were trying to hide your identity.]
- The reality is if you want to be popular, be video and don’t be corporate
- Traditional media is much more important than you think
- If you want o reach incoming Freshman, you have between March and Aug to get your message out
- In terms of tonality, neutral is the norm.
Engaging allows you to join in the conversation and correct bloggers who are saying bad stuff.
ROI: Trying it all back to the bottom line
- Define “R” – what’s your mission
- Define how you contribute to that mission
- Define the ‘I’ – what’s the investment
Steve is up next. He’s going to talk about qualifying the quantitative. My fingers are crying sweet baby emo tears. Why all the long words?
Why did PayPal start getting site feedback?
They had the “what” but they needed the “why”. They could see in their data that there were people going down the same path a few times. Why were they doing that?
If you go to their site and click on “site feedback”, a comment card will pop up. The two most important parts of the card are “Would you recommend this site to a friend?” and the abbreviated net promoter score. Whatever that is. Basically they want to force people to make a decision. They’re looking for trends over time.
Maximizing the Value
- Data is worse than pointless if you don’t use it
- In an ideal world, each comment would be read, and actionable comments would always be acted upon or at least considered for action.
- It’s more than just labor-intensive, that’s just hard.
- Quantifying the Qualitative: Qualitative research is thought of as focus groups and interviews. They’re free-floating and are about discovery. Comment boxes are kind of qualitative because people can right whatever they want, but you can quantify it because you’re presenting the exact same experience for everyone.
Comment Categorization: Categorize comments by themes and traffic them to appropriate product management teams. There are many commercial categorization tools available to help you do this. Comments typically categorize well because they’re one-dimensional.
He takes every comment and normalizes it using porter stemming. You want to clean it up by taking out stop words and put everything in the same tense. Then they use trigrams – a three word phrase – to move through the content and count all the three word phrases that appear. It gives you a histogram of each comment and lets you “cluster them into galaxies”. You can find commonalities.
And that’s it. Wow, that was a somewhat confusing session that failed to deliver. Bummer.