Quinnipiac, impression management online, virtual groups, persuasive industry, locative media, what is information, role of social media, ICM top 5, strategic planning, defining publics, strategic planning to nonprofits, strategic plan implementation, Center for Science in the Public Interest, Wal-mart, project management styles, future, journalism, reflections, NESN SEO, onward to Quinnipiac, A Day in My Life in Social Media, Viral Videos, Qualitative and Quantitative Analytics in my Life, social media monitoring tools, Media Convergence, Basic Web Analytics, A Crash Course in SEO, Semantic Search, Monopoly, Algorithmic Surfacing, Ambient Awareness, Polarization, Television, Participation, Physician Boundaries, Ethical Dilemmas, Charlie Hebdo, Premium Service, Spiderman, Brian Williams, Dark Patterns, Content Moderation, Big Data, Net Neutrality, Privacy and Big Data, Forgotten, Most Important Role of a Community Manager, Influencer Impact and Networks, Harrison Parrott, Content Marketing for Community Managers, Authentic Brand Voice in Social Media, Best Practices in Using Social Media for Customer Service, Highly Regulated Industries, Sabra Hummus, SWOT and PEST Analyses, Message Strategies, Communication Tactics, Program Evaluation, Continuing Program Evaluation, Strategic Campaign Plan Formatting, RPIE, Biblical Texts, Disruption, Facebook network, Qualitative and Quantitative Analytics, NESN Key Indicators, Writing Ethics, Spiderman, Wireframing, Sabra Hummus, Lonely Writer, Final Project ICM 522, reinvention, Regulation, Position Statements

Quinnipiac Assignment 10 – ICM 527 – Program Evaluation

Quinnipiac Assignment 10 – ICM 527 – Program Evaluation

Let’s do a program evaluation. This week’s readings were about evaluating a strategic plan and program.

Key Concepts

As Smith said, on (Page 331), “Program evaluation is the systematic measurement of the outcomes of a project, program or campaign based on the extent to which stated objectives are achieved.”

With a plan in place and measurable, clear objectives included in it, the next question is whether anything is working. This comes from figuring out how to measure results and what’s ‘good’ or at least adequate.

In Module 8, we studied Cans Get You Cooking, where the idea was to increase awareness of cans’ use in cooking via cooking shows and blogs. However, another objective was increased sales (after all, why bother with such a campaign if sales don’t increase?), and in that respect the plan was unsuccessful. According to Companies and Markets, the purchase of canned goods declines because of improvements in the economy.

When consumers have more discretionary income to spend on foodstuffs, they purchase fewer canned goods – no matter how well-crafted a campaign is. There was increased awareness, yes, and under that criterion, the campaign worked. But under the criterion of increased sales, it did not.

The Case of the Traveling Goalposts

It seemed a little as if the goalposts were moved in that campaign, that increased sales became a less attainable goal. Awareness was a far more readily attainable goal, and so they presented awareness as the premise behind the campaign.

These moved goalposts are the difference between what Smith refers to as awareness and action objectives, on pages 332 – 335, with the third type of objective, acceptance, straddling a line between both of the others. For the Cans Get You Cooking campaign, it seems as if the attainment of the awareness objective was the only cause for celebration.

Smith makes a compelling case on page 334, that creativity, effort, and cost don’t count as measures of effectiveness. All of those facets of a campaign are on the side of the organization. But measures of awareness, acceptance, and action are all effects felt (and acted upon) by publics. By definition, creativity, etc. should not have anything to do with the effectiveness of a campaign.

The Eight-Step AMEC Social Media Measurement Process

Jeffrey (Page 4) outlines, “The Eight-Step Social Media Measurement Process

  1. Identify organizational and departmental goals.
  2. Research stakeholders for each and prioritize.
  3. Set specific objectives for each prioritized stakeholder group.
  4. Set social media Key Performance Indicators (KPIs) against each stakeholder objective.
  5. Choose tools and benchmark (using the AMEC Matrix).
    • Public Relations Activity
    • Intermediary Effects
    • Target Audience Effects
  6. Analyze the results and compare to costs.
  7. Present to management.
  8. Measure continuously and improve performance.”
Quinnipiac Assignment 10 - ICM 527 - Program Evaluation
Avinash Kaushik, author of Web Analytics 2.0

What’s it Like?

This process compares favorably to methodologies learned in ICM 524 – Social Media Analytics. In that class, we read Web Analytics 2.0 by Avinash Kaushik. On pages 29 – 32, Kaushik outlined his Step 3 – Identifying Your Web Analytics Soul Mate (How to Run an Effective Tool Pilot) (average time: 2 years) Evaluate the following –

  • Usability
  • Functionality
  • Technical
  • Response
  • Total cost of ownership

Also –

  • Get enough time
  • Be fair
  • Ask about data sampling
  • Segment like crazy
  • Ask about search analytics
  • Test site content grouping
  • Bring on the interns (or the VPs!)
  • Test support quality
  • Reconcile the numbers (they won’t add up, but it’s fun!)
  • Check the daily/normal stuff
  • Sweat the TCO (total cost of ownership)

What Kaushik said, and what Jeffrey said, are similar. Because measurement is an objective activity. This is why objectives need to be clear and measurable. Five percent is measurable; better exposure (in general) is not.

For both authors, the idea is to have specific objectives and then act on them, whether those objectives are to launch a strategic campaign or select a web analytics vendor. Then, once they choose the vendor, get the yardstick in place, and use it.

Kaushik further reminds us that, while our intention may be to select a vendor and essentially ‘marry’ it, we still need to be evaluating the evaluator. If it’s not performing up to our reasonable specifications, then it’s time for vendor divorce court.

Key Performance Indicators (KPIs)

On page 7 of Jeffrey, it says, “Shel Holtz, principal of Holtz Communication + Technology (www.holtz.com) defined a KPI as a ‘quantifiable measurement, agreed to beforehand, that reflect the critical success factors of one’s effort.’”

This puts KPIs on a par with what we have been calling objectives. Wanting to ‘get better’ is one thing. But it’s vague and subject to weaseling.

Wanting to improve recognition of the Institute for Life Sciences Collaboration  (ILSC) and its missions by 5% as is measured by surveys taken during the second quarter of 2016 is a measurable key performance indicator. Therefore, anyone who can read numbers will be able to determine whether the KPI has been met.

Program Evaluation Re: Applicability to the ILSC

Beyond just recognition measurements, there are any numbers of KPIs to measured. These include the number of schools served by the Small World Initiative by a certain date, or increasing donations by a particular amount, subject to a clear deadline.

The ILSC

Currently, the ILSC website in particular seems to be just something people threw together. But it’s without any sense of how to deal with technological and design changes, or scalability. Keeping measurements out of the mix means the ILSC website can be started and then forgotten about. And it seems a lot like that’s exactly what happened.

However, a website cannot be a flash in the pan, as that can cause the publics to feel the organization behind it is also fly by night. Particularly when asking for money, an organization needs to give forth the impression of trustworthiness and solidity.

Adding Key Performance Indicators and measurements means there needs to be a sea change in how the ILSC views the website. It isn’t just something you throw together in an afternoon, for some temp hired for a few weeks to handle and then never see again. Instead, it needs to be an integral part of the organization.

Program Evaluation: Takeaways

While the organization’s work is (generally) offline, there still needs to be room for the website in the minds of the organization’s board members. One facet of their thinking has to include how to best use the website and social media, to better communication the ILSC’s mission and goals, and to communicate with its publics.

The website has got to have a place in those conversations, and it currently does not. That has to change.

Tags: ,