Social Media Marketing by Liana Evans was a book that I might have read a little too late in the semester. In all fairness, I read this book toward the end of my first social media class at Quinnipiac (ICM 522).
Hence it felt like I already knew a lot of what was being written, but that was likely more a function of timing than anything else.
Been There, Done That
The book is interesting. However, I had just read a ton of other works about very similar work, strategies, and ideas. Therefore, it ended up being maybe one book too many. And it ended up an optional read, anyway. Furthermore, other works seemed to have said it better. And these days, books just do not get published fast enough to take proper advantage of trends and new insights. Blogs, in general (although not always!) end up more current and relevant.
Possibly the best takeaway I got from the book was when Evans talked about online communities, particularly in Chapter 33 – You Get What You Give. And on page 255, she writes –
You need to invest your resources
Time to research where the conversation is
Time and resources to develop a strategy
and Time and staff resources to engage community members
Time to listen to what they are saying, in the communities
Time and resources to measure successes and failures
Giving valuable content
It is similar to a bank account
Don’t bribe the community
Rewards come in all fashions
Research who your audience is
Give your audience something valuable and/or exclusive
Don’t expect you’ll know everything
Listen to what your audience says
Admit when you are wrong
Thank your community
Finally, much like we’ve been telling people for years on Able2know – listen before you speak!
But no matter. Because this is still a terrific work by Josh Bernoff and Charlene Li, and it remains more than a little relevant.
And in fact, I think I understand it better than I ever have.
Changing the Way You Think about Online Marketing for Good
For Li and Bernoff, the online world is a rich and diversified community. And in that large umbrella community, there are several smaller communities. But unlike Matryoshka (Russian nesting dolls), there is an enormous amount of overlap.
Above all, they put forward the idea of a system called POST.
Personae – who are your potential buyers? Who are your readers? And who makes up your audience?
Objectives – what do you expect to get out of going online, and continuing online, or going in a different direction online?
Strategies – how will you implement your ideas? What comes first? In addition, what must wait?
Technologies – which platforms will you use? How will you use these differently as your strategy begins to click into place?
So the last time I read Groundswell, I suspect that I didn’t really understand POST.
And now I know never to start a social media campaign without it. So thanks to Charlene Li and John Bernoff! This work is a classic for a damned fine reason. It really is that good. Because you need this book in your social media library.
ICM 502 – Information Design Common to Biblical Texts
Biblical texts are fascinating, as are their designs.
What we today might consider to be problematic designs were much more typical years ago, particularly in the area of holy texts. That’s not so surprising when you consider that bibles were in far wider circulation before the printing press (scribes knew their buyer persona) and that, even after it was invented, there was still a great deal of illiteracy.
The two factors created a condition whereby a lot of prayer books and the like would be lettered or printed, and there would be a lot of illustration so as to help the illiterate faithful follow along.
Modern Design Issues
Design requirements are different now. For one thing, only maybe 14 percent of the US population is illiterate. While that figure is higher than most of us would like, it’s a lot better than it was after the Roman Empire fell, and literacy was a feature of no more than maybe 30 – 40 percent of the populace.
Because we don’t need so many pictures to help us along, Rebecca Hagen and Kim Golombisky, in White Space is Not Your Enemy, offer a list of layout sins on pages 31 – 42:
13 Amateur Layout Errors
Things that blink. Incessantly.
Naked photos (e. g. they need a border).
Bulky borders and boxes (use hairlines).
Corners and clutter – don’t fill the corners! Instead, group similar visual information together.
Trapped negative space – keep the negative space at the edges.
Tacky type emphasis – reversing, stroking, using all caps, and underlining.
Reversing – white copy on a dark background
Stroking – outline characters
Bad bullets – use real bullets, not just asterisks or emoticons, and align them.
Widows and orphans.
Justified rivers – don’t use fully justified (both sides) blocks of type. Help eliminate this by increasing column width or reducing font point size, or both. Or just don’t do it!
Biblical Text Designs
When it comes to biblical texts, violations of sins 4, 7, 8, and 9, seem to be common. I chose images outside of the readings as these designs seem to be ubiquitous for older religious texts.
In this image from a Gutenberg bible (Genesis 1: 1) currently housed at the Bayerische Staatsbibliothek, a lot of these sins come to the fore.
The border, though lovely and possibly unique, is also bulky (#4) and busy (#9), and clutters the corner space (#7). Negative space is trapped within the border (#8).
Furthermore, the lack of paragraph or verse breaks makes the columns look uniform but uninviting to read. The text feels unapproachable and forbidding.
There’s another bulky border in this image from a Koran which is currently housed in the David Collection at the Park Museer NE in Copenhagen, Denmark. This one has a lot of white space, with no visible spacings between sentences, except for the gold circles. The border feels imperfectly drawn although not freehand. Perhaps it was traced. At least there’s no corner clutter, and the border is relatively simple and not busy, but the design still commits the sin of bulky borders (#4). The large amount of margin space also appears strange. Because this is a holy text, it probably wasn’t meant for taking notes or annotating the text. So why is so much of the space being wasted? Contrast that with what’s inside the borders, where the text is cramped and takes up nearly all of the real estate.
The Gutenberg bible seems to be saying with its design is that illiterate parishioners can follow along, as can young children. The Koran, in contrast, is instead telling its reader to focus on reading and not view any graven images.
What are the differences between managing a media project (e. g. making a film) and managing another sort of endeavor? The defense of a legal case shows both differences and similarities.
At Vjeko.com, writer Vjekoslav Babic outlined the secrets of Hollywood project success in the context of a review of James Persse’s book on that subject. Babic was comparing media production to IT development, and found there were:
“8 important similarities between movies and IT:
Both deal in intangible product development.
Both are shaped to directly address a deadline-oriented business need.
Both require significant investments.
Both are built against a specification open to change.
Both rely on specialized production protocols and technologies.
Both require the integration and collaboration of specialized teams.
Both require careful analysis, design, execution, and integration.
Both must be thoughtfully delivered to their target audiences.”
Babic further noted, “…it distills down to a simple framework of five phases: development, preproduction, production, post-production and distribution.”
“…development roughly corresponds to project initiation, with pre-production mostly having to do with project planning. Other three phases could be mapped to project execution and monitoring.”
“In first three phases, most revolves around the script, the Hollywood’s equivalent of requirements document, and although Persse found out that the script also changes, it tends to be more of a holy scripture than requirements are in IT. For Hollywood, it is much easier to set Big Requirements Up Front….”
At Creative Skill Set, the definition of a Production Manager includes, “Production Managers are in charge of the ‘below-the-line’ budget. This covers costs relating to the crew and the practicalities of running a production.”
Things differ in the legal field. Let’s look at a moderately-sized insurance case, an automobile accident. It’s a left-turn case, where the client (defense) made a left turn in front of the plaintiff’s vehicle, causing a collision. Both parties are at fault although the percentages differ. For sake of argument, this will be a jurisdiction where comparative negligence is the law, e. g. there can still be some questions and plaintiff fault isn’t a sure-fire loser like in a contributory negligence jurisdiction.
The project manager is the attorney handling the claim. The purpose of the project needs little clarification: it is to either win the case outright or limit the amount paid by the insurance carrier (and the client, too, if damages exceed the amount of the policy). Similarly, the purpose of a film project is to make a movie. It’s not to make an artistic statement or even to make a profit, although those are favorable collateral consequences. Similarly, the purpose of defending a lawsuit is generally not to make law although that can also be a result.
For filmmaking, the goals and objectives are more on the profit side. Movies are made in order to meet financial projections, often to satisfy shareholders. In the law, the goals and objectives are to save insurance company money but also to garner experience for more junior attorneys and to build favorable reputations for lawyers with any level of expertise.
Scope for films is to make the one picture, or the three pictures in a trilogy or whatever the contract says. For legal cases, it’s just the one case, and it often does not include appeals if the matter gets that far, mainly because appeal work tends to be more specialized. Scope usually does include impleader (adding more defendants) for cross-claims or third-party claims which may be other sources of settlement or verdict funds. In the automobile accident fact pattern, if the traffic lights were mistimed or were not working at all, defense might wish to implead the town or its maintenance company. This might be seen as scope creep; the prudent law firm will allow the impleader if it is likely to provide a substantial benefit (note the ABA Model Rules of Professional Responsibility require a zealous defense; hence what in other projects would be seen as scope creep may very well be necessary under ethical considerations. The Model Rules have been adopted by many American jurisdictions).
Budgeting is strict for filmmakers. For attorneys, a balance must be achieved between keeping costs low and mounting an effective and zealous defense. Costs can be lowered with having paralegals perform research and more junior lawyers attend preliminary hearings and motions. Budgeting can be a dimension of time for lawyers, too, as a carrier might want an older case settled during the next quarter or might push for settlement if billed hours exceed a certain amount. Unlike in many other disciplines, budgeting in the law may be desired but not possible under ethical considerations.
The expected benefits for both disciplines center on reputation. For a film production company, a completed project could signal future development partners (actors, screenwriters, etc.) that the company can make an award-winning piece of art, or churn out a cheap film quickly to take advantage of a hot news story. For a law firm, a favorably concluded case can demonstrate a thorough understanding of the law or an ability to settle for less.
Success for both disciplines looks a lot like completion. A finished film, under budget, is the desired end result for a production manager. For lawyers, success is a closed case, and not necessarily a win. And then, for both, it’s time to move onto another project.
“Data that is (1) accurate and timely, (2) specific and organized for a purpose, (3) presented within a context that gives it meaning and relevance, and (4) can lead to an increase in understanding and decrease in uncertainty.”
But is that all there is?
The first part of the definition – accuracy and timeliness – seems to be more of a definition of facts, which are almost the quanta of information. Facts are bits of information, like the capital of Uruguay (Montevideo) or the proverbial price of tea in China (varies).
The second piece – specificity and organization – takes information away from the realm of random trivia and gives it a reason for being. Uruguay’s capital makes sense as a piece of information if you’re flying there or doing business in the country. It could be a fact which is specific and organized for something like winning Jeopardy! It’s not organized for the purpose of passing an Anatomy examination, however.
The third aspect – meaningful and relevant context – further removes it from random factoids and pop culture references. If the air-speed velocity of an unladen swallow will get you over a necessary bridge, then it is utterly meaningful and relevant data. If it doesn’t, then you’re left with just so much cerebral flotsam and jetsam. As Frode Heglund says, “Information cannot exist without context.”
The fourth bit – understanding increases while uncertainty decreases – further emphasizes the idea of meaningfulness.
Information tells you something you don’t already know, and is, to crib from Facebook, relevant to your interests. It’s accurate and timely, too. It is useful for you.
What is the role of design in how we receive information?
On page 6 of White Space is Not Your Enemy, Rebecca Golombisky and Rebecca Hagen say, “Good graphic design does four things. It captures attention, controls the eye’s movement across the page or screen, conveys information and evokes emotion.”
Essentially, what Golombisky and Hagen are saying is that design is one of the aspects of information conveyance as much as body language is a part of verbal and visual communications.
A great example is in the Twilight series book covers. Love or hate the stories (in the interests of full disclosure, I worked for the publisher, Hachette Book Group, when Breaking Dawn was first released) the covers are a festival of symbolism.
Twilight, the first in the series, sports a cover which easily symbolizes Biblical temptation, just as the title refers to an in-between time of the day – analogous to the in-between stage of life the main character, Bella, is in. Furthermore, since the hands don’t quite match, they might be intended to reference the couple at the center of the story. The second, New Moon, with its bicolor flower and dropped petal (which looks like a fresh drop of blood) shows a conflict just as the title refers to a time with no reflected sunlight, when the stars are at their brightest. For the third book, Eclipse, the sun (or perhaps the moon) is utterly blotted out in the title, and the cover image references a fragile and nearly completely broken bond. It is the darkest of the covers. And in the end, with Breaking Dawn (a book added to the series which may not have been originally planned), the cover shows a white queen leaving a red pawn in its wake in chess, a classic strategic game. The title is a hopeful one, symbolizing a new beginning. Yet the queen is not quite triumphant, and in chess, the piece is safe from capture but is also not shown in a position where it can take the pawn.
The covers also move from human contact or at least a human at center stage to another living (possibly dying) thing and then two images of nonliving possessions. Even if a curious book buyer was unaware of the plots in the series, they can readily see from the covers that the story progresses from temptation to conflict and possibly even violence, and the upshot is a possible potential triumph which does not seem to be realized yet. The title font is mysterious and the ‘l’ in Twilight looks almost like a stake in a vampire’s heart.
Through good design, six title words convey a wealth of meaning. Design can do this.
Quinnipiac Assignment 14 – ICM 527 – Real World versus Academics and RPIE
This week, the focus shifted to the practical application of the academic readings. The emphasis was placed squarely into the real world.
Key Concepts of Strategic Planning – Industry versus Academic Readings
Perhaps the most industry-related reading of the week was Ashkenas, Four Tips for Better Strategic Planning. Unlike the RPIE (research, planning, implementation, and evaluation) approach covered in the academic readings, Ashkenas leads with a kind of off-shoot of implementation and evaluation, wherein he insists on first field testing to evaluate assumptions. While the academic readings, such as Smith, seem to save the evaluative process for either the end of a campaign or the middle of one (say, after a major milestone or after an iteration), Ashkenas pushes for a form of evaluation to happen even before a campaign gets off the ground. As Smith notes, on page 331, “Program evaluation is the systematic measurement of the outcomes of a project, program or campaign based on the extent to which stated objectives are achieved.” Outcomes, by definition, come at or near the end.
Smith goes on further to say (page 331), “The key to creating any program evaluation is to establish appropriate criteria for judging what is effective. This research plan considers several issues: the criteria that should be used to gauge success, timing of the evaluation and specific ways to measure each of the levels of objectives (awareness, acceptance and action). It may prescribe the various evaluation tools, and it also should indicate how the evaluation would be used.” Yet the timing in Smith seems clear – the campaign has to be complete or near completion or at least running on all cylinders and then it’s time for an evaluation. Not so with Ashkenas, who gets it out of the way early in order to prevent the creation of a campaign based on faulty premises.
Another Ashkenas tip is to ‘banish fuzzy language’, e. g. to torch weasel words like ‘leverage’ and ‘synergy.’ For Ashkenas, a campaign plan needs to be straightforward, such as laying out clear and unambiguous goals and relating them directly to an organization’s stated mission. Similar to Anderson, Hadley, Rockland & Weiner’s objectives, e. g. “(a) clear and shared sense of purpose distills program tactics and focuses financial and human resources on those areas on which they have the greatest impact.” (Page 5), Ashkenas seeks to narrowly focus an organization’s campaigns. The plan is not much of a plan if its language is impenetrable. HootSuite, on page 2 of their Guide for Social Media Strategy, voices a similar call for clarity, where they say, “(a)ll business planning should start with defining clear goals, and social media is no exception. One of the biggest reasons why social media strategies fail is because goals aren’t aligned with core business values. For long term success on social media, choose goals based on traffic, leads, and sales.”
Wilkinson’s Four Steps – Relating them to the Academic Readings
Turning to Wilkinson, he outlines four major steps in what he called a Driver’s Model for strategic planning.
The first step is to perform a situational assessment, which is a lot like a SWOT (strengths, weaknesses, opportunities, and threats) analysis. As defined by Williams, “(a)t its most functional level a SWOTanalysis will help you obtain information and assess a situation.” Wilkinson adds some specifics to this basic form of preliminary analysis, wherein he provides fairly universal questions for a strategic planner to use when assessing an organization’s customers, competitors, industry trends, performance trends, and more. Wilkinson provides more of a roadmap in this area than Williams does.
The second step is to investigate an organization’s strategic direction. Here, Wilkinson drops the specifics and turns to broader organizational data such as vision statements, mission statements, goals, and objectives. These are far more organization- and industry-specific, so a generalized statement about them would not be of much help to a strategic planner. Smith, on page 41, defines a vision statement as looking, “to the future. It is a brief strategic description of what the organization aspires to become.” And a values statement, according to Smith (page 42), “is a set of beliefs that drive the organization and provide a framework for its decisions.” Hence, again, Wilkinson’s methodology is congruent with the academic readings from this semester.
The third step is implementation planning, whereby Wilkinson performs a somewhat modified PEST Analysis. While Wilkinson does not look at all Political, Economic, Social, and Technological factors, the implementation plan he touts is similar. He urges the strategic planner to investigate the barriers to achieving an organization’s vision. He also suggests developing key conditions (between two and seven) that must be met in order to consider a campaign to be successful. The roadmap combines the two, as the strategies “must drive achievement of the strategic direction by controlling the critical success factors and overcoming the barriers.” The implication, also, is that a conceived strategy which does not address either side of the implementation plan needs to either be changed of jettisoned from the plan.
The fourth step is the monitoring of progress, much as is included in the Smith and Anderson, etc. readings, and in Dr. Place’s paper, although Place notes an ethical element must be included in evaluating campaign plans, on page 129, where she states, “(t)he role of ethics in public relations evaluation, according to participants, is to guide practitioners as they conduct truthful, effective evaluation and weigh an organization’s needs with its publics’ needs. Ethics’ role appears to be most salient during the reflection or reporting phases of program evaluation.” For Wilkinson and the other readings, evaluation is a crucial step. Otherwise, how is an organization to truly know a campaign’s effectiveness? And, more importantly, evaluation is the only real way for an organization to be able to intelligently determine whether a campaign should be budgeted for another year, or if it should get the axe.
Relating RPIE to Business, Social Media, and Communications Industries
RPIE (research, planning, implementation, and evaluation) relates directly to all industries, it seems.
In the business world, particularly in light of Sarbanes-Oxley, which requires financial accountability and transparency, clear and well-defined research and evaluation are paramount requirements. Corporations cannot simply throw money at a problem; they have to have a plan and that plan has to demonstrate a reasonable chance of success. While not everything works, a campaign plan with germane and well thought out research has a far better chance of success than one where the dice are rolled. Sarbanes-Oxley, it seems, would require at least the R, P, and E portions of RPIE, and probably implementation as well.
In the case of social media industries, research and planning can make the difference between staying in business, or not. For a social media organization such as Facebook or Twitter, to not understand their buyer personae or how their platforms are used is a recipe for a platform going under. In a way, this seems to be what has happened to Myspace. For a social media organization which was originally heavily social and popular, Myspace missed the boat, did not realize that its public was growing out of usernames and becoming interested in real name-style authenticity, and did not plan beyond its own website. According to Jay Baer of Convince and Convert, Myspace also fell down because it never made itself particularly business-friendly. Planning and evaluation could have helped Myspace keep Facebook from eating its lunch.
In the communications field, implementation is perhaps the most vital of the four pieces. Communications is obviously all about messaging, as is strategic planning, and these organizations are often adjudged – fairly or unfairly – based upon image, message, and look and feel. In communications organizations, research and planning have to cross the implementation finish line. Understanding what a communications organization’s publics are interested in, and planning to give them what they want, is not enough if a communications organization (such as a newspaper online) falls short on implementation
In the real world, RPIE does not just apply to strategic planning. It can apply to nearly every aspect of a business or nonprofit organization. So much of what we see online was tossed out there with little thought to how it would look in a year or in twenty. So little of it is altered or updated when new information or technology mandate that changes be made. An organization of any type can set itself apart by following RPIE principles in all aspects of its existence.
The CSPI mainly uses proactive communications strategies in an effort to present a positive image to its publics. As Smith, on pages 130 – 140 writes, one proactive strategy is newsworthy communications. And on page 113, Smith indicates that a proactive strategy enables an organization to launch a communication program under the conditions and according to the timelines that seem to best fit the organization’s interests. E. g. this includes generating publicity, presenting newsworthy information, and developing a transparent communications process.
For the CSPI, one significant communications strategy is to showcase their past accomplishments. Making a case for future donations, the Center outlines how it has been fighting for consumers since 1971 (although the listed accomplishments only date back to 1973). Current communications are made on the organization’s blog, FoodDay.
Tactics include the existence of an online community, which a visitor to the site is urged to join every time they refresh the page. However, it doesn’t seem to be forums (which would be expected). Rather, the community might be the Facebook page or the Nutrition Action Health Letter, which is the organization’s newsletter. It’s hard to say; the site is unclear about whatever this ‘community’ is supposed to entail, and what sort of power they might wield, if any. It seems that their tactics encompass organizational media as outlined by Smith on page 229. A community should be able to engage in a degree of give and take, but the Facebook page doesn’t seem to have much in the way of responses to posters, and the newsletter is an even more one-sided method of organization communication.
If I were strategic counsel for the organization, the community would be a real community, with actual give and take. Tweets would be answered, as would Facebook posts. The public would not be ignored.
The organization’s main publics are people concerned about food safety for themselves and for their children. Hence parents are a component but not every communication is geared toward them. A side interest for the organization’s publics is an overall concern about health, as the front page of the website has links to articles about supplements, dieting (a huge online interest – just Google the word ‘diet’ and you get nearly half a billion hits), and rating coffee house foods. All three of these articles seem to be targeting nonparents, whereas articles on school lunches, candy at checkout counters, and children’s restaurant menus seem to be squarely aimed at parent publics.
Effectiveness seems to be mixed. As noted above, urgings to join a community reveals that there really isn’t a community to speak of. The Facebook page and Twitter stream both spew content but don’t answer community queries or engage with the community (although, in all fairness, there was a November 26, 2015 tweet wishing everyone a Happy Thanksgiving.
The Pinterest profile is a bit better in that a lot of the pins come from offsite or are repinnings. The Healthy Thanksgiving pinboard is colorful and attractive without being overly busy and precious like Pinterest pins can sometimes be. It did not look like a lot of impossible to re-create craft items, and instead was filled with practical recipes.
The CSPI’s image does need some help, though. While their rating on Charity Navigator is a respectable 3/4 stars, a quick Google reveals two significant criticisms of the organization. AlterNet reveals that the CSPI is against labeling GMO foods as such. A PDF on the CSPI site bolsters this statement, wherein the CSPI appears to be endorsing a statement that GMO foods are not harmful and thereby don’t need any form of special identification.
The other, and more significant critique, comes from AlcoholFacts.org. In a well-researched (albeit seemingly slanted) article, Alcohol Facts states, “Center for Science in the Public Interest distributes its reports without peer review, contrary to the way real science operates. … Without peer review, an advocacy report full of erroneous and misleading statistics can be passed off to the public as a scientific report. That’s exactly what Center for Science in the Public Interest does.”
The CSPI does not seem to have reacted to either criticism. Could this be the deliberate inaction strategy as outline by Smith on page 145? Or did the Center just drop the ball?
Tying it back to the ILSC
For the Institute for Life Sciences Collaboration (ILSC), the Center’s website provides some lessons on how to proceed. For one, when mentioning a community (and mentioning it ad nauseum, as the prompt to join their community comes up with every single page refresh), the Center is writing communications checks that it is not cashing. Why say something is a community when it so clearly is not? The ILSC needs to pay attention to its communications promises to its publics. Calling something a community does not make it so – communities are defined by shared bilateral communications. Weinberg & Pehlivan (2011), on page 277, define a community’s (and other social media) objectives as including “Conversation, sharing, collaboration, engagement, evangelism”. That does not happen when a public is talked at.
The ILSC can also take away the idea of developing a means of reacting to online criticism. The AlterNet and Alcohol Facts critiques of the Center are not going away. Not addressing these criticisms does not do the Center any favors. Instead, the Center looks as if it does not care about what is said about it, and its ignoring of the posts by its own Facebook and Twitter followers bolsters that impression. In order to stand out and better serve its own publics, the ILSC has got to not only listen to its followers, its fans, and its critics – it also has to answer them.
Added Applications of the RPIE Strategic Process
For the RPIE process (e. g. Research, Planning, Implementation, and Evaluation), the review of the Center’s website and their other online presences reveals that implementation cannot be overlooked. All the research and planning in the world does not amount to much if an organization does not seem to do anything with the available information out there.
The Center is shouting. Its community does not seem to be a community at all. Imperfect implementation is to blame, but at least that can be fixed.
Comparing InScope, Toms, and the Century Council Plans
InScope, is a “user-friendly search engine is dedicated to finding peer-reviewed, academic research among Belcher Rollins’ publications, back-dated to 1960. This equates to 27 million document entries.” (Page 4) Note: Belcher Rollins is a publisher.
The campaign for InScope focused on planning. The idea was to become and remain an online solution for librarians and academics. Enumerated goals (Page 9) were:
“To generate substantial interest in InScope from academic librarians, in order to support sales: positioning it as the best in the market and ‘a new generation in research’
To generate understanding and support for InScope among industry influencers and opinion leaders
To strengthen Belcher Rollins’ position as the brand leader in international academic publishing, underpinning its reputation for innovation and quality
To manage communications around contentious issues within scholarly publishing: (a) pressure on academic financial resources and criticisms of profiteering by publishers, (b) the lobby for open access publishing and (c) international censorship”
Events were planned with an eye toward attracting media coverage. There were a large number of internal communications planned, in order to keep the key stakeholders informed, e. g. librarians, academics, academic budget holders (the people with the money), and the media. The budget exceeded £2M, which currently converts to over $3.8M.
The Toms campaign, in contrast, had a markedly different look and feel. InScope was traditional and felt conservative, whereas Toms felt somewhat casual and even crunchy and hippie. Toms is a shoe creator and seller (they also sell other accessories such as purses and necklaces). The company’s mission is to donate a pair of shoes to charity for every pair purchased. The donated shoes are sent to children in countries such as Argentina. As the plan itself noted, on Page 30, “When a customer interacts with the TOMS brand, it is more than just buying shoes.”
The Toms objectives were to increase sales and repeat purchases, and to boost brand awareness. For Toms, the tactics included using the opening of new stores as events, although, in contrast to InScope, these events were not touted as a means of involving the media. Further, the Toms plan acknowledged the organization’s nontraditional stance in the media as being an opportunity (Page 15). Other tactics were even more grass roots, involving an email list, a shoe drop, and even posters. The Toms budget clocked in at a far more modest $17,420, although a lot of the biggest ticket items (such as the use of a plane for the shoe drop) were not enumerated among the planned expenses. Planning seemed looser, perhaps in keeping with the organization’s more relaxed overall philosophy.
For Century Council, the main goal was to kick off a new website to reach collegians in particular and help change attitudes about drinking. A further goal is to reduce underage drinking and drunk driving. The plan emphasized segmentation, whereby college students were divided by age and by activity (e. g. athletes, members of Greek letter organizations, etc.). The website would be specially tailored for each participating school. The tailoring was broken down further for age groups, as the message differed, being zero tolerance for those under 21 years of age, and responsible drinking for those over. The budget was in excess of $8.9M. In some ways, this campaign split the difference between InScope and Toms, at least in terms of presenting a strict and staid presence like InScope did, yet relying less on traditional media (one of the ways Century Council was looking to reach its publics was through a pizza box advertisement), like Toms did.
Each plan had a lot of the components of what we have been studying all along, from careful research into publics to clear-cut goals and objectives. Budgets were carefully laid out, although the one for Toms was incomplete. The timetables for all three campaigns seemed realistic.
Reading the Plans and Recognizing their Components
There were some language alterations between the plans as presented and our readings. InScope in particular used a lot of synonyms. Goals were enumerated as aims, for example. All three campaigns laid out their SWOT analyses clearly, using an easy to follow grid format. Evaluations for all of the campaign plans were clearly labeled.
Formatting and Stylistic Takeaways applied to the ILSC
The Toms plan in particular took advantage of a stylistic look and feel which mirrored the organization’s view of itself. The plan contained images of the founder, Blake Mycoskie, in Argentina, with children that the organization has helped. These images helped to add an emotional component to the campaign plan that was missing from the other two plans.
The Century Council plan was more generic-looking which seemed to reflect almost a PTA budget kind of communication. This tied in fairly well to the campaign being related to what happens on college campuses. By having the formatting look this way (and it may not even have been intentional), the campaign called to mind straightforward academics and straight talk.
The InScope campaign, in contrast, was poorly formatted. Word allows for headings which make navigating a document a lot easier – the campaign didn’t have those. The spreadsheet denoting the timetable was a bit wide and threw off other formatting. It was walls of text with little formatting or emphasis, and no imagery to speak of. Word also makes it easy to create a dynamic table of contents whereby a reader can click on a part of the table and be taken directly to the desired section of a document. The campaign did not take advantage of these simple yet powerful formatting tools – the campaign’s typist could use an intermediate course in Microsoft Word!
For the ILSC (Institute for Life Sciences Collaboration), the website is already rather bland. They are doing interesting things, such as teaching youth around the world, possibly finding cures for all kinds of fatal diseases, and potentially saving lives in Ghana. This is exciting stuff, yet the walls of text make the site look industrial, sterile, and unfeeling. There can be stylistic symmetry between the campaign plan and the look and feel of what the ILSC website should be all about. The ILSC needs to show its heart.
Visual impressions matter, particularly online. There is no reason why the ILSC cannot get started by making the campaign plan easy to read, well-indexed, and visually appealing.
Quinnipiac Assignment 11 – ICM 527 – Continuing Program Evaluation
This week, we continued studying the evaluation of public relations campaigns.
Ethical Issues Regarding Evaluation
As is true for any presentation of numbers, there are ways to spin findings which can lead a reader to believe one thing or another. Numbers can be used to make a case, and some numbers, if suppressed or deemphasized or just plain omitted, could alter organizational decision-making. This only gets into telling the truth with numbers. All bets are off if a strategic planner or any sort of analyst out and out alters the figures they have to present, or if they weren’t given accurate or truthful numbers to begin with.
But even if the analyst is completely honest about results and figures, there are still issues with emphasis and language. For the Cans Get You Cooking campaign, the initial purpose had to have been to increase the sale of canned goods. Instead, the campaign was labeled as a success for leading to an increase in awareness of canned foods. While awareness is a perfectly legitimate (and objective) goal for a campaign, the goal of increased sales seems to have been swept under the rug in favor of the one, demonstrable, favorable outcome – a boost in awareness.
On page 125, Place notes, “The role of ethics in public relations evaluation was described by participants as inherently associated with truth and fairness. For some professionals, this meant conveying evaluation data accurately and truthfully to organizational leadership or clients. For other professionals, this meant measuring whether the most accurate story or brand image reached an organization’s publics.”
Professionals, fortunately, realize that their words can be misinterpreted, even if they are reporting accurately on the numbers. If a campaign increases, say, signups for a class by five over an initial figure of five, then how is that reported? Is it a report of a new five signups, or does the professional state that signups have doubled? Both are mathematically correct, but there is an exciting spin to the latter which may be making it look more significant than it truly is.
The Real Warriors and Okay 2 Talk Campaigns
A review of both campaigns revealed good attention to detail. Both campaigns seemed to be rather carefully planned.
The Real Warriors Campaign was designed to encourage active armed services personnel and veterans of recent American military campaigns (since 9/11) to seek psychological counseling and other help for post-traumatic stress disorder, e. g. ‘invisible wounds’. Primary research included focus groups and key informant interviews. All of the campaign’s goals were awareness-based. The goal was to decrease stigma felt by veterans seeking mental health assistance.
The measurement of the effectiveness of the campaign included the distribution of campaign materials, website visitors, and social media interactions, plus news stories. This is good for an awareness campaign, but where are the actions? Where are the increased numbers of veterans seeking help? A far more germane measurement would be to show an increase in personnel hours for armed forces mental health professionals. Or perhaps there could be a measurement of the hiring of more counselors, or agreements with more civilian counselors. Without naming names or otherwise violating privacy, the number of patients being seen could be readily tallied, as could the number of appointments made, even if some of the appointments were never kept. Another objective measurement of success would be a decrease in suicides and fewer calls by veterans to suicide prevention hotlines. The campaign shows none of that.
As for the OK 2 Talk Campaign, that campaign’s goals were to create awareness and also to launch a safe social media space. Tumblr was the chosen platform as it allowed for anonymity. It seems to have also been chosen for a demographic match although that is not spelled out.
The measurement of the effectiveness of that campaign was a lot more closely aligned with its initial goals than the Real Warriors report showed. For example, the OK 2 Talk report gave objective figures regarding engagement on OK2Talk.org. The page views are not necessarily indicative of much. It is the content submissions which seem to better reflect engagement. On the Tumblr blog, visitors are encouraged to anonymously post about how they are feeling. The blog makes it clear that not everyone’s writings will be posted. However, there are several well-written or illustrated posts showcasing various viewpoints. OK 2 Talk intelligently shows all kinds of posts, even those where the writers clearly need help or are just reblogging messages put together by creative professionals.
The campaign report shows the number of content submissions and the number of clickthroughs to a ‘get help’ screen. There is also a statement regarding ‘thousands’ of comments but no specifics; that could have been more clearly shown. But that does not truly matter. Showing the number of clickthroughs to the ‘get help’ screen was an objective and direct measurement of how the campaign is going. It answers the question, ‘did it work, or was it just a colorful and fancy waste of time?’ with ‘yes, it did’, and far more effectively than the distribution of materials ever could. As Smith notes on page 335, “Guesses aren’t good enough; Hard work and cost aren’t measures of effectiveness; Creativity isn’t, either; Dissemination doesn’t equal communication; Knowledge doesn’t always lead to acceptance; and Behavior is the ultimate measure.”
In particular, Real Warriors should have remembered that dissemination does not equal communication. After all, the distributed campaign materials could have gone right into the trash. Without some demonstrated actions (yes, the campaign’s stated goal was awareness, but it could only really be measured with some form of observable action), Real Warriors seems more like a lot of paper redistribution.
The two campaigns have similar goals, and both have the valiant ideal of helping the mentally ill. But it’s only OK 2 Talk which is showing objective and relevant results.
Relating it all back to the ILSC
For the Institute for Life Sciences Collaboration, deciding what to measure, and to make sure it is being accurately measured, are important steps to take. While it is pretty easy to count website visitors using Google Analytics or the like, a better measurement is actual engagement like blog comments, Facebook comments and shares, and LinkedIn comments. This will tie directly to awareness objectives.
For objectives regarding adding high schools to the Small World Initiative, good measurements include the number of times that educators click through to a ‘get information’ page which should be added to a revamped website. Such inquiries could also be expected in the comments and messaging sections of a possible future Facebook group devoted to the ILSC. A similar vehicle for obtaining such inquiries could be a possible future LinkedIn group for the ILSC, and its topics.
Measurements of the campaign reaching donors could be a look at the number of visits to a donations page. It would also be the percentages of site visitors who went all the way through the online donations funnel. Knowing where they stop (if a visit does not lead to a donation) would be extremely helpful information to have.
For the website, Google Analytics should be used to tie back to visitor acquisition. If Facebook turns out to be the most popular place for visitors to come from, then the ILSC should be concentrating their efforts there. A surprisingly small amount of money (e. g. $20.00 or so) can boost a post and reach even more people. This measurement is useful for all types of objectives, as it helps to define where the ILSC’s social media time should be best concentrated. There is little use in devoting hours and hours of time to LinkedIn if the publics don’t come to the website and don’t donate any funds. Awareness needs to be related to action, for it is action that will get the SWI out of its funding gap and help keep the ILSC going for years to come.
Quinnipiac Assignment 10 – ICM 527 – Program Evaluation
This week’s readings were about evaluating a strategic plan and program.
As Smith said, on (Page 331), “Program evaluation is the systematic measurement of the outcomes of a project, program or campaign based on the extent to which stated objectives are achieved.”
With a plan in place and measurable, clear objectives included in it, the next question is whether anything is working. This comes from figuring out how to measure results and what’s ‘good’ or at least adequate. In Module 8, we studied Cans Get You Cooking, where the idea was to increase awareness of cans’ use in cooking via cooking shows and blogs. However, another objective was increased sales (after all, why bother with such a campaign if sales don’t increase?), and in that respect the plan was unsuccessful. According to Companies and Markets, the purchase of canned goods declines because of improvements in the economy. When consumers have more discretionary income to spend on foodstuffs, they purchase fewer canned goods – no matter how well-crafted a campaign is. There was increased awareness, yes, and under that criterion, the campaign worked. But under the criterion of increased sales, it did not. It seemed a little as if the goalposts were moved in that campaign, that increased sales were seen as being a less attainable goal. Awareness was a far more readily attainable goal, and so awareness was presented as being the premise behind the campaign.
These moved goalposts are the difference between what Smith refers to as awareness and action objectives, on pages 332 – 335, with the third type of objective, acceptance, straddling a line between both of the others. For the Cans Get You Cooking campaign, it seems as if the attainment of the awareness objective was the only cause for celebration.
Smith makes a compelling case on page 334, that creativity, effort, and cost don’t count as measures of effectiveness. All of those facets of a campaign are on the side of the organization, but measures of awareness, acceptance, and action are all effects felt (and acted upon) by publics. By definition, creativity, etc. should not be seen as having anything to do with the effectiveness of a campaign.
The Eight-Step AMEC Social Media Measurement Process
Jeffrey (Page 4) outlines, “The Eight-Step Social Media Measurement Process
Identify organizational and departmental goals.
Research stakeholders for each and prioritize.
Set specific objectives for each prioritized stakeholder group.
Set social media Key Performance Indicators (KPIs) against each stakeholder objective.
Choose tools and benchmark (using the AMEC Matrix).
Reconcile the numbers (they won’t add up, but it’s fun!)
Check the daily/normal stuff
Sweat the TCO (total cost of ownership)
What Kaushik said, and what Jeffrey said, are similar. Measurement is an objective activity. This is why objectives need to be clear and measurable. Five percent is measurable; better exposure (in general) is not.
For both authors, the idea is to have specific objectives and then act on them, whether those objectives are to launch a strategic campaign or select a web analytics vendor. Then, once the vendor is chosen, get the yardstick in place, and use it. Kaushik further reminds us that, while our intention may be to select a vendor and essentially ‘marry’ it, we still need to be evaluating the evaluator. If it’s not performing up to our reasonable specifications, then it’s time for vendor divorce court.
Key Performance Indicators (KPIs)
On page 7 of Jeffrey, it says, “Shel Holtz, principal of Holtz Communication + Technology (www.holtz.com) defined a KPI as a ‘quantifiable measurement, agreed to beforehand, that reflect the critical success factors of one’s effort.’”
This puts KPIs on a par with what we have been referring to as objectives. Wanting to ‘get better’ is one thing. But it’s vague and subject to weaseling. Wanting to improve recognition of the Institute for Life Sciences Collaboration (ILSC ) and its missions by 5% as is measured by surveys taken during the second quarter of 2016 is a measurable key performance indicator. Anyone who can read numbers will be able to determine whether the KPI has been met.
Applicability to the ILSC
Beyond just recognition measurements, there are any numbers of KPIs which can be measured, including the number of schools served by the Small World Initiative by a certain date, or increasing donations by a particular amount, subject to a clear deadline.
Currently, the ILSC website in particular seems to be just sort of thrown together without any sense of how to deal with technological and design changes, or scalability. Keeping measurements out of the mix means that the ILSC website can be tossed up and then forgotten about – and it seems a lot like that’s exactly what happened. However, a website cannot be a flash in the pan, as that can cause the publics to feel the organization behind it is also fly by night. Particularly when asking for money, an organization needs to give forth the impression of trustworthiness and solidity.
Adding Key Performance Indicators and measurements means there needs to be a sea change in how the ILSC views the website. It isn’t just something thrown together in an afternoon, to be handled by some temp hired for a few weeks and then never seen again. Instead, it needs to be an integral part of the organization. While the organization’s work is (generally) offline, there still needs to be room for the website in the minds of the organization’s board members. One facet of their thinking has to include how to best utilize the website and social media, in order to better communication the ILSC’s mission and goals, and to communicate with its publics. The website has got to have a place in those conversations, and it currently does not. That has to change.