Tag Archives: public relations

Mind The Gap: An Exploratory Case Study Analysis of Public Relations Student Intern and On-Site Supervisors’ Perceptions of Job Skills and Professional Characteristics


Thomasena Shaw

Thomasena Shaw, Bridgewater State University


Internships have significant early career advantages for undergraduates including less time finding a first employment position, increased monetary compensation and greater overall job satisfaction. Considerable professional and scholarly evidence highlights the important role of undergraduate internships, as well as gaps that exist between students and supervisors regarding the relative importance of specific job skills and professional characteristics. While previous studies have explored the underlying feelings and expectations of the two groups in professional and academic contexts, this exploratory case study uses coorientation as the theoretical framework to examine the levels of agreement, congruency and accuracy that exist between them in relation to key jobs skills and professional characteristics linked with career success; it also provides insight into the extent to which respondents perceive that the internship improved students’ college-learning outcomes. The key findings of this study indicate that the majority of respondents believed that the experience improved performance in relation to college learning outcomes. The study also found that students and supervisors are accurately cooriented with one another in relation to job skills items, but less so when it comes to professional characteristics. This could be particularly problematic for student interns as misperceptions and misunderstanding can potentially lead to missed opportunities for collaboration and integration, and/or a self-fulfilling prophecy where supervisors’ lack of coorientation damages the possibility of a cooperative relationship with current and future student interns, and the academic programs that bring them together.

 SlideShare PDF

Mind The Gap: An Exploratory Case Study Analysis of Public Relations Student Intern and On-Site Supervisors’ Perceptions of Job Skills and Professional Characteristics

Mind The Gap: An Exploratory Case Study Analysis of Public Relations Student Intern and On-Site Supervisors’ Perceptions of Job Skills and Professional Characteristics

The internship experience is broadly regarded by practitioners and educators as a critical event that often serves as a transition to an entry-level position (Gault, Redington, & Schlager, 2000; Gibson, 2001) and better employment opportunities for students (Knouse & Fontenot, 2008; Knouse, Tanner, & Harris, 1999; Redeker, 1992; Taylor, 1988). Internships improve college performance via experiential learning (Cantor, 1997; Ciofalo, 1989; McCarthy, 2006), improve personal habits such as time management and dependability (Sapp & Zhang, 2009; Taylor, 1988), have the potential to strengthen academic programs via service learning and citizenship (Fall, 2006; Mendel-Reyes, 1998), and help students make valuable connections with industry (Tovey, 2001) and community partners (Bringle, 2002; Soska & Butterfield, 2013). Internships provide students with a unique opportunity to gain valuable interpersonal, social, and contextual attitudes necessary for entry into non-academic settings (Anson & Forsberg, 1992), and crystallize personal interests and career ambitions (Coco, 2000).

Professional and scholarly evidence suggests a gap exists between students and supervisors regarding the relative importance of specific job skills and professional characteristics (CPRE, 1999; CPRE, 2006; Daugherty, 2011; Neff, Walker, Smith, & Creedon, 1999; Todd, 2014). While these and other studies have explored the underlying feelings and expectations of the two groups in professional and academic contexts, this study uses coorientation as the theoretical framework. Specifically, the researcher examines the levels of accuracy, congruency and agreement that exist between the two groups in relation to a number of job skills and professional characteristics considered necessary for a positive internship experience and future career success. The results are intended to extend existing understanding of the topic and suggest intentional changes to course design and dialogue regarding teaching practices that could improve student learning outcomes – ultimately laying the groundwork for the two groups to “coorient” toward one another accurately.

In the next section of this paper, a review of literature defines and examines the benefits of the internship experience, explores it in a public relations program context, and outlines the study’s theoretical framework: coorientation. Next, the researcher outlines the survey methodology employed, describes results, and discusses implications of the findings.


Benefits of the Internship Experience

Internships help students transition to entry-level positions (Gault, Redington, & Schlager, 2000; Gibson, 2001), improve interconnections between service learning and citizenship education (Fall, 2006; Mendel-Reyes, 1998), and have the potential to strengthen relationships between the academy and business and community partners (Tovey, 2001). An article in The Chronicle of Higher Education states that academic internships are valuable partnerships that allow students to collaborate closely with faculty, and strengthen ties between the academy and the community—whether students are paid or not (Westerberg & Wickersham, 2011). Regarding the benefits to the organization, internships provide direct business contact for students in an employment setting (Gupta, Burns, & Schiferl, 2010), prepare students with realistic expectations of their future careers, and an opportunity to gain on-the-job experience (Paulins, 2008). They provide additional well-educated, talented labor capacity (Brindley & Ritchie, 2000; Callanan & Benzing, 2004; Mihail, 2006), “compensation efficiencies” (Maertz, Stoeberl, & Marks, 2014), and an opportunity to see how much potential a student has in the field before hiring them (Coco, 2000). Indeed, Watson (1995) estimated that it is $15,000 per person less expensive to hire interns than to recruit and select candidates from an at-large pool. Maertz, Stoeberl, and Marks (2014) assert that interns are often more loyal toward the company and stay longer than the average non-intern hire.

College Internship Experiences Defined

The earliest recorded college-endorsed employment program was established in 1906 at the University of Cincinnati’s Cooperative Education Program (Thiel & Hartley, 1997). Typical contemporary internship programs have the following attributes: they offer a specific number of work hours, paid or unpaid employment, credit for college classes, supervision by a faculty coordinator or other university contact, and supervision by an organization mentor (DiLorenzo-Aiss & Mathisen, 1996; Gault, Redington, & Schlager, 2000; Roznowski & Wrigley, 2003). To maximize the internship experience, Coco (2000) asserts that students should be held accountable for projects and deadlines. Lubbers and Bourland-Davis (2012) suggest that on-site supervisors should provide incoming interns with some kind of orientation, where goals are clearly articulated, and with access to regular meaningful feedback. This type of internship experience resembles what Kuh (2008) describes as high-impact practices when they are effortful, help students build substantive relationships, help students engage across differences, provide students with rich feedback, help students apply and test what they are learning in new situations, and provide opportunities for students to reflect on the people they are becoming.

Divine, Linrud, Miller, and Wilson (2007) indicate that approximately 90% of U.S. colleges offer internships or similar experiential opportunities. In 2016, a US News and World Report survey of 324 ranked colleges and universities found that on average 40% of the undergraduate class of 2014 had internship experience. At the eight schools with the highest rates of participation, 100% of undergraduates completed an internship (Smith-Barrow, 2016). A National Association of Colleges and Employers (NACE, 2016) report found that more than 56% of students from the class of 2015 who participated in an internship had received at least one job offer by April of that year (compared to only 36.5% of undergrads who did not have an internship) and that the intern conversion rate was 51.7%.

The Internship Experience in a Public Relations Program Context

Internships are strongly encouraged and valued among both public relations educators and employers; the experience lends credibility to university public relations programs (Van Leuven, 1989a), and allows students to observe public relations practitioners in the roles of manager, strategist, planner, problem solver and counselor to management (Baxter, 1993). Lubbers, Bourland-Davis and Rawlins (2008) describe it as a process of socialization through which interns learn the values associated with the profession.

The industry’s largest organization of public relations professionals, the Public Relations Society of America (PRSA), encourages internships as a key way for students to enhance their education, résumé, portfolio, networking, and technical skills (Beebe, Blaylock, & Sweetser, 2009). A national study conducted by the Commission on Public Relations Education entitled “A Port of Entry” recommends a supervised work experience as one of the core courses for students majoring or pursuing an emphasis in public relations (CPRE, 1999); the Accrediting Council on Education in Journalism and Mass Communications (ACEJMC) also advocates and encourages opportunities for internship and other professional experiences outside the classroom (ACEJMC, 2013).

Research also supports the notion that a quality public relations internship increases job satisfaction after graduation (Horowitz, 1997), is a necessity for mass communication students making the transition from college to career (Beard & Morton, 1999), and is typically favored by students to seek mentoring and to make contacts (Basow & Byrne, 1993).

With regard to discipline-specific skills supervisors believed most necessary for public relations interns, Beard and Morton (1999) identify six predictors for internship success in a public relations context: (1) academic preparedness, (2) proactivity/aggressiveness, (3) positive attitude, (4) quality of worksite supervision, (5) organizational practices and policies, and (6) compensation. Brown and Fall (2005) identified writing, oral, and organizational skills, and note that the most valued professional characteristics were intangible: motivation and “healthy, upbeat attitudes” (p. 303). The aforementioned “Port of Entry” report (1999, p. 12) identified the following as core skills: mastery of language in written/oral communication; community relations, consumer relations, employee relations and other practice areas; research methods and analysis; problem solving and negotiation; and informative and persuasive writing.

Disparities Regarding Learning Outcomes

Despite the obvious benefits of the internship experience, research does indicate that disparities exist between how public relations practitioners, academic programs, and students perceive the importance of job skills and professional characteristics, which has the potential to lead to missed opportunities for all parties.

A study conducted on behalf of the Association of American Colleges and Universities (Hart, 2016) indicated that the college learning outcomes employers considered top priorities include demonstration in “cross-cutting skills” related to communication, teamwork, ethical decision-making, critical thinking, and applying knowledge in real-world settings (p.1). Sixty percent of employers indicated that they would be much more likely to consider a candidate that had recently completed an internship. However, 44% felt that recent college graduates were not well-prepared to apply their knowledge in real-world settings, and gave students low scores for preparedness across a range of college learning outcomes including ability to communicate orally, working effectively with others in teams, and critical thinking and analytical reasoning skills. There was alignment in the category referred to as staying current with new technologies; however, students were more than twice as likely as employers to think they were prepared in terms of oral communication, written communication, critical thinking, and creativity.

Two separate Commission on Public Relations Education reports (CPRE, 1999; CPRE, 2006) indicate that a number of key competencies and skills were weak or missing among entry-level public relations graduates, including: writing skills, understanding of business practices, and critical thinking and problem-solving skills. Neff, Walker, Smith, and Creedon (1999) assert that gaps exist between the outcomes educators and employers desire and those presently achieved in public relations education. They found that public relations graduates don’t always meet entry-level outcome competencies expected by employers, and recommended changes in curriculum, pedagogy and assessment.

It would appear that these disparities also spill over into the internship experience. Meng (2013) found differences between students and practitioners; practitioners ranked strategic decision-making capability, ability to solve problems and produce desired results, and communication knowledge and expertise highest. Meanwhile, public relations students rated ability to solve problems and produce desired results, being trustworthy and dependable, and relationship-building abilities highest. Sapp and Zhang (2009) found that industry supervisors rated students’ performance in the categories of attitude and interaction the highest, and skills related to the students’ writing skills, ability to take initiative, professional skills, spoken communication skills, and time management skills among the lowest. In Daugherty’s (2011) study, students indicated that they wanted more skill development and hands-on training, while on-site supervisors saw their role as more holistic. Todd (2014) found that public relations managers rated the job skills and professional characteristics of their entry-level millennial charges significantly lower than the latter group rated themselves.

Many of the research articles, studies and reports detailed above explore the public relations internship experience from a variety of perspectives, including that of student interns and their on-site supervisors, but none have explored the degree of coorientation—agreement, congruence and accuracy—each group perceives the other to have with his/her own evaluations in relation to recognized job skills and professional characteristics. Coorientation rests on the assumption that a person’s behavior is based on a combination of his/her personal construction of the world and the perception of orientations of those around them (Heider, 1946; Newcomb, 1953). As such, the theory suggests methods for measuring the degree of mutual orientation of individuals, groups or organizations toward an object, or the consensus among them about an object (Pearson, 1989). In this study, coorientation theory will be used to explore if perceptions regarding the job skills/professional characteristics necessary for a successful public relations internship experience are accurate or not. This will identify underlying disparities (if they exist), and facilitate discussion of implications for public relations educators, student interns, and on-site supervisors.

Theoretical Framework: Coorientation

Coorientation theory stems from the study of social psychology. Essentially, the term coorientation refers to simultaneous orientations, so if person A (on-site supervisor) feels negatively toward B (student intern) and positively about X (job skills and/or professional characteristics), and finds out that B feels positively about X as well, then the system can be said to be imbalanced, or asymmetrical. Ultimately, this imbalance can impede any moves toward balance or improvement of the relationship between the two parties. Therefore, coorientation can be seen as a relational term, and it is via communication that it is achieved. According to Johnson (1989), from this perspective, it is imperative that consensus is examined as an interaction between people rather than being the property of a single individual.

Perhaps the most recognizable names in this research stream are McLeod and Chaffee (1973) who developed a coorientation measurement model with three variables: agreement, congruency and accuracy. Perfect communication between the two groups (A and B), totally free of constraints, would not necessarily improve agreement, and it might even reduce congruency. Indeed, if the two are motivated to coorient, it can actually facilitate understanding, but it should always improve accuracy, even to the point where each person knows exactly what the other is thinking; this would be perfect communication in a quite literal sense.

The model, outlined in Figure 1, provides a visual representation of coorientation in relation to this study, which explores the relationship between the two groups’ (on-site supervisors and student interns) self-reported attitudes toward an object (rating of job skills and professional characteristics) as well as their perceptions of each other’s self-report. This produces three coorientation variables: agreement, congruency and accuracy.

Figure 1    Coorientation model representation of supervisors’ and interns’ ranking of job skills and professional characteristics. (Adapted from: The coorientation model of measurement. (From McLeod, J.M. & Chaffee, S.H. (1973). Interpersonal approaches to communication research. American Behavioral Scientist, 16, 484. Sage.)

Coorientation Variables Defined: Agreement, Congruency and Accuracy

Agreement indicates the degree to which the two groups’ beliefs on the issue (rating of job skills and professional characteristics) are similar. Perceived disagreement/agreement on the issue by the two groups is described as congruency. Accuracy is the extent to which one group’s cognition (e.g., interns’ perception of supervisors’ ranking of job skills and professional characteristics) equals what the other group actually reported.

According to Kim (1986), of the three measurements, accuracy is considered to be the most important because it can provide a clear picture of the effects of communication. For example, in terms of this study, agreement on the focal point—what job skills and professional characteristics are most important—must take place before true understanding can occur. Although communication may often produce some increase in accuracy, it rarely produces total agreement because each person arrives at his/her beliefs through personal experiences. Communication can produce marked increases in accuracy between the two groups because the more two parties coorient by communicating private values to each other, the more accurate perceptions of those values have the potential to become (Chaffee & McLeod, 1968).

It is important to note at this point that the coorientation variables—agreement, congruency and accuracy—are not functionally independent of one another, since each is based on two measures. Thus, if agreement is low and congruency is high, accuracy is necessarily low; if agreement and congruency are both high (or low), accuracy is high. A change in one of these variables will affect change in another if the third is held constant (Chaffee & McLeod, 1968). For example, if a public relations program makes student interns more accurate in their perceptions of the rigors and demands of actual public relations practice, then congruency for that public will also change. The direction of the change, higher or lower congruency, depends on the degree to which the initial supervisors’ definition of the issue was similar to student interns’ views.

Examples of the theory being used by public relations researchers include use in the exploration of public issues (Broom, 1977), media relations (Kopenhaver, Martinson, & Ryan, 1977), understanding between government organizations and interest groups (Grunig, 1972), non-profit organizations and donors (Waters, 2009), journalist and practitioner attitudes toward social media (Avery, Lariscy, & Sweetser, 2011), and international relations (Verčič & Verčič, 2007).

There can be no doubt that student interns are operating in more competitive and dynamic environments than ever before, and it is therefore imperative that both groups identify issues that may help or hinder their relationship. Expanding knowledge of the role and importance of the relationship that exists between them, as well as how each group reacts to similar stimuli/events (i.e., improving the level of coorientation), will potentially lead to improved student effectiveness and success, and more fruitful collaborations between academic programs and real-world industry/organizations.

Research Questions

This study will address the following research questions:

RQ1a: How do respondents’ rate/score specific job skills (JS) and professional characteristics (PC)?

RQ1b: Is there a significant difference in the levels of coorientation (agreement, congruency and accuracy) regarding JS and PC between the two groups?

RQ2: Do respondents perceive that the internship experience improves students’ learning outcomes?


Survey instrument

The researcher secured IRB approval, and pre-tested the survey with a small sample of faculty and students to verify categorical representation, and assess validity and comprehension. A Qualtrics survey link was then distributed to all students listed as belonging to the Strategic Communication/Public Relations concentration in the final three weeks of a traditional 15-week fall (2015) semester (N = 135) at a mid-sized public Northeastern regional university. All of the students who participated had completed (or were currently taking) a public relations practicum class, which uses a 120-hour required field experience as a focal point (course prerequisites include Introduction to Public Relations and Strategic Writing). Students worked at the job site 6-8 hours per week with an on-site supervisor (who is employed in a public relations capacity at the job site) and engaged in similar types of activities—event planning and coordination, strategic writing, preparing strategic awareness/promotion materials, etc. The on-site supervisor survey was emailed to students’ supervisors (students provided contact information in their survey). An initial solicitation email with a web-link to the survey was distributed to both groups and followed up with one reminder email; this yielded 32 completed student surveys (n = 32; response rate = 22%) and 15 supervisor surveys (n = 15, response rate = 50%).

The survey was comprised of three sections. The first gathered relevant demographic data from respondents, the second section asked respondents to rate/score eight job skills and 12 professional characteristics according to (1) his/her own perceptions, and (2) how they predict the other group would rank them (1 being most important, 12 least important).

This section has preliminary convergent validity as it adapts criteria presented in a study conducted by Todd (2014) that also divided tasks and responsibilities into two of these constructs. The third section of the survey explored the extent to which the internship experience improved students’ abilities related to a number of college learning outcomes (5-point Likert scale; 1 = no improvement, 2 = slight improvement, 3 = moderate improvement, 4 = significant improvement, 5 = not applicable). This section has preliminary convergent validity because it uses several of the same constructs presented in a study conducted on behalf of the Association of American Colleges and Universities that identified college learning outcomes employers considered top priorities. The Cronbach’s α score was 0.86, which demonstrates acceptable internal reliability. The final section of the survey asked respondents to answer open-ended questions related to the overall experience, and challenges/suggestions. The convenience nature of the survey and small sample size mean that external validity for both the quantitative and qualitative parts of the study are low; therefore, only face validity can be assumed.


Description of Respondents

Of the 47 respondents participating in the study, 68% (n = 32) were student interns and 32% (n = 15) were on-site supervisors. Sixty-eight percent (n = 24) of the interns were female and 32% (n = 8) were male; on-site supervisors were 53% female (n = 8) and 47% male (n = 7). Student respondents were mostly aged 21-25 (93% of students; n = 28); on-site supervisors’ ages ranged from 26-65, the median age being 39. The majority of both student and on-site supervisors identified as Caucasian (81%; students n = 26 and supervisors n = 13). The student respondents were mostly seniors (93%; n = 28), 19 % (n = 6) were juniors. All on-site supervisors (n = 15) reported having a 4-year college degree, two of them (20%) have a master’s degree. In the on-site supervisor group, 67% (n = 10) work in private not-for profit (charitable organization), the remainder work in other non-profit settings (local government n = 2; state government n = 2). Just over half of the students (53%; n = 17) reported that this was their first internship; 22% (n = 7) have had two; 19% (n = 6) have had three internships. In terms of how many hours students have worked at their internships, 66% (n = 21) worked under 10 hours; 19% (n = 6) worked over 15 hours. On-site supervisors indicated that 47% (n = 7) have had just one student intern, 33% (n = 5) have had more than three, and 20% (n = 3) had two interns. The majority of supervisors indicated that interns worked fewer than 10 hours per week (80%; n = 12).

RQ1a: How do respondents rate/score the importance of related job skills and professional characteristics?

Job skills: student interns. With regard to the eight job skills (see Table 1), student interns reported their top four (in order of preference) as, quality of work (M = 6.28, SD = 1.37), overall performance (M = 5.72, SD = 2.55), writing skills (M = 5.56, SD = 1.62), and job task preparation (M = 5.06, SD = 2.15). Their bottom four were oral communication skills (M = 4.81, SD = 1.92), knowledge of social media (M = 3.19, SD = 2.07), computer skills (M = 3.0, SD = 1.66), and research skills (M = 2.4, SD = 1.38).

Table 1

Job skills – Students’ and Supervisors’ Self Mean

Job Skill Student self-mean Supervisor self-mean Difference in means
Research skills 2.4 3.84 -1.44
Computer skills 3.0 2.9 .1
Knowledge of social media 3.19 3.14 .05
Oral communication skills 4.81 5.48 -.67
Job task preparation 5.06 5.13 -.07
Writing skills 5.56 5.91 .35
Overall performance 5.72 3.31 2.41
Quality of work 6.28 6.5 .22

Job skills: on-site supervisors. On-site supervisors reported their top four job skills (see Table 1) in order of preference as, quality of work (M = 6.5, SD = 1.50), writing skills (M = 5.91, SD = 1.22), oral communication skills (M = 5.48, SD = 1.84), and job task preparation (M = 5.13, SD = 2.40). Their bottom four were research skills (M = 3.84, SD = 2.54), overall performance (M = 3.31, SD = 2.84), knowledge of social media (M = 3.14, SD = 1.33), and computer skills (M = 2.9, SD = 1.03).

Professional characteristics: students. As there are 12 professional characteristics (PC), the researcher divided them into two groups—top and bottom (see Table 2). Student interns reported the top PC needed by interns as, willingness to learn (M = 9.75, SD = 2.47), time management (M = 9.12, SD = 1.69), attention to details (M = 9.03, SD = 2.54), accept responsibility (M = 7.87, SD = 2.54), follow instructions (M = 7.84, SD = 2.7), and punctuality (M = 6.34, SD = 3.17). The bottom lower ranked were, take on new tasks (M = 6.12, SD = 2.98), cooperation (M = 5.96, SD = 2.23), accept criticism (M = 5.65, SD = 2.71), work independently (M = 5.25, SD = 3.3), aware of ethics (M = 2.65, SD = 2.85), and understand diversity (M = 2.37, SD = 1.94).

Table 2

Professional Characteristics – Students’ and Supervisors’ Self Mean

Professional Characteristics Student self-mean Supervisor self-mean Difference in means
Understand diversity 2.38 2.27 .11
Aware of ethics 2.66 3.07 -.41
Work independently 5.25 6.93 -1.68
Accept criticism 5.66 5.93 -.27
Cooperation 5.97 6.07 -.10
Take on new tasks 6.12 5.20 .92
Punctuality 6.34 3.93 2.41
Follow instructions 7.83 7.93 -.10
Accept responsibility 7.88 7.27 .61
Attention to details 9.03 10.13 -1.10
Time management 9.13 7.40 1.73
Willingness to learn 9.75 11.87 -2.12

Professional characteristics: on-site supervisors. On-site supervisors reported their top PC as (see Table 2), willingness to learn (M = 11.87, SD = 0.516), attention to details (M = 10.13, SD = 1.55), follow instructions (M = 7.93, SD = 2.54), time management (M = 7.4, SD = 2.13), accept responsibility (M = 7.27, SD = 1.94), and work independently (M = 6.93, SD = 3.47). The bottom ranked PCs were, cooperation (M = 6.07, SD = 1.86), accept criticism (M = 5.93, SD = 1.94), take on new tasks (M = 5.2, SD = 1.78), punctuality (M = 3.93, SD = 2.78), aware of ethics (M = 3.07, SD = 3.49), and understand diversity (M = 2.27, SD = .88).

RQ1b: Is there a significant difference in the levels of coorientation (agreement, accuracy, congruence) between the two groups? 

Agreement. When respondents’ self–reports are compared to the self-reports of members of the other group, a coorientational insight into the level of agreement that exists between the two groups was obtained by utilizing a non-parametric statistical measure: the Mann-Whitney U test. The central question here is: Do students and supervisors agree on the rating/scoring of the items (student self vs supervisor self)?

Mann-Whitney U-tests indicated that, for the most part, the two groups agreed with one another on the ratings/scores of the eight JS presented in the survey. The only exception relates to the item overall performance (z = -2.813, p = 0.005). Here, students’ mean scores were higher than supervisors’ self-reports (student mean = 5.7; supervisor mean = 3.30).

Regarding the 12 PCs, respondents’ scores were similar on the majority of the items except for three items: (1) willingness to learn (z = -3.474, p = 0.001)—supervisors rated it higher than students (supervisor mean = 11.80; student mean = 9.70); (2) time management (z = -2.601, p = 0.009)—students rated it higher than supervisors (student mean = 9.1; supervisor mean =7.40); and (3) punctuality (z = -2.503, p = 0.012)—students rated it higher that their on-site counterparts (student mean = 6.3; supervisor mean = 3.9).

Congruency. To achieve coorientational insight into the level of congruency, respondents’ self–reports are compared to their projections of “other group” responses. Mann-Whitney U-tests compared respondents’ selections. The central question here is: How similar are respondents’ ratings/scores of job skills and professional characteristics to how they perceive their counterparts will rate/score the items (student self vs. student other; supervisor self vs. supervisor other)?

Student interns. Student intern ratings/scores were congruent with their perceptions of how supervisors would rate/score the items. No significant differences occurred in the JS category. Regarding professional characteristics, congruence also exists across all items; students’ ratings/scores were similar to their perceptions of how supervisors’ would rate/score the items across all items.

Table 3

Professional Characteristics – Supervisor Congruency

Professional Characteristics z score p value
Willingness to learn -4.670 .000
Attention to details -2.585 .010
Follow instructions -1.996 .046
Time management -2.936 .003
Accept responsibility -3.330 .001
Punctuality -4.037 .000
Cooperation -4.231 .000
Accept criticism -3.639 .000
Take on new tasks -4.648 .000
Work independently -1.827 .068
Understand diversity -4.670 .000
Aware of ethics -3.656 .000


On-site supervisors. Supervisors’ ratings/scores of job skills were congruent with their perceptions of how students would rate/score all JS items except for social media (z = -1.900, p = 0.050). However, in the PC category, there was a distinct lack of congruency across all items except work independently (z = -1.827, p = 0.068; see Table 3); supervisors’ ratings/scores were significantly different to their perceptions of how students would rate/score the items.

The central question was: How similar are respondents’ ratings/scores of job skills and professional characteristics to how they perceive their counterparts will rate/score the items (student self vs. student other; supervisor self vs. supervisor other)? Students displayed high levels of congruency—how they ranked all items in the job skills and professional characteristics categories matched how they perceived their supervisor counterparts would rank the items. On-site supervisors also displayed high levels of congruency in the job skills section; however, in the professional characteristics category, supervisors perceived that students’ selections would be different to their choices.

Accuracy. Finally, when student intern self-reports (or on-site supervisors) were compared to their projections of how the other group would respond, a coorientational insight into the level of accuracy that exists between the two groups is obtained. Mann-Whitney U-tests calculated accuracy within the student intern and on-site supervisor groups respectively. The central question here is: How do respondents’ (self) ratings/scores compare with their counterparts’ perceptions (other) of how they will rate/score the items (student self vs. supervisor other; supervisor self vs. student other)?

Student interns. Regarding JS, student interns’ ratings/scores compared with on-site supervisors’ perceptions of how they would respond was mostly accurate, except in relation to the item overall performance (z = -2.447, p = 0.014). Regarding the PC items listed in the survey, student interns’ ratings/scores compared with supervisors’ perceptions of how they would respond was accurate for just three items: willingness to learn, attention to details, and time management. Inaccuracy existed in relation to the ratings/scores of nine items: following instructions (z = -2.338, p = 0.019), taking responsibility (z = -2.453, p = 0.014), punctuality (z = -3.320, p = 0.001), cooperation (z = -4.197, p = 0.000), accept criticism (z = -4.197, p = 0.000), taking on new tasks (z = -3.680, p = 0.000), working independently (z = -3.982, p = 0.000), diversity (z = -5.362, p = 0.000), and ethics (z = -4.801, p = 0.00).

On-site supervisors: Regarding JS items, supervisor’ ratings/scores compared with student interns’ perceptions of how they would respond was mostly accurate. The only exception was regarding the items oral communication (z = -2.754, p = 0.006) and overall performance (z = -2.716, p = 0.007). In relation to the rating/score of PC items, on-site supervisors’ ratings/scores compared with students’ perceptions of how they would respond was accurate across most of the items. Inaccuracy existed in relation to three: willingness to learn (z = -3.103, p = 0.002), time management (z = -2.556, p = 0.011), and punctuality (z = -2.687, p = 0.007)

The central question here is: How do respondents’ (self) ratings/scores compare with their counterparts’ perceptions (other) of how they will rate/score the items? In this study, supervisors provided stronger evidence of coorientational accuracy than their student counterparts. When asked to project themselves as the opposite group, supervisors were better at predicting on-site supervisors’ responses (inaccuracy only occurred in two job skills items: oral communication and overall performance; and three professional characteristics items: willingness to learn, time management and punctuality). Students did display evidence of accuracy in their predictions of supervisors’ ratings of job skills (except for one item, overall performance); however, they were very poor at predicting their counterparts’ responses in the majority (nine) of the professional characteristics categories (they only accurately predicted students’ ratings of willingness to learn, attention to detail and time management).

RQ3: Do respondents perceive that the internship experience improved students’ learning outcomes?

A Mann-Whitney U-test revealed that significant differences did not exist between the two groups regarding perceptions of whether the internship experience improved students’ learning outcomes; both groups reported that the experience resulted in moderate to significant improvement across all 12 recognized college learning outcomes (Cronbach’s α = 0.86).

Students. On a 5-point Likert scale (1 = no improvement, 2 = slight improvement, 3 = moderate improvement, 4 = significant improvement, 5 = not applicable), the majority of student respondents (N = 32) indicated that they improved across all college learning outcome categories while working as an intern (M = 3.43).

Supervisors. On a 5-point Likert scale (1 = no improvement, 2 = slight improvement, 3 = moderate improvement, 4 = significant improvement, 5 = not applicable), the majority of on-site supervisors indicated that students improved across all college learning outcome categories while working as interns (M = 3.49).

Responses to open-ended questions

Students and supervisors were asked several open-ended questions about challenges they experienced related to the internship, and suggestions related to curriculum/coursework to make the internship experience more successful.

Students. Student interns indicated that the most significant challenges they faced related to time and work-load management, the unpaid nature of internships, the strong emphasis on writing ability, and adapting to working in a “professional” environment:

[My challenges] were definitely being able to balance the work load [while] still being a full time [sic] student. Being involved on campus, having 3 internships in total, and still trying to make money [with] a part time job. It was tough balancing everything, as all the work from each of these things was incredibly important…at times it was really hard to make [priority] decisions.

[When] the internship is unpaid, it makes it very difficult to make ends meet. This is especially true when having to travel to the job site.

I think one of my biggest challenges was being able to write press releases since I never [wrote] them at a professional level before. I definitely had trouble with certain types of writing such as creating brochures and news releases.

Learning the expectations of my co-workers/supervisor and making sure I always met, and/or exceeded them. This was a challenge at times because I was new to the real world [sic] environment and didn’t know what to expect.

With regard to suggestions to the curriculum/coursework to make the internship experience more successful, most students did not respond to this question. Those who did were very satisfied with their preparation and experience: “I wouldn’t change a thing, it was a great experience. I loved the balance between the classroom and the field experience.” Another student stated: “I can’t imagine it being more successful. I learned so much.”

Some student suggestions included: “[Adding] a writing refresher workshop prior to beginning [the] internship would be beneficial,” and “Taking a business management class may have really helped too.” Additionally:

Possibly a class with reminders on basic guidelines on how to write press releases and other basic PR writing tools. I found myself looking at past assignments from previous years for help, my writing was not always as strong as I wanted it to be.

When asked what the internship taught them about their major/discipline, students indicated that they learned more about the scope of public relations: “It taught me valuable writing skills and how to tailor wording to meet the needs of specific audiences. I think I improved my listening skills as well.

Two other students responded:

I learned that there are many different facets to public relations, and problems are always going to occur. Working for a non-profit was challenging, but there were also many benefits. I now know that it requires passion and a dedication not required in most regular office jobs.

I definitely learned how to communicate in a more professional setting, i.e. through emails, phone calls, person-to-person, etc. This experience opened my eyes to not only the inner workings of a real world business, but also to new workplace skills that I will definitely use in the future.

Supervisors. With regard to challenges, for supervisors it seems that the biggest issue related to the limited time interns worked on-site: “[The] only challenge was that she only worked two days a week and [I] felt bad trying to reach her to follow up on items during days when she wasn’t working.” Another student stated:

I couldn’t be happier with the experience I’ve had with my intern. All of her work has been of the highest quality and she never hesitates to take on new tasks and responsibilities. She consistently surpasses expectations and brings great insight and value to my department. The only challenge I may have encountered was keeping her busy because she was so efficient!

Regarding suggestions, supervisors indicated that perhaps more interaction with academic advisors would be helpful:

More correspondence from the advisers is always helpful – I like having a weekly bi-weekly or monthly check-in with the college staff to ensure the student, adviser, and internship supervisor are all on the same page.

I felt like my intern had a very strong grasp of communication principles, specifically in regards to public relations and social media. Her coursework absolutely prepared her for work in those fields. Communications work can often come with broad job descriptions and require the communicator to wear many ‘hats’ [sic]. It seems to me that my intern had a strong academic foundation that would be an asset in adapting to this kind of situation.

Finally, additional comments offered by supervisors were complimentary of interns:

Our experience so far has been awesome. We currently have two different interns here for different reasons and they are both very motivated, intelligent and helpful. They are a great addition to our organization.

I’d just like to compliment the faculty on offering a generation of new communicators such a high level of preparation for an industry that changes daily with the advent of new technologies and vehicles for messaging. I’m excited to see what these future professionals will bring to the table!


Discipline-specific skills that supervisors consider most necessary for public relations interns include strategic writing, oral, and organizational skills, research skills; problem solving and negotiation; and informative and persuasive writing (Brown & Fall, 2005; CPRE, 1999). Meng (2013) and Sapp and Zhang (2009) found that the practitioners rank strategic decision-making capability, problem solving, and communication knowledge and expertise highest, while public relations students rate ability to solve problems and produce desired results, writing skills, oral communication skills, and time management skills among the lowest. The results of this study indicate that students mostly agreed with on-site supervisors (and vice-versa) in relation to their ratings of job skills and professional characteristics. Students placed high ratings on quality of work, overall performance, writing skills, and job task responsibility; oral communication, knowledge of social media, computer skills and research skills were lower rated. On-site supervisors’ top-rated job skills were quality of work, writing skills, oral communication and job task responsibility; lower rated items were research skills, overall performance, knowledge of social media and computer skills.

While there are many benefits related to the internship experience, disparities do exist between how students and supervisors perceive the importance of job skills and professional characteristics, which can lead to missed opportunities for all (Meng, 2013; Sapp & Zhang 2009; Todd, 2014). This survey indicates that regarding job skills, student interns and on-site supervisors are both cooriented to one another across all three coorientation variables (agreement, congruency and accuracy). Regarding professional characteristics items, both groups were also cooriented to one another regarding the agreement variable (student self vs. supervisor self); however, significant differences exist among on-site supervisors regarding the congruency variable (supervisor self vs supervisor other), and students regarding the accuracy variable (student self vs supervisor other). This finding is potentially more problematic for student interns than on-site supervisors because, according to Kim (1986), of the three measurements, accuracy is the most important; it must take place before true understanding can occur. Misperceptions and misunderstanding have the potential to result in missed opportunities for collaboration and integration, and/or a self-fulfilling prophecy where a lack of coorientation between both students and supervisors damages the possibility of a cooperative relationship with current and future student interns, and the academic programs that provide access to students.

With regard to college learning outcomes, literature indicates that employers believe that engaging students in internships improves college-learning outcomes, makes students better prepared for career success, and potentially a high-impact learning experience that deepens learning (Hart, 2016; O’Neill 2010). In this study, the majority of students perceived that improvement was “significant,” while supervisors’ perceived improvement was “moderate.” These findings differ from several reports that indicate that public relations graduates re not meeting entry-level outcome competencies (CPRE, 1997; CPRE, 2006; Neff, Walker, Smith, & Creedon, 1999). The high-impact focus of the internship experience respondents of this study participated in may have deepened perceptions of learning and successful outcomes for students.

In the open-ended portion of the survey, students stated that they valued the real-world nature of the experience, and learned a lot about the scope of public relations; challenges mostly related to time and work-load management, the unpaid nature of experience, and the strong emphasis on writing ability. Supervisors identified limited time interns worked on site as a key challenge, but for the most part, they reported being very satisfied with their interns.

The findings of this study suggest that both groups were cooriented to one another in relation to perceptions of the job skills associated with the internship experience; however, in relation to the professional characteristics category, supervisors indicated lower levels of congruency (supervisor self vs. supervisor other), which means accuracy and overall coorientation between the two groups is low. Blindly assuming that all parties share a common understanding of goals, outcomes, tasks and responsibilities can lead to missed opportunities for collaboration and integration, and/or damage the possibility of a cooperative relationship with current and future student interns, and the academic programs that provide access to students

Suggestions to overcome discrepancies

  1. Faculty supervisors should clearly communicate to all parties (not just students) what practical expectations, roles, and responsibilities are associated with the experience. This can be achieved by encouraging collaboration between student and supervisor (prior to the start of the internship) in the learning goals and outcomes identification process.
  2. Details related to projects and deadlines, expectations regarding the degree of autonomy/independence versus teamwork/direction could also be established. This could be achieved by collaborating in the creation of a “contract” document in the opening days/weeks of the internship.

In addition to collaboration related to expectations, the provision of rich feedback to the student from both the faculty and on-site supervisor can benefit all parties and the hallmark of high-impact internships. This feedback can relate to the practical day-to-day tasks/responsibilities, but also engaging students and their supervisors in reflective conversations related to the interns’ career goals and opportunities to reflect on the people they are becoming.

Scaffolding relevant prior learning (Introduction to Public Relations and Public Relations Writing classes as prerequisites) and encouraging reflection on challenges/opportunities can take the form of journals—shared with faculty and on-site supervisors—that hone writing skills and prompt students to engage in critical thinking related to the experience; it can also provide an opportunity to coorient more accurately with one another.

To conclude, the two groups in this study have a lot more in common than they don’t; perfect communication may not necessarily improve accuracy between these two groups, but if two are motivated to coorient, it can facilitate understanding. For the public relations educator and student intern, the goal of communication must be to improve accuracy, even if they agree to disagree or even choose not to coorient to the same things in the same degree. As such, greater dialogue about the fact that students are more cooriented to supervisors regarding the importance of jobs skills and professional characteristics than supervisors suspected, will ultimately lead to greater understanding and opportunities for all parties involved.

Limitations and Future Study

Although the survey was sent to over 135 strategic communication/public relations concentration students, the response rate and subsequent sample size was small. The convenience sample nature of the supervisor sample—determined by student interns providing their supervisors’ contact information—is also a limitation and while the response rate was relatively high, the researcher acknowledges that external validity for the study is low. The study’s results may not be generalizable with a certain margin of error toward the larger population of student interns and on-site supervisors. Another limitation is that that the majority of students who participated in the study worked at the internship site fewer than 10 hours; their experiences would likely differ from students whose internships require them to work significantly greater hours. Despite these limitations, the results provide a valuable exploratory insight into how respondents’ rate job skills and professional characteristics, the level of coorientation that exists between them, and the extent to which they view the internship experience improves a variety of college learning outcomes.

Future research could expand this study by incorporating some qualitative elements, and increasing the representativeness and generalizability of the study by increasing the sample size (including other universities). The researcher intends to incorporate a longitudinal approach, continuing to gather and analyze information from student interns and their supervisors and explore the implications of their orientations on the quality of the experience for both parties.


Accrediting Council on Education in Journalism and Mass Communications (ACEJMC) (2013), “Accrediting Council on Education in Journalism and Mass Communications (ACEJMC) Accrediting Standards.” Retrieved from https://www2.ku.edu/~acejmc/PROGRAM/STANDARDS.SHTML.

Anson, C. M., & Forsberg, L. L. (1990). Moving beyond the academic community transitional stages in professional writing. Written Communication, 7(2), 200-231.

Avery, E., Lariscy, R., & Sweetser, K. D. (2010). Social media and shared—or divergent—uses? A coorientation analysis of public relations practitioners and journalists. International Journal of Strategic Communication, 4(3), 189-205.

Basow, R. R., & Byrne, M. V. (1993). Internship expectations and learning goals. Journalism Educator, 47(4), 48-54.

Baxter, B. L. (1993). Public Relations Education: Challenges and Opportunities: Public Policy Committee of the Public Relations Society of America. New York.

Beard, F., & Morton, L. (1999). Effects of internship predictors on successful field experience. Journalism & Mass Communication Educator, 53(4), 42.

Beebe, A., Blaylock, A., & Sweetser, K. D. (2009). Job satisfaction in public relations internships. Public Relations Review, 35(2), 156-158.

Brindley, C., & Ritchie, B. (2000). Undergraduates and small and medium-sized enterprises: opportunities for a symbiotic partnership? Education+ Training, 42(9), 509-517.

Bringle, R. G., & Hatcher, J. A. (2002). Campus–community partnerships: The terms of engagement. Journal of Social Issues, 58(3), 503-516.

Broom, G.M. (1977). Coorientational measurement of public issues. Public Relations Review, 3(4), 110-118.

Brown, A., & Fall, L. T. (2005). Using the port of entry report as a benchmark: Survey results of on-the-job training among public relations internship site managers. Public Relations Review, 31(2), 301-304.

Callanan, G., & Benzing, C. (2004). Assessing the role of internships in the career-oriented employment of graduating college students. Education+ Training, 46(2), 82-89.

Cantor, J. A. (1997). Experiential learning in higher education: Linking classroom and community. ERIC Digest. Retrieved from http://files.eric.ed.gov/fulltext/ED404949.pdf

Chaffee, S.H., & McLeod, J.M. (1968). Sensitization in panel design: A coorientational experiment. Journalism Quarterly, 45, 661-669.

Ciofalo, A. (1989). Legitimacy of internships for academic credit remains controversial. Journalism Educator, 43(4), 25-31.

Coco, M. (2000). Internships: A try before you buy arrangement. SAM Advanced Management Journal, 65(2), 41-43.

CPRE (1999). Public relations education for the 21st century: A port of entry. Retrieved from http://www.prsa.org/_Resources/ resources/pre21.asp?ident=rsrc6.

CPRE (2006). Public relations education for the 21st century: The Professional Bond. http://www.commpred.org/theprofessionalbond/index.php (accessed April 27).

Daugherty, E. L. (2011). The public relations internship experience: A comparison of student and site supervisor perspectives. Public Relations Review, 37(5), 470-477.

Fall, L. (2006). Value of engagement: Factors influencing how students perceive their community contribution to public relations internships. Public Relations Review, 32(4), 407-415.

Gault, J., Redington, J., & Schlager, T. (2000). Undergraduate business internships and career success: Are they related? Journal of Marketing Education, 22(1), 45-53.

Gibson, D. C. (2001). Communication faculty internships. Public Relations Review, 27(1), 103-117.

Grunig, J.E. (1972). Communication in community decisions on the problems of the poor. Journal of Communication, 22, 5-25.

Gupta, P., Burns, D., & Schifer, J. (2010). An exploration of student satisfaction with internship experiences in marketing. Business Education & Administration, 2(1), 27-37.

Hart Research Associates. (2016). Falling short? College learning and career success. NACTA Journal, 60(1a).

Heider, F. (1958). The Psychology of Interpersonal Relations. New York: John Wiley and Sons.

Horowitz, E. M. (1997, August). Does money still buy happiness: Effects of journalism internships on job satisfaction. Paper presented at the meeting of the Association for Education in Journalism and Mass Communication, Chicago, Il.

Johnson, D. J. (1989). The Coorientation Model and Consultant Roles. In: Botan, C.H. and Hazleton, Jr. (Eds.). Public Relations Theory (pp. 243-263). New Jersey: Lawrence Erlbaum Associates.

Kim, H. S. (1986). Coorientation and ommunication. In B. Dervin and M.I. Voight (Ed.), Progress in Communication Sciences (pp. 31-54). Norwood, NJ: Ablex.

Knouse, S.B, Tanner, J. R, & Harris, E. W. (1999). The relation of college internships, college performance, and subsequent job opportunity. Journal of Employment Counseling, 36(1), 35-43.

Knouse, S. B., & Fontenot, G. (2008). Benefits of the business college internship: A research review. Journal of Employment Counseling, 45(2), 61-66. doi: 10.1002/j.2161-1920.2008.tb00045.x

Kopenhaver, L.L., Martinson, D.L., & Ryan, R. (1984). How public relations practitioners and editors in Florida view each other. Journalism Quarterly, 61(4), 860-865,884.

Kuh, G. D. (2008). Excerpt from High-Impact Educational Practices: What They Are, Who Has Access to Them, and Why They Matter. Assoc. of Am. Colleges and Univ., Washington, DC.

Lubbers, C., Bourland-Davis, P., & Rawlins, B. (2008). Public relations interns and ethical issues at work: Perceptions of student interns from three different universities. PRism 5(1&2). Retrieved from http://praxis.massey.ac.nz/prism_on-line_journ.html

Lubbers, C. A., Bourland-Davis, P. G., & DeSanto, B. (2012). An exploration of public relations internship site supervisors’ practices. In M. A. Goralksi and H. P. LeBlanc (Eds.), Business Research Yearbook, (511-518). International Academy of Business Disciplines and International Graphics, Beltsville, MD.

Maertz Jr, C., A. Stoeberl, P., & Marks, J. (2014). Building successful internships: Lessons from the research for interns, schools, and employers. Career Development International19(1), 123-142.

McCarthy, P., & McCarthy, H. (2006). When case studies are not enough: Integrating experiential learning into business curricula. Journal of Education for Business, 81(4), 201-204.

McLeod, J., & Chaffee, S. (1973). Interpersonal approaches to communication research. American behavioral scientist, 16(4), 469-499.

Mendel‐Reyes, M. (1998). A pedagogy for citizenship: Service learning and democratic education. New Directions for Teaching and Learning, 1998(73), 31-38.

Meng, J. (2013). Learning by leading: Integrating leadership in public relations education for an enhanced value. Public Relations Review, 39(5), 609-611.

Mihail, D. (2006). Internships at Greek universities: An exploratory study. Journal of Workplace Learning, 18(1), 28-41.

National Association of Colleges and Employers (NACE). (2016). 2016 Internship & Co-op Survey. Retrieved from http://www.naceweb.org/intern-co-op-survey/

Neff, B., Walker, G., Smith, M., & Creedon, P. (1999). Outcomes desired by practitioners and academics. Public Relations Review, 25(1), 29-44.

Newcomb, T.M. (1953). The approach to the study of communication acts. Psychological Review, 60, 393-404.

O’Neill, N. (2010). Internships as a high-impact practice: Some reflections on quality. Peer Review, 12(4), 4-8.

Redeker, L. (1992). Internships provide invaluable job preparation. Public Relations Journal, 22(3), 20.

Sapp, D., & Zhang, Q. (2009). Trends in industry supervisors’ feedback on business communication internships. Business Communication Quarterly, 72(3).

Smith-Barrow, D. (2016). 10 colleges where almost everyone gets internships. US News and World Report. Retrieved from http://www.usnews.com/education/best-colleges/the-short-list-college/articles/2016-03-08/10-colleges-where-most-students-get-internships

Soska, T., Sullivan-Cosetti, M., & Pasupuleti, S. (2010). Service learning: Community engagement and partnership for integrating teaching, research, and service. Journal of Community Practice, 18(2-3), 139-147.

Taylor, S. (1988). Effects of college internships on individual participants. Journal of Applied Psychology, 73(3), 393.

Thiel, G., & Hartley, N. (1997). Cooperative education: A natural synergy between business and academia. SAM Advanced Management Journal, 62(3), 19.

Todd, V. (2014). Public relations supervisors and Millennial entry-level practitioners rate entry-level job skills and professional characteristics. Public Relations Review, 40(5), 789-797.

Tovey, J. (2001). Building connections between industry and university: Implementing an internship program at a regional university. Technical Communication Quarterly, 10(2), 225-239.

Verčič, D., & Verčič, A. T. (2007). A use of second-order co-orientation model in international public relations. Public Relations Review, 33(4), 407-414.

Waters, R. (2009). Comparing the two sides of the nonprofit organization–donor relationship: Applying coorientation methodology to relationship management. Public Relations Review, 35(2), 144-146.

Watson, B. (1995). The intern turnaround. Management Review, 84(6), 9-13.

Westerberg, C., & Wickersham, C. (2011). Internships have value, whether or not students are paid. The Chronicle of Higher Education. Retrieved from http://www.chronicle.com/article/Internships-Have-Value/127231/

© Copyright 2017 AEJMC Public Relations Division

Educating students for the social, digital and information world: Teaching public relations infographic design


Diana Sisson Tara Moretensen

Diana C. Sisson, Ph.D., Auburn University
Tara M. Mortensen, Ph.D., University of South Carolina


 This study employs an exploratory content analysis of current public relations information graphics to examine variables within two concepts pertaining to public relations: transparency and clarity. These two concepts were chosen because they apply to both traditional public relations practice and are also widely taught amongst contemporary infographics design experts. The subjects of the study are nonprofit organizations’ online informational graphics (N = 376) that have been released on Twitter. Findings suggest that nonprofit organizations are not applying traditional public relations principles to their design of online information graphics, demonstrating difficulty in translating these principles to visual design, a skill that is becoming more important. While the study is not intended to generalize, this snapshot of current practice is used to offer improvements in preparing public relations students for communication with information visualizations. This exploration illuminates the need for public relations education geared toward the social, visual, and data-driven environment. To this end, the study uses these findings to develop an initial set of practices for infographic design that can be implemented into current public relations education.

Keywords: infographics, public relations, visual communication, nonprofit organizations, public relations education, visual literacy

SlideShare PDF

Educating students for the social, digital and information world:  Teaching public relations infographic design

Educating students for the social, digital and information world:  Teaching public relations infographic design

Social media have transformed public relations education, forcing students to apply traditional public relations principles, such as transparency and clarity, to new forms of communications such as infographics. Infographics are design pieces that may include “data visualizations, illustrations, text, and images together into a format that tells a complete story” (Krum, 2013, p. 6). In the contemporary mediascape that caters to low-attention spans, infographics have become hugely popular forms of communication. Public relations firms are using the medium to build awareness of products and brands, provide information to shareholders, and increase the value of the brand or cause (Krum, 2013, p. 88). Effectively creating infographics requires an understanding of visual communication principles and for niche industries such as public relations, requires translating legacy principles to new forms of communication. Data visualizations are compelling to audiences, and “present the illusion of trustworthiness due to their visual nature and presentation of statistical information” (Toth, 2013, p. 449). Thus, understanding how to correctly present data in visual form is imperative.

No research was found by the researchers regarding how public relations professionals are applying traditional principles to the design of information graphics, nor how students can better prepare to work in a modern media environment. Given the popularity of infographic use among nonprofit organizations in an online environment, these are significant gaps. This exploratory study examines public relations graphics released via Twitter to identify the manner in which the principles of transparency and clarity are being applied, and to ultimately offer an initial list of suggestions for public relations educators.

Based on a review of the literature in the following sections, opportunities for further study arose and research questions are proposed. Visuals as a form of communication in the contemporary visual-social mediascape will be introduced, concentrating on infographics. A discussion of the importance and communicative powers of visuals will also be undertaken. Following, the variables within the concepts of transparency and clarity will be laid out as they pertain to public relations and visual communication, specifically infographics.


An Onion article jokes that people “shudder” at large blocks of uninterrupted text, requiring a colorful photo, an illustration, or a chart to comprehend the information (“Nation shudders,” 2010). Satire aside, contemporary news consumers are indeed skimmers, primarily reading exciting words and facts, as well as headlines and visuals (Nielsen, 2011; Rosenwald, 2014). This trend has contributed to a massive increase in the use of infographics to spread information, as well as a need for educators to teach new tools.

Between 2013 and 2015, Google searches for infographics increased 800% (Meacham, 2015). Infographics intend to tell a story primarily in pictures, while minimizing the number of words and maximizing visual impact (Meacham, 2015). The production of data and its graphic representation were once specialized trades but are now accessible to nearly everyone (Yaffa, 2011). Infographics harness the power of visuals to grab readers’ attention, reduce the amount of time it takes to understand data, provide context by showing comparisons, and make messages more emotional, memorable and accessible (Kimball & Hawkins, 2008; Kostelnick & Roberts, 2010; Schafer, 1995; Tufte, 2003).

In an age of “fake news” and audience mistrust of traditional media sources, understanding how to communicate truthfully in multiple forms is particularly important for students (Rutenberg, 2016). The 2016 presidential elections brought the term “fake news” into mainstream awareness, raising widespread knowledge of the viral spread of untruthful information via social-networking sites (Wingfield, Isaac, & Benner, 2016). Twitter and Facebook have been urged to take their part of the responsibility in this spread, and tomorrow’s communicators, too, must be prepared to understand, identify, and create truthful and clear visual-statistical messages. Members of the media, following Kellyanne Conway, have used the term “alternative facts” to describe a problematic trend of a growing perception of multiple truths, which affects the credibility of politicians, corporations and the media (Rutenberg, 2017, para. 7). Data design has special considerations in this regard (Kienzler, 1997; Rosenquist, 2012; Stallworth, 2008; Tufte, 2001). Visual content creators can accidentally and easily mislead their audience because visuals earn more importance and emotional impact than texts (Kienzler, 1997). Infographics can unintentionally distort or make data opaque to gain viewers (McArdle, 2011). As Toth (2013) noted, infographics represent an extension of fundamental issues, including “presenting information clearly and succinctly, targeting audiences, defining clear purposes, developing ethos, understanding document design principles, using persuasion techniques effectively, branding, and conducting and summarizing research” (p. 451).

Public Relations Education and Visual Communication

Educational materials for creating and disseminating infographics have only recently been developed and are not widely adopted within the various streams of communication education. Experts on infographics contend that there are thousands of poorly-constructed infographics online, but “the good designs rise to the top and are the designs that most often go viral in social networks” (Krum, 2013, p. 271). The challenge is melding the principles of various fields, including public relations, with the principles of infographic design and visual communication.

Researchers and professionals have noted the increased need for education in infographics in public relations due to employers’ demand for such skills and increased usage in the field (Gallicano, Ekachai, & Freberg, 2014). Advocates of visual literacy have long held that visual education, including knowledge of how to create visuals, is the missing piece of contemporary education (Metros, 2008; Sosa, 2009). Visuals have a powerful impact on audiences in ways that text does not. Visuals grab readers’ attention (Boerman, Smith, & van Meurs, 2011) and stick in the memory longer than other forms of communications (Graber, 1990). Krum (2013) referred to this as the “picture superiority effect” (p. 20). Further, images are subject to less scrutiny than other forms of communication (Messaris, 1994, p. x). In other words, viewers of images tend to believe what they see (Newton, 2013; Wheeler, 2001), and this is especially the case with visualized data (Cairo, 2012; Krum, 2013). While modern college students are consumers and producers of highly visual content on the web, they lack the skills to effectively communicate visually (Metros, 2008). Visual intelligence influences perceptions and interpretations of visual materials (Moriarty, 1996). Schools are encouraged to introduce concepts of visual literacy to understand, analyze, interpret and create effective visual information (Burns, 2006).

ACEJMC suggests, broadly, that all programs should teach students to apply the appropriate tools and technologies for the communication professions in which they work. There is greater importance to teach students visual communication skills due to the digital landscape and shorter attention spans (Lester, 2015). This need is particularly pertinent to public relations students and infographics. The Commission on Public Relations Education met in 2015 to discuss undergraduate public relations education, noting a need for better verbal as well as graphic communications (p. 8). Kent (2013), in his suggestions for using social media in public relations, states that publics are better served by thoughtful, thorough, and relevant information including high-quality infographics that contain complete information, rather than “eye-candy” (p. 343). Richard Edelman (2012) said to public relations educators that, “There is a huge place for deeper, more informative visuals . . . which infographics – visual representations of information, data or knowledge – provide” (p. 4).

The following sections of this paper review two principles of public relations, and within each principle, rules of effective infographic design are applied. Transparency and clarity were examined because: 1) organizational transparency is necessary to provide coherence, visibility, and clarity (Albu & Wehmeier, 2014); and, 2) clarity assures that information communicated is easily understood by various publics and does not contain jargon (Rawlins, 2009). Further, concepts of transparency and clarity each encompass variables that theoretically and practically derive from and can be applied to visual communications, specifically infographics. The researchers were interested in studying the junction of these two fields and extracting implications for students who will be working within this increasingly-popular, professional niche.


Public relations students are taught to be transparent, but may not know how this applies to infographic design. According to Rawlins (2009):

Transparency is the deliberate attempt to make available all legally releasable information—whether positive or negative in nature—in a manner that is accurate, timely, balanced, and unequivocal, for the purpose of enhancing the reasoning ability of publics and holding organizations accountable for their actions, policies, and practices. (p. 75)

Plaisance (2007) argued while transparency “is not always a sufficient condition for more ethical behavior, its absence is a prerequisite for deception” (p. 193). Transparency has been studied from conceptual (Rawlins, 2006, 2009), journalistic (Plaisance, 2007), and social media campaign (Burns, 2008; DiStaso & Bortree, 2012) perspectives.

Rawlins (2006) argued transparency is comprised of three components: participation, substantial information, and accountability. Drawing on previous transparency literature, as well as on the Global Reporting Index (GRI) Guidelines and other guidelines promoting transparent communication, Rawlins (2006) found substantial information was the “strongest predictor among transparency components” (p. 433). From this perspective, Rawlins (2009) noted disclosure is about providing information, but can be used to distort perspectives, rather than provide clarity.

Transparency has been studied from a social media campaign perspective (Burns, 2008) and from a dialogic perspective with particular focus on mutual understanding (Albu & Wehmeier, 2014). Using content analysis, Burns (2008) examined the Wal-Mart and Edelman “Wal-Marting Across America” blog crisis to argue that a lack of transparency in blogging leads to harsh criticism despite classic crisis response strategies such as apology. DiStaso and Bortree (2012) echoed similar sentiments about transparency through their evaluation of award-winning campaigns. DiStaso and Bortree (2012) found that many of the campaigns reflected transparency in that they “provid[ed] information that is useful for others to make informed decisions” (p. 513). Transparency in social media tactics kept organizations accountable to their publics (DiStaso & Bortree, 2012). Albu and Wehmeier (2014) argued that transparency and dialogue were “interconnected,” which was often overlooked in the literature (p. 129). Echoing Rawlins (2009), they posited that disclosure alone was insufficient for publics’ understanding; rather, true understanding was based in the coherence, clarity, and visibility of information (Albu & Wehmeier, 2014). In communicating transparently to foster mutual understanding, Albu and Wehmeier (2014) argued accountability, credibility, and loyalty of stakeholders may be heightened.

Transparency and visual communications. While transparency is a vital principle for public relations professionals to abide by, contemporary public relations educational materials fall short of teaching the application of transparency to infographics design. On the same token, textbooks specific to visual communication explain the importance of transparency in infographic design, but do little to translate these principles to public relations (e.g., Knaflic, 2015; Krum, 2013; Smiciklas, 2012). Transparency with data is, in fact, of utmost importance in the creation of infographics. Viewers tend to see visualized data as both important and scientifically true, placing increased pressure on infographic designers to be transparent about the data. To be transparent, the infographic needs to “address the sources of the data included in the design in an open and honest manner” (Krum, 2013, p. 295). Sharing where the data came from, the age of the data, and the credibility of the data source can help establish the believability of the data. Further, copyright law means that the designer of the infographic and the names of any contributing illustrators and photographers be given credit (Lester, 2015; Walter & Gioglio, 2014).

Still, a massive portion of information graphics appearing online have either no data source listed, vague data sources provided, or simply provide questionable data sources, including personal blogs and websites. Krum (2013) suggested infographic designers should track down and cite the original source of data, list the source, and list a specific URL to the exact report or dataset that was used, as well as including the date of the data. Once an infographic is released online, its whereabouts will become unpredictable. In fact, a purpose of infographic design is to “go viral.” Therefore, in addition to source information, then, the bottom of an infographic must include the name of the company that originally released it and a landing page URL that sends the viewer to the original source of the infographic.

Transparency measures. The Global Reporting Index offers guidelines for promoting transparent communication (Rawlins, 2009). The GRI indicated clarity, relevance, timeliness, neutrality, sustainability context, and comparability were important components in transparent communication (Rawlins, 2009).

Transparent communication should aid with decision-making by providing relevant information to members of key publics (Global Reporting Index, as cited in Rawlins, 2009). Transparent communication should be timely. The Global Reporting Index defined timeliness as providing “information within a time frame that makes the information usable” (as cited in Rawlins, 2009, p. 82). Transparent communication should be neutral in order to avoid perceptions of deception. The GRI defined neutrality as “avoid[ing] bias and striv[ing] for a balanced account of the company’s performance” (as cited in Rawlins, 2009, p. 81). While transparent communication should be neutral and timely, it should also provide a sustainability context to information. The Global Reporting Index defined sustainability context as “identify[ing] how organizational behavior is contributing to effects on the environment, economy, and/or social welfare” (as cited in Rawlins, 2009, p. 80). Furthermore, transparent communication should be comparable. The GRI defined comparability as “easily compar[ing] to both earlier performance of the company and to other similar organizations” (as cited in Rawlins, 2009, p. 81).


Public relations students are taught about presenting information clearly, but infographic design has special implications for this principle, which may be less understood. As delineated by the Global Reporting Index guidelines, information is clear, or has clarity, when the information communicated is easily understood by various publics and does not contain jargon (as cited in Rawlins, 2009). Furthermore, the GRI indicated that clarity enhances understanding of information (as cited in Rawlins, 2009). Jargon, or highly technical and industry-specific words or acronyms, hinders understanding of organizational communication by members of key publics. Marken (1996) contended that public relations professionals have a responsibility to communicate on behalf of their organizations in a clear and concise manner, and public relations students are taught to present information clearly.

Clarity and infographic design. When creating infographics, several principles of design promote clarity. A primary purpose of creating infographics is to provide clarity to disorganized and difficult-to-understand data or ideas (Cairo, 2012). A well-designed infographic should present information in a way that readers can see, read, and explore information which would be too difficult to digest in its raw data form (Cairo, 2012). As Krum (2013) said, “Nobody wants to read a text article that has been converted into a JPG image file and then called an infographic” (p. 291), and further stresses: “Using big fonts in an infographic to make the numbers stand out is not data visualization . . . . Displaying the number in a large font doesn’t make it any easier for the audience to understand” (p. 219). Therefore, the visualization of data in order to increase comprehension of information is essential.

Charts (pie, line, bar), graphs, illustrations, maps, and diagrams, when used correctly, help make complex information more clear and understandable (Cairo, 2012). Additionally, considered by many the Father of Data Visualization, Edward Tufte is described by Yaffa (2011) as saying “the first grand principle of analytical design: above all else, always show comparisons” (para. 12). Doing so allows clear data presentation and interpretation to viewers. According to Yaffa (2011), Tufte believes, “there is no such thing as information overload . . . . Only bad design,” which impedes rather than enhances clarity (para. 36). In addition to choosing the proper visualization method for the given data, clarity is increased when viewers do not have to look back and forth to discern the meaning of the visualizations or colors. This is why pioneer infographic designer Scott Farrand said to “avoid legends like the plague” (personal communication, March 23, 2016), and Randy Krum said using legends are “evil” (p. 293). Tufte (1983) coined the term “chart junk” (p. 67) to refer to anything that gets in the way of a viewer interpreting the data.

Research Questions

Given the popularity of infographics use by nonprofit organizations and the call from the Commission on Public Relations Education (2015) and other scholars, this area should be examined, and improvements should be offered for the next generation of public relations practitioners. For this reason, the following research questions are offered:

RQ1: To what degree are nonprofit organizations’ information graphics transparent?

RQ2: To what degree do nonprofit organizations present the information in graphics clearly?


A content analysis was conducted to systematically and quantitatively evaluate transparency and clarity strategies in nonprofits’ online information graphics (Stempel, 2003). Content analysis allowed for conclusions to be drawn from the observations that emerge from analysis of data (Stempel, 2003).


Information graphics (N = 376) released by 18 nonprofit organizations on Twitter were analyzed. The researchers defined an infographic for this study as a graphic that contains information. This graphical information did not necessarily need to be quantitative, but could also be words, facts, or illustrations. None of the infographics were “clickable” or lead to other pages. Note that the definition is broad. While Fernando (2012) defines an infographic as “a form of storytelling that people can use to visualize data in a way that illustrates knowledge, experiences, or events” (Fernando, 2012, p. 2), a wider definition is adopted for the present study in order to accommodate those infographics that fall out of the expert definition. Infographics distributed through Twitter were selected for this study because 21% of American adults use the social media platform for their news consumption (Greenwood, Perrin, & Duggan, 2016). As this is an exploratory study, only one social-networking website was used. Future studies should examine transparency and clarity of nonprofit organizations on other social networks, such as Facebook.

Nonprofit organizations were selected for analysis based on a sampling frame of Top Nonprofits.com’s Top 100 Nonprofits on the Web list. The sample frame was selected for its reliance on “publicly available web, social, and fiscal responsibility metrics” (Top 100 Nonprofits on the Web, n.d., para. 2), as well as for its rankings methodology of nonprofits online. Each of the chosen nonprofit organizations’ Twitter feeds were accessed to gather infographics. Data collection occurred from November 1, 2015 to November 31, 2015 for this study. All non-animated, non-clickable infographics collected were released in November 2015, as well as up to six months prior in May 2015 in order to collect a substantive sample. This time frame allowed the researchers to examine a snapshot of nonprofit organizations’ infographic use and design practices prior to December and January, which are traditionally peak fundraising periods. Duplicates were excluded.

Nonprofit organizations found in the Top Nonprofits.com’s Top 100 Nonprofits on the Web list were divided into “more than 10” and “less than 10” infographics categories. The rationale for this categorization was to ensure that the researchers were not pulling infographics from nonprofit organizations that used the visual communication infrequently; this categorization was intended to ensure representativeness of infographic use and frequency. The researchers collected infographics from the Top 100 Nonprofits on the Web list using this categorization until an adequate sample size was met. The sample was not random, as generalizing to the broader social media sphere was not the purpose of the paper. Rather, the purpose of the examination is to gather a snapshot of contemporary public relations infographics and offer suggestions for improvement in education.

Nonprofit organizations analyzed in this study and listed in Top 100 Nonprofits on the Web include: Human Rights Campaign (15.2%, n = 57), UNICEF (15%, n = 55), Save the Children (8.2%, n = 31), ACLU (8%, n = 29), Conservation International (7%, n = 25), International Rescue Committee (7%, n = 25), Wounded Warrior Project (6.4%, n = 24), Amnesty International (6.1%, n = 23), Teach for America (5.1%, n = 19), Feeding America (5%, n = 18), Susan G. Komen (5%, n = 18), March of Dimes (5%, n = 17), Rotary International (4%, n = 13), ASPCA (3%, n = 10), Livestrong (1.1%, n = 4), Samaritan’s Purse (1%, n = 3), Ronald McDonald House (1%, n = 3), and Kliva (1%, n = 2).

 Coding and Variables

Variables for measuring transparency and clarity were gleaned from the academic and professional literature on public relations and infographic design, as described in the literature review. Each concept contained variables pertaining to the intersection of infographic design and public relations. Transparency was the largest of the three concepts, and specifically measured using variables and variable definitions found in Table 1.

Table 1
Transparency variables and definitions
Variable Variable Definition
Data attribution Whether or not the data was attributed at all
Data availability Whether the original data itself is available to viewers: A link on infographic? Link on landing page? Spreadsheet on landing page? Data source not available at all? Each was coded as yes or no.
Data quality Whether the data source is vague, questionable, reliable, or not identified. Vague data sources are those that only contain the name of the host site that publishes the data without any additional information about a specific report or article. Questionable data sources are those that are Wikipedia, blogs, or personal sites, and unclear sites are those where the source is not clearly identified. Each was coded as yes or no.
Data date Whether the date of the data was provided. Coded as yes or no.
Designer credit Whether credit was given to the individual that designed the infographic. Coded as yes or no.
Photographer or graphic credit Whether credit was given to the individual(s) who created any graphical elements or photographs used in the infographic. Coded as yes or no.
Landing page Whether a URL was provided that directs the user back to the original web location of the infographic. Coded as yes or no.
Relevance Whether the infographic contains information specific to the organization. Coded as yes or no.
Sustainability context Whether the infographic identifies how organizational behavior is contributing to effects on the environment, economy, and/or social welfare. Coded as yes or no.
Neutrality Whether the infographic contains information from organizations other than itself. Coded as yes or no.
Comparability Whether the infographic compares its performance to itself or to other/ similar organizations. Coded as yes or no.
Timeliness Whether the infographic contains information in a timeframe usable to stakeholders. Coded as yes or no.

Clarity contained three variables: two derived from the infographic literature and one derived from public relations literature, which can be found in Table 2. They were infographic type, presence of a legend, and presence of industry jargon.

Table 2
Clarity variables and definitions
Variable Variable Definition
Infographic type Timeline, pie chart, line graph, how-to diagram, bar graph, bubble chart, flow chart, list, numbers only, words/facts only, or other. For each of these, coded as present or not present.
Legend Whether or not the infographic contained a legend. Coded as yes or no.
Jargon Whether or not the infographic contained jargon. Coded as yes or no.


In addition, the researchers coded the organization that released the infographic, the topic, tone, and type of data visualization. Tone was coded as humorous/entertaining, informational, utility/how-to, serious/somber, other and none, and each category was coded as yes or no, as these categories are not mutually exclusive. Humor or entertaining infographics were light-hearted or comical; informational infographics were merely fact-based; utility-based infographics were those that taught a user how to do something; serious or somber infographics contained serious information aimed at persuading users. As this article is aimed towards education, infographic types examined (e.g., pie charts, maps) were selected from two leading textbook authors, Cairo (2012) and Krum (2013). A detailed visual codebook was developed and refined through five separate practice sessions by two independent coders. Following refinement of the codebook, three more practice coding sessions of a subsample of infographics were undertaken, with intermittent discussions and clarifications, until a level of agreement was achieved. Coders reached a good to excellent level of intercoder reliability. The Cohen’s Kappas were all α > 0.9, with three exceptions: Type: 0.87; Neutrality = 0.87; and Attribution = 0.87. To examine the data from the visual and textual content analysis, frequencies and descriptive statistics of each category were conducted.


Findings from this study highlighted the nuances of how nonprofits approach transparency and clarity practices. The following sections address the results of each research question.

RQ1: To what degree are nonprofit organizations’ infographics transparent?

Frequencies of infographic transparency variables, infographic quality of data source of those that list a source, and data availability were conducted. As Table 3 shows, only 18.6% of the infographics attributed the source of their data. For each variable, there were fewer positive instances of transparency than negative.

Table 3
Frequencies of infographic transparency variables
Variable Yes No Total
Data attribution 70 (18.6%) 306 (81.4%) 376 (100%)
Data date 53 (14.1%) 323 (85.9%) 376 (100%)
Designer credit 3 (0.8%) 373 (99.2%) 376 (100%)
Photographer credit 28 (7.4%) 348 (92.6%) 376 (100%)
Landing page 120 (31.9%) 255 (67.8%) 376 (100%)
Relevance 65 (17.3%) 311 (82.7%) 376 (100%)
Sustainability context 53 (14.1%) 322 (85.6%) 376 (100%)
Neutrality 30 (8.0%) 345 (91.8%) 376 (100%)
Comparability 9 (2.4%) 366 (97.3%) 376 (100%)
Timeliness 39 (10.4%) 336 (90%) 376 (100%)

The inclusion of a landing page was the one tool used most often by the nonprofits in this sample (31.9%). Very few infographics included a credit to the designer (0.8%) or image source (14.1%). Relevance, or whether the infographic contained information about an action taken by the organization, was present in 17.3% of infographics. Similarly, 14.1% of infographics contained information about how organizational behavior is contributing to effects on the community, environment, or social welfare of groups or individuals.

As Table 4 shows, of the infographics that list a data source (18.6%, n = 70), 64 of the sources were vague, or only listed the host site without additional information about the specific report or article; three were “questionable” (e.g., a blog, Wikipedia, or personal site); and three were not clearly identified.

Table 4

Infographic quality of data source of those that list a source

Data quality Number of infographics Total
Vague 64 (91.4%) 70 (100%)
Questionable 3 (4.3%) 70 (100%)
Unclear 3 (4.3%) 70 (100%)

As Table 5 shows, audiences wishing to clarify the source of data would be mostly unable to, as only 16 (4.2%) of the infographics in the total sample contained a way to find the source of the data.

Table 5
How nonprofit organizations make data available
Data availability Number of infographics Total
Link on infographic 5 (1.3%) 376 (100%)
Link on landing page 8 (2.1%) 376 (100%)
Spreadsheet on landing page 3 (.8%) 376 (100%)
Data source not readily available 360 (95.7%) 376 (100%)

Finally, image source (4%) was associated with numbers-only infographics. Image source (4%) was also associated with infographics with only words and facts. Designer credit (1%) was most associated with list infographics. Landing page URLs (12%) were most associated with infographics with only words and facts. Landing page URLs (8%) were also associated with list infographics.

RQ2: To what degree do nonprofit organizations present the information in infographics clearly?

For the present paper, the construct of clarity was measured using three variables culled from the literature: type of infographic, the use of jargon, and the use of legends. Inclusion of jargon (Figure 1, from our sample) and inclusion of a legend (Figure 3, from our sample) inhibit clarity.

Figure 1. Infographic example of jargon and avoiding legend (ACLU, 2015, May 31)

Nonprofit organizations used and disseminated different types of infographics through Twitter. Infographic types examined included: numbers only (66%, n = 248), word and facts (27%, n = 103), lists (13%, n = 103), pie charts (9%, n = 32), bar graphs (4%, n = 16), how-to (3%, n = 12), maps (3%, n = 12), line graphs (2%, n = 6), timelines (1.1%, n = 4), and flowcharts (1%, n = 2).

Figure 2. “Big numbers” (ACLU, 2015, October 28)


Figure 3. Infographic example of unnecessary legend (ACLU, 2015, May 21)

Most of the infographics (89.6%, n = 337) examined did not contain a data visualization, thus precluding the need to consider whether a legend must be used. Of the 39 (10.4%) infographics in this analysis that did contain data visualization (e.g., a chart or graph), 14 (3.7%) used a legend unnecessarily, while 25 (6.6%) did not use a legend, thus clarifying data interpretation. Of the infographics examined, 35 (9.5%) contained instances of jargon, or highly technical, industry-specific words or acronyms that may not be understood by all members of the lay audience.


The present study sits at the intersection of public relations, infographics, and education. By examining infographic design principles as applied to public relations practices, these exploratory findings lend to the development of more effective education in the area of visual literacy, particularly, public relations infographics design. The study suggests that while making heavy use of infographics on social media, the nonprofit organizations studied do not often translate concepts of transparency and clarity into their infographic-based communications online. This finding magnifies educators’ and researchers’ calls for better visual literacy education among students and lends to suggestions for such literacy in the area of infographic design.

The nonprofit organizations in this study did not often practice transparency in their infographics. Only 19% of the infographics examined included the data source at all, and even fewer provided details such as the date of the data (14.1%). Those that did include a source were most often vague about it, including the name of a company (e.g., “Humane Society”) instead of directing the user to an actual dataset or name of a study. In fact, very few (4.2%) infographics made the dataset available to viewers, inhibiting the viewer’s ability to explore, ask questions, and assess credibility (Cairo, 2012). Nonprofit organizations were most opaque in their sourcing of photographers (7.4%) and designers (0.8%), an ethical and legal blunder (e.g., Lester, 2015; Newton, 2013). Only 32% of the infographics examined included at least a URL leading back to the landing page from where the infographic originated, leaving most viewers in the dark as to the origins of the graphic itself to fill in any of the transparency gaps.

Further, the infographics studied in this sample did not reflect transparent communication practices as outlined by Global Reporting Index guidelines (as cited in Rawlins, 2009). Only 17.3% of the infographics released by nonprofit organizations in this study communicated their actions (i.e., relevance), while even fewer (14.1%) communicated using a sustainability context how their actions impact the community, environment, or social welfare of groups or individuals. Given this finding, nonprofit organizations are missing an opportunity to communicate what they do and how they impact society, which may provide a competitive advantage and enhance relationships with current and potential donors.

Limited (8%) infographics communicated neutral information about the nonprofit organization’s actions from a third party, which may create skewed perceptions. Third-party endorsements provide organizations an additional layer of credibility with members of key publics; therefore, not incorporating this information may impact perceptions of organizational credibility. Very few (2.4%) infographics provided comparable information about nonprofit organizations’ past and present performance, which would show its effectiveness to donors and members of key publics. While using social media to provide information quickly, few (10.4%) infographics provided timely information that would aid donors and key publics in decision-making. Timeliness refers to the information date in relation to the information distribution in infographic via Twitter.

The infographics examined could also improve clarity. Nonprofit organizations are not taking full advantage of the power of infographics to visualize otherwise difficult data or information, a primary purpose of using infographics (Cairo, 2012). Most nonprofits are releasing big numbers, big words, or lists, a strategy recommended against by experts on the topic (e.g., Cairo, 2012; Krum, 2013). Very few other types of data visualizations were used, with pie charts being the most popular, present in 9% of the graphics. Other forms of visualization, while potentially more appropriate, were each used in less than 5% of the sample. Of the infographics using data visualizations, just under half used legends, adding unnecessary work for viewers trying to decipher the meaning of the visualization.

Practical Implications. Findings from this study inform public relations educators by presenting gaps in practice that can be addressed by teaching students about transparency and clarity with regard to infographics. Students should keep in mind that once an infographic is released onto the Internet, its eventual whereabouts are unpredictable. Students should be prepared to conduct a communication audit of their infographic use to ensure that communication has clarity and communicates dedication to transparency practices. Students should employ a thematic analysis of current messaging in their communication audits guided by the measures of transparency and clarity offered in this study.

In the same way that public relations professionals are trained in management, strategy, writing, and research, the visual landscape of information overload dictates a need for basic education in communicating these ideas in data using visuals. Given the findings from this study, the following suggestions for infographic design are offered as a first step toward suggestions on infographic design for educational purposes:

  1. Data source, designer credit, and photographer credit must be included directly on all types of infographics to lend to transparency;
  2. Nonprofit organizations must enhance credibility with members of key publics through the use of neutral information or data to show unbiased impact on society through their organizational efforts;
  3. Nonprofit organizations must strive to communicate clearly by avoiding the use of legends and jargon, which may be confusing and add unnecessary work for members of key publics;
  4. Nonprofit organizations would improve their commitment to clarity and harness the power of visuals by incorporating more visualization of data and fewer graphics with mere large numbers which may make the numbers seem important. Tufte suggests to always show comparisons in data visualizations, allowing the viewer to better understand. Showing, not telling, is at the heart of infographic design;
  5. And, designers and public relations professionals must consider the apparent believability of data visualizations and be vigilant in their transparency efforts by including a data source, a link to the dataset, the date, and a landing page link on the infographic itself, lending to credibility.

Limitations and Suggestions for Future Studies

Despite the relevant and important findings of this study, there are still some limitations worth noting. This study is intended as a snapshot into the current state of nonprofit infographics online with a purpose of opening up a dialogue about improvement and leading to future, more thorough studies on the topic and for developing an initial set of suggestions for teaching public relations students about infographic design. The sample was purposeful and not random; thus, these findings cannot be generalized to all infographics online, or even those from public relations agencies. The goal of the paper was not to generalize, but to glean a snapshot of practices in order to offer best practices for students.

The shortcomings of this study and unaddressed issues open up the door for future studies. Infographics disseminated by nonprofit organizations on Twitter were the only type of infographic studied. Other scholars would add to the literature by exploring other types of infographics and other social networks. Second, there is an important area in need of examination with regard to infographics, and that is data deception. For example, bubble charts are infamous for misrepresenting the size and scale of area, rendering data comparisons misleading (Cairo, 2012; Tufte, 1983). No studies, to the authors’ knowledge, have taken on the task of carefully examining the accuracy of data visualizations. This second, larger step would add richness to the present understanding of infographics. This is an important area of study, and offering students instructions in this regard is relevant.

Further study regarding best practices of visual, social and primarily nonlinear and web-based forms of communications will enhance current practices in the public relations industry and will help to bolster the credibility of an organization during a time when that is desperately needed. The buzz surrounding the proliferation of “fake news” and so-called “alternative facts” calls educators’ attention to the need to teach transparency and clarity as applied to all forms of communications. This study opens up conversations and invites further study into best practices of performing public relations in the contemporary media landscape. Future studies can add to and move beyond the three concepts examined here, and study not only infographics, but the myriad other forms of online communications, including memes, GIFs, animations, and snaps.


Albu, O. B., & Wehmeier, S. (2014). Organizational transparency and sense-making: The case of Northern Rock. Journal of Public Relations Research, 26(2), 117–133. http://doi.org/10.1080/1062726X.2013.795869

ACLU. (2015, October 28). #CriminalJustice system fails women survivors of domestic & sexual abuse. New ACLU report: https://www.aclu.org/feature/responses-field …. [Twitter post]. Retrieved from https://twitter.com/ACLU/status/659467248487133184

ACLU. (2015, May 31). #Minneapolis arrest rates are much higher for ppl of color #overcriminalization https://www.aclu.org/feature/picking-pieces#minneapolis …. [Twitter post]. Retrieved from https://twitter.com/ACLU/status/605037048928468993

ACLU. (2015, May 21). More than 50% of ppl executed in the US in ’14 were African Americans http://www.vox.com/2015/5/19/8625697/death-penalties-by-state … @voxdotcom @colorlines. [Twitter post]. Retrieved from https://twitter.com/ACLU/status/601395527830392835

Boerman, S. C., Smith, E., & van Meurs, L. (2011). Attention battle: The abilities of brand, visual, and text characteristics of the ad to draw attention versus the diverting power of the direct magazine context. In S. Okazaki, (Ed.), Advances in advertising research (Vol. 2): Breaking new ground in theory and practice (pp. 295–310). Wiesbaden: Gabler Verlag.

Burns, K. S. (2008). The misuse of social media: Reactions to and important lessons from a blog fiasco. Journal of New Communications Research, 3(1), 41–54.

Burns, M. (2006). A thousand words: Promoting teachers’ visual literacy skills. Multimedia and Internet@ Schools, 13(1), 16.

Commission on Public Relations Education. (2015). Educator Summit on Public Relations Education: Summary Report (pp. 1–38). Retrieved from http://www.commpred.org/_uploads/industry-educator-summit-summary-report.pdf

Cairo, A. (2015). Graphics lies, misleading visuals. In New Challenges for Data Design (pp. 103-116). London: Springer-Verlag.

Cairo, A. (2012). The Functional Art: An introduction to information graphics and visualization. San Francisco, CA: New Riders.

DiStaso, M. W., & Bortree, D. S. (2012). Multi-method analysis of transparency in social media practices: Survey, interviews and content analysis. Public Relations Review, 38(3), 511–514. doi:10.1016/j.pubrev.2012.01.003

Edelman, R. (2012, June). When all media is social: Navigating the future of communications. Speech presented at the 2012 Edelman Academic Summit, Palo Alto, CA. Available at http://www.newmediaacademicsummit.com/summit2012/agenda.asp

Fernando, A. (2012). Killer infographic! But does it solve TMI? Communication World, 29(2), 10-12.

Gallicano, T., Ekachai, D., & Freberg, K. (2014). The infographics assignment: A qualitative study of students’ and professionals’ perspectives. Public Relations Journal, 8(4).

Graber, D. A. (1990). Seeing is remembering: How visuals contribute to learning from television news. Journal of Communication, 40(3), 134-156.

Greenwood, S., Perrin, A., & Duggan, M. (2016, November 11). Social Media Update 2016. Retrieved from http://www.pewinternet.org/2016/11/11/social-media-update-2016/

Kent, M. L. (2013). Using social media dialogically: Public relations role in reviving democracy. Public Relations Review, 39(4), 337-345.

Kienzler, D. S. (1997). Visual ethics. Journal of Business Communication, 34, 171-187. doi:10.1177/002194369703400204

Kimball, M. A., & Hawkins, A. R. (2008). Document design: A guide for technical communicators. Boston, MA: Bedford/St. Martin’s.

Knaflic, C. N. (2015). Storytelling with Data: A Data Visualization Guide for Business Professionals. Hoboken, NJ: John Wiley & Sons.

Kostelnick, C., & Roberts, D. (2010). Designing visual language: Strategies for professional communicators (2nd ed.). Boston, MA: Allyn & Bacon.

Krum, R. (2013). Cool infographics: Effective communication with data visualization and design. Indianapolis, IN: John Wiley & Sons.

Lester, P. M. (2015). Photojournalism: An ethical approach. New York, NY: Routledge.

Marken, G. A. (1996). Public relations’ biggest challenge: Translation. Public Relations Quarterly, 41(3), 47–48.

Meacham, M. (2015, August). Use infographics to enhance training. TD Magazine. Retrieved from https://www.td.org/magazines/td-magazine/use-infographics-to-enhance-training

McArdle, M. (2011, December 23). Ending the infographic plague. The Atlantic. Retrieved from https://www.theatlantic.com/business/archive/2011/12/ending-the-infographic-plague/250474/

Messaris, P. (1994). Visual “literacy”: Image, mind, and reality. Boulder, CO: Westview Press.

Metros, S. E. (2008). The educator’s role in preparing visually literate learners. Theory into Practice, 47(2), 102-109.

Moriarty, S. E. (1996). Abduction: A theory of visual interpretation. Communication Theory, 6(2), 167-187.

“Nation shudders at large block of uninterrupted text.” (2010, March 9). The Onion. Retrieved from http://www.theonion.com/article/nation-shudders-at-large-block-of-uninterrupted-te-16932

Newton, J. (2013). The burden of visual truth: The role of photojournalism in mediating reality. New York, NY: Routledge.

Nielsen, J. (2011). How long do users stay on web pages? Retrieved from http://www.nngroup.com/articles/how-long-do-users-stay-on-web-pages/

Plaisance, P. L. (2007). Transparency: An Assessment of the Kantian Roots of a Key Element in Media Ethics Practice. Journal of Mass Media Ethics, 22(2-3), 187–207. doi:10.1080/08900520701315855

Rawlins, B. (2006). Measuring the Relationship Between Organizational Transparency and Trust. Presented at the 9th Annual International Public Relations Research Conference, Miami, FL. Retrieved from http://www.docunator.com/bigdata/1/1366449053_93ee43dea2/iprrc_10_proceedings.pdf#page=425

Rawlins, B. (2009). Give the Emperor a Mirror: Toward Developing a Stakeholder Measurement of Organizational Transparency. Journal of Public Relations Research, 21(1), 71–99. doi:10.1080/10627260802153421

Rosenquist, C. (2012). Visual form, ethics, and a typology of purpose: Teaching effective information design. Business Communication Quarterly, 75, 45-60. doi:10.1177/1080569911428670

Rosenwald, M.S. (April 6, 2014). Serious reading takes a hit from online scanning and skimming, researchers say. The Washington Post. Retrieved from https://www.washingtonpost.com/local/serious-reading-takes-a-hit-from-online-scanning-and-skimming-researchers-say/2014/04/06/088028d2-b5d2-11e3-b899-20667de76985_story.html

Rutenberg, J. (2016, November 6). Media’s next challenge: Overcoming the threat of fake news. The New York Times. Retrieved from https://www.nytimes.com/2016/11/07/business/media/medias-next-challenge-overcoming-the-threat-of-fake-news.html?_r=0

Rutenberg, J. (2017, January 22). “Alternative facts” and the costs of Trump-branded reality. The New York Times. Retrieved from https://www.nytimes.com/2017/01/22/business/media/alternative-facts-trump-brand.html

Schafer, C. (1995). Understanding the brains helps writers. Intercom, 14(9), 18-19.

Smiciklas, M. (2012). The power of infographics: Using pictures to communicate and connect with your audiences. Indianapolis, IN: Que Publishing.

Sosa, T. (2009). Visual literacy: The missing piece of your technology integration course. TechTrends, 53(2), 55.

Stallworth, W. L. (2008). Strengthening the ethics and visual rhetoric of sales letters. Business Communication Quarterly, 71, 44-52. doi:10.1177/1080569907312860

Stempel, G. H. (2003). Content Analysis. In G. H. Stempel, D. H. Weaver, & G. C. Wilhoit (Eds.), Mass communication research and theory (pp. 209–219). Boston: Allyn & Bacon.

Top 100 Nonprofits on the Web. (n.d.). Retrieved May 24, 2017, from https://topnonprofits.com/lists/best-nonprofits-on-the-web/

Toth, C. (2013). Revisiting a genre teaching infographics in business and professional communication courses. Business Communication Quarterly, 76(4), 446-457.

Tufte, E. (2001). The quantitative display of information (2nd ed.). Cheshire, CT: Graphics Press.

Walter, E., & Gioglio, J. (2014). The power of visual storytelling: How to use visuals, videos, and social media to market your brand. New York, NY: McGraw-Hill Professional.

Wheeler, T. H. (2005). Phototruth or photofiction?: Ethics and media imagery in the digital age. New York, NY: Routledge.

Wingfield, N., Isaac, M., and Benner, K. (2016, November 14). Google and Facebook take aim at fake news sites. The New York Times. Retrieved from https://www.nytimes.com/2016/11/15/technology/google-will-ban-websites-that-host-fake-news-from-using-its-ad-service.html?_r=0

Yaffa, J. (May/June 2011). The information sage: Edward Tufte, the graphics guru to the power elite who is revolutionizing how we see data. Washington Monthly. Retrieved from https://washingtonmonthly.com/magazine/mayjune-2011/the-information-sage/

Yeh, H. T., & Cheng, Y. C. (2010). The influence of the instruction of visual design principles on improving pre-service teachers’ visual literacy. Computers & Education, 54(1), 244-252.

© Copyright 2017 AEJMC Public Relations Division

How Do Social Media Managers “Manage” Social Media? A Social Media Policy Assignment


Melissa Adams

Melissa Adams, North Carolina State University

SlideShare PDF

How Do Social Media Managers “Manage” Social Media?: A Social Media Policy Assignment

How Do Social Media Managers “Manage” Social Media?: A Social Media Policy Assignment

As numerous public relations research studies have noted, social media communication by employees and other stakeholders often impacts public perceptions of their associated organizations, whether or not that communication is sanctioned by the enterprise or is a personal expression. Employees have been known to use social media to purposefully express anger or attempt to harm the reputation of organizations through “venting” or negative “flaming” messages meant to be seen by potential clients or hires, thus presenting new challenges for public relations (Jennings, Blount, & Weatherly, 2014; Krishna & Kim, 2014).

As the resident social media “expert,” commonly charged with monitoring and responding to such communication, as well as day-to-day management, public relations professionals are usually the primary resource for the development of social media policies (Lee, Sha, Dozier, & Sargent, 2015; Messner, 2014). Even though organizations may not have a policy in place when they become active on social media, they often realize the necessity of one after gaining some experience (Messner, 2014).

This assignment was developed to address the task of policy development with practical training that foregrounds professional ethical communication guidance, legal precedent, and collaboration with organizational stakeholders. Researching and crafting the policy also prepares students for the emergent public relations role of social media policy maker and manager (Neill & Moody, 2015).

Assignment Rationale

The social media policy assignment was designed to integrate knowledge gained from recent course material and discussion of ethical social media practice, a unit on the current legal environment (copyright, etc.), and a workshop on the basics of campaign planning. It challenges students to apply what they have learned to the development of a comprehensive policy addressing organizational needs and includes all the appropriate information (i.e., they must think it through just as they would in an agency or professional project). This unit begins with the question “How do social media managers really ‘manage’ social media?” Then, moving through the ethics and legal units as a class, this question continues to promote discussion of the challenges that digital public relations practitioners must take into account as resident technical experts, planners, and policy advisors managing social media and organization-public relationships (Lee, Sha, Dozier, & Sargent, 2015; Neill & Moody, 2015). Legal case precedent and issues of copyright, fair use, and freedom of speech as expressed on social media (e.g., the Hispanics United versus National Labor Relations Board case) are the focus of class discussion leading up to the social media policy assignment (Lipschultz, 2014; Myers, 2014).

In addition, this assignment requires students to identify and work with a client organization, learn about the organization’s potential risks from inappropriate social media use, and then make analytical decisions to construct an ethical, comprehensive policy to address them. Finally, the completed social media policy provides students with a professional quality portfolio piece, and if the client chooses to adopt it, an impressive resume-builder.

Student Learning Goals

This assignment develops several communications practice competencies noted by public relations educators and practitioners as desired skills for young professionals. Through its blend of research and knowledge application, the social media policy assignment teaches students to think like a practitioner following best practices and the value of collaboratively developed policies (Freberg, Remund, & Keltner-Previs, 2013; Messner, 2014). Working through this assignment, students build practical research skills by conducting discovery interviews with organization practitioners or administrators, while simultaneously gaining experience working with a client, managing logistics and communication. The assignment also helps students develop analytic acumen through performing an audit of client social media assets in regard to organizational risk.

By conducting a working review of existing organizational social media and example documents, students learn and understand common objectives and components of social media policies. They are then challenged to apply their recently gained legal knowledge to the development of an ethical and compliant written social media policy document.

Finally, as advanced writing and presentation skills are core competencies for public relations practice, the social media policy assignment provides an opportunity to refine presentation skills and gain experience producing professional quality documents. For the last stage of the assignment, students are required to formally meet and present their final policies to their client organizations, who in turn complete a satisfaction form for assessment.

Connections to Public Relations Theory and Practice

 This assignment comes from a course developed for seniors and advanced juniors enrolled in the public relations concentration. It connects to recent scholarship and research on the ethical practice of social media in public relations. As communications professionals, students will likely be required to either update existing social media policies or develop new ones for clients or employer organizations. To do this, these young professionals will need to work across the organizations to collaborate with other stakeholders in human resources, legal and marketing to develop, implement, promote, and police them across the enterprise as noted in recent research (Neill & Moody, 2015). Crucially, they must be able to craft policies that both recognize the free speech rights of employees and provide a comprehensive guidelines document addressing all areas of possible use (Lipschultz, 2014; Myers, 2014).

In preparation for the social media policy assignment, students read and discuss a textbook chapter on the legal issues of social media practice (Lipschultz, 2014) and review National Public Radio’s Ethics Handbook (n.d.), which addresses the general ethical journalism practice concepts of fairness, transparency, and accuracy. They also review the Public Relations Society of America’s Member Code of Ethics (n.d.), which reinforces the journalistic principles covered by NPR’s Ethics Handbook, yet extends them to the role of ethical digital public relations practice by addressing practitioner duties such as the preservation of accurate information flow and safeguarding privacy (PRSA, n.d.). In addition to professional ethical guidance, these resources offer a framework for the students to refer back to as they work through the assignment and interact with their clients about the specific needs of their organizations.

Assignment Introduction and Execution

 To introduce the assignment, two examples of actual (anonymized) social media policies of varying scope and audience (university and small business or student organization) are presented. Students form small groups to work through examples of the policies, comparing the components and noting differences. They make a list of all the similarities and differences of each policy element as a group. Afterward, the class discusses the elements of each policy to determine their primary function and necessity. Then the social media policy assignment is introduced with an in-depth handout (a brief version of the handout is provided in the Appendix) and a walk-through of the numerous questions students should ask to determine the needs and goals of their client organization, including resources required for implementation and adoption.

Students are then charged with identifying a client organization to work with on this assignment—a nonprofit organization, student organization, or a small business they are affiliated with that needs such a policy. If needed, students receive help connecting with a potential client organization for the project.

From this point, students use the assignment instructions to work on their individual policy documents on their own time. After completion and grading, the policies are returned to the students for finalization for their clients, and they email them to the instructor for a final proofread before the documents are delivered. This final step allows a review of presentation points and the assessment form with the students.

Evidence of Learning Outcomes

Several of the client organizations have implemented their student’s policy document following completion of this assignment. These included student organizations, two nonprofits, and two small businesses where students were employed or interning at the time. One small business, a massage studio and beauty spa, adopted the social media policy across its small chain of retail locations in the Southeastern US.

Additionally, students have noted in instructor feedback forms that this assignment was very useful as it gave them an opportunity to develop “real world” experience and a document they could use as both a portfolio piece and a professional writing sample.


Freberg, K., Remund, D., & Keltner-Previs, K. (2013). Integrating evidence based practices into public relations education. Public Relations Review, 39(3), 235-237. doi: 10.1016/j.pubrev.2013.03.005

Jennings, S. E., Blount, J. R., & Weatherly, M. G. (2014). Social media—A virtual Pandora’s box: Prevalence, possible legal liabilities, and policies. Business and Professional Communication Quarterly, 77(1), 96-113. doi: 10.1177/2329490613517132

Krishna, A., & Kim, S. (2015). Confessions of an angry employee: The dark side of de-identified “confessions” on Facebook. Public Relations Review, 41(3), 404-410. doi: 10.1016/j.pubrev.2015.03.001

Lee, N., Sha, B. L., Dozier, D., & Sargent, P. (2015). The role of new public relations practitioners as social media experts. Public Relations Review, 41(3), 411-413. doi: 10.1016/j.pubrev.2015.05.002

Lipschultz, J. H. (2014). Social media communication: Concepts, practices, data, law and ethics. New York, New York: Routledge.

Messner, M. (2014). To tweet or not: Analysis of ethical guidelines for social media engagement of nonprofit organizations. In DiStaso, M. W., & Bortree, D. S. (Eds.), Ethical practice of social media in public relations (pp. 82-95). New York, NY: Routledge.

Myers, C. (2014). The new water cooler: Implications for practitioners concerning the NLRB’s stance on social media and workers’ rights. Public Relations Review, 40(3), 547-555. doi: 10.1016/j.pubrev.2014.03.006

National Public Radio (n.d.). NPR Ethics Handbook. Retrieved from http://ethics.npr.org/

Neill, M. S., & Moody, M. (2015). Who is responsible for what? Examining strategic roles in social media management. Public Relations Review, 41(1), 109-118. doi: 10.1016/j.pubrev.2014.10.014

Public Relations Society of America (n.d.). PRSA Member Code of Ethics. Retrieved from http://apps.prsa.org/AboutPRSA/Ethics/CodeEnglish/index.html


 Assignment Worksheet

For this assignment you will create a formal, professional social media policy for an organization of your choice. If you need help identifying an organization, I will help you connect with a local nonprofit or student organization.

Sohow do you go about this?   Just follow these steps.

Research the social media footprint and assets of the organization and create a list of all their platforms and note any apparent campaigns, strategies and tactics used.

  1. Identify, contact and talk to the person in charge of social media and brand administration for the organization (who will likely be in a communications function). If this individual can’t meet with you in person, you can connect with them via email or phone. Note that in smaller organizations, this contact might be someone in human resources or customer service.
    • Ask them if they have an existing social media policy, if so, does it fit their needs? If not, can you do one for them?
    • Then ask—what are the main concerns regarding social media for their organization? Also find out if there are any special regulations or legal issues you should be aware of when preparing your policy.
  2. Ask yourself (and your client organization when applicable) the following questions as you think through this assignment.
    • What is the “big picture” purpose of this policy? How will the policy meet certain organizational needs and align with business objectives?
    • What types of social media activities need to be addressed in the policy document? What platforms? What types of content?
    • Are there any special considerations (based on your organization) that you should consider and address in the policy?
    • Who is the audience for this policy?
    • What are the specific risks your organization hopes to mitigate with this policy and where might they come from? Employees? Other stakeholders?
    • Who will be in charge of policy administration? Who will monitor and report infractions? What will happen to violators? Who should be contacted with questions about the policy?
    • What resources might readers need to comply with this policy? (Example: A link to an organizational brand standards guide.)
    • How will your organization implement this policy? Who needs to review and approve it before dissemination?


Sections to include in your policy document:

Policy Overview – provide a rationale for the policy. Explain in clear terms why it is needed, how it will be implemented, etc. Explain its goal in positive terms (to maintain xxx, to promote xxx, etc.), and be sure to include a list of applicable social media assets. Explicitly state what is covered by the policy (and what isn’t).

Allowed Use – provide examples of approved use. This should include actual or example tweets/posts as well as brand elements. Use screenshots to illustrate as needed.

Disallowed Use – provide examples of what NOT to do! Use screenshots and descriptive language.

Legal – address any legal issues including copyright. (Example: the FERPA section in the university social media policy example.)

General Best Practices – create a short list based on the organization’s current social media assets. Follow the examples provided as well as those posted online by reputable and ethical organizations (such as the examples shared in class).

Resources – this section is for links or directions to internal resources such as legal documents or other policies, and for reference links to external sources.

Contact Information – for the administrator of the policy, legal, etc. as you see fit. Provide full information including email and phone number.


Assignment Rubric – 100 pts possible

  1. Research – 20 pts
  2. Planning/Organization – 25 pts
  3. Content (each section is addressed completely) – 35 pts
  4. Clarity (is it easy to follow?) – 10 pts
  5. Professional Presentation – 10 pts



Who Will Get Chopped?: Mystery Basket PR Challenge


         Emily Kinsky

• Mary E. Brooks, West Texas A&M University

• Emily S. Kinsky, West Texas A&M University

SlideShare PDF

Who Will Get Chopped?: Mystery Basket PR Challenge

Who Will Get Chopped?: Mystery Basket PR Challenge

Based off Food Network’s Chopped challenge, the Mystery Basket PR Challenge is a competition that focuses on creativity, speed, and skill in which students are given a box of mystery “ingredients” (e.g., brand, crisis, strategy, channel, speaker, audience) they have to use to complete an assigned task (e.g., a tweet, an official statement, a headline). For example, a box might have a brand name, a particular crisis, a group of people affected and a celebrity, and the task would be to write a headline for a news release, keeping in mind which crisis response strategy from Benoit (1997) or Coombs (2007) might be most appropriate. Students open the box and have a limited time in their groups to complete the task, which they then pitch to the judges (faculty and local professionals). This requires teamwork and application of lessons learned in class as the student groups compete against each other.

The purpose of the Mystery Basket PR Challenge is for students to apply PR strategies to handle unexpected situations and solve problems collaboratively under a deadline. This challenge can also help prepare students to clearly and quickly articulate ideas.

Per Kolb’s (1984) experiential learning theory, learning through experience focuses on the process at hand and not necessarily the outcome of the project. By formatting the classroom into a simulated work environment, students will have greater success in their future careers when faced with similar challenges (Ambrose, Bridges, DePietro, Lovett & Norman, 2010; Svinicki & McKeachie, 2014). The challenge covers the five elements that are crucial to an experiential learning activity: the use of real-world situations; complexity (more than one answer may suffice); industry-specific concepts; student-led activity; and finally, feedback and reflection (Svinicki & McKeachie, 2014). The benefits to students are numerous, especially in relation to the PR industry where strategy, creativity, spontaneous thinking, collaboration, and articulate wording are all pivotal to being successful.

This pedagogical teaching tool is applicable to a variety of courses within the PR discipline (e.g., writing, campaigns, cases, ethics, social media) or other strategic communication classes.

During fall 2016, a version of this challenge was successfully implemented in an advertising writing class as a final project. Student feedback was positive. For example, one student said, “the ‘Chopped’ final was also very intriguing! Having an interactive final that brings in industry professionals to critique our work will greatly help” students continuing in the field.

Assignment Instructions

The Mystery Basket PR Challenge includes three rounds. Each round consists of four mystery public relations components that groups of students must incorporate to produce a public relations solution for a specific organization. Students will work in small groups to produce the solution in a short amount of time for a variety of situations, organizations and media platforms. Student groups will compete against each other. Working in a collaborative environment is essential in PR. Learning to meet deadlines is also pertinent, especially in the public relations industry where clients expect work at a pre-set time. Further, PR practitioners must learn to handle unexpected crises in a timely situation.


The rules for each round include using all of the mystery basket components, creating the designated assignment within the time allotted, and making a persuasive pitch to the judges. In addition, students will have a public relations pantry they can turn to for help. The pantry would consist of their textbooks, Internet access, cell phones and laptops/tablets. This is similar to Chopped where contestants have access to a modified grocery store in order to enhance a dish. Students are given one class period to practice prior to the real competition class period with different ingredients than what will be used in the competition.


Each group has a basket of mystery components during each round. The round assignments can change based on the class topic (see Appendix A for examples). For an introductory course, Round 1 could be the event planning round; Round 2 could be the social media round; and Round 3 could be the news release round. Just like Chopped, the time for each round will increase as each round increases in difficulty. During Round 1 for a social media class, the students will have 10 minutes to create a calendar-related promotion; during Round 2, students will have 20 minutes to create a hashtag campaign; and during Round 3, the students will have 30 minutes to write a blog post.

Professional Feedback

The student groups will be given live feedback on their work from industry professionals (see Appendix B for a sample judging rubric). The benefits of including public relations industry professionals in this challenge are many. Students have a chance to demonstrate their creative and innovative ideas, their presentation abilities, and their quick thinking skills to the professionals. In addition, students and professionals will begin to formulate relationships. This is important for potential future employment and/or mentorship.

When the time for each round expires, one person from each group must present the team’s final idea to the judges for one minute (or longer, depending on the challenge). The judges will deliberate and deliver their individual comments to each group. The judges will also choose a winner for every round. The class enrollment size and the division of groups will determine how many winning groups per round. The winners from each round will be named the Mystery Basket PR Challenge champions.

Appendix A

Assignment Examples

The Mystery Basket PR Challenge can be modified for different PR courses (e.g., crisis, campaigns, writing, social media). Like Chopped, each round allows students more time (e.g., 10, 20 and 30 minutes). Some “ingredients,” like the brands, will be assigned, while others can be selected strategically by the students (e.g., which channel makes the most sense in this situation?).

Crisis Communication 

  • Round 1: Official statement
  • Component #1: Brand/Organization (this would be assigned to the group)
  • Component #2: An image restoration strategy from Benoit or Coombs
  • Component #3: Crisis (a type of crisis would be assigned to the group)
  • Component #4: Speaker (choose the title of the person who would share the statement)
  • Round 2: Social media post
  • Component #1: Brand/Organization
  • Component #2: An image restoration strategy
  • Component #3: Crisis
  • Component #4: Channel (assign or let them choose)
  • Round 3: News release
  • Component #1: Brand/Organization
  • Component #2: Crisis
  • Component #3: Audience
  • Component #4: A quote to include

Social Media

  • Round 1: Calendar promotion
  • Component #1: National ____ Day (choose a day that fits the brand/org; for example, if the students were given Bayer Aspirin as the brand, they might choose July 9 Rock ‘n’ Roll Day as the specific national day for a tied-in promotional post)
  • Component #2: Brand (company/organization assigned to the group)
  • Component #3: Social media site (choose the most appropriate site)
  • Component #4: Post (write copy, decide when it would be posted, sketch image)
  • Round 2: Hashtag campaign
  • Component #1: Organization
  • Component #2: Event
  • Component #3: Goal
  • Component #4: Social media platform
  • Round 3: Blog post
  • Component #1: Organization
  • Component #2: Audience
  • Component #3: Keywords
  • Component #4: Links

Appendix B

Judging Rubric Example 

Division A Judge Name:

Round 2: Social Media Post


Please circle which group in Division A is being judged:

Group 1                                           Group 2                                           Group 3



Please rate from 1-10 (with 10 being the best) the creativity of the social media post based on the components provided in the basket.       1    2    3    4    5    6    7    8    9    10



Please rate from 1-10 (with 10 being the best) the overall idea of the social media post based on the components provided in the basket.      1    2    3    4    5    6    7    8    9    10



Please rate the quality of presentation from 1-10 (with 10 being the best).

1    2    3    4    5    6    7    8    9    10



Please provide comments concerning the overall social media post results, the presentation, and/or anything regarding how the challenge was managed (both positive feedback and suggestions for improvement).


 Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How learning works: Seven research-based principles for smart teaching. San Francisco, CA: Jossey-Bass.

Benoit, W. L. (1997). Image repair discourse and crisis communication. Public Relations Review, 23(2), 177-186.

Coombs, W. T. (2007). Protecting organization reputations during a crisis: The development and application of situational crisis communication theory. Corporate Reputation Review, 10(3), 163–176.

Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice-Hall, Inc.

Svinicki, M. & McKeachie, W. (2014). McKeachie’s teaching tips: Strategy, research and theory for college and university teachers. Belmont, CA: Wadsworth Cengage Learning.

I Love Tweeting in Class, But…. A Qualitative Study of Student Perceptions of the Impact of Twitter in Large Lecture Classes



  • Jenny Tatone, University of Oregon
  • Tiffany Derville Gallicano, University of North Carolina at Charlotte
  • Alec Tefertiller, University of Oregon


This is perhaps the first in-depth qualitative study that shares insights about the perceived role of Twitter on the learning experience and the sense of classroom community from students’ perspectives in a large lecture class. We conducted four focus groups with a cumulative total of 27 students from a class of 269 students. Based on our data, we propose ways that Twitter might contribute to the sense of classroom community, which could be tested through quantitative research. We also identify ways that Twitter helps and undermines students’ learning experience. In addition, we found a surprising theme about Twitter fostering a sense of competition in the class when projected on the wall. This study concludes with recommendations for integrating Twitter in the large lecture class.

Keywords: Public relations, Twitter, classroom exercises

Slideshare PDF

I Love Tweeting in Class, But…. A Qualitative Study of Student Perceptions of the Impact of Twitter in Large Lecture Classes

Millennials are known as digital natives–they grew up using digital media and are accustomed to using it throughout the day (Porter Novelli, 2008; Válek & Sládek, 2012). According to a Pew study, 90% of Americans ages 18-29 use social media and 86% of them own a smartphone (Perrin, 2015). Smartphones and social media have become so essential to the everyday lives of today’s young adults that some of them believe that they would feel invisible without them (Boyd, 2014; Tatone, 2016). The publicly networked spaces that digital media afford play a central role in shaping the ways young adults perceive their life experiences––personally, socially, and culturally (Ito et al., 2009; Tatone, 2016). Educators in various disciplines are exploring the potential of social media to play a powerful role in another area of young adults’ experiences––their education (e.g., Cole, Hibbert, & Kehoe, 2013; D’Angelo & Woosley, 2007; Tyma, 2011).

All of the studies we found about Twitter in the context of large lecture classes used surveys, experiments, or content analysis as a method, with the exception of Tyma’s (2011) study, and her qualitative data resulted from one large class discussion, as opposed to in-depth focus groups or interviews. The studies using quantitative methods have provided insight into the potential of Twitter to contribute to learning (e.g., Cole, Hibbert, & Kehoe, 2013; Junco, Heibergert, & Loken, 2011; Kim et al., 2015) and to be a source of distraction (e.g., Varadajan, 2011). Qualitative research can play a key role by helping educators understand students’ in-depth explanations of how Twitter can help with learning, interfere with learning, or do both, as well as discovering students’ recommendations for how to integrate it into the large lecture classroom based on their experiences. We thought our class would be an interesting context for this qualitative research because we tried out various implementation strategies in response to student feedback with regard to the timing of class tweets and projecting the Twitter feed on the wall. We also saw an opportunity to explore any ways that Twitter might influence perceptions of the sense of classroom community, particularly given the lack of research about it in a large lecture context.


Strategies for Integrating Twitter

Instructors are discovering strategies to improve the use of Twitter in large lecture classes. Despite the likelihood that most students have had some experience with Twitter, the literature suggests that a tutorial about how to use Twitter effectively is helpful to students (e.g., Junco et al., 2011; Tyma, 2011; Varadarajan, 2011). In addition, instructors have found that students need reminders on occasion to keep tweets relevant to the class lecture (e.g., Cole et al., 2013; Pollard, 2014). Some students want their instructors to send these reminders, so they do not have to see the distracting content or call out their classmates who are tweeting irrelevant content (Tyma, 2011). A teaching assistant can handle these reminders during the lecture when seeing off-topic tweets. Another issue is whether the live tweets with the class hashtag should be projected onto the classroom wall. Elavsky, Kumanyika, and Mislan (2011) noticed that participation on the class hashtag increased when the Twitter feed was projected onto the wall in their large lecture media and democracy class.

An additional consideration is whether Twitter can be used to sustain students’ attention during class. We found a study that recommended restricting Twitter use to designated Twitter intervals (Cole, Hibbert, & Kehoe, 2013) to help students focus on the lecture content. In another study, Kim et al. (2015) used a game approach to sustaining students’ attention by presenting surprise Twitter questions on lecture slides and awarding points to a limited number of students who correctly answered the questions on Twitter using the class hashtag. Through a survey, participant observation, and exam scores from a comparison of class sections in which Twitter was used and not used, the research team concluded that their approach to integrating Twitter in the large lecture classroom helped students stay focused during class and learn the material.

Junco, Heibergert, and Loken (2011) studied the related topic of class engagement and produced a significantly higher engagement score in their class section in which Twitter was used, as compared to their class section in which Twitter was not used. Thus, their strategies for integrating Twitter into the large lecture classroom have credibility. They applied the following principles for undergraduate education by Chickering and Gamson (1987):

  1. Student/faculty contact (by adding Twitter as a communication channel)
  2. Cooperation among students (by encouraging students to use Twitter to ask each other questions, collaborate on a project, and offer one another emotional support)
  3. Active learning (by asking students to use Twitter to connect the class material to their own experiences)
  4. Prompt feedback (by responding quickly to students’ tweets)
  5. Emphasizing time on task (by expanding class discussions past class meeting days through the Twitter channel)
  6. Communicating high expectations (by using Twitter to promote high quality work)
  7. Respecting diversity (by discussing diversity through the Twitter feed)

Junco applied Chickering and Gamson’s (1987) principles in a later study with his colleagues when investigating the difference of requiring Twitter in class, as opposed to making it optional (Junco, Elavsky, & Heibergert, 2013). His research team concluded that large lecture classes should require Twitter use because his optional Twitter class section had lower class engagement and learning scores than his required Twitter class section, as measured by comparing student surveys and scores from each section.

In a related study, Pollard (2014) did not require Twitter use and found that the majority of students in her history course of 370 students did not participate on the class hashtag. Nevertheless, the majority of her students found Twitter in the classroom to be somewhat valuable, with 18% reporting that it was incredibly useful. Her findings suggest that the student behavior of lurking on the Twitter channel by observing without tweeting to it could have at least some value, which might not be visible through a content analysis of participation.

The Sense of Classroom Community

Students who believe their class has a strong sense of classroom community have a sense of belonging to a class, believe that classmates care about one another, perceive that all of the students have a mutual responsibility to one another, and experience shared expectations about meeting common goals as students in the same class (Rovai & Lucking, 2000; Rovai, 2002). The sense of classroom community can make a difference to learning (Rovai, 2002; Wighting, 2006).

We did not see any studies about Twitter’s contribution to the sense of classroom community in the context of large lecture classes, so we thought this would be a particularly interesting area to explore. A study with some relevance to the role of Twitter in enhancing a sense of classroom community in a large lecture class was C. M. Elvasky et al.’s (2013) study. These researchers found that 81.1% of the 260 participants in their media and democracy class thought that in-class tweets made the class feel smaller and more interconnected. In a tangentially related study about online discussion boards, which could be similar to Twitter, 59% of 341 students believed that the required discussion boards contributed to their sense of social connection with their peers in their large lecture course (Stoerger & Kreiger, 2016).

Research Questions

As noted in the introduction, we could not find any in-depth qualitative studies that involved hearing students’ perspectives about Twitter in a large lecture class. To explore how Twitter might affect students’ learning experience and the sense of classroom community from their perspectives, we investigated the following research questions:

RQ 1: In what ways do students think the use of Twitter as a pedagogical tool in the large college classroom affected their learning experience (if it had any effect)?

RQ 2: How do students think the use of Twitter as a pedagogical tool in the large lecture classroom affected their sense of class community (if it had any impact)?


Class Context and Professor Interaction

This study reports data from an entry-level course with 269 students that introduced students to public relations, advertising, journalism, and communication studies. A public relations professor taught the class and discussed the public relations angles of most of the topics the class explored. A tweet was required during every class meeting that did not have an exam. Students were required to use their real names in either their Twitter handle or profile. An alternative option in this section for students choosing to not tweet was to write a handwritten comment each time a tweet was required and submit it to their assigned teaching assistant. The professor discussed the basics of Twitter and emphasized the professional advantages of Twitter, as well as recommendations for using it in a professional context. The course ended at 5:20 p.m., and in the evenings of the class meetings, the professor spent one to three hours reading, retweeting, and responding to tweets on the course hashtag.

Despite a study’s recommendation to stop class lectures to have a designated period for a Twitter interval (Cole et al., 2013), we chose initially to invite the class to tweet at any point during the class due to several of our colleagues’ anecdotal experiences with using this unrestrained Twitter approach. We received complaints from students about this unrestrained Twitter approach, so after the first two weeks of tweeting throughout class, we switched to designated Twitter intervals. During these intervals, the lecture stopped, and students were instructed to take a moment to focus on writing a tweet based on a prompt delivered in class, and they were reminded of the alternative of writing a reflection of similar length. We encouraged students to take a moment to read each other’s tweets and consider favoriting any they liked. They were then asked to put their phones away, although the auditorium was so large that it was difficult to enforce this policy.

Sampling for Focus Groups and Participants

All students were invited to participate in a focus group in exchange for extra credit. Due to the class size, we had planned to give all of the students who signed up for a focus group spot extra credit, regardless of whether we ended up including them in the focus groups; however, only 20 students registered for the focus groups. We recruited another 10 students, three of whom did not show up. We believe that the low rate of volunteering might have been due to the timing of the focus groups on a Saturday morning, combined with a heavy homework time (with just two weeks remaining of class), and a major competing campus event that attracted hundreds of students. The four focus groups had a cumulative total of 27 students. We did not conduct additional focus groups because we reached saturation with the data (see Glaser & Strauss, 1967).

We wanted to group similar people together, in line with the homogenous sampling strategy for focus groups (Miles & Huberman, 1994; Patton, 1990). Consequently, we organized the focus groups by the grades students were earning at the time of the course (without revealing this information to the students). We used purposive sampling by sending individual solicitations to people who stood out through their substantive tweets and by identifying people who could fill in the spaces we had in the grade groups. Although we tried to have 10 students per group, ultimately, we had a group of nine A students, plus a C student who showed up to the wrong group; a group of eight B students; a group of seven C students; and a group of three students in a combined D/F group. The focus group participants had name cards in front of them to facilitate interaction, and cupcakes were served. Regarding demographics, there were 11 Caucasian students (including 5 females and 6 males); 10 Asian students (all females); 2 Hispanic students (both females); 2 Caucasian-Middle Eastern students (1 male and 1 female); and 1 Caucasian-Asian male student. Students ranged in age from 18-27. The median age was 20.

Focus Group Approach and Protocol

We used a semi-structured approach, which allowed for a naturally flowing conversation wherein students elaborated frequently on other students’ comments, which often helped to shape the conversation’s direction more than our focus group protocol (see Appendix). This semi-structured approach also gave us the opportunity to ask follow-up questions on what the conversation’s natural unfolding revealed, giving us greater insight (Krueger, 1988). By asking open-ended questions and allowing focus group conversations to follow their own course, we believe we reduced the power difference with our students (see Lindlof & Taylor, 2002; Madriz, 2000) and positioned participants as experts rather than as subjects of a research study (Lee, 1993). Additionally, the focus group setting enabled participants to further explore their initial reactions to questions by interacting with one another, thus enhancing the quality of the results (see Madriz, 2000). Each focus group lasted an average of 54 minutes.

A potential drawback of the focus groups was that students might have felt influenced to say what they thought other students and the focus group moderator wanted to hear. In each group, the focus group moderator was either the professor or one of the graduate teaching fellows who had guest lectured a few times and worked with students closely. In an attempt to offset these potential drawbacks, we reminded students that honest feedback was of the utmost importance because the purpose of the focus groups was to learn from them. We told students we wanted to learn about the educational value, or lack thereof, with regard to incorporating Twitter into future curricula. In this way, we followed Krueger’s (1988) guideline to tell focus groups what the researchers want to discover from them. Furthermore, we told students that feedback from our previous classes had helped to shape the present course, so this was a good opportunity to continue the goodwill toward future classes by being honest and constructive. We responded in a supportive manner to all opinions and welcomed all viewpoints throughout the discussions.

Data Analysis

We performed a thematic analysis on the transcripts by seeking common patterns while noting the wide variety of responses we received (see Miles & Huberman, 1994). We used our research questions as a lens for reducing the data; next, we coded the relevant content by phrase, sentence, or paragraph, depending on the length of the relevant chunk of text (see Miles & Huberman, 1994). We used emic codes (i.e., the participants’ phrases) when possible and otherwise used etic codes (i.e., our words) when participants’ phrases were too long or did not summarize the content (see Lindlof & Taylor, 2002).


RQ 1: In what ways do students think the use of Twitter as a pedagogical tool in the large college classroom affected their learning experience (if it had any effect)?

Students commented on various advantages and disadvantages of Twitter as a tool for their learning experience. Many students valued the ability to express various viewpoints and learn from one another, although for some students, this marketplace of ideas via Twitter was more idealistic than what had actually occurred. Furthermore, students noted a major drawback of the potential for Twitter use in the college classroom to lead them down a rabbit hole into the use of social media unrelated to class. In addition, some students brought up that they disliked having their speech limited to the 140-character tweet limit. Nevertheless, the same students recognized that having to do this developed their skills. Details are included below.

Many participants agreed that the hashtag provided a place to share and learn from multiple points of view:

This is why Twitter’s really cool—you can have your own opinion and at that same time you can share what you think is correct without degrading that other person’s opinion. It’s a very open way of making sure that everyone’s voice is heard and to make sure that no voice is completely stamped out… no voice is elevated to the highest pedestal. (Student from the A group)

Some students recognized Twitter’s potential for enabling a marketplace of ideas––fitting their expectation of what college was meant to offer––while noting that it did not reach this ideal:

The entire point of college … [is] to not be around like-minded people…. Twitter… in a class college setting, embodies that in that you can see other people’s opinions and, if you feel so inclined, you’re able to argue your point, and…arguing in an academic sense is where the greatest ideas come from. …In its most ideal sense, Twitter would lead to… an argument of conviction, but sometimes it’s not that … most of the time, it’s not that. (Student from the A group)

Nevertheless, some students shared evidence of intellectual debate on the course hashtag. For example, the class studied the circuit of culture in the context of the public relations battle between the producers of the movie Ridiculous Six and Native American protestors. A student who rarely talked in class noted, “A lot of people were saying, ‘It’s by Adam Sandler. You shouldn’t take it seriously,’ and I was just one by one knocking out why representation is really important and it feels good [to recall that experience].” When asked about student reactions to her tweets, she noted that she received some comments and a lot of favorites “from people spectating the little showdowns.” Twitter gave several students increased agency for expressing their views in class. A student from the D group commented, “I feel like what’s cool about Twitter is if you do talk about these topics, it’s a cool, more informal, more comfortable way of expressing my opinion.”

A downside of Twitter was the potential for distraction. The switch to Twitter intervals (in which Twitter was only projected on the wall during designated Twitter periods after the second week of class) helped some students with regard to the distracting aspects of Twitter. “When we first started, I thought it was a really big distraction to have it on the wall because people kept staring at that and not paying attention, but once you started doing the intervals, it was good” (Student from the A group). A student from the B group commented,

Twitter in the classroom…has its perks and its downfalls. I love seeing different perspectives from other students, because obviously I don’t know what everyone’s thinking, so seeing their thoughts is really interesting – some things I’d never really thought about…. I guess lately the downfall is I get distracted. I start to focus on the J201 hashtag, and I’m not really paying attention as much as I could on the lecture.

For other students, even the use of Twitter intervals continued to be problematic: “It’s distracting because when I look on the phone, there’s so many other things on it, so it’s like you just see that little edge… [of] another app; it’s like, ‘Ah, you want to touch it so bad’” (Student from the C group).

Students brought up the issue of the 140-character limit with regard to the educational value of Twitter: “I don’t understand why I would download something that limits what I can say … I just never really saw the point” (Student from the A group). A student from the B group noted, “I almost have to sacrifice what I think ‘cause it doesn’t fit in the 140 characters, so that’s problematic. But it’s almost like a skill…something that you learn how to do over time.” A student from the A group said, “Eventually, I realized tweets are an easy way for me to make concise comparisons that were easy to remember. So I began appreciating the tweets.” Thus, some students disliked the character limit while acknowledging that learning how to fit their thoughts into a tweet had value. The results of the qualitative study suggest that for many (but not all) students, Twitter helped students exchange views and be exposed to different viewpoints. On the downside, many students reported struggles with getting distracted on their phones after visiting the hashtag.

RQ 2: How do students think the use of Twitter as a pedagogical tool in the large lecture classroom affected their sense of class community (if it had any impact)?

Twitter impacted most of the participants’ perceptions of the classroom community; however, it did so in different ways. Although there were some students for whom Twitter had no impact on the sense of community, for many others, it tended to increase the sense of community while also infusing it with a spirit of competition. This spirit of competition seemed focused on entertaining each other, to the detriment of the educational value. Details are presented below.

For many students, Twitter increased the sense of community. One way that Twitter increased the sense of community was by helping students bond through seeing one another’s similar reactions. A student from the D group commented,

When we were talking about copyright issues and stuff recently, the whole time, when she was going over the rules for it, and I had no idea about the rules for copyright stuff before that, I was thinking like, ‘What?’, like, ‘copyright should last forever.’And then I was just thinking that I was probably alone in that thought. But then I saw that people had tweeted, ‘No, it should last forever.’ And then I was like, ‘Yeah, like, that’s what I think’ (laughs). Hearing the different views, when it’s something that the teacher is supposed to be unbiased or chooses to be unbiased about when she is providing information, it’s interesting, helpful, I think.

Twitter also increased the sense of community by helping the class know additional student thought leaders who were reluctant to speak in a classroom auditorium setting. It also gave thought leaders an online opportunity to continue their conversations outside of the class lecture. For example, in the grade A focus group, there was a student who stood out for passionately asserting her opinions frequently on Twitter; she was also a compelling writer. Despite her large share of classroom voice on Twitter, she only spoke in the classroom once and this was after significant encouragement by her professor toward the end of the course: “Without Twitter, I wouldn’t feel welcome to participate. I wouldn’t feel comfortable speaking and having you repeat what I’m saying over the microphone.” Another thought leader in this woman’s focus group recognized her from Twitter: “[Lauren] and I had never met but we communicate on Twitter a lot.” [Lauren] concurred: “We talk so much on Twitter.” Finally, the sense of community was also enhanced by students responding to each other’s questions pertaining to matters such as where to find an assignment description.

For many participants, Twitter amplified the sense of competition in the classroom community by producing pressure to come up with tweets that would “one-up” other tweets or garner positive feedback through a favorited tweet. Students explained that these tweets were designed primarily to entertain each other rather than enrich the educational experience. Students explained that projecting the Twitter feed on the classroom wall contributed to the sense of competition: “Once you’ve broadcasted on the wall and people see a physical reaction to what they’re saying, it stops becoming about learning. It starts becoming about––how can I get the most laughs; how can I make sure I’m the coolest” (Student from the A group). As another student from the A group recalled, “I see a meme and I’m like, ‘Oh, I want to make a funnier one.’” Another student from the group added, “I’m like, ‘I’m going to post something that’s going to knock it out of the park.’” The tweets were related to the class content but arguably had more entertainment value than educational value.


The growing prominence of social media in the lives of many of today’s college students is challenging our values and norms surrounding education. Educators and scholars are seeking to understand how to best adapt to the pace with which digital technologies are advancing, blurring lines between education and entertainment, virtual and real, public and private, affecting the way students feel, think, and relate, both inside and outside of the classroom (Ito et al., 2009). Our study provides a needed contribution to the literature by perhaps being the first qualitative study that involved an in-depth approach that achieved qualitative saturation with regard to exploring students’ stories and views with regard to the integration of Twitter in a large lecture setting. As a qualitative study, the findings are not generalizable; however, they can still provide insight in the context of one university class involving the strategies we used.

The Sense of Classroom Community

Through our qualitative research, we found that a sense of community in the classroom through Twitter might be influenced by the following variables:

  • Helping students bond through seeing one another’s similar reactions;
  • Helping students feel like they belong when their tweets are favorited or retweeted;
  • Helping students develop relationships with one another by helping each other out with basic questions about the course, such as the location of assignment instructions;
  • Enabling the rise of additional class thought leaders who provided excellent content using the course hashtag but felt reluctant to speak in a classroom auditorium setting; and
  • Fostering additional discussion on the course hashtag, as compared to the amount of verbal discussion in the classroom.

These applications of Twitter to the sense of classroom community fit well with Rovai and Lucking’s (2000) conceptualization of the concept, particularly with regard to feeling a sense of belonging, feeling like members care for one another, perceptions of shared responsibilities to one another, and perceptions of shared learning goals.

Sense of Competition

A new theme we had not read about in the literature that surprised us was the theme of competition on the course hashtag. Our qualitative data suggested that projecting tweets on a classroom wall could increase a sense of competition among students, which can devolve into attempts to entertain one another rather than share knowledge. Research is needed to discover whether there are ways to productively harness this competition toward educational goals (and if so, what those ways are) and whether a sense of competition among students should even be promoted, particularly with regard to how a sense of competition might intrude on the sense of classroom community (as conceptualized by Rovai & Lucking, 2000). Thus, this study introduces a question with regard to Elavsky et al.’s (2011) finding that participation on the class hashtag jumped when the Twitter feed was projected on the wall. Does the overall quality of the tweets change when the tweets are projected, and if so, how? Initial insight from this study, based on students’ accounts, suggests that projecting tweets might detract from the tweets’ intellectual rigor. There is a temptation to send entertaining but educationally shallow tweets to create ripples of appreciation throughout an auditorium.

Guidance for Tweets

In addition, this study goes further than the recommendation in the literature about reminding students from time to time to keep their tweets relevant (e.g., Cole et al., 2013; Pollard, 2014). Based on our study, we suggest that instructors (who choose to use Twitter) provide significant guidance in helping students to understand the type of tweets that add to the educational value of the hashtagged discussion and the types of tweets that are not worthy of points.

The strategies of providing reminders to increase the intellectual quality of tweets and rigorously grading the quality of tweets could be steps in the right direction. Anecdotally speaking, we used these strategies in a subsequent version of the class, and the intellectual rigor of tweets from the students who tended to entertain rather than educate eventually increased when they noticed that they were not receiving points for their vacuous tweets and followed up with us to learn why. The quality of tweets also increased in a subsequent class in which we did not award points for vanity tweets that merely expressed enthusiasm for the professor or topic without adding value to the conversation.

We want to note that the rigorous grading strategy required much more time than the simplistic grading strategy did due to emails and direct message tweets from many individual students who asked questions about why they were not receiving points for their tweets and how their tweets could improve (even though we had already addressed these topics during the lecture). During the subsequent class, we also noticed that we had additional opportunities to correct students on their understanding of the class content or provide information to help students formulate better arguments, perhaps because there was more intellectual content for our responses than there appeared to be earlier. Formal research can explore these anecdotal insights with greater credibility than these casual observations can provide.

Frequency of Tweets

A clear recommendation from our research was that for most of our participants, the invitation to tweet throughout class was too much of a distraction to justify this approach. With some vocal exceptions, there was a consensus in the focus groups that following the hashtag, tweeting, and listening to a lecture was overwhelming and even stressful. Thus, this research provides strong endorsement for designated Twitter intervals, as recommended by Cole et al. (2013).

The Learning Experience

In addition, our qualitative data suggested ways in which Twitter both helps and undermines the learning experience, which can be tested through future research. Students who viewed Twitter as valuable for their education praised it as a platform for exposing themselves to different views they had not considered. Some students recognized that having to condense their thoughts into tweets was a good skill to develop, as exasperating as it was to confine their speech.

The major way focus group participants saw Twitter undermining their learning experience was its ability to distract them from class, particularly due to the temptation to open other apps on their phones before tuning back in to the lecture. As we noted in the literature review, several studies have concluded that Twitter has the potential to contribute to students’ learning experience (e.g., Cole, Hibbert, & Kehoe, 2013; Junco, Heibergert, & Loken, 2011; Kim et al., 2015); however, in another study, students emphasized the distracting nature of Twitter and did not think it should be used in large lecture classes (see Varadarajan, 2011). We believe that Varadajan’s differing results might be due to the lack of using Twitter intervals based on the difference the intervals made to our students’ experiences.


Understanding the adoption of Twitter in the classroom from students’ perspectives in an open-ended question format provided rich data from their perspectives. We believe part of the value of this study lies in recommendations about how Twitter should be integrated into the large lecture classroom with regard to frequency of tweets, guidelines for insisting on intellectual tweets (reinforced via scoring), and potential effects of projecting tweets onto the classroom wall––for those instructors choosing to integrate it. With these recommendations also comes caution about students’ temptation to continue using their phones in classroom auditoriums following Twitter intervals for non-class activity and the significant investment of instructor and teaching assistant time, at least with the approach we took. As additional studies are conducted, we will continue to learn more about the pedagogical use of this resource.


Boyd, D. (2014). It’s complicated: The social lives of networked teens. New Haven, CT: Yale University Press.

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 3, 7.

Cole, M. L., Hibbert, D. B., & Kehoe, E. J. (2013). Students’ perceptions of using Twitter to interact with the instructor during lectures for a large-enrollment chemistry course. Journal of Chemical Education, 90, 671–672. doi:10.1021/ed3005825

D’Angelo, J. M., & Woosley, S. A. (2007). Technology in the classroom: Friend or foe Education, 127, 462–472.

Elavsky, C. M., Kumanyika, C. & Mislan, C. (2011). Disrupting or developing discourse? Twitter and the microprocesses of learning community in the media studies classroom. Paper presented at the meeting of the International Communication Association, Boston, MA. Retrieved from  http://citation.allacademic.com/meta/ p491771_index.html

Glaser, B., & Strauss, A. (1967). The discovery of grounded theory: Strategies for qualitative research. New York, NY: de Gruyter.

Ito, M., Antin, J., Finn, M., Law, A., Manion, A., Mitnick, S., … & Horst, H. A. (2009). Hanging out, messing around, and geeking out: Kids living and learning with new media. Cambridge, MA: MIT press.

Junco, R., Elavsky, C. M., & Heibergert, G. (2013). Putting Twitter to the test: Assessing outcomes for student collaboration, engagement and success. British Journal of Educational Technology, 44, 273–287. doi: 10.1111/j.1467-8535.2012.01284.x

Junco, R., Heibergert, G., & Loken, E. (2011). The effect of Twitter on college student engagement and grades. Journal of Computer Assisted Learning, 27(2), 119–132. doi:10.1111/j.1365-2729.2010.00387.x

Kim, Y., Jeong, S., Ji, Y., Lee, S., Kwon, K. H., & Jeon, J. W. (2015). Smartphone response system using Twitter to enable effective interaction and improve engagement in large classrooms. IEEE Transactions on Education, 58, 98–103. doi:10.1109/TE.2014.2329651

Krueger, R. A., & Casey, M. A. (2000). Focus groups: A practical guide for research (3rd ed.). Thousand Oaks, CA: Sage.

Lee, R. M. (1993). Doing research on sensitive topics. London, England: Sage.

Lindlof, T. R., & Taylor, B. C. (2002). Qualitative communication research methods (2nd ed.). Thousand Oaks, CA: Sage.

Madriz, E. (2000). Focus groups in feminist research. In N. K. Denzin & Y. S. Lincoln (Eds.), Handbook of qualitative research (2nd ed.), pp. 835-850). Thousand Oaks, CA: Sage.

Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis (2nd ed.). Thousand Oaks, CA: Sage.

Patton, M. Q. (1990). Qualitative evaluation and research methods (2nd ed.). Newbury Park, CA: Sage.

Perrin, A. (2015, October 8). Social media usage: 2005-2015. Retrieved from http://www. pewinternet.org/2015/10/08/social-networking-usage-2005-2015/

Pollard, E. A. (2014). Tweeting on the backchannel of the jumbo-sized lecture hall: Maximizing collective learning in a world history survey. The History Teacher, 47, 329–354.

Porter Novelli. (2008). Intelligent dialogue: Millennials [White paper]. Retrieved from http://www.porternovelli.com/intelligence/millennials

Rovai, A.P. (2002). Sense of community, perceived cognitive learning and persistence in asynchronous learning networks. The Internet and Higher Education, 5, 319–332.

Rovai, A. P., & Lucking, R. A. (2000). Measuring sense of classroom community. Paper presented at Learning 2000: Reassessing the Virtual University, Virginia Tech, Roanoke, VA.

Stoerger, S., & Kreiger, D. (2016). Transforming a large-lecture course into an active, engaging, and collaborative learning environment. Education for Information, 32, 11–26. doi:10.3233/EFI-150967

Tatone, J. (2016). Integrating contemplative learning into new media literacy: Heightening self-awareness and critical consciousness for enriched relationships with new media ecologies (Master’s thesis). University of Oregon.

Tyma, A. (2011). Connecting with what is out there! Using Twitter in the large lecture. Communication Teacher, 25(3), 175–181.

Válek, J., & Sládek, P. (2012). Immersed into digital world. Learning and students’ perception. Social and Behavioral Sciences, 69, 1866–1870.

Varadarajan, R. (2011). Use of Twitter to encourage interaction in a multi-campus pharmacy management course. American Journal of Pharmaceutical Education, 75(5), 1.

Wighting, M.J. (2006). Effects of computer use on high school students’ sense of community. The Journal of Educational Research, 99, 371–379.

Appendix: Focus Group Protocol

IRB forms, name tags, demographic forms, and snacks. Check the recorder. Why we’re doing this study:

  • Help us in our teaching.
  • Help other university professors who are considering tech options in large lecture classes.
  • Part of our job is research.Ground rules:
  • Try not to interrupt or talk over anyone.
  • Different opinions are welcome.
  • Please be completely honest with your feedback.
  • Concrete examples and stories are especially helpful.

Questions (Note: for space considerations only the major questions were provided here.

Probes are not included)

  1. How long ago did you join Twitter and why did you join it?
  2. For those of you who used Twitter prior to J201, what were your experiences with using Twitter?
  3. What were your initial thoughts and feelings upon finding out that you would beasked to tweet to a class hashtag during our class?
  4. What was it like during the first couple of weeks when you were tweeting throughoutclass?
  5. How did you feel about having the live Twitter feed projected on the wall?
  6. Can you describe the experience you had when you posted your first tweets to the#UOJ201 hashtag?
  7. What are your thoughts about when [Tiffany/I] shifted from having you tweet throughout class to having designated intervals for tweeting during class?
  8. What are your thoughts about the tweets on our class hashtag?
  9. Can anyone talk about interacting with others on the hashtag and what that experience was like?
  10. How do you decide what to tweet?
  11. Can you describe the ways in which using Twitter as part of the large classroom experience engaged, distracted or, in some other way, affected you?
  12. Do you think cell phones should be used in large lecture classes? Why or why not?
  13. Can you talk about your thoughts on the ideal college classroom experience in a large lecture class – what student technology, if any, works best for you – including not just Twitter but any social media and any classroom response technology, such as Top Hat or the iClicker.
  14. Time pending: Have you talked to others about your use of Twitter in the classroom and, if so, in what ways did you describe the experience to them?
  15. Time pending: What are some of the general thoughts and feelings you have toward class use of Twitter and other social media, both in and out of the classroom?
  16. Is there anything you would like to add?


The Best of Both Worlds: Student Perspectives on Student-Run Advertising and Public Relations Agencies

The Best of Both Worlds: Student Perspectives on Student-Run Advertising and Public Relations Agencies

  • Joyce Haley, Abilene Christian University
  • Margaret Ritsch, Texas Christian University
  • Jessica Smith, Abilene Christian University

Haley--250x350px   Ritsch-250x350px   Smith-250x350px


Student-led advertising and/or public relations agencies have increasingly become an educational component of university ad/PR programs. Previous research has established the value that advisers see in the agencies, and this study reports student perceptions of agency involvement. The survey (N = 210) found that participants rated the opportunity to work with real clients, the importance of their universities having agencies, and the increase in their own job marketability as the most positive aspects of the agency experience. Participants said that the most highly rated skills that agency participation built were the ability to work with clients, working in a team structure, and interpersonal skills.

Keywords: Student-led agencies, public relations, advertising, skills

Haley, J., Ritsch, M, & Smith, J. (2016). The Best of Both Worlds: Student Perspectives on Student-Run Advertising and Public Relations Agencies, Journal of Public Relations Education 2(1), 19-33.

PDF Download Link: The Best of Both Worlds: Student Perspectives on Student-Run Advertising and Public Relations Agencies (Link opens in a new window.)

Menu: Abstract | PDF | Introduction | Literature Review | Method | Results | Discussion & Conclusions | References


Student-run public relations agencies have existed for nearly 40 years. Self-identifying as the nationís oldest student-run public relations agency, PRLab at Boston University was founded in 1978. By 1989, eight student-run advertising agencies had been established in such places as the University of Oregon and the University of Illinois (Avery & Marra, 1992). By 2010, a study of student-run public relations agencies identified 119 such firms (Maben, 2010).

In the past decade, the student-led agency has increasingly become a component of university public relations and advertising programs. Bush and Miller (2011) found that nearly 60% of participating agencies had existed for fewer than 6 years and almost 15% of agencies had existed less than a year. Further, Busch (2013) reported that 55% were established after 2007.
Likewise, research about student agencies is also a relatively unexplored frontier. The available research examines the pedagogical value of the agency from the adviser or educator perspective. Many of the studies are qualitative. This study adds the student perspective of the learning experience to the existing literature.

In this survey, students and alumni report that the student agency provides an experiential learning opportunity that gives students the chance to apply the knowledge gleaned from the classroom to client work, performed in a professional environment with faculty adviser guidance. Participants reported their perceptions of how agency experience helped them develop skills required for employment in strategic communication fields.

Survey results indicate that campus-based advertising and public relations agencies can offer a powerful learning environment in higher education. The experience enables the development of skills that are important to employers: teamwork, written and oral communication, and interpersonal skills, as well as reliability and problem-solving ability (Battle, Morimito & Reber, 2007; Commission Report, 2006; Paskin, 2013; Todd, 2009). This paper begins with a review of literature that has examined education in strategic communication, the value of experiential learning, and the growth and performance of student agencies. It continues by describing the survey methodology employed, sharing results, and discussing implications of the findings.

Literature Review

Strategic communication educators periodically receive input from industry professionals regarding skills requirements for entry-level practitioners. The Commission on Public Relations Education issued a report in 2006 (an update of an earlier report issued in 1999) titled The Professional Bond: Public Relations Education for the 21st Century. In both the 1999 and 2006 reports, the commission identified gaps between what public relations majors were able to do upon graduation and what PR professionals required of entry-level employees. Among the most desired attributes were writing, critical thinking, and problem-solving skills. Professionals deemed graduates lacking in all of these areas. The commission called on faculty to balance the teaching of writing skills with instruction in “higher order knowledge” like strategic thinking and management skills. Gaining practical experience was highly recommended and was named a key factor in students obtaining entry-level positions (Commission Report, 2006).

Several recent studies indicate that this gap continues to exist. Todd (2009) reported that PRSSA professional advisers thought skills taught in classes and skills needed in industry were mismatched. Todd said professional advisers placed higher value on “a curriculum that emphasizes practical experience in new media, internships, preparing students for their first job, and ëhands-on experience” (Todd, 2009).

There is a wide divide between how recent graduates employed in entry-level public relations positions view their job skills and how their supervisors rate them, according to Todd (2014). Practitioners who had been working in the field for 2 years or less believed their performance to be average to above average on skills and professional characteristics. Supervisors rated them significantly poorer on all but two skills, social media and computers. The greatest disparity in technical skills was in the evaluation of writing ability, followed by oral and research skills. Of the 16 professional characteristics measured, the largest divide occurred in critical thinking, dependability, attention to detail, following instructions, time management and accepting responsibility.

Industry supervisors placed “real life industry experience in the classroom” as their top suggestion to improve professional performance (Todd, 2014). Entry-level personnel ranked that suggestion second after business etiquette courses. Obtaining multiple internships ranked as second for industry supervisors and third for entry-level personnel. Supervisors also suggested increasing opportunities for writing with constructive criticism and requiring students to gain more writing practice.

The professional expectations of integrated marketing communications practitioners mirror those required of dedicated public relations professionals. Students entering IMC fields should have strong communication skills, strategic and conceptual thinking, interpersonal skills and professionalism (Battle et al., 2007; Beachboard & Weidman, 2013).

Professional application of new media tools has become essential to the practice of advertising and public relations. But, when asked to compare the importance of new media skills to traditional skills, professionals said a foundation of basic skills like writing, communication and strategic thinking should take precedence. Teaching traditional skills within the context of new media applications was considered to be ideal (Paskin, 2013).

Experiential Learning in the Curriculum

Kolb (2014) focuses on experiential learning and suggests that most disciplines would be well served to go beyond imparting factual information to helping students place the information in a conceptual framework so they can use it in varied settings. Experiential learning is “the process whereby knowledge is created through the transformation of experience” (Kolb, 1984, p. 41). In this approach, learning is a process of relearning that requires learners to adapt as they interact with their environments. Experiential learning connects learning, thinking, and doing in a continuous loop.

Experiential learning activities have long been available for journalism majors through working on a school newspaper or yearbook. This applied learning experience increases the likelihood that students entering the journalism field truly understand the discipline and secure a job immediately upon graduation, according to Feldman (1995).

Within the public relations and advertising curriculum, experiential learning is typically facilitated through the capstone campaigns course, internships, and service learning. Students rated a campaigns course as highly effective in helping them develop the professional skills of writing and editing, strategic planning, teamwork, research, client relations and managerial skills. They also ranked service-learning high for its ability to deliver an opportunity to apply course knowledge to the real world and to build confidence and leadership (Werder & Strand, 2011). Yet, the campaigns course generally provides minimal client contact. According to Benigni, Cheng and Cameron (2004) more than half of professors report having client contact only one to three times per semester. The class also tends to have short-term technical tasks that must be repeated rather than focusing on “evolved management function” (p. 270).

Muturi, An, and Mwangi (2013) write that students report a high level of motivation from service learning projects, viewing them as an opportunity to “learn about the real world outside the classroom” (p. 401). A key motivating factor could be the “desire to move away from hypothetical classroom situations and into a real-world setting as the site for education” (p. 400), suggesting they would have a positive attitude toward any project that would meet these needs.

Professionals ranked having an internship/practicum or work-study program among the top five out of 88 areas of public relations content (DiStaso, Stacks & Botan, 2009). But internships may be more task- than process-oriented, thus not facilitating higher-level knowledge (Neff, 2002). The student agency delivers a form of experiential learning that facilitates a “cycle of learning” where the learner “touches all the bases – experiencing, reflecting, thinking, and acting” (Kolb & Kolb, 2009, p. 298).

Growth in the Number and Size of Student Firms

An analysis of undergraduate student-run public relations firms on U.S. college campuses in 2010 identified 119 agencies (Maben, 2010). Advisers representing 55 of these agencies responded to the online survey, reporting an average agency age of 9.36 years, with 22 having been in operation for 4 years or fewer. Thirteen had existed over 15 years, and the oldest was 37 years. Bush and Miller (2011) found that nearly 60% of advisers worked with agencies that had existed fewer than 6 years. Almost 15% of respondents advised firms that were less than a year old.

Busch (2013) analyzed the online presence of advertising and public relations agencies and found that only 19% of the analyzed agencies were established before 2000. More than half began after 2007. Busch found that agencies get larger over time. Seventy-five percent of small agencies (fewer than 25 members) were founded after 2007. All of the large agencies (more than 50 members) were founded before 2007. Taken together, these studies indicate that student-run agencies are a relatively recent trend.

Structure of Student-Run Agencies

Bush and Miller (2011) found that advisers of 51% of student-run firms described them as focused on integrated communications, followed by about a third primarily focused on public relations, and 9% focused on advertising. Agencies were evenly split between schools offering credit for participation and those that did not. Just over half operated out of journalism/mass communication programs, and 40% were student organizations, most commonly affiliated with a professional organization such as PRSSA. Just over a third had a dedicated workspace. The service most frequently provided for clients was social media (89.6%), followed by event planning (87.5%), and campus posters (85.4%). Full campaigns were implemented by 83% of firms.

Of business processes, the most common practices were weekly meetings (89.6%) and client contracts and staff orientation (at 77.1% each). Less than half used planning briefs or time sheets, and only 32.6% tracked billable hours. A majority of student agencies have implemented standard industry business practices such as job descriptions, approval and reporting hierarchies, an application and interview process, and client billing (Bush & Miller, 2011). These findings were supported by Maben (2010). Bush and Miller (2011) found that nearly 90% of agencies provide leadership opportunities for students. Fewer than half have creative teams or media directors. Nearly 60% invoice clients for their services (Bush & Miller, 2011). Maben (2010) indicated that nearly half of agencies charge clients for services.

Prior to 2009, publications about student agencies focused on case studies of individual firms (Swanson, 2011). Bush (2009) evaluated pedagogical benefits and suggested agency structures (or types) that were best prepared to deliver these benefits. Another focus of the study explored features that appeared to enhance agency sustainability. To provide a consistent platform for teaching and learning, student agencies need stability. More than 20% of the sample agencies in the Bush and Miller (2011) study had gone out of existence and revived, a few more than once. Bush (2009) suggested that agencies with the greatest likelihood of longevity had well-established structures with teams and job titles, and used business procedures including job applications and performance assessments. Additionally, clients were charged for services and the firm had a dedicated office space. Academic course credit and set meeting times provided accountability for student performance. Some students were paid. Faculty advisers were compensated, generally through a course release or overtime pay. Services provided to clients required both task and process-oriented skills.

Agencies that are operated through journalism/mass communication programs reported having more of the variables that contribute to sustainability than those run as student organizations (Bush, 2009). Those connected to JMC programs were significantly more likely to have an office that included technology and to charge clients for their services. Advisers of these programs report spending more time in their advising roles.

Among the significant challenges to stability and consistency were funding and university support. Bush and Miller (2011) found that nearly two-thirds of agencies received no university funding. Only 2% received funding at levels consistent with other student media. Seventy-five percent of agencies in Maben (2010) reported receiving no university funding.

Bush and Miller (2011) found that almost 40% of advisers described their advising as more time-consuming than teaching other courses, and about 20% reported spending the same amount of time. Eighty percent did not receive a course release or overload pay, and their advising did not count as service for tenure and promotion. Those who received compensation generally spent more time than advisers who took on the role as faculty service (Bush, 2009).

Agencies identified by Bush (2009) as having the greatest risk of dissolving had little student accountability, were volunteer-based with no application process, operated with few business protocols, and had no dedicated office space. These less stable student agencies functioned entirely as a student organization or club, and the quality of student leadership varied from year to year. Few of the 55 agencies Maben (2010) studied were this type of agency.

The importance of a dedicated office space to sustainability is unclear. Firms in existence the longest were less likely to have dedicated office space, according to Maben (2010). Only 38% of agencies in the Bush and Miller (2011) study were housed in a dedicated space.

At one university, an agency model provides an example of an approach that may circumvent the sustainability and university support issues. The university established a PR firm and integrated it with a required senior-level capstone course. In this way, faculty involvement is included in a regular course load (Swanson, 2011).

Adviser Perceptions of the Educational Value of Student-Run Firms

Previous research indicates that advisers of student-run agencies believe in the educational value of the agency model. Two-thirds said they believe student agencies are “extremely beneficial to student learning” (Bush & Miller, 2011, p. 488). They are viewed to be “highly beneficial to public relations pedagogy in the two areas that are most difficult to teach: Process-oriented experiential learning and professional skills,” according to Bush (2009, p. 35). The advisers articulated another benefit: the facilitation of career choice and opportunities.

Among the professional skills learned, the top benefit cited was the experience of working directly with clients (Bush & Miller, 2011). Learning to manage client relationships, anticipate issues, and deal with clients who change direction were commonly defined benefits (Maben, 2010). A majority of advisers said the agency experience benefited students by giving them the chance to apply their classroom learning to immediate client challenges, and to practice business processes within the context of a professional environment. (Bush & Miller, 2011). Advisers report that applied learning occurs with research, writing, strategic planning, event planning, media pitching and other client services (Maben, 2010).

Students participating in agencies were observed to grow in gaining confidence, taking on responsibility, solving problems, providing leadership that inspires others to follow, working effectively in teams, and managing deadlines (Bush, 2009; Maben, 2010). Maben said it provided a place where students “gain confidence in their ability to think independently and to take on new challenges” (p. 87). Students learned the skills of negotiating with others and of giving and accepting constructive feedback. Advisers said that the experience helped students believe they could succeed in professional agencies.

One important aspect of learning was leadership. Bush (2009) reported that “most questions for advisers are management questions – team membership, client relationships and how to deal with employees” (Bush, 2009, p. 32). The student agency structure typically gives students experience with more disciplined business practices than are offered in other experiential courses like the campaigns course, Bush found. Students have an opportunity to learn to apply a process approach and critical thinking within the professional environment. (Bush, 2009).

Agency experience on a resume can open doors to internships and employment. Bush and Miller (2011) found that half of advisers report that students often receive job or internship opportunities based upon having the agency experience. Another 42% report that students sometimes are afforded these opportunities. Maben (2010) reported that seeing a student-run agency listed on applicantsí resumes automatically earned interviews. Students with agency experience were able to more quickly obtain top internships and secure jobs, sometimes above entry-level (Bush, 2009). Maben (2010) said that “the whole experience sets them apart from students who have no practical experience” (p. 89).

Adviser time commitment was positively correlated with advisersí perceptions of studentsí skill development (in areas such as writing press releases and graphic design) and of the agencyís overall benefit to student learning (Bush & Miller, 2011; Bush, 2009). Having dedicated office space enhanced learning outcomes in skills application, understanding business processes, and developing professional skills (Bush & Miller, 2011).

Adviser responses were overwhelmingly positive regarding the agency experience, with a few reporting that their firms were too young to predict outcomes or that they believed the effects to be neutral (Maben, 2010). Challenges cited include keeping students motivated and managing client expectations. Advisers reported that “client expectations were often either too high or too low” (Bush, 2009, p. 33).

Research Questions

The literature provides a solid view of the value advisers see in student-run ad/PR agencies, so this study will focus on the perspective of student participants. Three research questions guide the paper.

RQ 1. What were studentsí experience in agency participation?
RQ 2. How did agency experience affect student skills?
RQ 3. How did agency participation affect opinions about university structure for agencies?


An online survey targeted people (current university students and graduates within the previous 2 years) who have worked in a student ad/PR agency. The survey had 28 items. Participants reported gender, major and year of graduation. Participants rated their level of agreement with nine items about their agency experience:

  • I feel better prepared for the professional expectations of the workplace.
  • The experience allowed me to learn at a deeper level the concepts covered in previous coursework.
  • I feel more confident in my abilities.
  • It is/was important for my learning to work directly for real clients.
  • The experience has enhanced my “marketability” as a job candidate.
  • Being in a responsible, dedicated job role is/was one of the most valuable things about the experience.
  • It is important for my college or university to have a student agency for ad/PR.
  • I have gained a greater sensitivity for people who are different from me (a difference such as racial or ethnic background, sexual orientation, disability).
  • Participants also rated the effect of agency work on their development of 10 types of skills:
  • Working within a team structure
  • Interpersonal skills
  • Problem-solving
  • Leadership
  • Writing
  • Working with clients
  • Understanding new media
  • Strategic planning
  • Production skills like graphic or web design
  • Business practices like budgeting, timekeeping, billing

Participants reported their level of agreement with the need for a student agency to have a dedicated space (instead of a classroom) and a faculty adviser who is readily available to students when they need guidance. They reported whether they received academic credit, a stipend and/or pay for participating in the student agency. They also reported how many hours per week on average they worked in the student agency.

A Qualtrics survey link was emailed to faculty advisers of student advertising and/or public relations agencies at 61 U.S. colleges and universities. The list included student agencies at both public and private colleges and universities, geographically dispersed across the U.S. To compile the list, the authors drew from PRSSAís roster of affiliated student agencies and the list of schools accredited by the Accrediting Council on Education in Journalism and Mass Communication. From the combined list, the authors searched for current adviser contacts. Three online searches were conducted to obtain faculty adviser names, first on student agency websites, secondly on the host university website, and finally a general search engine query using the agency name and the term “faculty adviser.”

The 61 faculty advisers were asked to distribute the survey link to the students who worked at the student agency either currently or in the previous 2 years.
Table 1


Out of 227 responses, 210 people provided informed consent and are considered participants. Not all participants answered every question. Participants were primarily female (n = 164, 80%), and two-thirds of participants were advertising or public relations majors in college (see Table 1).

Nearly three-quarters of participants were currently working at a student-run agency (n = 153, 74.3%), and the rest had worked at an agency in the past. Participants reported whether they received academic credit, pay, neither, or both for their agency service. Academic credit was the most popular response (n = 114, 55.1%), followed by neither academic credit nor pay (31.4%, n = 65), both academic credit and pay for their agency participation (9.2%, n = 19), and pay for their agency participation (4.3%, n = 9). Slightly more than half of participants spent 6 hours or fewer per week in their agency roles (see Table 2).

Table 2

Research Question 1

Nine questions measured various aspects of studentsí agency experiences. Mean responses above 4.25 on all items except one indicated high levels of agreement with the statements (see Table 3).

Table 3Agreement was particularly high among participants that it was important for colleges or universities to have a student-run agency and that it was important for their learning to work directly with real clients. The only item to receive moderate agreement was the statement that participants gained greater sensitivity for people who were different from them. There were no statistically significant differences by gender on any item.

Participants who were currently working for a student-run agency had a higher estimation of how the experience would enhance their marketability as a job candidate (M = 4.56, SD = .67) than participants who had worked for agencies in the past (M = 4.21, SD = 1.03). This difference was statistically significant, t(204)= -2.82, p < .01.

The number of hours worked per week affected participants’ judgment of two experience variables. The item “The experience allowed me to learn at a deeper level the concepts covered in previous coursework” showed a significant difference, F(2, 205) = 4.24, p = .02. Participants who worked 15 or more hours per week rated their conceptual learning higher (M = 4.77, SD = .50) than participants who worked 1-6 hours per week (M = 4.28, SD = .83) and students who worked 7-14 hours per week (M = 4.39, SD = .89). These are statistically different means according to Games-Howell post-hoc tests. Students who worked 1-6 hours per week did not differ significantly from students who worked 7-14 hours per week.

The item “I feel more confident in my abilities” also showed a significant difference, F(2, 205) = 4.29, p = .02. A Games-Howell post-hoc test showed that participants who worked 15 hours or more per week rated their confidence higher (M = 4.60, SD = .62) than participants who worked 1-6 hours per week (M = 4.15, SD = .75). Participants who reported working 7-14 hours per week were not significantly different from either of the other groups.

Research Question 2

Students evaluated the effect that agency work had on 10 types of skills (see Table 4). Working with clients was the skill that had the highest mean rating (M = 4.45, SD = .76), and production skills was the item that was lowest (M=3.34, SD = 1.04). Skill development did not vary by gender or whether participants were currently engaged in agency work or had been in the past.

Table 4Participants who had graduated were more likely to say that agency participation had helped their production skills (M = 3.52, SD = .92) than current students did (M = 3.18, SD = 1.19), t(198) = 2.23, p = .03.

The number of hours worked per week at the agency affected six of the skills variables. Students who reported working 15 or more hours per week rated the effect on their ability to work within a team structure higher (M = 4.63, SD = .49) than students who worked 1-6 hours per week (M = 4.25, SD = .71), a significant difference according to a Tukey post-hoc test on the omnibus F(2, 199)= 3.79, p = .02. The students who worked 7-14 hours per week (M = 4.44, SD = .77) did not differ significantly from the other groups.

A one-way ANOVA examining problem-solving skills by hours worked per week was significant, F(2, 197) = 4.02, p = .02. Post-hoc tests using Tukey HSD showed that students who reported working 15 or more hours per week rated the effect on their problem-solving skills higher (M = 4.59, SD = .50) than students who worked 1-6 hours per week (M = 4.10, SD = .82). The students who worked 7-14 hours per week (M = 4.20, SD = .91) did not differ significantly from the other groups.

A one-way ANOVA examining leadership skills by hours worked per week was significant, F(2, 197) = 5.54, p < .01. Post-hoc tests using Tukey HSD showed that students who reported working 15 or more hours per week rated the effect on their leadership skills higher (M = 4.62, SD = .56) than students who worked 1-6 hours per week (M = 4.07, SD = .84). The students who worked 7-14 hours per week (M = 4.33, SD = .91) did not differ significantly from the other groups.

A one-way ANOVA examining skills working with clients by hours worked per week was significant, F(2, 197) = 7.64, p = .001. Games-Howell post-hoc tests showed that students who reported working 15 or more hours per week rated the effect on their client skills higher (M = 4.83, SD = .38) than students who worked 1-6 hours per week (M = 4.27, SD = .79). Students who reported working 7-14 hours per week rated the effect on their client skills higher (M = 4.56, SD = .77) than students who worked 1-6 hours per week but did not differ significantly from students working 15 or more hours per week.

A one-way ANOVA examining new media skills by hours worked per week was significant, F(2, 196) = 6.39, p < .01. Tukey HSD post-hoc tests showed that students who reported working 15 or more hours per week rated the effect on their new media skills higher (M = 4.41, SD = .68) than students who worked 1-6 hours per week (M = 3.83, SD = .91) and students who reported working 7-14 hours per week (M = 3.69, SD = 1.04). There was no significant difference between students working 1-6 hours per week and students working 7-14 hours per week.

A one-way ANOVA examining production skills by hours worked per week was significant, F(2, 197) = 4.82, p < .01. Games-Howell post-hoc tests showed that students who reported working 15 or more hours per week rated the effect on their production skills higher (M = 3.90, SD = .86) than students who worked 1-6 hours per week (M = 3.21, SD = 1.07) and students who reported working 7-14 hours per week (M = 3.29, SD = 1.13). There was no significant difference between students working 1-6 hours per week and students working 7-14 hours per week.

Research Question 3

Participants indicated moderate support for the need for exclusive resources. The mean response indicated that it is moderately important for the student agency to have a dedicated space instead of a classroom (M = 3.51, SD = .65). The mean response for need for a faculty adviser who is readily available to students was 3.74 (SD = .50). There were no significant differences by gender or current or past affiliation with an agency, and the need for an adviser also had no significant differences by hours worked. Agency alumni thought it was more important (M = 3.62, SD = .55) than current students did (M = 3.42, SD = .72) for the agency to have a dedicated space, t(202) = 2.08. p = .04.

The more hours participants spent working at the agency, the higher their average rating of the importance of a dedicated space, F(2, 201) = 11.79, p < .001. Students who worked 1-6 hours a week rated this 3.31 (SD = .71) and were significantly different from students who worked 7-14 hours a week (M=3.66, SD = .56), and 15 or more hours a week (M = 3.86, SD = .35), according to Games-Howell post-hoc tests. Students working 7-14 hours per week and students working 15 or more hours per week were not significantly different.

Participant Comments

Participants responded to an open-ended question: “Why did you choose to participate in your college or universityís student agency?”

A total of 214 responses revealed a range of reasons, most commonly stated as “experience.” The word ìexperienceî appeared in 68% (n = 146) of the responses in a variety of contexts, ranging from “real-world,” “real life,” and “professional” experience to “the experience of being in charge and making decisions rather than [being] a powerless intern.”

One participant’s response reveals this common theme:

The experience would be more hands-on than learning about strategy from books, in lectures, case studies and projects – working with real clients to solve their marketing problems and see proposed strategies come to fruition was extremely motivating and more gratifying than an “A” on a test.

Some of the students defined their motivation for joining a campus agency in terms of what they believed the experience would not be. “It would not be another class, you get to work with real clients,” wrote one. “I wanted the guarantee of receiving real world work experience versus the possibility of making copies or getting someone coffee,” wrote another.

The quote above reflects a related theme: a desire for a “real” professional experience. Students used the word “real” in 19% (n = 42) of the open-ended responses, with ìreal worldî the most frequent way the term was used, appearing 23 times. Students wrote that they wanted to work with “real clients” and to gain “real-life,” “real job,” “real agency,” and “real work” experience. Another student wrote: “The real-life experience can not be duplicated anywhere else. At the same time, it ís a controlled environment. It’s the best of both worlds.”


Previous research shows that faculty advisers believe in the pedagogical benefits of student-run ad/PR agencies (Bush, 2009; Bush & Miller, 2011; Maben 2010). Advisers who have championed this teaching tool, often giving their time to it without compensation (Bush & Miller, 2011), should be encouraged to know that this survey indicates students and alumni highly value the student agency experience. They join advisers in observing that agencies are able to facilitate process-oriented learning and develop professional skills (Bush, 2009; Bush & Miller, 2011; Maben 2010). On most measures of skills and professional characteristics, participants rated the agencies’ effects on their skills above 4 on a 5-point scale.

For decades, the public relations profession has charged academia with delivering instruction in “higher order knowledge” like strategic thinking and management skills (Commission Report, 2006; Neff, Walker, Smith, & Creedon, 1999). These skills are also considered to be important in the broader IMC field (Battle, et al., 2007; Beachboard & Weidman, 2013 ). Quite recently, supervisors in the public relations field found that their entry-level hires underperform in critical thinking, dependability, attention to detail, following instructions, time management and accepting responsibility (Todd, 2014). Student and alumni participants in this study gave the student agency experience high marks for developing their capacity for strategic thinking, problem solving, and leadership. Advisers report that the student firm facilitates “learning things you can’t learn in a classroom” (Bush, 2009; Maben, 2010).

The capacity to collaborate successfully is a professional characteristic developed by the student firm experience (Bush, 2009; Maben, 2010). Students placed the ability to work with others at the top of the list of skills enhanced by agency participation. The three most highly rated skills were proficiency in working with clients, ability to work within a team structure, and growth in interpersonal skills.

Professionals recommend that students gain practical and hands-on experience. They further recommend that curriculum be designed to deliver some of these opportunities (Commission Report, 2006). One key area where the student firm delivers practical experience is in working directly with real clients. In this study, participants said this was the most important experience gained and the area in which they grew more than any other. Advisers also cited the client interface as the top benefit (Bush & Miller, 2011). Traditionally the campaigns course has encapsulated hands-on learning, but the campaigns course generally provides minimal client contact (Benigni, et al., 2004).

Advisers rate the application of classroom learning as the tertiary benefit (at 85%) after client contact and portfolio building (Bush & Miller, 2011). This survey showed that students who invested more time in working at the firm (15 or more hours per week) rated the experience more highly for allowing them to learn at a deeper level the concepts covered in previous coursework than did students who spent 6 or fewer hours. This finding suggests that students should be encouraged to invest 15 or more hours per week to learn and benefit the most from the experience.

Professionals point to practical experience as a key factor in students obtaining entry-level positions (Commission Report, 2006; Todd, 2009). Participants reported that they are well prepared to meet the requirements of the profession and that their marketability as job candidates had increased. Advisers also believe that the agency experience on a rÈsumÈ enhances job opportunities (Bush, 2009; Bush & Miller, 2011; Maben, 2010).

Results suggest that although student firms provide above-average experiential learning in writing, production, new media and businesses practices, these are the lowest-rated areas for skill development. Agency advisers would do well to explore ways to enhance the learning in these areas.

Participants agreed strongly that colleges and universities should have a student-run agency for advertising and public relations. They rated as moderately important the university-provided resources of a dedicated office space and an adviser who is readily available. According to Bush and Miller (2011), fewer than 40% of agencies have a dedicated space. This survey didnít ask whether participants worked in agencies with dedicated space, but the presence of a facility could have affected participantsí judgments of this factor. Perhaps the role of the adviser is only moderately valued because students may be unaware of the foundational support required to obtain clients, facilitate the staffing and training process, ensure that equipment and supplies are available and other behind-the-scenes work. Additionally, some agencies have both a staff director and a faculty adviser. In these cases, the staff director ís day-to-day role may meet the “readily available” need.


The number of questions included in this survey was limited. The one previous attempt on record (Maben, 2010) to gather the views of students who had worked at student firms resulted in only five responses. In an effort to greatly increase the number of respondents, this study launched a survey that could be completed quickly. The questions provide a first look at student experiences. The area can certainly use further research. The study also used current faculty advisers to contact current and former agency participants. The authors could not confirm that all advisers sent the survey out, nor did the survey ask participants to identify the school they attended. Future research would do well to collect data from a group that could be confirmed to be representative.

The survey was designed to produce results that allowed comparison of participants’ experiences with previous research examining advisers’ views, therefore some questions about need for university support, faculty advisers, and office space may not be salient for many student respondents.


Participation in a student-run advertising and public relations firm, if designed well, can allow students to learn at a deeper level the concepts covered in their coursework. The experience can enable students to feel better prepared for the professional expectations of the workplace. They are likely to graduate with more confidence in their abilities, including problem-solving, interpersonal, and teamwork skills and the ability to work directly with clients. This study suggests that students who participate in a student-run agency strongly believe in the value of the experience and believe every campus should offer it. In general, the more time students spend working in a campus agency, the higher they rate their learning.

Undergraduate advertising and public relations programs that do not offer a student agency can learn much from this study, combined with related literature, as they consider creating such an experiential learning opportunity for their students. Programs that already have student firms will find data that may guide efforts to refine and improve the educational impact of the student-run advertising and public relations agency.

Future research might seek the perspectives of employers who have hired students with campus agency experience. Employers may be able to shed light on whether the experience helped a student land the job, and whether they believe it provided a good foundation for recent graduatesí current responsibilities and prospects within the organization.


Avery, J. R., & Marra, J. L. (1992). Student-Run advertising agency: A showcase for student work. Paper presented at the annual meeting of the Association for Education in Journalism and Mass Communication, Montreal, Canada. Retrieved March 1, 2015, from http://eric.ed.gov/?id=ED351711.

Battle, T. A., Morimoto, M., & Reber, B.H. (2007). Considerations for integrated marketing communications education: The needs and expectations from the communications workplace. Journal of Advertising Education, 11(2), 32-48.Journal of Advertising Education, 11(2), 32-48.

Beachboard, M. R., & Weidman, L. M. (2013) Client-centered skill sets: What small IMC agencies need from college graduates. Journal of Advertising Education, 17(2),

Benigni, V., Cheng, I. & Cameron, G. T. (2004). The role of clients in the public relations campaigns course. Journalism and Mass Communication Educator, 59(3), 259-277.

Busch, A. M. (2013). A professional project surveying student-run advertising and public relations agencies at institutions with ACEJMC accredited programs. Theses and Professional Projects from the College of Journalism and Mass Communications, Paper 35. Retrieved March 1, 2015 from http://digitalcommons.unl.edu/journalismdiss/35.

Bush, L. (2009). Student public relations agencies: A qualitative study of the pedagogical benefits, risks, and a framework for success. Journalism and Mass Communication Educator, 64, 27-38.

Bush, L., & Miller, B. M. (2011). U.S. student-run agencies: Organization, attributes and adviser perceptions of student learning outcomes. Public Relations Review, 37(5), 485-491.

Commission on Public Relations Education (2006). The professional bond: Public relations education for the 21st century. Richmond, VA: Judy VanSlyke Turk, Ed. Retrieved March 1, 2015, from http://www.commpred.org/theprofessionalbond/index.php.

DiStaso, M. W., Stacks, D. W., & Botan, C.H. (2009). State of public relations education in the United States: 2006 report on a national survey of executives and academics. Public Relations Review, 35, 254-269.

Ellis, B. G. (1992, August). A case study of a student-run advertising/public relations agency: The Oregon State University experience. Paper presented at the annual meeting of the Association for Education in Journalism and Mass Communication, Montreal, Canada.

Feldman, B. (1995). Journalism career paths and experiential learning. Journalism and Mass Communication Educator, 50(2), 23-29.

Kolb, D. A. (2014). Experiential learning: Experience as the source of learning and development. (2nd ed.). Upper Saddle River, N.J.: Pearson.

Kolb, A. Y., & Kolb, D. A. (2009). The learning way: Meta-cognitive aspects of experiential learning. Simulation & Gaming, 40(3), 297-327.

Kolb, A. Y., & Kolb, D. A. (2005). Learning styles and learning spaces: Enhancing experiential learning in higher education. Academy of Management Learning & Education, 4(2), 193-212.

Maben, S. K. (2010). A mixed method analysis of undergraduate student-run public relations firms on U.S. college campuses. (Unpublished doctoral dissertation) University of North Texas, Denton, Texas. Retrieved March 1, 2015, from http://digital.library.unt.edu/ark:/67531/metadc30486/m2/1/high_res_d/dissertation.pdf.

Muturi, N., An, S., & Mwangi, S. (2013). Studentsí expectations and motivations for service-learning in public relations. Journalism and Mass Communication Educator, 68(4), 387-408.

Neff, B. (2002). Integrating leadership processes: Redefining the principles course. Public Relations Review, 28(2), 137-147

Neff, B., Walker, G., Smith, M. F., & Creedon, P. J. (1999). Outcomes desired by practitioners and academics. Public Relations Review, 25(1), 29-44.

Paskin, D. (2013). Attitudes and perceptions of public relations professionals towards graduating students’ skills. Public Relations Review, 39, 251-253

Swanson, D. J. (2011). The student-run public relations firm in an undergraduate program: Reaching learning and professional development goals through “real world” experience. Public Relations Review, 37, 499-505.

Swanson, D. J. (2014, April). Assessing learning and performance in the student-run communications agency. Paper presented to the annual meeting of the Western Social Sciences Association, Albuquerque, N.M. Retrieved March 1, 2015, from http://works.bepress.com/dswanson/71.

Todd, V. (2009). PRSSA faculty and professional advisorsí perceptions of public relations curriculum, assessment of studentsí learning and faculty performance. Journalism and Mass Communication Educator, 64(1), 71-90.

Todd, V. (2014). Public relations supervisors and millennial entry-level practitioners rate entry-level job skills and professional characteristics. Public Relations Review, 40(5), 789-797.

Werder, K. P., & Strand, K. (2011). Measuring student outcomes: An assessment of service-learning in the public relations campaigns course. Public Relations Review, 37(5), 478-484.

Menu: Abstract | PDF | Introduction | Literature Review | Method |
Results | Discussion & Conclusions | References

Considering Certification? An Analysis of Universities’ Communication Certificates and Feedback from Public Relations Professionals

Considering Certification? An Analysis of Universities’ Communication Certificates and Feedback from Public Relations Professionals

  • Julie O’Neil, Texas Christian University
  • Jacqueline Lambiase, Texas Christian University

julie-o'neil-250x350px   jacqueline-lambiase-250x350px


Working professionals may need post-baccalaureate education, but finding time and resources to do so may be difficult. An analysis of 75 university master’s programs in public relations found 22 related programs offering communication certificates. A web audit of these programs, plus a survey and in-depth interviews, indicated professionals are interested in earning certificates, particularly in social and digital media strategy and measurement. Professionals want to attend certificate programs that combine online and face-to-face instruction.

Keywords: certification, public relations, communication certificates

O’Neil, J., Lambiase, J. (2016). Considering Certification? An Analysis of Universities’ Communication Certificates and Feedback from Public Relations Professionals, 2(1), 34-46.

PDF Download Link: Considering Certification?: An Analysis of Universitiesí Communication Certificates and Feedback from Public Relations Professionals (Link opens in a new window.)

Menu: Abstract | PDF | Introduction | Literature Review | Method | Results | Discussion & Conclusions | References


U.S. News & World Report (2014) named public relations as one of the top 100 careers of 2014. The Bureau of Labor Statistics predicted that from 2010 to 2012, job growth for public relations specialists would increase by 12% (2012a) and that growth for public relations management positions would increase by 21% (2012b). This growing number of public relations practitioners must keep up with technology and industry trends and demands occurring in public relations.

There is no shortage of options for ongoing training and learning in public relations, as noted by the Commission on Public Relations Education 2012 report. Professional associations, commercial enterprises and trade publications offer a plethora of webinars, face-to-face seminars, and publications to educate and train public relations practitioners. Some public relations practitioners take advantage of these training and learning opportunities, while others invest the time and money to pursue a graduate degree in public relations or a tangential field such as business. However, in addition to the offerings provided by professional entities and the in-depth masterís degree, there is yet another way for public relations professionals to learn and grow: earn a certificate from a university. To date, scant research exists on communication certificates offered by universities. This study seeks to address this gap, by (1) analyzing through a website audit the types and structures of communication certificates offered by U.S. colleges and universities that also offer masterís degrees in public relations, and (2) examining public relations professionalsí preferences for certificates through an online survey and in-depth interviews.

Literature Review

Public Relations Graduate Education

Graduate education is growing particularly fast in public relations, fueled by the explosion of social and digital media and the accompanying job opportunities. Another driver of the growth of graduate programs in public relations is the need for universities to offset declining enrollments in journalism programs and budget cuts from state budgets (Commission on PR Education, 2012). Since 2000, the number of masterís degree programs in public relations has increased from 26 to 75 (Commission on PR Education, 2012; Shen & Toth, 2013).

In its website audit of 75 graduate programs in public relations, the Commission on PR Education (2012) noted a lack of uniformity among masterís degree programs in public relations in terms of program titles, admission standards, required credit hours, and curriculum. Follow-up research by public relations scholars Briones and Toth (2013) found a lack of conformity among the programs in terms of adhering to recommended content areas provided by the Commission on PR Education 2012 report. Briones and Toth attributed the lack of uniformity among programs in part to the widely different graduate models in existence at universities. While the majority of programs offer a professional graduate degree in public relations, some offer an academic degree designed to prepare students for a Ph.D., while others provide a more interdisciplinary graduate degree.

The Commission on PR Education (2012) also reviewed the delivery methods of public relations graduate programs. Roughly 82% of programs offer traditional courses, which rely on face-to-face meetings and instruction. Approximately 10% of public relations programs use online delivery, and about 8% use a hybrid/blended delivery that includes online and in-person instruction. Despite the small number of programs offering online programs, the Commission on PR Education predicts that the number of online and hybrid graduate programs in public relations will increase. According to journalism educator and researcher Casteneda (2011), some journalism programs are partnering with outside vendors to develop certificates and online degrees using a shared-revenue agreement.

Professional perceptions of education delivery methods is mixed. For example, research indicates that public relations educators and practitioners view traditional programs more positively compared to online programs (Commission on PR Education, 2012; Toth, Shen, & Briones, 2012). However outside of public relations, evaluation of online programs tends to be more positive. One evaluation of an online masterís program revealed that students listed improving themselves, advancing in their career, personal reasons and securing new job opportunities as the top reasons for their need of an online program (Tokmak, Baturay & Fadde, 2013). Online courses can better accommodate working professionals who may be balancing family commitments (Wyland, Lester, Mone, & Winkel, 2013), and they enable people to take courses from universities located around the world (Gold & Jose, 2012).

Graduate programs can integrate in-person meetings with online sessions to create a hybrid program that combines benefits of both methods while still remaining flexible for students. In an 18-month Internet-Based Masterís in Educational Technology (iMet) program, students learn collaborative problem solving through teamwork assignments in a classroom and online platform (Cowan, 2012). Students of the iMet program “meet 25% face-to-face and 75% online” (p. 13). At the University of Nevada at Reno, a hybrid masterís degree program allowed the journalism school to create variety in its curriculum that accommodated studentsí preferences without needing additional staff or resources (Coulson & Linn, 1995).

Certificate Programs

For students who desire to earn new skills, but require a more cost-effective and/or more accessible program than what is afforded through a masterís program, a certificate program is a viable option. Universities are increasingly offering students the ability to complete courses to earn a post-baccalaureate certificate. Bosworth (2010) of Complete College America defines a certificate as a technical diploma comprised of “credentials issued by educational institutions that indicate completion of a discrete program of study or series of courses” (p. i). Certificate programs provide credentialing for new skill sets in a shorter amount of time than a typical masterís degree. In the 2010-2011 academic year, public and private universities in the United States conferred more than 1.20 million certificates (National Center for Educational Statistics, 2013). The knowledge and skills gained from a certificate program allow students to easily transfer what they learn to the workplace (National Center for Education Statistic, 2012). By teaching students more practical knowledge, the coursework included in a certificate program is typically more applicable for career advancement.

The demand for certificates is steadily increasing. Since 2000, conferred certificates have increased 64% (U.S. Department of Education, 2013). The increase in certificates demonstrates both a desire from students and a need for universities to provide these types of learning and training programs. Certificate programs range significantly in structure in terms of the amount of time and coursework needed for completion. However, compared to certificates completed in shorter periods of time, certificate programs lasting more than one year are linked with higher salaries (Bosworth, 2010). Public institutions award more one-year certificates compared to private universities, which account for only 5% of all certificate programs (Bosworth).

Certificate course delivery ranges from traditional and online methods to hybrid approaches that combine the two. In a study conducted about an online library media certificate, researchers found that a strictly online education provided working adults access to specialized knowledge needed to prepare them to maintain and grow within their current positions (Meyer, Bruwelheide & Poulin, 2009). In a study conducted by professors from Northern Illinois University, they recommended that certificate programs of all types should target more interdisciplinary approaches to courses (McFadden, Chen, Munroe, Natfzger, & Selinger, 2011). By providing overlap, students can learn how to address core subject areas more dynamically. Universities that can provide course crossover are better positioned to equip their students to think critically about their subject area.

In 2013, the Public Relations Society of America began offering the Certificate in Public Relations Principles for university students majoring in public relations in their final year of school (PR accreditation, 2013). This certificate program is currently offered at 13 participating colleges and universities. The Universal Accreditation Board administers the certification exam (PR accreditation).

In summary, in response to the growth in the public relations field and professionalsí need for ongoing training and learning, masterís degree programs in public relations also continue to increase in number. Certificate programs in multiple disciplines are also increasing. Other than the 2013 PRSA certificate program, less is known, however, about the ways universities are offering certificate programs related to communication professions, how those programs meet the specialized training and learning needs of public relations professionals, and whether public relations professionals see value in pursuing a certificate. This mixed-methods study therefore seeks to answer:

RQ1: What types of communication-related certificates do universities offer?
RQ2: Do public relations professionals value post-baccalaureate certificates, and if yes, what certificate features are most attractive?


In the first phase of the project, researchers conducted a website audit of communication-related certificates offered at universities in the United States. The sampling frame included the 75 universities identified by the Commission on Public Relations Education (2012) as offering a masterís degree in public relations. Researchers first visited the websites at those 75 universities to determine if the university offered a certificate program in a communication-related field. Only post-baccalaureate certificates, not undergraduate ones, were systematically analyzed. Twenty-two of the 75 universities offer post-baccalaureate certificate programs, with four university programs offering two types of certificates, for a total of 26 certificate programs to be analyzed. Researchers then systematically analyzed the offerings for certificate title, program costs, delivery method, and the ratio of courses and credits earned.

The second phase of the research included an online survey and in-depth interviews with public relations professionals. The survey instrument was created in Qualtrics and included questions about communication professionals’ ongoing training and learning needs and their interest in and preference for certificates, including content and delivery method.

Researchers secured approval from their university’s Institutional Review Board prior to data collection.

The researchers used purposive and snowball sampling to recruit practitioners for their online survey. Researchers emailed 30 presidents of local chapters of professional communication associations such as the Public Relations Society of America, Social Media Club, and the International Business Communicators Association to ask them to forward the survey invitation to their respective members. The researchers also recruited participants via social media and personal contacts. One hundred and twelve participants completed the survey in February 2014. Participants live and work across the United States.

As part of the survey, participants were asked if they would be willing to be interviewed by the researchers. In this third phase of research, more than 20 participants were contacted after volunteering, and the researchers conducted in-depth interviews with 13 communication professionals. Each participant responded to nine open-ended questions that were related to their career and educational goals and professional development needs. Their responses were transcribed and analyzed by the researchers through an iterative reading process used to discover common themes and to produce descriptive summaries of participants’ ideas and suggestions. In this qualitative part of the overall study, participants created meaning with the researchers.


Audit of Existing Certificate Programs

Table 1 displays the information about the communication certificates offered by the 22 programs. Eighteen of those programs offer a single certificate program, with four offering two certificate programs; the following descriptive findings, then, are based on these 26 separate certificate programs. Seven programs use the term “marketing” in their titles, as in “Strategic Marketing” or “Integrated Marketing.” Five programs include the word “digital” in their titles, with three of those titles using “Digital Marketing.” The other two programs using this term were called “Digital Storytelling” and “Digital Media Skills.   Five programs include the term “public relations” in their titles, while four programs use the term strategic in their title, as in “Strategic Communication” or “Strategic Marketing.” Nine programs positioned themselves through terms related to new/social media or technology. The single or double appearances of these following terms indicate more focused programs on this list: “nonprofit,” “global,” “storytelling,” “ethics,” and “diversity.”

As indicated by Table 1, the majority of the graduate programs included in this website audit offer a certificate completion in one year or less. That is, out of 22 schools examined, 12 offer certificate completion within 1 year, with the leading minimum being a 3-months’ completion at the University of the Pacific, followed by a 9-monthsí completion at Seton Hall University and Lasell College. Out of the remaining 10 schools, eight offer certificate completion between a 1-year minimum and 2-year maximum. Factors playing into completion within this timeframe depend on a studentís course load and availability of requisite courses. For example, Northwesternís certificate in Strategic Marketing requires a completion of five courses, which may be taken in just 2 semesters or across 2 years. Similarly, the certificate programs at West Virginia University require 1-year minimums and 8-year maximums to complete, on a kind of “as you go” basis. Two other schools, the universities of Denver and Oregon, require a 2-year minimum for program completion and certificate attainment.


The ratio of number of courses to credit hours varied but most programs followed credit-hour protocols borrowed from graduate programs (i.e., one course equals three credit hours). Six programs offered 5 courses/15 credits for their certification, and six programs offered 6 courses/18 credits for certification. Six other programs were based on three courses, with varied credit hours. The outliers in this study were a 2-course program at Seton Hall for 12 credits and a 7-course program at the University of Oregon for 28 credits. These differences account for some of the variety of costs of these programs. Two universities, California State University-Fullerton and Georgetown University, offer Continuing Education Units (CEU) rather than credit hours. Twelve CEUs can be earned at California State University-Fullerton for completion of 6 courses and 10.8 CEUs at Georgetown for 6 courses.

While the majority of these graduate certificate programs cost between $10,000 and $15,000, five of the 22 programs cost less than $5,500. Of these five, the most affordable program is the University of the Pacific’s certificate in Social Media Business, offered at $495 and in a completion time of 3 months. Second to this program is that of California State University-Fullerton, offering a Digital Marketing certificate between $2,700 and $3,600 depending on in-state or out-of-state tuition, and obtained in a 10-month period of time. Six programs cost between $7,000 and $9,900, beginning with Rowan University at $7,019 for a certificate in Public Relations, and ending with Lasell College for a 9-month-long certificate obtainment priced at $9,825. Programs at 10 schools cost between $10,000 and $15,000, depending on students’ in-state or out-of-state classification and other factors. Moving on to those programs in the $20,000 bracket, the University of Oregon’s non-resident certificate cost of $22,198 begins the category, and is closely followed by George Washington University’s $23,674 Public Relations certificate program cost and Northwestern’s $24,330 Strategic Marketing certificate program cost. Standing as the most expensive certificate programs are those offered by Farleigh Dickson University at $33,507 for a six-course completion in Public Relations Administration, and $56,208 to obtain a Communication certificate from Auburn University in just two semesters.

The delivery models of the 22 university certificate programs varied as well. Eleven programs were online only and seven programs were classroom only, with four programs using a blended model.


One hundred and twelve professionals responded to the survey, although only 85 people fully completed it. Seventy-four percent of respondents are female; 26% are male. Fifty-nine percent are 30 or younger, 24% are between the ages of 31 and 45, and 17% are older than age 45. Eighty-six percent of participants have an undergraduate degree, mostly in the areas of public relations, strategic communication, and journalism. Fourteen percent of respondents have a masterís degree. In terms of current position, 25% of respondents indicated they are working in public relations, 18% in advertising, 14% in digital media, 4% in journalism, and 35% in “other,” which includes fields such as event planning, sustainability, fundraising, marketing, corporate communication, and human resources. Five percent of respondents said they were not currently working. Thirty-six percent of respondents work for an agency, 31% for a corporation, 13% for a nonprofit organization, 9% for government, 4% are self-employed, and 8% said “other.” The self-reported income levels included 6% less than $25,000, 16% between $25,000 and $35,000, 34% between $35,001 and $50,000, 24% between $50,001 and $75,000, 10% between $75,001 and $100,000, and 11% more than $100,000.

Survey participants were first asked to indicate their level of interest in taking leadership and training programs at a university on a variety of topics by indicating whether they were interested, uninterested, or unsure of interest. As indicated by Table 2, the topic that received the greatest interest included effective storytelling across multiple platforms (traditional and online) followed by social media and digital media strategy, and then measuring and evaluating communication effectiveness. The topic that received the least amount of interest was design fundamentals and multimedia, although roughly 50% of respondents still indicated interest in that topic alone.


When asked to indicate their level of interest in earning a post-baccalaureate certificate in a communication-related area, 28% said they were “definitely interested,” 29% said “probably interested,” 29% indicated they “might be interested,” 10% said “probably not interested,” and 3% said “definitely not interested.” There were no statistically significant differences among people of different ages regarding their interest in certificates (x2 = 25.54; df = 32; p = .78), between men or women (x2 = 4.19; df = 4; p = .38), nor between people working at different types of organizations (x2 = 19.15; df = 20; p = .51).

Participants were next asked what type of learning environment they would prefer for certificate coursework. Sixty percent said they would prefer one-day weekend, in-person seminars coupled with 5- to 10-week online assignments and discussions; 22% said in-person weekend seminars only; and 18% said online learning delivery that consists solely of online assignments and discussions. Sixty-seven percent of respondents then said they typically prefer to learn and keep up with professional development with a community of people; 33% said they prefer to do so by themselves.

Fifteen percent of participants said they had already earned a professional certificate. Participants indicated they had earned certificates in new media, human resources, management, fundraising, teaching, training, graphic design, leadership and communication, including both APR (Accredited in Public Relations, offered through the Public Relations Society of America) and ABC (Accredited Business Communicator, offered through the International Association of Business Communicators).

The last survey question asked respondents for other suggestions related to their professional development and learning needs. In this open-ended survey question, some participants expressed the need for program flexibility and course options that accommodated professionals working full time. A few participants said programs need to have a strong focus on contemporary digital and social media tools and strategy. One participant lamented that “I’m pretty skeptical of any structured program’s ability to keep up with the pace of real-world trends in communication.” Another participant mentioned “connectivity to top-level executives for networking and job placement,” and another said the program must be “engaging.” Finally, one participant wrote that s/he “would love more info on how pay increases and how it can help with getting better job opportunities as well as figuring out your specific interests so that you can decide which program you want to go into.”

In-Depth Interviews

The in-depth interviews with 13 professionals included 10 women and 3 men ranging in age from mid-20s to early 60s. Nine of those interviewed are between the ages of 30 and 50, during a time of life when many professionals are seeking graduate-level educational opportunities (“Digest,” 2012; Mullen, Goyette & Soares, 2003); 4 participants had earned master’s degrees. These professionals work in varied environments. Four are employed by corporations, four work for nonprofits or universities, three work in sole-proprietor communication businesses, and two are employed by marketing or public relations agencies.

Participants were evenly split in response to whether they were actively thinking about pursuing a masterís degree, with five saying “yes” and five saying “no” (but three of the “no” answers were from professionals who already had masterís degrees). Three participants responded with “maybe,” and even the “yes” answers were filled with caveats. Cost and/or time was mentioned by six participants as reasons why pursuing a masterís degree would be difficult for them; two women and two men specifically mentioned family considerations as probable barriers. One of the male participants said “the lawn still has to be mowed,” another said he “was the primary breadwinner of my family,” and two female participants mentioned the ages of their children as factors for delaying consideration of or being hesitant to pursue graduate studies. Two participants specifically said shorter programs of 1 year or 18 months would be attractive to them.

These considerations of family and current career responsibilities also informed participant descriptions of a “dream” post-baccalaureate experience. Five professionals used the words “blended,” “mixture,” or “flexible” when describing a program that would be both in-person and online. This flexible program would help them manage learning while they worked and cared for family members. One participant who was interested in earning a masterís degree said, “Right now, we are all hands on deck, and I don’t leave here most nights before 7 p.m.” Nine of 13 participants expressed their preferences for a mixture of online and in-person course work for other reasons, too. One participant said he would want to meet in person “to connect and network,” while another wants to hear “what others are doing.” Another said she wants “a think-tank format (where discussion) is part of the learning.” One female professional said, “I never did want to do the APR thing (through PRSA), since it’s a lot of studying by yourself.” Finally, one participant said that while he realized some of the instruction would probably be online because many people required flexibility, he realized “I do better in person, not online – there’s a level of accountability in a group setting.”

An ideal curriculum for post-baccalaureate programs would include new social media communication strategies and new digital tools for analytics, according to almost all participants. “I need help with strategy,” said one participant, “to know what an orderly process is for doing the research or finding information, and then making a plan.” Blogging and digital influence were listed as components of needed new media tactics and strategies. Another participant asked for interdisciplinary curriculum that included “the spectrum of PR, marketing, business, advertising, social media.” Next on the list for many participants was curriculum that addressed business management or entrepreneurship. “If you want to move up, you must understand business,” said one, who mentioned a university which offered a business boot camp for non-business graduate students. Two respondents talked about their dreams of entrepreneurship and mentioned that disciplinary area as important to them. Two others mentioned diversity and cross-cultural communication as important to their professional needs.

Eight of 13 people said they would be interested in earning a professional certificate from a university, with one respondent saying “maybe.” Two of the “no” answers came from professionals who had already earned master’s degrees. This question prompted fewer caveats than those listed for masterís programs, with participants seeming to make underlying assumptions that certificate programs were both well focused and completed more quickly than graduate degrees. One participant “loved” the idea of a certificate program “without the commitment of a master’s degree.” Another one of the eight professionals who expressed interest in a certificate program said he “saw value in [it], when people see initials after a name, and know that there’s an expertise level associated with that.” This participant said the masterís/certificate conversation does not have to be an “either/or kind of thing,” but “maybe they could be done at the same time, or one could lead to the other.” Another said she would be interested in a certificate program as long as it was not “just studying to the test.” Curriculum preferences expressed for graduate-level certificate programs included business and marketing strategies, leadership and management, employee engagement, overcoming challenges and handling crisis, ethics, social media, channel and segmentation planning, and digital metrics and methodologies.


Just as Briones and Toth (2013) discovered a lack of coherence in master’s education in public relations, this present research found certificate programs offered by the 22 universities in this study to lack uniformity; certainly, these certificate programs respond to different educational goals and marketing demands than traditional master’s education. However, many certificate programs offered programs of similar credit-hour requirements, with 12 of the programs offering certificates that included five or six courses, for 15-18 credit hours total. There was a split in delivery method, with 11 programs online only, and 11 programs using classroom only delivery or a blended model of classroom and online instruction.

In general, these certificate programs could be said to be following models B and C of the 2006 Commission on Public Relations Education report for masterís degrees in public relations, with B-type certificate programs providing public relations and business-related instruction to meet the advanced career goals of seasoned professionals and C-type certificate programs providing more focused instruction on a specific part of the communications field, such as ethics. Beyond curriculum, the largest differences among these certificate programs were the costs, which ranged from $495 to more than $50,000 for programs of widely ranging requirements.

The desires of practitioners in the in-depth interviews seem to support this certificate program variety, and this variety is supported by responses to an open-ended question in the survey as well. Flexibility was a key to participants’ abilities to earn either a master’s degree or certificate, especially in terms of the timing of classes and/or the delivery model. However, in the interviews many practitioners expressed their desire for building relationships with other students and for in-classroom discussion, where they believed real learning took place. Several practitioners expressed reservations about studying alone or earning a certificate that was a ìstudy to the testî type of experience.

Certainly, programs designing certificate programs must balance all of these concerns and desires carefully, with perhaps the most challenging decision to be that of whether to offer a broad program that serves the demands of seasoned professionals or a specific program designed to “catch up” professionals on new trends in the industry.

Findings from the survey and interviews indicate that public relations professionals are generally interested in earning a certificate. Nearly 60% of survey respondents indicated that they were either “definitely interested” or “probably interested” in earning a certificate, and another 29% said they “might be interested.” More people in the interviews said they would be interested in a university certificate program than in a master’s degree program. Perceived advantages of certificate programs included the shorter time commitment and more specific curriculum content.

Despite the fact that public relations educators and some professionals generally don’t view online programs as positively as traditional programs (Commission on PR Education, 2012), respondents in this research indicated a preference for a hybrid delivery, one that combines online and face-to-face learning. In the survey, only 1 in 5 respondents preferred to earn a certificate program that consisted of face-to-face instruction only. The interviews also indicated that participants would value masterís degree programs that contained both online and in-person components, with flexible meeting times. Although the majority of professionals prefer a hybrid delivery, only 18% of certificate programs analyzed used a hybrid approach, while half provided online-only course experiences.

Although public relations practitioners can certainly learn a great deal by reading blogs and trade publications, the majority of respondents in this research indicated that they prefer to learn with a community of people. One of the strengths of master’s and certificate programs is that they bring together a community of people to learn.

Participants want a graduate curriculum that includes the newest digital tools and strategies for communication careers, with business management and marketing expertise rated highly, too. Participants indicate a preference for multiple topics for a certificate, with the greater number of people saying they would like a class on storytelling, digital and social media strategy, and measuring communication effectiveness.

One overall finding of this mixed-methods research is that universities and colleges have an opportunity to respond to professional demand for certificates and other ongoing training. Only 29 percent of universities that offer a graduate degree in public relations also provide a communication-centric certificate. Although certificates will never completely replace the relevance or need for an in-depth master’s program in public relations, a certificate may be the best educational choice for professionals who want to learn a specialized skill or process without the time commitment and cost of a master’s degree. Certificates may also become more desirable if more people strive to earn “badges” as an informal way to showcase expertise to prospective employers (Goligoski, 2012).

The findings of this research are limited by the use of a convenience and snowball sample. Moreover, the types of people who were motivated to complete the survey likely have a higher interest in graduate education and certificates compared to other professionals. Participation in the in-depth interviews was self-selected.

Future research could analyze the differences in university certificate and for-profit certification programs, and the differences among programs offered by universities, broad professional organizations, and specific trade groups. Depth interviews could be conducted with certificate program coordinators to explore the challenges and opportunities in coordinating these programs. Finally, additional research could delve into professional attitudes and likely behavior of potential students toward universities which offered “laddered” approaches to graduate work, allowing students to begin with a certificate program that would give credit toward a future master’s degree program. As this research demonstrates, certificate programs offer opportunities for both communication professionals and universities interested in being part of their ongoing learning.


Bosworth, B. (2010). Certificates count: An analysis of sub-baccalaureate certificates. Complete College America. Retrieved from http://eric.ed.gov/?id=ED536837.

Briones, R. L., & Toth, E. L. (2013). The state of PR graduate curriculum as we know it: A longitudinal analysis. Journalism & Mass Communication Educator, 68(2), 119-133

Bureau of Labor Statistics (2012a). Management occupations. Retrieved March 1, 2014, from http://www.bls.gov/opub/ooq/2012/spring/art02.pdf.

Bureau of Labor Statistics (2012b). Public relations specialists. Retrieved March 1, 2014, from http://www.bls.gov/ooh/media-and-communication/public-relations-specialists.htm#tab-6.

Castaneda, L. (2011, Winter). Disruption and innovation: Online learning and degrees at accredited journalism schools and programs. Journalism and Mass Communication Educator, 66(4), 361-373.

Commission on Public Relations Education (2006, November). The professional bond: Public relations education and the practice. Retrieved March 15, 2014, from http://www.commpred.org/_uploads/report2-full.pdf

Commission on Public Relations Education (2012, October). Standards for a masterís degree in public relations: Educating for complexity. Retrieved Jan. 15, 2014, from www.commpred.org.

Coulson, D. C., & Linn, T. (1995). The hybrid master’s degree: Combining research with practice. Journal of the Association for Communication Administration (JACA), 2, 127-31.

Cowen, J. (2012). Strategies for developing a community of practice: Nine years of lessons learned in a hybrid technology education masterís program. TechTrends: Linking Research & Practice to Improve Learning, 56(1), 12-18.

Digest of Education Statistics. (2012). Number of persons age 18 and over, by highest level of educational attainment, sex, race/ethnicity, age: 2012. National Center for Education Statistics. Retrieved March 29, 2014, from http://nces.ed.gov/programs/digest/d12/tables/dt12_010.asp.

Gold, M. & Jose, S. (2012) An interdisciplinary online and masters program in agroforestry. Agroforestry Systems, 86(3), 379-385.

Goligoski, E. (2012). Motivating the learner: Mozillaís open badges program. Access To Knowledge: A Course Journal, 4(1). Retrieved from http://ojs.stanford.edu/ojs/index.php/a2k/article/view/381.

McFadden, K., Chen, S., Munroe, D., Natfzger, J., & Selinger, E. (2011). Creating an innovative interdisciplinary graduate certificate program. Innovative Higher Education, 36(3), 161-176.

Meyer, K., Bruwelheide, J. & Poulin, R. (2009). Why they stayed: Near-perfect retention in an online certification program in library media. Journal of Asynchronous Learning Networks, 13(3), 129-145.

Mullen, A. L., Goyette, K. A., & Soares, J. A. (2003) Who goes to graduate school? Social and academic correlates of educational continuation after college. Sociology of Education, 76 (2), pp. 143-169.

National Center for Education Statistics. (2012). Defining and reporting subbaccalaureate certificates in IPEDS. Retrieved from http://nces.ed.gov/pubs2012/2012835/index.asp.

National Center for Education Statistics. (2013). Degrees conferred by public and private institutions. Retrieved from https://nces.ed.gov/programs/coe/indicator_cvc.asp.

PR Accreditation (2013). Certificate in Principles of Public Relations. Retrieved July 21, 2014 from http://www.praccreditation.org/apply/certificate/

Shen, H., & Toth, E. (2013). Public relations master’s education deliverables: How practitioners and educators view strategic practice curriculum. Public Relations Review, 39(5), 618-620.

Tokmak, H., Baturay, H. & Fadde, P. (2013). Applying the context, input, process, product evaluation model for evaluation, research, and redesign of an online masterís program. The International Review of Research in Open and Distance Learning, 14(3), 273-292.

Toth, E. L., Shen, H. & Briones, R. (2012, Jan. 1). Summary of research on the state of public relations/communication management: Masterís degree education in the United States. Retrieved from www.prsafoundation.org/research.html.

U.S. Department of Education. (2013). Characteristics of certificate completers with their time to certificate and labor market outcomes. Retrieved from http://nces.ed.gov/pubs2013/2013157.pdf.

U.S. News & World Report. (2014). Money Careers. Retrieved March 29, 2014, from http://money.usnews.com/careers/best-jobs/public-relations-specialist.

Wyland, R. L., Lester, S. W., Mone, M. A., & Winkel, D. E. (2013). Work and school at the same time? A conflict perspective of the work-school interface. Journal of Leadership & Organization Studies, 20(3), 346-357.

Menu: Abstract | PDF | Introduction | Literature Review | Method |
Results | Discussion & Conclusions | References