A Guide to Assessment of Learning Outcomes for ACEJMC Accreditation

View PDF

This guide explains ACEJMC’s expectations of an assessment plan and offers suggestions for assessing student learning. The suggestions are not meant to be prescriptive, but to offer examples of tools that could be part of a comprehensive, multi-­‐year assessment plan. This guide will introduce assessment as a principle of accreditation, describe the components of an assessment plan, provide examples of direct and indirect measures, and discuss the timeline for assessment as related to the unit’s self-­‐study.

Assessment as a Principle of Accreditation

Assessment is a valuable process to help units develop curriculum, improve teaching, document what students have learned, and enhance student learning.

ACEJMC lists 12 professional values and competencies that must be part of the education of all journalism and mass communications students. Students should:

  • understand and apply the principles and laws of freedom of speech and press for the country in which the institution that invites ACEJMC is located, as well as receive instruction in and understand the range of systems of freedom of expression around the world, including the right to dissent, to monitor and criticize power, and to assemble and petition for redress of grievances;
  • demonstrate an understanding of the history and role of professionals and institutions in shaping communications;
  • demonstrate an understanding of gender, race ethnicity, sexual orientation and, as appropriate, other forms of diversity in domestic society in relation to mass communications;
  • demonstrate an understanding of the diversity of peoples and cultures and of the significance and impact of mass communications in a global society;
  • understand concepts and apply theories in the use and presentation of images and information;
  • demonstrate an understanding of professional ethical principles and work ethically in pursuit of truth, accuracy, fairness and diversity;
  • think critically, creatively and independently;
  • conduct research and evaluate information by methods appropriate to the communications professions in which they work;
  • write correctly and clearly in forms and styles appropriate for the communications professions, audiences and purposes they serve;
  • critically evaluate their own work and that of others for accuracy and fairness, clarity, appropriate style and grammatical correctness;
  • apply basic numerical and statistical concepts;
  • apply tools and technologies appropriate for the communications professions in which they work.

Units requesting evaluation of a professional master’s degree program also must demonstrate how their graduates attain this additional core competency:

  • contribute to knowledge appropriate to the communications professions in which they work.

The Accrediting Council invites units to adopt, revise or expand this list as they choose. Many units adopt ACEJMC’s list of professional values and competencies with no change, and others have added one or more statements to the list or collapsed several items into larger concepts. Some may even subdivide the concepts into cohorts related to knowledge, values and competencies. Whatever a unit decides, it must ensure that its curriculum and instruction address all of ACEJMC’s expectations for all students.

The Accrediting Council’s curriculum and assessment standards are closely connected. Standard 2, Curriculum and Instruction, states:

The unit provides a curriculum and instruction that enable students to learn the knowledge, competencies and values the Council defines for preparing students to work in a diverse global and domestic society.

Standard 9, Assessment of Learning Outcomes, states:

The unit regularly assesses student learning and uses results to improve curriculum and instruction.

A unit’s curriculum and instruction may evolve for a number of reasons – for instance, changes in professional practice, changes in an audience’s information-­‐seeking behavior, and changes in communications technology. Assessment of student learning is different from these evolutionary changes. Assessment is a focused and deliberate process to learn if students are learning what a unit expects them to learn and to improve the quality of the program overall.

Assessment evaluates student learning at the course, sequence, department or unit level and mainly wants to know whether graduates have mastered the professional values and competencies. (In contrast, grades evaluate individual student performance in a course and may consider issues such as attendance, participation and improvement.) If assessment reveals weaknesses in the mastery of any of the values or competencies, the unit should decide how to address those weaknesses, take action to do so, and assess the result.

An Assessment Plan

Schools seeking accreditation are required to have a written assessment plan that reflects ACEJMC’s values and competencies. Because units also may be required to conduct assessment for institutional purposes or regional accrediting bodies, units may want to develop an assessment plan that serves multiple purposes.

Generally, program outcomes should be framed as student learning outcomes (what a student should know or be able to do) rather than as teaching objectives (what the course or teacher will seek to do).

An assessment plan typically has these components:

Goals of the Unit – describe what the unit intends to accomplish, how the unit’s goals relate to the institution’s mission, and purposes for assessment

Student Learning Outcomes – describe what the unit expects students to be able to know, do and value, reflecting ACEJMC values and competencies

Curriculum Map – indicate where the values and competencies are addressed in core and required courses

Direct and Indirect Measures – indicate what measures the unit will use to determine whether learning outcomes are being met

Timeline – indicate when assessment measures will be implemented and results reported

Oversight – indicate who has responsibility for seeing the plan is implemented

Use of Information – describe provisions for sharing information with internal and external audiences and for making recommendations and decisions

To lay a foundation for assessment, a review of syllabi for core and required courses is a good starting point. A unit should ensure that these courses cover ACEJMC’s professional values and competencies, that their syllabi contain appropriate student learning outcomes, and that instruction is demanding and current.*

Each value and competency does not need its own course, and, typically, courses address multiple learning outcomes. Units should not rely on coverage of learning outcomes in elective courses because electives are avoidable. For example, although a unit may offer “Race, Gender and the Media” as an elective course, it must show that an understanding of diversity is one of the learning outcomes in appropriate core or required courses.

The specification of student learning outcomes in syllabi constitutes important evidence for ACEJMC site teams. In meetings with faculty members and students, teams often encounter assertions from some that a particular course addresses this or that value or competency and from others that it does not. Evidence that a course is intended to address a particular value or competency is clear when its syllabus addresses the topic in learning outcomes, the course outline, or coverage in specific readings and assignments.

*ACEJMC’s values and competencies anticipated change, and the curriculum standard requires keeping up with change. It states that instruction must be “demanding and current.” The Accrediting Council, Committee and site teams have interpreted “current” to mean that all graduates, regardless of specialization, should be prepared to communicate to diverse audiences through multiple platforms of information delivery.


Measures of Assessment

Standard 9, Assessment of Learning Outcomes, requires that an assessment plan use “multiple direct and indirect measures to assess student learning.” This suggests at least two direct measures and at least two indirect measures.

Direct measures require students to demonstrate their learning. These measures examine actual student work to determine whether students demonstrate the knowledge, values and competencies required to achieve program goals. (Examples: examinations, capstone projects, student portfolios, aggregate internship evaluations, course-­‐embedded assessment)

Indirect measures capture perceptions, attitudes and outcomes of the learning experience. These include self-­‐reports of student learning or data and outcomes that indicate program goals have been achieved. (Examples: student surveys, alumni surveys, employer surveys, exit interviews, focus groups, student awards, graduation and employment data)

ACEJMC does not prescribe specific measures of assessment. It endorses the wisdom of experts that no single measure is sufficient. It encourages units to develop and apply measures that reflect the mission and goals of the unit as well as those of ACEJMC. In fact, no one measure may fit all departments or sequences; thus, each department or sequence may adopt its own measures of assessing student learning.

The selection of assessment measures may depend on whether the learning outcome involves mastery of information or mastery of a skill. For example, an examination better enables assessment of knowledge and values, whereas capstone projects and portfolios better enable assessment of student skills.

Direct Measures

Here are examples of direct measures where students demonstrate their learning:

Examinations. Some units administer pre-­‐ and post-­‐tests as evidence of student learning. The pre-­‐test might show modest knowledge of the five freedoms in the First Amendment at point of entry to the major and keen knowledge of the First Amendment at graduation. Exams similarly can explore other areas of knowledge, values and competencies expected of all students. Sometimes, units administer only a comprehensive exam at the end, with results benchmarked against expectations (e.g., the unit expects xx% of seniors to know y). By producing cumulative feedback on what students know and can do and what they don’t know and can’t do, exams provide the faculty with insight on areas to improve curriculum and instruction. Some units have entrance exams related to grammar and writing skills. Entrance exams alone are not a direct measure of student learning, but an exam at the senior level would be a comparative tool to measure student accomplishment.

Capstone Projects. Typically, a capstone course synthesizes and updates the knowledge, values and competencies acquired by graduating seniors of a sequence or department.   The instructor could assign and grade a research or skills-­‐based project as the culminating work of the curriculum, which the student might include in a portfolio for job applications. For purposes of program assessment (different from the course grade), a unit could collect an ungraded copy of each senior’s project and ask a panel of JMC faculty or professionals to evaluate the projects (or a sample of them) using a specified rubric to assess the collective level of student accomplishment.

Student Portfolios. A portfolio commonly includes examples of advanced in-­‐class work and a student’s best work through campus media and professional internships. A portfolio might also include the student’s resume and some self-­‐reflection on the incorporated work. A unit could define appropriate learning outcomes from ACEJMC’s list and have a panel of faculty, alumni or professionals evaluate the portfolios (or a sample of them) using a rubric similar to the one recommended for assessing capstone projects.

Aggregate Internship Evaluations. Units regularly gather feedback from professionals who supervise student interns. That’s helpful in the decisions to award credit or determine a student’s grade. For assessment purposes, units could analyze internship evaluations in the aggregate, looking for trend lines that may suggest ways to improve curriculum and instruction. For this to be possible, the internship evaluation form needs to go beyond behavioral measures (dependability, punctuality, ability to work independently) and seek to evaluate the student’s demonstration of ACEJMC’s professional values and competencies (clear and accurate writing, good grasp of technology, can apply numerical concepts, can think critically and creatively, etc.).

Course-­‐Embedded Assessment. Its purpose is to assess the learning outcomes of the course and not the grade of the student. For example, course-­‐embedded assessment could involve common questions in exams across all sections of a course. This takes advantage of pre-­‐existing student motivation to perform well, but requires faculty willingness to engage in a shared process. Course-­‐embedded assessment also could involve reviewers taking a second look at materials generated by students in a course to see what evidence exists that students have met specified learning outcomes. Or the assessment instrument could be separate from graded work, for the explicit purpose of providing group-­‐level information on the achievement of student learning outcomes.

The value of using multiple measures is that any single measure has limitations and, thus, should be combined with others to help complete the picture. In addition, the strength of assessment results can depend on the breadth or representativeness of the cohort. For example, aggregate internship evaluations carry more weight when all students are required to do an internship, compared to internship evaluations of a self-­‐selected portion of students. Similarly, the evaluation of a random sample of portfolios would be more representative of the student body than an evaluation of portfolios of only top students. Units should address the breadth and representativeness of direct measures.

Indirect Measures

Here are examples of indirect measures that capture perceptions, attitudes and outcomes of the learning experience:

Surveys and interviews. Many possibilities of indirect measures exist: student surveys, exit interviews of seniors, focus groups, alumni surveys and employer surveys. All involve self-­‐report or perceptions of the quality of student learning. Survey questions ideally reflect the unit’s goals for student attainment of knowledge, values and competencies – such as asking respondents to evaluate their mastery of each of ACEJMC’s values and competencies.

Student awards. A unit’s record of student performance in competitions over time may provide insight into the effectiveness of curriculum, instruction and student learning. Awards received by individual students or student teams (such as student newspapers, broadcast news programs, and advertising or public relations campaigns) are outcomes that cumulatively point to the quality of the learning experience, at least for some students.

Graduation and employment data. Retention and graduation rates can be valuable indicators of the effectiveness of curriculum and instruction. So can the employment record of graduates over time, such as the proportion of students employed in JMC-­‐related careers six months after graduation. Units may need to benchmark student employment against broader employment data, since the state of the economy is beyond a unit’s control.

As with direct measures, indirect measures such as surveys and student awards should be considered carefully for how representative the results may be.

Establishing Levels of Achievement

Just as ACEJMC does not specify assessment measures, it also does not prescribe levels of achievement. Rather, it is the responsibility of the unit to establish benchmarks, mechanisms for comparison, or other appropriate ways of interpreting the results of assessment measures. Nevertheless, the ACEJMC mission statement does reflect an expectation of high quality:

The Accrediting Council on Education in Journalism and Mass Communications is dedicated to fostering and encouraging excellence and high standards in professional education in journalism and mass communications.

The language of ACEJMC’s values and competencies suggests three levels of achievement – awareness, understanding and application – so pay attention to the verbs that guide the expectations.

Three Other Considerations

Alumni. Standard 9 requires units to maintain contact with alumni “to assess their experiences in the professions and to gain feedback for improving curriculum and instruction.” This requirement makes an alumni survey (or other mechanism for alumni feedback) an obvious candidate as an indirect measure.

Professionals. Standard 9 also specifies that the unit include members of JMC professions in its assessment process. This clearly could occur through the student internship program, if the evaluation instrument is linked to ACEJMC’s professional values and competencies. Professionals also could be involved in the evaluation of student portfolios or capstone projects, or through employer surveys about the readiness of recent graduates of the unit. Some units ask members of their professional advisory boards to conduct mock interviews of seniors, offering advice to students about their portfolios and interview style as well as using a rubric that incorporates some of the values and competencies to inform the unit about the students’ collective level of career preparation.

Graduate programs. For units that submit a professional master’s degree for review, ACEJMC requires that a graduate program have its own assessment plan. Many of the same direct and indirect measures can apply, but may need to be constructed differently for the graduate level. In addition, ACEJMC says graduate programs must distinguish themselves from undergraduate programs by offering advanced, rigorous courses. Standard 9 lists outcomes appropriate to a professional graduate program as “a professional project, thesis or comprehensive exam that demonstrates that graduate students have developed analytical and critical thinking abilities appropriate to the profession.”

Assessment Plan Timeline

A unit should develop an assessment program consistent with its resources and administer it with sufficient regularity to allow time to analyze the results and decide on actions to improve student learning.

While some universities require assessment of student learning at the course level every semester, experience suggests that this frequency and the accumulation of data can subvert meaningful analysis and sensible action. Units may choose to stagger the implementation of assessment measures – for instance, portfolio evaluations each fall, a review of capstone projects every other year, an alumni survey every third year.

Alternatively, a reasonable assessment cycle might involve a cohort analysis. The unit could administer a few direct and indirect measures as a cohort enters the major, administer most measures as that cohort prepares to graduate, devote an academic year to analyze the findings, and decide where and how to make improvements in curriculum and instruction. The unit then would introduce the changes for the cohort entering after the year of analysis and resume its assessment program.

Use of Information

A common phrase in assessment is “closing the loop.” It means studying assessment results to see what improvement might be suggested and taking steps to make it happen. There’s little value in going through assessment if the end result is a failure to analyze the findings and discuss how to improve curriculum and instruction to enhance student learning.

Once improvements are made, units then should monitor assessment findings over time to see if the desired outcomes have been attained.

The Self-­‐Study and Site Visit

To achieve compliance with Standard 9, a unit must present in the self-­‐study a written assessment plan and a report on the measures used, the findings and analysis, and the actions taken to improve student learning. ACEJMC’s self-­‐study template for this standard frames questions and directions to guide the unit in providing the needed content.

Site teams expect to review the written plan and the process and results of its execution. This ranges from a description of the process, findings, analysis and actions to how the data were collected (in the body of the report or in appendices). Assertion and generalization are not satisfactory. Careful attention should be paid to timely analysis and improvements implemented with each assessment cycle.

Since no program is perfect, assessment is designed to do something meaningful – guide a unit and its faculty to a better understanding of how to improve the program to heighten student learning.


This guide to assessment of learning outcomes for ACEJMC accreditation was written by:

  • Trevor Brown,
    Professor Emeritus, School of Journalism, Indiana University
  • Marie Hardin, Accrediting Committee Vice Chair
    Professor of Journalism and Associate Dean, Undergraduate and Graduate Studies, College of Communications, Pennsylvania State University
  • Paul Parsons, Accrediting Council Vice President
    Dean, School of Communications, Elon University

Their time and expertise are appreciated.

– Susanne Shaw, ACEJMC Executive Director