Reconsidering General Education at Purdue Fort Wayne

“What is difficult is getting a group of faculty from many different perspectives and prior institutional and educational experiences to work together to design or change a curriculum to be cogent, coherent, and meaningful to student.” (Ratcliff, J.L., 1996, p.6).

Part 1: Confessional

The General Education Assessment Plan Proposal presented on Thursday, January 5th at the Transition Meeting was more than a simple assessment plan proposal.  As I reviewed the various assessment reports for general education courses prepared over the last few years and examined how students matriculated through general education courses, it was clear that the 300 or so options for completing general education at IPFW provided little opportunity for faculty members to create a coherent curriculum or for students to experience a meaningful general education learning experience.  In effect, our general education program (like many others) is a menu of courses that students complete to check off a space on their “Bingo Sheet”.  The construction of a programmatic assessment of general education was challenging given the lack of common intellectual experiences for our students.  Therefore, embedded in the Assessment Plan Proposal, is a proposal to redesign the general education program.

The Proposal presented is intended as a starting point – nothing more, nothing less.  Admittedly, it is a relatively strong statement of what our general education program could become. It represents a personal design bias toward integrating a practical, applied liberal arts foundation across the broad spectrum of academic programs associated within a comprehensive university.  It is grounded in a strong perspective that a comprehensive university can intentionally design a learning experience and leverage assessment findings to demonstrate the distinctiveness of their graduates to multiple constituents.  Finally, the design is driven by a desire to help students experience a baccalaureate degree experience that helps them think critically and creatively, to solve problems and create opportunities, and to use this knowledge to advance a more inclusive vision of life, work, and service to their home, career, and community. However, ultimately, our faculty will decide what our general education program should be.  My role is to support and help realize that collective vision.

Part 2: Invitation

In a couple of weeks, our office will be announcing a series of town hall sessions to discuss the proposal, listen to the ideas of our faculty members, and provide information to the General Education Sub-committee.  Regardless of the decision on structural changes to the general education program, we hope to secure consensus on a revised assessment plan for general education.  Should we reach consensus this semester, we hope to roll out the assessment plan over a couple of academic years much like we did a few years ago with Programmatic Assessment.  I look forward to our discussions.

D. Kent Johnson, PhD., Director of Assessment

Advertisements

Assessing for Student Success Phase 2: Assessing Student Persistence and Retention

D. Kent Johnson, PhD., Director of Assessment

The Office of Assessment launched the current assessment strategy three years ago.  Its continuing focus is to improve student learning through systemic assessment of student learning at the academic program level.  Critical to this programmatic emphasis is assessing learning in “core courses” a department identifies as building blocks to the major.  The assessment of student learning as students matriculate through core courses examines the extent to which student learning progresses relative to desired programmatic learning outcomes. These programmatic outcomes represent the knowledge, skills, and values an academic program expects students to achieve at the end of their matriculation to a degree.  Understanding how that learning is progressing is a critical component to examining how to improve the likelihood that students achieve the outcomes and measuring at specific points of intervention (i.e. courses) provides the data needed for departments to implement curricular interventions aimed at improving student success.

This year, the assessment strategy will build on this foundation to add the assessment of student retention at the institutional level.  Understanding the factors that contribute to student retention and departure in the first year are vital to the academic units as the pool of students available to pursue a major offered by an academic unit is dependent on the number of students matriculating to the second year.  The institution’s success in admitting new students and ensuring those students are retained through the first and second full semesters is essential to academic units sustaining enrollments that support viable academic units.

The assessment of student retention and success assumes that the admitted profile of students is interconnected with retention/completion rates.  It further assumes that absent interventions, retention rates and graduation rates are largely a function of institutional and student profile.  Improving retention and ultimately graduation rates, therefore, is dependent upon understanding the interactions of student and institution and using this information to design and implement interventions aimed at improving student success.

This 2017 Fall Semester, the Office of Assessment in coordination with the Office of Institutional Research  is conducting an assessment of first time, full time students.  This assessment is grounded in research on organizational factors that influence student persistence (Braxton and Francis, 2017) and research on the relationship of “grit” and perseverance (Duckwork, Peterson, Matthews, and Kelly, 2007).

This assessment is the first activity in a larger formative assessment program aimed at improving student persistence to degree completion.  I look forward to sharing the findings with the university community at the beginning of the 2018 Spring Semester.

References:

Braxton and Francis (2017). Organizational Assessment to Improve College Student Persistence. Strategic Enrollment Management Quarterly, Vol 5, November 2, 2017, pp. 80-86.

Duckworth, Peterson, Matthews, and Kelly (2007). Grit: Perseverance and Passion for Long-Term Goals. Journal of Personality and Social Psychology. 2007. Vol 92. No 6, pp 1087-1101.

The Communicative Role of Assessment in Demonstrating the Utility of Liberal Arts in Comprehensive Regional Universities

D. Kent Johnson, PhD., Director of Assessment

“The often forgotten role of assessment as communication might be among the most important for the preservation of the comprehensive mission of IPFW and is one that we fully control.  It is also intentionally designed into our assessment strategy.”

Matthew Sigelman (CEO of Burning Glass Technologies) argued in an essay published by Inside Higher Education that the debate over liberal arts versus vocationalism is lazy.  He states, “…liberal arts majors are not as badly prepared as people fear – and graduates with other majors may be less prepared than they believe.”  This statement is probably not very provocative for higher education faculty as we are accustomed to touting the American Baccalaureate degree and its unique blending of liberal, general, and specialized knowledge as a primary strength of undergraduate education.  However, communicating this story to external constituents – especially legislators, potential employers, and prospective students and their families is not a strength of most higher education institutions.

Effectively communicating what students know and can do as a result of their education (especially for students graduating in majors that skew more to the liberal arts than vocation or profession) is especially important in the current political and social environment for public comprehensive universities.  This new environment is, perhaps, best defined in a 2002 statement by North Carolina Governor, Pat McCrory, who stated in an interview with Bill Bennett:

“So I’m going to adjust my education curriculum to what business and commerce needs to get our kids jobs as opposed to moving back in with their parents after they graduate with debt,” McCrory said, adding, “What are we teaching these courses for if they’re not going to help get a job.” (http://www.huffingtonpost.com/2013/02/03/pat-mccrory-college_n_2600579.html)

McCrory’s statement is specifically positioned to drive the idea of utility or vocationalism as the determinant of public higher education funding.  But, if taken at its face as a curriculum aligned “…to what business and commerce needs to get our kids jobs…” this perception of utility to meet the market demand is incomplete.  Mr. Sigelman states (based on his company’s analysis of the skills employers value most and are most difficult to find) that “Across the full spectrum of jobs, what employers seem to call for, above all else, are foundational skills like writing, research, analysis, critical thinking, and creativity” (https://www.insidehighered.com/views/2016/02/08/debate-over-liberal-arts-vs-vocationalism-lazy-one-essay).  These are the same skills liberal arts faculty tout as hallmarks of students completing their degrees.  Therefore, an opportunity exists to demonstrate the value of liberal education from an employment perspective through the communication of assessment findings to external constituents.  For this reason, the IPFW Annual Assessment Report asks departments to describe how they are communicating what students know and can do to external constituents.

Helping students describe what they know and can do to prospective employers is something many faculty do.  For example, Andy Downs has discussed how he encourages students to list the skills they have to help prospective employers understand the value of hiring a student with a degree in political science.  However, at the program, college, and university levels, data driven communications that are grounded in high quality and rigorous assessment to promote the quality of our graduates as measured by achievement of student learning outcomes has the potential to demonstrate that the knowledge and skills employers demand (e.g. writing, research, analysis, critical thinking, and creativity) are available through graduates across a range of majors.  The often forgotten role of assessment as communication might be among the most important for the preservation of the comprehensive mission of IPFW and is one that we fully control.  It is also intentionally designed into our assessment strategy.

Carefully constructed and executed, summative aspects of programmatic assessment of student learning forms the type of evidential foundation employers desire to understand how graduates of a program are prepared to contribute to the success of the organizations employing our graduates. Formative programmatic assessment builds on this foundation to inform departments and programs how student learning relative to the stated outcomes might be further enhanced through curricular interventions and innovations.  As this type of assessment is shared with external constituents it serves the valuable role of demonstrating institutional commitment to ensuring that current and future graduates are prepared to meet the increasingly challenging needs of future employers.

 

IPFW’s Assessment Model in the Context of a Culture of Learning

D. Kent Johnson, PhD., Director of Assessment

Calls to create a “culture of assessment” have been a centerpiece of higher education discussions for more than two decades. The evolution of the assessment culture is evidenced by a distinctive vocabulary (e.g. student learning outcomes, authentic assessment, etc.); common artifacts (curriculum maps, student products, portfolios, rubrics, etc.); and consistent methodologies (direct measures, program level, course level, institutional level, quantitative, qualitative, mixed, etc.). The resultant assessment activities focus on stating expected outcomes, measuring student learning relative to those outcomes, and making judgements on the extent to which students achieve stated outcomes. Too often this focus on assessment mechanics is disconnected from the teaching and learning process resulting in a failure to use assessment to improve student learning (Blaich and Wise (2011); Fulcher, et al. (2014)).

This disconnect of assessment from teaching and learning was reinforced within institutions in the first decade of the assessment movement as assessment was positioned as an institutional act aimed at compliance with external or internal demands (Ikenberry and Kuh, 2015). Consequently, as the assessment movement was institutionalized, compliance or fear cultures driven by requirements to comply with external demands such as state legislatures or accrediting groups, or internal demands such as administrative mandates to assess programs were created. Within this space conversations on how assessment might actually improve student learning and success were often secondary to the act of assessment.  Assessment was done for the sake of assessment.

A more evolved perspective of assessment supports an idea that assessment exists to increase the likelihood that institutional practices increase student learning and success. Based on their review of the literature, Fuller, et.al (2016) defined a culture of assessment as “…institutional contexts supporting or hindering the integration of professional wisdom
figure1

Culture of Learning

Culture of Assessment

Culture of Fear or Compliance

 

with the best available assessment data to support improved student outcomes or decision making” (p. 404). This perspective shifts the cultural target to one of assessment as illustrated in Figure 1.

While the emphasis on professional wisdom implies that faculty will use assessment to improve student learning relative to stated student outcomes, it remains somewhat isolated from the central target – a culture of learning. Fulcher, et al. (2014) identified this shortcoming and described a simple model for assessment comprised of three steps:

figure2

1. Assessing student learning

2. Using assessment findings to plan interventions or innovations, and

3. Re-assessing to examine if the intervention or innovation suggested that student learning was increased (p. 5).

The IPFW Integrated Teaching, Assessment, and Learning Model (Figure 2) builds on Fulcher, et.al. (2014) to address the gap they noted between the act of assessing and the art and science of instructional design. The IPFW Model conceptualizes the “Culture of Assessment” as embedded within a larger “Culture of Teaching and Learning”. This embedded strategy places increased emphasis on the need to intentionally connect assessment of student learning to teaching and learning process. Figure 2 represents IPFW’s assessment culture through the inner triangle (assess-innovate-reassess) as embedded in the outer triangle (teaching, learning, and student success) representing learning culture.

Supporting Academic Units in Implementing the IPFW Integrated Teaching, Assessment, and Learning Model: An Invitation to the Fall 2016 IPFW Assessment Academy:

The IPFW Assessment Plan represented in SD 15-6 expresses the IPFW Integrated Model through a specific artifact – the Assessment Report which is comprised of six interrelated elements:

1. Stated Student Learning Outcomes (SLO’s)

2. Curricular Maps

3. Assessment Plan

4. Assessment Methodology, and an ongoing cycle represented in steps 5 and 6,

5. Assessment Findings, and

6. Use of Assessment Findings to Improve Student Learning

The ongoing loop in process five and six follows the assess-intervene-reassess cycle suggested by Fulcher, et al. (2014). The emphasis on “5” and “6” above prioritizes using assessment findings to improve student learning consistent with the recommendation of Banta and Blaich (2011) that assessment effort shift from gathering data to using data. To facilitate this shift, the IPFW Assessment Academy is offering three workshops in Fall Semester 2016:

  1. Creating Signature Assignments for Programmatic Assessment,
  2. Assessment Plan Tune-Up, and
  3. Assess-Intervene/Innovate-Reassess.

If you are interested in participating in the IPFW Academy, please contact Kent at x15411 or email the assessment office at assessment@ipfw.edu.

References:

Banta, T.W., & Blaich, C. (2011). Closing the assessment loop. Change: The Magazine of Higher Learning, 43(1), 22-27.

Blaich, C., & Wise, K. (2011). From gathering to using assessment results: Lessons from the Wabash national study. (Occasional Paper No. 8). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Fulcher, K. H., Good, M. R., Coleman, C. M., & Smith, K. L. (2014, December). A simple model for learning improvement: Weigh pig, feed pig, weigh pig. (Occasional Paper No. 23). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Fuller, M.B., Skidmore, S.T., Bustamante, R.M., and Holzweiss, P.C. (2016). Empirically Exploring Higher Education Cultures of Assessment. The Review of Higher Education, 39(3). Pp. 395-429.

Ikenberry, S.O., & Kuh, G.D. (2015). From Compliance to Ownership: Why and How Colleges and Universities Assess Student Learning. In Kuh, et.al. Using Evidence of Student Learning to Improve Higher Education (pp. 1-17). San Francisco, CA: Jossey-Bass.