The CPSA is an online asynchronous discussion activity to assess the professional skills. These are learning outcomes also known as the non-technical, transferable, soft, 21st Century, or general education and relate to communication, teamwork, problem-solving, critical thinking, and the ability to consider ethical and social implications. These learning outcomes have been identified by employers, consultants, professional organizations, governments, and supra-national organizations, as essential to the development and growth of a knowledge economy and a key to employment.

Though there is certainly potential for a modified version of the CPSA to be used across disciplines, it has been designed specifically for students in fields such as computing, information technology, computer engineering, and computer science. The international accreditor for these fields, the Accreditation Board for Engineering and Technology (ABET), began to emphasize the professional skills as early as 1997 through the release of their document, Engineering Criteria 2000 (EC2000). This was done to address concerns that graduates of these disciplines, while strong technically, were weak in the professional skills.

Building off of ABET’s professional skills, the CPSA professional skills are tightly aligned to the ABET Computing Accreditation Commission’s computing outcomes as seen below.


History and Development:
The precursor to the CPSA is the Engineering Professional Skills Assessment (EPSA). This is a tool that was first used by Washington State University’s College of Engineering back in 2007. The EPSA measures ABET’s six engineering professional skills learning outcomes at the course and program level, and it is the first method to assess these outcomes directly and simultaneously. The EPSA is made up of a scenario, a discussion of the scenario built around discussion prompts, and a rubric for assessing student responses. The 600-800 word scenario briefly describes a realistic and multifaceted engineering problem, similar to an authentic problems faced by engineers in the work place. In groups of 5-7, students conduct a recorded discussion based on the scenario for 45 minutes. The recordings of the discussion are transcribed and then assessed by faculty using the EPSA rubric. Since the first EPSA publication in 2008, it has had and continues to have a rich and diverse publication record that has included an National Science Foundation validity and reliability study. The EPSA has proven to be a useful resource for numerous engineering faculty from a myriad of institutions, and it has served as a trustworthy foundation for the development of the CPSA.

One of the initial co-creators of the EPSA was Ashley Ater Kranov, and she has also served as co-creator of the CPSA. The first CPSA study occurred in 2013 and was published in 2016. It utilized a modified version of the EPSA as the EPSA scenarios, rubric, and method were altered to better fit computing rather than engineering, to better fit with the needs of a second language environment, and to transition to online discussions. This initial study witnessed the emergence of the CPSA as a standalone tool instrument and method.

Consistent CPSA research has been conducted and published from 2014 through to 2018 and three research grants have been awarded to the project- Zayed University’s Research Incentive Fund, Abu Dhabi’s Department of Education and Knowledge Award for Research Excellence, and Zayed University’s Research Clusters. The initial research team was Maurice Danaher, Ashley Ater Kranov and Kevin Schoepp. Over time, there have been additional contributors and more recently, Anthony Rhodes has joined the team as part of the Research Cluster grant. In addition, more than 10 faculty have utilized the CPSA in their courses and more than 900 students have participated in the method.

The CPSA has undergone significant revisions and improvements over the years to the point where further dissemination is warranted. Besides a robust empirical foundation, there are now two versions of the CPSA rubric, numerous scenarios and prompts, a scenario development checklist, and a draft users’ manual to understand how to implement the CPSA and assess the student transcripts. Reliability and validity work has been completed and published, while other  protocols that have been implemented to ensure the trustworthiness of the CPSA.