The DELTA Planning and Assessment team, which is composed of experts in assessment, pedagogy, and blended and distance education (DE) at the university level, plays a key role in informing DELTA’s data-driven decision making for promoting student success at NC State. Together, we work diligently on the continued growth of a seamless culture of assessment across DELTA, with a concentration on DE programs, student and faculty support (DELTA Services), DELTA Grants projects, and research.
DELTA’s assessment trio may not share the same fame as The Dude, Walter, and Donny; or Ferris, Cameron, and Sloane; or even Harry, Ron, and Hermione, but we are busy working behind the scenes to ensure DELTA is supporting and expanding access to quality education. Together, Chris Willis, Dan Spencer, and I are helping faculty and our DELTA colleagues investigate the impact of new teaching methods, and evaluating the services we provide for supporting students in achieving their academic goals via blended, online and distance education.
As the director for DELTA Planning and Assessment, I am responsible for high-level assessment administrative tasks and leadership to support DELTA’s research activities. I work with university colleagues to ensure DELTA’s compliance with university, UNC System, and SACSCOC assessment and accreditation policies and requirements. Chris Willis, program coordinator for learning analytics and assessment in DELTA, leads all assessment work for DELTA Grants funded projects and courses, most notably our course redesign initiative. Dan Spencer, our educational specialist, supports us in our efforts, and conducts empirical research investigating why students are more likely to succeed in face-to-face courses compared to its DE counterparts. Below are some examples of our work.
Planning and Assessment supports DE programs and courses through consultations for faculty and DE Program Directors/Coordinators to assist in the development of assessment activities. Moreover, we help in the alignment of curriculum goals, outcomes and objectives. This allows the Directors/Coordinators to provide visible and tangible evidence of and confirmation that their DE program is meeting its teaching and learning goals. (i.e., A Google document with links to NC State online and DE program curriculum assessment and accreditation resources is available for easier access to support materials.)
In an effort to provide high-quality support, DELTA Planning and Assessment administers surveys that ask students and faculty to evaluate DELTA’s DE Services and in turn, the data are used to change and improve teaching and learning experiences in the courses we support. We make a point of asking questions that elicit responses DELTA can act on. For instance, from participants’ responses in the annual Faculty Evaluation of DE Services surveys, DELTA identified a trend in instructors’ requests for expanding LearnTech Help Desk services and support beyond the typical Monday through Friday business hours. Subsequently, Help Desk hours were extended during the week and Sunday time-slots were added, giving NC State instructors easier and even timelier access to support staff that can answer their instructional technology questions.
DELTA’s engagement in ongoing critical self-reflection, soliciting feedback from constituents (i.e., seeking input from students), and then acting on the results are vital to succeeding in its endeavors. DELTA Testing Services, for example, enable DE students to take exams in a professional, secure and monitored setting. So that DELTA may consistently provide DE students (and their faculties) with efficient service in secure distraction-free testing environments, we began administering a Student Evaluation of DELTA Testing Services during the spring and fall final exam periods. The insights gained from the survey results were used to expand DELTA Testing Services to a new location on Centennial Campus, with easier access, streamlined registration, and more sitting space when taking exams. Secondary to assisting DELTA in offering students the best opportunity to convey the mastery of course material, outcomes from the survey allowed DELTA to demonstrate NC State’s leadership to quality in the administration of testing services and programs. As such, DELTA Testing Services set forth and succeeded in being officially recognized and became a member of the National College Testing Association (NCTA)’s Consortium of College Testing Centers (CCTC). Implementing the Student Evaluation of DELTA Testing Services survey not only satisfied the rigorous requirements needed to join, the results are helping NC State serve as an institutional exemplar by more effectively embodying and promoting industry best practices in testing services administration for online and distance education.
Chris and Dan are committed to measuring outcomes as a result of DELTA Grants projects and course redesign initiative. These outcomes include measuring the impact teaching methods and models have on student success in critical path courses after redesign; determining the effect of increased enrollments in DE courses without a detriment in student performance and retention; investigating the impact of virtual reality (VR), adaptive learning, and other educational technology tools on student learning and engagement; and if the use of kit labs help accommodate an increase in enrollments while physical, brick and mortar lab space resources are lessening.
Our work doesn’t end here. DELTA supports NC State’s enterprise learning technologies and does this by making data-driven decisions. How do we know faculty and students want to use a specific technology, why, and what are the benefits or obstacles? Asked in another way, does the technology improve the efficiency and effectiveness of delivering course content and at the same time engage students in learning for improved outcomes? Using Top Hat as an example, Chris worked with DELTA’s Instructional Support Services team on the Top Hat Pilot Project. DELTA conducted an investigation of the classroom engagement tool to determine if the university should consider the adoption of Top Hat at the enterprise level. Consequently, Top Hat is now available for instructors who wish to build student engagement in their classroom.
Looking to the future of DELTA Planning and Assessment, the team is hard at work on a number of projects that will have a positive impact on reporting, assessment, and data management efforts across DELTA. Chris has pulled together all of the internally created survey items used in DELTA grant projects, and he and Dan are working to run validity and reliability tests, combined with resources on other validated instruments, that will lead to a repository of assessment and evaluation tools that anyone can use. In addition, Planning and Assessment has begun conversations and meetings with a number of teams to further integrate assessment and evaluation into all project stages, from kickoff to implementation and beyond. By doing this, the team hopes to promote more interaction between Planning and Assessment and other teams. This will have a number of benefits, including increasing DELTA’s accountability and visibility; helping DELTA staff to feel more comfortable about assessment and evaluation; standardizing project documentation and deliverable timelines; and simplifying the evaluation-related workload for IDs, New Media, etc. while increasing the number of projects in which Planning and Assessment can provide expertise and assistance.
Professionally, Planning and Assessment stays abreast of assessment theory, methods, and instruments by attending local and national conferences, including SACSCOC, Association for the Assessment of Learning in Higher Education (AALHE), and the Online Learning Consortium (OLC), and the TIDE (Technology Innovation in Digital Education) Symposium. In kind, we share NC State and DELTA’s successes via presentations and workshops at the above professional meetings. Beyond the major projects mentioned, the Planning and Assessment team is hoping to work on paper submissions to peer-reviewed journals, and to conduct a broader analysis of available course redesign data.