Frequently Asked Questions

INTRODUCTION

Below are some frequently asked questions (FAQ) about the Quality Assurance Framework (QAF) and implementing quality assurance practices at the universities in Ontario.

We hope that the FAQ feature will assist in responding to your questions, but this is not intended to replace a conversation with the Quality Assurance Secretariat to get additional information or clarification. If you do not find your particular query addressed in the FAQs or if you continue to have questions and require more detailed information, please contact the QA Secretariat directly. 

QUESTIONS

NEW PROGRAMS

Q1: When is a “significant change” to a program considered a major modification and when is it considered a new program?

Q2: Does the Quality Council grant extensions on deadlines for Reports on New Programs?

Q3:  Are universities required to submit CVs for the faculty involved in a proposed new program, including programs approved under the expedited approval protocol?

Q4: When developing a new program proposal, what information is reasonable and appropriate to meet the QAF evaluation criterion 2.1.6 a) “Appropriateness of the proposed methods for the assessment of student achievement of the intended program learning outcomes and Degree Level Expectations.”

Q5: When developing a new program proposal, what information is reasonable and appropriate to meet QAF evaluation criterion 2.1.6 b) “Completeness of plans for documenting and demonstrating the level of performance of students, consistent with the institution’s statement of its Degree Level Expectations.”

Q6: How long does it take for a new program to receive approval by the Quality Council?

Q7: What is the difference between a new program approvals process and an expedited program approvals process?

Q8: How does Quality Council approval of a new program relate to funding approval by MTCU?

CYCLICAL PROGRAM REVIEWS

Q9:  Should a university seek approval from the Quality Council to allow a review of an existing program to exceed the eight year limit for a cyclical program review, if there are exceptional circumstances?

Q10: Are there any programs that are not required to be on the cyclical program review schedule?

AUDITS

Q11:  Has the Quality Council allowed a university a deferral for its scheduled Quality Assurance Audit?

Q12: Is it possible to see examples of Audit Reports of other Universities?

Q13: What is expected in the university’s review of the draft Audit Report and accompanying Summary?

Q14: What is expected in the Institutional One-Year Follow-Up Response and what does the Audit Committee do to review it?

MAJOR MODIFICATIONS

Q15: Who is responsible for deciding if a change to a program constitutes a major modification, a minor modification, a new program?

Q16: When does a “significant change” to a program result in a major modification and when does it result in a new program?

ANSWERS

NEW PROGRAMS

Q1: When is a “significant change” to a program considered a major modification and when is it considered a new program?

A1: The QAF contains definitions of both “major modifications” and “new programs”. The Guide to the QAF includes several examples to assist universities in identifying the differences. Each university IQAP is also a source of additional guidance on its own institutional definitions, including defining “…what constitutes a “significant change” in the requirements, intended learning outcomes or human and other resources associated with the program” (page 17 of the Quality Assurance Framework).  Most universities have named (in its IQAP) an internal arbiter (often the Provost) to assist in determining how the “significant” change will be handled under the university’s IQAP. The first step for a university is to consult with its own arbiter, if it has named one. Because of the complexities that are evident in the definitions, this question is one that very often is answered through a discussion between the university’s authoritative key contact for quality assurance and the Executive Director of the Quality Council.

When the Quality Council reviews universities’ Annual Reports on Major Modifications, questions sometimes emerge about whether a reported major modification should, in fact, have been handled as a new program. The Quality Council takes its oversight responsibility seriously to ensure that universities are meeting the requirements of the Quality Assurance Framework.

Return to Questions

Q2: Does the Quality Council grant extensions on deadlines for Reports on New Programs?

A2: The reports required by the Quality Council as part of a conditional approval address important program requirements that are underway but not completed by the time of program approval, including examples such as faculty hiring or new space requirements. The timelines for submitting those reports reflect both the issue and a realistic estimate of the time needed for a university to have the information/data on the subject. It is important to submit those reports in a timely way; accordingly, the Quality Council monitors submission deadlines closely and expects to receive a submission on or before the required date.

Once the Quality Council is satisfied that all the reporting requirements are satisfactorily completed, the Quality Council will grant the program “approval to continue without condition.

Return to Questions

Q3: Are universities required to submit CVs for the faculty involved in a proposed new program, including programs approved under the expedited approval protocol?

A3: External Reviewers, the Appraisal Committee and the Quality Council require sufficient information about the faculty who will be involved in the delivery of a new program in order to assess several evaluation criteria in the QAF. Most universities submit CVs for faculty along with a new program proposal or a proposal for expedited approval. Because faculty members have an up-to-date CV for research grants and other reporting purposes, the CV is a very convenient way to convey the information. Occasionally a university will submit a narrative description of the background of each faculty member. As long as the information provided in the description can be evaluated against the QAF criteria, this would be sufficient. But in the experience of the Appraisal Committee, CVs offer the best way to convey this important information.

Return to Questions

Q4: When developing a new program proposal, what information is reasonable and appropriate to meet the QAF evaluation criterion 2.1.6 a “Appropriateness of the proposed methods for the assessment of student achievement of the intended program learning outcomes and Degree Level Expectations?”

A4: External Reviewers and the Appraisal Committee/Quality Council members need to be able to discern the relation between the method of assessment and the particular learning outcome and reflecting the appropriate Degree Level Expectations (UUDLEs or GDLEs). To give an obvious example, if a learning outcome is focused on the development of oral communication skills, then a written test as the method of assessment would be questionable. If an outcome indicates the importance of applying specific knowledge in order to develop a set of cognitive and conceptual problem-solving skills, then written tests and assignments certainly can be appropriate. If an outcome concerning such application involves achieving designated proficiency of hands-on skill, then a practical assignment with, but not limited to, observational assessment would have a more immediate relation to this outcome. Simply put, “hands-on application” and “written conceptualization” do not convey a clear and immediate relation.

Reviewers of a program proposal ask the same questions that students and instructors ask: “is the assignment or assessment method well-suited for students to demonstrate their knowledge, skills, attributes, etc. and for instructors to be able to assess and evaluate the demonstration?”

Some examples of how universities have provided information that assists reviewers to assess this criterion are included below:

Example 1: Proposal for new undergraduate program

The proposal lists several types of formative and summative assessments methods. The program emphasizes hands-on methods of assessments, such as lab and field work, and independent research projects. The proposal provides a table that shows the alignment between program learning outcomes and assessment methodologies.

Example 2: Proposal for a new Honours Bachelors program

In addition to traditional tests and exams that measure recall of content and application of knowledge, the program will assess comprehensive knowledge through projects that require application. Deep learning, critical thinking, and creative problem-solving include knowledge of alternative bases of wisdom and methodologies in measured relation to those of the “Western” mainstream tradition. There will also be an emphasis on community engagement and experiential learning, wherever possible. Learning is also demonstrated through ethical research practices, as well as written and oral presentations, which are key skills that need to be practiced and developed to prepare graduates for career success.

Examples of such an approach to assessment include:

  • Evaluating student ability to merge diverse knowledges, including those non-Eurocentric and community-based with Western theoretical concepts in essays, presentations, research projects and exams
  • Measuring student ability to identify and communicate the impacts of historical institutional and governmental policies of the health, wellbeing, self-determination and organization of a specific community
  • Gauging student efficacy in identifying, supporting and assisting in the research interests of self-defined communities
  • Evaluating student use of community-designed theoretical frameworks towards the development of practical, meaningful and innovative applications that benefit community partners and the university.

The proposal indicated that the assessment practices in program would include:

  • Tests & Examinations • Simulation
  • Research Essay • Critical Reflection
  • Seminar Presentation • Debates
  • Oral Examination/Assessment • Presentations
  • Research Project • Relevant Community Engagement

Example 3: Master’s Program with Capstone

Assessment of Learning Student performance in the program will be assessed through a variety of methods including: reports, presentations, assignments and project portfolios, cumulative for the calculation of course grades. A table presented very clearly the techniques to be used to assess skills and competencies associated with designated program learning outcomes. These included:  Assignments, Projects, Capstone, Portfolio, Presentations

Assessing Experiential Learning Through its modular and flexible structure, the program has been designed to be responsive and accommodating to student interests and creativity. Responsiveness necessitates an infrastructure for assessing learning that ensures rigor, consistency and meaningful standards across the potentially diverse outcomes anticipated. Central to this tension between flexibility and consistency is the capstone project, to be evaluated by the Program Committee.

Capstone requirements include a project proposal and the selection of a Project Advisory Committee (PAC) for approval by the Program Committee. The project proposal must include a rubric and metric for individual and group evaluation that will be negotiated between students and their PAC and approved by the Program Committee to ensure rigor across projects and the means for final capstone assessment and evaluation. This apparatus will provide project specific flexibility to define deliverables and outcomes, and a program-wide system for assessing quality and student achievement. Students are required to provide project updates every three months to ensure ongoing feedback and review before final deliverable(s) are evaluated. Finally, the Project Advisory Committee will be required to submit a grade and an assessment of the project with clear demarcations of individual student contributions (in the case of group projects) to the Program Committee for review.

Responsibility for helping students to navigate the program and address any flagged issues lies with the program Director and departmental Graduate Coordinators. The Program Committee will also report to the Curriculum Committee and will forward any student issues or appeals that cannot be resolved at the program level to the department committees on standing. At the program level, the Director, Graduate Coordinators, and the Program Committee will be responsible for approving the student admissions, capstone proposals, Project Advisory Committee composition and final project evaluations. Students and student progress through the program as a whole will be closely monitored by program administrators to ensure appropriate progress and that all programmatic requirements are met.

Example 4: Master’s Program

Pedagogical and Evaluation Strategies: Faculty members use a multi-modal method for course instruction. This method is also applied to evaluation. Students are evaluated on their core knowledge gained in the course; however, application of these concepts is a key component of the student’s evaluation. For example, exams and midterms are designed with a combination of questions to ensure that students not only understand concepts, but also know how to apply them, how to make appropriate decisions to determine potential solutions, and to use their knowledge to develop unique or innovative solutions. Student performance is also measured in terms of their written and oral communication skills in the majority of the courses, which allows students to develop their communication skills continuously throughout their entire degree, leading to graduates who have the basic knowledge and skills required but who can use this knowledge effectively and appropriately. Lastly, students will keep an e-portfolio throughout the entirety of the program. The portfolio includes the deliverables from the core courses, the research method course, the specialization courses and, the Major Research Paper. The portfolio will help them to integrate knowledge acquired in various courses and their MRP, and enable faculty to make an assessment based on the overall program learning outcomes.

Example 5: Master’s Program

The program will utilize a variety of methods to assess student achievement across the program-learning objectives during and after the twelve-month program (presented in a Table depicting relations between assessment methods and learning outcomes). Assessment includes community partner feedback through a survey provided at the end of the placement; written and oral class work, such as term papers, draft of proposed methods, critical reflections, presentations, and final grades; the MRP proposal (including the Research Ethics Board application); and the final MRP. To provide information about classroom work, instructors will be asked to provide a breakdown of the class final grade mean by type of work completed. We will also track presentations at conferences, teaching feedback for TAs, participation in research grants and scholarships and awards. An exit survey will be administered upon completion of the MRP to assess student perspectives on their own achievement, level of satisfaction with the program, and future plans. If students are in agreement, we will occasionally contact program alumni to find out how they have used their MA training.

Rationale: These methods of demonstrating and documenting student achievement are appropriate because they provide an on-going assessment of the student during the program and are easily recorded for longer-term uses. The community partner survey will assess satisfaction with 1) the work accomplished by the student, and 2) placement facilitation and oversight by university staff and faculty. Student reflections at the end of the placement will provide the student perspective on these same topics. Administering an exit survey will then provide a broader overall assessment of the program, from the student’s perspective. Following up with the alumni over time provides information of the longer-term value of the program both for students and society, and ensuing cyclical program reviews.

Return to Questions

Q5: When developing a new program proposal, what information is reasonable and appropriate to meet the QAF evaluation criterion 2.1.6 b “Completeness of plans for documenting and demonstrating the level of performance of students, consistent with the institution’s statement of its Degree Level Expectations?”

A5: “Documenting” and “demonstrating” depend on the nature of the program.  “Level of performance” infers a continuum of success.  For instance, many programs, whether traditional academic or those with an immediate practical application, document for their own records the grade spread of a graduating cohort (Undergraduate: cumulative major in the context of cumulative Degree GPA; Graduate: cumulative in-program). Some Programs set a course grade or GPA number which students must achieve for graduation.  Many programs calculate completion versus placement rates, sometimes with the assistance of professional organizations for data collection, and devise plans for surveying alumni one-year post-graduation, then five-years.  Again, types of documentation as well as the aim and need of demonstration are program-specific.  There is no one-size fits all.  “Documentation” is a function of the different needs of programs for the information; however, all programs will undergo review every eight years.

Each proposal is assessed, in part, in terms of whether  program design and delivery, and student performance of knowledge, skills, and abilities are achieved at the level of the degree (undergraduate Bachelor’s, graduate Diploma, Master’s, Doctoral). In addition to these expectations, each proposal  is also assessed, given the program design and delivery, in terms of whether students are actually achieving the outcomes specified as central to the program. Criterion 2.1.6b asks programs to devise ways of demonstrating and documenting whether such outcomes are being achieved primarily as a means of programs’ ongoing self-assessment as well as to provide information for future cyclical program reviews.

Simply put, “how do you plan to assess (document and demonstrate) whether all the effort put into designing and, soon, delivering the program is working in the way you expected and with the levels of success you expected? What sort of information do you need in order to be able to answer that question?” Generally speaking, that information is drawn from students’ actual results during the program and upon graduation as well as after graduation.

For some programs, isolating the most appropriate information – the what, but also the how – is a challenge, including one of human resources; however, meeting the challenge in a program-specific way is a necessity when program design, approval, and delivery befall learning outcomes.

Some examples of how universities are providing this information in their new program proposals are shown below:

Example 1: Proposal for new undergraduate program

Projects or assignments for the capstone courses will be created with not only the course, but also program learning outcomes in mind, so that both individual faculty and the program as a whole have a way of evaluating whether or not students have met the program learning outcomes and degree level expectations.

To document and demonstrate the level of performance of students with respect to program learning outcomes, the university is using the Desire2Learn Insights tool. With this tool, course level assignments are linked to course and program outcomes and degree level expectations to provide quantitative data on student achievement of program level learning outcomes.

To complement the direct forms of program assessment, as an indirect form, students are exposed to the program learning outcomes when they begin their degree, and in an exit survey upon graduation, they are asked their opinion on how well they feel the outcomes were reflected in the course curriculum, as well as how well they feel they were achieved.

Example 2: Proposal for new Honours Bachelor’s Program

The program will be externally reviewed during cyclical reviews, and assessed on an ongoing basis through indicators such as student grades, integrative assessment practices, and awards data. Classes, and assessment practices, will be closely monitored on an ongoing basis and as we receive feedback from students, faculty, teaching assistants, community members, and others. Ultimately, we will judge success by assessing the career success and satisfaction of our graduates and thus we will make every effort to maintain contact with our graduates to this end. Efforts to improve the program, whether in content or delivery, in response to this data/feedback will be routine and on-going in order to better address contemporary issues that arise in relevant communities.

Appendices provide curriculum-mapping and assessment charts (Program Learning Outcomes tied to specific courses and a table indicating with assessment type was attributed to each course learning outcome).

Example 3: Proposal for Master’s program in a professional discipline

Proposal indicated that the program would use accreditation requirements to ensure that students are meeting the program learning outcomes.

Example 4: Proposal for new Master’s Program

The plans for documenting and demonstrating the level of student performance have been designed specifically to be consistent with the degree level expectations. The program-level learning outcomes are based on the DLEs and provide the backbone for the program. Onto these were mapped appropriate courses and methods of assessment. Since the MRP is the capstone experience and is associated with most of the learning outcomes and DLEs, upon successful completion students will have achieved the program’s objectives. In addition, more global methods of assessment, such as the exit survey, provide a broader view of the program and student performance. Together, these assessment methods provide a complete picture of the program that is easily documented and can be used for formal cyclical reviews or other purposes.

Return to Questions

Q6: How long does it take for a new program to receive approval by the Quality Council?

A6: The Appraisal Committee and the Quality Council can approve a new program proposal within 45 days of its submission to the Quality Assurance Secretariat. These two groups meet eleven times a year in order to assure timely decision-making. Proposals can take longer to approve in cases where there is insufficient or unclear information in the proposal for an evaluation against the criteria in the QAF. In these cases, the university may be asked to submit further information. In many cases, the university is able to respond within a few days of the request and the 45 day timeframe is met. In most cases, universities provide the information for the next Appraisal Committee meeting and a decision can be made then. In a few cases, universities and programs have required several months to make a response. These delays may be due to the time of year of the request or the extent of the requested information. Meeting dates and submission deadlines are available by clicking here.

Return to Questions

Q7: What is the difference between a new program approvals process and an expedited program approvals process?

A7: The key process difference is that all new program proposals must be subject to external arm’s-length review by disciplinary experts. This step is not required for those proposals that fall under the expedited appraisal protocol because these programs (e.g. graduate diplomas (Types 1, 2, 3), graduate collaborative programs, etc.) are based on existing approved programs that have already been the subject of external review. The evaluation criteria are the same for both new and expedited approval (excluding those that relate to external review). A further process difference is that the Appraisal Committee is the final decision-maker on approval in the expedited process. Decisions are reported, for information, to the Quality Council. In the case of new programs, the Quality Council makes the approval decision, based on a recommendation from the Appraisal Committee.

Return to Questions

Q8: How does Quality Council approval of a new program relate to funding approval by MTCU?

A8: The Quality Council has responsibility for assuring that the quality assurance evaluation criteria in the QAF have been met in each new program that it approves. The Quality Council reports its program approval decisions to the University by formal letter and through posting decisions on the Quality Council website. MTCU has access to both of these communications. Universities may provide the formal letter of approval as part of their submission for funding approval to MTCU.  MTCU does have access to the web postings. Universities may submit new program proposals to the Quality Council and MTCU at the same time but MTCU funding is not granted to new programs that have not been approved by the Quality Council. In cases where the university changes the proposal in response to appraisal, it should ensure that MTCU receives updated information, as appropriate.

Return to Questions

CYCLICAL PROGRAM REVIEWS

Q9: Should a university seek approval from the Quality Council to allow a review of an existing program to exceed the eight year limit for a cyclical program review, if there are exceptional circumstances?

A9: No. It is the university’s responsibility to ensure that all programs are reviewed at least once every eight years as is specified in the IQAP and QAF. The Quality Council is aware that there are some situations in which universities may not be adhering to the eight year time frame. In that case, it is suggested that the University indicate how it is responding to this deviation from its policy and make every effort to get back on schedule. When the universities are audited once every eight years, the Quality Assurance auditors do examine the conformity of program reviews to the eight year cycle and will make recommendations or identify a cause for concern if there is a problem.

Return to Questions

Q10: Are there any programs that are not required to be on the cyclical program review schedule?

A10: No. All programs that result in a degree (or Graduate Diploma) are subject to cyclical program review. Included also are programs that are the subject of accreditation. There are ways in which these latter programs can be reviewed with efficiencies (see link to Guide).  Occasionally a university may suspend admissions to a program for a year. These programs should remain on the schedule and be the subject of cyclical program review.

Return to Questions

AUDITS

Q11: Has the Quality Council allowed a university a deferral for its scheduled Quality Assurance Audit?

A11: No. Almost all universities audited to date have indicated that they would like to have more time to implement their IQAP before they come up for audit. But all universities have been audited according to the agreed to schedule for audits. The schedule was based on the former audit system (UPRAC). The auditors take account of the newness of the processes in their audit visit and audit only those programs that have been the subject of IQAP processes.

Return to Questions

Q12: Is it possible to see examples of Audit Reports of other Universities?

A12: The Summary Reports of the Principal Findings of the Quality Assurance Audit at each university are posted on the Quality Council website. These are available for anyone to see following their approval by the Quality Council. The Quality Assurance Secretariat does not distribute the full Audit Reports except when requested to do so by those specified in the QAF (COU, OCAV, and MTCU). Others seeking to see an Audit Report could approach the university directly. The universities operate under their own policies with respect to sharing their full Audit Report.

Return to Questions

Q13:  What is expected in the university’s review of the draft Audit Report and accompanying Summary?

A13: The university’s key contact in quality assurance should prepare a brief statement that identifies any errors of fact or omission in the draft report or summary. It becomes part of the official audit documentation but it is not intended to be a response to the recommendations or suggestions. Those responses will come in the Institutional One Year Follow-up Response by the university.

Return to Questions

Q14: What is expected in the Institutional One-Year Follow-Up Response and what does the Audit Committee do to review it?

A14: The university’s key contact should detail the steps and activities that the university has taken to respond to the Recommendations in the Audit Report. Universities may also include their response to any Suggestions.  If the IQAP has been modified in response to the Audit, it should also be submitted to the Quality Council to be re-ratified.

The auditors responsible for the Audit Report review the university’s One Year Follow-up Response and prepare a Commentary on its Scope and Adequacy, as well as a Summary of their commentary, suitable for publication. All of these reports go the Audit Committee which makes a recommendation to the Quality Council on whether or not to accept the Institutional One Year Response.

Once approved by the Quality Council, the One Year Institutional Follow-up Response and the Auditors’ Summary of the Scope and Adequacy of the Response are posted on the websites of both the university and the Quality Council.

Return to Questions

MAJOR MODIFICATIONS

Q15: Who is responsible for deciding if a change to a program constitutes a major modification, a minor modification, a new program?

A15: The University is responsible for following its IQAP and identifying the program changes that are considered major, minor or new.  The Quality Assurance Secretariat will respond to requests for assistance. Most universities have identified in their IQAP, who to go to when an arbiter may be needed. (See Guide to the Quality Assurance Framework)

Return to Questions

Q16: When does a “significant change” to a program result in a major modification and when does it result in a new program?

See answer to question 1.