Summary
In my role supporting faculty effectiveness initiatives at Austin Community College, I designed and implemented a data-informed instructional improvement system aimed at strengthening course consistency, student engagement, and faculty development across multiple departments.
I translated student feedback, course experience data, and institutional quality standards into actionable instructional design improvements, helping faculty and instructional leaders align teaching practices with measurable learning outcomes. Using systems thinking, I connected student feedback loops, operational data, and instructional frameworks into a scalable improvement model that supported continuous online course and teaching enhancement.
My work focused on embedding a structured, evidence-based approach to instructional design grounded in student experience, quality standards, and faculty development needs.
Challenge
Instructional design practices varied widely across departments, resulting in inconsistent course quality, uneven student experiences, and limited alignment with institutional quality standards. Student feedback existed but was fragmented across systems and was not consistently translated into instructional improvements. Faculty development was also reactive rather than structured, with limited integration between feedback data, course design standards, and instructional coaching.
Additionally, there was no unified framework connecting student experience data from Navigate360 with operational systems such as Canvas or instructional quality benchmarks defined by the Quality Matters Rubric. This created a gap between what students were experiencing and how instruction was being designed and improved.
Audience
Primary stakeholders included faculty across multiple disciplines, academic support teams, department chairs, and instructional leadership. Secondary stakeholders included students and operational teams working within systems such as Canvas, Navigate360, Microsoft 365, and Google Workspace.
I also worked closely with faculty in structured feedback and development cycles delivered through Zoom, ensuring instructional improvements were grounded in real classroom experience and student data.
Process
I began by conducting a needs analysis using student feedback and engagement data from Navigate360, combined with operational insights from Canvas to identify patterns in student experience, course navigation challenges, and engagement gaps. This allowed me to surface instructional breakdowns that were not visible through anecdotal feedback alone.
I mapped instructional workflows and course design gaps using FigJam, which helped visualize where faculty support, course structure, and student feedback loops were misaligned. This systems-level mapping clarified opportunities for redesign and standardization across departments.
I then aligned instructional improvements to the Quality Matters Rubric to ensure all redesign efforts were grounded in nationally recognized standards for course quality, alignment, and accessibility. From there, I developed structured instructional resources and faculty-facing redesign guides that translated Navigate360 feedback into actionable instructional adjustments, supported by documentation and collaboration tools in Google Workspace and Microsoft 365.
I facilitated faculty development and coaching sessions via Zoom, incorporating real student feedback, course design walkthroughs, and collaborative redesign activities. These sessions functioned as applied instructional design labs where faculty could directly connect data to course improvements.
Tools Used
Canvas (cloud-native SaaS, learning management system), Navigate360 (student feedback and CRM system), FigJam (instructional workflow mapping), Quality Matters Rubric (instructional quality framework), Google Workspace, Microsoft 365, and Zoom.
Solution
I developed a structured instructional improvement system that integrated student feedback from Navigate360, operational insights from Canvas, instructional design standards from the Quality Matters Rubric, and collaborative design workflows built in FigJam.
This system created a continuous instructional improvement cycle where student experience data directly informed course redesign, faculty development was aligned to real instructional gaps, and all improvements were anchored in recognized quality standards. Google Workspace and Microsoft 365 supported documentation, communication, and iterative updates across departments, ensuring consistency and scalability.
Impact
The initiative led to more consistent alignment of courses with Quality Matters standards, improved integration of student feedback into instructional redesign cycles, and increased faculty engagement in structured course improvement processes. Instructional leaders gained clearer visibility into patterns in student experience and course quality, enabling more targeted support and decision-making.
Over time, the organization shifted toward a more structured, data-informed instructional design model grounded in continuous feedback, shared standards, and scalable improvement processes.
Reflection
This work strengthened my ability to design instructional systems that connect student feedback, operational data, and instructional standards into cohesive design frameworks. I developed deeper fluency in translating complex feedback into structured instructional improvements that are both scalable and aligned with institutional expectations.
Most importantly, I learned that effective instructional design is not only about course creation, but about building systems that continuously connect learner experience, faculty development, and quality standards into an ongoing improvement cycle.