This month, our guest editor is Rich Tape, programmer analyst at the Centre for Teaching, Learning and Technology. He presents the LLM Sandbox, a Large Language Model service designed to help UBC faculty experiment with GenAI in their courses.
The integration of language models and generative AI into higher education continues to accelerate globally, yet Canadian institutions face distinct challenges in adoption. Privacy regulations, data sovereignty requirements, and budget sustainability amongst other concerns create barriers that commercial AI services often cannot address within existing institutional frameworks or legal requirements.
Recent1 surveys2 indicate growing faculty and student interest in AI applications for teaching, learning, and research, yet implementation remains limited by practical considerations. Educators cite concerns about Privacy Impact Assessment (PIA) compliance, unpredictable pricing models, and potential vendor lock-in as obstacles to AI integration.
Enter the LLM Sandbox.
In response to these challenges, The LT Hub Team at The CTLT has developed the LLM Sandbox, a locally-hosted large language model (LLM) service designed specifically to meet UBC Faculty — and perhaps broader Canadian higher education — requirements.
This infrastructure addresses several key institutional needs:
- The system operates entirely within Canadian data residency requirements and simplifies PIA processes by maintaining a stateless architecture where no user data is stored by the core service.
- Unlike commercial per-token models, the LLM Sandbox uses a pricing model that allows for accurate budget forecasting and sustainable long-term planning.
- Built on open standards and OpenAI-compatible APIs, the service prevents vendor lock-in while supporting familiar development frameworks.
In addition to meeting these needs, the infrastructure of the LLM Sandbox is designed such that beyond the funding envelope of the Teaching and Learning Enhancement Fund (TLEF), Faculties – with a minimal amount of internal technical support – will be able to deploy their own instances of the LLM Sandbox ensuring that their funded projects can continue to be available to their students.
Supporting Implementation Through the LT Incubator
Recognizing that infrastructure alone doesn’t address all implementation barriers, the LT Incubator provides support services for faculty developing AI-enhanced educational tools.
Services include:
- Technical development and system administration
- PIA shepherding and compliance guidance
- Project management and team coordination
- Integration with existing learning systems
This support model allows educators to focus on pedagogical innovation rather than technical infrastructure management.
Other Resources
The LT Incubator and LLM Sandbox is currently supporting multiple TLEF-funded projects, demonstrating practical applications across several disciplines. Faculty interested in exploring AI integration can learn more about the technical specifications at https://lthub.ubc.ca/llm-sandbox-a-locally-hosted-privacy-focused-language-model-service/.
The CTLT and LT Incubator also publish regularly on The AI CTLT Website and host multiple weekly Generative AI related events which are posted on the CTLT Events website.
- https://ctlt.ubc.ca/2025/02/21/edubytes-students-perception-of-generative-ai/ ↩︎
- https://jipe.ca/index.php/jipe/article/view/245 ↩︎
Enjoyed reading this edition of Edubytes? To view past issues, visit the Edubytes archive.
Are you interested in staying up to date on the latest trends in teaching and learning in higher education? Sign up for our newsletter and get this content delivered to your inbox once a month.