Session box

Running Your Own Generative AI Language Model Locally In-Person

This workshop walks you through the process of running a generative AI language model locally using Python. Some prior python experience is helpful but not required for attendance. We will use the llama-cpp-python library and the Microsoft Phi-4 language model in this workshop to demonstrate how to set up and query language models locally. Topics covered include:

  • Steps to install, configure, and run a local language model
  • GPU development environment setup
  • Responsible AI use and guidelines
  • Considerations for selecting a local generative AI model
  • Basic Python code needed to prompt and retrieve responses from the model
  • Conversation formatting to improve response quality.

This is a Research Data and Computing Services student led workshop. Contact: Vincent Scalfani, UA Libraries, vfscalfani@ua.edu

Before using AI tools, be sure to review UA OIT’s guidelines for use of AI and the UA Minimum Security Standard: https://oit.ua.edu/software/artificial-intelligence/ai-approved-list/

Date:
Tuesday, April 15, 2025
Time:
11:30am - 1:00pm
Time Zone:
Central Time - US & Canada (change)
Location:
Scholars' Station - Rodgers Library
Event Type:
Workshop
Categories:
  Research Data Services (Open Session)  
Registration has closed.

Session Organizer

Vincent Scalfani

More sessions like this...