- Had a good time collaborating with @mellymeldubs.bsky.social and the AI for Humanists team on this! It also features some data we've been cooking up for @print-and-prob.bsky.social. Worth thinking about whether you have a research task that a local LLM could help with...
- New NEH-supported tutorial on running LLMs locally with ollama! Your laptop is more powerful than you think. Save money, privacy, and energy. aiforhumanists.com/tutorials/