Thread: Business GPT API question.
View Single Post
Old 04-02-2025, 10:25 AM  
2MuchMark
Videochat Solutions
 
2MuchMark's Avatar
 
Industry Role:
Join Date: Aug 2004
Location: Canada
Posts: 48,746
It probably depends on your usecase, but why not just run an LLM on your local machine instead? Its probably faster depending on your hardware, dIt's efinately private, and its pretty f'in cool too..
__________________

Custom Software | Server Management | Integration and Technology Solutions
https://www.2much.net
2MuchMark is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote