• Subscribe
  • Are you training your own Models like Llama instead of using GPT ?

    Anil Matcha
    8 replies

    Replies

    Richard Gao
    Don't see much of a purpose for taking that effort right now But once we have some NSFW chatbot integrations, we might decide to train our own Llama model for evoke-app.com
    Hashnimo
    NoNext for YouTube
    NoNext for YouTube
    Completely dependent on GPT, maybe until they somehow decide to API ban me.
    Maria Gonzalez
    Yes, I've been trying my hand at training my own models, some of which are very comparable to Llama. Although GPT is useful, there are times when a tailored model is preferable. I can tailor the model's design and settings to my specific context and data to improve its predictive efficacy, for instance. I get more granular control over the training process and can make any necessary adjustments to the model when I do it myself. GPT is a helpful tool, but I think you can get better, more specialized outcomes from investigating other methods of model training.
    Alexander Miroch
    Actually installed Alpaca on a linux host with 8Gb RAM. I'd say I 80% satisfied with the responses. It takes around 8 seconds to generate a decent response. I'm using dalai, maybe it adds some processing time. Anyway the ChatGPT is not the only one now)