• @IrateAnteater@sh.itjust.works
    link
    fedilink
    523 hours ago

    Since VLC runs on just about everything, I’d imagine that the cloud service will be best for the many devices that just don’t have the horsepower to run an LLM locally.

    • @GenderNeutralBro@lemmy.sdf.org
      link
      fedilink
      English
      223 hours ago

      True. I guess they will require you to enter your own OpenAI/Anthropic/whatever API token, because there’s no way they can afford to do that centrally. Hopefully you can point it to whatever server you like (such as a selfhosted ollama or similar).

    • @zurohki@aussie.zone
      link
      fedilink
      English
      123 hours ago

      It’s not just computing power - you don’t always want your device burning massive amounts of battery.