• 0 Posts
  • 11 Comments
Joined 2 months ago
cake
Cake day: May 7th, 2024

help-circle


  • Well, now that you know that this content is AI generated, and that this community doesn’t allow that, your next move should likely be whatever a responsible member of the community would do when made aware that they aren’t meeting the expectations of their peers. I suppose we’ll get to see what that looks like, if you’re not off flying a kite yourself.


  • I was an early Lemmy user, and joined back when .ml was basically the only choice.

    It was weird, but the thing that really got me considering a different instance was just how many words they added to their censor list.

    I messaged the admins and told them that it wasn’t a great system, because it censors posts with “removed” whenever a forbidden word is used, and because of that I can’t tell the difference between someone saying “bitch” and someone saying the n-word. How am I supposed to know whether or not to report such a comment if the offense part is obscured. “What a stupid removed” could be slightly sexist, or indefensibly racist.

    They told me “All slurs are bad, you should report those comments just the same”.

    Ay, fuck you guys, the words I gave you as examples are absolutely not equally bad, or at least, anyone with a functioning brain wouldn’t make that argument…



  • Ultimately [the Jolla Mind2] sounds… a lot less useful than the AI-in-a-device features that companies have been promising for products like the Rabbit R1 or Humane AI Pin.

    What the hell? Why would the device with a dedicated NPU and local models be less useful than the piece-of-shit marketing stunts that everyone hates?

    The Mind2 looks interesting. It solves the issue of your hardware not supporting the requirements to run the model, by providing hardware, and lets you use your existing smartphone to access it remotely. I am curious how it actually performs.

    It might not be a long-term product concept though. All new phones are going to come stock with a lot more than 6 TOPS of AI compute onboard very soon.