• 0 Posts
  • 20 Comments
Joined 1 year ago
cake
Cake day: June 19th, 2023

help-circle









  • skip0110@lemm.eetoAsk Lemmy@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    I don’t hate Google. But some of their services/products are more buggy then the competitors (gmail, chat, chrome) and some don’t have much utility (free form search for products or recommendations, maps) so I use the better competitor products, where it benefits me. And I use the Google product when it offers me a benefit (search for technical documentation or finding a specific URL, chrome devtools). In some cases I’m locked in (gmail) and in that respect, it’s frustrating (but not unique to Google)






  • I think this model has billions of weights. So I believe that means the model itself is quite large. Since the receiver needs to already have this model, I’d suggest that rather than compressing the data, we have instead pre encoded it, embedded it in the model weights, and thus the “compression” is just basically passing a primary key that points to the data to be compressed in the model.

    It’s like, if you already have a copy of a book, I can “compress” any text in that book into 2 numbers: a page offset, and a word offset on that page. But that’s cheating because, at some point, we had to transfer to book too!