• 0 Posts
  • 223 Comments
Joined 1 year ago
cake
Cake day: June 28th, 2023

help-circle










  • It does?

    Pixel Camera (previously known as Google Camera) can take full advantage of the available cameras and image processing hardware as it can on the stock OS and does not require GSF or sandboxed Google Play on GrapheneOS. Direct TPU and GXP access by Google apps including Pixel Camera is controlled by a toggle added by GrapheneOS and doesn’t provide them with any additional access to data. The toggle exists for attack surface reduction. Every app can use the TPU and GXP via standard APIs including the Android Neural Networks API and Camera2 API regardless.

    TPUs and GXP are what enable apps to do on device ais with whatever model they choose to bring.




  • I wouldn’t say it’s a theory. Just my thoughts / speculation. I would speculate that people who are pushing out RT / Alex Jones level content would be more hesitant to do so if there would be a big “this is not a reliable news source” sticker next to it.

    I would speculate that people who point Alex Jones / RT stuff just hope people read the article without thinking about where it comes from.


  • While I acknowledge that the MBFC does have some right wing bias, I think it serves its purpose. Aka to flag literal propaganda “news” sites.

    The titles are literally accurate in the image. Israel is (unethically) launching preemptive strikes.

    If you look at the .ml news communities that don’t use MBFC you will see that way too many news stories are from literally Russia Today, Southern China Morning Post, and other extremely biased to a very particular agenda publications.

    I think people are trying to tie MBFC to being Zionist just so the bot will be dropped and it will be easier for them to normalize things like Russia Today outside of .ml spaces.



  • It’s a waste of everyone’s time for sure. It’s just good business sense to make your customers happy though.

    As for typing speed perhaps ya lol. You could be faster. But I think the best approach here is using high quality locally run LLMs that don’t produce slop. For me I can count on one hand how many times I’ve had to correct things in the past month. It’s a mater of understanding how LLMs work and fine tuning. (Emphasis on the fine tuning)