- cross-posted to:
- technology@beehaw.org
- hackernews@derp.foo
- cross-posted to:
- technology@beehaw.org
- hackernews@derp.foo
Apple wants AI to run directly on its hardware instead of in the cloud::iPhone maker wants to catch up to its rivals when it comes to AI.
Google is doing this exact same thing with Gemini, the platform behind Bard / Assistant.
Gemini has large scale models, that live in data centers, and handles complex queries. They also have a “Nano” version of the model that can live on a phone and handle simpler on-device tasks.
The smaller models are great for things like natural language UI and smart home controls. It’s also way faster and capable of working offline. A big use case for offline AI has been hiking with the Apple Watch in areas with no reception.
Also battery management, background tasks power distribution and hardware energy efficiency, i mean it would be great to have ai that adapted hardware energy consumption settings depending on my use case, yes i know that algorithms already exist to do that, but it would be great to have much much more flexible energy manager based on ai that accommodate and adapt to my use cases