by nreece on 6/1/25, 6:32 AM with 43 comments
by salamo on 6/1/25, 11:04 PM
That said, I probably wouldn't use this unless mine was one of the specific use cases supported[0]. I have no idea how hard it would be to add a new model supporting arbitrary inputs and outputs.
For running inference cross-device I have used Onnx, which is low-level enough to support whatever weights I need. For a good number of tasks you can also use transformers.js which wraps onnx and handles things like decoding (unless you really enjoy implementing beam search on your own). I believe an equivalent link to the above would be [1] which is just much more comprehensive.
by arbayi on 6/1/25, 3:44 PM
A gallery that showcases on-device ML/GenAI use cases and allows people to try and use models locally.
by ricardobeat on 6/1/25, 1:35 PM
by pzo on 6/2/25, 4:29 AM
This days probably better to stick with onnxruntime via hugging face transformers or transformers.js library or wait until executorch mature. I haven't seen any SOTA model officially released having official port to tensorflow lite / liteRT for a long time: SAM2, EfficientSAM, EdgeSAM, DFINE, DEIM, Whisper, Lite-Whisper, Kokoro, DepthAnythingV2 - everything is pytorch by default but with still big communities for ONNX and MLX
by yeldarb on 6/1/25, 12:53 PM
Got really excited then realized I couldn’t figure out what “Google AI Edge” actually _is_.
Edit: I think it’s largely a rebrand of this from a couple years ago: https://developers.googleblog.com/en/introducing-mediapipe-s...
by 6gvONxR4sf7o on 6/2/25, 12:47 AM
by davedx on 6/1/25, 12:25 PM
(It seems to be open source: https://github.com/google-ai-edge/mediapipe)
I think this is a unified way of deploying AI models that actually run on-device ("edge"). I guess a sort of "JavaScript of AI stacks"? I wonder who the target audience is for this technology?
by hatmanstack on 6/1/25, 3:13 PM
by zb3 on 6/1/25, 1:24 PM
by danielb123 on 6/1/25, 1:22 PM
by rs186 on 6/3/25, 12:18 AM
You know how terrible the store and the publishing process are -- their own people don't even use it.
by dingody on 6/3/25, 2:30 AM
by stanleykm on 6/1/25, 3:40 PM
by roflcopter69 on 6/1/25, 10:52 PM
For context, I get to choose the tech stack for a greenfield project. I think that executor h, which belongs to the pytorch ecosystem, will have a way more predictable future than anything Google does, so I currently consider executorch more.
by init0 on 6/2/25, 4:06 AM
by synergy20 on 6/2/25, 1:45 AM
by suilk on 6/2/25, 1:36 AM
by rvnx on 6/1/25, 1:07 PM
Go to this page using your mobile phone.
I am apparently a doormat or a seatbelt.
It seems to be a rebranded failure. At Google you get promoted for product launches because of the OKRs system and more rarely for maintenance.