by alvis on 10/6/25, 6:27 PM with 382 comments
by sert_121 on 10/7/25, 1:11 AM
The biggest bottleneck for this for the past two years imo wasn't the models, but the engineering and infra around it, and the willingness of companies to work with openaio directly. Now that they've grown and have a decent userbase, companies are much more willing to pay/or involve themselves in these efforts.
This has eventual implications outside user-heavy internet use (once we see more things built on the SDK), where we're gonna see a fork in the web traffic of human centric workflows through chat, and an seo-filled, chat/agent-optimized web that is only catered to agents. (crossposted)
by fidotron on 10/6/25, 7:23 PM
by rushingcreek on 10/6/25, 6:46 PM
The problem with this approach is precisely that these apps/widgets have hard-coded input and output schema. They can work quite well when the user asks something within the widget's capabilities, but the brittleness of this approach starts showing quickly in real-world use. What if you want to use more advanced filters with Zillow? Or perhaps cross-reference with StreetEasy? If those features aren't supported by the widget's hard-coded schema, you're out of luck as a user.
What I think it much more exciting is the ability to completely create generative UI answers on the fly. We'll have more to say on this soon from Phind (I'm the founder).
by mhl47 on 10/6/25, 6:49 PM
Personally I don't hope thats the future.
by emilsedgh on 10/6/25, 7:05 PM
They want to be the platform in which you tell what you want, and OAI does it for you. It's gonna connect to your inbox, calendar, payment methods, and you'll just ask it to do something and it will, using those apps.
This means OAI won't need ads. Just rev share.
by ed on 10/7/25, 12:54 AM
A lot of the fundamental issues with MCP are still present: MCP is pretty single-player, users must "pull" content from the service, and the model of "enabling connections" is fairly unintuitive compared to "opening an app."
Ideally apps would have a dedicated entry point, be able to push content to users, and have some persistence in the UI. And really the primary interface should be HTML, not chat.
As such I think this current iteration will turn out a lot like GPT's.
by hubraumhugo on 10/6/25, 7:45 PM
Why would I use a chat to do what could be done quicker with a simple and intuitive button/input UX (e.g. Booking or Zillow search/filter)? Chat also has really poor discoverability of what I can actually do with it.
by cefboud on 10/6/25, 6:52 PM
by fny on 10/6/25, 7:10 PM
by darajava on 10/6/25, 9:44 PM
Another commenter suggested a hotel search function:
> Find me hotels in Capetown that have a pool by the beach .Should cost between 200 dollars to 800 dollars a night
ChatGPT can already do this. Similarly, their own pizza lookup example seems like it would exist or nearly exist with current functionality. I can't think of a single non-trivial app that could be built on this platform - and if there are any, I can't think of any that would be useful or not in immediate danger of being swallowed by advances to ChatGPT.
by bonoboTP on 10/6/25, 8:43 PM
Convenience-wise probably this model is more viable, and things will get centralized to the AI apps. And the nested utilities will be walled gardens on steroids. Using custom software and general computing (in the manner of the now discontinued sideloading on Android) will get even further away for the average person.
by wiradikusuma on 10/6/25, 7:16 PM
This time will be different?
by WillieCubed on 10/6/25, 7:46 PM
Custom GPTs (and Gemini gems) didn't really work because they didn't have any utility outside the chat window. They were really just bundled prompt workflows that relied on the inherent abilities of the model. But now with MCP, agent-based apps are way more useful.
I believe there's a fundamentally different shift going on here: in the endgame that OpenAI, Anthropic et al. are racing toward, there will be little need for developers for the kinds of consumer-facing apps that OpenAI appears to be targeting.
OpenAI hinted at this idea at the end of their Codex demo: the future will be built from software built on demand, tailored to each user's specific needs.
Even if one doesn't believe that AI will completely automate software development, it's not unreasonable to think that we can build deterministic tooling to wrap LLMs and provide functionality that's good enough for a wide range of consumer experiences. And when pumping out code and architecting software becomes easy to automate with little additional marginal cost, some of the only moats other companies have are user trust (e.g. knowing that Coursera's content is at least made by real humans grounded in reality), the ability to coordinate markets and transform capital (e.g. dealing with three-sided marketplaces on DoorDash), switching costs, or ability to handle regulatory burdens.
The cynic in me says that today's announcements are really just a stopgap measure to: - Further increase the utility of ChatGPT for users, turning it into the de facto way of accessing the internet for younger users à la how Facebook was (is?) in developing countries - Pave the way for by commoditizing OpenAI's complements (traditional SaaS apps) as ChatGPT becomes more capable as a platform with first-party experiences - Increase the value of the company to acquire more clout with enterprises and other business deals
But cynicism aside, this is pretty cool. I think there's a solid foundation here for the kind of intent-based, action-oriented computing that I think will benefit non-technical people immensely.
by Illniyar on 10/6/25, 10:44 PM
The docs mention returning resources, and the example is returning a rust file as a resource, which is nonsensical.
This seems similar to MCP UI in result but it's not clear how it works internally.
by ttoinou on 10/6/25, 6:47 PM
by LudwigNagasena on 10/6/25, 11:50 PM
I hope their GUI integration will be eventually superseded by native UI integration. I remember such well thought out concepts dating back to 2018 (https://uxdesign.cc/redesigning-siri-and-adding-multitasking...).
by spullara on 10/6/25, 6:51 PM
by MaxPock on 10/6/25, 7:15 PM
"Find me hotels in Capetown that have a pool by the beach .Should cost between 200 dollars to 800 dollars a night "
by pu_pu on 10/7/25, 3:04 AM
Ideally, users should be able to describe a task, and the AI would figure out which tools to use, wire them together, and show the result as an editable workflow or inline canvas the user can tweak. Frameworks like LlamaIndex’s Workflow or LangGraph already let you define these directed graphs manually in Python where each node can do something specific, branch, or loop. But the AI should be able to generate those DAGs on the fly, since it’s just code underneath.
And given that LLMs are already quite good at generating UI code and following a design system (see v0.app), there’s not much reason to hardcode screens at all. The model can just create and adapt them as needed.
Really hope Google doesn’t follow OpenAI down this path.
by MaxPock on 10/6/25, 7:04 PM
by whinvik on 10/6/25, 8:56 PM
by benatkin on 10/6/25, 6:49 PM
by ed on 10/7/25, 12:27 AM
Lots of folks (myself included) are reporting it doesn't: https://github.com/openai/openai-apps-sdk-examples/issues/1
by skeeter2020 on 10/6/25, 9:46 PM
by naiv on 10/6/25, 6:43 PM
by outlore on 10/6/25, 9:49 PM
by sailfast on 10/7/25, 12:48 AM
Sure, this helps app partners access their large user base and grows their functionality too - but the end game has to be lock-in with a 30% tax right?
by aryehof on 10/8/25, 5:56 AM
Can’t say I'm unhappy to see the authoritarian duopoly of the existing app stores challenged.
One question that comes to mind is how will multiple providers of similar products and services be recommended/discovered? Perhaps they wont be recommended, but just listed instead as currently done by search engines. Is AISO our future - AI Search Optimization?
by alganet on 10/7/25, 12:00 AM
by petecapecod on 10/7/25, 12:18 PM
While Apps do sound and look like the future, I feel like we're headed down the same road as the App and Google Play stores with this. Sooner or later OpenAI is going to use this to take a cut $$ of the payments going through the system. Which they most likely need and deserve, but still any time you close off part of the web it makes the web less open and free.
by irrationalfab on 10/6/25, 7:17 PM
by mercury24aug on 10/10/25, 7:52 PM
by itsnowandnever on 10/6/25, 7:34 PM
so, best of luck to OAI. we'll see how this plays out
by disiplus on 10/6/25, 6:58 PM
by chvid on 10/6/25, 6:48 PM
by mightymosquito on 10/7/25, 11:45 AM
To me it seems like a strategic shift from pure AI research and the AGI snake oil to other supposed tangible stuff.
In short, the AI revolution is mostly over, and we seem to be back in the realm of software.
by helloguillecl on 10/6/25, 8:37 PM
It has the potential to bridge the gap between pure conversation and the functionality of a full website.
by Dig1t on 10/6/25, 10:59 PM
by dawnerd on 10/7/25, 2:52 AM
by spullara on 10/6/25, 10:46 PM
by ttoinou on 10/6/25, 7:08 PM
by Handy-Man on 10/6/25, 6:51 PM
by doppelgunner on 10/7/25, 8:23 AM
by saberience on 10/6/25, 8:39 PM
by nextworddev on 10/6/25, 7:42 PM
by melodyogonna on 10/7/25, 8:20 AM
by todotask2 on 10/7/25, 2:21 PM
by danjl on 10/6/25, 7:06 PM
by defraudbah on 10/7/25, 7:51 AM
by nthypes on 10/6/25, 9:07 PM
by mirzap on 10/6/25, 7:58 PM
by hamonrye on 10/7/25, 2:38 AM
by compacct27 on 10/6/25, 6:42 PM
by tonysurfly on 10/7/25, 7:03 AM
by siva7 on 10/6/25, 7:27 PM
by OtherShrezzing on 10/6/25, 7:58 PM
by alvis on 10/6/25, 6:46 PM
by AlfredBarnes on 10/7/25, 12:38 PM
by klysm on 10/7/25, 1:31 AM
by jasonsb on 10/6/25, 6:45 PM
by testfrequency on 10/6/25, 6:58 PM
“CEO” Fidji Simo must really need something to do.
Maybe I’m cynical about all of this, but it feels like a whole lot of marketing spin for an MCP standard.
by throwacct on 10/6/25, 7:49 PM
I'mma call it now just for the fun of it: This will go the way of their "GPT" store.
by darkwater on 10/6/25, 7:33 PM
by markab21 on 10/6/25, 6:49 PM
MCP standardizes how LLM clients connect to external tools—defining wire formats, authentication flows, and metadata schemas. This means apps you build aren't inherently ChatGPT-specific; they're MCP servers that could work with any MCP-compatible client. The protocol is transport-agnostic and self-describing, with official Python and TypeScript SDKs already available.
That said, the "build our platform" criticism isn't entirely off base. While the protocol is open, practical adoption still depends heavily on ChatGPT's distribution and whether other LLM providers actually implement MCP clients. The real test will be whether this becomes a genuine cross-platform standard or just another way to contribute to OpenAI's ecosystem.
The technical primitives (tool discovery, structured content return, embedded UI resources) are solid and address real integration problems. Whether it succeeds likely depends more on ecosystem dynamics than technical merit.