[HN Gopher] Building MCP servers for ChatGPT and API integrations
       ___________________________________________________________________
        
       Building MCP servers for ChatGPT and API integrations
        
       Author : kevinslin
       Score  : 33 points
       Date   : 2025-07-24 20:59 UTC (2 hours ago)
        
 (HTM) web link (platform.openai.com)
 (TXT) w3m dump (platform.openai.com)
        
       | babyshake wrote:
       | Is it safe to say that MCP has "won" vs. A2A? Or is this a
       | misreading of the situation?
        
         | miguelxpn wrote:
         | I think they have different use cases. MCP is for tool calling,
         | A2A for agents communicating between themselves.
        
         | d_watt wrote:
         | They're not directly solving the same problem. MCP is for
         | exposing tools, such as reading files. a2a is for agents to
         | talk to other agents to collaborate.
         | 
         | MCP servers can expose tools that are agents, but don't have
         | to, and usually don't.
         | 
         | That being said, I can't say I've come across an actual
         | implementation of a2a outside of press releases...
        
       | cube2222 wrote:
       | The support here is really weird.
       | 
       | If I understand correctly, it requires your MCP server to have
       | exactly two tools - search and fetch.
       | 
       | So this is not really support for MCP in general, as in all the
       | available MCP servers. It's support for their own custom higher-
       | level protocol built on top of MCP.
        
         | monadoid wrote:
         | Yeah I hope they open up support to all MCP tools - this is
         | lame as-is.
        
         | varunneal wrote:
         | this guide is just for an example of how to build a single mcp
         | (e.g. a vector store). chatgpt connectors implement mcps in
         | general now
        
           | cube2222 wrote:
           | I don't believe this is the case. Do you have a link for
           | that?
           | 
           | Cause from TFA "To work with ChatGPT Connectors or deep
           | research (in ChatGPT or via API), your MCP server must
           | implement two tools - search and fetch."
           | 
           | Also, this page is actually the only docs site about MCP they
           | have, and their help articles link to it too.
        
         | asabla wrote:
         | For ChatGPT and DeepResearch yes, not when using the API. I
         | guess you could just return empty results if you want to offer
         | other tools as well (can't test it now, since custom connectors
         | only supports Workspace or PRO accounts for this moment).
         | 
         | Quote we're talking about: > To work with ChatGPT Connectors or
         | deep research (in ChatGPT or via API), your MCP server must
         | implement two tools - search and fetch.
         | 
         | Reference links:
         | 
         | - Using remote MCP servers with the API:
         | https://platform.openai.com/docs/guides/tools-remote-mcp
         | 
         | - Which account types can setup custom connectors in ChatGPT:
         | https://help.openai.com/en/articles/11487775-connectors-in-c...
        
       | ascorbic wrote:
       | Has something changed? That seems to be the same page they've had
       | for a couple of months. It's only for deep research mode, and is
       | restricted to Pro and Enterprise.
        
       | ipsum2 wrote:
       | Title is mildly misleading, is only available for the API, not
       | the web/mobile interface.
        
         | dang wrote:
         | Not sure which title you mean, but the submitted title was
         | "ChatGPT Launches MCP Support" and we changed it to be that of
         | the article (per
         | https://news.ycombinator.com/newsguidelines.html)
        
           | ipsum2 wrote:
           | Thanks for fixing it.
        
       | maxwellg wrote:
       | It is very nice to get MCP support in ChatGPT. OpenAI really
       | fumbled the bag with the OpenAPI-based Custom Actions (or was it
       | Custom GPTs?). The web editor experience was always incredibly
       | buggy, even months after initial release. MCP servers allow us to
       | move nearly all of the tool definition bits into the server
       | codebase itself, so we can change things on the fly / version
       | control / feature flag tools etc. much better.
        
       | DiabloD3 wrote:
       | I love how they don't actually explain why I (or anyone else)
       | would ever implement their API.
       | 
       | Given how disastrous the AI 'industry' has been, between
       | misappropriating data from customers, performing actions on
       | behalf of customers that lead to data and/or financial loss, and
       | then seeking protection from the law in one or more cases of
       | these, isn't providing an MCP service essentially requiring you
       | to notify customers of a GDPR-or-similar data compromise event at
       | some point in the future when it suddenly but inevitably betrays
       | you?
       | 
       | Like, isn't OpenAI just leading people to a footgun and then
       | kindly asking them to use it, for the betterment of OpenAI's
       | bottom line, which was significantly in the red for FY24?
        
       ___________________________________________________________________
       (page generated 2025-07-24 23:00 UTC)