[HN Gopher] Show HN: Metorial (YC F25) - Vercel for MCP
       ___________________________________________________________________
        
       Show HN: Metorial (YC F25) - Vercel for MCP
        
       Hey HN! We're Wen and Tobias, and we're building Metorial
       (https://metorial.com), an integration platform that connects AI
       agents to external tools and data using MCP.  The Problem: While
       MCP works great locally (e.g., Cursor or Claude Desktop), server-
       side deployments are painful. Running MCP servers means managing
       Docker configs, per-user OAuth flows, scaling concurrent sessions,
       and building observability from scratch. This infrastructure work
       turns simple integrations into weeks of setup.  Metorial handles
       all of this automatically. We maintain an open catalog of ~600 MCP
       servers (GitHub, Slack, Google Drive, Salesforce, databases, etc.)
       that you can deploy in three clicks. You can also bring your own
       MCP server or fork existing ones.  For OAuth, just provide your
       client ID and secret and we handle the entire flow, including token
       refresh. Each user then gets an isolated MCP server instance
       configured with their own OAuth credentials automatically.  What
       makes us different is that our serverless runtime hibernates idle
       MCP servers and resumes them with sub-second cold starts while
       preserving the state and connection. Our custom MCP engine is
       capable of managing thousands of concurrent connections, giving you
       a scalable service with per-user isolation. Other alternatives
       either run shared servers (security issues) or provision separate
       VMs per user (expensive and slow to scale).  Our Python and
       TypeScript SDKs let you connect LLMs to MCP tools in a single
       function call, abstracting away the protocol complexity. But if you
       want to dig deep, you can just use standard MCP and our REST API
       (https://metorial.com/api) to connect to our platform.  You can
       self-host (https://github.com/metorial/metorial) or use the managed
       version at https://metorial.com.  So far, we see enterprise teams
       use Metorial to have a central integration hub for tools like
       Salesforce, while startups use it to cut weeks of infra work on
       their side when building AI agents with integrations.  Demo video:
       https://www.youtube.com/watch?v=07StSRNmJZ8  Our Repos: Metorial:
       https://github.com/metorial/metorial, MCP Containers:
       https://github.com/metorial/mcp-containers  SDKs: Node/TypeScript:
       https://github.com/metorial/metorial-node, Python:
       https://github.com/metorial/metorial-python  We'd love to hear
       feedback, especially if you've dealt with deploying MCP at scale!
        
       Author : tobihrbr
       Score  : 42 points
       Date   : 2025-10-14 14:49 UTC (8 hours ago)
        
 (HTM) web link (github.com)
 (TXT) w3m dump (github.com)
        
       | ushakov wrote:
       | congrats on the launch!
       | 
       | why do I need a specialized platform to deploy MCP instead of
       | just hosting on existing PaaS (Vercel, Railway, Render)?
       | 
       | also if you're not using VMs, how do you isolate per-user
       | servers?
        
         | tobihrbr wrote:
         | Great questions!
         | 
         | If you want to run your own remote servers (for your
         | product/company) Railway or Render work great (Vercel is a bit
         | more difficult since Lambdas are very expensive if you run them
         | over long periods of time). Metorial targets developers who
         | build their own AI agents and want to connect them to
         | integrations. Plainly, we do a lot more then running MCP
         | servers; we give you monitoring, observability, handle
         | consumer-facing OAuth, and give you super nice SDKs to
         | integrate MCP servers with your agent.
         | 
         | Regarding the second question, Metorial has three execution
         | modes depending on what the server supports: 1) Docker - this
         | is the most basic one which any MCP server should support. We
         | did some heavy optimizations to get those to start as fast as
         | possible and our hibernation system supports stopping and
         | resuming them while restoring the state. 2) Remote MCP - we
         | connect to remote MCP servers for you, while still giving you
         | the same features and ease-of-integration you get with any
         | Metorial server (I could go more into detail on how our remote
         | servers are better than standard ones). 3) Servers on our own
         | lambda-based runtime. While not every MCP server supports this
         | execution mode, it's what really sets us apart. The Lambdas
         | only run for short intervals, while the connection is managed
         | by our gateway. We already have about 100 lambda-based servers
         | and working on getting more on to that execution model.
         | 
         | There's a lot about our platform that I haven't included in
         | this. Like our stateful MCP proxy, our security model, our
         | scalable SOA, and how we transform OAuth into a single REST API
         | calls for our users.
         | 
         | Let me know if you have any additional questions, always happy
         | to talk about MCP and software architecture.
        
           | ushakov wrote:
           | thanks for explaining, especially the runtimes part!
           | 
           | i am currently running Docker MCP Containers + MCP Gateway
           | mixed with Remote MCPs in microVMs (aka. Sandboxes).
           | 
           | seems to be the most portable setup, so you don't have to
           | worry about dealing with different exec like uvx, poetry,
           | bun, npx and the whole stdio/streamable http conversion.
           | 
           | lambdas sound interesting, esp. if you have figured out the
           | way to make stateful work stateless, but comes with the
           | downside that you have to maintain all the integrations
           | yourself + the environment itself might have compatibility
           | issues. i've seen someone also using cloudflare dynamic
           | workers for similar use-case (disco.dev), but they're
           | maintaining all the integrations by hand (or with Claude Code
           | rather). more extreme version of this would be writing custom
           | integration specific to the user by following very strict
           | prompt.
           | 
           | anyways, i'll look into Metorial as am curious about how the
           | portable runtimes work.
           | 
           | i am also maintaining a list of MCP gateways, just added you
           | there as well: https://github.com/e2b-dev/awesome-mcp-
           | gateways
           | 
           | thanks for building this, looking forward to checking it out!
        
             | tobihrbr wrote:
             | Thanks for sharing and adding us to your list. The point
             | about the lambdas is fair, though we do support other
             | execution modes to combat this. Please let me know if you
             | have any feedback or encounter hiccups :)
        
       | langitbiru wrote:
       | I wrote a book about MCP: https://leanpub.com/practical-mcp
       | 
       | I'm considering adding more chapters to the book: security, easy
       | deployment, etc. So, I may look into your solution. I believe
       | there are other players also, like Klavis AI, FastMCP and some
       | MCP startups that I cannot remember.
       | 
       | Congratz!
        
         | tobihrbr wrote:
         | Thanks so much! I'll definitely check out your book. Always
         | happy to talk MCP :)
        
       | samgutentag wrote:
       | mitochondria is the powerhouse of the cell
        
       | cgijoe wrote:
       | Oh my lord, your timing is perfect. I need this so badly right
       | now. Congrats on the launch, and wow, thank you for making your
       | MCP containers available separately!
        
         | tobihrbr wrote:
         | Haha, good thing we launched today. Thank you so much for the
         | encouraging words!
        
       | solumos wrote:
       | The distinction between "Vercel for MCP [integrations]" and
       | "Vercel for MCP [servers]" is meaningful -- maybe "Zapier for
       | MCP" is a more appropriate "X for Y"?.
       | 
       | Congrats on the launch!
        
         | tobihrbr wrote:
         | That's a really interesting point. We've actually been
         | discussing this quite a bit. We felt like putting an emphasize
         | on the "dev tool" aspect (like Vercel) makes more sense, but
         | the way you put it we might want to reconsider that. Thank for
         | your interest!
        
       | fsto wrote:
       | We've just begun implementing Composio. Would love to reconsider
       | if you help clarifying the main differences. From my perspective
       | it looks like you have more robustness features to me as a
       | developer and you're fully open source (not just the client)
       | whereas Composio has more integrations. But would love your input
       | to clarify. Congrats on the launch!
        
       | rancar2 wrote:
       | I like the license (FSL) chosen for the project, but it may need
       | some explaining for others. Can you comment on decision for
       | selecting the Functional Source License (Version 1.1, ALv2 Future
       | License), and the intent from the Metorial team with it including
       | any restrictions on potential commercial use of the platform
       | (i.e. free-to-paid without notice)?
       | 
       | For those who aren't aware of what FSL (https://fsl.software/)
       | is: "The Functional Source License (FSL) is a Fair Source license
       | that converts to Apache 2.0 or MIT after two years. It is
       | designed for SaaS companies that value both user freedom and
       | developer sustainability. FSL provides everything a developer
       | needs to use and learn from your software without harmful free-
       | riding."
        
         | tobihrbr wrote:
         | Thanks for pointing that out. Ultimately, we wanted to strike a
         | balance between being fair and open to the community, welcoming
         | contributions, and ensuring that people can self-host without
         | having to worry about licensing issues, while also ensuring
         | that Metorial, as a company, can exist and work on OSS
         | sustainably. This isn't easy and I don't think there's a right
         | answer. To us FSL strikes a pretty good balance. Allowing the
         | community to use and participate while ensuring that Metorial
         | makes sense as a business as well.
        
       ___________________________________________________________________
       (page generated 2025-10-14 23:00 UTC)