Connect desktop-touch-mcp to the client you already use
This page keeps the setup practical. Pick your client, copy one snippet, and you should be able to get moving without reverse-engineering transport names or config shapes.
Current scope: all examples on this page assume a local Windows 11 machine. This project is not yet a multi-OS setup story.
Choose the transport first.
Before transport, one important scope note: the runtime and client setup on this page are documented as Windows 11 only for now.
Stdio is the simplest path when your client can launch local MCP servers itself.
npx -y @harusame64/desktop-touch-mcp
HTTP is the right path when your client wants a server URL.
npx -y @harusame64/desktop-touch-mcp --http
The default local MCP endpoint is:
http://127.0.0.1:23847/mcp
Optional health check:
http://127.0.0.1:23847/health
Custom port example:
npx -y @harusame64/desktop-touch-mcp --http --port 8080
If the client can launch, use stdio. If it asks for a URL, use HTTP.
Use stdio
Claude Code, Codex, Gemini CLI, and many local clients are comfortable launching a local command.
Use HTTP
VS Code, URL-based clients, and ChatGPT Developer mode fit better with a local endpoint.
Keep both
It is reasonable for public docs to show stdio first and HTTP second for the same project.
These setup patterns are based on current official client docs.
This project has also been manually checked against Claude, GitHub Copilot CLI, VS Code / Copilot Chat, OpenAI Codex, and Gemini CLI.
CLI registration or shared project JSON.
Fastest path:
claude mcp add --transport stdio desktop-touch -- npx -y @harusame64/desktop-touch-mcp
HTTP path:
claude mcp add --transport http desktop-touch http://127.0.0.1:23847/mcp
Shared project .mcp.json for stdio:
{
"mcpServers": {
"desktop-touch": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@harusame64/desktop-touch-mcp"]
}
}
}
HTTP version:
{
"mcpServers": {
"desktop-touch": {
"type": "http",
"url": "http://127.0.0.1:23847/mcp"
}
}
}
Interactive form or ~/.copilot/mcp-config.json.
Interactive path:
/mcp add
Then enter:
- Server Name:
desktop-touch - Server Type:
LocalorSTDIO - Command:
npx -y @harusame64/desktop-touch-mcp
For HTTP, choose HTTP and use http://127.0.0.1:23847/mcp.
JSON config for stdio:
{
"mcpServers": {
"desktop-touch": {
"type": "local",
"command": "npx",
"args": ["-y", "@harusame64/desktop-touch-mcp"],
"env": {},
"tools": ["*"]
}
}
}
HTTP version:
{
"mcpServers": {
"desktop-touch": {
"type": "http",
"url": "http://127.0.0.1:23847/mcp",
"tools": ["*"]
}
}
}
Use .vscode/mcp.json.
Stdio:
{
"servers": {
"desktop-touch": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@harusame64/desktop-touch-mcp"]
}
}
}
HTTP:
{
"servers": {
"desktop-touch": {
"type": "http",
"url": "http://127.0.0.1:23847/mcp"
}
}
}
Once it is registered, open Copilot Chat, switch to Agent mode, and enable the server from the tools picker.
Use ~/.codex/config.toml.
Stdio:
[mcp_servers.desktop_touch] command = "npx" args = ["-y", "@harusame64/desktop-touch-mcp"]
HTTP:
[mcp_servers.desktop_touch] url = "http://127.0.0.1:23847/mcp"
Check what Codex sees:
codex mcp list
Treat ChatGPT as an HTTP-first client.
The current public OpenAI docs describe ChatGPT Developer mode with remote MCP over streaming HTTP or SSE. For desktop-touch-mcp, that means the local HTTP mode is the right public setup story.
npx -y @harusame64/desktop-touch-mcp --http
http://127.0.0.1:23847/mcp
- Open
Settings -> Apps -> Advanced settings -> Developer mode - Enable Developer mode
- Create an app for your MCP server
- Add the local MCP endpoint
Use settings.json.
Gemini CLI supports both user and workspace config:
~/.gemini/settings.json.gemini/settings.json
Stdio:
{
"mcpServers": {
"desktop-touch": {
"command": "npx",
"args": ["-y", "@harusame64/desktop-touch-mcp"]
}
}
}
HTTP:
{
"mcpServers": {
"desktop-touch": {
"httpUrl": "http://127.0.0.1:23847/mcp"
}
}
}
After registration, /mcp is the quickest way to check status.
Make the first run obvious.
Show stdio first
It is the least surprising local path for most technical readers.
Show HTTP second
It removes ambiguity for URL-based clients and web clients.
Keep one snippet per client
Readers should never have to translate one client’s config shape into another by hand.