This is a quickstart template to easily build and deploy a custom remote MCP server to the cloud using Azure Functions with Python. You can clone/restore/run on your local machine with debugging, and azd up
to have it in the cloud in a couple minutes. The MCP server is secured by design using keys and HTTPS, and allows more options for OAuth using built-in auth and/or API Management as well as network isolation using VNET.
If you're looking for this sample in more languages check out the .NET/C# and Node.js/TypeScript versions.
Below is the architecture diagram for the Remote MCP Server using Azure Functions:
- Python version 3.11 or higher
- Azure Functions Core Tools >=
4.0.7030
- Azure Developer CLI
- To use Visual Studio Code to run and debug locally:
An Azure Storage Emulator is needed for this particular sample because we will save and get snippets from blob storage.
-
Start Azurite
docker run -p 10000:10000 -p 10001:10001 -p 10002:10002 \ mcr.microsoft.com/azure-storage/azurite
Note if you use Azurite coming from VS Code extension you need to run
Azurite: Start
now or you will see errors.
-
Change to the src folder in a new terminal window:
cd src
-
Install Python dependencies:
pip install -r requirements.txt
Note it is a best practice to create a Virtual Environment before doing the
pip install
to avoid dependency issues/collisions, or if you are running in CodeSpaces. See Python Environments in VS Code for more information.
-
Start the Functions host locally:
func start
Note by default this will use the webhooks route:
/runtime/webhooks/mcp/sse
. Later we will use this in Azure to set the key on client/host calls:/runtime/webhooks/mcp/sse?code=<system_key>
-
Add MCP Server from command palette and add URL to your running Function app's SSE endpoint:
http://0.0.0.0:7071/runtime/webhooks/mcp/sse
-
List MCP Servers from command palette and start the server
-
In Copilot chat agent mode enter a prompt to trigger the tool, e.g., select some code and enter this prompt
Say Hello
Save this snippet as snippet1
Retrieve snippet1 and apply to newFile.py
-
When prompted to run the tool, consent by clicking Continue
-
When you're done, press Ctrl+C in the terminal window to stop the Functions host process.
-
In a new terminal window, install and run MCP Inspector
npx @modelcontextprotocol/inspector
-
CTRL click to load the MCP Inspector web app from the URL displayed by the app (e.g. http://0.0.0.0:5173/#resources)
-
Set the transport type to
SSE
-
Set the URL to your running Function app's SSE endpoint and Connect:
http://0.0.0.0:7071/runtime/webhooks/mcp/sse
Note this step will not work in CodeSpaces. Please move on to Deploy to Remote MCP.
- List Tools. Click on a tool and Run Tool.
Run this azd command to provision the function app, with any required Azure resources, and deploy your code:
azd up
You can opt-in to a VNet being used in the sample. To do so, do this before azd up
azd env set VNET_ENABLED true
Additionally, API Management can be used for improved security and policies over your MCP Server, and App Service built-in authentication can be used to set up your favorite OAuth provider including Entra.
Your client will need a key in order to invoke the new hosted SSE endpoint, which will be of the form https://<funcappname>.azurewebsites.net/runtime/webhooks/mcp/sse
. The hosted function requires a system key by default which can be obtained from the portal or the CLI (az functionapp keys list --resource-group <resource_group> --name <function_app_name>
). Obtain the system key named mcp_extension
.
For MCP Inspector, you can include the key in the URL:
https://<funcappname>.azurewebsites.net/runtime/webhooks/mcp/sse?code=<your-mcp-extension-system-key>
For GitHub Copilot within VS Code, you should instead set the key as the x-functions-key
header in mcp.json
, and you would just use https://<funcappname>.azurewebsites.net/runtime/webhooks/mcp/sse
for the URL. The following example uses an input and will prompt you to provide the key when you start the server from VS Code. Note mcp.json has already been included in this repo and will be picked up by VS Code. Click Start on the server to be prompted for values including functionapp-name
(in your /.azure/*/.env file) and functions-mcp-extension-system-key
which can be obtained from CLI command above or API Keys in the portal for the Function App.
{
"inputs": [
{
"type": "promptString",
"id": "functions-mcp-extension-system-key",
"description": "Azure Functions MCP Extension System Key",
"password": true
},
{
"type": "promptString",
"id": "functionapp-name",
"description": "Azure Functions App Name"
}
],
"servers": {
"remote-mcp-function": {
"type": "sse",
"url": "https://${input:functionapp-name}.azurewebsites.net/runtime/webhooks/mcp/sse",
"headers": {
"x-functions-key": "${input:functions-mcp-extension-system-key}"
}
},
"local-mcp-function": {
"type": "sse",
"url": "http://0.0.0.0:7071/runtime/webhooks/mcp/sse"
}
}
}
For MCP Inspector, you can include the key in the URL: https://<funcappname>.azurewebsites.net/runtime/webhooks/mcp/sse?code=<your-mcp-extension-system-key>
.
For GitHub Copilot within VS Code, you should instead set the key as the x-functions-key
header in mcp.json
, and you would just use https://<funcappname>.azurewebsites.net/runtime/webhooks/mcp/sse
for the URL. The following example uses an input and will prompt you to provide the key when you start the server from VS Code:
{
"inputs": [
{
"type": "promptString",
"id": "functions-mcp-extension-system-key",
"description": "Azure Functions MCP Extension System Key",
"password": true
}
],
"servers": {
"my-mcp-server": {
"type": "sse",
"url": "<funcappname>.azurewebsites.net/runtime/webhooks/mcp/sse",
"headers": {
"x-functions-key": "${input:functions-mcp-extension-system-key}"
}
}
}
}
You can run the azd up
command as many times as you need to both provision your Azure resources and deploy code updates to your function app.
Note
Deployed code files are always overwritten by the latest deployment package.
When you're done working with your function app and related resources, you can use this command to delete the function app and its related resources from Azure and avoid incurring any further costs:
azd down
Once your application is deployed, you can use these commands to manage and monitor your application:
# Get your function app name from the environment file
FUNCTION_APP_NAME=$(cat .azure/$(cat .azure/config.json | jq -r '.defaultEnvironment')/env.json | jq -r '.FUNCTION_APP_NAME')
echo $FUNCTION_APP_NAME
# Get resource group
RESOURCE_GROUP=$(cat .azure/$(cat .azure/config.json | jq -r '.defaultEnvironment')/env.json | jq -r '.AZURE_RESOURCE_GROUP')
echo $RESOURCE_GROUP
# View function app logs
az webapp log tail --name $FUNCTION_APP_NAME --resource-group $RESOURCE_GROUP
# Redeploy the application without provisioning new resources
azd deploy
The function code for the various endpoints are defined in the Python files in the src
directory. The MCP function annotations expose these functions as MCP Server tools.
Here's the actual code from the function_app.py file:
@app.generic_trigger(arg_name="context", type="mcpToolTrigger", toolName="hello_mcp",
description="Hello world.",
toolProperties="[]")
def hello_mcp(context) -> None:
"""
A simple function that returns a greeting message.
Args:
context: The trigger context (not used in this function).
Returns:
str: A greeting message.
"""
return "Hello I am MCPTool!"
@app.generic_trigger(
arg_name="context",
type="mcpToolTrigger",
toolName="get_snippet",
description="Retrieve a snippet by name.",
toolProperties=tool_properties_get_snippets_json
)
@app.generic_input_binding(
arg_name="file",
type="blob",
connection="AzureWebJobsStorage",
path=_SNIPPET_BLOB_PATH
)
def get_snippet(file: func.InputStream, context) -> str:
"""
Retrieves a snippet by name from Azure Blob Storage.
Args:
file (func.InputStream): The input binding to read the snippet from Azure Blob Storage.
context: The trigger context containing the input arguments.
Returns:
str: The content of the snippet or an error message.
"""
snippet_content = file.read().decode("utf-8")
logging.info(f"Retrieved snippet: {snippet_content}")
return snippet_content
@app.generic_trigger(
arg_name="context",
type="mcpToolTrigger",
toolName="save_snippet",
description="Save a snippet with a name.",
toolProperties=tool_properties_save_snippets_json
)
@app.generic_output_binding(
arg_name="file",
type="blob",
connection="AzureWebJobsStorage",
path=_SNIPPET_BLOB_PATH
)
def save_snippet(file: func.Out[str], context) -> str:
content = json.loads(context)
snippet_name_from_args = content["arguments"][_SNIPPET_NAME_PROPERTY_NAME]
snippet_content_from_args = content["arguments"][_SNIPPET_PROPERTY_NAME]
if not snippet_name_from_args:
return "No snippet name provided"
if not snippet_content_from_args:
return "No snippet content provided"
file.set(snippet_content_from_args)
logging.info(f"Saved snippet: {snippet_content_from_args}")
return f"Snippet '{snippet_content_from_args}' saved successfully"
@app.generic_trigger(
arg_name="context",
type="mcpToolTrigger",
toolName="save_simple_sensor_data",
description="Save sensor data.",
toolProperties=tool_properties_simple_sensor_data_json,
)
@app.generic_output_binding(arg_name="file", type="blob", connection="AzureWebJobsStorage", path=_SIMPLE_SENSOR_BLOB_PATH)
def save_simple_sensor_data(file: func.Out[str], context) -> str:
"""
Save sensor data to Azure Blob Storage.
Args:
file (func.Out[str]): The output binding to write the sensor data to Azure Blob Storage.
context: The trigger context containing the input arguments.
Returns:
str: A success message indicating that the sensor data was saved.
"""
content = json.loads(context)
logging.info(f"Received content: {content}")
sensor_data = content["arguments"]
if not sensor_data:
return "No sensor data provided"
file.set(json.dumps(sensor_data))
logging.info(f"Saved sensor data: {sensor_data}")
return "Sensor data saved successfully"
@app.generic_trigger(
arg_name="context",
type="mcpToolTrigger",
toolName="save_complex_sensor_data",
description="Save complex IoT device data with nested sensor, event, and configuration information.",
toolProperties=tool_properties_complex_sensor_data_json,
)
@app.generic_output_binding(arg_name="file", type="blob", connection="AzureWebJobsStorage", path=_COMPLEX_SENSOR_BLOB_PATH)
def save_complex_sensor_data(file: func.Out[str], context) -> str:
"""
Save complex IoT device data to Azure Blob Storage.
Args:
file (func.Out[str]): The output binding to write the device data to Azure Blob Storage.
context: The trigger context containing the input arguments.
Returns:
str: A success message indicating that the device data was saved.
"""
content = json.loads(context)
logging.info(f"Received device data content: {content}")
device_data = content["arguments"]
if not device_data:
return "No device data provided"
if not device_data.get("device_id"):
return "Device ID is required"
# Save the complete device data structure
file.set(json.dumps(device_data))
logging.info(f"Saved device data for device: {device_data.get('device_id')}")
return f"Device data for {device_data.get('device_id')} saved successfully"
Note that the host.json
file also includes a reference to the experimental bundle, which is required for apps using this feature:
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle.Experimental",
"version": "[4.*, 5.0.0)"
}
This MCP server exposes several functions as tools, available to clients like GitHub Copilot or MCP Inspector:
A simple greeting function to verify connectivity and tool invocation.
Example:
Say Hello
Retrieves a named snippet from Azure Blob Storage.
Input: snippet name
Example:
Retrieve snippet1 and apply to my code
Saves a snippet with a specified name to Azure Blob Storage.
Input: snippet name, snippet content
Example:
Save this code as my_function_snippet
Saves basic sensor readings.
Fields:
sensor_id
: Unique sensor identifiermetric_name
: Name of the metric (e.g., temperature)value
: Numeric valueunit
: Measurement unittimestamp
: When the reading was takenIsCalibrated
: Manual or automatic calibration
Example:
Save sensor data for sensor THM001 with temperature reading of 24.5°C taken now
Stores detailed IoT device data, including:
- Device ID and timestamp
- Location (latitude, longitude, altitude, description)
- Array of sensors (with metrics, status, and errors)
- Events (with triggers, thresholds, severity)
- Configuration (sampling/transmit intervals, firmware, network info)
Example:
Save complex device data for IoT-Gateway-123 with current location, temperature and humidity sensors, and network configuration
All data is stored in Azure Blob Storage and can be accessed or processed by other Azure services as