After getting back into heavier development work recently using JetBrains IDEs, I realised their AI integration had changed quite a bit since I last used it. You now run out of credits far quicker than before, which made me start looking for alternatives.
Since I already have OpenAI credits and access to some excellent high-context models, I wanted to connect JetBrains directly to my azure OpenAI endpoint, which I’d read was supported.
That’s when I discovered JetBrains only allows connecting a custom Azure OpenAI endpoint with an Enterprise subscription, due to the secret key requirement. I don’t have Enterprise access (and definitely can’t justify the cost right now), so I built an OpenAI-compatible proxy instead.
This proxy lets JetBrains, and likely other OpenAI-compatible clients, use Azure OpenAI without modification.
Project overview
This project is a dotnet application that acts as an OpenAI compatible proxy for Azure OpenAI Services. It is written in C# 13 and is fully containerised using Aspire with Docker Compose.
Key features:
– Exposes familiar endpoints such as /v1/chat/completions and /v1/models so existing OpenAI clients work without changes.
– Translates OpenAI style requests into the correct Azure API format automatically.
– Uses Aspire for multi-service container deployments and health checks.
– Instrumented with OpenTelemetry for metrics and distributed traces.
– Includes a Scalar API Reference UI for interactive API documentation.
The result is a gateway that provides compatibility, observability and Docker deployment for teams that want to use Azure-hosted models like gpt-5-chat with OpenAI clients like Jetbrains.
API code
You can access the code for the proxy here: OpenAI Proxy.
Simply configure the Azure section in your appsettings.json file with your endpoint details and API key, then run the application.
Configuration
Add an Azure section to your appsettings.json (or use environment variables) with the following keys:
If you prefer to supply a full URL instead of base+deployment, set EndpointFull to a complete Azure URL and the proxy will use that directly.
Deployment notes
– The proxy is designed to run in Docker. Use your existing Aspire setup to publish docker-compose artifacts, then run docker compose locally or in your environment.
– Ensure the Azure key is provided via a secure secret mechanism in production (environment variable, secret store or Docker secret).
– OpenTelemetry and Scalar integration require additional configuration; see their respective docs for collectors and UI wiring.
Closing thoughts
This proxy is a small, pragmatic workaround that saved me from having to buy an Enterprise subscription just to point JetBrains at my Azure OpenAI endpoint. It is designed to be compatible, observable and simple to run in containers.
The project is intended to be run locally in a secure environment, not exposed to the public internet. Running it publicly could lead to unintended or unauthorised resource usage, for which I accept no responsibility.