How Can AI Talk to My Database Part Two: MySQL and Gemini
In my previous blog post, I experimented with creating an MCP Server to provide AI access to a PostgreSQL database using the FastMCP Python framework and APIs from Anthropic and OpenAI. However, I soon realized that these two APIs can only communicate with an MCP server through HTTPS over a public URL. This led me to explore alternative solutions, including Gemini's API, which can access a local MCP server.
Creating a MySQL MCP Server with FastMCP
Gemini's API is different from the ones I previously used. It allows direct access to a local MCP server, making it an attractive option for testing. To leverage this feature, I created a MySQL MCP server using the FastMCP framework.
In this implementation, I replaced the psycopg2 driver, which is used for PostgreSQL, with a MySQL one. This required minimal changes to the code.
I installed a test database on my Ubuntu machine and ran the MCP server from there. To simplify the setup, I used the Sakila sample database, which provides a wealth of data for AI models to learn from.
Step-by-Step Guide
To create a read-only user for the MCP server, I used uv to create a virtual environment and installed the necessary Python modules. I then ran the MCP server using the sse transport method.
I connected to the virtual environment and started a FastMCP client to validate that the MCP server was working as expected.
Once inside the Python environment, I opened a connection with the MCP server and requested its list of tools. The query_data tool provided by the MCP server was returned along with its description.
Testing Gemini Integration
To integrate Gemini with FastMCP, I created a new virtual environment and installed the necessary Python modules. I then instantiated both a FastMCP client that connected to our local MCP server and a Gemini client.
I asked AI a question about films featuring Morgan McDormand in the Sakila database, but it initially returned an unhelpful response. I adjusted the temperature setting to improve the outcome.
Example: Gemini's Limitations
Gemini, like other AI models, has its own set of tools and capabilities. In this case, it was unable to access my test database and return a meaningful response.
The limitation lies in the fact that Gemini relies on external data sources to retrieve information. Without proper connections, it may struggle to provide accurate responses.
Using Different LLM Models
In my initial experiment with Gemini, I used the gemini-2.0-flash model. However, prior to publishing this article, my colleagues suggested that I might be using an older or less advanced model than necessary.
I took their advice and switched to the more advanced 2.0 or even 2.5-pro model. This resulted in a significantly better response from Gemini, with it correctly identifying films featuring Morgan McDormand suitable for general audiences.
Conclusion
In conclusion, while creating an MCP Server to provide AI access to my database using FastMCP and APIs was a challenging task, I discovered that Gemini's API can access local MCP servers. By choosing the right LLM model, I was able to achieve better results from Gemini.