Track
In this article, I’ll explain step by step how to use Flowise to build an AI agent that can answer questions based on a CSV dataset.
Flowise is a handy tool for those who want to build AI agents without getting tangled in complex coding. Flowise has a low-code approach and allows us to focus more on the task at hand, rather than the technical details.
Recently, I've seen a surge in tools designed to build AI agents. These tools have become more accessible and user-friendly, allowing everyone to create intelligent systems without needing extensive technical skills. If you're interested in exploring more about these tools, check out these tutorials as well:
- n8n: A Guide With Practical Examples
- Langflow: A Guide With Demo Project
- Dify AI: A Guide With Demo Project
What Is Flowise?
Flowise is a tool designed to help us create AI agents through a simple drag-and-drop interface. It operates by allowing us to connect different blocks, which represent various functions, to build a customized workflow. Each block can be configured to perform a specific action.
For instance, there are blocks known as LLM blocks that are used to send a message to a large language model. There are also function blocks that give us the power to execute custom JavaScript functions, enabling even more tailored operations.
By linking these blocks together, information flows from one block to the next, processing and transforming the data step by step. This modular approach makes it easy to construct AI agents without deep programming knowledge, as we can focus on designing the workflow and configuring the necessary actions.
Building a Data Analyst AI Agent With Flowise
Let's break down the process of building a data analyst AI agent using Flowise. This guide will cover signing up for Flowise, building a custom workflow, and utilizing various blocks to create an interactive agent that can analyze datasets.
Step 1: Create an account on Flowise
Begin by visiting this Flowise sign-up page and creating an account. Fill in the required details and follow the instructions sent to your email to verify and activate your account.
Step 2: Database setup
Now that we have our Flowise account ready, it's time to set up the database we will be using. We need a database to store the data we want to analyze.
We'll use SingleStore, a popular and user-friendly SQL database provider, to manage our data. Let's go through these steps:
- Start by heading to the SingleStore website.
- Sign up for a new account on SingleStore.
- With an active SingleStore account, the next step is to upload the dataset we will be working with. For this example, we're using a dataset focused on student social media habits, which can be found on Kaggle. Make sure to download the dataset file from Kaggle and upload it to SingleStore.
- Once the dataset is uploaded, SingleStore will automatically generate a database table from it. This table will form the basis of our data analysis, making the information ready for our Flowise AI agent to interact with.
Step 3: Navigate to the chatflows section
Once logged in, go to the dashboard and find the "Chatflows" section. Click on "Add New" to start creating a new workflow.
Step 4: Create a custom function to load the table Information
This step involves writing a custom function to retrieve information about the table, such as its description and column names. This data is crucial for our AI agent to understand the structure of the dataset it will be querying.
- Create a custom function block.
- Paste the following code into the function block editor:
const mysql = require('mysql2/promise');
const tableName = $tableName;
const tableDescription = $tableDescription;
const connectionUrl = "<PASTE_YOUR_CONNECTION_STRING_HERE>"
async function main() {
try {
const pool = mysql.createPool(connectionUrl);
const q =
DESCRIBE ${tableName};
;
const [rows] = await pool.query(q);
const fields = rows.map((row) => ${row.Field} of type ${row.Type}).join("\n");
const tableInformation = Table name: ${tableName}\nTable description:\n${tableDescription.trim()}\nColumns:\n${fields.trim()};
return tableInformation;
} catch(error) {
return String(error);
}
}
return main();
At the top of the function, we need to set the connectionUrl
value. This can be found on SingleStore, in the Deployments tab.
The connection string looks like this:
françois-19ecc:<françois-19ecc Password>@svc-3482219c-a389-4079-b18b-d50662524e8a-shared-dml.aws-virginia-6.svc.SingleStore.com:3333/db_franois_88ec0
The first part, françois-19ecc
in my case, is the username. Right after, there’s a placeholder for the password, <françois-19ecc Password>
, which we need to replace.
This function connects to the SingleStore database and gets the column information for the table. It has two inputs:
tableName
: The name of the tabletableDescription
: The description of the table.
These must be configured by clicking the “Input Variables” button on the custom code node. Variables can be accessed in the code by prefixing their name with a $
.
In my case, the table was named dataset
when created on SingleStore. You should use whatever name was set when you uploaded the CSV file.
Step 5: Create a prompt template node
This node is where we craft the questions that guide our AI agent. Connect this node to your custom function.
- Create a prompt template node.
- Insert the following prompt:
Based on the SQL table information and the user's questions, return a SQL query that answers that question.
TABLE INFORMATION: {tableInformation}
QUESTION: {question}
The prompt has two placeholders: {tableInformation}
and {question}
. The question
is the user-submitted prompt, while the tableInformation
is the output of the function we defined in the previous step.
These placeholder values must be configured by clicking the “Format Prompt Values” button on the prompt template node.
Step 6: Link to an LLM chain block
Now, we connect the prompt template node to an LLM chain block, which will interpret the prompt and generate a SQL query from it. These are the steps we need to follow:
- Add an LLM chain block.
- Include a model block configured to use OpenAI, and connect it to the LLM chain block on the “Language Model” input.
- To configure the OpenAI model, you’ll need to set up an OpenAI API key. If you don’t have one, you can create one here.
- Connect the prompt template block to the “Prompt” input.
Step 7: Use a set variable block
To store the generated SQL query for further use, create a set variable block and link it to the output from the LLM chain block.
Step 8: Execute the SQL query
Send the SQL query to a new custom code block to execute it on the database.
- Create another custom code block.
- Enter the following code:
const mysql = require('mysql2/promise');
const connectionUrl = "<PASTE_THE_SAME_URL_USED_BEFORE>";
function formatQuery() {
// This function is used to clean the query provided by ChatGPT
// by removing markdown quotes
const q = $query;
let lines = q.trim().split(/\r\n|\r|\n/);
if(lines[0].startsWith("")) {
lines = lines.slice(1, lines.length - 1);
}
return lines.join("\n").trim()
}
const q = formatQuery();
try {
const pool = mysql.createPool(connectionUrl);
const [rows] = await pool.query(q);
return rows;
} catch(error) {
return Query: ${q}\nError:${String(error)}`
}
Remember to set the connectionUrl
value. It’s the same we used before.
This function has one input, which is the query we want to execute, and is accessed in the code using $query
.
Step 9: Create another prompt template
This template is used to format the answer that the AI agent will present to the user based on the query results.
- Create a new prompt template.
- Use this prompt:
Based on the question and the query result, provide an answer to the user's question. Always show the query to the user.
QUERY: {query}
QUERY RESULT: {queryResult}
QUESTION: {question}
This prompt has three placeholders:
query
: The SQL query that was generated by the LLM, which is obtained by connecting the variable node.queryResult
: The result of the query, which is calculated by the previous custom function node.question
: The initial user prompt.
Step 10: Final LLM chain block
Finally, connect this prompt template to a new LLM chain block, which will use the prompt to formulate the response. You can connect this to the same OpenAI model block used earlier.
Here’s the final agent flow:
With these steps completed, our AI agent is ready. It can accept questions, generate SQL queries to retrieve relevant data, and deliver clear, concise answers.
Chatting With the Data Analyst Agent
The agent is now ready to use. To chat with it, click the purple chat button in the top right corner:
Let’s test it by asking how many responses the survey got:
Note that the agent includes the query in the reply because of the way the last prompt was designed. Let’s ask some statistics about the data:
Here’s another example:
My Thoughts on Flowise
After experimenting with several AI agent-building tools, my experience with Flowise proved to be somewhat frustrating. One recurring issue I faced was accidentally deleting some variable configurations, which led to the agent behaving erratically without a clear explanation as to why. This often made troubleshooting a tedious task.
Moreover, Flowise lacks a clear starting point for its agent flows, which makes it challenging to follow and understand how the agent functions. This can be particularly daunting for those new to building AI systems. Additionally, Flowise falls short in terms of easily viewing intermediate results or testing nodes in isolation. This makes it difficult to pinpoint problems or refine the workflow effectively.
I also found it hard to see the node configurations at a glance. For example, the custom code nodes we used defined input variables, but it’s not visible that this is the case. We need to click that section of the node to see it. This makes it hard to understand the flow just by looking at it.
Conclusion
While Flowise offers a promising low-code interface for creating AI agents, it has a few usability issues that make it harder to use than similar tools, especially for those looking for a smoother and more intuitive experience. Despite these challenges, Flowise's concept holds potential, and with some improvements, it could become a great tool for building AI solutions.