How to Automate Running SQL Queries on a Schedule with n8n

admin1234 Avatar

## Introduction

Data & Analytics teams often need to run SQL queries at regular intervals to generate reports, update dashboards, or gather insights for decision-making. Manually executing these queries can be tedious, error-prone, and inefficient. Automating SQL query execution on a schedule helps improve efficiency, ensures consistent data availability, and frees up analyst time for more strategic tasks.

In this article, we will walk through building an automation workflow using n8n — an open-source workflow automation tool — to run SQL queries on a schedule. We will integrate database connectivity (for example, MySQL or PostgreSQL), configure scheduled triggers, and handle query execution with error handling and notifications. This step-by-step guide is designed for startup data teams, automation engineers, and operations specialists wanting to streamline their data workflow with minimal manual intervention.

## Tools and Services Used

– **n8n:** The automation platform to build and orchestrate the workflow.
– **Database:** Example uses MySQL or PostgreSQL to run SQL queries.
– **Email (SMTP) or Slack:** Optional for error notifications.

## Use Case

A Data & Analytics team wants to automatically run a predefined SQL query every morning at 6 AM to extract customer purchase summaries. The results will be saved or further processed downstream (e.g., stored in Google Sheets, sent as a CSV file via email).

## Step-by-Step Tutorial

### Prerequisites

– Running n8n instance (cloud or self-hosted).
– Access to a MySQL or PostgreSQL database with credentials.
– (Optional) Slack workspace or SMTP email configured for notifications.

### 1. Setup a Scheduled Trigger in n8n

– Login to your n8n instance.
– Create a new workflow.
– Add a **Cron** node (found under triggers) to specify when to run the workflow.
– Configure the Schedule:
– Set `Mode` to `Every Day`.
– Set the time to `6:00 AM`.
– Save the node.

**Explanation:** The Cron node triggers the workflow at 6 AM every day, initiating the SQL query run.

### 2. Add the Database Node

– Add a **MySQL** or **PostgreSQL** node to the workflow (depending on your database).
– Configure the credentials (hostname, port, username, password, database name).
– In the **Operation** dropdown, select `Execute Query`.
– In the **Query** field, input the SQL query you want to run. For example:

“`sql
SELECT customer_id, SUM(purchase_amount) AS total_spent
FROM purchases
WHERE purchase_date >= CURDATE() – INTERVAL 30 DAY
GROUP BY customer_id;
“`

– Connect the Cron node output to this database node input.

### 3. Process Query Output (Optional)

The database node returns the query results as a JSON array. Depending on your use case, you can:

– Store results in a Google Sheet (add **Google Sheets** node and map fields accordingly).
– Send results via Email (attach as CSV using **Email** node).
– Push to Slack (format results as message or file).

Here is how to send results as a CSV via email:

– Add a **Function** node next to convert JSON to CSV:

“`javascript
const items = $input.all();

const headers = Object.keys(items[0].json).join(“,”);
const csvRows = items.map(item => Object.values(item.json).join(“,”));

const csv = [headers, …csvRows].join(“\n”);

return [{ json: { csv } }];
“`

– Connect the database node output to this Function node.

– Add an **Email Send** node:
– Configure SMTP credentials.
– Set recipient, subject, body.
– Attach CSV content:
– Use binary property, set mime type to `text/csv`.
– Alternatively, provide the CSV as inline content.

### 4. Add Error Handling

To make the workflow robust:

– Enable n8n’s **Error Workflow** feature, or
– Add a **Catch Error** node to handle database errors.
– Configure notifications for failures:
– Add Email or Slack node that receives error messages.
– Connect Catch node output to notification nodes.

### 5. Test the Workflow

– Run the Cron node manually to trigger the workflow.
– Verify:
– Query executes successfully.
– Data outputs as expected (email received, sheet updated, etc).
– Check logs for errors.

## Common Pitfalls and Tips

– Ensure database credentials are securely stored using n8n’s credential management.
– For large query results, consider paginating queries or batching results to avoid timeouts.
– If cron trigger doesn’t work as expected, verify n8n scheduler service is running.
– Use parameterized queries to avoid SQL injection vulnerabilities if using dynamic inputs.
– For email sending, set up SPF/DKIM records for better deliverability.

## Scaling and Adaptations

– Run different queries based on dynamic schedules by adding decision nodes.
– Integrate with other cloud data warehouses like BigQuery or Snowflake via APIs.
– Store result outputs in cloud storage (e.g., AWS S3) for large datasets.
– Chain workflows: use this query execution as a part of bigger ETL pipelines.
– Use environment variables in n8n for switching credentials between dev/prod.

## Summary

Automating scheduled SQL query execution with n8n enables Data & Analytics teams to improve efficiency and data freshness. By leveraging n8n’s Cron trigger and Database nodes, along with optional notification and data processing nodes, you can create a reliable, maintainable, and scalable workflow. Adding error handling and modular design prepares the system for production-grade use.

## Bonus Tip: Dynamic Query Parameters

Enhance your workflow by dynamically injecting parameters such as dates or user IDs into SQL queries using n8n’s expressions and input data. For example, you can define query like:

“`sql
SELECT * FROM purchases WHERE purchase_date = ‘{{ $json[“date”] }}’;
“`

And pass the required `date` value from a preceding node or environment variable, making your automation flexible and reusable.