Build an Automated Data Dashboard with Make, Snowflake, Streamlit Integration: Gain Real-time Insights and Optimize Data-Driven Investment Decisions
Instead of struggling with manual data updates, connect Make, Snowflake, and Streamlit to build an automated data dashboard. This dashboard provides real-time insights and optimizes data-driven investment decisions, saving you time and money. No more relying on outdated information!
1. The Challenge / Context
Many companies and individual investors rely on Excel spreadsheets or inefficient data pipelines to make investment decisions. This leads to delays in data updates, increased potential for errors, and a lack of real-time insights. If data is not up-to-date, it's difficult to make the right decisions, and you can fall behind the competition. This is especially critical in a rapidly changing market environment. Additionally, the significant time required for data analysis makes it difficult to focus on core tasks.
2. Deep Dive: Make, Snowflake, Streamlit
The core of this solution is the effective integration of three powerful tools: Make, Snowflake, and Streamlit. Let's explore the role and functionality of each tool.
- Make (formerly Integromat): A low-code automation platform that allows you to build workflows by connecting various applications and services. In this project, it detects data changes in Snowflake and automatically updates the Streamlit dashboard.
- Snowflake: A cloud-based data warehouse optimized for storing, managing, and analyzing large volumes of data. Here, it stores investment-related data and provides data to Streamlit by fetching it from Make. Snowflake's excellent scalability and performance enable real-time data processing and analysis.
- Streamlit: An open-source, Python-based framework that helps data scientists and machine learning engineers easily visualize data and create interactive web applications. It visually represents data fetched from Snowflake, allowing investors to gain insights intuitively.
3. Step-by-Step Guide / Implementation
Now, let's look at the detailed process of building an automated data dashboard by integrating Make, Snowflake, and Streamlit.
Step 1: Set up Snowflake Database and Table
Create a database and table in Snowflake to store investment-related data. The table schema may vary depending on the type of investment data, but generally includes the following columns:
- Symbol (Stock Symbol)
- Date (Date)
- Open (Opening Price)
- High (High Price)
- Low (Low Price)
- Close (Closing Price)
- Volume (Trading Volume)
-- 데이터베이스 생성
CREATE DATABASE investment_data;
-- 데이터베이스 사용
USE investment_data;
-- 테이블 생성
CREATE TABLE stock_prices (
symbol VARCHAR(10) NOT NULL,
date DATE NOT NULL,
open DECIMAL(10, 2),
high DECIMAL(10, 2),
low DECIMAL(10, 2),
close DECIMAL(10, 2),
volume BIGINT,
PRIMARY KEY (symbol, date)
);
Step 2: Develop Streamlit Dashboard
Develop a dashboard using Streamlit to visualize data from Snowflake. Install the necessary libraries and write code to connect to Snowflake and fetch data.
import streamlit as st
import snowflake.connector
import pandas as pd
# Snowflake 연결 설정
@st.cache_data
def get_connection():
return snowflake.connector.connect(
user=st.secrets["snowflake"]["user"],
password=st.secrets["snowflake"]["password"],
account=st.secrets["snowflake"]["account"],
warehouse=st.secrets["snowflake"]["warehouse"],
database="investment_data",
schema="PUBLIC"
)
# 데이터 가져오기
@st.cache_data
def load_data(symbol):
conn = get_connection()
query = f"SELECT * FROM stock_prices WHERE symbol = '{symbol}' ORDER BY date"
df = pd.read_sql(query, conn)
return df
st.title('Stock Price Dashboard')
# 종목 선택
symbol = st.selectbox('Select a stock symbol', ['AAPL', 'MSFT', 'GOOG'])
# 데이터 로드
data = load_data(symbol)
# 데이터 시각화 (예: 라인 차트)
st.line_chart(data.set_index('date')['close'])
Important: Use `st.secrets` to securely manage your Snowflake connection information. You need to add Snowflake credentials to Streamlit secrets.
Step 3: Build Make Scenario
Build a scenario in Make that detects Snowflake data changes and updates the Streamlit dashboard. Follow these steps:
- Log in to Make and create a new scenario.
- Add a Snowflake module and select the "Watch events" trigger.
- Set up the Snowflake connection and specify the table to monitor (e.g., `stock_prices`).
- Add a Webhooks module and select the "Custom webhook" action.
- Generate a Webhook URL and save it for restarting your Streamlit app. (e.g., for a Streamlit Community Cloud app, redeploy the app to clear the cache.) Alternatively, a more complex method involves refreshing the streamlit app via an API.
- If necessary, add filters to trigger the Webhook only under specific conditions (e.g., when data for a particular stock changes).
Key: The simplest way to update a Streamlit dashboard is to redeploy it. If you are using Streamlit Community Cloud, you can press the "Reboot app" button in the app settings. However, a more automated method involves using the Streamlit API, but the Streamlit API is currently in an experimental state, so use it with caution.
Alternative Method (Streamlit App Refresh via API Call, EXPERIMENTAL): You can programmatically refresh the app using the Streamlit API. This method is slightly more complex but allows you to build a fully automated pipeline. You need to configure Make to call the Streamlit API endpoint when the Webhook is triggered. This feature is still in an Experimental state, so refer to the Streamlit document for usage.
Step 4: Set up Webhook URL
Use the Webhook URL generated in Make to update the Streamlit dashboard when Snowflake data changes. As mentioned earlier, the simplest method is to redeploy the app on Streamlit Community Cloud. A more complex method involves using the Streamlit API (Experimental).
# Streamlit app redeployment example (Streamlit Community Cloud)
# (This is not actual code, but a conceptual explanation.)
# 1. When a Webhook is triggered in Make, call the Streamlit Community Cloud API to redeploy the app.
# 2. App redeployment clears the cache and reflects the latest data.
# If using Streamlit API (Experimental)
# (This is not actual code, but a conceptual explanation.)
# 1. When a Webhook is triggered in Make, call the Streamlit API endpoint.
# 2. The Streamlit API provides functionality to refresh the app.
4. Real-world Use Case / Example
As an individual investor, I used to spend a lot of time analyzing the stock market. Every day, I collected stock price data using Excel spreadsheets, created charts, and made investment decisions. However, this process was very cumbersome and time-consuming. After building an automated data dashboard integrated with Make, Snowflake, and Streamlit, I was able to save over 5 hours per week on data updates. Furthermore, I could make better investment decisions based on real-time insights. In particular, the ability to respond quickly during periods of high market volatility was extremely useful.
5. Pros & Cons / Critical Analysis
- Pros:
- Automated data updates save time and effort
- Real-time insights optimize data-driven investment decisions
- Reduced potential for errors
- Excellent scalability and flexibility
- Easy to develop with low-code/no-code solutions
- Cons:
- Requires a basic understanding of Make, Snowflake, and Streamlit
- Initial setup and configuration can be time-consuming
- Potential instability when using Streamlit API (Experimental)
- Snowflake costs incurred
6. FAQ
- Q: How much does Snowflake cost?
A: Snowflake costs vary depending on data storage, computing resource usage, and query execution frequency. It's best to check the pricing information on the Snowflake website and estimate costs based on your expected usage. - Q: Can I use other automation platforms instead of Make?
A: Yes, you can use other automation platforms like Zapier or Pipedream. However, Make offers powerful features for connecting various applications and services, along with a reasonable pricing policy, which is why it was deemed most suitable for this solution. - Q: Can I make the Streamlit dashboard more complex?
A: Of course. Streamlit offers various chart libraries, widgets, and themes, allowing you to enrich your data visualizations and user interface. Refer to the Streamlit documentation to explore various features.
7. Conclusion
An automated data dashboard integrated with Make, Snowflake, and Streamlit is a powerful tool that can optimize investment decisions and save time. Follow the steps presented in this article to build your own dashboard, gain real-time insights, and make better investment decisions. Try the code now and experience the power of data-driven investing! Check the official Streamlit documentation for more information.


