Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
CREATE TABLE public."dashboardMetricsTotalSnapshot" (
id TEXT PRIMARY KEY DEFAULT 'snapshot',

"activitiesTotal" BIGINT,
"activitiesLast30Days" BIGINT,
"organizationsTotal" BIGINT,
"organizationsLast30Days" BIGINT,
"membersTotal" BIGINT,
"membersLast30Days" BIGINT,
"updatedAt" TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
CREATE TABLE public."dashboardMetricsPerSegmentSnapshot" (
"segmentId" TEXT PRIMARY KEY,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Inconsistent column type for segmentId (TEXT vs uuid)

The segmentId column is defined as TEXT, but throughout the codebase, segment IDs are consistently stored as uuid type. The segments table defines its primary key as uuid, and all other tables referencing segments (like memberSegments, organizationSegments, activities, etc.) use uuid for segmentId. This type mismatch could cause issues when joining with the segments table or when the JDBC sink attempts to write UUID data into a TEXT column, potentially requiring explicit casting or causing data inconsistencies.

Fix in Cursor Fix in Web

"activitiesTotal" BIGINT,
"activitiesLast30Days" BIGINT,
"organizationsTotal" BIGINT,
"organizationsLast30Days" BIGINT,
"membersTotal" BIGINT,
"membersLast30Days" BIGINT,
"updatedAt" TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
1 change: 1 addition & 0 deletions scripts/scaffold/kafka-connect/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ RUN yum install -y jq findutils unzip

RUN confluent-hub install snowflakeinc/snowflake-kafka-connector:2.5.0 --no-prompt
COPY tmp/kafka-connect-http/ /usr/share/confluent-hub-components/kafka-connect-http/
RUN confluent-hub install confluentinc/kafka-connect-jdbc:10.8.4 --no-prompt


VOLUME /storage
Expand Down
Loading