This document discusses the history and evolution of database concepts like commit logs, replication, and partitioning into the modern streaming data architecture. It traces how technologies like Kafka evolved from basic concepts like commit logs for high availability and scalability to power stream processing, data integration, and real-time analytics. The core idea of storing a commit log of all changes has remained fundamental to powering new use cases with streaming data.
36. 36
CREATE TABLE sales (
prod_id NUMBER(6),
cust_id NUMBER,
quantity_sold NUMBER(3),
amount_sold NUMBER(10, 2)
)
CREATE TRIGGER sales_deletion
AFTER DELETE ON sales
REFERENCING OLD ROW AS Old
FOR EACH ROW
INSERT INTO sales_deleted_Log
VALUES (Old.title);
CREATE TRIGGER sales_update
AFTER UPDATE ON sales
REFERENCING OLD ROW AS Old, NEW ROW AS New
FOR EACH ROW
BEGIN ATOMIC
UPDATE customer SET
amount = amount
+ (New.amount * New.quantity)
- (Old.amount * Old.quantity)
WHERE
id=New.cust_id
END;
CREATE TRIGGER sales_creation
AFTER INSERT ON sales
REFERENCING NEW ROW AS New
BEGIN ATOMIC
UPDATE customer SET
amount = amount
+ (New.amount * New.quantity)
WHERE
id=New.cust_id
END;