In reality, there’s two kinds of ETL: The kind that focuses on transformation, cleansing, and quality because there’s a business process that requires standardization because they have to count things, do math, or disambiguate. The kind that has to be done merely to allow RDBMS to function because of their dependency on harmonized, normalized, and deorthoganalized data.
We eliminate #2 completely, and enable a more iterative approach to #1.
How does ML deliver this integration? Ingest data as is. No upfront data modeling required. ETL tool not required. Structured and unstructured data. This includes scalar, text, geospatial, binary, semantic triples. Data and meta data. Schemas accepted, but not required. Data Formats (XML, JSON, binary etc.) with efficient tokenized storage Methods Content Pump - high speed data loading, serial writes REST APIs Java Client API Node.js Client API Java / .NET XCC
Competitive Advantage? Data modeling and transformation is not a mandatory pre-requisite to loading data.
The Nature of Policing Is Changing
“The increasing availability of information and new technologies offers us huge potential to improve how we
protect the public. It sets new expectations about the services we provide.”
Police IT systems need to adapt to keep up with those changes