Architecture diagram of Normalization.
Data Standardization
Large Data Volume Processing

Software for early layers on data processing, limiting its redundancy and applying filters to standardize different sources. System based on JSON configurations for the detection of structured data types and models. Enabling the conversion of large amounts of data in storage or via APIs in real time.

Real-time data conversion to simplify data analysis

Normalization is a software based on Node.js, Python and with connection to Microsoft Azure Storage performing the conversion through automatically generated threads and stabilizing the load of the process on different machines.

Management by groups, models and types
Hierarchy common repository

Normalization works on a repository of data models for the reuse of types to streamline the normalization of different sources. Allowing to distinguish types and thus facilitating further data analysis.

Reuse of different data types

Normalization's filtering capability is focused either by traditional queries or by temporal or geospatial systems using geodesic algorithms, reducing the size of the data and improving the speed of the analysis.