Developing a data warehouse means assembling a lot of subsystems in order to create a whole and coherent data application.
Discovers the 34 Kimball Subsystems and drill down to discover them
Data processing has changed a lot since 1940 :) and luckily, we are not using punch card anymore.
Stream processing is becoming the norm in data integration task while batch processing stays the king in data analysis.
ENIAC, the first fully electronic digital computer , 1946
![]()
Dimensional Data Modeling permits you to model a (hyper)cube and analyse a process through different perspective.
You define dimensions, measures and metrics.
Most programs process some input to produce some output; that’s pretty much the definition of computing.
A Function is the basic block of all reusable code component. It is also becoming the central component of any serverless architecture.
Slow improvement is still improvement. We have worked on the following pages recently.
Computing Division at the Department of the Treasury, mid 1920s
![]()
The table format (known also as a relation) is the most important data structure used in data analysis. Did you know that its name originate from the medieval counting table ?