Recent Articles Changed

Code Quality - Code analysis
Modified Mar 17 - 10:13
Code analysis is an application that scans code and reports : problems or improvement The application performing code analysis is a linter (a parser/lexer program) Example: Bad variable name writing Bad name writing is a problem that occurs when th "...

Language - Linter (Code Analysis, Validation)
Modified Mar 17 - 10:09
A linter is a tool that: statically analyze code finds problems in them may enforce a coding style. syntax highlighting Example implementation Bad variable name writing Bad name writing is a problem that occurs when the developer makes a mistake "...
Video / Motion / Animation / Film
Modified Mar 13 - 14:18
A video is : a sequence of image called frame shown at the frequency called frequence rate (per second) compressed to follow a bitrate. Movies at first were called moving pictures. A good example to see is the
What are images? (also known as pictures)
Modified Mar 13 - 12:53
An image (from Latin: imago) is an object that depicts visual perception, such as: a photograph or other two-dimensional picture, Example of images and their dimensions Name Dimension Album CoverRaster graphics (also called bitmapVector graphicsMus "...
Recent Articles Created
Code Shipping - Definition File (Declaration File, Stubs)
Created Feb 24 - 16:33
dynamic languages does not have any type system (by default). (a variable may store any kind of value. To overcome this situation, developers create definition files that defines a library in terms of class, function and type signaturecode completioc...
Data all the way
Always keep one hand firmly on data, Amos liked to say.
Data is what set psychology apart from philosophy, and physics from metaphysics.

Data Warehouse Subsystems
Developing a data warehouse means assembling a lot of subsystems in order to create a whole and coherent data application.
Discovers the 34 Kimball Subsystems and drill down to discover them

Data Processing
Data processing has changed a lot since 1940 :) and luckily, we are not using punch card anymore.
Stream processing is becoming the norm in data integration task while batch processing stays the king in data analysis.
ENIAC, 1946
The first fully electronic digital computer

Dimensional Data Modeling
Dimensional Data Modeling permits you to model a (hyper)cube and analyse a process through different perspective. You define dimensions, measures and metrics.
Most programs process some input to produce some output; that’s pretty much the definition of computing.

Function
A Function is the basic block of all reusable code component. It is also becoming the central component of any serverless architecture.
Knowledge isn't free. You have to pay attention.

Table / Relation
The table format (known also as a relation) is the most important data structure used in data analysis. Did you know that its name originate from the medieval counting table ?
Education isn't something you can finish.