“The complexity of a microcircuit, measured for example by the number of transistors per chip, doubles every 18 months (and therefore quadruples then every 3 years).” (Moore’s first law)

We continue to generate I.o.T. at an ever increasing speed both as advanced versions of the previous one and as new ones, collecting quantities of data that we sometimes neglect and sometimes memorize but without any classification.

This attitude generates constant misunderstanding on the part of the software in using these declassified data and without any possibility of adding and creating new heuristics.

The last plane crash in Ethiopia creates a paradox that should make us think further considering that it had already happened previously in Indonesia.

“Yesterday, Ethiopia chose to entrust the analysis to Germany, but the German federal agency for air accidents had declined the request for” lack of the necessary software “to analyze them.” Source Il Messaggero of Thursday 14 March 2019

We are so unaware of the scenarios we are producing that we delegate to I.o.T. and we authorize the software that interfaces them to decide or worse to exclude the man and his heuristics in the reality of the context.

So I ask myself a question too many devices have evolved or is the classification of the data we collect missing to allow software to work on a heuristic basis ?

Do we produce incomprehensible data? Gaetano