Data science evolution 1960 to 2040

Isn’t it amusing to find out how paradigms have passed in the computer era? Let me share some weekend amateur drawings.

1) 1960-1980. Monstrous servers and ultra dumb clients.

 

2) 1980-2000. Monstrous clients and pity servers connecting to outputs. (VB6 and access databases loved to be close to the client).

3) 2000-2020. Enormous servers and enormous javascript heavy clients exchange enormous amount of data everywhere. What’s a paradigm here?

4) 2020-2040 (speculative). Connected clients is units, owner of it’s own data, stored locally. Data transfer between units and a kind of interbackbone is just descriptions, models, meta, statuses, point-to-point-connectivity agreements for subscribers. Interbackbone is just there to transport data, act services, agreeements and do physical transfer between networks. More like IP dgrams in the lower layers.

Have a nice friday and weekend! And remember to do your homework.