Category Archives: Microblogfeed


Driving Architectural Simplicity – The Value, Challenge, and Practice of Simple Solutions

Simple architectures are easier to communicate, build, deploy, operate, and evolve. Architectural simplicity is not easily encapsulated by one type of model or practice. Several practices can be applied in combination to drive simplicity. Agile practices stress simplicity. Architectural complexity can occur based on many factors such as design ability and focus, technology evolution, and organizational structure.

Typical resistance threshold, showed in numbers

Some kind of natural law with treshold of resistance. Once one level is passed, then up to next “level”.

Let’s start off..

First you start with 0
– You strive or struggle to reach your first. 1.

Once you reached 1
– You strive or struggle to double up, being 2

Once you got it doubled up (2)
– You strive or struggle to get multiple of 10, being 20

Once you reached 20
– You start strive or struggle to again get multiple of 10, being 200

Once you reached 200
– You being tempted to gain another multiple of 10, being 2.000

Once you reached 2.000
– You, here, how hard can it be to again gain a multiple of 10?? You aim for the five numbers, being 20.000.

Once you reach 20.000
– Here you feel it’s enough easy to do something more of what you done. Why not again multiple by 10, so you get 200.000

Once you reach the 200.000
– You are probably very hungry to hit the legendary million count, and would certainly not stop here. You aim for another multiple of 10, being 2.000.000

Once you hit the million, and reach the 2nd
– You already get the thing but might be obsessed to pass the 10th million, or you stop care about racing, here.


When can this be of relevance? Pretty often, right?

  • Money earnings?
  • Number of sales?
  • Number of Dolce & Gabbana garments?
  • Website page count visits?
  • Number of miles driven?
  • Collecting of items of particular kind?
  • Number of reads of a LinkedIn-Post?
  • Number of Tweets you done?
  • Number of comments you done on Quora?
  • Citizens you impact oppionion on?
  • ..your bet!


7 signs for programmers being the next dozen manpower, open for many educational levels

It’s not hard to see that programmer slash developer going to increase significant in importance and in number of employers. This trending will for sure come shipped with changes in ways of working, scoping and how one can specialize! Let’s go :-).

  1. User experience designers, UX. Maybe the current most obvious trending. A strong emerging work role that is more about visualizing, modeling and describe requirements. A part of a coders life has for a long time been to design the frontend and indirectly, strictly speaking, tied whole business areas to the moment of feeling for the actual coder. Now coders can focus on modularize and produce competent and extensible code and create a design that meet something that users already agreed on.
  2. Test strategies with automation that today build traceability and business acceptance into the test framework. Including planning and driving the delivery frameworks such as agile or waterfall. The test manager and testers work as a new role here, often not touch a line of code but instead define acceptance criterias and measurements. Also they might drive required changes to the testability of solution. Coders are there to develop for the needs, not longer a part of the expectations to define and develop conprehensive tests and test tools themselves.
  3. Unsuccessful rate of IT projects it’s still high. Often due to lack of human resources, or key individuals that play to broad in the delivery and can’t deliver with full quality in all areas. Number of developers in a delivery must increase dramatically. The cost side for developers would be decreased scope and narrow use of each knowledge. This open for a new manufacture line thinking, drive a new level of skills and salary expectation to do a professional work.
  4. DevOps to automate and parameterize release and deployment. Abbreviations such as CI and CD disarm the (maybe) most challenging and specializationing of a coders role. That is to understand the often very complex and volatile relationships between dev, integration, test and prod. All of the sudden the continous deployment and continous integration can be done cross over organisation with a new kind of role, without passing developers and operational resources all over the building environments. Coders can instead isolate their work on build easy to parameterize solutions.
  5. IT architects is strongly going to be an agreed profession. The roles is clarified, contextualized. Several organizations start to agree in the big picture. All signs on this path give it more clear that architecture and architects is about a interface (not border!) to development and programming. Coders no longer need to “call out” to find out what integrations and components or deployments need to be involved.
  6. Virtualized servers, data centers, the cloud and docker. A technology paradigm that from ground is built to bring infrastructure as a service. Capacity sizing and physical limitation is now in the hands of infrastructure specialists. They deliver reliable and fail safe OS level, perfectly patched, backed-up and with disaster recovery inside the same. Coders instead focus on provide and deploy software functionality into the infrastructure components and increase it’s awareness on how build energy efficient, low-utilization solutions that are capable to scale out.
  7. Portable frameworks, microkernel and microservices patterns and software design. The separation of software functionality into portable components and increased asynchronous or loose coupling between them, make the developer able to be very-very specialized within it’s particular area. Also framework-oriented development is not future, its here. Specialization of coders is it’s capability to select correct framework and implement instances of those. There is simply put no need for more than just few developers with competence such as “full stack”, i.e. because of monolites.

Sum up

In many of the listed areas, there is clearly improvements in sight of the developer slash programmer slash coder role. A kind of purification of the role, which most likely help you as recruiter and you as providing or valuate the education and skills of a resource. In several ways I feel this is a sight similar to the former big “industry floor” where workers are divided into, and to stay within, functional areas.

A production line roughly starts with

  • transport to warehouse
  • selecting from warehouse
  • assemble and inspect <– many
  • test whats assembled <– many
  • repair what’s not pass test
  • finally send for delivery queue.

Separate mechanisms taking over in the dispatch area. The resources may practically switch between the functional areas, but can functioning just in one at a time and organizationally belong to only one.

When I convert this to IT, I find

  • requirement modeling
  • sprint planning
  • coding <– many (coders)
  • early stage testing <– many (coders)
  • testing <– many
  • repair, re-deploy in test, re-test
  • continuous delivering

A challenge to all this mentioned above, is the role that used to be defined as “senior developer”, sometimes lend from the architecture signs as “senior developer slash architect”. May we see many of them convert to lead developers, in line with test managers? or pure architects. Or technical projects leads. However, key positions is there rather then a dozen role. This latest style (archetypes) of developers is not where we will see the increased number of developers, the upcoming years.

New bounties waiting for you who hack Windows

Microsoft has announced two new bounties for you who break or hack functions in Windows.

Microsoft Windows Bounty Program Terms

Microsoft is pleased to announce the launch of new Windows security bounty programs beginning July 26, 2017. Through this program, individuals across the globe have the opportunity to submit vulnerabilities found in latest Windows 10 Insider Preview slow ring. Windows 10 Insider preview updates are delivered to testers in different rings.

Microsoft Windows Defender Application Guard Bounty Program Terms

Microsoft is pleased to announce the launch of the Windows Defender Application Guard (WDAG) bounty program beginning July 26, 2017. Through this program, individuals across the globe have the opportunity to submit vulnerabilities in WDAG found in latest Windows 10 Insider Preview slow ring. Windows 10 Insider preview updates are delivered to testers in different rings.



Here is a full program of current bounties, and some that was available earlier:


Microsoft Security :: Security Vulnerability | Report a Vulnerability | MSRC:

Microsoft has championed many initiatives to advance security and to help protect our customers, including the Security Development Lifecycle (SDL) process and Coordinated Vulnerability Disclosure (CVD). We formed industry collaboration programs such as the Microsoft Active Protections Program (MAPP) and Microsoft Vulnerability Research (MSVR), and created the BlueHat Prize to encourage research into defensive technologies.


What Albert Einstein know that is useful for IT

New link. Read the post here:

Einstein knew what would be relevant for todays IT




We are in mourning of MsPaint – But remember the destiny of File manager for 20 years ago?

Loud barks does echo around the Internet regarding the discontinuation of MS Paint in Windows.

Somebody here remember the voices when MS did the same with “File Manager” (winfile.exe) into the latest versions of Windows 3.1 and the server Windows NT 3.5 era? The loss of WinFile was a catastroph in productivity until Total Commander save the world.

But that’s also a part of history and today only dinosaurs know about it. Now let’s mourn about Ms Paint and make some guess about what will be discontinued at about 20 years, at year 2037. What is your best guess?

Read more at >>

MS Paint is here to stay

MS Paint fans rejoice: The original art app isn’t going anywhere – except to the Windows Store for free!

On top of this, Microsoft comment that Paint3D will be the next. Even though the titles, classic paint move out from Windows and need to be fetched from Windows Store.

Little better destiny than File Manager (that faced a large number of technical limitations towards modern operating systems.

Analysing capacity +-15 deviation in compare to +-0 deviation

Would this model make sense as a visualization of how analytical depth and skill could differ between individuals, beyond education and experience?

When would the #2 archetype fit in a professional work, where #1 does not? And oppositely. Could (or even should) everyone try to be capable to fit in #2?

Readable full size picture: Full size image

Each individual do have natural degree of talent in versatile and parallel approaching of a problem (& solving, which is a ability itself). Communities do often talk about skills, but rarely about how intelligence, the in born ability measured as IQ, do impact how performing the skill is improved or not.

Limiting to +-15 standard deviations from normal IQ, it include approx 94% of the population, people that you meet daily and probably at work (which eventually is more diverse, based on your site).

How one at mid right, in compare to mid left, of this range approaching a problem, daily tasks and gather decision is very different. But may look the same from outside or question survey.


Jonas Nordin | Professional Profile | LinkedIn

View Jonas Nordin’s professional profile on LinkedIn. LinkedIn is the world’s largest business network, helping professionals like Jonas Nordin discover inside connections to recommended job candidates, industry experts, and business partners.


Why AI and IT-automation should not be for the IT, but instead the business

With respect to the ongoing trend of AI, robotics, deep learning, machine learning and a number of data analysis kinds. The key enabler of this industry is data, information. An era of technologies to be part of the future, already a multi billion industry. Referenced to this, one often hear “information is the new fuel”. So, therefore let us stop there a second.

What is fuel to you?

Think of it as a methapor.

Then think of what fuel is, what function it have, in a generic perspective.

What can the cost be not to challenge the current association to fuel? If the relation and understanding of information and data is not very clear, how can one ensure sustainable development of the new technologies? I wan’t to challenge this in the higher view, and explain why I want you to care about it.

For instance: A composition of atoms that you put into combustion. After combustion we have a new composition of atoms, something else. A catalythic process. Or so. Irreversibly blended with other similar kind of atoms. Unable to trace back to it’s source or original structure. The cost of fuel is producing, not by using it. Once fuel is there. it’s ready to serve. Does this fit into the description of data to IT? If yes, might you violate architectural foundational principles by neglect traceability?

For completeness, I choose to introduce with IoT. The reason is that IoT matter as an newcomer as enormous producer of data. But do IoT produce data? Well, yeah! To a cost? Maybe not obviously in first though, but it’s quite easy to associate a cost immediately to IoT. Hardware to produce is for me, cost. The engineering of IoT hardware is in infancy. Also to mention the feasibility of the implementation phase. Of course I include the security aspect in that comment. IoT is meaningless and belong to kids corner until it ensure security and information receipt acknowledgement. Where is the business value here?

Machine Learning and Deep Learning is definitively about data. But does it produce new data through cost, such as IoT? Not directly, one can say. This may be seen as the instrument used to transform fuel to power. As a technology. Still, building and maintain it is connected directly to cost. The algorithms and analysis of information or data to produce new data by creation of scenario figures. Figures that come with human effort, training and evaluation. And on top of that, one need data quality recognition and classification. Every time you end up in a change of an algorithm, the old data might be useless and need to be recreated. Should I care to mention cost of power during computing? Where is the business value here?

Taking persistence into account. DW and Big Data plus dozen of “localized” technologies for the structured data, in addition to the physical storage to persist data. As it is for IoT, all might agree that this is engineering. Obviously not about fuel! But you might be tempted to associate information to the fuel persisted on it, as gasoline in a gas tank. But wait. This storage mechanism is there to keep the power until it’s released. Such as a battery or transmission/gearbox. To the cost of this objective; think of the mandatory methods and styles to agree on transportation, data format, availability and quality selection. This breeaathing red colored dollars sign. Costs just to produce availability/usefullnes of data for decision making. Where is the business value here?

To this picture, add information. So, now most of you might disagree or say that of course there is business value behind all this. I bet you are right. Information is common in all areas, but is it fuel in any of them? or are they just supporting or consuming functions where information is a part?

Is automation or AI about replace humans with machines or services? Automate services without direct human actions. Invent services in areas where human can’t or won’t as of today. Oh yeah, here we will find increase of income or increase of defense. Or increase of service level. Reduced cost. Increased ROI. Happy business user and the CFO that love to invest and see technology help to create value. The key is automation. More about that soon. Now time for a mandatory parable.

Parable alert

For more than hundred years ago, automation could be to put cogwheel between a winch and a arm to rotate. This to make spans of iron ore transport from a hill to ground. A person need to rotate this. One may think that it was a horse or two, but a horse must be managed by men. And it take a one to one relationship between a human work hour and efficiency. One day, one attached a rotating axis from a completely different invention domain, steam engine, to make the rope winch rotate. Suddenly iron ore could be transported 24h a day. Never get tired, always transport, regardless of time to deliver. It now took one person 5min one time an hour to put coal into steam engine owen. As of a sudden, little later, a good income could be doubled by install another line of iron ore transport, with same purpose. Eventually little faster, more reliable and need less coal to run. Invention, optimization, cost reduce.

Is this an obvious business case? Similarities to what we do with IT? Yeaah, little. And this is finally the point for rest of the article. Robotic, Machine Learning, deep learning is all about IT. IT is mechanic, strategy, methods to produce, use and re-use the information or data. Purpose? convert to power, together with fuel. The fuel itself is not in any of the earlier metaphors. In the parable above, the very transport (movement = service) of iron ore could be seen as the product of all mechanical composition. Compare this to IT deliver a web-based renting service. Fuel would continue be the business decisions and ideas, that drive the innovation and invention of technology and usage of information, converted/combusted into power that push energy into the transmission and gearbox.

The questions I will let you take with you is (from IT perspective); Are we those who should define and invent AI or automation? To which cost or increase revenue do we replace or incorporate automation that make sense? That is a business question. IT now sit on extremely powerful platform of technology. Maybe you can’t have control over the effects, when apply it in a greyzone where business decisions is not present. Think intentionally, IT is here to deliver effecient technology strategy to business decisions. While business is here to provide adequate services to customers, whatever their usage is. So what I can see, Master of Business Administration is there for business development and interface to the customers. Right? It’s may be tempting that IT might take share of the market to increase the “IT drive the business” perspective. The deeper technical knowledge of the tools can make IT advocate for business, what capabilities they should design services for.

It didn’t work well at dotcom-bubble at about year 2000. I assume that the finance and IT around the world is more mature and can’t repeat the dotcom-bubble like it was, but i’m certain that we can (by greed or by accident) repeat troublesome and time consuming decision patterns and styles where IT is used to meet business. Eventually because we are new to read the volatile consumer market as of today. Remember the intention of the technology strategy. I trust business to make decisions and requirements automation and AI, but IT to provide the technology strategy. With this said, this is an interpretation of automation. One can say it’s wrong, one can agree, one can choose both or none of it. Reflect, analyse and comment! Thanks.

Bottom note: If you like more peoples to read this, just press a like or a comment. For LinkedIn it’s enough for spreading the word, in contrast to many other networks that require sharing.


Jonas Nordin | Professional Profile | LinkedIn

View Jonas Nordin’s professional profile on LinkedIn. LinkedIn is the world’s largest business network, helping professionals like Jonas Nordin discover inside connections to recommended job candidates, industry experts, and business partners.


PS: Interesting analysis of emerging technologies, with respect to this article. The scope in the link would be far outside the year 2017, but indeed this is the year to start watch out for it. Also as you understand, I still see AI as the services to business and end users, not a technology by itself.. =) DS.

Monolith First – When MicroServices make sense

In the actual microservices hype, I want to emphasize the point to be extra careful where to implement tje MicroService architecture, so it’s not happen to be “by all means” because many talk about it. Read this article that share good insights.


bliki: MonolithFirst

evolutionary design · microservices tags: As I hear stories about teams using a microservices architecture, I’ve noticed a common pattern. Almost all the successful microservice stories have started with a monolith that got too big and was broken up Almost all the cases where I’ve heard of a system that was built as a microservice system from scratch, it has ended up in serious trouble.


Where do they go, once you let them go through your port (number)?

As you know, UDP is stateless (or to called connectionless). So there is no guaranteed delivery to destination (nor promise what being returned to you).

UDP grow rapidly in popularity last years. Common known areas is online gaming, media streaming, online collaboration and even some data intensive business applications rely on UDP. It’s also nice to mention services like Wifi loves UDP to distribute discovery information to the network.

However, with UDP came critical levels of security risks. Beside that you, by protocol definition. don’t know for sure if the packets is delivered (or not), you also send information that are capable to redirect, transform or drop without further notice. When they return, you think you know what come back – but don’t know for sure.

So the retoric question: Do you care or know in detail what data you provide in UDP packages? Once the data is kept in a encrypted transfer, is fine, as long as the encryption is secured.

Happy weekend 🙂


This posting is sent to remind about not produce exceptions for flow control!

Why? Because the code will be hard to follow. One path for the exception and one without.

Why? Because the exception may not be good for what’s really happen.

Why? It might be re-written without knowledge of what happen in the methods with the exception control flow

Why? Because you would probably not reverse trace all instances of your exception

Why? Because you are lazy

Why? Because you know that you won’t reimplement the methods.

Why? Because you find out it’s not within the scope of your job.

Why? Because the boss will ask what you are doing instead of the described change task.

And by the way, maybe the automated tests or other mechanisms do, because they might rely on the exception flow.

And then.. then you have to ask the boss why tests failed when you worked on something else.

– Just do the the exception handling properly! Normalize the collections based on agreement of data structure (or on the lack of it) during increment. Create specialist classes for anomalies and exceptions, such as it was new programs to handle the code.

Data science evolution 1960 to 2040

Isn’t it amusing to find out how paradigms have passed in the computer era? Let me share some weekend amateur drawings.

1) 1960-1980. Monstrous servers and ultra dumb clients.


2) 1980-2000. Monstrous clients and pity servers connecting to outputs. (VB6 and access databases loved to be close to the client).

3) 2000-2020. Enormous servers and enormous javascript heavy clients exchange enormous amount of data everywhere. What’s a paradigm here?

4) 2020-2040 (speculative). Connected clients is units, owner of it’s own data, stored locally. Data transfer between units and a kind of interbackbone is just descriptions, models, meta, statuses, point-to-point-connectivity agreements for subscribers. Interbackbone is just there to transport data, act services, agreeements and do physical transfer between networks. More like IP dgrams in the lower layers.

Have a nice friday and weekend! And remember to do your homework.