Leadership roles in the various business contexts require a core understanding of the environment, systematic approach to solve the problems and seeing the bigger picture within and around the organization. Richard Rumelt’s “Good Strategy / Bad Strategy” is an excellent book that uncovers the mysteries of the true nature of a strategy. We will delve into the core principles presented in the book and explore how they can be applied in this short blog post.
Understanding the Core of Good Strategy
Clarity of Thought and Vision
Rumelt emphasizes that at the heart of a good strategy is a clear diagnosis of the situation, understanding of the challenge, and a coherent approach to overcome it. This clarity cuts through the noise and complexity, enabling leaders to focus on what is truly important. There are also examples in the book which the managers build their strategies by either not understanding the real problems in an organization or ignoring them. We all remember the story of Don Quixote, right? This is in reality, sweeping the problems under the carpet and add new ones. Unsolved problems come up with different shapes and causes companies to fail inside out. The proper first step is to acknowledge and prioritize what to address, then overcoming the challenge becomes possible.
Leveraging Strengths
A strategic approach is not just about addressing challenges but also about leveraging strengths. Rumelt argues that a good strategy plays to an organization’s strengths, turning what you have into a powerful tool to edge past competition. It is a fundamental requirement to understand what your organization’s foundational and learned strengths before touching the pen to start working on a strategy. Remember, strengths could be vastly different depending on the environment, industry or even the different life phases of the key people in your organization. In some cases, properties of the organization could manifest themselves as strength at a certain time while the very same so called strength could turn into a fatal weakness.
Pragmatism Over Fantasy
Rumelt warns against what he terms ‘bad strategy,’ which often includes lofty goals without realistic plans. There are public examples of management in giant companies which create strategies from their ivory towers and everyone except them struggle to see how the strategy could be achieved or how a plan could be created out of the stragey. A good strategy is grounded in realistic assessments and actionable steps, steering clear of fanciful objectives disconnected from ground realities. If a strategical objective cannot be put into a plan, then it probably is not good to have under a strategy.
Key Takeaways from the Book
The Kernel of Strategy
At its core, strategy involves three elements: a diagnosis, a guiding policy, and coherent actions. This ‘kernel’ is the foundation which effective strategies are built on top of.
The Pitfalls of Bad Strategy
Bad strategies, according to Rumelt, often arise from unrealistic or misguided objectives, lack of focus, or mere fuzz that lacks substance. Recognizing these pitfalls is the first step in avoiding them.
Strategy is Dynamic
The book also highlights that strategy is not a one time event but it is actually a compass while navigating the rough waters. Whoever owning the strategy must keep it up to date and ensure the adaptation to changing circumstances and opportunities.
Applying the Concepts
In Business Leadership
Leaders can utilize Rumelt’s principles by being open about their organization’s challenges, setting achievable objectives, and aligning actions with these goals. This approach fosters a culture of focused and effective strategy execution. Remember, a strategy is effective if it could be understood and worked towards to. With this type of methodology, it is easier to get commitment from the members of the organization towards the most important goals.
Personal Career
Individuals can apply these insights in their career planning. By clearly understanding personal strengths and the challenges of the job market, one can formulate a strategy for career advancement that is both realistic and ambitious. Creation of such strategy would guide an individual for identifying the challenges and enable long term career planning.
In Entrepreneurship
Entrepreneurs can benefit from these principles by focusing on their unique value proposition, understanding their market deeply, and crafting strategies that leverage their strengths to outmaneuver potential or existing competition.
Conclusion
“Good Strategy / Bad Strategy” is not just a book; it’s a lens through which to view business and personal challenges. Rumelt’s wisdom encourages us to embrace clarity, leverage our strengths, and approach strategy with a balance of ambition and pragmatism. As we apply these timeless principles, we pave the way for success in our organizations and personal endeavors. I strongly recommend this book to people who are in need of guidance within business or personal context.
It has been more than a month since my first post about the graphs showing the COVID19 city snapshots from THL website. During that time, THL has improved and now they show the region by region infection density according to the data published last 14 days. 149 new infections diagnosed yesterday in Finland and I wanted to give an update on the 3 week moving sum city graphs. Note that the graphs below are only the snapshots taken from THL. Also the cities below are picked due to the size/growth rate of the infection and there are also other cities with #COVID19 infections. For most accurate information, please visit THL official website and daily update pages.
As a short summary, infection rates in Jyväskylä, Mikkeli and Lahti are rising among some other smaller cities which is different than what we have seen in the first wave. Situation in Uusimaa also not also getting better. There are small number of cities where the trend is downwards including Turku and Kuhmo.
Second wave of the pandemic is at our door and we do not know much about how bad it already could be. We have data coming through various sources but most of it give little about the trends, where we can see if we are doing bad, and if so, how bad it is. We have started to follow the infection trends since March and those graphs gave an accurate picture in March-April but but now in mid-August, numbers have started to make less sense and things are getting fuzzy when we see cumulative graphs which take all the data into account since March. An example is below:
There is not much information above. Actually, only meaningful information there is the number of the deceased. Even 146 infections in last one week is not correct (When you check THL (Finnish Institute for Health and Welfare) web page regarding the infection data, as of today 14.8.2020, summing up all the data for sampling days from “Päiväkohtaiset tautitapaukset (näytteenottopäivän mukaan)” , the number is different depending on where you take the “start date”, but non of the sums for 7 or 8 days make 146.
There are other news platforms with similar data but let’s check THL, the authority for coordinating the efforts for fighting with COVID19. THL publishes two kind of regular updates:
In their website, updates are published 3 times a week.
THL posts daily updates to the detailed statistics page. It is also possible to see how many infections have been recorded to their system for each city till that day. Note that, those daily updates do not tell how many people are diagnosed on any particular day per city. It just shows the snapshots of the situation. There could be some positive results which are not yet in their system and those stats will be recorded for a later day. Another interesting thing is that sum of all cities do not make the total sum given at the page. For such reasons, it is not possible to extract exact number of diagnoses in any particular city, any particular day. Considering these, there is somewhat useful information in THL data, but it is not easy to make sense of it. Even if you make sense of it, it is not possible to draw a complete picture.
So what am I after in this post? In the ideal world, best is to know how many people are still sick with COVID19 today, if possible city by city. In the beginning of the pandemic, YLE’s graph was telling exactly that for the entire country, number of sick people with the disease. If the curve is upwards, then the virus is spreading faster among us so we need to slow it down and “flatten” the curve. However, now after five months, many people in the graph is no longer sick and the flat graph is meaningless due to the scale. However, since there is no clear info about who is no longer sick other than the deceased, we do not have an absolute way of showing that information in a graph. So what can we do?
I have been collecting daily snapshots from THL’s confirmed coronavirus cases page for my personal use about the daily status of the infection for quite some time now. I previously did some work to extract graphs for each city, similar to the graph of YLE. In the beginning of the pandemic, cumulative data for each city was useful for the reasons I have mentioned above, especially if I was planning to visit those places. You can never be too careful, right? Then I took it one step further by calculating the moving sum for each date. We cannot know how many people are sick but if we have a number for each day that holds the total number of reported cases in the last few weeks, this could at least tell in which direction the infection is going. So for example, the graph below shows how we did in July. The numbers are not daily reported infections for Helsinki, it is the cumulative number up to each date for last 21 days.
Note that this is not a scientific work. I am just sharing my personal observation about the situation.
To explain with an example, the number on 22nd of July is 22. That is the total number of reported cases in THL website between 2nd and 22nd of July (21 days) for Helsinki. In the beginning of the month, it shows numbers 51 and 53, which tells us that the infection rates dropped during July.
Above graph shows the situation from 28th of July until today. It shows that on 28th of July, we had 20 people diagnosed with #COVID19 in Helsinki during the last 3 weeks. Comparing that to 14th of August, there are 89 people. This shows that confirmed COVID19 cases are raising fast and potentially we have more infected people in Helsinki today, than late June or early July. Until July 13, gatherings more than 50 people were prohibited, there were restrictions to restaurants and bars. There is no restrictions at the moment despite of having more infections in the community.
I will add the graphs of other big cities below. Thanks for reading.
DISCLAIMER: Graphs above does not show the exact number of sick people. They are just a different representations of the snapshots taken from THL website on each date.
It has been a while since the last time I wrote a post but here is a new one. It is about cycling. I have been cycling on my free time last few years and I enjoy it quite a lot. This is something I could not have done back in Istanbul due to traffic safety reasons but in Finland, you could enjoy your ride in almost all parts of the country. I owned a road bike around for around two years before it was stolen last November and only now I got a new one.
As I have been enjoying cycling in Helsinki, I decided to make a short video out of my one of the first short spring trips in May 2020. It was only an hour long trip and it started like a summer day and ended like a proper Finnish spring. It is also visible that people prefer to be outside despite to Coronavirus pandemic. I have little video editing experience so do not expect too much 🙂
I am thinking of starting a VLOG but cannot decide what the content should be. One option is to share my experiences and thoughts about cycling to fellow beginner cyclists, another option is to share my experience as a professional living in Finland. I had this idea for a few years already, but maybe social isolation of Coronavirus could have a positive output. Feel free to contact me if you think of something interesting.
TestCon was a three day conference held in three different locations in beautiful city of Vilnius in Lithuania. It started with a one full day workshop and continued with two session days. Workshops were held at the first day of the conference and you need to sign-up beforehand to reserve a seat. If you are late as I was, you might end up missing the most interesting workshops since they seem to be fully booked quite quickly. Workshops were organised in two different locations and you have to check the schedule from www.testcon.lt website to see where you need to go. One big thing to complain is that conference material was published quite late at their web-page after ten days. However, all sessions were recorded and uploaded to YouTube.
Session days were held in a location called Multikino, a big movie theater, at a mall in Vilnius. Workshop locations and Multikino were really far from each other so you cannot book a hotel which is in walking distance to all three. You will have to use a means of transportation to reach the venues.
Each session day was really busy having seven sessions, each 45 to 50 minutes. There was no lunch breaks. There was a time slot in the programme called “Lunch Break”, but since there were sessions during that time, you had to skip those sessions to be able to eat. Speakers were from all around the world and most of their main expertise was Quality Assurance. All of them were successful examples in their fields and they shared their experience about the tools and methods they have been using. It was hard towards the end of the day to digest the all the information flowing in, so at that point, I had to write extensive notes and pictures and went back to those after the conference.
I will go through with the main topics I was able to attend in this order: Test Driven Development, Test Automation, Quality Process.
Test Driven Development, TDD
Test Driven Development is a development method which almost every developer know by heart but not applied if it is not strictly required by the organisation they work in. TDD was main topic of few sessions and all agreed that applying TDD makes a big impact on the quality of the produced software as well as the culture of software development in an organisation.
Tests, in this context unit tests, are a kind of feedback about the code, similar to reviews. In spite of code reviews, tests are reliable and instant. When TDD is used, code emerges from the tests. Changes done to the code are always verified by the test and behavior of the code is always tested against the expected behavior. These are all known by us that behavior is testable with unit tests. Francesco Strazzullo who was talking about TDD in the first talk of the conference made a good point about TDD that, it is actually possible that produced code could also be tested against non-functional requirements (-ilities) by using TDD.
Unit testing is at the core of the TDD. Dennis Doomen talked about how to practice TDD and unit testing without shooting yourself in the foot. His approach was so that developers should not try to apply DRY principles to unit tests and it is okay to copy-paste a test to create another test with minor modifications. His definition of “unit” in unit testing seemed like a bigger concept than individual classes. This definition targeted testing a bulk of classes which enables a feature for the software, something we call system testing, in this case performed completely by the developer.
Finding and fixing bugs early in development is the cheapest way to deal with them. Test driven development helps finding bugs early in the development process. This was one of the learning of the conference: “Fail fast, fail early, fail often”.
Test Automation
Let’s first start with the definition. What is test automation? Test automation is a testing concept where set of tests run to use the produced software like an end user does. Automation performs certain steps and behaviour of the tested software is then verified with different tools such as comparing screenshots and evaluating the logs if applicable (in some sessions “Gray Box Testing” was mentioned so that test environment has some information about the tested software state to detect and deal with flakiness). Test automation is a crucial part of the modern software development process. Having automated tests dramatically reduces testing time and increases the reliability of the software. As a result, modifications to the software could be verified against the expected behaviour and as a result “continuous release” could be possible. Test automation is important to achieve scalable software development when cross functional teams produce their code and merge to a common branch, i.e. staging or even master. As there are so many positive sides of test automation, it is not so easy to achieve automated tests which are fast, reliable and easy to maintain. Automated tests are also a good way of verifying requirements of the product.
Jani Haapala, talked about how to perform scalable test automation. I have to mention that this was my favourite session in the whole conference and it was really eye-opening. He first emphasised the importance of picking right infrastructure. This heavily depends on what kind of software is produced as well as the capabilities of the involved people. Complexity of the problem grows when considering the possible layers for infrastructure and the tooling, so having the correct infrastructure is really hard and should be discussed in a separate post. Considering how hard this is, infrastructure and tooling could also be provided from an external party, and development of tests could be done in the software teams. This approach clearly defines the responsibilities and the contract between the parties. Jani also presented how they approach the failing tests. Flakiness is also a problem in their world and they created a mechanism to deal with the failing tests with different reasons.
Automated tests could be developed with different methods. I will write about three methods highlighted in the conference. First approach is developing a test framework for the product and maintain it. People can develop their tests by using the test framework and test framework could be maintained by a separate team. This approach requires everyone to be coders since developing tests is same as writing code. Second approach is using frameworks such as Robot on top of your test framework. Jani Haapala promoted Robot as it does not require much coding skills and readability of the tests is high. Readability is important when requirements are directly executed as tests. No need to say that almost no coding skills are required to develop automated tests with Robot Framework. Note that this requires considerable coding effort to first build a test framework and then adapt it to Robot Framework. However, it pays off when developing the tests with less effort than implementing the test. Third approach is using record/playback method to create tests. This approach requires a tool which the user (tester in this case) can start recording a test and performed actions are recorded as code. Generated code could be re-run as an automated test. It is fast to create automated tests with this approach. A person with no coding skills can generate hundreds of tests quickly. However, this approach will generate a lot of duplicate code and once product changes, it is likely that there could be huge amount of tests to be recreated. Louise Gibbs presented Record/Playback approach and she emphasised that refactoring generated tests was required to re-use some of the tests. She explained that a non-coder tester might not have adequate coding skills to create automated tests but he/she can learn while performing the refactoring on generated code. This approach could be beneficial if an organisation want to boost their automated test coverage but do not have enough coding power or time to do it.
Quality Process
Quality is by definition tells how good or bad something is. Sounds simple. Quality process is the set of rules, methods executed by actors whose goal is to have the product matching the specified requirements. In a big software project, it is not easy to manually verify if product is matching the requirements especially if product is created incrementally. An independent structure or team is needed to perform relevant actions to make sure that the quality targets are achieved. It is also important to note that, the mentioned verifications below should be done by that independent party who do not have a conflict of interest with the development teams.
To achieve quality, test automation is crucial in the modern software development world. If the produced software has test automation directly testing the functional requirements, incremental modifications are not as scary and minimal amount of manual work needed for testing. This will require functional requirements to be defined in smaller sets, or “user stories” and each story should have at least one automated test which will tell if a previously implemented requirement is still met after the changes to the product. Independent quality organisation should verify the tests against the defined requirements.
Having automated tests directly running the functional requirements will enable the ability to have a release anytime in a product timeline with periodical release targets. This approach makes things scalable for cross functional teams and also makes things much easy when some parts of the product is outsourced.
Test driven development is also an integral part of the quality process to find problems and deal with them in early phases. It could be utilised to check non-functional requirements.
One important aspect is the actors for managing the quality process. Conference had the message that “Everyone is responsible for quality” and it is a culture for software development organisations. I recommend watching the session from Torstein Skarra From Waterfall to Agile: 7 tips for Transforming Test Organization.
I will have a separate blog post about software development and how to achieve the quality which will have my further learnings from the conference.
Unlike my previous posts, this one has nothing to do with software. I will tell you about my favourite musician and my favourite city in Finland. I moved to Tampere for the first time in 2012 when I started my studies. After a break of two years, I moved back again in 2014 autumn for a few months to finish my studies. It was a cloudy and cold autumn time in Tampere. I had a full time job that I worked from home in addition to the classes I had to attend. In such a busy schedule, it was also emotional times in combination with lack of friends. Those were the times I first started to listen to Ólafur Arnalds. He is an Icelandic musician, playing instrumental songs mainly with piano.
Spending 4 months in Tampere, I listened too much of Ólafur. I left the city at the end of the year but since then, whenever I go there, his songs play in my head.
Ólafur Arnalds had four released projects then but I have been mainly listening Living Room Songs album. Some songs remind me jogging around Hervanta, others feel like driving in Hämeentie. Piece below is a great one, reminds me foggy morning runs near the lake behind Hervanta:
There are bunch of other nice songs in that album like, “This Place is a Shelter” and “Tomorrow’s Song”. I probably listened those songs hundreds of times before this talented guy finally came to Helsinki for a concert during his world tour project. Despite the relatively expensive tickets, it was nice to see Kulttuuritalo full of people.
He told several stories including how he started to play piano. Apparently he was trying to be a rapper before his grandmother convinced him for his current carrier. His stage was not only playing music. Light shows also made a good atmosphere for the melody. He also played songs from new album re:member. Song below is my favourite among the new songs and unlike the ones I like from Living Room Songs, this has a happier rhythm. Name of the song is “saman”, it means “together” in Icelandic.
Software is designed and implemented to solve problems. While some problems are more complex than others, every problem requires certain breaking down and simplification. This is called divide and conquer. Latin version is “divide et impera” and this has been one of the most successful politics strategy among our short history of politics. Although the history of software is even shorter and not comparable to politics, the necessity of divide and conquer strategy is vital. A software engineer or architect needs to divide a problem into smaller pieces to understand it better and to come up a comprehensive solution. In this post, you will be reading about the ways to break down a big problem into more manageable pieces and we will be discussing which way is better.
Functional Decomposition
Functional decomposition is the traditional way to break-down a complex problem into smaller pieces. Most of the time, it is the most evident and obvious solution to a given problem. To achieve functional decomposition, object oriented programming (or sometimes other paradigms as well) influences the problem solver to construct the smaller pieces. I would like to concentrate on the mental model of the programmer in decomposing the problem and let’s consider an object oriented programming language is used. Object oriented programming will allow the programmer to make abstractions of the bigger problem into smaller functional pieces. Functional decomposition is the first solution which comes to mind when designing software. I think this is because people who are coding using OOP languages are inclined to design their systems with functional decomposition because of the way OOP is taught to them. Also almost always there is an obvious break-down of the problem using this decomposition method.
Is that right thing to do?
Absolutely not, from the software architecture point of view. Functional decomposition leads the complex problem to be broken down to smaller functional problems which are isolated from each other in the wrong way. Solutions to small functional problems force coder to implement incorrect encapsulation which prevents reuse between different sub-solutions and make solution to small problem also over complicated. Joe Armstrong criticize Object Oriented Programming “The problem with object-oriented languages is they’ve got all this implicit environment that they carry around with them. You wanted a banana but what you got was a gorilla holding the banana and the entire jungle.”
Object Oriented Programming is a powerful tool, but relying only on it when decomposing software is not the right approach.
Volatility Based Decomposition
I heard this type of decomposition at a conference from Juval Löwy. Decomposing by volatility means that system should be designed via separate modules with different volatility levels. Those modules should be isolated from each other and changing one of them should not affect the other modules. It is in a way close to composition design pattern. Breaking down a big problem into smaller pieces is not that easy when the solution is not as evident as in functional decomposition.
Rules: – Identify change areas and isolate similar change areas from others. – Do not use functional change as a change area. – Integrate features onto the architecture (features are not implementation) – Do not resonate with change
Where does object oriented paradigm stand in this decomposition as a tool? Volatility based decomposition takes advantage of powerful encapsulation mechanisms. Interfaces and classes should be designed to isolate the implemented change area/module from other modules. Features emerge on top of the different volatility levels. Functions or features are most of the time at the highest volatility level and changing features should not break the architecture.
Credits
This blog post was inspired by Juval Löwy at Zen of Architecture session presented in Software Design and Development Conference 2019, London.
This is my first blog post after many years. I am happy to write the first post on an exciting topic that I believe is also exciting for other software developers as well. Few colleagues and I attended our first software development conference during last month. It was a five-day conference that started with a full-day workshop, following three days of sessions with a final day of another full-day workshop. Main focus area was software engineering and different sessions concentrated on different sub-focus areas such as software development methodology, architecture, project management and user experience.
Workshops
I attended two full-day workshops during the conference. First workshop, given by Jeff Prosise, was about using Azure. Azure workshop showed us how strong Microsoft was building the platform. During the workshop, we created a web app and used storage services to upload content for the app. In our web app, some number of images were shown. This is really simple usage of Azure but using Cognitive Services to detect the content of the images were impressive. It is important to remember that those services are not free. There are different tariffs for almost all of the services in Azure but prices seemed reasonable.
“Software Architecture: the hard parts” was the last day workshop of the conference. It was run by Neal Ford and Mark Richards who are respected Software Architects. The point of the workshop was to design a system with the best possible architectural model. Three day sessions before the last day workshop were critical to be able to make the design since the workshop was implemented to digest the information given during the sessions.
Sessions
Conference sessions were held for three full days. Each session lasted about two hours. There were 6 parallel sessions in every turn and there were three turns every day. I would like to summarize the sessions under three sub-titles: Architecture, Methodology and Code Analysis.
Sessions covering the software architecture demonstrated the essentials of how to design software in modern day. Monolith and distributed architectures were explained with “why and when to use” points. -ilities of each architecture type was gone through to decide which architecture is better over another. One good point was that there is almost never perfect architecture for a problem, there are always tradeoffs. Another aspect to architecture was analysing the existing architecture. Macro and micro methods were examined. Architecture fitness functions are particular type of objective functions that are used to summarize how close a given design solution is achieving the targets. Domain driven design and design by volatility were other important terms came up about architecture, if you have any interest to make a search about.
Agile was widely covered in the conference. Many of the speakers concentrated on the negative sides of the agile and how it is implemented among different company setups. Speakers concentrated on agile manifesto and how agile was nowadays perceived as scrum or other processes. It is hard to use agile in a productive and cost effective way. There is a sweet spot of conditions where agile works and when those conditions are not met, agile is ineffective by cost and time.
Refactoring and code analysis was another interesting part in the conference. We all do code refactoring and most of us use static code analysis. However, we run code analysis right before we make a commit or create a pull request. Interesting thing in the conference were the new tools to measure the complexity of the code over time. In my opinion, using such tools to measure the cyclomatic complexity of the code and creating graph visualisations could give life saving clues about pain points of the code and avoid catastrophe. Such techniques also help detecting the dead code in the system. If you are interested, you may search in your search engine for Code Scene.
Software Design and Development conference was well worth the time and effort. Speakers were really good and they conveyed their message nicely in almost every session. This conference was targeted to developers and architects who are open to change and improve themselves towards reaching today’s software design and development standards. It was also surprising to realize many things we already do in practice were also developed by many of the speakers and adapted to actual work environments. Turks have a saying “aklın yolu birdir” which literally means “correct minds think alike”.
Thank you for your time.
Book Advice
There were many suggestions for audience during the conference but below two books are the ones on top of my list.