The customer is a leader in providing Fleet Management Solution which can be used to track, monitor, optimize, comply with regulations and ensure safety of the fleet of vehicles operated by fleet operators. The existing platform was based on a monolithic architecture resulting in challenges like inability to distribute components across physical public cloud locations and inability to provide horizontal scalability for all the components. Phi 21 was engaged to define the high level solution architecture for the next generation platform as a set of loosely coupled services that can be deployed and scaled independently and can handle exponentially higher volumes of data.
It was decided to use the strategic modeling tools provided by Domain Driven Design (DDD) to decompose the monolith into a set of subdomains which are implemented within a bounded context. The bounded context provides the logical boundary within which business concepts related to a particular subdomain are modeled and implemented. The solution addressed additional non-functional requirements such as Security, API Management, Data Model and Observability.
The system level use cases were identified and used to derive the subdomains which were then further used to derive the bounded contexts within which the solution is implemented. Context Map was used to depict the integration relationship between different bounded contexts. It was proposed to implement each of the bounded contexts as a microservice application which can be developed and deployed in an independant and isolated manner using the appropriate technology stack.
Microservices deployed in a public cloud environment bring in new challenges from a security perspective by expanding the attack surface due to the fact that there are multiple entry points. It was recommended that OpenID Connect will be used to authenticate users and authentication between Microservices will be implemented using certificates.
The next generation platform will have API’s that can be used by end customers. Even though this capability is available in the existing platform there were challenges due to the load and the way customers were using the system. After detailed analysis of the existing solution it was recommended to use an API management platform which will provide capabilities like enforcing quotas, traffic management, security, collection of metrics for analysis.
In order to ensure that the application performs as per the SLA’s and customer expectation in production, it is important that metrics are collected and analysed. It was recommended that an Application Performance Monitoring tool be used along with a set of metrics that need to be collected across infrastructure, application, database, messaging and code quality.
Based on the type of data that each Microservice was handling, it was decided that following types of data models should be supported.
Note: Continuous Integration (CI) and Continuous Delivery (CD) were not within scope.
Performance, Scalability & Reliability (PSR) test strategy was defined to ensure that the platform meets the speed, responsiveness, throughput, scalability and stability requirements under the workloads expected in production. This is important to ensure that the platform doesn’t suffer from issues such as: slowness, becoming unresponsive, non-availability due to system crashes, etc. The recommendation included the steps and activities within each step that need to be performed.
A detailed data migration strategy covering the planning, analysis, design, implementation and validation phase was recommended along with the set of activities that need to be performed in each phase.
Recommended Data Lifecycle Management (DLM) strategy to manage the flow of data within the platform starting with its creation till it becomes obsolete and is destroyed. As data becomes more and more important in decision making it is important to know what data exists, where it is located and the accuracy of it. Additionally, it is also important to ensure that storage and use of data complies with applicable laws and regulations. A well defined and properly implemented data lifecycle management strategy will help achieve all of these and make things more efficient and agile.