Minfy

Empowering Healthcare Insights with a Data Lake : Streamlining Healthcare Data with S3 Data Lake & QuickSight

  1. Home
  2. >
  3. Case Study

type: Case study

Project Overview

Executive Summary

KareXpert is a digital healthcare SaaS based platform which serves more than 200 hospitals in 3 countries.KareXpert’s Heathcare portfolio consists of Hospital Information Management System (HIMS), Electronic Medical Records (EMR), Laboratory Information Management Solution (LIMS), Virtual care etc.

Through the HIMS platform, hospitals enable them to manage information and collect data related to all aspects of healthcare — such as — patient records, inventory, bills, pharmacy, and more, which in turn ensures that processes are completed swiftly and effectively. The major role of HIMS is to record information on health events and check the quality of services at different levels of health. Collected data gets stored in MongoDB in the backend in nested JSON format.

Business Requirement

KareXpert is looking to provide their customers with a visualization dashboard on QuickSight to present the insights and analysis of the data collected. They have previously used Tableau for visualizations but now want to transition to QuickSight. This transition aligns with their strategy of leveraging AWS services for better integration and scalability, especially since the needed data is present in S3 for the long run. Storing data in S3 allows for seamless integration with QuickSight and other AWS analytics services, facilitating efficient data processing and analysis.

KareXpert aims to enhance customer experience by offering visualization dashboards through QuickSight. These dashboards will provide insights and analyses of data collected from MongoDB, facilitated by an ETL pipeline using AWS Glue for a more efficient and cost-effective process, benefiting customers in corporate and private hospital chains, clinics, and diagnostic labs.

Scope

The scope is to establish an S3 DataLake of 102 MongoDB collections and create Glue ETL pipeline to extract data from KareXpert’s MongoDB database for their two production environments and visualize it on QuickSight. 

24 dashboards for each production environment covering the following high level subject areas. 
Financial analysis
Operational analysis
Clinical analysis
Administrative analysis
Inventory Analysis


Challenges

Connecting with MongoDB using Glue Connector:

Difficulty in establishing a connection between AWS Glue and MongoDB using the Glue connector due to configuration complexities and compatibility issues. The AWS Glue connector was ineffective due to issues with Java truststore and SSL MongoDB connection.

To address this challenge, we have leveraged AWS Glue for Spark to establish connectivity between MongoDB and AWS Glue. This allowed us to bypass the complexities and compatibility issues associated with the Glue connector. Instead, we utilized the script provided in the AWS Documentation within our ETL process.

This script configures the connection by specifying parameters such as connection name, database, collection, partitioner, partition size, partition key, and update URI status. It enables Glue to interact seamlessly with MongoDB, overcoming connection challenges effectively.


Configuring ETL Script with Manual Timestamp for Incremental Data Sync:

In the AWS Glue ETL script configuration, the Bookmark feature did not cater to our specific use case, causing limitations in achieving incremental data synchronization. Enabling the Bookmark feature resulted in pulling full data instead of solely incremental data. Furthermore, it was noted that Bookmarks are supported only in S3 and JDBC data sources, further restricting our options for incremental data synchronization.

To address this challenge, we opted to manually manage the timestamp for incremental data synchronization within the ETL script. By incorporating custom logic to track and update timestamps, we effectively ensured the synchronization of only the new or modified data since the last synchronization. This approach circumvented the limitations posed by the default Bookmark feature, enabling us to achieve precise and efficient incremental data synchronization as per our use case requirements.

 

Optimizing Partitioning Structure to Reduce File Overhead:

When utilizing the Parquet file format and partitioning data in AWS Glue with Spark, the default behavior results in the creation of multiple files per partition. Each unique combination of partition column values generates a separate directory (partition) in the storage system. Within these partition directories, numerous small files are created to store data corresponding to the specific combination of partition column values. However, to align with customer's preference to minimize storage overhead and manage the high volume of data efficiently we've opted for storing data in single file.

To mitigate this challenge, we implemented the concept of coalescing within the Glue ETL script. By coalescing the DataFrame output before writing to the S3 destination, we ensured that data within each partition was consolidated into a single file. This optimization effectively reduced storage overhead and enhanced the efficiency of data retrieval and management. Consequently, storing a single file in unique partition paths streamlined the data storage process, improving resource utilization and facilitating smoother data operations within the AWS Glue environment.


Schema Harmonization for Mismatched Data Structures:

• Integrating incremental data, comprising both new and updated records, with existing datasets posed a significant challenge due to schema mismatches. These discrepancies primarily stemmed from variations in data structures across different files, such as differing array types (e.g., array of strings vs. array of structs) or discrepancies in basic data types (e.g., StringType vs. LongType).

• To address this challenge, a custom data processing function was developed to harmonize schemas and facilitate seamless data integration. The function analyzed the schema of incoming dataframes, identifying columns with mismatched array structures. Subsequently, it transformed the array columns to conform to a unified schema, ensuring consistency across datasets.

• The find mismatch array function identified columns with mismatched array structures, enabling targeted schema transformation.

• The transform array columns function harmonized array columns by converting data to match the desired schema, handling both structural discrepancies and basic data type mismatches.

• The process_data function leveraged these schema harmonization techniques to merge incremental data with existing datasets effectively. By dynamically adjusting array structures and data types, it ensured compatibility and consistency across datasets, facilitating smooth data integration processes.

 

Handling Duplicate Entries During Incremental Data Merge:

During the merging of incremental data with existing datasets, the possibility of duplicate entries arose, particularly when updated rows were present in the incremental data. This duplication occurred post-merge, leading to redundancy within the merged dataset.

To address this challenge and maintain the integrity of the merged dataset, a solution was devised to identify and remove duplicate entries based on the latest modified timestamp. By querying the dataset for the latest modified timestamp entry post-merge, redundant rows were pinpointed and eliminated, ensuring data consistency and eliminating duplication.

After merging the incremental data with the existing dataset, a query was executed to retrieve the entry with the latest modified timestamp. This was accomplished using the following SQL query:

In this query, the `ROW_NUMBER ()` window function is utilized to assign a sequential row number to each record within a partition, which is determined by the `_id` field. The records are ordered based on the modified time column in descending order, ensuring that the row with the latest modification timestamp receives row number of 1.

By filtering the results to include only rows where `row_num` equals 1, the query effectively selects the most recent version of each record, thereby eliminating duplicates and maintaining the integrity of the merged dataset.

Note: This query is executed to create a dataset in Amazon QuickSight, not to modify data stored in Amazon S3. The purpose is to ensure that only the latest version of each record is included in the dataset used for visualization, thereby removing duplicates and enhancing data accuracy in the reporting process.

 

Resolving Data Ambiguity: Overcoming Inconsistencies in QuickSight Dashboard Filters:

In our QuickSight dashboard, inconsistency in data regarding the relationship between facility_id and facility_name resulted in ambiguity and incorrect data representation, especially when using facility_name as a filter. Despite the expectation of a unique facility_name for each facility_id, discrepancies arose where a single facility_id was associated with two slightly different facility_names.

To address this issue, we implemented a solution within QuickSight by creating a calculated field. This calculated field explicitly specified the specific facility_name for each unique facility_id, ensuring that there was no multiple facility_names associated with a single facility_id. By adopting this approach, we successfully resolved the ambiguity and ensured accurate data visualization and analysis in our QuickSight dashboard, particularly when using facility_name as a filter.

Solution

Solution Architecture.

The solution architecture details the orchestration of AWS services that collectively manage the data workflow. The structure ensures efficient data processing from the initial extraction in MongoDB to the final visualization in Amazon QuickSight.

Description

1. Extract data from MongoDB for visualization and analysis:

A secure connection between MongoDB and AWS is facilitated through AWS Glue for Spark. In AWS Glue version 4.0 and later, you can utilize AWS Glue to read from and write to tables in MongoDB and MongoDB Atlas.

2. Transform the data for compatibility with visualization tools:

AWS Glue jobs are used to perform Extract, Transform, Load (ETL) processes on the extracted data, ensuring compatibility with visualization tools.
Data undergoes transformation to prepare it for analysis, enabling seamless integration with visualization tools such as Amazon QuickSight.

3. Establish a data lake using S3:

Post-transformation, the data is organized within an S3 bucket to create a data lake, structured to facilitate analytical needs, and enable in-depth data exploration.
The data lake architecture ensures scalability and flexibility, allowing for the storage of large volumes of data in a cost-effective manner.

4. Visualize data in QuickSight:

Amazon QuickSight utilizes AWS Glue Data Catalog tables as a metadata repository and leverages Amazon Athena for querying data stored in the S3 data lake. QuickSight connects to Athena as a data source, allowing users to create interactive dashboards and reports for business intelligence and data-driven decision-making.
QuickSight's intuitive interface enables users to explore and analyze data effortlessly, empowering stakeholders to derive actionable insights from the data.
The solution architecture orchestrates AWS services effectively to manage the data workflow, ensuring efficient data processing from MongoDB extraction to visualization in Amazon QuickSight.

Furthermore, the solution description provides detailed insights into the implementation of various functions within the ETL script:

The ETL script performs a range of critical functionalities, including extracting collection and database information from a csv file stored in an S3 bucket, configuring environment-based settings dynamically based on the provided environment name as a job parameter, and seamlessly handling incremental data extraction, even in scenarios involving errors like schema mismatches. This comprehensive approach ensures efficient and reliable data processing across different environments.
It establishes connections to MongoDB and orchestrates data extraction, leveraging AWS Glue to convert the extracted data into DataFrame format for further processing. We opted not to use DynamicFrame for this purpose since some of the inbuilt functions offered in handling DataFrames are not available with DynamicFrame.
Timestamp formatting and partitioning functionalities are incorporated to adjust timestamps to reflect the UTC+5:30 timezone and optimize data storage in S3 using Parquet format.
Additional functions, such as sending emails, updating Glue crawler targets, and managing timestamp CSV files on S3, contribute to the robustness and efficiency of the data processing pipeline.

In summary, the comprehensive solution architecture and ETL job showcase the seamless integration of AWS services and custom functionalities to address specific project requirements, enabling organizations to derive valuable insights from their data and drive informed decision-making processes.

ETL Job

1. Collection and Database Information:
Extracts the collection name, MongoDB database name, and timestamp column from the provided row.

2. Environment-Based Configuration:
Checks if the specified environment is one of the supported environments ('env1’, 'env2', 'env3'). Exits the function if it's unsupported.

3. Environment-Specific Connection Settings:
Sets the connection parameters (connection string, username, password, S3 base path) based on the selected environment.

4. Checking for Incremental Extraction:
Checks if the timestamp file exists in S3 for the specified collection, determining whether to perform incremental or full data extraction.

5. Environment Validation:
Validates if the specified environment is part of the paths to crawl. Exits the function if not.

6. Incremental Extraction Logic:
Branches into either incremental or full data extraction based on the previous check.

7. MongoDB Connection Options:
Defines the MongoDB connection options, including connection URI, database, collection, username, password, and any specific aggregation pipelines for incremental extraction.

8. Connection to MongoDB and Data Extraction:
Establishes a connection to MongoDB using AWS Glue, applying the specified connection options, and converts the dynamic frame to a DataFrame.

9. Timestamp Formatting and Partitioning:
Adjusts timestamps to reflect UTC+5:30 timezone before partitioning and storing data.
•  
Utilizes the `date_format` and `from_unixtime` functions to format timestamps into a human-readable format, considering the UTC offset.
Defines partition structure with potential columns such as 'date', 'id1', and 'id2'.
Raises a ValueError if no recognizable partition columns are available in the DataFrame.
Converts the DataFrame back to DynamicFrame to maintain compatibility with AWS Glue.
Optimizes performance by coalescing the DataFrame into a single partition before writing it to storage.
Writes the transformed data to an S3 bucket in Parquet format, ensuring the specified partition keys are applied to the storage structure.

10. send_email Function:
This function is designed to send an email using Amazon Simple Email Service (SES)
It takes a subject and body as input and uses the `ses_client` to send an email to the specified recipient
It prints the message ID if the email is sent successfully and handles errors if any occur during the email sending process.

11. update_crawler_targets Function:
This function is part of AWS Glue, a service for preparing and loading data into data stores.
It updates the targets of an existing Glue Crawler with new S3 paths. It expects the crawler's name, list of paths, and the database name.
The function converts the paths into the format expected by the Glue API and then calls `update_crawler` to apply these changes to the specified crawler.

12. get_last_processed_timestamp Function:
This function retrieves the last processed timestamp for a given database and collection.
It constructs the S3 file path based on the database and collection names and attempts to read a CSV file from S3.
If the file exists, it reads the CSV and returns the last timestamp from the DataFrame. If the file is not found, it returns to `None`.

13. update_timestamp_csv Function:
This function is responsible for updating a timestamp CSV file stored on Amazon S3.It loads the existing CSV content (if any) from S3, appends a new timestamp to the DataFrame, and then saves the updated DataFrame back to S3.

14. extract_database_name_from_path Function:
This function extracts the database name from a given S3 path. It splits the path using the '/' character and retrieves the database name from a specific position in the resulting list, the purpose of this database is to use specified crawler which needs to be updated.

15. Glue Crawler Update and Start:
This section of code updates and starts AWS Glue Crawlers based on the paths obtained from the `paths_to_crawl` dictionary. It filters out paths corresponding to failed collections and then groups the paths by database name.
For each database, it calls `update_crawler_targets` to update the crawler with the new paths and then starts the Glue Crawler using `glue_wrapper.start_crawler`.

16. Glue Data Catalog Creation:
AWS Glue is utilized to automatically discover, and catalog datasets stored in the S3 data lake.
The Glue data catalog provides a unified metadata repository, facilitating easy data exploration and analysis.

17. Data Analysis with Athena:
SQL-based analysis is performed using Amazon Athena to derive insights from the healthcare datasets stored in the S3 data lake.
SQL queries are executed to generate datasets relevant to financial analysis, operational efficiency, clinical outcomes, administrative tasks, and inventory management.

18. QuickSight Visualizations:
After the data is stored in Amazon S3 and tables are created in the AWS Glue Data Catalog, the next stage involves leveraging these tables to perform analysis and visualization.
Using the tables registered in the Glue Data Catalog, AWS Athena is employed as a query service to interactively analyze the data. Athena enables querying data directly from S3 using standard SQL syntax, without the need for complex ETL processes.
SQL-generated datasets are seamlessly integrated into Amazon QuickSight for visualization and dashboard creation.
24 Interactive dashboards are developed for each production environment, covering various subject areas such as financial analysis, operational efficiency, clinical outcomes, administrative tasks, appointment analysis, laboratory, bed occupancy, inventory management, etc.

Outcome

The implemented solution enables KareXpert to leverage their healthcare data effectively, leading to better outcomes such as improved patient care, operational efficiency, strategic planning, and compliance adherence. By harnessing the power of AWS services, KareXpert can stay ahead in the rapidly evolving healthcare landscape and continue to provide high-quality care to their patients.

Efficient Data Management:

By automating the extraction, transformation, and loading of data from MongoDB collections to an S3 data lake, customers can efficiently manage vast amounts of healthcare data. This streamlines the data management process, reducing manual effort and ensuring data accuracy and completeness.

Data-Driven Decision Making:

With the ability to perform SQL-based analysis using Amazon Athena, KareXpert can now provide its customers valuable insights into various aspects of their healthcare operations. This enables data-driven decision-making across departments, leading to informed strategies and actions.

Improved Operational Efficiency:

Through the analysis of operational metrics such as bed occupancy, appointment analysis, and inventory management, customers can identify areas for improvement and optimize resource allocation. This leads to increased operational efficiency and cost savings.

Strategic Planning:

By understanding their healthcare operations at a granular level through data analysis, customers can develop long-term strategic plans aimed at achieving organizational goals. This might involve expansion strategies, service enhancements, or targeted interventions to address specific challenges.

Leveraging AWS services like Glue, S3, Athena, and QuickSight, KareXpert can gain valuable insights into their healthcare operations, enhancing patient care, optimizing resource allocation, and improving overall efficiency across their network of hospitals.

 

Benefits

Previously, the data extraction process involved manual efforts, with individuals manually pulling data from each collection in MongoDB. Subsequently, they manually uploaded the extracted data into S3, where further transformation for analytics took place.

This manual process was time-consuming and relied heavily on human intervention. However, the current workflow has evolved to a more efficient and automated system. The entire process, from extracting data from around 150collections in MongoDB to loading it into the S3 data lake and subsequent transformation for analytics in QuickSight, now takes approximately less than 4 hours.

Currently, 30 hospitals actively utilize QuickSight dashboards for data-driven decision-making.

Reach out to us for a better world

Minfy has a repository of learnings, competencies and an enviable track record of meeting customer needs. Advice and service, solutions and responsiveness work in tandem. Begin your cloud journey, accelerate it or optimise your cloud assets. Experience business impact.

MINFY

Transforming Parcel Delivery: POS Malaysia's Path to AWS  

  1. Home
  2. >
  3. Case Study

type: Case study

About the company

POS Malaysia, the national courier service provider, embarked on a journey to modernize its operations and enhance customer experiences. Partnering with Minfy, an AWS Premier Consulting Partner, POS Malaysia sought to leverage AWS services to provide accurate Estimated Arrival Time (ETA) predictions for parcel deliveries. This case study explores how the strategic use of Amazon DynamoDB, Amazon Redshift, and AWS Glue helped POS Malaysia achieve its goals. 

Customer Challenges

POS Malaysia faced several challenges, including the need to: 

1. Predict ETA for parcel deliveries accurately.
2. Process and analyze large volumes of parcel data for predictive analytics.
3. Modernize its data infrastructure for scalability and efficiency. 

AWS Service Selection Rationale: 

 Minfy recommended a tailored AWS solution to address POS Malaysia's challenges: 

1. Amazon DynamoDB: Chosen for its scalability and low-latency performance, DynamoDB became the primary database for storing parcel data.
2. Amazon Redshift: Utilized as the data warehouse to store and analyze vast volumes of structured data, facilitating predictive analytics.
3. AWS Glue: Employed for data integration, transformation, and preparation, enabling seamless analytics workflows. 

Solution Implementation

Minfy executed a comprehensive solution: 

1. Data Ingestion: Parcel data was ingested into Amazon DynamoDB from various sources, ensuring real-time availability.
2. Data Warehousing: Amazon Redshift stored and organized parcel data, making it readily accessible for predictive analytics.
3. Data Integration: AWS Glue was used for data integration, ensuring seamless workflows for analytics.
4. Predictive Analytics: Machine learning models were developed to predict ETA for parcel deliveries, leveraging the data stored in DynamoDB and Redshift. 

Results & Benefits

POS Malaysia's partnership with Minfy and the strategic implementation of AWS services yielded significant benefits: 

1. Accurate ETA Predictions: The predictive analytics solution, powered by AWS services, provides precise ETA predictions for parcel deliveries, enhancing customer satisfaction.
2. Scalability: The AWS infrastructure ensures that POS Malaysia can scale its operations to handle spikes in parcel data and delivery requests.
3. Efficiency: Streamlined data integration and analytics workflows minimize operational overhead and enhance efficiency. 

About Minfy

Minfy specializes in multi-cloud support, advanced AWS infrastructure monitoring, Database (DB) support, and Application Modernization. Recognized for industry excellence and innovation, Minfy tailored comprehensive solutions for Uno Minda, showcasing expertise in multi-cloud environments and AWS services. Minfy actively participated in cost optimization exercises, demonstrating a commitment to efficiency and fiscal responsibility. We conduct APN programs, such as WAR (Well-Architected framework), different competencies (DevOps, Migration, Data Lake and few more), as a part of Minfy Services as Partners.

Reach out to us for a better world

Minfy has a repository of learnings, competencies and an enviable track record of meeting customer needs. Advice and service, solutions and responsiveness work in tandem. Begin your cloud journey, accelerate it or optimise your cloud assets. Experience business impact.

MINFY

Efficiency Unleashed: How Minfy's Automated Strategy Transformed UGRO's Cloud Resource Management

  1. Home
  2. >
  3. Case Study

type: Case study

About the company

U GRO Capital Limited, a prominent Data Tech lending platform in India, specializes in addressing the small business credit gap. The company collaborates with major OEMs, offering versatile term loan programs tailored for MSMEs. UGRO's innovative supply chain program delivers holistic solutions, positioning them as a key player in fostering financial growth for businesses in the Indian market.

Customer Challenges

U GRO faced a multitude of technical challenges, encountering delays in applying critical updates due to irregular security measures. This exposed the system to vulnerabilities, necessitating meticulous patch management. The low-security hub score highlighted the need for a detailed security enhancement strategy, requiring in-depth analysis. In addition, the absence of a real-time monitoring dashboard complicated management, and cost oversight became a major challenge. Moreover, organizational inefficiencies, including a lack of account stability, posed challenges, making cost reductions and financial optimizations complex. Therefore, a strategic and technical approach was imperative to fortify security, streamline costs, and establish an organized operational framework. U GRO reached out to Minfy to enlist us as their MSP to solve these issues.

Partner Solution

When Minfy engaged in routine support operations with U GRO, a comprehensive analysis revealed an abundance of underutilized resources, including dormant volumes, idle IP addresses, and lingering floating snapshots. In response, a proactive strategy was initiated, incorporating automated scripting and alert mechanisms for routine sanity checks. This systematic process served the dual purpose of ongoing vigilance over idle resources and concurrent endeavors to optimize costs for U GRO. The automation of these checks ensured a preemptive approach, systematically identifying and addressing unused resources, fortifying the system's efficiency, and mitigating unnecessary costs.

It was also noted that the S3 storage size progressively grew, resulting in runaway costs. To address this a lifecycle policy was recommended to manage costs and restrict storage capacity, Minfy suggested moving the data from standard to archival storage. In terms of data transfer, the month of September 2023, recorded a total outbound data volume of 63 TB. To address this, a review of log store data and lifecycle policies in CloudWatch was advised for optimal system management.

To address the lack of real-time monitoring, Minfy deployed Swayam™ (Minfy's in-house tool). It automated reports and played a pivotal role in enhancing resource management and cost optimization. By leveraging automation, integrity in the infrastructure was continually ensured, contributing to U GRO's fiscal prudence by identifying avenues for resource optimization and cost reduction proactively and systematically through the CXO dashboard and regular cost optimization.

Results

The implemented solution for U GRO yielded transformative outcomes, enhancing patch management to fortify system security and significantly improving the low-security hub score. Proactively identifying and optimizing underutilized resources streamlined financial processes. The introduction of real-time monitoring tools like Swayam and the CXO dashboard bolstered cost oversight for fiscal prudence. Addressing organizational challenges improved operational efficiency and reduced risks, successfully fortifying system security, optimizing costs, and establishing a structured operational framework. This integrated approach aligned U GRO with industry best practices for sustained success. Cost savings initiatives achieved significant benefits, with USD 8,904 in savings attributed to RDS and an additional USD 134 in savings related to EC2, reflecting a cumulative monthly savings of USD 9,038 through successful expense optimization. Following AWS best practices in the U GRO environment resulted in a subsequent Security Hub score improvement from 42% to 70%, addressing initial security glitches and ensuring a robust and secure environment.

About Minfy

Minfy specializes in multi-cloud support, advanced AWS infrastructure monitoring, Database (DB) support, and Application Modernization. Recognized for industry excellence and innovation, Minfy tailored comprehensive solutions for Uno Minda, showcasing expertise in multi-cloud environments and AWS services. Minfy actively participated in cost optimization exercises, demonstrating a commitment to efficiency and fiscal responsibility. We conduct APN programs, such as WAR (Well-Architected framework), different competencies (DevOps, Migration, Data Lake and few more), as a part of Minfy Services as Partners.

Reach out to us for a better world

Minfy has a repository of learnings, competencies and an enviable track record of meeting customer needs. Advice and service, solutions and responsiveness work in tandem. Begin your cloud journey, accelerate it or optimise your cloud assets. Experience business impact.

MINFY

Catalyzing Success: Cholamandalam's DevOps Transformation Journey

  1. Home
  2. >
  3. Case Study

type: Case study

About the company

Cholamandalam Investment and Finance Company Limited (Chola), incorporated in 1978 as the financial services arm of the Murugappa Group, has emerged as a comprehensive financial services provider. Chola offers a range of services, including vehicle finance, home loans, SME loans, and more. With operations across 1,204 branches in India and assets under management exceeding INR 1,147.95 billion (approximately USD 15.5 billion), Chola is committed to enhancing the lives of its 2.74 million customers while upholding strong ethical values and a commitment to all stakeholders.

Customer Challenges

Chola was confronted with a critical need for seamless and controlled software deployments. Lacking an in-house development team, they heavily relied on external vendors, leading to a fragmented landscape with multiple vendors and disjointed deployment pipelines. This approach resulted in operational challenges, exacerbated by a continuous influx of change requests from end customers and insurance aggregators. These challenges led to significant downtimes, negatively affecting the end-user experience. To address this, Chola aimed to establish a centralized repository to streamline and govern the deployment process for improved efficiency and consistency.

Why Minfy

Chola selected Minfy as its DevOps partner due to their status as an AWS Premier Partner, extensive experience in complex DevOps projects, the required competencies to build the proposed solution, and their existing partnership with Chola as a managed services provider. This choice was underpinned by the confidence in Minfy's expertise and their ability to deliver a robust and tailored DevOps solution, aligning seamlessly with Chola's cloud-focused objectives.

Minfy’s Solution

Application Discovery:

Minfy initiated the solution by conducting a comprehensive application discovery process, which included identifying the number of applications in Chola's ecosystem, profiling the application stack, and identifying dependencies. This insight provided a clear understanding of Chola's application landscape and was pivotal for the subsequent DevOps activities.

DevOps Solution Setup Activities:

•  AWS Cognito with SAML against ADFS + MFA: Minfy implemented AWS Cognito with SAML authentication against ADFS, enhancing security with Multi-Factor Authentication (MFA).
•  AWS Code Commit Repo Setup: Minfy established AWS Code Commit repositories, offering a secure and efficient version control system for Chola's applications
• Jenkins Master and Slave Setup: Minfy configured Jenkins Master and Slave (Windows) environments as per the proposed specifications. Additionally, they integrated AWS Cognito with Connect ID and set up an S3 bucket for enhanced functionality.
Sonarqube with AWS ADFS SAML Integration: Minfy integrated Sonarqube with AWS ADFS for seamless Single Sign-On (SSO) access and efficient code quality analysis.
CICD Pipeline Configuration: Minfy created CodeCommit repositories for the identified applications and established Continuous Integration and Continuous Deployment (CICD) pipeline scripts, ensuring streamlined code deployment.
• Source Code Commit: Chola's development team and vendors were provided access to commit source code to AWS Code Commit, facilitating collaborative development.
• Deployment, Validation, and UAT Deployment: The solution included deploying the code to the development environment, validating its accuracy and functionality with input from the development team, and then merging and deploying it to UAT servers for comprehensive testing and quality assurance.

 

Challenges Faced by Minfy During Implementation:

A notable challenge encountered by Minfy during the implementation was the integration of SAML with Jenkins while implementing Active Directory (AD) authentication. A hurdle in this process was the absence of native support for AD authentication in Jenkins. Chola's requirement to onboard multiple vendors onto the AD system further added complexity, especially with their strong preference for avoiding the creation of separate logins for Jenkins. The overarching goal was to ensure that vendors who were already authenticated by the AD could effortlessly access Jenkins without the need for additional login credentials.

To address this challenge, Minfy effectively resolved the issue by enabling AD authentication using AWS Cognito. This solution streamlined the login process for vendors authenticated by the AD, eliminating the need for separate Jenkins credentials. This strategic implementation not only tackled the SAML integration issue but also provided a seamless experience for Chola's users, simplifying access to Jenkins and improving overall efficiency and security within the DevOps ecosystem.

Business Outcomes

Minfy's comprehensive solution effectively addressed Chola's pressing challenges, resulting in transformative outcomes. By streamlining software deployments and fostering collaboration among multiple vendors, Minfy reduced time to deployment and market by 40%, facilitating faster and more agile development cycles. Additionally, the integration of Sonarqube significantly enhanced Chola's security posture, improving code quality and fortifying the codebase. This holistic approach not only tackled Chola's initial challenges related to fragmented deployments but also laid the foundation for heightened efficiency, security, and competitiveness within the evolving landscape of DevOps.

 

 

 

Reach out to us for a better world

Minfy has a repository of learnings, competencies and an enviable track record of meeting customer needs. Advice and service, solutions and responsiveness work in tandem. Begin your cloud journey, accelerate it or optimise your cloud assets. Experience business impact.

MINFY

Minfy Unlocks New Possibilities for Ayur.AI with AI- Healthcare Solutions

  1. Home
  2. >
  3. Case Study

type: Case study

Overview

In recent years, there is a trend of technological transformation in the businesses all around the globe. We are in the midst of digital transformation, where every organisation is putting resources to foster innovation, and to adapt with the market and customers need. Organizations are democratizing the data with collaborative efforts. Ayur.AI is in a similar bandwagon, where Minfy is helping Ayur.AI in solving some of the pressing issues of our time in the healthcare domain, collectively.

Ayur.AI has a deep domain knowledge and understanding of holistic healthcare practices to transform healthcare into a more proactive, preventive, and patient-centric experience by combining Ayurveda with modern science.

To support Ayur.AI achieve their vision, Minfy identified three broad areas of collaboration in their service and product offerings as below:

1. AI had patient data in unstructured formats such as images and PDFs. Hence, they faced challenges in utilizing that data to advance their service and product offerings.
2. AI wanted to track a patient’s progress over time and to perform advanced analytics to achieve a more proactive and preventive approach.
3. AI wanted to better manage their ayurvedic nutraceutical products listing by understanding the emotions around a particular product so that they could make better, and data driven recommendations to the customers.

Solution Offerings

Solution 1: Data Digitization

The process of converting unstructured data into structured/semi-structured data involves several steps, including data extraction from documents, data cleaning and pre-processing, data mapping, data standardization, and data normalization. After these steps are completed, the data is typically stored, analysed, and used to generate insights.

AWS offers an array of tools and services to assist healthcare stakeholders in unlocking the full potential of their data. Our solution utilizes various AWS services like Amazon Textract to process a small sample of documents, extracting relevant data and standardizing for further processing.

Extraction: The foundation of utilizing unstructured healthcare data lies in the extraction of relevant information from various sources such as images, PDFs, and scanned documents. This process involves leveraging the power of Optical Character Recognition (OCR) algorithms and tools, such as the widely available services like Amazon Textract. These algorithms and tools enable us to recognize, extract, and define output from the unstructured data, making it machine-readable and ready for further analysis.

We used Amazon Textract to perform the OCR and digitize the data. An instance of a sample lab investigation report ran through Textract to get the extracted information in the CSV or Excel format.

 

The corresponding output file of the unstructured document:

 

Processing: Once the data is extracted, we are standardizing it to a uniform format for meaningful comparison and calculations. Natural Language Processing (NLP) techniques play a vital role in extracting valuable information from textual data. This feature allows a healthcare provider to look from the complete document and get an understanding of mapping of what were the diagnoses, symptoms, treatments and qualities etc.

Data Mapping, Normalization, & Standardization: The extracted data is diverse and needs to be transformed to a common scale to facilitate meaningful comparison and calculations. We transformed the data into the standard forms to ensure homogeneity of medical concepts across different datasets.

Because data ontology plays a crucial role in establishing a common ground for understanding the data, thereby enhancing the accuracy and relevance of analyses by defining the relationships between different medical concepts.

Solution 2: Advanced Analytics and Visualizations on Patient Data

We have successfully extracted, cleaned, and pre-processed the data. As a result, it is now structured and ready to be used for analytics and visualizations. We started with performing a descriptive analysis of the patient data to gain insights into their current health status. We used Amazon QuickSight to create dashboards that visualize the medical data.

For instance, we had patient lab investigation reports, and we had extracted the information along with unique identifiers such as UHID. We then performed the necessary preliminary steps to ensure that the data is usable.

Our descriptive analytics involved creating a dashboard that displays vitals trends, health parameters abnormalities, a timeline view of medical events, the distribution of normality, and the classification of parameters based on abnormality. We also included a health parameter value change indicator to compare the changes during a specific period.

These powerful visualization techniques would enable healthcare professionals to determine patterns, trends, and correlations within the data - providing valuable insights into patient health, current stage of the disease and potential medical outcomes.

Solution 3: AI Powered Sentiment Analysis of Nutraceutical Product Reviews

For performing the sentiment analysis, Minfy used their in-house solution – Sentiment.ai. Sentiment.ai is a solution that uses AI/ML to analyse the sentiment of text data to understand the emotions and opinions. ​

With the help of Sentiment.ai, we enabled Ayur.AI to better understand the customer emotions about products, make data-driven and informed decisions, and provided real-time analytics, trend detection, and sentiment tracking to capture the pulse of customer sentiment.

 

The analysis helped Ayur.AI to accurately understand the sentiment of customers, proactively addressing the detected issues or concerns, and finally to use transparent and data-driven approach. Overall, the solution enhanced customer satisfaction, personalized marketing strategies, optimized product assortment and improved Ayur.AI’ s reputation across retailers.

Outcomes

The following outcomes were achieved by Ayur.AI during their collaboration with Minfy:

Effective Data Utilization: Ayur.AI successfully transitioned from unstructured data formats like images and PDFs to structured data. The data digitization process enabled Ayur.AI to harness valuable insights from previously untapped data sources.

Proactive Healthcare Approach: With the ability to track patient progress over time, Ayur.AI adopted a proactive and preventive approach to healthcare. Advanced analytics provided a comprehensive view of patient health, leading to informed decision-making for doctors.

Data-Driven Decision-Making for Products: Ayur.AI gained a deeper understanding of customer emotions and opinions regarding their nutraceutical products listing. Real-time sentiment analysis empowered Ayur.AI to make data-driven decisions, updating listing of products in real-time.

Conclusion

Healthcare systems are at a critical juncture in the dynamic world of smart healthcare. While fundamental principles and systems have been in use for a long time, the introduction of new technologies opens a wide range of opportunities for growth and advancement.

The adoption of technological developments has moved from voluntary to mandatory in a fast-expanding healthcare market. It is the path that will allow us to provide efficient, patient-centric care, ultimately leading to better health outcomes for everyone.

As a result, it is critical that we work together, push the boundaries, and truly embrace the transformative force of innovation as we journey towards a future where healthcare’s potential is limitless.

Co-Author - Dr. Bala Pesala | Founder & CEO, Ayur.AI

Co-Author – Yedhu Krishnan B | Senior Full Stack Developer, Ayur.AI

Author – Gaurav Lohkna | Data Scientist, Minfy

Reach out to us for a better world

Minfy has a repository of learnings, competencies and an enviable track record of meeting customer needs. Advice and service, solutions and responsiveness work in tandem. Begin your cloud journey, accelerate it or optimise your cloud assets. Experience business impact.

MINFY

Pos Malaysia's AWS Revolution

  1. Home
  2. >
  3. Case Study

type: Case study

 

About Customer

Pos Malaysia is the national courier service provider and sole licensee for universal postal services in the country, delivering to more than 10 million addresses across the nation. With a track record of over 200 years, the Pos Malaysia Group has progressed from a traditional
postal service into a dynamic mail and parcel services, financial services and supply chain solutions provider with the largest delivery and touchpoint network in Malaysia. Pos Malaysia’s Integrated Parcel Centre (IPC) was upgraded to process over 300,000 items a day. With the growing demand, Pos Malaysia incorporates various technologies to deliver several services to its customers and to realize its vision, “We deliver. We connect. We improve lives.”

Challenges

Pos Malaysia’s existing business model uses on-premises data centers for storing and processing all the data generated across different services. The company stores a large amount of data on-premises database which is used by different business units for data processing, transformation and further analysis. As the company is seeing an increase in orders, they are looking forward to leveraging cloud technology to build a data warehouse for data analytics. The company wants to build an end-to-end ML pipeline using this data warehouse to incorporate advanced ML algorithms that can predict ETA (Estimated Arrival Time). The company is also faced with challenges like scaling during data spikes and vertical scaling for heavy workloads. A secure solution to maintain and monitor different APIs for delivering integrated solutions. The company is therefore looking for a cost-effective solution that can address the above requirements and reduce its operational overheads. Being an AWS premier partner, Minfy brings in a range of deep cloud expertise in migration and modernization. Minfy believes that the client can modernize its database for scalability and security and leverage serverless architecture for its workflows. Minfy proposes various data analytics services offered by AWS for building advanced data analysis and ML pipelines enabling the company to be more agile in delivering its solution.

Current Scenario

• The data resided in different databases. The client wanted a data warehouse solution to help its business units to access the right data and perform analytics in real time removing the existing manual interventions to fetch the data.
• The client wanted to enhance the user experience of their customers by reducing the time to predict the estimated arrival time. During peak time, the APIs took a long time to respond.
• Owing to the pandemic, there was a negative impact on the delivery time for parcels which significantly affected the estimated arrival time. The client wanted to revise and run an ML model to predict the ETA with a low margin of error.
• The client required a pipeline that can process data in batches and persist it to a NoSQL database for fast retrieval. Monitoring the APIs was becoming a challenge as the traffic in the overall application has increased to over a million requests daily.

Client Expectation

• A robust and scalable infrastructure solution with minimal spending
• An architecture that can allow provision-on-demand spikes.
• Efficient and fast data warehouse solution for quick insights
• Low latency storage with minimum operational overheads
• Fast response and deployment times for APIs
• Security of data at rest and in transit across services

Solution Offered

Transit Gateway was configured to establish a secure connection between the Virtual Private Connection in the cloud with the on-premises database/application.
• AWS API Gateway was configured to host and manage the private API service as a private API endpoint. It enables the client with functionalities like API keys for partner validation. Rate-limiting usage in order to implement throttling and handling unwanted spikes in usage.
• The basic level of validations of API requests was configured using API Gateway which also assists in maintaining multiple versions for the next phase releases.
• AWS Lambda was configured as a computer service triggered by API Gateway upon request. Lambda transforms the request into a valid API call for the Dynamo DB query and responds back with the relevant information.
• Dynamo DB was configured to store the connote ids of the parcel with an internal trigger to create a data lifecycle in order to purge legacy data from storage. Dynamo DB was configured as a key-value pair storage service which is a fast, scalable, and, flexible NoSQL database for low-latency data
• AWS Redshift was configured as the data warehouse system to store the data and create business reports.
• AWS Batch is a trigger-based service that will be used to create either event-based or time-based triggers in order to extract the delta data from Redshift on a periodic basis and process the data into the input set for Sagemaker
• AWS Sagemaker endpoint was configured for predicting ETA using ML algorithms.
• AWS Batch retrieves the output from SageMaker and pushes it into Dynamo DB for storage. The table structure was decided based on output structure from SageMaker.
• AWS S3 is the data source where the data extracted from the OAL database is stored.
• AWS Glue was configured to preprocess the raw data to relevant data structure for the data warehouse. It also helps in validating the data set and correcting any duplication-related errors.
• Key Management Service (KMS) was configured with the database services to ensure data encryption at rest. HTTPS API with certificate hosted in ACM was used to ensure data encryption in transit.
• CloudWatch was configured to monitor all the services being used in the architecture. It helped set up alerts in case infrastructure usage goes beyond expected parameters and assists in having automated resolutions for the same.

Results and Benefits

• Using API Gateway, the client was able to handle a huge number of requests. They were further able to reduce the cost using the tiered pricing model of this service. Since API Gateway is a fully managed service, the client was also able to reduce the operational efforts in managing the APIs and tracking requests.
• Since Lambda was querying results from the DynamoDB, the customers got the response from the API with very low latency even during peak traffic.
• Using AWS Glue, Redshift, Batch and Sagemaker, the client was able to implement the ML pipeline to deliver quick results. Using AWS Batch, the client was able to run Batch computing reducing the cost significantly

Reach out to us for a better world

Minfy has a repository of learnings, competencies and an enviable track record of meeting customer needs. Advice and service, solutions and responsiveness work in tandem. Begin your cloud journey, accelerate it or optimise your cloud assets. Experience business impact.

MINFY

Driving Agility and Cost Effectiveness for a Leading Indian Conglomerate

  1. Home
  2. >
  3. Case Study

September 01, 2023 | Resonances Type: CaseStudy

Objectives

Dalmia's business goal is to be in the top two in all its businesses through the strength of its people and the speed of its innovation. To achieve this goal on an innovation front, they want to centralize their IT infrastructure, modernize their applications, and reduce costs, including license costs. They also want to transform their applications and make them futureready in order to accelerate innovation.

Solution Highlights

Minfy migrated and ran their SAP & Databases on AWS Cloud:

•   HANA (hosted on Oracle) to Sybase
•   MySQL databases to RDS MySQL

As an AWS Premier Partner, Minfy brings a range of deep expertise in cloud migration and infrastructure management, adhering to AWS best practices, built upon its proven experience.

Minimised Cost

Flexibility

Cloud Native Agility

Key Benefits

Flexibility

 

The flexibility provided by AWS Cloud has allowed Dalmia to scale infrastructure according to business requirements.

 

 

 

Re-Platforming

By re-platforming critical legacy application databases to managed RDS, Dalmia has seen improved performance and reduced operational overhead for activities such as monthly patching and scheduling backups.

 

Cloud native Service

 

By helping Dalmia adopt cloud-native services, we enabled them in driving innovation and supporting their digital
transformation initiatives.

 

 

Customer Background

Founded by Mr. Jaidayal Dalmia in 1939, Dalmia Cements is a pioneering cement company in India. Headquartered in New Delhi, the company operates as a 100% subsidiary of Dalmia Bharat Ltd. The company has a manufacturing capacity of 35.9 million tons per annum (MTPA) across 14 plants and grinding units located in 10 states. With over 33,000 dealers and sub-dealers, the company serves more than 22 states and is a leading player in each region. The company has grown through acquisitions and greenfield expansions to scale and venture into new territories. Dalmia Cement is unique in that it has at least one plant in each of the four major eastern states of West Bengal, Bihar, Jharkhand, and Odisha.

Challenges

The Dalmia Group has their workloads hosted on-premise and in a third-party data center. An assessment identified a total of 123 servers and 11 applications currently hosted in these locations. They were looking to move their workloads to AWS due to the poor performance of the
prevous data center. The goal is to centralize their infrastructure, modernize their applications, and reduce costs, including license costs. They also want to transform their applications and make them future-ready in order to accelerate innovation.

 

Business Drivers

Business Driver 1: Private cloud contract extension on the cards, Critical Business Applications & ERPs hosted on the private cloud.
Business Driver 2: Limitations / Constrains on the tech front to scale/innovate their business faster in on-premise infrastructure

Solutions

Minfy leveraged the Migration Acceleration Program to create the business case for migration to AWS. We designed and implemented a
highly-available infrastructure for their SAP workloads on AWS. Additionally, Minfy migrated Dalmia’s critical databases as well as their non-SAP applications. We also defined & implemented a Cloud Operating Model & transitioned Dalmia’s cloud-ops to Minfy’s Managed Services
Support.

About Minfy

Minfy Technologies is a Cloud Native System Integrator helping enterprises, start-ups and fast-growing businesses navigate digital journeys leveraging AI & Cloud technologies. We assist our customers in accelerating Digital Transformation, Cloud Adoption and Innovation. Our transformative services include Cloud Consulting, Migration & Legacy Modernisations, Cloud-Native Application Development and DeepTech implementations while offering reliable and world class 24x7 managed services.

Reach out to us for a better world

Minfy has a repository of learnings, competencies and an enviable track record of meeting customer needs. Advice and service, solutions and responsiveness work in tandem. Begin your cloud journey, accelerate it or optimise your cloud assets. Experience business impact.

MINFY

Database Modernization for Leading Indian Tech Company

  1. Home
  2. >
  3. Case Study

September 01, 2023 | Resonances Type: CaseStudy

Objectives

Cyient & Minfy's collaboration to migrate end-customer's (leading mobile network company) databases on Oracle Commercial Database Engine to OpenSource PostgreSQL for cost-effective scalability & cost management.

Transform Cyient's Oracle Database to Aurora PostgreSQL for cost savings and reduction of management overheads.
Reduce heavy Oracle licensing costs.
Achieve Oracle-grade scale, performance, and HADR with open-source alternatives.
Based on the success of the aforementioned, they planned to move their other applications using Oracle, to Aurora PostgreSQL

Solution Highlights

Minfy migrated Cyient's database to AWS cloud from a commercial database engine. This resulted in improved database
performance, reduced licensing costs as well as improved scalability and flexibility.

Minimised Cost

Improved Performance

Scalability

Key Benefits

Migrating from a commercial database

Migrating from a commercial database engine to Opensource PostgreSQL resulted in reduced licensing costs - licensing freedom.

Improved scalability & flexibility

Improved scalability & flexibility in workload management. This has an impact on cost and reduces likelihood of overprovisioning.

Right-sizing

 

Right-sizing of the database resulted in improved database performance & efficiency.

 

Customer Background

Cyient is a leading Indian multinational technology company specializing in engineering, manufacturing, data analytics, and networks and operations. With over 15,000 associates in 19 countries, they provide services and solutions to a diverse customer base of over 300 clients, including 29 Fortune 500 companies. Cyient's approach combines global delivery with local relationships, innovation, and digital expertise.

Challenges

Business Challenge:

Overprovisioning of Licenses and costs associated with it.

Freedom from Licensing and Vendor Lock-in Challenges.

Specific Technical Challenges:

The end-customer had access restrictions for DMS GUI usage by Cyient & Minfy on AWS console and AWS CLI was used to solve this issue

Converting complex Oracle SQL code objects to PostgreSQL for COTS app required significant engineering effort.

Achieving HADR, DR, Scale-up/out with Oracle posed cost & management challenges which we resolved as a part of our solution

Technical Solutions

Features of the Solution:

  Used AWS SCT and AWS CLI with DMS for migration to AWS Aurora PostgreSQL

•  Customized Ora2PG integration for SQL code object conversion

  Code conversion for schema and SQL/DB objects with open-source accelerators.

•  Factored RDS Multi-AZ and Read Replicas for HADR & read scaling

•  Open-source customizations for BLOB/CLOB tables and CDC using DMS.

•  Designed and executed open-source PostgreSQL APMs for performance insights

•  Provided SQL query optimization recommendations with Aurora PostgreSQL APM reports

About Minfy

Minfy Technologies is a Cloud Native System Integrator helping enterprises, start-ups and fast-growing businesses navigate digital journeys leveraging AI & Cloud technologies. We assist our customers in accelerating Digital Transformation, Cloud Adoption and Innovation. Our transformative services include Cloud Consulting, Migration & Legacy Modernisations, Cloud-Native Application Development and DeepTech implementations while offering reliable and world class 24x7 managed services.

Reach out to us for a better world

Minfy has a repository of learnings, competencies and an enviable track record of meeting customer needs. Advice and service, solutions and responsiveness work in tandem. Begin your cloud journey, accelerate it or optimise your cloud assets. Experience business impact.

MINFY

Mjunction's Database Modernization Success on AWS

  1. Home
  2. >
  3. Case Study

August 31, 2023 | Resonances Type: CaseStudy

 

About the Company

Mjunction is an Indian online marketplace company that specializes in e-commerce, digital procurement, and supply chain solutions. Founded in 2001 as a joint venture between Tata Steel and SAIL, Mjunction has played a pivotal role in transforming various industries by providing innovative technology platforms for buying, selling, and managing commodities and services. The company's expertise lies in creating transparent, efficient, and secure digital ecosystems that optimize business processes and drive value for a wide range of stakeholders.

Customer Challenges

Mjunction faces the challenge of migrating 120 on-premises servers, comprising primarily Linux stack (63) and some Windows (57), including SAP (24) and Non-SAP (96) workloads, to AWS cloud. The objective is to centralize infrastructure for inventory consolidation, optimize costs, and enhance operational efficiency. The underlying issue lies in their on-premises infrastructure, which contributes to elevated costs.

Why Minfy

Mjunction selected Minfy as their migration partner due to a well-established relationship as their reselling partner. Leveraging Minfy's premier AWS partnership in India and a track record of successful large-scale migrations underscored their expertise and reliability.

 

Minfy’s Solution

Minfy successfully undertook a substantial migration initiative for Mjunction, encompassing the migration of their 120 on-premises servers. Notably, we seamlessly migrated their critical application, Auction1.metaljunction.com, from IBM infrastructure to AWS, including the associated databases. This paved the way for the following transformative elements within our comprehensive solution:

Landing Zone Setup

 
 

A meticulously crafted landing zone was established, providing Mjunction with a secure and compliant foundation for their seamless migration to AWS. This robust groundwork ensured data integrity and compliance with industry standards.

 
 
 

Autoscaling high availability for Application

The implementation of dynamic autoscaling mechanisms offered the application elasticity, enabling it to automatically adjust resources based on real-time demand. This approach enhanced operational efficiency and responsiveness, ensuring optimal performance even during peak usage periods.

Database Modernization

 

Mjunction's Oracle databases underwent a strategic transformation by transitioning to open-source managed RDS PostgreSQL and self-managed EC2 PostgreSQL instances. This modernization strategy not only elevated performance and scalability but also contributed to cost savings by leveraging open-source technology's benefits.

Tools used

Download free AWS Schema Conversion Tool for macOS AWS SCT

AWS Diagramming Icons Explained: The AWS Database Set | Gliffy by Perforce  AWS DMS

Specific Outcomes

Through Minfy's migration initiative, Mjunction achieved enhanced application performance and scalability with dynamic autoscaling, optimized costs by transitioning to open-source PostgreSQL databases, established a secure landing zone ensuring compliance, and seamlessly migrated their critical application to AWS, ensuring uninterrupted service. This modernization drive equips Mjunction for future growth while providing immediate benefits in efficiency, cost, and user experience.

About Minfy

Minfy Technologies is a Cloud Native System Integrator helping enterprises, start-ups and fast-growing businesses navigate digital journeys leveraging AI & Cloud technologies. We assist our customers in accelerating Digital Transformation, Cloud Adoption and Innovation. Our transformative services include Cloud Consulting, Migration & Legacy Modernisations, Cloud-Native Application Development and DeepTech implementations while offering reliable and world class 24x7 managed services.

Reach out to us for a better world

Minfy has a repository of learnings, competencies and an enviable track record of meeting customer needs. Advice and service, solutions and responsiveness work in tandem. Begin your cloud journey, accelerate it or optimise your cloud assets. Experience business impact.

MINFY

POS Malaysia's AWS Excellence: Enhancing Security, Reliability, and Cost Optimization

  1. Home
  2. >
  3. Case Study

type: Case study

 

Executive Summary

Pos Malaysia is the national courier service provider and sole licensee for universal postal services in the country, delivering to more than 10 million addresses across the nation. With a track record of over 200 years, the Pos Malaysia Group has progressed from a traditional postal service into a dynamic mail and parcel services, financial services and supply chain solutions provider with the largest delivery and touchpoint network in Malaysia. Pos Malaysia’s Integrated Parcel Centre (IPC) was upgraded to process over 300,000 items a day. With the growing demand, Pos Malaysia incorporates various technologies to deliver several services to its customers and to realize its vision, “We deliver. We connect. We improve lives.”

Challenges

•  IAM User best practice: POS Digital account has IAM users and wanted to improve their security posture in terms of IAM users, password policy, temporary access, secrets manager etc by implementing best practices to protect against unauthorized access across the cloud environment.
•  Automation: POS Digital account wanted to have a reliable platform with a modernized environment by leveraging AWS cloud-native services with higher fault isolation.
•  Reliability: POS Digital account’s objective is to enhance their testing process to validate the multi-deployment strategy along with the auto-scaling group to minimize the peak load performance.
•  Higher Availability for Application Recovery: POS Digital account wants to improve its application resiliency with multi-account strategy with a disaster recovery plan.

AWS Best Practice Recommendations Implemented

Security Pillar

•  Enable IAM key rotation for more than 90 days.
•  Configure password policy to match organization standards.
•  Implement AWS Secrets Manager in SSM parameter.
•  Enable IMDSv2 to secure EC2 instances where IAM roles are enabled.

Reliability Pillar

•  Enable multi-account strategy to plan for DR setup.
•  Enable auto healing layer by leveraging AWS Autoscaling group for business-critical instances.
•  Enable Route53 for more control over the data plane.

Performance Pillar

•  Enable metrics for infra and app layer with application insights to have more control of all layers.
•  Use AWS Config to have a proper audit of all the resources.

Customer Outcomes

•   Improved Security: The POS Digital account has enabled security posture across the environment in terms of MFA and IAM best practice and have enabled IMDSv2 to protect all the EC2 instances.
•   Enhanced Reliability: POS Digital account enabled auto-scaling group with a highly reliable environment. And customer has performed distributed load testing using Apache JMeter.
•   Cost Optimization: AWS Compute Optimizer has been enabled to have more control over the rightsizing of resources and addressed RI instances for better cost optimization.
•   Security Standards: End-to-end monitoring has been enabled on both the infra and app layer with application insights. Also tested some canary scripts for Synthetic monitoring on the web layer.

Conclusion

The POS Digital account has made significant enhancements across multiple fronts in its AWS environment. These include improved security through the implementation of Multi-Factor Authentication (MFA), adherence to IAM best practices, and the deployment of IMDSv2 to safeguard all EC2 instances. Additionally, the account has bolstered reliability by establishing an auto-scaling group for a highly dependable system, validated through distributed load testing using Apache JMeter. Cost optimization efforts have been prioritized by enabling AWS Compute Optimizer to fine-tune resource allocation and address Reserved Instances (RIs) for cost efficiency. Lastly, security standards have been elevated with end-to-end monitoring spanning both infrastructure and application layers, aided by application insights and the introduction of canary scripts for synthetic monitoring at the web layer to proactively identify and address potential issues.

About Minfy

Minfy Technologies is a Cloud Native System Integrator helping enterprises, start-ups and fast-growing businesses navigate digital journeys leveraging AI & Cloud technologies. We assist our customers in accelerating Digital Transformation, Cloud Adoption and Innovation. Our transformative services include Cloud Consulting, Migration & Legacy Modernisations, Cloud-Native Application Development and DeepTech implementations while offering reliable and world class 24x7 managed services.

Reach out to us for a better world

Minfy has a repository of learnings, competencies and an enviable track record of meeting customer needs. Advice and service, solutions and responsiveness work in tandem. Begin your cloud journey, accelerate it or optimise your cloud assets. Experience business impact.

This website stores cookie on your computer. These cookies are used to collect information about how you interact with our website and allow us to remember you. We use this information in order to improve and customize your browsing experience and for analytics and metrics about our visitors both on this website and other media. To find out more about the cookies we use, see our Privacy Policy. If you decline, your information won’t be tracked when you visit this website. A single cookie will be used in your browser to remember your preference not to be tracked.