Executive Summary

About Client
The client, Yorker, is focused on leveraging technology to address the challenge of tracking and managing cricket bowlers’ net practice bowling loads. Recognizing the risk of overtraining and injuries from improper tracking, therefore, Yorker aims to provide a digital solution tailored for cricket players. In addition, An AWS Custom Application for Yorker empowers bowlers to automate session recordings, create personalized training plans, and monitor progress effectively. The app also fosters a sense of community by enabling interaction, knowledge sharing, and participation in skill-building challenges. The project is being executed in multiple phases, beginning with a Minimum Viable Product (MVP) to establish a strong foundation for future improvements. Yorker’s commitment to innovation and user-centric design reflects its dedication to transforming how athletes manage their training and optimize performance while minimizing injury risks.
Project Background - Enhancing Cricket Training through Digital Bowling Load Management
The Yorker mobile app project addresses a major challenge for cricket bowlers: accurately tracking and managing their bowling loads during net practice. Without proper tracking, bowlers risk improper training regimens, leading to overtraining and injuries. The Yorker app offers a digital solution that automates session recordings, capturing key metrics like delivery count, types of deliveries, and intensity levels. Additionally, the app allows bowlers to create personalized training plans, track progress, and receive real-time alerts to avoid overexertion. By leveraging technology, this initiative not only helps reduce injury risks but also fosters a sense of community. Bowlers can share experiences, learn from experts, and engage in skill-enhancing challenges. Ultimately, the app aims to optimize performance while ensuring bowlers train safely and efficiently, revolutionizing the way athletes manage their training.
Scope & Requirement for AWS Custom Application For Yorker
- User Authentication: Secure login and registration functionality for bowlers.
- Profile Management: Basic user profile setup, including personal details and preferences.
- Bowling Record Tracking: Automated entry for recording bowling sessions, including delivery count, types, and intensity.
- Basic Reporting: Simple reports summarizing bowling loads to help users monitor their progress.
- Mobile App Development: We will develop the front end using React Native to ensure cross-platform compatibility on iOS and Android.
- Backend Services: Built using .NET with RESTful APIs for data communication.
- Database: RDS Aurora PostgreSQL for structured data storage of user profiles and bowling records.
- CI/CD Pipeline: Set up Continuous Integration/Continuous Deployment processes for efficient development and release.
- User Interface Design: Intuitive and user-friendly UI aligned with branding, focusing on easy data entry and report viewing.
Implementation
Technology and Architecture for AWS Custom Application For Yorker
Read more on the technology and Architecture we used for AWS Custom Application Development
Technology
WAF, API Gateway, Lambda Functions, RDS, S3, CloudWatch, Secrets Manager
Integrations
The application leverages RESTful APIs for smooth data transfer between the front end and back end, facilitating user authentication, session tracking, and profile management. Future integrations may include cloud-based analytics and third-party push notifications to enhance user engagement.
Scalability
The app is designed to run on serverless services, allowing automatic scaling based on usage.
Cost Optimization
Serverless architecture, using AWS Lambda, reduces infrastructure costs.
Backup and Recovery
A robust backup strategy, using Amazon S3, prevents data loss, while automated recovery processes ensure quick restoration in case of failure.
Features of AWS Custom Application For Yorker
Automated Bowling Session Tracking
Capture and record each bowling session, including the number of deliveries, delivery types, and intensity levels, thus providing players with a detailed log of their training activities.Personalized Training Plans
Create and customize training plans tailored to individual fitness levels and goals. Furthermore, Players and coaches can adjust these plans based on real-time performance data to optimize training regimens.Progress Monitoring & Alerts
Track progress against predefined plans, with visual dashboards and alerts to notify users of deviations that may lead to overexertion or injuries.User Profile & Simple Reporting
Maintain a personalized profile to store training history, generate basic reports on bowling performance, and gain insights to improve overall training effectiveness.
Challenges with AWS Custom Application For Yorker
Accurate Data Capture & Tracking
Ensuring the app reliably records detailed bowling metrics like delivery type, count, and intensity without manual errors poses a challenge, especially in a real-time sports environment.Scalability & Performance
As user adoption grows, maintaining app performance and scalability will be critical, particularly during peak usage times. Designing a backend that can handle large volumes of data efficiently is essential.User Engagement & Retention
Encouraging consistent use of the app among bowlers can be challenging. Building features that foster community interaction, personalized plans, and gamified challenges will be crucial to retaining users.Cross-Platform Compatibility
Delivering a seamless user experience across both iOS and Android devices requires rigorous testing to address device-specific issues, screen resolutions, and performance variations.
Project Completion of AWS Custom Application For Yorker
Duration
- Aug2024 – Oct 2024 ~ Implementation and Support
- Oct 2024 – Present, We are rolling out the changes production
Deliverables
Requirements Specification & Architectural Design Documents
Comprehensive documentation outlining detailed project requirements, technical architecture, and system design.Minimum Viable Product (MVP)
A fully functional MVP with core features, including user authentication, profile management, automated bowling session tracking, and basic reporting.Mobile Application UI/UX Design
Intuitive and user-friendly interface designs for the app, ensuring a seamless experience on both iOS and Android devices.Backend Services & APIs
Development of scalable backend services using .NET, along with RESTful APIs for data communication between the mobile app and server.CI/CD Pipeline & Deployment
Implementation of Continuous Integration/Continuous Deployment pipelines to automate the build, testing, and deployment processes. Additionally, the initial release is deployed on cloud platforms.
Support
As part of the project implementation we provide 2 months of Ongoing extended support. Additionally, this also includes 20 hrs a month of development for minor bug fixes and a SLA to cover any system outages or high priority issues.
Testimonial
Awaited
Next Phase
We are now looking at the next phase of the project which involves:
1. Ongoing Support and adding new features every Quarter with minor bug fixes
2. Social & Community Building Features
If You Are Looking For Similar Services? Please Get In Touch
Executive Summary

About Client
Enterprise Software Solutions (ESS) provides tailored software licensing and cloud services, specializing in optimizing Microsoft licensing solutions. Headquartered in Phoenix, Arizona, ESS helps clients reduce cloud spending and navigate complex licensing challenges. Their services include Microsoft 365 and Dynamics 365 solutions, cloud licensing optimization, and 24/7 support. ESS focuses on delivering cost-effective software solutions with a strong emphasis on customer service, including fast delivery, a 30-day warranty on most products, and a full refund policy.
Address: Tempe, AZ 85281 , USA
https://www.enterprise-software-solutions.com/
Project Background
Peritos and Enterprise Software Solutions (ESS) have entered a 2-year contract to support multiple enterprise and Microsoft clients. This partnership focuses on providing solutions and services for clients utilizing Microsoft products. The collaboration covers high-level requirements across various projects, ensuring both companies can effectively serve their clients by leveraging Microsoft’s extensive suite of tools and services. These include Dynamics 365 F&O , CRM, Sales Hub, Customer insights, HR, Project operations, SCM and Finance.
Requirement
- Supporting enterprise-level implementations of Microsoft tools.
- Addressing high-level needs such as financial operations, customer relationship management, HR processes, and supply chain efficiencies.
- Providing seamless integration across various departments to enhance productivity and business outcomes.
Scope
Scope
- Requirement Gathering: Understand client-specific needs and align them with Dynamics 365 capabilities.
- System Implementation : Set up the appropriate Dynamics Product for customer and the different modules (e.g., Finance, CRM, SCM) based on the business processes and scope identified.
- Customization: Tailor the system through custom fields, workflows, or integrations as required.
- Data Migration: Transfer data from legacy systems into Dynamics 365 while maintaining integrity.
- User Training and Go-Live: Train users, conduct tests, and ensure a smooth transition to the new system.
- Post-Go-Live Support: Provide ongoing support and system optimizations.
Technology and Architecture
- Cloud-Based Architecture: Dynamics 365 runs on Azure, leveraging its scalability, security, and integration capabilities.
- Microservices and APIs: Modular design using microservices allows seamless integration of Finance, SCM, HR, and CRM with third-party systems.
- Data Management: Utilizes Azure Data Factory for ETL processes, with Azure SQL for data storage and Power BI for embedded analytics.
- Security: Built-in role-based access control (RBAC) and encryption ensure data protection across the system.
Challenges
1. Data Migration Complexity: Transferring large volumes of data from legacy systems while ensuring accuracy and integrity can be time-consuming.
2. User Adoption: Resistance from users accustomed to old systems can slow down implementation.
3. Customization Needs: Legacy customizations may not fit into the Dynamics 365 architecture and require redevelopment.
4. Integration with Other Systems: Connecting Dynamics with existing software and third-party tools can be complex.
5. Downtime and Disruption: Migrating to a new system may cause temporary disruptions in business processes.
Overcoming challenges across multiple clients has led us to develop reusable assets, datasets, and tools that streamline the Dynamics 365 implementation process. These resources include pre-configured templates, tested integration frameworks, and proven techniques for data migration and user training. By utilizing these assets, we ensure each project runs smoothly and can achieve faster go-live times. This approach has helped us maintain consistency and efficiency, reducing the overall complexity and risks associated with deploying new systems across different client environments.
Project Completion
Duration Long Term contract from Jan 2022 for 5 years
The deliverables for a Dynamics 365 implementation projects we have worked on are below
1. System Configuration: Set up Dynamics 365 modules (Finance, SCM, HR, CRM) as per business requirements.
2. Data Migration Report: Detailed report on data migrated from legacy systems, ensuring data integrity.
3. Customization Documentation: Comprehensive documentation of customizations and configurations done on the system.
4. Integration Setup: Configured integration with third-party systems and existing platforms.
5. User Training Materials: Manuals, videos, and training sessions for end-users.
6. Go-Live Support Plan: Detailed plan for system launch and ongoing post-go-live support.
These deliverables ensure a successful and smooth deployment.
Support
Support can be from 2 weeks ongoing to be around 4 weeks of Hypercare support. We also have been supporting 5 customers for an AMC contract
Testimonial
Working with Peritos has been a highly productive and seamless experience. Their expertise in Microsoft solutions, particularly Dynamics 365, has helped us successfully implement complex projects across multiple clients. Their reusable assets, tools, and consistent approach to handling customizations have ensured smoother transitions and faster go-lives. The team’s commitment to delivering high-quality results and providing ongoing support has strengthened our partnership, and we look forward to future successes together.
– Abhi Ahuja , ESS Founder
Executive Summary

About Client
The customer’s (Tonkin + Taylor) business is involved in environmental consulting or meteorological services, focuses on providing high-resolution meteorological data for various applications, including air quality analysis, weather forecasting, and climate risk assessment. Their offerings are centered around advanced data modeling using the Weather Research Forecasting (WRF) model, which requires significant computational resources due to its ability to generate detailed meteorological datasets.
Project Background - AWS Custom product for Weather research forecasting
Peritos was hired to address these challenges by developing a comprehensive system that could:
- Efficiently run the WRF model using HPC cluster.
- Automatically create and manage HPC cluster jobs on receiving new data requests.
- Automatically manage data resolution adjustments.
- Provide a seamless experience for customers through an easy-to-use online platform.
Enable the commercialization of the datasets, ensuring that the customer could capitalize on the broad applicability of their data across multiple disciplines
Scope & Requirement
Implementation
Technology and Architecture
The architecture of this application efficiently handles the computational intensity of the WRF model, scales dynamically with demand, and provides a seamless experience for users. The integration of various AWS services ensures that the solution is robust, secure, and scalable.
Overall Workflow
User Request: Users input data parameters and request pricing. If satisfied, they proceed with the purchase.
Processing Trigger: Upon payment confirmation, the system triggers the data processing workflow.
WRF and WPS Processing: The ParallelCluster performs the necessary computations to generate the meteorological data.
Post-Processing: Any additional processing is done before the final data is stored.
Download and Notification: Users are notified and provided with a link to download their processed data.
Technology
The web app was deployed with the below technological component
• Backend Code: .NET, C#, Python
• Web App code: Nextjs
• Database: PostgreSQL
• Cloud: AWS
Integrations
• Google APIs
• Stripe
• Auth0
• SendGrid
• Slurm APIs
High-Performance Computing (HPC) Environment
• AWS ParallelCluster: Provides the compute infrastructure needed to run the WRF model and WPS processes. This cluster is set up dynamically and scaled according to the computational demands of the task, ensuring efficient resource usage.
• Head Node and Compute Fleet: The head node manages the compute fleet, which executes the high-compute WRF and WPS processes.
• FSx for Lustre: High-performance file storage integrated with the ParallelCluster, used to store and access the large datasets generated during processing.
Processing and Orchestration
• AWS Lambda Functions: Used extensively for orchestrating various steps in the data processing workflow.
• AWS Step Functions: Orchestrates the entire workflow by coordinating Lambda functions, managing state transitions, and handling retries or errors.
Features of Application
The solution leverages AWS cloud services to generate, process, and distribute high-resolution meteorological data.
Users interact via an interface hosted on AWS Amplify, secured by AWS WAF and Shield, with APIs managed by Amazon API Gateway.
The system orchestrates data processing using AWS Lambda functions and AWS Step Functions, coordinating tasks such as WRF and WPS processing on an AWS ParallelCluster.
FSx for Lustre provides high-performance storage, while Amazon S3 and Aurora DB handle data storage and transaction management.
Post-processing is done on EC2 instances, with notifications sent via SNS. The solution efficiently manages the high computational demands of the WRF model, scales dynamically, and ensures secure, seamless data access for internal and external users.
Challenges
- Challenge 1: High Computational Demand: The WRF model’s capacity to produce highly detailed meteorological datasets necessitates extensive computational power, which made running it on the customer’s existing local infrastructure impractical. The challenge was to find a solution that could efficiently handle large-scale data generation with optimum costing.
- Solution: This challenge was met by implementing an AWS-based high-performance computing (HPC) cluster, specifically AWS ParallelCluster, which provided the necessary computational resources to run the WRF model efficiently. The jobs on ParallelCluster were created and managed dynamically using AWS Stepfunction and AWS Lambda by utilizing Slurm APIs
- Challenge 2: User Experience and Commercialization: To monetize their meteorological data, the customer needed to create an accessible, user-friendly portal where external users could easily select regions, adjust data resolution, and purchase datasets. The portal needed to be intuitive, efficient, and fully capable of handling secure transactions, which was essential for the success of the customer’s business model.
- Solution: The customer addressed this challenge by developing a web-based portal using AWS Amplify, integrated with AWS WAF and Shield for security, and managed via Amazon API Gateway. This platform provided a seamless user experience, enabling external customers to effortlessly interact with the system, select their data parameters, and complete purchases, thereby facilitating the commercialization of their datasets and enhancing revenue streams.
Project Completion
Duration
- Jan 2024 – Aug 2024 ~ Implementation and Support
Deliverables
• Setting up the AWS services Architecture review and sign off by internal and existing vendors of Landcheck to ensure all best practices are followed and it is in alignment with best practices using AWS well Architected framework to ensure security , scalability and performance are upto the mark.
• Custom web application was developed by the Peritos team working closely with the client’s product owner and completing any changes, bugs and adding critical features prior to Go live to ensure we have a smooth release.
• We are still working on the handover documents and preparing for the final go Live
Testimonial
Awaited
Next Phase
We are now looking at the next phase of the project which involves:
1. Ongoing Support and adding new features every Quarter with minor bug fixes
2. Adding support for more countries
If You Are Looking For Similar Services? Please Get In Touch
Executive Summary

About Client
Custom Web App Development
Bayleys is a New Zealand-based, family-owned, operated real estate agency in New Zealand and Fiji. As the largest full-service real estate agency in the country, they offer a comprehensive array of property-related services and sector-specific expertise. Furthermore, Their business model is founded on trust, reliability, discretion, and exceeding our clients’ expectations. The in-depth experience, knowledge, and successful track record, therefore, are proven across our full-service business lines. They are committed to delivering world-class service and results.
https://www.bayleys.co.nz/
Location: New Zealand
Project Background
Bayleys envisioned an upgrade, seeking to replace their dated desktop application with a cutting-edge cloud-based alternative. Moreover, Embracing modern technology, this transition aimed to enhance efficiency, accessibility, and collaboration within the organization, aligning seamlessly with contemporary industry standards. The move to a cloud-based app promises to revolutionize their operations, providing a flexible and scalable platform that empowers their team and, consequently, delivers an exceptional experience to their clientele.
Scope & Requirement For Custom Web App Development
During the 1st Phase of the web app development, we discussed the implementation as follows:
- Review existing AWS environment
- Create a new web app for searching and adding new property details
- Migrate data from existing database to AWS
Implementation
Technology And Architecture Of Custom Web App Development
Read more on the key components that defined the Property search-Custom web app development for Bayleys
Technology/ Services used
The web app was deployed with the below technological component
- Backend Code: .NET 6, C#
- Web App code: Next.js
- Database: PostgreSQL
- Cloud: AWS
Integrations:
Google APIs
- AWS WAF service is used for the firewall
- All API endpoints are token-based
Scalability
- The application is designed to be running on serverless services so that it can easily scale up and down automatically based on usage.
Cost Optimization
- Alerts and notifications are configured in the AWS to notify if the budget is being exceeded.
- Deployed on serverless infrastructure, the application does not incur any additional costs when it is not in heavy use.
- Peritos, being a cloud partner, is managing the environment for the client, keeping a close watch on the cost, and finding ways to optimize the same
Code Management, Deployment
- CI/CD is implemented to automatically build and deploy any code changes
Challenges In Implementing Custom Web App Development
- Reuse the existing code logic
- Map the complicated calculation logic from the existing app to the new app
- Retain all features yet give it a better User experience
Project Completion
Duration Of Web App Development Implementation
May 2024 to Aug 2024
Deliverables for Custom Web App Development
- A new modernized cloud-based app
- User Guide
- Unit testing document
Support for Web App Development
- 1 month extended support
- A template for Cloud formation stack to create more AWS resources using the available stacks
- Screen sharing sessions with demo of how the services and new workloads can be deployed.
Testimonial
Awaited
Next Phase
Awaited

If You Are Looking For Similar Services? Please Get In Touch
Executive Summary

About Client
AWS Compute & High-performance Computing
Tonkin + Taylor is New Zealand’s leading environment and engineering consultancy with offices located globally. They shape interfaces between people and the environment, which includes earth, water, and air. Additionally, They have won awards like the Beaton Client Choice Award for Best Provider to Government and Community-2022 and the IPWEA Award for Excellence in Water Projects for the Papakura Water Treatment Plan- 2021.
https://www.tonkintaylor.co.nz/
Location: New Zealand
Project Background
Tonkin + Taylor were embarking on launching a full suite of digital products and zeroed upon AWS as their choice for a cloud environment. Moreover, They wanted to accelerate their digital transformation and add more excellent business value through AWS Development Environment best practices. To achieve all this, we needed to configure AWS Compute & High-Performance Computing, following best practices and meeting compliance standards, which can serve as a foundation for implementing more applications. Furthermore, The AWS Lake House is a central data hub that consolidates data from various sources and caters to all applications and users. It can quickly identify and integrate any data source. The data goes through a meticulous 3-stage refining process: Landing, Raw, and Transformed. Additionally, After the refinement process, it is added to the data catalog and is readily available for consumption through a relational database.
Scope & Requirement for AWS Compute & High Performance Computing
The 1st Phase of the AWS Environment Setup discussed implementation as follows:
- Implement Data Lakehouse on AWS
Implementation
Technology and Architecture of AWS Compute & High Performance Computing
Read more on the key components that defined the Implementation of Data Lakehouse on AWS for Tonkin + Taylor
Technology/ Services used
We used AWS services and helped them to setup below
- Cloud: AWS
- Organization setup: Control tower
- AWS SSO for authentication using existing AzureAD credentials
- Policies setup: Created AWS service control policies
- Templates created for using common AWS services
Security & Compliance:
- Tagging Policies
- AWS config for compliance checks
- NIST compliance
- Guardrails
- Security Hub
Network Architecture
- Site to Site VPN Architecture using Transit Gateway
- Distributed AWS Network Firewall
- Monitoring with Cloud Watch and VPC flow logs.
Backup and Recovery
- Cloud systems and components used followed AWS’s well-Architected framework and the resources were all Multi-zone availability with uptime of 99.99% or more.
Cost Optimization
- Alerts and notifications are configured in the AWS cost
Code Management, Deployment
- Cloudformation scripts for creating stacksets and scripts for generating AWS services was handed over to the client
AWS Compute & High Performance Computing Challenges & Solutions
Diverse data sources- Data Analytics and cleaning up and integration patterns to pull data from different data sources
On-premise data connection to data lake migration- Site-to-site Secure AWS connection was implemented
Templatized format for creating pipelines- Created scripts of specific format, Deployment scripts, and CI CD scripts
Project Completion
Duration of AWS Compute & High Performance Computing
Apr 2023 to July 2023 ~ 4 months
Deliverables for AWS Compute & High Performance Computing
- Create scripts to create and deploy pipelines
- Implement Data Lakehouse
Support
- Providing ongoing support as we are a dedicated development partner for the client
Testimonial
After we setup and enabled client to start using the newly built environment they were eager to get apps being rolled out using cloud resources. It was exciting to see client using the environment extensively. We also took Feedback from stakeholders as below:

Tonkin + Taylor has initiated its Digital Transformation journey, and AWS is one of the key partners in its effort to enable a digitally savvy organization that provides an excellent Customer and Employee Experience. Furthermore, As part of this journey, we have received great support from Peritos as our AWS Partner. The team at Peritos is knowledgeable, brings previous experience in enabling other organizations on this journey, great with quality and timeliness of delivery. T+T has set up its AWS platform with the support of AWS and Peritos and has enabled us to provide our Engineers and clients an environment for data ingestion, transformation, and hosting of multiple applications, analytics, data science & visualization.
Santosh Dixit
Digitization delivery lead
Next Phase
We are now looking at the next phase of the project, which involves:
- API and file-based data sources to be added
- Process data to be used in different applications for ingesting in other applications

If You Are Looking For Similar Services? Please Get In Touch
Executive Summary

About Client
ABDM-Compliant Hospital Management Software for all-size hospitals.
Ekanshi Solutions Pvt Ltd offers expert management consultation services to healthcare organizations. They provide strategic guidance and support to help organizations achieve their goals. With the in-depth expertise and industry knowledge, they help organizations optimize their operations, make informed decisions, and achieve excellence in patient care.
https://ekanshisolutions.com/
Location: Lucknow, Uttar Pradesh, India
Project Background
Ekanshi Solution requires reviewing its clients’ hospitals and clinics to ensure they meet the compliance requirements. To achieve this, we recommended developing a software solution that meets the basic compliance requirements and also eases the operational burden on hospitals.
- Registration and demographic data collection.
- Patient history and medical record management.
- Appointment scheduling and reminders.
- Patient check-in and waiting list management.
- ABDM Compliance M1 , M2 and M3 scenarios therefore create Verify ABHA and to manage patients records
- The movement of this on-premise app to a cloud-based infrastructure is aimed at improving performance, ensuring data security, and enabling seamless integrations with other digital health services.
- AWS Automated HIPPA Compliance check and aligned with best practices.
Scope & Requirement for ABDM-Compliant Hospital Management Software
In the 1st Phase of custom application development, we discussed the implementation as follows:
- A customized app, furthermore, helps to generate ABHA ID and integrates ABDM-compliant APIS
- The client hospital team should be able to view patient records easily and receive and send to the central server
- Able to book appointments and moreover schedule reminders easily.
- We would create a Web version of the app to help manage the above functionality, which will replace the current paper-based and unorganized work the admin was doing.
- Plan and execute the migration of application code, data, and databases from the on-premise system to the selected cloud platform.
- Ensure minimal downtime by utilizing cloud migration tools and strategies, such as database replication, to synchronize on-premise data with the cloud.
- Compliance with HIPPA and using config rules to do ongoing monitoring of compliance
Implementation
Technology and Architecture of Hospital Management Software
Read more on the technology and Architecture we used for AWS Custom Application Development using ESRI ArcGIS.
Technology/ Services used
The web app was deployed with the below technological component
- Backend Code: .NET Core, C#
- Web App code: AngularJS
- Database: PostgreSQL
- Cloud: AWS
Integrations:
Google APIs
ABDM Integration
Auth0
SendGrid
- AWS WAF service is used for the firewall
- All API endpoints are token-based
Scalability
- The application is designed to be running on serverless services so that it can easily scale up and down automatically based on usage.
Backup and Recovery
- Additionally, Automated backups are configured to backup the database and store multiple copies of the backup.
Cost Optimization
- Alerts and notifications are configured in the AWS to notify if the budget is being exceeded.
- Deployed on serverless infrastructure, the application does not incur any additional costs when it is not in heavy use.
- Peritos, being a cloud partner, is managing the environment for the client, keeping a close watch on the cost, and finding ways to optimize the same
Code Management, Deployment
- CI/CD is implemented to automatically build and deploy any code changes
Features of the Application
- Integrated Patient Profile with NDHM: This application seamlessly integrates with NDHM, enabling the swift creation of ABHA IDs and facilitating the exchange of patient health data. By interfacing with the National Digital Health Mission, the system ensures that patient data is standardized, up-to-date, and easily accessible, fostering more informed medical decisions.
- Multi-tenancy Architecture: The system’s ability to cater to multiple hospitals or health providers under a single unified platform is a significant advantage. Each hospital can manage its operations while benefiting from centralized updates and features, ensuring scalability and simplifying administrative tasks.
- Data Encryption at Rest and In Transit: Implemented encryption using AWS Key Management Service (KMS) for both data at rest (S3, EBS, RDS) and in transit (SSL/TLS) to ensure compliance with GDPR and HIPAA requirements for securing sensitive data.
- Identity and Access Management (IAM): Designed and enforced strict least-privilege access policies using AWS IAM. This included creating custom roles and policies with granular permissions for specific users and services, ensuring only authorized personnel had access to sensitive data.
- AWS Config and Compliance Rules: Set up AWS Config to track and audit configuration changes across the environment. Applied AWS Config Rules to continuously monitor compliance against GDPR and HIPAA requirements, such as encryption enabled on S3 buckets and logging for API Gateway and Lambda.
- Audit Logging and Monitoring: Configured AWS CloudTrail and Amazon CloudWatch for continuous logging and monitoring of API calls, changes, and actions within the AWS environment. This was crucial for meeting HIPAA requirements for audit trails and GDPR’s data access visibility.
- VPC Flow Logs and Security Groups: Deployed Virtual Private Cloud (VPC) with properly configured flow logs to monitor and log network traffic. Used AWS Security Groups and Network ACLs to ensure secure network segmentation and prevent unauthorized access to sensitive resources.
- Data Residency and Data Transfer Controls: Implemented controls to ensure data residency compliance by restricting data storage and processing to specific AWS regions as required by GDPR. Utilized VPC endpoints and AWS Direct Connect to secure data transfers and reduce the exposure to the public internet.
- Backup and Disaster Recovery: Designed an automated backup strategy using AWS Backup to meet GDPR’s requirement for data recoverability, ensuring regular snapshots of critical databases (e.g., RDS, DynamoDB) and storing them in encrypted S3 buckets across different regions for redundancy.
Challenges in implementing ABDM Compliant Hospital Management Software
- Integration with ABDM APIs is needed to achieve compliance; however, the API documentation was not up to date. Also, the API versions keep checking. During the app development from v1 to V3, we had to reach the APIs and perform code refactoring to ensure the utilization of the latest set of APIs.
- Furthermore, Help from PWC team was provided and explained the API endpoints and the test scenarios to cover to ensure the app compliance checks can be passed.
- Testing of the application with multiple end users who were experts in their domain was a challenge.
- We found the data quite complicated to understand and relied on the client’s team to test and inform us about the expected result in case of any issues. Additionally, we identified key users such as doctors, administrators, nurses, department heads, etc., to ensure coverage of all user scenarios.
- Given the sensitive nature of medical data, ensuring robust security measures against breaches and unauthorized access is paramount.
- The hospital management application ensured data security and privacy through end-to-end encryption for both data at rest and in transit. AWS’s suite of security tools, including IAM for access control, KMS for key management, and VPCs for network isolation, were leveraged. We fortified the APIs with security tokens and rate limiting and conducted regular training sessions for staff on security best practices.
Project Completion
Duration of Hospital Management Software Implementation
Jan 2023 – Dec 2023 ~ 1 year months 1st Version
@nd Version- Jan 2024 – Present Currently working on Reporting, Enhancements, and Billing , In patient and Out patient feature addition along with M2 Billing
Deliverables for ABDM-Compliant Hospital Management Software
Setting up the AWS environment for the client system
• Custom web application for two environments production and UAT system
• We delivered the features as agreed in the scope
- Registration and demographic data collection.
- Patient history and, furthermore, medical record management.
- Appointment scheduling and reminders.
- Patient check-in and waiting list management.
- ABDM Compliance M1, M2, and M3 scenarios to create Verify ABHA and to manage patients records
- HIPPA compliance report for managing workloads and following best practices for HIPPA and also ongoing monitoring report.
- We developed the following set of core features.
User: Manages user registration, authentication, roles, and permissions.
Hospital: Multi-tenant application to handle hospital registration, department management, and related configurations.
Doctor: Manages doctor profiles, availability, specialties, and associated scheduling.
Patient: In addition, Interfaces with ABDM for patient data operations, ABHA ID creation, and retrieval of patient health history.
Support
- As part of the project implementation we provide 2 months of Ongoing extended support.
- This also includes 20 hrs a month of development for minor bug fixes and an SLA to cover any system outages or high-priority issues.
Testimonial
After working for 6 months on the project, we took feedback from the Product owner whom we have worked closely for project execution:

Peritos and using AWS have been instrumental in transforming our hospital’s operations for clients. It empowered us to create a custom multi-tenant application that not only meets our current needs but also positions us for future growth and innovation to showcase this to our larger client base and prospects. With a solid system now, we have the confidence to continue our mission of providing exceptional healthcare services to our community, knowing that our technology backbone is secure, reliable, and ready to scale. Additionally, We are happy with the services and look forward to completing more projects in the future with Peritos team.
Akanksha Niranjan
OWNER, EKANSHI SOLUTIONS
Next Phase
We are now looking at the next phase of the project which involves:
1. Furthermore, Ongoing Support and adding new features every Quarter with minor bug fixes
2. Electronic Medical Records (EMR) Integration: Incorporate a system that not only stores patient data but also tracks their entire medical history, including medications, allergies etc
3. AI-Powered Predictive Analysis: Moreover, Use AI and machine learning to analyze patient data for potential health risks, helping doctors make informed decisions

If You Are Looking For Similar Services? Please Get In Touch
Executive Summary

About Client
CumulusPro is a Cloud-based digital Image Processing Border Detection and verification platform that delivers a quick and efficient onboarding experience for your customers. CumulusPro helps businesses rapidly transform into digital enterprises by linking people, processes, and applications. Their cloud-based Business Process Management (BPM) Platform is designed to revolutionize how organizations and public institutions digitally communicate and collaborate with their customers, citizens, and partners. Digital businesses improve customer experience and increase customer satisfaction, business efficiency, and productivity while reducing cost and time to market.
https://cumuluspro.com/solutions/identify-plus/
Location:Singapore
Project Background- SAP Support Services-ID Card Detection
This project explains how we implemented border detection and able to process the image. This involved taking pictures from the browser camera, Ability to capture images using different browsers and devices and store the images in blob storage and relevant information on the SQL server. Processing of images digitally is of much importance since processes are digitalised and people take picture of different ID cards(driving license, passport etc) and upload for various purpose today(for instance filling online applications and forms for numerous reasons).First level of image processing algorithm is implemented on these images for seamless application processing before they are uploaded in the system due to the presence of various issues in the images like existence of white spaces, skewed image to name a few.
Scope & Requirement
Scope of work for Image Processing Border Detection-custom app development-ID, card detection, and web app development was as below:
stored in a suitable format. If the image is not correct, then it should either be asked to be retaken or
be corrected using a series of algorithms that we implemented
- Ensure the image classification is done, as there were 2 Ids being scanned.
- Change the orientation of the images to de-skew the image to make it 0 degree
- Be able to refine the resolution of the images and detect borders to crop the image
- If not able to correct the image ask the user to retake the image before uploading it to the
system - If taken successfully, store the image in the BLOB server and do OCR to store the important
information related to Identification in the corresponding server - Pre-processing to convert the image to greyscale and reduce the noise and improve image quality and size using PyrDown and PyrUp
Implementation
Technology and Architecture
Technology
Mobile app was deployed with the below technological component
• Backend Code: .NET Core, C#, Node.js
• Mobile App code: React Native
• Web App code: ReactJS
• Database: SQL Server, MongoDB
• Cloud: Microsoft Azure
Integrations
• Integration to read all data from an existing Shopify backend.
• Single Sign-on using Auth0 to register using Google and follow same login procedure as on the Shopify web app.
• Sendgrid to send emails for order processing events
Security:
• All API endpoints are tokenized
• Payment Gateway like paypal would be used which are secure and PCI DSS compliant.
Backup and Recovery
Cloud systems and components used in the attendance management system are secure and 99.99% SLA. We have added HA/DR mechanism to create a replica of the services
Scalability
Application is designed to scale up to 10X times the average load received in the 1st 6 months of its usage and all cloud resources are configured for auto-scaling based on the load
Cost Optimization
Alerts and notifications are configured in the Azure cloud to ensure that if the budget exceeds a notification is sent out
Code Management, Deployment
Code for the app is handed over to the client through Microsoft AppCenter.
CI/CD is implemented to automatically add, build and deploy any code changes on the app side.
Features of Image Processing Border Detection-custom app development
- Automatic ID card border detection and the image cropping
- Removal of skewness (if present)
- Correction of orientation
Challenges
We encountered some ssues as below:
- Establishing a pre-defined system criterion which needs to be met before uploading the picture. This includes deciding threshold value for the image processing algorithms to be implemented for removal of white spaces present on the side of ID card and skewness present in the image captured
- The Canny algorithm used for border detection had some limitations and didn’t work in all the scenarios where ID card had more prominent rectangle drawn on it or an image with white background of a white ID card. These cases didn’t gave a correct collection of lines and may have resulted in incorrect border detection. To overcome this problem we implemented FindContours on binary image which gave us a more reliable result
- We had to cater for devices of varying power and capability since this solution implementation had to run on various platforms. As we got into the testing phase we had to avoid large variation in responsiveness depending on the device used
Project Completion
Duration
Deliverables
We used OpenCV library for image processing along with EmguCV wrapper, which is compatible with .NET languages (C#,VB,VC++ etc). We implemented the solution through the following steps:
Down sample -> Noise reduction -> Up-sample -> Image Enlargement
- Edge detection using Canny algorithm(to convert image into collection of lines)
- Detect outermost lines for Border detection through ‘Probabilistic Hough transform’ to filter out lines based on width & length
- Using ‘FindContours’ to find objects at outer most location (to overcome the shortcomings of Hough transform)
- Post processing which includes cropping the image along the border and getting a processed image containing ID card completely and removing any skewness present
Support
As part of the project implementation we have standard practice of providing 1 month of extended support. This includes any Major / Minor bug fixes.
Testimonial
We took Feedback from stakeholders as below:

Syed Mohd. Atif
Co-Founder Enticed Retail LLP
Next Phase
We are now looking at the next phase of the project, which involves:
1. Images may not be of the same template, so machine learning can be added to identify the image template and process the image accordingly.
2. For a specific template, there should be a training model which helps the system to know how the new image template would look like and to be able to adapt to new template images
faster.

If You Are Looking For Similar Services? Please Get In Touch
Executive Summary

About Client
DEEPIQ simplifies industrial analytics and offers the only self-service application for end-to-end (Data + AI) OPs for the industrial world. With this app, your SMEs can digitize their expertise, and your data teams can build and deploy robust analytic pipelines.
Location: Have Offices Globally in USA, India, Canada
Project Background
We help clients to move to the cloud from their existing system landscape. For this particular case study we helped the client DEEPIQ to move to website hosted in WordPress to Azure.
Scope & Requirement
Implementation
Technology and Architecture
Technology
The migration was deployed with the below technological component
• For Azure Dataverse-The underlying technology used was Azure SQL Database
• For Azure Blob Storage- It supported the most popular development frameworks including Java, .NET, Python & Node.js
Security & Compliance:
- Tagging Policies
- Azure config for compliance checks
- NIST compliance
- Guardrails
- Security Hub
Backup and Recovery
Azure Backup provided a simple, secure, cost-effective, and cloud-based backup solution to protect the business or application-critical data stored in Azure Blob in two ways- Continuous backups & Periodic backups
Network Architecture
- Site to Site VPN Architecture using Transit Gateway
- Distributed Azure Network Firewall
- Monitoring with Cloud Watch and VPC flow logs.
Cost Optimization
- Alerts and notifications are configured in the Azure cost
Code Management, Deployment
- Cloudformation scripts for creating stacksets and scripts for generating Azure services was handed over to the client
Challenges
We encountered some issues as below:
- We faced some issues pertaining to system changes not working as per the new standards
- Creating a backup strategy without having proper backup tools in place
- Ensuring this end-to-end migration did not lead to data loss
- Additional security enhancements are to be added to ensure unauthorized users do not modify reports.
Project Completion
Duration
Deliverables
- Migration of data and applications securely without impacting the existing user base
- Using Traffic manager to configure routing to 2 different websites
- Adding Auth0 with Traffic manager routing traffic to different domains
Testimonial

Evania Fernandes
building Manager
ultimate property group
Next Phase

If You Are Looking For Similar Services? Please Get In Touch
Executive Summary

About Client
The world today is witnessing a growing trend in the use of technology in the health sector. This allowed us to assist our client, a pharmaceutical company in tracking medical devices and the quality of medicines on the go along with the inventory and transit status, and we supported them in AWS to IoT integration.
Project Background
In this case study, we achieved the following:
- How we implemented a small AWS IoT integration application with a toolkit to assure product quality, elevate the efficiency of medical devices, and raise alerts in case manual intervention is required
- Set up AWS for the application to manage the devices seamlessly
- Interaction with the device to fetch vital information
- Finally, creating a mobile application and using AWS IoT to monitor the devices
Scope & Requirement
We used the below solution components to create a responsive web application that gives a holistic view of all the devices connected to the system and information on their vital parameters.
Implementation
Technology and Architecture
Technology/ Services used
We used AWS services and helped them to setup below
- Cloud: AWS
- Organization setup: Control tower
- AWS SSO for authentication using existing AzureAD credentials
- Policies setup: Created AWS service control policies
- Templates created for using common AWS services
Security & Compliance:
- Tagging Policies
- AWS config for compliance checks
- NIST compliance
- Guardrails
- Security Hub
Network Architecture
- Site to Site VPN Architecture using Transit Gateway
- Distributed AWS Network Firewall
- Monitoring with Cloud Watch and VPC flow logs.
Backup and Recovery
- Cloud systems and components used followed AWS’s well-Architected framework and the resources were all Multi-zone availability with uptime of 99.99% or more.
Cost Optimization
- Alerts and notifications are configured in the AWS cost
Code Management, Deployment
- Cloudformation scripts for creating stacksets and scripts for generating AWS services was handed over to the client
Challenges
We encountered some issues as below:
- AWS setup and pricing were complicated to understand as it is based on usage and consumption, which was a difficult thing to assess at the start of the application
- Ensuring data privacy and security is of utmost importance in this case. Since devices can be hacked without much effort due to poor encryption and that could allow unauthorized access
- Impeccable quality assurance of the whole setup was to be achieved in this case of the pharmaceutical industry, which involves dealing with medicines are surgical instruments, so there was a need for honest sharing of information if anything was not going as expected.
- Understanding the client’s vision of how they needed the UI was challenging.
Project Completion
Duration
Deliverables
- Responsive web application
- Login mechanism using 2 Factor authentication
- Integration with AWS IoT to send and receive data
- Creating screen design using SAP Build on Fiori Guidelines.
- Raspberry Pi3
- Humidity and Temperature sensor to be added
- Using AWS IoT integration core functionality.
- Responsive web application
- Backend in Azure
- Coding in C#, JavaScript, and Angular 5 using Fiori guidelines
- Using MQTT to receive data
Support
- 1 month of extended support
- A template for Cloud formation stack to create more AWS resources using the available stacks
- Screen-sharing sessions with a demo of how the services and new workloads can be deployed.
Testimonial

Evania Fernandes
building Manager
ultimate property group
Next Phase

If You Are Looking For Similar Services? Please Get In Touch
Executive Summary

About Client
Newzealand’s most awarded mortgage & insurance advisor Global Finance caters to about 1,500+ customers for their mortgage or insurance needs every year so that they can meet their financial goals. Global Finance offers more preference & freedom, with loan approvals from numerous lenders if chosen by the customers. Dealing with a large number of clients & team members, Global Finance was facing issues managing their unstructured data. As Peritos had already been managing their Dynamics environment, we successfully guided and supported Global Finance’s move from saving data from Azure Dataverse to Azure blob Storage which saved them 1500$ a month.
Project Background
Global Finance has been offering smarter loans and insurance since 1999. Working as one of the best mortgage & insurance advisers in NZ, Global Finance helped clients to save on their loans, by avoiding unnecessary interest and getting mortgage-free faster. Since the beginning, they have helped some customers become mortgage-free in as little as 7 years rather than the standard 30-year term. Global Finance was already using Dyn365 and saving data from Azure Dataverse, now moving to Azure Blob Storage has optimized for storing massive amounts of unstructured data for them.
Scope & Requirement
In the 1st Phase of the Windows Virtual Server Setup, implementation was discussed as follows:
- Setting up for saving data from Azure Dataverse to Azure Blob Storage has sustained a lot of unstructured data for Global Finance
- Setting up the demands for storing and analyzing large volumes of unstructured data have increased over the past decade & Azure Blob Storage is one solution that fulfills the enterprise needs accurately.
Implementation
Technology and Architecture
Read the technical components & Architecture for migrating from Azure Dataverse to Azure Blob Storage.
Technology
The migration was deployed with the below technological component
• For Azure Dataverse-The underlying technology used was Azure SQL Database
• For Azure Blob Storage- It supported the most popular development frameworks including Java, .NET, Python & Node.js
Security & Compliance:
- Tagging Policies
- Azure config for compliance checks
- NIST compliance
- Guardrails
- Security Hub
Backup and Recovery
Azure Backup provided a simple, secure, cost-effective, and cloud-based backup solution to protect the business or application-critical data stored in Azure Blob in two ways- Continuous backups & Periodic backups
Network Architecture
- Site to Site VPN Architecture using Transit Gateway
- Distributed Azure Network Firewall
- Monitoring with Cloud Watch and VPC flow logs.
Cost Optimization
- Alerts and notifications are configured in the Azure cost
Code Management, Deployment
- Cloudformation scripts for creating stacksets and scripts for generating Azure services was handed over to the client
Challenges of Migrating from Azure Dataverse to Azure Blob Storage
- It was a bit of a challenge to ensure the new environment after migration meets all of the compliance criteria and still remain cost effective.
Project Completion
Duration
July 2022 ~ 1 week
Deliverables
- Dynamics License
- Power App License
- Power App per use License
- Power App per app license
Support for Dynamics Discounted Licensing
- For all Licenses we implement we provide monthly billing with 20 days credit Terms.
- We provide Value added services by sending reports to the client on the license usage and last activity date for each user to help them manage their license cost and to get visibility
Testimonial
- Azure Blob Storage has a lot of organizational features that has solved the storage problem of Global Finance at a lower cost. Despite being developed for unstructured data, containers permit businesses to construct their preferred categories by uploading specific blobs to specific containers.
- Shifting from Azure Dataverse to Azure Blob Storage has provided a free hand to Global Finance to access objects in Blob Storage via the Azure Storage REST API, Azure PowerShell, Azure CLI, or an Azure Storage client library.

Now Global Finance is securely connected to Blob Storage by using SSH File Transfer Protocol (SFTP) & mount Blob Storage containers by using the Network File System 3.0 protocol. Peritos handled the Microsoft Dynamics 365 domain for Global Finance and provided discounted licensing, which proved very cost-effective.
Evania Fernandes
building Manager
ultimate property group
Next Phase
We are also in discussion with other projects for the client
1. Dynamics CRM system Support
2. O365 License Management
