Executive Summary
About Client
Newzealand’s most awarded mortgage & insurance advisor Global Finance caters to about 1,500+ customers for their mortgage or insurance needs every year so that they can meet their financial goals. Global Finance offers more preference & freedom, with loan approvals from numerous lenders if chosen by the customers. Dealing with a large number of clients & team members, Global Finance was facing issues managing their unstructured data. As Peritos had already been managing their Dynamics environment, we successfully guided and supported Global Finance’s move from saving data from Azure Dataverse to Azure blob Storage which saved them 1500$ a month.
https://www.globalfinance.co.nz/
Location:Auckland, Newzealand
Project Background
Global Finance has been offering smarter loans and insurance since 1999. Working as one of the best mortgage & insurance advisers in NZ, Global Finance helped clients to save on their loans, by avoiding unnecessary interest and getting mortgage-free faster. Since the beginning, they have helped some customers become mortgage-free in as little as 7 years rather than the standard 30-year term. Global Finance was already using Dyn365 and saving data from Azure Dataverse, now moving to Azure Blob Storage has optimized for storing massive amounts of unstructured data for them.
Scope & Requirement
In the 1st Phase of the Windows Virtual Server Setup, implementation was discussed as follows:
- Setting up for saving data from Azure Dataverse to Azure Blob Storage has sustained a lot of unstructured data for Global Finance
- Setting up the demands for storing and analyzing large volumes of unstructured data have increased over the past decade & Azure Blob Storage is one solution that fulfills the enterprise needs accurately.
Implementation
Technology and Architecture
Technology
The migration was deployed with the below technological component
• For Azure Dataverse-The underlying technology used was Azure SQL Database
• For Azure Blob Storage- It supported the most popular development frameworks including Java, .NET, Python & Node.js
Security & Compliance:
- Tagging Policies
- Azure config for compliance checks
- NIST compliance
- Guardrails
- Security Hub
Backup and Recovery
Azure Backup provided a simple, secure, cost-effective, and cloud-based backup solution to protect the business or application-critical data stored in Azure Blob in two ways- Continuous backups & Periodic backups
Network Architecture
- Site to Site VPN Architecture using Transit Gateway
- Distributed Azure Network Firewall
- Monitoring with Cloud Watch and VPC flow logs.
Cost Optimization
- Alerts and notifications are configured in the Azure cost
Code Management, Deployment
- Cloudformation scripts for creating stacksets and scripts for generating Azure services was handed over to the client
Challenges of Migrating from Azure Dataverse to Azure Blob Storage
- It was a bit of a challenge to ensure the new environment after migration meets all of the compliance criteria and still remain cost effective.
Project Completion
Duration
July 2022 ~ 1 week
Deliverables
- Dynamics License
- Power App License
- Power App per use License
- Power App per app license
Support for Dynamics Discounted Licensing
- For all Licenses we implement we provide monthly billing with 20 days credit Terms.
- We provide Value added services by sending reports to the client on the license usage and last activity date for each user to help them manage their license cost and to get visibility
Testimonial
- Azure Blob Storage has a lot of organizational features that has solved the storage problem of Global Finance at a lower cost. Despite being developed for unstructured data, containers permit businesses to construct their preferred categories by uploading specific blobs to specific containers.
- Shifting from Azure Dataverse to Azure Blob Storage has provided a free hand to Global Finance to access objects in Blob Storage via the Azure Storage REST API, Azure PowerShell, Azure CLI, or an Azure Storage client library.
Now Global Finance is securely connected to Blob Storage by using SSH File Transfer Protocol (SFTP) & mount Blob Storage containers by using the Network File System 3.0 protocol. Peritos handled the Microsoft Dynamics 365 domain for Global Finance and provided discounted licensing, which proved very cost-effective.
building Manager
Global Finace services
Next Phase
We are also in discussion with other projects for the client
1. Dynamics CRM system Support
2. O365 License Management
If You Are Looking For Similar Services? Please Get In Touch
Executive Summary
About Client
Newzealand’s most awarded mortgage & insurance advisor Global Finance caters to about 1,500+ customers for their mortgage or insurance needs every year so that they can meet their financial goals. Global Finance offers more preference & freedom, with loan approvals from numerous lenders if chosen by the customers. Dealing with a large number of clients & team members, Global Finance was facing issues managing their unstructured data. As Peritos had already been managing their Dynamics environment, we successfully guided and supported Global Finance’s move from saving data from Azure Dataverse to Azure blob Storage which saved them 1500$ a month.
https://www.globalfinance.co.nz/
Location:Auckland, Newzealand
Project Background
Global Finance has been offering smarter loans and insurance since 1999. Working as one of the best mortgage & insurance advisers in NZ, Global Finance helped clients to save on their loans, by avoiding unnecessary interest and getting mortgage-free faster. Since the beginning, they have helped some customers become mortgage-free in as little as 7 years rather than the standard 30-year term. Global Finance was already using Dyn365 and saving data from Azure Dataverse, now moving to Azure Blob Storage has optimized for storing massive amounts of unstructured data for them.
Scope & Requirement
In the 1st Phase of the Windows Virtual Server Setup, implementation was discussed as follows:
- Setting up for saving data from Azure Dataverse to Azure Blob Storage has sustained a lot of unstructured data for Global Finance
- Setting up the demands for storing and analyzing large volumes of unstructured data have increased over the past decade & Azure Blob Storage is one solution that fulfills the enterprise needs accurately.
Implementation
Technology and Architecture
Technology
The migration was deployed with the below technological component
• For Azure Dataverse-The underlying technology used was Azure SQL Database
• For Azure Blob Storage- It supported the most popular development frameworks including Java, .NET, Python & Node.js
Security & Compliance:
- Tagging Policies
- Azure config for compliance checks
- NIST compliance
- Guardrails
- Security Hub
Backup and Recovery
Azure Backup provided a simple, secure, cost-effective, and cloud-based backup solution to protect the business or application-critical data stored in Azure Blob in two ways- Continuous backups & Periodic backups
Network Architecture
- Site to Site VPN Architecture using Transit Gateway
- Distributed Azure Network Firewall
- Monitoring with Cloud Watch and VPC flow logs.
Cost Optimization
- Alerts and notifications are configured in the Azure cost
Code Management, Deployment
- Cloudformation scripts for creating stacksets and scripts for generating Azure services was handed over to the client
Challenges of Migrating from Azure Dataverse to Azure Blob Storage
- It was a bit of a challenge to ensure the new environment after migration meets all of the compliance criteria and still remain cost effective.
Project Completion
Duration
July 2022 ~ 1 week
Deliverables
- Dynamics License
- Power App License
- Power App per use License
- Power App per app license
Support for Dynamics Discounted Licensing
- For all Licenses we implement we provide monthly billing with 20 days credit Terms.
- We provide Value added services by sending reports to the client on the license usage and last activity date for each user to help them manage their license cost and to get visibility
Testimonial
- Azure Blob Storage has a lot of organizational features that has solved the storage problem of Global Finance at a lower cost. Despite being developed for unstructured data, containers permit businesses to construct their preferred categories by uploading specific blobs to specific containers.
- Shifting from Azure Dataverse to Azure Blob Storage has provided a free hand to Global Finance to access objects in Blob Storage via the Azure Storage REST API, Azure PowerShell, Azure CLI, or an Azure Storage client library.
Now Global Finance is securely connected to Blob Storage by using SSH File Transfer Protocol (SFTP) & mount Blob Storage containers by using the Network File System 3.0 protocol. Peritos handled the Microsoft Dynamics 365 domain for Global Finance and provided discounted licensing, which proved very cost-effective.
building Manager
Global Finace services
Next Phase
We are also in discussion with other projects for the client
1. Dynamics CRM system Support
2. O365 License Management
If You Are Looking For Similar Services? Please Get In Touch
Executive Summary
About Client
The client, Yorker, is focused on leveraging technology to address the challenge of tracking and managing cricket bowlers’ net practice bowling loads. Recognizing the risk of overtraining and injuries from improper tracking, therefore, Yorker aims to provide a digital solution tailored for cricket players. In addition, An AWS Custom Application for Yorker empowers bowlers to automate session recordings, create personalized training plans, and monitor progress effectively. The app also fosters a sense of community by enabling interaction, knowledge sharing, and participation in skill-building challenges. The project is being executed in multiple phases, beginning with a Minimum Viable Product (MVP) to establish a strong foundation for future improvements. Yorker’s commitment to innovation and user-centric design reflects its dedication to transforming how athletes manage their training and optimize performance while minimizing injury risks.
Project Background - Enhancing Cricket Training through Digital Bowling Load Management
The Yorker mobile app project addresses a major challenge for cricket bowlers: accurately tracking and managing their bowling loads during net practice. Without proper tracking, bowlers risk improper training regimens, leading to overtraining and injuries. The Yorker app offers a digital solution that automates session recordings, capturing key metrics like delivery count, types of deliveries, and intensity levels. Additionally, the app allows bowlers to create personalized training plans, track progress, and receive real-time alerts to avoid overexertion. By leveraging technology, this initiative not only helps reduce injury risks but also fosters a sense of community. Bowlers can share experiences, learn from experts, and engage in skill-enhancing challenges. Ultimately, the app aims to optimize performance while ensuring bowlers train safely and efficiently, revolutionizing the way athletes manage their training.
Scope & Requirement for AWS Custom Application For Yorker
Scope: The first phase of the Yorker mobile application focuses on developing a Minimum Viable Product (MVP) to establish a strong foundation. Specifically, this phase will deliver core functionalities to allow cricket bowlers to start tracking their training sessions and managing their profiles. The scope includes:
- User Authentication: Secure login and registration functionality for bowlers.
- Profile Management: Basic user profile setup, including personal details and preferences.
- Bowling Record Tracking: Automated entry for recording bowling sessions, including delivery count, types, and intensity.
- Basic Reporting: Simple reports summarizing bowling loads to help users monitor their progress.
Requirements:
- Mobile App Development: We will develop the front end using React Native to ensure cross-platform compatibility on iOS and Android.
- Backend Services: Built using .NET with RESTful APIs for data communication.
- Database: RDS Aurora PostgreSQL for structured data storage of user profiles and bowling records.
- CI/CD Pipeline: Set up Continuous Integration/Continuous Deployment processes for efficient development and release.
- User Interface Design: Intuitive and user-friendly UI aligned with branding, focusing on easy data entry and report viewing.
Implementation
Technology and Architecture for AWS Custom Application For Yorker
Read more on the technology and Architecture we used for AWS Custom Application Development
Technology
WAF, API Gateway, Lambda Functions, RDS, S3, CloudWatch, Secrets Manager
Integrations
The application leverages RESTful APIs for smooth data transfer between the front end and back end, facilitating user authentication, session tracking, and profile management. Future integrations may include cloud-based analytics and third-party push notifications to enhance user engagement.
Scalability
The app is designed to run on serverless services, allowing automatic scaling based on usage.
Cost Optimization
Peritos helped optimize costs for Yorker by designing an efficient AWS architecture using auto-scaling, right-sized instances, and serverless technologies. With tools like AWS Cost Explorer and Trusted Advisor, we continuously monitored and reduced spending. Automation through CI/CD pipelines and code optimization further enhanced performance while lowering operational costs.
Backup and Recovery
A robust backup strategy, using Amazon S3, prevents data loss, while automated recovery processes ensure quick restoration in case of failure.
Features of AWS Custom Application For Yorker
Automated Bowling Session Tracking
Capture and record each bowling session, including the number of deliveries, delivery types, and intensity levels, thus providing players with a detailed log of their training activities.Personalized Training Plans
Create and customize training plans tailored to individual fitness levels and goals. Furthermore, Players and coaches can adjust these plans based on real-time performance data to optimize training regimens.Progress Monitoring & Alerts
Track progress against predefined plans, with visual dashboards and alerts to notify users of deviations that may lead to overexertion or injuries.User Profile & Simple Reporting
Maintain a personalized profile to store training history, generate basic reports on bowling performance, and gain insights to improve overall training effectiveness.
Challenges with AWS Custom Application For Yorker
Accurate Data Capture & Tracking
Ensuring the app reliably records detailed bowling metrics like delivery type, count, and intensity without manual errors poses a challenge, especially in a real-time sports environment.Scalability & Performance
As user adoption grows, maintaining app performance and scalability will be critical, particularly during peak usage times. Designing a backend that can handle large volumes of data efficiently is essential.User Engagement & Retention
Encouraging consistent use of the app among bowlers can be challenging. Building features that foster community interaction, personalized plans, and gamified challenges will be crucial to retaining users.Cross-Platform Compatibility
Delivering a seamless user experience across both iOS and Android devices requires rigorous testing to address device-specific issues, screen resolutions, and performance variations.
Project Completion of AWS Custom Application For Yorker
Duration
- Aug2024 – Oct 2024 ~ Implementation and Support
- Oct 2024 – Present, We are rolling out the changes production
Deliverables
Requirements Specification & Architectural Design Documents
Comprehensive documentation outlining detailed project requirements, technical architecture, and system design.Minimum Viable Product (MVP)
A fully functional MVP with core features, including user authentication, profile management, automated bowling session tracking, and basic reporting.Mobile Application UI/UX Design
Intuitive and user-friendly interface designs for the app, ensuring a seamless experience on both iOS and Android devices.Backend Services & APIs
Development of scalable backend services using .NET, along with RESTful APIs for data communication between the mobile app and server.CI/CD Pipeline & Deployment
Implementation of Continuous Integration/Continuous Deployment pipelines to automate the build, testing, and deployment processes. Additionally, the initial release is deployed on cloud platforms.
Support
As part of the project implementation we provide 2 months of Ongoing extended support. Additionally, this also includes 20 hrs a month of development for minor bug fixes and a SLA to cover any system outages or high priority issues.
Testimonial
Awaited
Next Phase
We are now looking at the next phase of the project which involves:
1. Ongoing Support and adding new features every Quarter with minor bug fixes
2. Social & Community Building Features
If You Are Looking For Similar Services? Please Get In Touch
Executive Summary
About Client
Enterprise Software Solutions (ESS) provides tailored software licensing and cloud services, specializing in optimizing Microsoft licensing solutions. Headquartered in Phoenix, Arizona, ESS helps clients reduce cloud spending and navigate complex licensing challenges. Their services include Microsoft 365 and Dynamics 365 solutions, cloud licensing optimization, and 24/7 support. ESS focuses on delivering cost-effective software solutions with a strong emphasis on customer service, including fast delivery, a 30-day warranty on most products, and a full refund policy.
Address: Tempe, AZ 85281 , USA
https://www.enterprise-software-solutions.com/
Project Background
Peritos and Enterprise Software Solutions (ESS) have entered a 2-year contract to support multiple enterprise and Microsoft clients. This partnership focuses on providing solutions and services for clients utilizing Microsoft products. The collaboration covers high-level requirements across various projects, ensuring both companies can effectively serve their clients by leveraging Microsoft’s extensive suite of tools and services. These include Dynamics 365 F&O , CRM, Sales Hub, Customer insights, HR, Project operations, SCM and Finance.
Requirement
- Supporting enterprise-level implementations of Microsoft tools.
- Addressing high-level needs such as financial operations, customer relationship management, HR processes, and supply chain efficiencies.
- Providing seamless integration across various departments to enhance productivity and business outcomes.
Scope
Scope
- Requirement Gathering: Understand client-specific needs and align them with Dynamics 365 capabilities.
- System Implementation : Set up the appropriate Dynamics Product for customer and the different modules (e.g., Finance, CRM, SCM) based on the business processes and scope identified.
- Customization: Tailor the system through custom fields, workflows, or integrations as required.
- Data Migration: Transfer data from legacy systems into Dynamics 365 while maintaining integrity.
- User Training and Go-Live: Train users, conduct tests, and ensure a smooth transition to the new system.
- Post-Go-Live Support: Provide ongoing support and system optimizations.
Technology and Architecture
- Cloud-Based Architecture: Dynamics 365 runs on Azure, leveraging its scalability, security, and integration capabilities.
- Microservices and APIs: Modular design using microservices allows seamless integration of Finance, SCM, HR, and CRM with third-party systems.
- Data Management: Utilizes Azure Data Factory for ETL processes, with Azure SQL for data storage and Power BI for embedded analytics.
- Security: Built-in role-based access control (RBAC) and encryption ensure data protection across the system.
Challenges
- Data Migration Complexity: Transferring large volumes of data from legacy systems while ensuring accuracy and integrity can be time-consuming.
- User Adoption: Resistance from users accustomed to old systems can slow down implementation.
- Customization Needs: Legacy customizations may not fit into the Dynamics 365 architecture and require redevelopment.
- Integration with Other Systems: Connecting Dynamics with existing software and third-party tools can be complex.
- Downtime and Disruption: Migrating to a new system may cause temporary disruptions in business processes.
Overcoming challenges across multiple clients has led us to develop reusable assets, datasets, and tools that streamline the Dynamics 365 implementation process. These resources include pre-configured templates, tested integration frameworks, and proven techniques for data migration and user training. By utilizing these assets, we ensure each project runs smoothly and can achieve faster go-live times. This approach has helped us maintain consistency and efficiency, reducing the overall complexity and risks associated with deploying new systems across different client environments.
Project Completion
- System Configuration: Set up Dynamics 365 modules (Finance, SCM, HR, CRM) as per business requirements.
- Data Migration Report: Detailed report on data migrated from legacy systems, ensuring data integrity.
- Customization Documentation: Comprehensive documentation of customizations and configurations done on the system.
- Integration Setup: Configured integration with third-party systems and existing platforms.
- User Training Materials: Manuals, videos, and training sessions for end-users.
- Go-Live Support Plan: Detailed plan for system launch and ongoing post-go-live support.
These deliverables ensure a successful and smooth deployment.
Support
Support can be from 2 weeks ongoing to be around 4 weeks of Hypercare support. We also have been supporting 5 customers for an AMC contract
Testimonial
Working with Peritos has been a highly productive and seamless experience. Their expertise in Microsoft solutions, particularly Dynamics 365, has helped us successfully implement complex projects across multiple clients. Their reusable assets, tools, and consistent approach to handling customizations have ensured smoother transitions and faster go-lives. The team’s commitment to delivering high-quality results and providing ongoing support has strengthened our partnership, and we look forward to future successes together.
– Abhi Ahuja , ESS Founder
About Client
Pioneer Institute of Professional Studies is a Part of Pioneer Group Pioneer Group was established in 1996. The group is a renowned name in providing quality education and is one of the oldest private educational Institutes in Indore, M.P. Pioneer Group is run and managed by highly qualified & experienced professionals having domain experience in the field of education & industry. Autonomous and NAAC Accredited Institute.
- Institutional Membership of CSI, IMA, ISTE and AIMS
- Placement in Top fortune 500 Companies and Institutional Membership of CSI, IMA, ISTE and AIMS
- Number of Students, Teachers and has been ever-increasing
http://www.pioneerinstitute.net/
Location: Indore, Madhya Pradesh, India
Project Background
Peritos and Pioneer Institute of Professional Studies have a history of working together. As a Microsoft Partner Peritos was chosen for the migration project work for AX to F&O system. The migration from Dynamics AX to Dynamics 365 Finance & Operations (F&O) is often driven by a need for enhanced functionality, cloud scalability, and better integration with other Microsoft services. Dynamics AX, although a robust ERP solution, lacks many of the modern features and cloud capabilities provided by Dynamics 365 F&O. Migrating to F&O offers businesses opportunities for real-time analytics, streamlined workflows, advanced AI capabilities, and improved flexibility for business processes.
Requirement
- End of support for Dynamics AX: Customer was reaching end-of-life, prompting businesses to upgrade.
- Desire to adopt cloud infrastructure: Dynamics 365 F&O offers cloud deployment and integration with other Microsoft Azure services and customer wanted to upgrade to the cloud
- Scalability and Flexibility: Dynamics 365 offers modular solutions, meaning companies can scale specific modules to meet evolving business needs. Since they had a list of customizations so it was better to do with the newer implementation
- Seamless integration with Microsoft 365 and Power Platform (Power BI, Power Automate, etc.) for more connected, insightful business management.
- Enhanced User Experience: F&O provides a more intuitive, web-based interface that improves user productivity and accessibility and also better uptime and lesser maintenance.
Scope
Below phases were identified
- Assessment and planning
- Data Migration planning
- Process mapping
- Configuration, Customization
- Integration
- Module wise implementation Sales, Finance , Project management, banking and Vendor management
Implementation
Technology and Architecture
Technology Stack:
Microsoft Azure: Primary cloud platform hosting Dynamics 365 F&O.
Dynamics 365 Finance & Operations: Core ERP platform, replacing Dynamics AX.
Azure Data Factory (ADF): Used for ETL processes during data migration.
Power Platform: Power BI, Power Automate for analytics and automation.
Common Data Service (CDS):Ensures data consistency across integrated systems.
Architecture:
Cloud-based Deployment: Leverages Azure for scalability, performance, and flexibility.
Modular Design: Dynamics 365 modules (Finance, Sales, Vendor , Customer and Banking )
Microservices Architecture: F&O integrates with other services via APIs and Azure microservices.
Data Layer: Data stored in Azure SQL Database, with integrations for real-time analytics using Power BI.
Integration Layer: Utilizes Azure Logic Apps, Service Bus for third-party system integration.
Features & Benefits
Dynamics 365 F&O is fully cloud-based, offering better scalability and reduced infrastructure costs
Modern, web-based interface with improved usability, customizable dashboards, and better navigation.
- Embedded Power BI for real-time insights and advanced reporting.
- Built-in enterprise-grade security with role-based access, multi-factor authentication (MFA), and data encryption.
Native integration with Microsoft services (Azure, Power Platform, Microsoft 365) and third-party applications via APIs.
Benefits
Cloud deployment allows for easy scaling as business needs grow, without the need for additional hardware
The modern UI and enhanced features streamline workflows and boost employee productivity.
- Modular structure allows businesses to adopt only the functionalities they need, with room for future growth 3 systems were retired and moved to F&O
- Cost saving as previous Support contracts with 3rd party software were terminated to enable use of a single platform
Cloud connectivity facilitates better collaboration across teams, locations, and departments, improving overall efficiency.
Challenges
- The new web-based interface in Dynamics 365 F&O can be unfamiliar to users accustomed to the Dynamics AX interface, potentially leading to resistance and slower productivity during the initial phase.
Legacy customizations in Dynamics AX might not be compatible with the new architecture and UI of Dynamics 365 F&O, requiring redesign or redevelopment.
- Migrating large volumes of historical data while ensuring its integrity and compatibility with the new system can be complex and time-consuming.
Implement a phased approach to data migration, beginning with high-priority data, using tools like Azure Data Factory to ensure seamless data transfer and minimal disruption.
- The new features and interface require comprehensive training for end-users to fully leverage the system, which can be time-consuming and resource-intensive.
Tailor training sessions to focus on the specific roles and responsibilities of different user groups to ensure effective learning and immediate application of the new UI.
Introducing a new system can create disruption, and without a solid change management strategy, it may lead to confusion and inefficiencies.
- To address user adoption, offer detailed training sessions, workshops, and access to user guides for staff to become familiar with the new interface and features and supported them to learn in batches with targetted based user group.
Project Completion
Duration: Aug 2022- May 2023 Go Live April 2023
Deliverables
- Fully Implemented Dynamics F&O as per agreed scope.
- 4 weeks of Hypercare support as part of Go Live
Migration Plan Document: Detailed plan with timeline, resources, and risk management.
Current State Assessment Report Audit of AX modules, customizations, and integrations.
Solution Design Document: Blueprint of F&O configuration, customizations, and integrations.
Data Mapping Sheets: Field mapping from AX to F&O system.
Data Cleansing Report: Results of data cleaning activities pre-migration.
Data Validation Report: Data validation results post-migration for integrity.
Functional Testing Report: Functional tests report with pass/fail results.
Performance Testing Report: Performance tests on system load and transactions.
User Acceptance Testing Report: UAT feedback and user sign-off.
Training Materials: User manuals, videos, and session documentation.
Support
- 4 weeks of Hypercare support as part of Go Live
Testimonial
The transition to Dynamics 365 F&O has streamlined our financial reporting significantly. The real-time analytics and embedded Power BI have been game-changers for making faster decisions. The user interface took some getting used to, but after proper training, it’s much easier to navigate compared to AX
— Prakash Chand Jain , Vice Chairman Pioneer Education Group
Our transition to Dynamics 365 Finance & Operations from Dynamics AX has been transformative for our entire organization, especially across our multiple branches and subsidiaries. Managing operations across diverse regions and educational institutions has become much more streamlined.
CA Prashant Jain
Director, Pioneer Education Group
Next Phase
We are now looking at the next phase of the project which involves:
- Customizations to be identified 3 months after go Live and Hyper care ends.
- Business Process Analysis: Reevaluate key business processes and compare them against current system capabilities to identify areas requiring further customization.
- User Feedback: Collect feedback from department heads and end-users to pinpoint specific pain points or missing features.
- Data and Reporting Needs: Analyze reporting and data analytics usage to determine if additional custom reports or dashboards are needed.
Executive Summary
About Client
The customer’s (Tonkin + Taylor) business is involved in environmental consulting or meteorological services, focuses on providing high-resolution meteorological data for various applications, including air quality analysis, weather forecasting, and climate risk assessment. Their offerings are centered around advanced data modeling using the Weather Research Forecasting (WRF) model, which requires significant computational resources due to its ability to generate detailed meteorological datasets.
Project Background - AWS Custom product for Weather research forecasting
Peritos was hired to address these challenges by developing a comprehensive system that could:
- Efficiently run the WRF model using HPC cluster.
- Automatically create and manage HPC cluster jobs on receiving new data requests.
- Automatically manage data resolution adjustments.
- Provide a seamless experience for customers through an easy-to-use online platform.
Enable the commercialization of the datasets, ensuring that the customer could capitalize on the broad applicability of their data across multiple disciplines
Scope & Requirement
Implementation
Technology and Architecture
The architecture of this application efficiently handles the computational intensity of the WRF model, scales dynamically with demand, and provides a seamless experience for users. The integration of various AWS services ensures that the solution is robust, secure, and scalable.
Overall Workflow
User Request: Users input data parameters and request pricing. If satisfied, they proceed with the purchase.
Processing Trigger: Upon payment confirmation, the system triggers the data processing workflow.
WRF and WPS Processing: The ParallelCluster performs the necessary computations to generate the meteorological data.
Post-Processing: Any additional processing is done before the final data is stored.
Download and Notification: Users are notified and provided with a link to download their processed data.
Technology
The web app was deployed with the below technological component
• Backend Code: .NET, C#, Python
• Web App code: Nextjs
• Database: PostgreSQL
• Cloud: AWS
Integrations
• Google APIs
• Stripe
• Auth0
• SendGrid
• Slurm APIs
Cost Optimization
Peritos enhanced Tonkin + Taylor’s FinOps capabilities by designing a cost-efficient, scalable AWS architecture. We optimized compute resources using AWS ParallelCluster, implemented serverless automation with Lambda and Step Functions, and used Amazon S3 and FSx for Lustre for cost-effective data storage. The solution allowed Tonkin + Taylor to scale on demand, reduce infrastructure costs, and gain visibility into cloud spending. This enabled efficient monetization of meteorological data while maintaining control over operational expenses.
High-Performance Computing (HPC) Environment
• AWS ParallelCluster: Provides the compute infrastructure needed to run the WRF model and WPS processes. This cluster is set up dynamically and scaled according to the computational demands of the task, ensuring efficient resource usage.
• Head Node and Compute Fleet: The head node manages the compute fleet, which executes the high-compute WRF and WPS processes.
• FSx for Lustre: High-performance file storage integrated with the ParallelCluster, used to store and access the large datasets generated during processing.
Processing and Orchestration
• AWS Lambda Functions: Used extensively for orchestrating various steps in the data processing workflow.
• AWS Step Functions: Orchestrates the entire workflow by coordinating Lambda functions, managing state transitions, and handling retries or errors.
Features of Application
The solution leverages AWS cloud services to generate, process, and distribute high-resolution meteorological data.
Users interact via an interface hosted on AWS Amplify, secured by AWS WAF and Shield, with APIs managed by Amazon API Gateway.
The system orchestrates data processing using AWS Lambda functions and AWS Step Functions, coordinating tasks such as WRF and WPS processing on an AWS ParallelCluster.
FSx for Lustre provides high-performance storage, while Amazon S3 and Aurora DB handle data storage and transaction management.
Post-processing is done on EC2 instances, with notifications sent via SNS. The solution efficiently manages the high computational demands of the WRF model, scales dynamically, and ensures secure, seamless data access for internal and external users.
Challenges
- Challenge 1: High Computational Demand: The WRF model’s capacity to produce highly detailed meteorological datasets necessitates extensive computational power, which made running it on the customer’s existing local infrastructure impractical. The challenge was to find a solution that could efficiently handle large-scale data generation with optimum costing.
- Solution: This challenge was met by implementing an AWS-based high-performance computing (HPC) cluster, specifically AWS ParallelCluster, which provided the necessary computational resources to run the WRF model efficiently. The jobs on ParallelCluster were created and managed dynamically using AWS Stepfunction and AWS Lambda by utilizing Slurm APIs
- Challenge 2: User Experience and Commercialization: To monetize their meteorological data, the customer needed to create an accessible, user-friendly portal where external users could easily select regions, adjust data resolution, and purchase datasets. The portal needed to be intuitive, efficient, and fully capable of handling secure transactions, which was essential for the success of the customer’s business model.
- Solution: The customer addressed this challenge by developing a web-based portal using AWS Amplify, integrated with AWS WAF and Shield for security, and managed via Amazon API Gateway. This platform provided a seamless user experience, enabling external customers to effortlessly interact with the system, select their data parameters, and complete purchases, thereby facilitating the commercialization of their datasets and enhancing revenue streams.
Project Completion
Duration
- Jan 2024 – Aug 2024 ~ Implementation and Support
Deliverables
• Setting up the AWS services Architecture review and sign off by internal and existing vendors of Landcheck to ensure all best practices are followed and it is in alignment with best practices using AWS well Architected framework to ensure security , scalability and performance are upto the mark.
• Custom web application was developed by the Peritos team working closely with the client’s product owner and completing any changes, bugs and adding critical features prior to Go live to ensure we have a smooth release.
• We are still working on the handover documents and preparing for the final go Live
Testimonial
Awaited
Next Phase
We are now looking at the next phase of the project which involves:
1. Ongoing Support and adding new features every Quarter with minor bug fixes
2. Adding support for more countries
If You Are Looking For Similar Services? Please Get In Touch
Executive Summary
About Client
AWS Control Tower Setup
Wine-Searcher is a web search engine that helps find the price and availability of any wine, whiskey, spirit, or beer worldwide. It has been in operation since 1999 and has offices in New Zealand and the UK. In addition, They provide easy-to-use search engines, price comparison tools, an extensive database of wines and spirits, an encyclopedia, and news pages that aim to provide all “wine-finding” needs.
https://www.wine-searcher.com/
Location: New Zealand & UK
Project Background
Peritos expertly directed an AWS Control Tower setup for Winesearcher, thus optimizing their cloud infrastructure. Leveraging AWS Control Tower, the Peritos team streamlined governance and compliance, ensuring seamless scaling and enhanced security. This was needed as there were multiple different accounts the client wanted to consolidate accounts in addition to using organizations via the control tower. Additionally, Through meticulous configuration, we tailored the environment to Winesearcher’s specific needs, facilitating efficient resource management and cost control. With AWS Control Tower’s automation and governance features, Wine-Searcher gained a robust foundation for future growth, while Peritos provided invaluable expertise, empowering the company to focus on innovation and deliver an exceptional user experience in the dynamic wine market.
Scope & Requirement For AWS control tower Setup
Prerequisite: Automated pre-launch checks for your management account
Step 1. Create your shared account email addresses
Expectations for landing zone configuration
Step 2. Configure and launch your landing zone
Step 3. Then, review and set up the landing zone
Implementation
Technology And Architecture Of AWS control tower Setup
Furthermore, read on the key components that defined the Architecture for the AWS Control Tower Setup for Wine-Searcher
Technology/ Services used
We used AWS services and helped them to setup below
- Cloud: AWS
- Organization setup: Control tower
- AWS SSO for authentication using existing AzureAD credentials
- Policies setup: Created AWS service control policies
- Moreover, Templates created for using common AWS services
Security & Compliance:
- Tagging Policies
- AWS config for compliance checks
- NIST compliance
- Guardrails
- Security Hub
Network Architecture
- Site to Site VPN Architecture using Transit Gateway
- Distributed AWS Network Firewall
- Monitoring with Cloud Watch and VPC flow logs.
Backup and Recovery
- Furthermore, Cloud systems and components used followed AWS’s well-architected framework, and the resources were all Multi-zone availability with uptime of 99.99% or more.
Cost Optimization
- Alerts and notifications are configured in the AWS cost
Code Management, Deployment
- Cloudformation scripts for creating stack sets and scripts for generating AWS services were handed over to the client
Challenges In Implementing AWS control tower Setup
- Landing Zone Drift
- Role Drift
- Security Hub Control Drift
- Trusted Access disabled
Project Completion
Duration Of AWS control tower Setup Implementation
Aug 2023 to Sep 2023 ~ 4 weeks
Deliverables for AWS control tower Setup
1. Control tower implemented
AWS Control Tower is a service built with a solid architecture that can , thus, assist your organization in meeting its compliance requirements by establishing controls and implementing best practices. Moreover, third-party auditors evaluate the security and compliance of several services available in your landing zone as part of various AWS compliance programs, including SOC, PCI, FedRAMP, HIPAA, and more.
2. Business Benefits
Ensuring compliance, therefore, and implementing best practices is crucial for any organization. With our solution, you can, therefore, set up a well-architected, multi-account environment in under 30 minutes. Moreover, The creation of AWS accounts is automated with built-in governance, ensuring that the set standards and regulatory requirements are met. You can also enforce preconfigured controls to adhere to best practices. Additionally, our solution enables the seamless integration of third-party software at scale to enhance your AWS environment.
Support
- 1 month extended support
- A template for Cloud formation stack to create more AWS resources using the available stacks
- In addition, Screen sharing sessions with demo of how the services and new workloads can be deployed.
Testimonial
Awaited
Next Phase
If You Are Looking For Similar Services? Please Get In Touch
Executive Summary
About Client
Managing AWS Environment
Wine-Searcher is a web search engine that helps find the price and availability of any wine, whiskey, spirit, or beer worldwide. It has been in operation since 1999 and has offices in New Zealand and the UK. They provide an easy-to-use search engine, price comparison tools, an extensive database of wines and spirits, an encyclopedia, and news pages that aim to provide all “wine-finding” needs.
https://www.wine-searcher.com/
Location: New Zealand & UK
Project Background
As part of their plan to launch a full suite of digital products, Wine-Searcher chose AWS as their cloud environment. Strategic resource allocation and cost optimization are critical to ensure a cost-effective operation. Peritos helped as the reliable AWS partner on AWS Cost Explorer and AWS Budgets, like valuable tools for implementing ongoing discounted billing. Furthermore, leveraging reserved instances and spot instances and optimizing usage based on peak hours and demand patterns can result in significant cost savings. Experts from the Peritos team helped regularly monitor and fine-tune the AWS environment based on Winesearcher’s needs, allowing for continuous optimization while adhering to budgetary constraints and maintaining the required scalability and performance for their operations.
Scope & Requirement for Managing AWS Environment
In the 1st Phase of the AWS Environment Setup, implementation was discussed as follows:
- Manage Billing
- Value added services
- Handling Complex environments
- Multiple AWS invoices
- Cost Optimization
- Cloud security optimization
Implementation
Technology and Architecture of Managing AWS Environment
Furthermore, Read on the key components that defined the Architecture for managing the AWS Environment Setup for Wine-Searcher
Technology/ Services used
We used AWS services and helped them to setup below
- Cloud: AWS
- Organization setup: Control tower
- AWS SSO for authentication using existing AzureAD credentials
- Policies setup: Created AWS service control policies
- Templates created for using common AWS services
Security & Compliance:
- Tagging Policies
- AWS config for compliance checks
- NIST compliance
- Guardrails
- Security Hub
Network Architecture
- Site to Site VPN Architecture using Transit Gateway
- Distributed AWS Network Firewall
- Monitoring with Cloud Watch and VPC flow logs.
Backup and Recovery
- Cloud systems and components used followed AWS’s well-architected framework, and the resources were all Multi-zone availability with uptime of 99.99% or more.
Cost Optimization
- Alerts and notifications are configured in the AWS cost
Code Management, Deployment
- Cloudformation scripts for creating stack sets and scripts for generating AWS services were handed over to the client
Challenges in Implementing Managing AWS Environment
- Collate all accounts together
- Understand and agree on how the account would be managed under the distribution model
Project Completion
Duration of Managing AWS Environment Implementation
1st Sep 2021 to Current
Deliverables for Managing AWS Environment
- Collate all accounts under the dsitrubution ECAM model
- Manage billing
- Provide support services as needed
- Ongoing discounted licensing
Support
- One month of extended support
- A template for Cloud formation stack to create more AWS resources using the available stacks
- Screen-sharing sessions with demos of how the services and new workloads can be deployed.
Testimonial
Awaited
Next Phase
We are now looking at the next phase of the project, which involves:
1. Implementing a control tower for the client.
If You Are Looking For Similar Services? Please Get In Touch
Executive Summary
About Client
AWS Compute & High-performance Computing
Tonkin + Taylor is New Zealand’s leading environment and engineering consultancy with offices located globally. They shape interfaces between people and the environment, which includes earth, water, and air. Additionally, They have won awards like the Beaton Client Choice Award for Best Provider to Government and Community-2022 and the IPWEA Award for Excellence in Water Projects for the Papakura Water Treatment Plan- 2021.
https://www.tonkintaylor.co.nz/
Location: New Zealand
Project Background
Tonkin + Taylor were embarking on launching a full suite of digital products and zeroed upon AWS as their choice for a cloud environment. Moreover, They wanted to accelerate their digital transformation and add more excellent business value through AWS Development Environment best practices. To achieve all this, we needed to configure AWS Compute & High-Performance Computing, following best practices and meeting compliance standards, which can serve as a foundation for implementing more applications. Furthermore, The AWS Lake House is a central data hub that consolidates data from various sources and caters to all applications and users. It can quickly identify and integrate any data source. The data goes through a meticulous 3-stage refining process: Landing, Raw, and Transformed. Additionally, After the refinement process, it is added to the data catalog and is readily available for consumption through a relational database.
Scope & Requirement for AWS Compute & High Performance Computing
The 1st Phase of the AWS Environment Setup discussed implementation as follows:
- Implement Data Lakehouse on AWS
Implementation
Technology and Architecture of AWS Compute & High Performance Computing
Read more on the key components that defined the Implementation of Data Lakehouse on AWS for Tonkin + Taylor
Technology/ Services used
We used AWS services and helped them to setup below
- Cloud: AWS
- Organization setup: Control tower
- AWS SSO for authentication using existing AzureAD credentials
- Policies setup: Created AWS service control policies
- Templates created for using common AWS services
Security & Compliance:
- Tagging Policies
- AWS config for compliance checks
- NIST compliance
- Guardrails
- Security Hub
Network Architecture
- Site to Site VPN Architecture using Transit Gateway
- Distributed AWS Network Firewall
- Monitoring with Cloud Watch and VPC flow logs.
Backup and Recovery
- Cloud systems and components used followed AWS’s well-Architected framework and the resources were all Multi-zone availability with uptime of 99.99% or more.
Cost Optimization
- Alerts and notifications are configured in the AWS cost
Code Management, Deployment
- Cloudformation scripts for creating stacksets and scripts for generating AWS services was handed over to the client
AWS Compute & High Performance Computing Challenges & Solutions
Diverse data sources- Data Analytics and cleaning up and integration patterns to pull data from different data sources
On-premise data connection to data lake migration- Site-to-site Secure AWS connection was implemented
Templatized format for creating pipelines- Created scripts of specific format, Deployment scripts, and CI CD scripts
Project Completion
Duration of AWS Compute & High Performance Computing
Apr 2023 to July 2023 ~ 4 months
Deliverables for AWS Compute & High Performance Computing
- Create scripts to create and deploy pipelines
- Implement Data Lakehouse
Support
- Providing ongoing support as we are a dedicated development partner for the client
Testimonial
After we setup and enabled client to start using the newly built environment they were eager to get apps being rolled out using cloud resources. It was exciting to see client using the environment extensively. We also took Feedback from stakeholders as below:
Tonkin + Taylor has initiated its Digital Transformation journey, and AWS is one of the key partners in its effort to enable a digitally savvy organization that provides an excellent Customer and Employee Experience. Furthermore, As part of this journey, we have received great support from Peritos as our AWS Partner. The team at Peritos is knowledgeable, brings previous experience in enabling other organizations on this journey, great with quality and timeliness of delivery. T+T has set up its AWS platform with the support of AWS and Peritos and has enabled us to provide our Engineers and clients an environment for data ingestion, transformation, and hosting of multiple applications, analytics, data science & visualization.
Santosh Dixit
Digitization delivery lead
Next Phase
We are now looking at the next phase of the project, which involves:
- API and file-based data sources to be added
- Process data to be used in different applications for ingesting in other applications
If You Are Looking For Similar Services? Please Get In Touch
Executive Summary
About Client
Afghanistan Holding Group has a team of 300 highly qualified Afghan professionals who provide turnkey business services to over 700 international clients.. They provide services like accounting, audit, insurance, tax, consultancy and advisory.
https://ahg.com.af/
Location: Kabul, Afghanistan
Project Background- SAP Implementation for Enterprise
Afghanistan Holding Group was looking for SAP Implementation for Enterprise wide rollout.
Requirement was to optimize the internal processes for SAP core modules to be implemented and help manage the processes more effectively . They were further looking to transform their business models and operations with the help of SAP S/4 Implementation. and be able to show more visibility to the stakeholders on the day to day operations.
Scope & Requirement
High level Project Requirement for SAP implementation for Enterprise wide roll out were as below
- To install, setup and implement SAP S4HANA for a small business of 300 employees that is currently using QuickBooks.
- On premise licenses already purchased and a module list with features was shared
- They mostly required everything set on standard forms, processes and reports. No custom forms, processes, reports will be needed.
- One SAP HANA certified server was already purchased and on premise, 4 virtual machines were to be setup.
Implementation
Technology and Architecture
Technology
ERP implementation required
• SAP System S/4
• Database HANA
• Web App : Fiori
• Hosting : On premise
• Managing VM – Vmware, Suse Linux
Integrations
• For the initial implementation it was a standalone system
Security:
- User role group and custom authentication code assigned to the different users
- Admin users identified and assigned the right roles
- Roles assigned specific to department like sales, purchasing and Finance
Backup and Recovery
System was tested for snaphsots to be taken every weekend and at every end of day. DR and Backup testing was done as part of the handover.
Scalability
Relied on HANA native capability to sustain memory-intensive processing logic.
Cost Optimization
Not in scope as the customer had on-premise system and license was directly procured from SAP.
Code Management, Deployment
NA as only standard functionality was used.
Features of SAP Implementation for enterprise
- SAP system was implemented which had comprehensive business processes mapped to the tool for Sales, Purchasing, Finance, core order-to-cash, procurement and a platform for integration at a later date when client was ready
- Improved and Faster analytics and reporting where the data being input was processed for multiple reports and made it easier to analyze performance, determine pain points, and initiate improvements
- Included SAP business Suite with embedded modules for ERP, CRM, SRM, SCM, PLM co-deployed in the single instance
Challenges
- This was a straightforward implementation where we were able to install the SAP system from the application file and database however in trying to do backup and DR testing we ran into space-related issues as the log and snapshots quickly filled up all the space available
- It was informed to the client and the hard disk space had to be monitored closely.
- To install the application at the right path and to create seperate client for Dev, QA and Prod we were not having enough disk space and we also informed client to have them on a seperate server
- It was decided to create a partition and do multipathing and logical volume grouping so the system can have different client versions.
- End users after implementing the S4 Hana system were not comfortable with the GUI based window and tcodes which had to be entered to navigate around in the system
- A few training plans were made to help them understand the SAP system, Also the option of moving away from Tcode based approach to the Fiori Tile-based model however this would have delayed the project Go live plan hence was parked for later. So the client, for now, continued to up skill the existing resources and Peritos assisted wherver any help was needed.
Project Completion
Duration
Oct 2018 – Jan 2019 ~ 4 months
Deliverables
- A fully configured SAP S/4 system with all core modules implemented ready for the client to start working on the functional configuration part
- Handover of the system setup, Architecture diagram and of the backend system setup using on premise servers,
Support
As part of the project implementation we provided 1 month of extended support. This includes any Major / Minor bug fixes. Questoions specific to the implementation and any system outages etc being faced
Testimonial - SAP Implementation For Enterprise
Below is the feedback on different parameters we recieved
• Skills 5/5 *****
• Availability 5/5 *****
• Communication 5/5 *****
• Quality 5/5 *****
• Deadline 5/5 *****
• Cooperation 5/5 *****
Overall: 5/ 5
We also took Feedback from stakeholders as below:
“Peritos Team was incredible in every way. We have now decided to hire them for additional SAP assignments as well – looking forward to continued success!”
Sanzar Kakkar
Chairman, moore afghanistan
Next Phase - SAP Implementation For Enterprise
We had next projects planned with the client.
- Support for helping the end user to understand the SAP systems , adding documentation and mapping the sales , purchasing and finance processes in the system
- Helping some other customer of the client to upgrade SAP systems
- Client was keen to map thier processes to the SAP system and evaluate the option for Integration, reporting and Mapping the current business process to the new system. This was under discussion.