Executive Summary

About Client

Newzealand’s most awarded mortgage & insurance advisor Global Finance caters to about 1,500+ customers for their mortgage or insurance needs every year so that they can meet their financial goals. Global Finance offers more preference & freedom, with loan approvals from numerous lenders if chosen by the customers. Dealing with a large number of clients & team members, Global Finance was facing issues managing their unstructured data. As Peritos had already been managing their Dynamics environment, we successfully guided and supported Global Finance’s move from saving data from Azure Dataverse to Azure blob Storage which saved them 1500$ a month.  

https://www.globalfinance.co.nz/

Location:Auckland, Newzealand

Project Background

Global Finance has been offering smarter loans and insurance since 1999. Working as one of the best mortgage & insurance advisers in NZ, Global Finance helped clients to save on their loans, by avoiding unnecessary interest and getting mortgage-free faster. Since the beginning, they have helped some customers become mortgage-free in as little as 7 years rather than the standard 30-year term. Global Finance was already using Dyn365 and saving data from Azure Dataverse, now moving to Azure Blob Storage has optimized for storing massive amounts of unstructured data for them.

Scope & Requirement

In the 1st Phase of the Windows Virtual Server Setup, implementation was discussed as follows:

  • Setting up for saving data from Azure Dataverse to Azure Blob Storage has sustained a lot of unstructured data for Global Finance
  • Setting up the demands for storing and analyzing large volumes of unstructured data have increased over the past decade & Azure Blob Storage is one solution that fulfills the enterprise needs accurately. 

Implementation

Technology and Architecture

Technology 

The migration was deployed with the below technological component
• For Azure Dataverse-The underlying technology used was Azure SQL Database

• For Azure Blob Storage- It supported the most popular development frameworks including Java, .NET, Python & Node.js

Security & Compliance:

  • Tagging Policies
  • Azure config for compliance checks
  • NIST compliance 
  • Guardrails
  • Security Hub

Backup and Recovery

Azure Backup provided a simple, secure, cost-effective, and cloud-based backup solution to protect the business or application-critical data stored in Azure Blob in two ways- Continuous backups & Periodic backups

Network Architecture 

  • Site to Site VPN Architecture using Transit Gateway
  • Distributed Azure Network Firewall
  • Monitoring with Cloud Watch and VPC flow logs. 

Cost Optimization 

  • Alerts and notifications are configured in the Azure cost 

Code Management, Deployment

  • Cloudformation scripts for creating stacksets and scripts for generating Azure services was handed over to the client  

Challenges of Migrating from Azure Dataverse to Azure Blob Storage

  • It was a bit of a challenge to ensure the new environment after migration meets all of the compliance criteria and still remain cost effective.

Project Completion

Duration

July  2022 ~ 1 week 

Deliverables

  • Dynamics License 
  • Power App License
  • Power App per use License
  • Power App per app license 

Support for Dynamics Discounted Licensing

  • For all Licenses we implement we provide monthly billing with 20 days credit Terms. 
  • We provide Value added services by sending reports to the client on the license usage and last activity date for each user to help them manage their license cost and to get visibility 

Testimonial

  • Azure Blob Storage has a lot of organizational features that has solved the storage problem of Global Finance at a lower cost. Despite being developed for unstructured data, containers permit businesses to construct their preferred categories by uploading specific blobs to specific containers.
  • Shifting from Azure Dataverse to Azure Blob Storage has provided a free hand to Global Finance to access objects in Blob Storage via the Azure Storage REST API, Azure PowerShell, Azure CLI, or an Azure Storage client library.
Feedback image

Now Global Finance is securely connected to Blob Storage by using SSH File Transfer Protocol (SFTP) & mount Blob Storage containers by using the Network File System 3.0 protocol. Peritos handled the Microsoft Dynamics 365 domain for Global Finance and provided discounted licensing, which proved very cost-effective. 

building Manager
Global Finace services

Next Phase

We are also  in discussion with other projects for the client

1. Dynamics CRM system Support 

2. O365 License Management 

If You Are Looking For Similar Services? Please Get In Touch

Executive Summary

About Client

Newzealand’s most awarded mortgage & insurance advisor Global Finance caters to about 1,500+ customers for their mortgage or insurance needs every year so that they can meet their financial goals. Global Finance offers more preference & freedom, with loan approvals from numerous lenders if chosen by the customers. Dealing with a large number of clients & team members, Global Finance was facing issues managing their unstructured data. As Peritos had already been managing their Dynamics environment, we successfully guided and supported Global Finance’s move from saving data from Azure Dataverse to Azure blob Storage which saved them 1500$ a month.  

https://www.globalfinance.co.nz/

Location:Auckland, Newzealand

Project Background

Global Finance has been offering smarter loans and insurance since 1999. Working as one of the best mortgage & insurance advisers in NZ, Global Finance helped clients to save on their loans, by avoiding unnecessary interest and getting mortgage-free faster. Since the beginning, they have helped some customers become mortgage-free in as little as 7 years rather than the standard 30-year term. Global Finance was already using Dyn365 and saving data from Azure Dataverse, now moving to Azure Blob Storage has optimized for storing massive amounts of unstructured data for them.

Scope & Requirement

In the 1st Phase of the Windows Virtual Server Setup, implementation was discussed as follows:

  • Setting up for saving data from Azure Dataverse to Azure Blob Storage has sustained a lot of unstructured data for Global Finance
  • Setting up the demands for storing and analyzing large volumes of unstructured data have increased over the past decade & Azure Blob Storage is one solution that fulfills the enterprise needs accurately. 

Implementation

Technology and Architecture

Technology 

The migration was deployed with the below technological component
• For Azure Dataverse-The underlying technology used was Azure SQL Database

• For Azure Blob Storage- It supported the most popular development frameworks including Java, .NET, Python & Node.js

Security & Compliance:

  • Tagging Policies
  • Azure config for compliance checks
  • NIST compliance 
  • Guardrails
  • Security Hub

Backup and Recovery

Azure Backup provided a simple, secure, cost-effective, and cloud-based backup solution to protect the business or application-critical data stored in Azure Blob in two ways- Continuous backups & Periodic backups

Network Architecture 

  • Site to Site VPN Architecture using Transit Gateway
  • Distributed Azure Network Firewall
  • Monitoring with Cloud Watch and VPC flow logs. 

Cost Optimization 

  • Alerts and notifications are configured in the Azure cost 

Code Management, Deployment

  • Cloudformation scripts for creating stacksets and scripts for generating Azure services was handed over to the client  

Challenges of Migrating from Azure Dataverse to Azure Blob Storage

  • It was a bit of a challenge to ensure the new environment after migration meets all of the compliance criteria and still remain cost effective.

Project Completion

Duration

July  2022 ~ 1 week 

Deliverables

  • Dynamics License 
  • Power App License
  • Power App per use License
  • Power App per app license 

Support for Dynamics Discounted Licensing

  • For all Licenses we implement we provide monthly billing with 20 days credit Terms. 
  • We provide Value added services by sending reports to the client on the license usage and last activity date for each user to help them manage their license cost and to get visibility 

Testimonial

  • Azure Blob Storage has a lot of organizational features that has solved the storage problem of Global Finance at a lower cost. Despite being developed for unstructured data, containers permit businesses to construct their preferred categories by uploading specific blobs to specific containers.
  • Shifting from Azure Dataverse to Azure Blob Storage has provided a free hand to Global Finance to access objects in Blob Storage via the Azure Storage REST API, Azure PowerShell, Azure CLI, or an Azure Storage client library.
Feedback image

Now Global Finance is securely connected to Blob Storage by using SSH File Transfer Protocol (SFTP) & mount Blob Storage containers by using the Network File System 3.0 protocol. Peritos handled the Microsoft Dynamics 365 domain for Global Finance and provided discounted licensing, which proved very cost-effective. 

building Manager
Global Finace services

Next Phase

We are also  in discussion with other projects for the client

1. Dynamics CRM system Support 

2. O365 License Management 

If You Are Looking For Similar Services? Please Get In Touch

Executive Summary

About Client

The client, Yorker, is focused on leveraging technology to address the challenge of tracking and managing cricket bowlers’ net practice bowling loads. Recognizing the risk of overtraining and injuries from improper tracking, therefore, Yorker aims to provide a digital solution tailored for cricket players. In addition, An AWS Custom Application for Yorker empowers bowlers to automate session recordings, create personalized training plans, and monitor progress effectively. The app also fosters a sense of community by enabling interaction, knowledge sharing, and participation in skill-building challenges. The project is being executed in multiple phases, beginning with a Minimum Viable Product (MVP) to establish a strong foundation for future improvements. Yorker’s commitment to innovation and user-centric design reflects its dedication to transforming how athletes manage their training and optimize performance while minimizing injury risks.

Project Background - Enhancing Cricket Training through Digital Bowling Load Management

The Yorker mobile app project addresses a major challenge for cricket bowlers: accurately tracking and managing their bowling loads during net practice. Without proper tracking, bowlers risk improper training regimens, leading to overtraining and injuries. The Yorker app offers a digital solution that automates session recordings, capturing key metrics like delivery count, types of deliveries, and intensity levels. Additionally, the app allows bowlers to create personalized training plans, track progress, and receive real-time alerts to avoid overexertion. By leveraging technology, this initiative not only helps reduce injury risks but also fosters a sense of community. Bowlers can share experiences, learn from experts, and engage in skill-enhancing challenges. Ultimately, the app aims to optimize performance while ensuring bowlers train safely and efficiently, revolutionizing the way athletes manage their training.

Scope & Requirement for AWS Custom Application For Yorker

Scope: The first phase of the Yorker mobile application focuses on developing a Minimum Viable Product (MVP) to establish a strong foundation. Specifically, this phase will deliver core functionalities to allow cricket bowlers to start tracking their training sessions and managing their profiles. The scope includes:

  • User Authentication: Secure login and registration functionality for bowlers.
  • Profile Management: Basic user profile setup, including personal details and preferences.
  • Bowling Record Tracking: Automated entry for recording bowling sessions, including delivery count, types, and intensity.
  • Basic Reporting: Simple reports summarizing bowling loads to help users monitor their progress.

Requirements:

  • Mobile App Development:  We will develop the front end using React Native to ensure cross-platform compatibility on iOS and Android.
  • Backend Services: Built using .NET with RESTful APIs for data communication.
  • Database: RDS Aurora PostgreSQL for structured data storage of user profiles and bowling records.
  • CI/CD Pipeline: Set up Continuous Integration/Continuous Deployment processes for efficient development and release.
  • User Interface Design: Intuitive and user-friendly UI aligned with branding, focusing on easy data entry and report viewing.

Implementation

Technology and Architecture for AWS Custom Application For Yorker

Read more on the technology and Architecture we used for AWS Custom Application Development 

Technology
WAF, API Gateway, Lambda Functions, RDS, S3, CloudWatch, Secrets Manager

Integrations
The application leverages RESTful APIs for smooth data transfer between the front end and back end, facilitating user authentication, session tracking, and profile management. Future integrations may include cloud-based analytics and third-party push notifications to enhance user engagement.

Scalability
The app is designed to run on serverless services, allowing automatic scaling based on usage.

Cost Optimization
Peritos helped optimize costs for Yorker by designing an efficient AWS architecture using auto-scaling, right-sized instances, and serverless technologies. With tools like AWS Cost Explorer and Trusted Advisor, we continuously monitored and reduced spending. Automation through CI/CD pipelines and code optimization further enhanced performance while lowering operational costs.

Backup and Recovery
A robust backup strategy, using Amazon S3, prevents data loss, while automated recovery processes ensure quick restoration in case of failure.

Features of AWS Custom Application For Yorker

  • Automated Bowling Session Tracking
    Capture and record each bowling session, including the number of deliveries, delivery types, and intensity levels, thus providing players with a detailed log of their training activities.

  • Personalized Training Plans
    Create and customize training plans tailored to individual fitness levels and goals. Furthermore, Players and coaches can adjust these plans based on real-time performance data to optimize training regimens.

  • Progress Monitoring & Alerts
    Track progress against predefined plans, with visual dashboards and alerts to notify users of deviations that may lead to overexertion or injuries.

  • User Profile & Simple Reporting
    Maintain a personalized profile to store training history, generate basic reports on bowling performance, and gain insights to improve overall training effectiveness.

Challenges with AWS Custom Application For Yorker

  • Accurate Data Capture & Tracking
    Ensuring the app reliably records detailed bowling metrics like delivery type, count, and intensity without manual errors poses a challenge, especially in a real-time sports environment.

  • Scalability & Performance
    As user adoption grows, maintaining app performance and scalability will be critical, particularly during peak usage times. Designing a backend that can handle large volumes of data efficiently is essential.

  • User Engagement & Retention
    Encouraging consistent use of the app among bowlers can be challenging. Building features that foster community interaction, personalized plans, and gamified challenges will be crucial to retaining users.

  • Cross-Platform Compatibility
    Delivering a seamless user experience across both iOS and Android devices requires rigorous testing to address device-specific issues, screen resolutions, and performance variations.

Project Completion of AWS Custom Application For Yorker

Duration

  • Aug2024 – Oct 2024  ~ Implementation and Support
  • Oct 2024 – Present,  We are rolling out the changes production

Deliverables

  • Requirements Specification & Architectural Design Documents
    Comprehensive documentation outlining detailed project requirements, technical architecture, and system design.

  • Minimum Viable Product (MVP)
    A fully functional MVP with core features, including user authentication, profile management, automated bowling session tracking, and basic reporting.

  • Mobile Application UI/UX Design
    Intuitive and user-friendly interface designs for the app, ensuring a seamless experience on both iOS and Android devices.

  • Backend Services & APIs
    Development of scalable backend services using .NET, along with RESTful APIs for data communication between the mobile app and server.

  • CI/CD Pipeline & Deployment
    Implementation of Continuous Integration/Continuous Deployment pipelines to automate the build, testing, and deployment processes. Additionally, the initial release is deployed on cloud platforms.

Support

As part of the project implementation we provide 2 months of Ongoing extended support. Additionally, this also includes 20 hrs a month of development for minor bug fixes and a SLA to cover any system outages or high priority issues.

Testimonial

Awaited

Next Phase

We are now looking at the next phase of the project which involves:

1. Ongoing Support and adding new features every Quarter with minor bug fixes

2. Social & Community Building Features

If You Are Looking For Similar Services? Please Get In Touch

Executive Summary

About Client 

The customer’s (Tonkin + Taylor) business is involved in environmental consulting or meteorological services, focuses on providing high-resolution meteorological data for various applications, including air quality analysis, weather forecasting, and climate risk assessment. Their offerings are centered around advanced data modeling using the Weather Research Forecasting (WRF) model, which requires significant computational resources due to its ability to generate detailed meteorological datasets.

Project Background - AWS Custom product for Weather research forecasting

Peritos was hired to address these challenges by developing a comprehensive system that could:

  • Efficiently run the WRF model using HPC cluster.
  • Automatically create and manage HPC cluster jobs on receiving new data requests.
  • Automatically manage data resolution adjustments.
  • Provide a seamless experience for customers through an easy-to-use online platform.

Enable the commercialization of the datasets, ensuring that the customer could capitalize on the broad applicability of their data across multiple disciplines

Scope & Requirement

Implementation

Technology and Architecture

The architecture of this application efficiently handles the computational intensity of the WRF model, scales dynamically with demand, and provides a seamless experience for users. The integration of various AWS services ensures that the solution is robust, secure, and scalable.

Overall Workflow

User Request: Users input data parameters and request pricing. If satisfied, they proceed with the purchase.

Processing Trigger: Upon payment confirmation, the system triggers the data processing workflow.

WRF and WPS Processing: The ParallelCluster performs the necessary computations to generate the meteorological data.

Post-Processing: Any additional processing is done before the final data is stored.

Download and Notification: Users are notified and provided with a link to download their processed data.

Technology

The web app was deployed with the below technological component
• Backend Code: .NET, C#, Python
• Web App code: Nextjs 
• Database: PostgreSQL
Cloud: AWS

Integrations
• Google APIs 
• Stripe
• Auth0
• SendGrid

• Slurm APIs

Cost Optimization

Peritos enhanced Tonkin + Taylor’s FinOps capabilities by designing a cost-efficient, scalable AWS architecture. We optimized compute resources using AWS ParallelCluster, implemented serverless automation with Lambda and Step Functions, and used Amazon S3 and FSx for Lustre for cost-effective data storage. The solution allowed Tonkin + Taylor to scale on demand, reduce infrastructure costs, and gain visibility into cloud spending. This enabled efficient monetization of meteorological data while maintaining control over operational expenses.

High-Performance Computing (HPC) Environment

 • AWS ParallelCluster: Provides the compute infrastructure needed to run the WRF model and WPS processes. This cluster is set up dynamically and scaled according to the computational demands of the task, ensuring efficient resource usage.
• Head Node and Compute Fleet: The head node manages the compute fleet, which executes the high-compute WRF and WPS processes.
• FSx for Lustre: High-performance file storage integrated with the ParallelCluster, used to store and access the large datasets generated during processing.

Processing and Orchestration

AWS Lambda Functions: Used extensively for orchestrating various steps in the data processing workflow.

AWS Step Functions: Orchestrates the entire workflow by coordinating Lambda functions, managing state transitions, and handling retries or errors.

Features of Application

  • The solution leverages AWS cloud services to generate, process, and distribute high-resolution meteorological data.

  • Users interact via an interface hosted on AWS Amplify, secured by AWS WAF and Shield, with APIs managed by Amazon API Gateway.

  • The system orchestrates data processing using AWS Lambda functions and AWS Step Functions, coordinating tasks such as WRF and WPS processing on an AWS ParallelCluster.

  • FSx for Lustre provides high-performance storage, while Amazon S3 and Aurora DB handle data storage and transaction management.

  • Post-processing is done on EC2 instances, with notifications sent via SNS. The solution efficiently manages the high computational demands of the WRF model, scales dynamically, and ensures secure, seamless data access for internal and external users.

Challenges

  • Challenge 1: High Computational Demand: The WRF model’s capacity to produce highly detailed meteorological datasets necessitates extensive computational power, which made running it on the customer’s existing local infrastructure impractical. The challenge was to find a solution that could efficiently handle large-scale data generation with optimum costing.
    • Solution: This challenge was met by implementing an AWS-based high-performance computing (HPC) cluster, specifically AWS ParallelCluster, which provided the necessary computational resources to run the WRF model efficiently. The jobs on ParallelCluster were created and managed dynamically using AWS Stepfunction and AWS Lambda by utilizing Slurm APIs
  • Challenge 2: User Experience and Commercialization: To monetize their meteorological data, the customer needed to create an accessible, user-friendly portal where external users could easily select regions, adjust data resolution, and purchase datasets. The portal needed to be intuitive, efficient, and fully capable of handling secure transactions, which was essential for the success of the customer’s business model.
    • Solution: The customer addressed this challenge by developing a web-based portal using AWS Amplify, integrated with AWS WAF and Shield for security, and managed via Amazon API Gateway. This platform provided a seamless user experience, enabling external customers to effortlessly interact with the system, select their data parameters, and complete purchases, thereby facilitating the commercialization of their datasets and enhancing revenue streams.

Project Completion

Duration

  • Jan 2024  – Aug 2024  ~ Implementation and Support

Deliverables

• Setting up the AWS services Architecture review and sign off  by internal and existing vendors of Landcheck to ensure all best practices are followed and it is in alignment with best practices using AWS well Architected framework to ensure security , scalability and performance are upto the mark. 

• Custom web application was developed by the Peritos team working closely with the client’s product owner and completing any changes, bugs and adding critical features prior to Go live to ensure we have a smooth release. 

• We are still working on the handover documents and preparing for the final go Live 

Testimonial

Awaited

Next Phase

We are now looking at the next phase of the project which involves:

1. Ongoing Support and adding new features every Quarter with minor bug fixes

2. Adding support for more countries 

If You Are Looking For Similar Services? Please Get In Touch

Executive Summary

About Client

AWS Compute & High-performance Computing

Tonkin + Taylor is New Zealand’s leading environment and engineering consultancy with offices located globally. They shape interfaces between people and the environment, which includes earth, water, and air. Additionally, They have won awards like the Beaton Client Choice Award for Best Provider to Government and Community-2022 and the IPWEA Award for Excellence in Water Projects for the Papakura Water Treatment Plan- 2021.

https://www.tonkintaylor.co.nz/
Location: New Zealand

Project Background

Tonkin + Taylor were embarking on launching a full suite of digital products and zeroed upon AWS as their choice for a cloud environment. Moreover, They wanted to accelerate their digital transformation and add more excellent business value through AWS Development Environment best practices. To achieve all this, we needed to configure AWS Compute & High-Performance Computing, following best practices and meeting compliance standards, which can serve as a foundation for implementing more applications. Furthermore, The AWS Lake House is a central data hub that consolidates data from various sources and caters to all applications and users. It can quickly identify and integrate any data source. The data goes through a meticulous 3-stage refining process: Landing, Raw, and Transformed. Additionally, After the refinement process, it is added to the data catalog and is readily available for consumption through a relational database.

Scope & Requirement for AWS Compute & High Performance Computing

The 1st Phase of the AWS Environment Setup discussed implementation as follows:

  • Implement Data Lakehouse on AWS

Implementation

Technology and Architecture of AWS Compute & High Performance Computing

Read more on the key components that defined the Implementation of Data Lakehouse on AWS for Tonkin + Taylor

Technology/ Services used

We used AWS services and helped them to setup below 

  • Cloud: AWS
  • Organization setup: Control tower 
  • AWS SSO for authentication using existing AzureAD credentials
  • Policies setup: Created AWS service control policies
  • Templates created for using common AWS services 

Security & Compliance:

  • Tagging Policies
  • AWS config for compliance checks
  • NIST compliance 
  • Guardrails
  • Security Hub

Network Architecture 

  • Site to Site VPN Architecture using Transit Gateway
  • Distributed AWS Network Firewall
  • Monitoring with Cloud Watch and VPC flow logs. 

Backup and Recovery

  • Cloud systems and components used followed AWS’s well-Architected framework and the resources were all Multi-zone availability with uptime of 99.99% or more. 

Cost Optimization 

  • Alerts and notifications are configured in the AWS cost 

Code Management, Deployment

  • Cloudformation scripts for creating stacksets and scripts for generating AWS services was handed over to the client  

AWS Compute & High Performance Computing Challenges & Solutions

  • Diverse data sources- Data Analytics and cleaning up and integration patterns to pull data from different data sources 

  • On-premise data connection to data lake migration- Site-to-site Secure AWS connection was implemented  

  • Templatized format for creating pipelines- Created scripts of specific format, Deployment scripts, and CI CD scripts  

Project Completion

Duration of AWS Compute & High Performance Computing

Apr 2023 to July 2023  ~ 4  months

Deliverables for AWS Compute & High Performance Computing

  • Create scripts to create and deploy pipelines 
  • Implement Data Lakehouse  

Support

  • Providing ongoing support as we are a dedicated development partner for the client  

Testimonial

After we setup and enabled client to start using the newly built environment they were eager to get apps being rolled out using cloud resources. It was exciting to see client using the environment extensively. We also took Feedback from stakeholders as below:

Feedback image
Santosh Dixit
Digitization delivery lead

Next Phase

We are now looking at the next phase of the project, which involves:

  1. API and file-based data sources to be added  
  2. Process data to be used in different applications for ingesting in other applications  

If You Are Looking For Similar Services? Please Get In Touch

Executive Summary

About Client

SydConsulting (Services You Demand) is an SAP professional consultancy initiated in April 2013 by Grant McPherson. Grant had been working in international SAP consultancy businesses in NZ and Europe when he and his wife Katy chose to return to New Zealand. Grant was frustrated with the local SAP consultancy practices because they had a profit-first focus. SYD was explicitly formed with the core values of Customer-first, Employee second & Profit third.

 

https://www.syd.co.nz/
Location: Auckland,NZ

Project Background

SYD Consulting created a digital network on the SAP Business Technology Platform and integrated it using SAP Integration Suite to track and trace its products at every phase of the supply chain. SAP CPI Support enabled:

  • Optimal implementation of source-to-pay and design-to-operate processes, leveraging the cloud integration capabilities of the SAP Integration Suite
  • Higher security as a result of simplified integration among connected participants due to the cloud integration capabilities of the SAP Integration Suite
  • Increased trust in the SYD Consulting brand
  • Enhanced business agility and lower transaction costs

Scope & Requirement

Implementation

Technology and Architecture

Technology 

The web app was deployed with the below technological component
•Application SAP BOBJ
• File migration LCM
• Database: SQL Server
Cloud: NA
• On premise servers 

Integrations
•  Single Sign-on using Active directory

• No other integration requirement 

Security:

• In built Data Encryption
• Roles and Authorisation based access for each user group and Admin for specific users. 

Backup and Recovery

Existing File system backup functionality was being reused.

Scalability

Not Required 

 

Cost Optimization 

As per license agreement with SAP. No additional costing needed 

Code Management, Deployment

NA

Challenges

Worked on a complex integration scenario with multiple integration tools like Mulesoft

Project Completion

Duration

Nov 2022 to Ongoing  

Deliverables

  • Support existing Production CPI flows  

Support

  • For all Licenses we implement we provide monthly billing with 20 days credit Terms. 
  • We provide Value added services by sending reports to the client on the license usage and last activity date for each user to help them manage their license cost and to get visibility 

Testimonial

Awaited

Next Phase

We are also  in discussion with other projects for the client

  1. Ongoing CPI work identified 
  2. Mulesoft 3.9 migration to Mulesoft 4   

If You Are Looking For Similar Services? Please Get In Touch

Executive Summary

About Client 

AmityWA empowers participants and provides them with support and assistance that build their skills and independence. Many times, it is difficult to manage all the data and keep track of each participant. So, AmityWA felt a need for a server that can provide a mechanism for programmatic control of the SAP support Services-Azure Remote Server Setup and its associated hosted virtual machines.

 

https://amitywa.com.au/

Location: Perth, Australia 

Project Background

 Peritos assisted AmityWA to streamline all the servers and explained how to address its data challenges by leveraging SAP Support Services-Azure Remote Server Setup. Peritos offered discounted licensing and guided how to save ongoing costs. 

Scope & Requirement

Below are some of the scope items 

  • We implemented the Azure Windows Server Support with which we also offered them the licensing support at discounted price. 
  • Assist the Best Practices for Azure License Management
  • Recommend a saving plan
  • 1 month Free Trial and extended 2 Month trial as a partner
  • Understand the best licensing needed for the company 
  • Handover and questions

Implementation

Technology and Architecture

Technology 

  • SAP Support Services-Azure Remote Server Setup
  • Team Member License

Integrations

  • NA 

Security:

  • Comes with MIcrosoft Standard Support and Security Cover

Backup and Recovery

  • Microsoft Production Support 
  • 1 Live Environment and 2 Sandbox Environment with Multiple companies that can be createdW

 

Scalability

As applicable with the standard Service

Cost Optimization 

Explained to the client how the cost could be saved for monthly and Annual commitment orders 

Mixed License for Team Member and discounted licensing that aided in cost savings 

Code Management, Deployment

Deployment for the SAP Support Services-Azure Remote Server Setup was done via Peritos 

 

Challenges of SAP Support Services-Azure Remote Server Setup

AmityWA aims to be a provider of services that empower NDIS participants toward their goals. While catering to all participants’ needs, it is important that data is secure for each participant and can be accessed by AmityWA. For that, they required a virtual server that can help achieve faster and easier backup & recovery of key application workloads. Operating through a virtual server provided the liberty to work faster provisioning applications and resources, improving disaster recovery & business continuity & minimizing or eliminating downtime. 

Project Completion

Duration

Oct  2021 ~ 2 working days 

Deliverables

  • Delivered a Web App which smoothly automates and manages the leads for each of AttentionSeeker clients ensuring the data gets stored in the app and the client can see the progress made by the Attention Seeker team thus saving time to have frequent calls to update them on the progress. 
  • A customized design of the app was discussed with the client
  • Training and handover in using the app and explaing how the users and other clients can be easily onboarded. 

Support for Setting Up Azure Windows Virtual Server

  • For all Licenses we implement we provide month end billing with 20 days credit Terms. 
  • We provide Value added services by sending reports to the client on the license usage and last activity date for each user to help them manage their license cost and to get visibility 

Testimonial

Awaited

Next Phase

We are also  working with clients on other projects 

1. Support for Dynamics system

2. Helping with Integration of Business Central with Ecommerce online Stores Amazon, WooCommerce etc. 

If You Are Looking For Similar Services? Please Get In Touch

Executive Summary

About Client

CumulusPro is a Cloud-based digital Image Processing Border Detection and verification platform that delivers a quick and efficient onboarding experience for your customers. CumulusPro helps businesses rapidly transform into digital enterprises by linking people, processes, and applications. Their cloud-based Business Process Management (BPM) Platform is designed to revolutionize how organizations and public institutions digitally communicate and collaborate with their customers, citizens, and partners. Digital businesses improve customer experience and increase customer satisfaction, business efficiency, and productivity while reducing cost and time to market.

https://cumuluspro.com/solutions/identify-plus/

Location:Singapore

Project Background- SAP Support Services-ID Card Detection

This project explains how we implemented border detection and able to process the image. This involved taking pictures from the browser camera, Ability to capture images using different browsers and devices and store the images in blob storage and relevant information on the SQL server. Processing of images digitally is of much importance since processes are digitalised and people take picture of different ID cards(driving license, passport etc) and upload for various purpose today(for instance filling online applications and forms for numerous reasons).First level of image processing algorithm is implemented on these images for seamless application processing before they are uploaded in the system due to the presence of various issues in the images like existence of white spaces, skewed image to name a few. 

Scope & Requirement

Scope of work for Image Processing Border Detection-custom app development-ID, card detection, and web app development was as below: 

The user captures the image from the camera and should be able to upload the image, which should be
stored in a suitable format. If the image is not correct, then it should either be asked to be retaken or
be corrected using a series of algorithms that we implemented
  • Ensure the image classification is done, as there were 2 Ids being scanned.
  • Change the orientation of the images to de-skew the image to make it 0 degree
  • Be able to refine the resolution of the images and detect borders to crop the image
  • If not able to correct the image ask the user to retake the image before uploading it to the
    system
  • If taken successfully, store the image in the BLOB server and do OCR to store the important
    information related to Identification in the corresponding server
  • Pre-processing to convert the image to greyscale and reduce the noise and improve image quality and size using PyrDown and PyrUp

Implementation

Technology and Architecture

Technology 

Mobile app was deployed with the below technological component
• Backend Code: .NET Core, C#, Node.js
• Mobile App code: React Native
• Web App code: ReactJS 
• Database: SQL Server, MongoDB
Cloud: Microsoft Azure

Integrations
• Integration to read all data from an existing Shopify backend. 

• Single Sign-on using Auth0 to register using Google and follow same login procedure as on the Shopify web app. 

• Sendgrid to send emails for order processing events 

Security:

• All API endpoints are tokenized

• Payment Gateway like paypal  would be used which are secure and PCI DSS compliant. 

Backup and Recovery

Cloud systems and components used in the attendance management system are secure and 99.99% SLA. We have added HA/DR mechanism to create a replica of the services 

Scalability

Application is designed to scale up to 10X times the average load received in the 1st 6 months of its usage and all cloud resources are configured for auto-scaling based on the load

Cost Optimization 

Alerts and notifications are configured in the Azure cloud to ensure that if the budget exceeds a notification is sent out 

Code Management, Deployment

Code for the app is handed over to the client through Microsoft AppCenter. 

CI/CD is implemented to automatically add, build and deploy any code changes on the app side.

Features of Image Processing Border Detection-custom app development

  • Automatic ID card border detection and the image cropping
  • Removal of skewness (if present)
  • Correction of orientation

Challenges

We encountered some ssues as below: 

  • Establishing a pre-defined system criterion which needs to be met before uploading the picture. This includes deciding threshold value for the image processing algorithms to be implemented for removal of white spaces present on the side of ID card and skewness present in the image captured
  • The Canny algorithm used for border detection had some limitations and didn’t work in all the scenarios where ID card had more prominent rectangle drawn on it or an image with white background of a white ID card. These cases didn’t gave a correct collection of lines and may have resulted in incorrect border detection. To overcome this problem we implemented FindContours on binary image which gave us a more reliable result
  • We had to cater for devices of varying power and capability since this solution implementation had to run on various platforms. As we got into the testing phase we had to avoid large variation in responsiveness depending on the device used

Project Completion

Duration

Deliverables

We used OpenCV library for image processing along with EmguCV wrapper, which is compatible with .NET languages (C#,VB,VC++ etc). We implemented the solution through the following steps:

Down sample -> Noise reduction -> Up-sample -> Image Enlargement

  • Edge detection using Canny algorithm(to convert image into collection of lines)
  • Detect outermost lines for Border detection through ‘Probabilistic Hough transform’ to filter out lines based on width & length
  • Using ‘FindContours’ to find objects at outer most location (to overcome the shortcomings of Hough transform)
  • Post processing which includes cropping the image along the border and getting a processed image containing ID card completely and removing any skewness present

Support

As part of the project implementation we have standard practice of providing 1 month of extended support. This includes any Major / Minor bug fixes. 

Testimonial

We took Feedback from stakeholders as below:

Feedback image

 

 

 

 

 

 

 

 

 

Syed Mohd. Atif
Co-Founder Enticed Retail LLP

Next Phase

We are now looking at the next phase of the project, which involves:

1. Images may not be of the same template, so machine learning can be added to identify the image template and process the image accordingly.
2. For a specific template, there should be a training model which helps the system to know how the new image template would look like and to be able to adapt to new template images
faster.

If You Are Looking For Similar Services? Please Get In Touch

Executive Summary

About Client

DEEPIQ simplifies industrial analytics and offers the only self-service application for end-to-end (Data + AI) OPs for the industrial world. With this app, your SMEs can digitize their expertise, and your data teams can build and deploy robust analytic pipelines.

https://deepiq.com/

Location: Have Offices Globally in USA, India, Canada

Project Background

We help clients to move to the cloud from their existing system landscape. For this particular case study we helped the client DEEPIQ to move to website hosted in WordPress to Azure. 

Scope & Requirement

In this white paper, you will learn how we helped the client to migrate to Azure cloud with a cost effective, flexible cloud migration path. Make use of  all the tools and resources provided by Azure and at a minimum cost you could move your website to Azure and if you have a good standing reputation and cloud consumption owing to your wide user base you may also qualify for a migration incentive from Azure which can cover all your costs.

Implementation

Technology and Architecture

Technology 

The migration was deployed with the below technological component
• For Azure Dataverse-The underlying technology used was Azure SQL Database

• For Azure Blob Storage- It supported the most popular development frameworks including Java, .NET, Python & Node.js

Security & Compliance:

  • Tagging Policies
  • Azure config for compliance checks
  • NIST compliance 
  • Guardrails
  • Security Hub

Backup and Recovery

Azure Backup provided a simple, secure, cost-effective, and cloud-based backup solution to protect the business or application-critical data stored in Azure Blob in two ways- Continuous backups & Periodic backups

 

Network Architecture 

  • Site to Site VPN Architecture using Transit Gateway
  • Distributed Azure Network Firewall
  • Monitoring with Cloud Watch and VPC flow logs. 

Cost Optimization 

  • Alerts and notifications are configured in the Azure cost 

Code Management, Deployment

  • Cloudformation scripts for creating stacksets and scripts for generating Azure services was handed over to the client  

Challenges

We encountered some issues as below: 

  • We faced some issues pertaining to system changes not working as per the new standards
  • Creating a backup strategy without having proper  backup tools in place
  • Ensuring this end-to-end migration did not lead to data loss
  • Additional security enhancements are to be added to ensure unauthorized users do not modify reports. 

Project Completion

Duration

Deliverables

  • Migration of data and applications securely without impacting the existing user base
  • Using Traffic manager to configure routing to 2 different websites
  • Adding Auth0 with Traffic manager routing traffic to different domains

Testimonial

Feedback image

 

 

 

 

Evania Fernandes
building Manager
ultimate property group

Next Phase

If You Are Looking For Similar Services? Please Get In Touch

Executive Summary

About Client

The world today is witnessing a growing trend in the use of technology in the health sector. This allowed us to assist our client, a pharmaceutical company in tracking medical devices and the quality of medicines on the go along with the inventory and transit status, and we supported them in AWS to IoT integration.

Project Background

In this case study, we achieved the following:

  • How we implemented a small AWS IoT integration application with a toolkit to assure product quality, elevate the efficiency of medical devices, and raise alerts in case manual intervention is required
  • Set up AWS for the application to manage the devices seamlessly
  • Interaction with the device to fetch vital information
  • Finally, creating a mobile application and using AWS IoT to monitor the devices

Scope & Requirement

We used the below solution components to create a responsive web application that gives a holistic view of all the devices connected to the system and information on their vital parameters. 

Implementation

Technology and Architecture

Technology/ Services used

We used AWS services and helped them to setup below 

  • Cloud: AWS
  • Organization setup: Control tower 
  • AWS SSO for authentication using existing AzureAD credentials
  • Policies setup: Created AWS service control policies
  • Templates created for using common AWS services 

Security & Compliance:

  • Tagging Policies
  • AWS config for compliance checks
  • NIST compliance
  • Guardrails
  • Security Hub

Network Architecture 

  • Site to Site VPN Architecture using Transit Gateway
  • Distributed AWS Network Firewall
  • Monitoring with Cloud Watch and VPC flow logs.

Backup and Recovery

  • Cloud systems and components used followed AWS’s well-Architected framework and the resources were all Multi-zone availability with uptime of 99.99% or more. 

Cost Optimization 

  • Alerts and notifications are configured in the AWS cost 

Code Management, Deployment

  • Cloudformation scripts for creating stacksets and scripts for generating AWS services was handed over to the client  

Challenges

We encountered some issues as below:

  • AWS setup and pricing were complicated to understand as it is based on usage and consumption, which was a difficult thing to assess at the start of the application
  • Ensuring data privacy and security is of utmost importance in this case. Since devices can be hacked without much effort due to poor encryption and that could allow unauthorized access
  • Impeccable quality assurance of the whole setup was to be achieved in this case of the pharmaceutical industry, which involves dealing with medicines are surgical instruments, so there was a need for honest sharing of information if anything was not going as expected.
  • Understanding the client’s vision of how they needed the UI was challenging.

Project Completion

Duration

 

 

Deliverables

  • Responsive web application
  • Login mechanism using 2 Factor authentication
  •  Integration with AWS IoT to send and receive data 
  • Creating screen design using SAP Build on Fiori Guidelines.
  • Raspberry Pi3
  • Humidity and Temperature sensor to be added
  • Using AWS IoT integration core functionality.
  • Responsive web application
  • Backend in Azure
  • Coding in C#, JavaScript, and Angular 5 using Fiori guidelines
  • Using MQTT to receive data

Support

  • 1 month of extended support
  • A template for Cloud formation stack to create more AWS resources using the available stacks
  • Screen-sharing sessions with a demo of how the services and new workloads can be deployed.

Testimonial

 

 

 

Feedback image

 

 

 

 

Evania Fernandes
building Manager
ultimate property group

Next Phase

If You Are Looking For Similar Services? Please Get In Touch