AWS Compute & High Performance Computing for Tonkin+Taylor

Executive Summary

About Client

AWS Compute & High-performance Computing

Tonkin + Taylor is New Zealand’s leading environment and engineering consultancy with offices located globally. They shape interfaces between people and the environment, which includes earth, water, and air. Additionally, They have won awards like the Beaton Client Choice Award for Best Provider to Government and Community-2022 and the IPWEA Award for Excellence in Water Projects for the Papakura Water Treatment Plan- 2021.

https://www.tonkintaylor.co.nz/
Location: New Zealand

Project Background

Tonkin + Taylor were embarking on launching a full suite of digital products and zeroed upon AWS as their choice for a cloud environment. Moreover, They wanted to accelerate their digital transformation and add more excellent business value through AWS Development Environment best practices. To achieve all this, we needed to configure AWS Compute & High-Performance Computing, following best practices and meeting compliance standards, which can serve as a foundation for implementing more applications. Furthermore, The AWS Lake House is a central data hub that consolidates data from various sources and caters to all applications and users. It can quickly identify and integrate any data source. The data goes through a meticulous 3-stage refining process: Landing, Raw, and Transformed. Additionally, After the refinement process, it is added to the data catalog and is readily available for consumption through a relational database.

Scope & Requirement for AWS Compute & High Performance Computing

The 1st Phase of the AWS Environment Setup discussed implementation as follows:

  • Implement Data Lakehouse on AWS

Implementation

Previous slide
Next slide

Technology and Architecture of AWS Compute & High Performance Computing

Read more on the key components that defined the Implementation of Data Lakehouse on AWS for Tonkin + Taylor

Technology/ Services used

We used AWS services and helped them to setup below 

  • Cloud: AWS
  • Organization setup: Control tower 
  • AWS SSO for authentication using existing AzureAD credentials
  • Policies setup: Created AWS service control policies
  • Templates created for using common AWS services 

Security & Compliance:

  • Tagging Policies
  • AWS config for compliance checks
  • NIST compliance 
  • Guardrails
  • Security Hub

Network Architecture 

  • Site to Site VPN Architecture using Transit Gateway
  • Distributed AWS Network Firewall
  • Monitoring with Cloud Watch and VPC flow logs. 

Backup and Recovery

  • Cloud systems and components used followed AWS’s well-Architected framework and the resources were all Multi-zone availability with uptime of 99.99% or more. 

Cost Optimization 

  • Alerts and notifications are configured in the AWS cost 

Code Management, Deployment

  • Cloudformation scripts for creating stacksets and scripts for generating AWS services was handed over to the client  

AWS Compute & High Performance Computing Challenges & Solutions

  • Diverse data sources- Data Analytics and cleaning up and integration patterns to pull data from different data sources 

  • On-premise data connection to data lake migration- Site-to-site Secure AWS connection was implemented  

  • Templatized format for creating pipelines- Created scripts of specific format, Deployment scripts, and CI CD scripts  

Project Completion

Duration of AWS Compute & High Performance Computing

Apr 2023 to July 2023  ~ 4  months

Deliverables for AWS Compute & High Performance Computing

  • Create scripts to create and deploy pipelines 
  • Implement Data Lakehouse  

Support

  • Providing ongoing support as we are a dedicated development partner for the client  

Testimonial

After we setup and enabled client to start using the newly built environment they were eager to get apps being rolled out using cloud resources. It was exciting to see client using the environment extensively. We also took Feedback from stakeholders as below:

Feedback image
Santosh Dixit
Digitization delivery lead

Next Phase

We are now looking at the next phase of the project, which involves:

  1. API and file-based data sources to be added  
  2. Process data to be used in different applications for ingesting in other applications  

If You Are Looking For Similar Services? Please Get In Touch