Data and Infrastructure Engineer - GCP
As the Data and Infrastructure Engineer - GCP, you will be working collaboratively with other teams and promoting best practices of cloud application architecture, security, SDLC, QA/QC, CI/CD, and DevOps.
Bachelor’s degree in engineering, computer science or related field, or equivalent work experience
Minimum 3+ years’ experience with data management and cloud back-end / API development
Demonstrated experience working in AGILE and DevOps environments
Strong written and oral communication skills
Building, documenting, deploying, and maintaining secure RESTful APIs
Enterprise API security and identity management best practices and frameworks
Architecting, managing, and deploying relational and non-relational databases
Microservices design principles and frameworks
Strong understanding of performance optimization, especially in latency-sensitive (millisecond scale) environments
Work with fulfillment services (access control, authentication, messaging, etc.)
CI/CD pipeline integration
Cloud application and infrastructure architectures, including distributed storage, compute, serverless, and containerized Docker / Kubernetes applications
Tech Stack strengths:
Linux (various distributions), GAE, GCE (GCP), Cloud Run.
Node.js
Tableau, Data Studio, Looker
Git (Github, BitBucket)
JIRA and Confluence
Terraform
CI/CD (Jenkins, DockerHub)
Docker / Kubernetes
API frameworks, API and Identify Management frameworks (e.g. WSO2 or equivalent)
Google Cloud Platform (preferred) or equivalent cloud experience:
Firebase, Firestore or equivalent key-value store, BigQuery
App Engine or equivalent cloud application deployment framework
Dialog Flow (preferred) or equivalent engine
Machine Learning APIs and frameworks (AutoML, Translation APIs, and others)
Tech Stack experience:
HTML 5 / CSS 3
JavaScript and front-end frameworks (Vue.js)
Python (Django / Flask)
Jenkins / Maven / Selenium
ISO/IEC 27001 standards