Live Chat

Cloud Data Architect- Remote Job in Rocky River, Ohio US

Cloud Data Architect- Remote

Onix Networking Corp. - Rocky River, OH

Posted: 7/26/2021 - Expires: 10/7/2021

Job ID: 229933391


Job Description

Summary: In this role, the Cloud Data Architect works as a consultant for Onix clients to define their architecture for how to ingest, store, process, analyze and explore/visualize data on cloud platforms (e.g. AWS, GCP, Azure). The cloud data architect provides the thought-leadership for the client and then translates that to actionable steps for cloud data engineers to implement. This role is both architect and analyst but specific to cloud data technology.


Scope/Level of Decision Making: This is a non-exempt position operating under limited decision-making and supervision. Position performs a variety of assigned activities, referring more complex issues to the manager.


Primary Responsibilities: 

  • Deliver data architecture projects for Onix clients.

  • Deliver high-level architecture design documents for customer engagements.

  • Develop detailed-level design requirements for data architecture implementations.

  • Work with clients to understand and develop their target data architecture needs.

  • Assess an organization’s readiness for using data on a cloud platform.

  • Work internally to develop delivery guides, assessments, and common data architecture patterns for cloud data projects.

  • Work with cloud architects, cloud data engineers, and developers on engagements to deliver end-to-end solutions for clients.


Preferred Skills and Experience: 

  • AWS Data Analytics, Machine Learning, Database Specialty certifications a plus

  • GCP Data Engineer, Machine Learning Engineer certifications a plus

  • 10 years experience working with big data architectures

  • Experience working with AWS, GCP

  • Customer-facing soft skills such as communication, conversation, and business awareness.

  • Three to five years’ experience in a Cloud Data Architect role or related position

  • Five to seven years’ experience implementing foundational data architectures

  • Experience with data processing software (such as Hadoop, Spark, Pig, Hive) and with data processing algorithms (MapReduce, Spark)

  • Experience in writing software in one or more languages such as Java, C++, Python, Go, and/or R

  • Experience with Business Intelligence and analytics processes, workflows, and tools (Looker, Tableau, Qlik, Quicksight, etc.)

  • Experience working data warehouses, including data warehouse technical architectures, infrastructure components, ETL/ELT, and reporting/analytic tools and environments

  • Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Amazon Web Services, Azure, and Google Cloud

  • Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, MongoDB, SparkML, Tensorflow)



Education: BA/BS degree in Computer Science, Mathematics or related technical field, or equivalent practical experience

Travel Expectation:  30 - 40% Domestic.

Job Summary

Employment Type:
Full Time Employee
Job type:
Federal Contractor
Skill Based Partner:
Education Level:
Bachelor's degree
Work Days:
Mon, Tue, Wed, Thu, Fri
Job Reference Code
Depending on Experience
Licenses / Certifications:
Display Recommended WorkKeys®Recommended WorkKeys®:
Applied Math: 4
Graphic Literacy: 4

Workplace Documentation: 5