Live Chat

Cloud Data Engineer - Remote Job in Lakewood, Ohio US

Cloud Data Engineer - Remote

Onix Networking Corp. - Lakewood, OH

Posted: 10/19/2021 - Expires: 1/17/2022

Job ID: 233029179


Job Description

Summary: In this role, the cloud data engineer helps customers transform and evolve their business through the use of Google’s extensive cloud services. As part of an entrepreneurial team in this rapidly growing business, you will work with cutting-edge cloud technologies and help shape the future of how data is used in the Enterprise. The Senior Data Engineer is responsible for developing and maintaining efficient and highly available data pipelines that execute on cloud data foundations that promote analysis and reporting by the Data and Analytics departments.


Scope/Level of Decision Making: This is an exempt position operating under limited decision-making and supervision. Position performs a variety of assigned activities, referring more complex issues to the manager.


Location: Remote-Nationwide, Remote-Ontario, Remote-Quebec


Primary Responsibilities: 

  • Use Google Cloud Platform to build Enterprise-grade Big Data solutions.

  • Working in tandem with our architecture team to identify and implement the most optimal cloud-based solutions for the company.

  • Create, and develop cloud-based data solutions.

  • Managing cloud environments in accordance with company security guidelines.

  • Deploying and debugging cloud data initiatives as needed in accordance with best practices throughout the development lifecycle.

  • Employing problem-solving skills, with the ability to see and solve issues before they snowball into problems.

  • Build new cloud-based data pipelines.

  • Bring together multiple data sources into a unified data warehouse.

  • Apply analytics and visualizations to customer data sets.

  • Assist in strategic direction and planning for the growth of the Cloud Data Team

  • Act as a technical lead on projects

  • Ability to mentor junior engineers

  • Ability to review and provide feedback on code written by another engineer.


Preferred Skills and Experience: 

  • 4+ years or more of consulting experience.

  • Strong Demonstrable Experience with the following programming languages

    • Python

    • Scala

    • Java

  • Knowledge of the following visualization tools.

    • Data Studio

    • Looker

    • Tableau

    • Qlik

    • Quicksight

  • Experience with large data sets and Enterprise-grade databases (structured and unstructured)

  • Experience deploying data pipelines.

  • Deep understanding of the ETL (extract, transform, load) process.

  • Experience extracting data from multiple sources via APIs and scripting.

  • Experience transforming data through field mapping, programmatic rulesets, and data integrity checking.

  • Able to expertly convey ideas and concepts to others.

  • Excellent communication skills (verbal, written, and presentation)

  • Creative problem-solving skills and the ability to design solutions not immediately apparent. 

  • Ability to participate in multiple projects concurrently.

  • Customer-oriented and shows a bias for action.

  • Able to function in a highly dynamic team that moves rapidly from idea to planning to implementation.

  • Highly adaptable with the ability to learn new technologies quickly without direct oversight.

  • Extensive experience with open-source technology, software development, and system engineering.

  • Excellent communication and organizational skills, and the ability to stay focused on completing tasks and meeting goals within a busy workspace.

  • Interest in Cloud Engineering and its impact on greater business practices.

  • Skilled at working in tandem with a team of engineers, or alone as required.

  • 4 + years of GCP data engineering experience with the following

    • Cloud Composer

    • DataFlow

    • Apache Beam

    • DataProc

    • Spark

    • Data Prep

    • BigQuery

    • Cloud SQL

  • 4+ years ETL/ ELT experience

  • 4+ years experience with Data Lake Patterns

  • 4+ Python, Scala, and Java programming languages

  • Database systems (SQL and NoSQL). ...

  • Experienced with Data warehousing solutions

  • Experience working with data structures and models

  • Experience delivering/deploying Data Optimizations & Modernizations

  • Experience using and automating ETL tools for batch and streaming processes

  • Experienced in developing distributed processes

  • Experience in 1 or more of the Data Consumption Tools and Services below.

    • Machine Learning

    • BigQuery ML

  • Familiarity and experiences with Machine Learning development life cycle

  • Tactical knowledge of Machine Learning models

  • Experience architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments such as Amazon Web Services, Azure, and Google Cloud

  • Experience working with big data, information retrieval, data mining or machine learning as well as experience in building multi-tier high availability applications with modern web technologies (such as NoSQL, MongoDB, SparkML, Tensorflow)


Education: Bachelor's Degree preferred but not required.

Travel Expectation:  20% Domestic. 10% International

Job Summary

Employment Type:
Full Time Employee
Job type:
Federal Contractor
Skill Based Partner:
Education Level:
Bachelor's degree
Work Days:
Mon, Tue, Wed, Thu, Fri
Job Reference Code
Depending on Experience
Licenses / Certifications:
Display Recommended WorkKeys®Recommended WorkKeys®:
Applied Math: 5
Graphic Literacy: 5

Workplace Documentation: 7