About Arcticom, LLC
Offering a broad range of information technology solutions, Arcticom, LLC provides network and systems administration, enterprise architecture and resource planning, certification and accreditation, software design, programming, maintenance of telecommunications and land mobile radio equipment and systems, help desk support and IT transformation services among other services.
Arcticom offers impressive performance that is routinely recognized with exceptional ratings and commendations tied to installation successes. Satisfied Bering Straits Native Corporation (BSNC) family customers include the U.S. Air Force, Army, Navy, Coast Guard, the Departments of State, Justice, Commerce, Agriculture, Interior, Homeland Security, the General Services Administration, the Defense Logistics Agency and the U.S Census Bureau.
About this position: DMS MRO Data Engineer Location - Dayton, OH
The Essential Duties and Responsibilities are intended to present a descriptive list of the range of duties performed for this position and are not intended to reflect all duties performed within the job. Other duties may be assigned. To perform this job successfully, an individual must be able to satisfactorily perform each essential duty. The requirements listed below are representative of the knowledge, skill and/or ability required. Reasonable accommodation may be made to enable individuals with disabilities to perform the essential functions of the position.
Wage/Salary Range: $125k - 150k
Applicants will be notified via phone or email within ten (10) business days of submittal.
Essential Duties & Responsibilities
* Build and optimize ETL data pipelines using Databricks and Apache Spark
* Develop scalable data models leveraging Spark Data Frames and Spark SQL for efficient querying, aggregation, and integration against numerous diverse data sources
* Utilize Python and Regular Expressions for advanced text parsing, pattern matching, and data cleansing to prepare structured datasets for downstream ingestion
* Use APIs for seamless data exchange and automation across systems
* Manage SharePoint data (lists, libraries, files, metadata) programmatically using Python and Microsoft Graph API
* Leverage Microsoft 365 tools for documentation and collaboration
* Develop and maintain Amazon Web Services (AWS) including S3
* Fully utilize the Microsoft 365 environment for cross-functional data sharing and productivity
* Use Jira to track and manage issues, tasks, and project workflows
* Collaborate with cross-functional teams and technical stakeholders to define data transformations of legacy systems to modern schemas
* Leverage Subject Matter Expert (SME) collaboration to interpret business rules and build business logic into data extracts and transformation processes
* Build and maintain strong working relationships with SME's, helping to identify requirements for data transformations based upon iterative product feedback
Required (Minimum Necessary) Qualifications
Education Requirements: Bachelor's (US accredited Required)
Level of Experience Requirements :
* Python: 4 years (Preferred)
* SQL: 4 years (Preferred)
* SharePoint Development: 3 years (Preferred)
* Spark: 3 years (Preferred)
* AWS: 3 years (Preferred)
* S3: 3 years (Preferred)
* AI: 2 years (Preferred)
* Databricks: 2 years (Required)
License/Certification
* U.S. federal government security clearance (Preferred)
Apply here: https://www.aplitrak.com/?adid=YmJnZW5lcmljLjAzNzEzLjEwNTA4QGJlcmluZ3N0cmFpdHNjb21wLmFwbGl0cmFrLmNvbQ