Responsible for planning and roadmap for the Data Analytics Reporting Platform, cloud migration, and regular backlog enhancements; plan the resources, infrastructure requirements and finance forecast; coordinate Reporting Platform roadmap and release plan to the internal and external clients; facilitate communication plan to the users for migration activities and the reports migration for performance testing; apply safe Agile principles to facilitate seamless cross-functional collaboration; responsible for requirements elicitation, user stories, daily scrums, sprint planning, sprint reviews, sprint retrospectives, and release planning with both product and development teams; create project sprint reports, velocity reports, release reports, and facilitate introspection meetings, project status, and progress reports to the leadership teams; responsible for key stakeholders, vendors, procurements, and cross functional teams in providing solutions to their issues and problems; brainstorm with Business Product Management to identify innovations and problem statements and help develop analytics workflows, predictive insights, user interaction technologies, and capabilities while co-creating and influencing product roadmaps leveraging AI/ML/data science; utilize data analytics, cloud migration, data observability, Snowflake, AWS Sage Maker, Power BI, Splunk, Docker, Kubernetes, DevOps, AI/NLP, and A/B testing to perform duties; responsible for the Data Analytics Reporting Platform for the organization of internal business units, and to the external clients accessing reports on the SIFMU, risk delivery, clearing and settlement process of trades; participate in prospective client meetings to support technical questions and information on the architecture, data flow, analytics, and visualizations along with the Sales & Marketing teams; drive the migration of Java WebSphere applications to a Tomcat-based Microservices architecture, incorporating components of HIPAM, PING, and the CCW framework; provide comprehensive consultation on optimizing data flow processes, including the design and enhancement of Snowflake tables and schemas; implement strategic solutions to improve efficiency, scalability, and data integrity; execute analytics models in the area of NLP, Transformation, Deep Learning, Linear Regression, Logistic Regression, Random Forest, Support Vector Machine, Naive Bayes, and forecasting models; drive industry best practices through data and metadata management, data quality, data governance, integrity and data privacy; and ensure GDPR, SOX, and HIPAA compliance with internal as well as external bodies.
Location: Streetsboro, OH and various unanticipated worksites throughout the US;
Salary: $118,310 per year;
Education: Bachelor’s Degree in Computer Engineering, Computer Science, Electronic Engineering, Mechanical Engineering, or in a related field of study (will accept equivalent foreign degree);
Experience: Two (2) years in the position above, as a Technical Operations Manager, as a Technical Program & Project Manager, or in a related occupation;
Other Requirements: Experience must include one (1) year’s use of all the following: data analytics, cloud migration, data observability, Snowflake, AWS Sage Maker, Power BI, Splunk, Docker, Kubernetes, DevOps, AI/NLP, and A/B testing.
Will also accept any suitable combination of education, training, and/or experience.