Honeywell Senior Advanced Data Engineer in Tempe, Arizona
The future is what you make it!
When you join Honeywell, you become a member of our global team of thinkers, innovators, dreamers and doers who make the things that make the future.
That means changing the way we fly, fueling jets in an eco-friendly way, keeping buildings safe and even making it possible to breathe on Mars.
Working at Honeywell isn’t just about developing cool things. That’s why our employees enjoy access to dynamic career opportunities across different fields and industries.
Are you ready to help us make the future?
Join a company that is transforming from a traditional industrial company to a contemporary digital industrial business, harnessing the power of cloud, big data, analytics, Internet of Things, AI/ML (Machine Learning), Automation and design thinking. You will support change that brings value to our customers, partners, and shareholders through the creation of innovative software and data-driven products and services. You will help engineer contemporary applications and services, constructing solutions that remain scalable, adaptable, and replicable. You will be part of transforming Honeywell's IT organization through the delivery of technology products that will directly impact the company's growth.
You will be a key part of the Process Intelligence and Advanced Analytics team and in this role, you will work on the engineering, design and implementation of advanced analytics solutions using best-in-class data engineering best practices, Data Science and AI driven Technologies supporting all Honeywell strategic businesses and functions. The role provides a unique opportunity to work directly with business stakeholders at different levels, data engineers, business analysts, business value architects and technology delivery teams to solution, and design key data and process analytics solutions. The role requires a strong technology leadership and service owner attitude, with attention to detail, and proven experience to excel in a fast-moving environment.
Design, Architect, and work closely with the business teams and analysts to build, maintain scalable and automated data pipelines, infrastructure, API integrations to support integration business data from various sources as well as continuing increases in data volume and complexity
· Become expert / key SME for the overall design, architecture for data pipelines, transformations, and key data models relevant for all internal and external data sources.
Collaborate with other analytics and business teams to improve consolidate data transformations / models that feed into the process mining and execution platform, drive increase data accessibility and fostering data-driven decision making / automation.
Implement processes and measures to monitor the data quality and ensure the availability/usability of production data that the AI-based solutions rely on
Work in a startup-type environment to assist business and delivery teams design and build innovative applications using Celonis EMS, AI/Machine Learning, and potentially business process orchestration solutions
Processing, cleansing, and verifying the integrity of structured and unstructured business data used for analysis
Propose, Develop and Deploy advanced Machine Learning, Reinforcement learning and Deep Learning Models/algorithms to enhance existing business processes with predictive and intelligent decision making
Analyze large amount of information to discover underlying trends and patterns that are relevant for solving business problems. Select features and perform proper data transformation to optimize datasets for building the models
Evaluate, visualize, and communicate the AI/ML model results and correlation with business KPI’s
Build startup capabilities using action engine, predictive, AI Driven solutions on top of existing analytics and orchestration solutions
Research, actively experiment to stay abreast of the emerging data engineering standards, data and Business Process Management, Low Code trends and ML driven advanced analytics
Work within project planning constraints, communicating any identified project risks and issues to the delivery/project manager and provide inputs to the change control process
Help develop standard operation practices and support the Operations Teams through UAT and after go-live
Work closely with the operations team to continuously monitor for data drift and significant changes, optimize existing pipelines / processes and help build effective data pipeline monitoring, alert, and notification framework to proactively identify problems/issues before business impact
YOU MUST HAVE
Bachelor’s in computer science, Information Technology, Finance or Engineering fields
3 plus years of IT experience in data engineering / software development / solution architect for large corporate/organizations
2 plus years of experience in building and deploying Data Pipelines, ETL and Enterprise scalable architectures for Machine Learning solutions
Hands-on experience with Python, R, Scala, Java, SQL, and various data warehousing/streaming/engineering tools such as Hadoop, Hive, Apache Spark, NiFi, Airflow, Paxata, Kafka, etc.
Solid experience in building IT use cases / solutions especially using Cloud infrastructure and services such as AWS and/or Azure cloud platforms
Solid understanding and proven experience in building and deploying Machine Learning solutions using various supervised/unsupervised ML algorithms such as Linear/Logistic Regression, Support Vector Machines, (Deep) Neural Networks, Random Forest, etc
Excellent understanding of Machine Learning techniques and proficiency in feature analysis, algorithm selection and model hyperparameter tuning
Master’s degree in computer science, Data Science or Engineering fields
Solid experience in building IT use cases / solutions especially using Cloud infrastructure and services such as AWS and/or Azure cloud platforms.
Solid understanding and proven experience in building and deploying Machine Learning solutions using various supervised/unsupervised ML algorithms such as Linear/Logistic Regression, Support Vector Machines, (Deep) Neural Networks, Random Forest, etc.
Excellent understanding of Machine Learning techniques and proficiency in feature analysis, algorithm selection and model hyperparameter tuning.
Work experience / education in data science, data engineering and analytics
Working experience and/or certification with Celonis, or other process/data mining platform
Good understanding of Machine Learning techniques and algorithms.
Proven experience with Azure Databricks, Informatica, Redshift.
1 plus years experience in Web Service/Restful API Integration, and/or full stack application development.
Experience in ERP platform integration, preferably with SAP
Working Experience in an Agile/Scrum/Scaled Agile and DevOps based team environment
Certifications AI / ML and Cloud platforms
Great communication skills
Honeywell is an equal opportunity employer. Qualified applicants will be considered without regard to age, race, creed, color, national origin, ancestry, marital status, affectional or sexual orientation, gender identity or expression, disability, nationality, sex, religion, or veteran status.