The global data insights and analytics (gdi&a) department at ford motors company is looking for qualified people who can develop scalable solutions to complex real-world problems using machine learning, big data, statistics, econometrics, and optimization. The goal of gdi&a is to drive evidence-based decision making by providing insights from data. Applications for gdi&a include, but are not limited to, connected vehicle, smart mobility, advanced operations, manufacturing, supply chain, logistics, and warranty analytics.
*qualifications*:
job qualifications:
- master’s or ph.d. degree in computer science, operational research, statistics, applied mathematics, or in any other engineering discipline preferred.
- 3+ years of experience with software engineering best practices in a team of 3+ engineers
- 1+ years working within public cloud ecosystems (aws, gcp, or azure)
technical skills:
- strong programming skills in python.
- demonstrated expertise in designing & architecting cloud-based data pipelines / microservices
- experience working with databases, including data modeling and querying relational databases (postgresql, mysql), nosql databases, and columnar databases like bigquery
- should have experience in feature engineering, hyper parameter tuning, model evaluation, etc.
- should have experience in using pandas/numpy/scikitlearn, pytorch, tensorflow, huggingface
- core concepts of modern nlp or computer vision modeling techniques
- familiarity with best practices in deep learning
- strong technical writing and oral english skills
*preferred qualifications*:
- git and best practices in github
- test and behavior driven development
- os: linux shell scripts
- experience with google cloud platform
- experience with front-end and back-end technologies (react, angular, node.js) for developing integrated data access and visualization layers
- high-level proficiency in infrastructure-as-code (iac) tools, specifically terraform
- strong knowledge of ci/cd pipelines and automation frameworks to enhance development workflows
*key roles and responsibilities of position*:
- analyze source data and data flows, working with structured and unstructured data (text, audio, images, video, etc.)
- manipulate high-volume, high-dimensionality data from varying sources to expose and highlight patterns, anomalies, relationships, and trends.
- analyze and visualize diverse sources of data, interpret results in a business context and report results clearly and concisely.
- fulfill problem formulation and ml technique consulting requests in a timely manner.
- work collaboratively with different business partners and be able to present results in a clear and concise manner.