Carrix / ssa marine is committed to making our employees feel welcome and respected.
our team is unique and our approach successful because we have fostered an environment that values varying backgrounds, perspectives, and experiences and takes pride in how the collective delivers value to our customers and partners.
a truly diverse workforce is the outcome of treating people right.
each team member is responsible for creating, maintaining, and enhancing our work culture through collaboration and empathy – as culture doesn't happen without conscious effort.
at carrix / ssa marine, we expect all employees to treat one another with respect and kindness, no exceptions or excuses.
summary/objective: as a data integration engineer at carrix / ssa marine, you are responsible for designing, building, and operating reliable, scalable, and secure data integration pipelines that move and synchronize data across enterprise systems, cloud platforms, and external partners.
you focus on data ingestion, transformation, orchestration, and operationalization, ensuring data is delivered accurately, on time, and in compliance with governance and security standards.
working closely with application teams, platform engineers, and data consumers, you enable trusted system-to-system and api-driven integrations and near-real-time data flows that support critical business operations and analytics initiatives.
essential responsibilities: data integration pipelines design, build, and maintain enterprise data integration pipelines using approved etl/elt and integration platforms (primarily boomi and/or informatica).
implement batch, micro-batch, and near-real-time ingestion patterns, including change data capture (cdc) and event-driven integrations.
develop and support api-driven integrations, including restful services for system-to-system and saas application connectivity.
perform data extraction, transformation, validation, and loading between internal systems, cloud platforms, and third-party applications.
develop and maintain reusable integration components, mappings, and templates to improve consistency and delivery speed.
operational excellence reliability monitor, troubleshoot, and resolve production integration failures, performance issues, and data quality problems.
implement data reliability, validation, reconciliation, and error-handling patterns across integration pipelines and api integrations support ci/cd deployment pipelines and promote devops best practices for integration development and release management.
create and maintain runbooks, operational documentation, and integration standards, including api usage and support guidelines.
requirements: key knowledge, skills abilities: · strong experience with data integration and etl/elt platforms (boomi, informatica, azure data factory, aws glue, dbt).
· hands-on experience designing and consuming restful apis, including authentication methods (oauth, api keys, tokens).
· experience integrating with cloud-based saas platforms using apis and connectors.
· advanced sql and relational database expertise, including oracle and cloud data warehouses such as snowflake or azure.
· experience with cdc, streaming, and real-time integration technologies (e.g., kafka, spark, flink).
· strong scripting and programming skills (e.g., python, java, scala ).
· ability to design resilient, scalable integration architectures across hybrid and cloud environments.
· proven ability to communicate technical integration concepts clearly to both technical and non-technical stakeholders.
qualifications: · bachelor's degree in computer science, information systems, data engineering, or a related field, or equivalent practical experience.
· 2 years of hands-on experience in data integration, etl/elt development, or systems integration.
· experience working with large, heterogeneous datasets across multiple source systems.
· experience working in agile delivery environments using tools such as azure devops or jira.
· experience in transportation, logistics, or industrial operations environments is a plus.