Who is Laudex? We are a leading Fintech in the educational credit industry in Mexico. Since 2009, we have financed over 20,000 students, enabling them to attend top-tier universities. We manage over $1.5 billion MXN in credit lines. We are now scaling our data ecosystem to power the next generation of AI-driven financial products.
Role Overview
We are seeking a Data Cloud Engineer to architect, build, and optimize the scalable data platforms on Google Cloud Platform (GCP) that power our advanced analytics, credit-risk modeling, and fintech solutions.
In this role, you won't just move data; you will build the backbone for Data Science and Business Intelligence. You will work under the mentorship of senior leaders with extensive international experience in AI and Credit Risk, ensuring that our data is reliable, well-governed, and ready for innovation.
Detailed Key Responsibilities
1. Advanced Cloud Data Architecture
* End-to-End Pipelines: Design, build, and maintain automated, scalable, and resilient data pipelines using GCP-native tools.
* Orchestration & Workflow: Develop robust ingestion and transformation workflows (ETL/ELT) that support real-time and batch processing for ML and BI applications.
* Infrastructure as Code: Ensure high levels of performance, reliability, and scalability across the entire data platform.
2. Data Platform & Analytics Enablement
* Semantic Layer Design: Architect and maintain optimized data models and semantic layers that serve as the "single source of truth" for Power BI dashboards and enterprise reporting.
* DS/AI Support: Provision high-quality, feature-ready datasets to empower Data Science teams in building predictive credit-risk models.
* BigQuery Management: Manage warehouse performance, partitioning, and clustering strategies to ensure lightning-fast query results.
3. Data Integration, Security & Governance
* Source Integration: Connect and synchronize multiple internal (transactional DBs) and external (financial APIs, credit bureaus) data sources.
* Quality Frameworks: Implement automated data quality monitoring, validation rules, and observability to ensure trusted data assets.
* Security & Compliance: Apply GCP security best practices, including fine-grained IAM roles, data encryption at rest/transit, and compliance with financial data regulations.
4. Optimization & Continuous Improvement
* Cost Management: Monitor and optimize cloud resource utilization to balance high performance with cost-efficiency within the GCP ecosystem.
* Technical Leadership: Collaborate with cross-functional teams to translate complex business requirements into elegant technical solutions, providing guidance on engineering best practices.
Detailed Qualifications
* Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related quantitative field.
* Experience: 3–5+ years of hands-on experience in Data Engineering, specifically within cloud environments.
* GCP Expertise: Strong proficiency in the Google Cloud ecosystem (BigQuery, Cloud Storage, Dataflow, Cloud Composer/Airflow, etc.).
* Technical Stack: * Advanced SQL: Deep knowledge of query optimization, window functions, and performance tuning.
* Python: Expert-level skills for data processing, scripting, and automation.
* Language: Professional working proficiency in English (required for international collaboration).
* Soft Skills: High tolerance for frustration in dynamic environments, strong problem-solving skills, and the ability to work independently in a remote-first setting.
Preferred: GCP Professional Certifications (Data Engineer/Architect) and experience with large-scale financial or Fintech datasets.
What’s In It For You?
* Mentorship: Direct access to experts in AI and Risk Modeling with global experience.
* Innovation: Work on a modern cloud-based data ecosystem using cutting-edge technologies.
* Flexibility: 100% Remote-first culture with a focus on technical excellence over "seat time."
* Impact: Contribute directly to financial inclusion and the educational development of thousands.