Goal: Build the cloud infrastructure for a data extraction, processing, and machine learning training and inference backend.
Goal: Run and improve the MLOps services and support AI scientists and application developers with AI systems.
I increased the test coverage of the AI model inferencing pipeline from 60% to 85%, and I made it more efficient by adding a model results cache using DynamoDB. I migrated the inference pipeline to a Step Function to improve the observability of the system.
Goal: Build a real time data pipeline based on PySpark to process UK Railways network data to create a digital twin. I implemented the transformation of the raw incomplete UK Railways data into a sensible timeseries and made the connection with the rest of the pipeline. I increased the test coverage of the whole pipeline from 30% to 80%. I supported other team members with PySpark.
Goal: web application to help small-mid size companies to automate their account receivables and payables.
I am in charge of the cloud infrastructure, and the backend. I implemented the fully automated CI/CD pipeline to deploy the application on CloudFlare, AWS, and GCP using Pulumi. I wrote automated UI end-to-end tests based on Cucumber, TypeScript, and Playwright increasing the confidence of the team for releases.
Goal: build an alerting tool for failures in the customer integrations of the Klarna payment service. The system would send alerts to Slack using an statistical model running on Rockset.
I implemented the asynchronous backend of the system which ran different queries against Rockset to try and find anomalies and send the proper alerts. In less than 4 months we had a working system, and the account manager teams were already happy with the alerts. Two alerts saved one of their large customers an estimated amount of more than 1 million EUR in one day.
Build serverless data pipelines in the cloud to aggregate data from different third-party data providers for business reporting. Migrate many of the manually deployed Lambdas into Terraform and added automated tests.
Develop a global solution for CARIAD to ingest, manage, and serve points-of-interest specially in the area of car chargers using AWS, Airflow, Python, and microservices. Improve the code quality of the data extraction pipelines and add new features. I improved the reliability and testability of the data pipelines by refactoring big chunks of the system and increasing the test coverage more than 20%.
Project staffing, team management, personal development processes, and knowledge sharing. Pre-sales and writing cloud solution project proposals. On the technical side: implement a REST API in Go, elasticsearch data ETL for reporting and data analytics, cloud infrastructure with AWS CDK.
Technical lead in a project for Continental. I built a data ingestion system for factory production data in Python based on AWS Step Functions. It performs data ingestion for a data warehouse in AWS Redshift and S3. Infrastructure defined with Terraform. Serverless Framework for the lambda functions, step functions, triggers, and alerts. Automated monitoring and alerting with AWS Cloudwatch and SNS. Dealt with XML files and XSLT processing.
Internal product for automation of image hardening for EC2 and ECS.
Python and Terraform for IoT project: RESTful API test driven development, IoT tooling for production, and cloud infrastructure in AWS.
Increased test coverage of the core API of the product to 90%. Increased the deployment time of the solution from 5 hours of manual work to fully automated 45 minutes.
BMW Log-Analyzer: software engineering and DevOps for IT infrastructure with Ansible, Python, Kafka, and ELK stack for log data stashing and analysis.
BMW Augmented Reality Tracking: C++ application to measure and estimate in real time IMU data and UR robotic arm to simulate human head movement for car driver HUD.