Designed and implemented a scalable data lake on Google Cloud Platform (GCP) for the Anti-Financial Crime (AFC) division, supporting Anti-Money-Laundering and Counter-Terorrism-Financing initiatives. Led a small engineering team and built robust data pipelines for ingesting and processing high-volume financial data, including over 30 million daily transactions from retail and corporate banking. Delivered analytics and regulatory reporting solutions for multiple international locations, including Germany, Singapore, Sri Lanka, Hong Kong, Taiwan, and Thailand. Collaborated closely with global teams to ensure data quality, compliance, and performance.
Freelance Data Engineer responsible for building a centralized data warehouse platform for internal dataset provisioning. Modernized legacy mainframe workflows by transitioning them to a private cloud infrastructure.
Developed an automated, highly available near-real-time system for monitoring and evaluating deliveries across Europe.
Freelance data engineer for internal teams in global financial, customs and chemical sector. Modeling of data warehouse systems and development of analysis software and its automated deployment. Development of scalable ETL pipelines for massive amounts of data. Automated analysis of data and continuous delivery based on business requirements.
Freelance Backend Developer at a state-owned bank, working on a platform for smart contract analytics. Developed components for automated contract analysis and document generation. Integrated raw machine learning models into production workflows.
Designed, managed, and scaled Hadoop clusters for large-scale data processing. Developed a self-service platform for automated deployment of bank-specific applications. Implemented CI/CD workflows to support multi-stage deployments across environments and clusters based on project lifecycle.
Completed two freelance projects focused on financial market analysis and investment strategy development. Collected and integrated data from third-party providers and crawled financial data from various websites. Built distributed data storage infrastructure using PostgreSQL, MongoDB, and Neo4j, connected to an Apache Spark processing cluster. Developed an end-to-end data pipeline and implemented a software environment for backtesting and portfolio rebalancing. Created a scalable and user-friendly web application for strategy selection and simulation. Also took over the role of Scrum Master, managing agile workflows and facilitating team coordination.
Discover other experts with similar qualifications and experience
2025 © FRATCH.IO GmbH. All rights reserved.