I support companies not only to understand AI processes but to use them productively. My focus is on implementing concrete, business-oriented solutions with modern LLM and agent technologies. The following example projects show typical use cases where I achieved efficiency, automation and new value creation:
Automated email classification and workflow triggers Development of a system for intelligent detection and handling of incoming emails (e.g. applications, invoices, support cases). Tech: OpenAI GPT-4, LangChain, FastAPI, Supabase, webhooks
AI-powered contract assistant for a LegalTech startup Implementation of an LLM-based assistant for automatic analysis, categorization and extraction of relevant data from legal documents. Tech: Llama 3, Pydantic, LangGraph, Pinecone (RAG), Docker
Internal knowledge search with RAG architecture Building a retrieval-augmented generation solution for a mid-sized company to make internal documentation, FAQs and manuals accessible via an AI interface. Tech: vLLM, FAISS, LangChain, HuggingFace, OpenSearch
Edge-capable anomaly detection in industry Development of a locally deployable vision-language system for visual quality control on edge devices (e.g. Jetson, A100). Tech: Vision Transformer, LLaVA, Torch, Nvidia TensorRT, FastAPI
Agent system for automated report generation Introduction of an agent framework that regularly aggregates data (e.g. from CRM, JIRA, calendar), prioritizes it and turns it into weekly management briefings. Tech: LangGraph, PandasAI, LlamaIndex, REST, Zapier API
LLM-powered matching system for a recruiting platform Design of a system that matches applications with job profiles, calculates scores and automatically forwards suitable profiles to recruiters. Tech: OpenAI Embeddings, LangChain Agents, FastAPI, Postgres
Sample projects from my full-time position, focusing on scalable AI solutions, automation and near-production data processing:
Generative AI platform with RAG & LLMs Design and development of an enterprise-wide platform for generative AI applications – from retrieval-augmented generation to automated content creation. Tech stack: AWS, Python, Svelte, TypeScript. Result: central, reusable infrastructure for scalable AI workflows across the company.
Document automation with LLMs Building intelligent pipelines for automated document evaluation and analysis using Azure, Python and LLMs. Result: significant efficiency gains in reporting processes through real-time analysis and higher accuracy.
Predictive analytics for machine data Development of an analysis and early warning system for machine states using Azure, Microsoft Fabric, Python and SAP. Result: reduction of unexpected failures through data-driven maintenance and real-time insights into asset availability.
Delivery of data-driven solutions focusing on predictive analytics, cloud data platforms and business enablement in the Azure environment:
Predictive maintenance system Development of a forecasting system for machine failures using Azure and Python. Building high-performance ETL workflows in Azure Synapse; result: 20% fewer unplanned outages and optimized maintenance planning through predictive models.
Enterprise Analytics Platform (EAP) Design and implementation of an enterprise-wide analytics platform on Azure, connecting various data sources. Automated pipelines and visualizations (Power BI, Tableau) enabled 30% faster operational decision-making.
Cloud data migration & reporting Leading the migration of legacy data to the Azure cloud, including redesigning Power BI reports. Result: 40% faster data availability and better responsiveness in day-to-day operations.
Sentiment analysis for customer feedback Development of an NLP-based analytics tool with Python and Azure Databricks to evaluate customer feedback. The insights were made available through Power BI; result: 25% increase in customer satisfaction through targeted actions.
Development and evaluation of data-driven architectures for the financial sector – focusing on risk assessment, simulation and forward-looking technologies:
Bank management platform Development of a comprehensive platform for bank-wide data management and risk simulation using SQL (DB2), Java, React, Python, MATLAB and Power BI. Centralized data storage enabled dynamic analysis of metrics like VaR and CVaR. Result: real-time risk analytics, intuitive dashboards and significantly faster decision-making in finance.
Quantum computing in finance Analysis of potential use cases for quantum computing in banking. Implementation of initial proofs of concept with Python and Qiskit to evaluate complex optimization and simulation problems. Result: technological foundation for long-term innovation strategies in finance.
Ad-hoc analysis for business units Building flexible analysis tools with Python, Dash, SQL (DB2) and Power BI to quickly support operational decision-making. Result: 40% shorter response times for analysis requests and increased data availability for business teams.
Development of data-driven systems to meet regulatory requirements, market analysis and risk assessment in the financial sector:
Data warehouse customization for compliance Modernization of the bank's internal data warehouse using Excel, VBA, SQL (DB2) and Java to meet new regulatory requirements. Optimization of ETL processes and reporting. Result: full compliance, consistent data flows and higher reporting quality across departments.
AI-based sentiment analysis Development of a tool for real-time analysis of market sentiment from financial news using Python, TensorFlow and Dash. Result: faster reactions to market changes, improved risk assessment and automated action recommendations for investment controlling.
Stress test platform for risk simulations Building a comprehensive platform for running stress scenarios and data quality analysis with Python, SQL, Power BI, Java and React. Result: 30% higher responsiveness in crises, improved forecasting accuracy and solid decision support in risk management.
I help companies use artificial intelligence (AI) in a productive and business-focused way – with a focus on automation, intelligent workflows, data-driven decisions and scalable system architectures.
As a PhD-level AI expert with over ten years of experience, I design and develop solutions that simplify, speed up and strengthen real business processes. From intelligent analysis of unstructured data (e.g. emails, documents, text sources) to agent-based assistance systems and predictive analytics – I combine modern AI technologies with concrete business impact.
Key areas of expertise:
Integration of AI systems into existing IT landscapes (APIs, webhooks, Azure, AWS, on-premises) Automation of document-based workflows (e.g. invoice processing, applications, email classification) Development of intelligent agent systems & workflows (e.g. with LangChain, LangGraph) Implementation of predictive analytics solutions for operational efficiency (e.g. maintenance, monitoring) Building scalable AI prototypes up to production-ready applications
Technologies & tools: Python, FastAPI, OpenAI, Llama 3, TensorFlow, Azure, AWS, Supabase, Postgres, Docker, Redis, LangChain, PandasAI, SQL, Power BI, and more.
I work pragmatically, open to technology and closely with product, IT and business teams – with a clear goal: to make AI not an experiment, but a real productivity factor – measurable, maintainable and future-proof.
Discover other experts with similar qualifications and experience
2025 © FRATCH.IO GmbH. All rights reserved.