Karl Estermann

Hiring Requirement

Zürich, Switzerland
Experience
Feb 2023 - Present
2 years 6 months
Zug, Switzerland

Hiring Requirement

AALS Software AG

When setting up a large-scale recruiting process, it’s important to focus on hiring experienced engineers and talent who can boost the company’s overall know-how or fill skill gaps. The competitive hiring environment makes it hard to find qualified candidates, since there may be only a few people who meet the criteria and many companies compete for the same talent pool. Treat the hiring pipeline like a sales funnel and continuously measure its “temperature” to ensure efficient and successful hiring efforts. Without constant monitoring, time gets wasted and inefficiencies arise in the recruitment team. Success in large-scale hiring relies on the team’s ability to fill key roles quickly and spot and remove bottlenecks to optimize workflow. Experimenting with different job descriptions for the same role across various channels can help find which approach brings more quality candidates into the pipeline. This, however, adds extra work for an already busy hiring team. Setting firm deadlines for candidates to complete each phase of the hiring process and tracking hires per week helps keep a steady pace and clearer forecasts for future hiring needs. To support a business plan and hire at scale, you need a well-defined career development system to assess candidates during hiring. This enables more objective evaluation of potential hires and ensures you bring on the right people to meet the company’s needs. Balancing your hiring efforts and keeping a steady number of hires per week lets the team manage the process better and plan future needs. Ultimately, by continuously measuring and improving the speed and efficiency of the hiring pipeline, companies can increase their chances of building a strong, talented team.

The main goal of the “Building a Data-Driven Hiring Machine” project is to establish a data-driven hiring process to recruit experienced engineers and talent, boost the company’s overall know-how, and fill skill gaps to ensure efficient and successful hiring efforts.

  • Introduce a data-driven hiring process for experienced engineers and talent.
  • Improve the company’s overall competency and fill skill gaps.
  • Continuously measure the hiring pipeline’s “temperature.”
  • Identify and remove bottlenecks in the hiring process to optimize workflow.
  • Experiment with different job descriptions for the same role across various channels.
  • Set deadlines for candidates to complete each hiring phase.
  • Track weekly hires to maintain a steady pace and clear forecasts for future hiring needs.
  • Implement a well-defined career development system to evaluate candidates during the hiring process.
  • Ensure balanced hiring efforts and maintain a consistent number of hires per week.
  • Continuously measure and improve the speed and efficiency of the hiring pipeline.

There’s a need for a founding engineer experienced in Ethereum and smart contracts for a startup. The company wants someone with solid computer science fundamentals and problem-solving skills, along with knowledge of Ethereum, smart contracts, Solidity, Golang, and other programming languages. But when a junior recruiter posts the role, they struggle to identify the specific requirements and find suitable candidates. This often requires intervention from an HR manager or senior developer to refine the job post and provide details so the junior recruiter can search the company’s resume database effectively. To solve this, the company plans to introduce an AI-based hiring platform using semantic search. Semantic search aims to improve accuracy by understanding the searcher’s intent and the contextual meaning of terms in the searchable data space. Here, the goal is to capture the job post’s intent and match it with relevant resumes from the company’s database. Before matching resumes, the company must capture the contextual meaning of all resumes and store it in a suitable format for queries. To extract the resumes’ contextual meaning, the company plans to use the HuggingFace Transformers library, known for its NLP capabilities. The library generates embedding vectors—multi-dimensional float vectors representing the contextual meaning of sentences, words, or paragraphs. These vectors, which can have hundreds to thousands of dimensions, help match the job post with resumes in the database.

Technologies:

  • Delphi and Microfocus Cobol.
Feb 2023 - Feb 2024
1 year 1 month
Baar, Switzerland

incl. CI/CD, Automation

AALS Software AG

A real-time, hands-on course on Flink and Hadoop covering MapReduce, HDFS, Spark, Flink, Hive, HBase, MongoDB, Cassandra, and Kafka. Extensive engineering experience including DevOps (CI/CD, automation).

  • Experience with Big Data technologies and Kafka (Confluence Strimzi, Pinot).
  • Experience in insurance (agency).
  • Highly motivated and willing for on-site work (lives in CH).
  • Available short-term (high onsite ratio possible if needed).
  • ETL/ELT pipeline with Apache tools and Pentaho.
  • Project lead for municipal software, financial services, Big Data Kafka.
  • Domain expertise in insurance, eCommerce, retail, banking, telecom, public administration, and local government.
  • Solution architecture in cloud data science, model development, AI, NLP, chatbots (RASA, Chatter, Dialogflow).
  • Data engineering (pipelines).
  • DevOps software development.
  • TypeDB knowledge base.
  • OpenStack basics, Kubernetes, Podman.

The main task of this AALS Software AG project was to deliver a practical real-time course on Flink and Hadoop that covers various Big Data technologies and DevOps practices, with a strong focus on automation and CI/CD processes.

Responsibilities:

  • Develop a hands-on Flink-Hadoop course covering MapReduce, HDFS, Spark, Flink, Hive, HBase, MongoDB, Cassandra, and Kafka.
  • Gain deep technical experience, including DevOps and CI/CD automation.
  • Work with Big Data technologies and Kafka.
  • On-site work in Switzerland.
  • Build ETL/ELT pipelines with Apache tools and Pentaho.
  • Lead projects in municipal software, financial services, and Big Data Kafka.
  • Develop AI models and chatbots with RASA, Chatter, and Dialogflow.
  • Build data pipelines.
  • Work on software development projects.
  • Manage a TypeDB knowledge base.
  • Work with OpenStack fundamentals and Kubernetes.

Technologies:

  • HDFS
  • ETL
  • DevOps
  • Cloud
  • Apache HBase
  • Pentaho
  • ELT
  • MongoDB
  • Kubernetes
  • Kafka
  • CI/CD
  • MapReduce
  • Hadoop
  • NLP
  • Apache Spark
  • Big Data
  • Cassandra
  • OpenStack
Nov 2022 - May 2023
7 months
Baar, Switzerland

Data IoT Engineer

AALS Software AG

  • Reviewed OPC UA, Redis, Influx.
  • Created an as-is analysis of data storage, processing, streaming, and file format requirements for IQ control, service info, manufacturing data, and axis monitoring use cases.
  • Advised the company on suitable databases for each use case.
  • Improved efficiency by delivering actionable recommendations.
  • Implemented Kafka IoT streaming.
  • Managed data in Azure Data Factory for analysis.

The main task of this project was to analyze and optimize data storage, processing, and streaming for various use cases, advise on fitting databases, and boost efficiency using IoT streaming and Azure Data Factory for analytics.

Responsibilities:

  • Analyzed and optimized data storage, processing, and streaming for various use cases.
  • Advised on fitting databases to improve efficiency.
  • Leveraged IoT streaming and Azure Data Factory for analytics.
  • Reviewed OPC UA, Redis, and Influx.
  • Performed a requirements analysis on data storage, processing, streaming, and file formats for IQ control, service info, manufacturing data, and axis monitoring.
  • Provided database recommendations for each use case.
  • Enhanced efficiency with actionable recommendations.
  • Implemented Kafka IoT streaming.
  • Managed data in Azure Data Factory for analysis.

Technologies:

  • Kafka Docker
  • Cassandra DataStax
  • Python scripting
  • Graph and Spark
  • Ansible
  • Grafana
  • Prometheus
  • Azure Data Factory
Feb 2020 - Present
5 years 6 months
Zug, Switzerland

Data Engineer

AALS Software AG

There are three goals:

  • Custom ChatGPT With Custom Knowledge Base: Using ChatGPT and LlamaIndex to build a custom chatbot that can derive knowledge from its own document sources. While ChatGPT and other LLMs are powerful, extending the LLM model offers a richer experience and lets you build a conversational chatbot for real business use cases like customer support or spam classification.
  • Document ChatBot with GPT-3 and LangChain: Chatbots that interpret human language using GPT-3, an OpenAI language generation model. GPT-3 can handle NLP tasks like text completion, translation, and summarization.
  • Ask Questions About Your Documents. KnowledgeGPT: A Python-based web app built with Streamlit. It lets you upload a document and get answers to questions about it.

The main task of the “Custom ChatGPT With Custom Knowledge Base” project at AALS Software AG is to develop an advanced chatbot that derives knowledge from its own document sources, interprets human language with GPT-3, and answers questions about uploaded documents.

Tasks and Responsibilities:

  • Develop a custom ChatGPT chatbot using ChatGPT and LlamaIndex.
  • Extend the LLM model to enhance chatbot capabilities.
  • Use the chatbot for real business applications like customer support or spam classifiers.
  • Build a document chatbot with GPT-3 and LangChain to interpret human language.
  • Leverage GPT-3 for NLP tasks like text completion, translation, and summarization.
  • Create a Python-based web app, KnowledgeGPT, with Streamlit.
  • Enable document uploads and Q&A on KnowledgeGPT.

Technologies for this project:

  • HuggingFace
  • Mistral 7B
  • Falcon 7B
  • TypeDB
  • VectorDB
  • KnowledgeGPT
  • Chatbot GPT-3X
  • OpenAI
  • Python
  • Rust
  • Streamlit
  • Delphi and Microfocus Cobol

Other projects and tasks include:

  • Conducting expectation analysis.
  • Analyzing and improving performance with statistical tests.
  • Designing a target architecture for business intelligence.
  • Developing a traceable process for business units.
  • Categorizing all published texts and making them available via chatbot.
  • Creating dynamic profiles of inbound leads.
  • DWH, data lake, and data lakehouse with Pentaho.
  • Teaching a chatbot to understand documents: chat, PDF summarization.
  • Security and ethics with LLMs.
  • Using Kafka as a communication system in schools and municipalities.
  • NLP chatbot with Streamlit and Pinot.
  • ELT instead of ETL for significant data stack improvements.

The main task of this project is to leverage natural language processing to develop a chatbot that categorizes published text documents and makes them accessible via dialogue to improve data processing and business intelligence.

Tasks and Technologies in this context:

  • Spark, Databricks, Kafka, web scraping, Python.
  • Container security review with Avr0.
  • MinIO and Ceph, Accumulo, nGraph.
  • GCP platform, Kedro, Pentaho.
  • One-click Kafka deployment for developers.
  • OpenAI, Bard, HuggingFace.
  • Other technologies: Kafka; Apache Spark; Bard; Kedro; nGraph; GCP platform; Accumulo; web scraping; HuggingFace; Databricks.
Feb 2020 - Feb 2022
2 years 1 month
Zug, Switzerland

Data Science

Winiker-Immobilien, Zug / Switzerland

  • Evaluate and predict tenant applicants before rental contract allocation.
  • Create statistical insights using classic clustering.
  • Prepare and use tenant data to build predictions with Random Forest.
  • Reporting with Power BI.
  • Visualization with Bokeh and Dash.
  • Lead BI development with Power BI and Tableau.
  • Migrate the application to GCP platform.
  • Build a Delta Lakehouse.
  • Work with mostly unstructured data.
  • Maintain RDB data, scale and manage with Kubernetes.

Other details:

  • Successfully reduced tenant turnover through improved clustering.
  • Used SAP HANA Community Edition as the database.
  • Used Kafka for communication.
  • Used Data Factory for further processing.
  • Added a Delta Lakehouse for persistent data storage.

Responsibilities:

  • Evaluate potential tenants before rental contracts.
  • Create statistical predictions using classic clustering.
  • Prepare and use tenant data to build Random Forest predictions.
  • Report using Power BI.
  • Visualize data with Bokeh and Dash.
  • Lead BI development with Power BI and Tableau.
  • Migrate the application to GCP platform.
  • Build a Delta Lakehouse.
  • Analyze mostly unstructured data.
  • Maintain RDB data.
  • Scale and manage with Kubernetes.

Technologies:

  • Kafka
  • Python
  • Databricks DataStax
  • OpenStack basics
  • OpenCV and OpenAI
  • PyTorch
  • TensorFlow
  • Keras
  • Pandas
  • NumPy
  • Tableau
  • Microsoft Power BI
  • Data Factory
Feb 1999 - Feb 2005
6 years 1 month
Zug, Switzerland

Project Leader

AALS Software AG

  • Lead developer for a school administration system using Delphi and Microfocus Cobol, HANA (formerly Sybase ASE).
  • I’ve worked since 1985 with SQL Anywhere ASE (formerly Watcom), now HANA on SUSE Linux.
  • I wrote an invoicing and payroll system using SQL syntax (stored procedures).
  • In 1990, we loaded the entire database into memory because it was the only one that could do it if you knew how.
  • Using T-SQL from ASE/HANA, we built invoicing, payroll, and tax systems.

The main task of the “Public Administration School Administration” project was to develop a school management system using Delphi, Microfocus Cobol, and HANA (formerly Sybase ASE).

Responsibilities:

  • Develop a school management system with Delphi, Microfocus Cobol, and HANA (formerly Sybase ASE).
  • Write invoicing and payroll systems using SQL syntax (stored procedures).
  • Load the entire database into memory.
  • Use T-SQL (ASE/HANA) to write invoicing, payroll, and tax systems.
  • Lead the project as developer, architect, and project manager.
  • Use skills in Linux, HANA, ASE Sybase, Delphi, and Microfocus Cobol.

Technologies:

  • Linux
  • HANA
  • ASE Sybase
  • SUSE Linux OS
  • Delphi and Microfocus Cobol
Feb 1985 - Feb 1999
14 years 1 month
Zug, Switzerland

Project Leader

AALS Software AG

  • Developed municipal administration software using Microfocus Cobol, Sybase ASE, SUSE Linux.
  • Led the public administration migration project from z/OS to Azure in 2022.
  • A step-by-step migration was planned to move solutions from a mainframe to Azure.
  • We first moved some applications, while others stayed temporarily or permanently on the mainframe.
  • This approach usually requires systems to enable interoperability of applications and databases between the mainframe and Azure.
  • Fortunately, there are many solutions that integrate Azure with existing mainframe environments.

The main task of this project was to develop municipal administration software using Microfocus Cobol, Sybase ASE, and SUSE Linux, and to plan a phased migration of solutions from a mainframe to Azure, ensuring interoperability of applications and databases.

Responsibilities:

  • Develop municipal administration software with Microfocus Cobol, Sybase ASE, and SUSE Linux.
  • Lead the public administration migration project from z/OS to Azure in 2022.
  • Plan and implement a phased migration of solutions from a mainframe to Azure.
  • Ensure some applications remain temporarily or permanently on the mainframe.
  • Set up systems to guarantee interoperability of apps and databases between the mainframe and Azure.
  • Use a range of solutions to integrate Azure with existing mainframe environments.
  • Serve as developer, architect, and project manager.
  • Use skills in Delphi and Microfocus Cobol.

Technologies:

  • Delphi and Microfocus Cobol
Summary

Committed and goal-driven leader with determination and strong analytical skills.

Languages
German
Native
English
Advanced
Education

Swiss Institute for Business Economics

Bachelor · Business Administration · Zürich, Switzerland

HKG Juventus

Bachelor · Business Administration · Hong Kong

Certifications & licenses

Evening Technical College

University of Applied Sciences, Switzerland

Driving license: A, B - Motorcycle and Car

Need a freelancer? Find your match in seconds.
Try FRATCH GPT
More actions