Henry Hüske

I'm Henry Hüske

Senior Data Engineer / Database migrations

  • Age 41
  • Address Göhrener Weg 29, 01109 Dresden, Germany
  • E-mail (PGP)
  • Phone +49 351 4188 2401
  • Languages German, English, Swedish
  • Available from October 1, 2021

Download my portfolio as PDF

Hello! I'm Henry Hüske. Senior data engineer specializing in database migration to the cloud. Strong background in project management support for data-related projects.

My services to you

  • Performing development tasks in the areas Data Warehouse (DWH), ETL, and Business Intelligence with experience in geodata handling
  • Replacement/migration of relational databases (in particular Oracle and PostgreSQL)
  • Support with the transfer of your database into the cloud infrastructure (AWS, GCP)
  • Performance optimization of your database queries and processes

Your benefits

  • The optimization and acceleration of data processes brings your data to its destination in a faster way and enables you to carry out more timely analyzes.
  • The replacement/migration of your database takes place in a structured way and with the smallest possible downtime.
  • The knowledge of your employees increases through my technical support and guidance.
  • Better and faster database queries and processes increase customer satisfaction.

Professional Skills

Focus topics
  • ETL/DWH development
  • Database migration (upgrades of existing databases; migration to other relational databases)
  • Database performance optimization
  • Bash
  • Data Vault
  • Docker
  • git
  • Linux
  • Python
  • Talend Studio
  • Terraform
  • Amazon Web Services (AWS)
  • Google Cloud Platform (GCP)
  • BigQuery
  • Oracle DB
  • PostgreSQL
  • MariaDB
  • MySQL
  • MongoDB

Project and Work Experience

01/2021 - 04/2021

Database Engineer


I worked for a large international group with a project team from a project service provider. My project task was the database support for Oracle DB 12.2 to check whether a replication of ERP data can be realized via an Oracle database in AWS RDS (Amazon Web Services - Relational Database Service) to Snowflake. The replication volume to be checked is approximately one billion transactions per day for about 50+ ERP databases with an aggregated size of 42 TB.

Technologies used: Amazon Web Services, AWS RDS Oracle DB, AWS Data Migration Service, Snowflake

04/2020 - 03/2021

Technical Lead Business Intelligence

Ubitricity - Gesellschaft für verteilte Energiesysteme mbH, Remote

The company is a provider of charging solutions and infrastructure for electric vehicles based in Berlin, Germany. My project task is to create a new data warehouse using cloud technologies to enable an aggregated and uniform view of the data from different source systems. The implementation is carried out using Google Cloud BigQuery as a data warehouse solution, Terraform for creating infrastructure as code, the ELT approach and Data Vault modelling for loading the data warehouse and Tableau as a visualization tool. The source data of approx. 10 GB are extracted primarily from MariaDB and MongoDB. As a technical lead, it is my job to plan, operate and implement the entire process for creating the data warehouse infrastructure and loading processes. In terms of content, I am supported by ProductOwners and employees of the BusinessIntelligence team.

Technologies used: Google Cloud, BigQuery, MariaDB, MongoDB, Terraform, Data Vault, Python

2015 - 2019

Senior Data Engineer

Avantgarde Labs GmbH, Germany (as employee)

My job included technical support for the ETL team of a large German e-commerce provider for electronic items; the ETL infrastructure consisted of 300 to 400 ETL jobs. On the one hand, I was responsible for creating, changing and optimizing ETL jobs using Talend Studio DI, and on the other hand for planning ETL processes and prioritizing tasks for a team of up to 5 employees. In the 5 years in the project three different Talend versions (5.3.1, 6.2.1, 7.0.1) were used. When upgrading the Talend Studio versions including the associated infrastructure components (Talend Administration Center, Nexus, git / SVN), I took over the technical coordination of the necessary steps in coordination with internal and external employees. A central database with a size of approximately 2 TB was used to combine the data from various source systems; until 2017 it was an Oracle database in version 11g R2, after that the database was migrated to PostgreSQL 9.6 and ETL infrastructure was transferred to the cloud (Google Cloud Platform using "Cloud SQL" for the database and VMs for the infrastructure components).

Technologies used: Google Cloud, Oracle DB, PostgreSQL, Talend, Docker

2009 - 2014

Data Engineer

Tele-Kabel-Ingenieurgesellschaft mbH, Germany (as employee)

My task was the project coordination for and the implementation of approx. 80 geodata migrations for supply networks (waster water, electricity, gas) for a Swiss GIS provider, which performs the GIS tasks for about 50 Swiss municipalities. The migrations took place from Oracle database 9 to Oracle database 11g R2. In addition, I carried out various customization programming for Autodesk Topobase using VB.NET for the customer.

Technologies used: Oracle DB, Autodesk Topobase, VB.NET


2003 - 2008

Diploma in Business Education

Technische Universität Dresden, Dresden, Germany

2005 - 2006

Study abroad

Mid Sweden University, Sundsvall, Sweden


  • Maximum of 25 working hours per week
  • Daily on-site work for customers in the Dresden area
  • Daily remote work with the following on-site days outside Dresden
    • At project start up to 1 week at customers office
    • 1 day every 2 weeks for customers in the areas Berlin, Leipzig, Erfurt, Chemnitz
    • 3 days every 2 months for customers in Europe

My contact details

Loading ...