content-header

Main Tasks

  • Bridge Between Tech and Stakeholders: Act as intermediaries between engineering teams and stakeholders, ensuring clean, transformed datasets. Understand business requirements and translate them into technical specifications
  • Data Transformation and Management: Manipulate raw data, making it more organised and easier to analyse. Involves creating and maintaining data pipelines and transforming data into structured formats suitable for analysis.
  • Data Quality Assurance: Develop data validation processes to ensure data integrity and accuracy.
  • Bridge Between Tech and Stakeholders: Act as intermediaries between engineering teams and stakeholders, ensuring clean, transformed datasets. Understand business requirements and translate them into technical specifications
  • Data Transformation and Management: Manipulate raw data, making it more organised and easier to analyse. Involves creating and maintaining data pipelines and transforming data into structured formats suitable for analysis.
  • Data Quality Assurance: Develop data validation processes to ensure data integrity and accuracy.
  • Collaboration with Data Teams: Work closely with data engineers, and data architects to ensure seamless data integration across various platforms. This involves coordinating with different teams to implement data solutions that meet business needs.
  • Developing and Maintaining Data Models: Create and maintain data models that represent complex business processes and entities, which are crucial for generating insights and supporting decision-making.
  • Automation of Data Processes: Automate repetitive data tasks to improve the efficiency and scalability of data operations. May use scripting languages to streamline data workflows.
  • Create designs for complex projects (data products), iterating existing components or designing new components as needed.
  • Influence teams within your area of responsibility to design and build components aligned with the overall roadmap and engineering principles.
  • Collaborate within teams to contribute to the execution of the organisation's technical strategy, focusing on the development and deployment of data solutions.
  • Lead outcomes, develop stakeholder relationships, and deliver high-quality insights through data storytelling.
  • Write code following coding standards and best practices, adhering to a test-driven and behaviour-driven development approach.
  • Assist in architecting systems, designing efficient data solutions, and facilitating technical decision-making.
  • Insightful domain knowledge to business problems, recommending and implementing data-led approaches.
  • Ensure high-quality, accurate, and professional outputs that drive real business decisions.
  • Leadership & Collaboration
  • Serve as a technical leader and mentor data engineers.
  • Partner with data scientists to productionize research models on Snowflake and AWS.
  • Engage with product and business stakeholders to align AI solutions with enterprise strategy.

Expectation

  • Knowledge of Data Visualization Tools (such as MicroStrategy / Power BI)
  • Experience with Terraform and Terragrunt for infrastructure as code
  • Experience with Generative AI technologies
  • Proficiency in SQL for data transformation, analysis and problem-solving.
  • Snowflake, DBT, Data Modelling, Data Vault, Data warehousing and GitHub
  • Understanding of version control systems, continuous integration pipelines, and service-oriented architecture.
  • Understanding the need for different and appropriate design techniques, such as data vault and data warehousing
  • Knowledge of Cloud technologies (AWS or Azure) and scripting languages like Python
  • Working knowledge of modern data architecture frameworks, understanding of Architecture & Engineering
  • Highly numerate background with the ability to drive business change through data.
  • Excellent communication skills, capable of explaining complex information in simple terms using available documentation tools.
  • Strong problem-solving skills and attention to detail, with a curiosity to explore opportunities and solve problems logically.
  • Delivery of solutions with longevity and maintainability following the latest Agile practices.
  • Understanding of Relational (and non-relational) databases and when to use them.
  • Standards/ Principles and knowing when and how to implement frameworks and when to make suggestions for new Standards.
  • Strong interpersonal skills with the ability to work cross-functionally with stakeholders, engineers, and analysts.
  • Languages: Python (primary), SQL, Bash
  • Cloud: Azure, AWS
  • Tools: Airflow, DBT , Docker
  • Data: Snowflake, Delta Lake, Redis, Azure Data Lake
  • Infra & Ops: Terraform, GitHub Actions, Azure DevOps, Azure Monitor
  • Confident English
  • Candidate needs to be currently eligible to work in Hungary.

Advantages

  • Knowledge of data platforms (e.g. Snowflake, Azure Data Lake).
  • Experience deploying models as APIs (FastAPI, Azure Functions).
  • Understanding of monitoring, model performance tracking, and observability best practices.
  • Familiarity with orchestration tools like Airflow or Azure Data Factory.

Company info

Tech People is a European provider of technical skills permanent recruitment and temporary contracting services for multiple industries.

 

Currently we are looking for

ANALYTICS ENGINEER (JT)

 

Location: Budapest, XI. district, hybrid

Salary: competitive

Type of employment: contracting

Start: ASAP

JELENTKEZEM


Cégnév: Tech People Hungary Kft.
Állás helye:
Állás-kategóriák:
Állásnév: