Data Platform Engineer

schedule

Full-time

location_on

Praha

face

Tereza Kůrková

Company description

Modern global IT HUB of insurance company.

Responsibilities

The team consists of 7 Data (Platform) Engineers managing couple of Data Platforms. Within the team, you will play a key role in constructing and automating the data platform, as well as developing functional data components. To set the foundation of the materialization of the data ambitions, you must be self-directed and comfortable with the complexity of supporting the data needs of multiple teams, systems and products. The infrastructure runs on MS Azure cloud and we use Terraform as IaC.We use Azure Data Factory for pipelines and orchestration and Databricks for processing and transforming data.

- Maintain and further enhancing the Data Platforms
- Understand the needs of your various customers from different units and translate them to modular solutions which are usable by the vast majority
- Enable support functions teams to run successful data and AI projects by providing technical guidance
- Brainstorm and design solutions to optimize our data platform solution architecture
- Share knowledge in our data engineering community, pair program and collaborate on code together with other data engineers within our company
- Participating in code reviews and ensuring adherence to development standards
- Staying up-to-date with the latest data platform technologies and industry trends

Requirements

- You have 5+ years of relevant software, data and/or platform engineering experience, building platforms which are modular, testable, scalable and easily consumable
- Have 3+ years hands-on experience on one or more cloud services (Azure/AWS) like: ADF, Data-lake, Delta-lake, Databricks, Key Vaults, BigQuery, Cloud Dataflow, Datapipeline, , etc.
- Experience with Infra as a Code (Terraform, BICEP)
- Experience with Data as Code; version control, small and regular commits, unit tests, CI/CD, packaging, branching etc.
- Demonstrated programming and debugging experience with Python/PySpark, SQL
- Preferably experience with open source projects run with a “build once, deploy often” mindset & Experience or interest in Domain Driven Design
- Self-directed
- Both pragmatic and methodical with a strong software engineering mindset, and metadata-driven data solution design
- Actively helping the team to achieve flow and achieve the sprint goal
- Collaborative and proactive about working in our inner source community
- Passionate about automation looking for automating our data products to the highest scale

Our offer

- Bonuses
- Work mostly from home (1 day per week on-site, 4 days per week HO)
- Flexible start/end of working hours - core working time 10.00 am till 2 pm
- 5 weeks of vacation and 5 annual Well-being days
- 3% employer supplemental pension monthly contribution
- Unlimited budget for your education (hard and soft skills, Language courses)
- Meal contribution, Cafeteria program, monthly home office allowance
- Multisport card, partnership with various companies (Makro, Datart, Sony, Electrolux…)
- iPhone 11, personal Office 365 License, O2 Family discounts
- Volunteering days to support our community
- Employee referral bonuses to encourage the addition of great new people to the team
- Amazing place of work near city centre - Prague 5

Drop files here browse files ...
Drop files here browse files ...
Drop files here browse files ...
Are you sure you want to delete this file?
/