Data Engineer

Data Engineer

Country

Croatia, Slovenia, Montenegro, North Macedonia

Work Model

Office, Remote, Hybrid

WHO WE ARE LOOKING FOR

We’re looking for an experienced Data Engineer to join our growing AI & Data Engineering team and help with data architecture and ML/AI features on client projects.

The AI & Data Engineering team is one of the newest additions at Infinum, focused on delivering impactful solutions in data engineering and artificial intelligence. We work on a range of client projects, helping turn data into real business value.

.

Our work spans everything from building robust data pipelines to developing AI-powered systems that extract insights and enable smarter decision-making. Right now, we’re building machine learning algorithms that recommend relevant content to users of an IoT app, along with data pipelines that power analytics and ML models behind the scenes.

.

As a growing team, we’re looking to significantly expand in the near future. We often collaborate with DevOps, Backend, QA, and System Architect teams and have delivered data-based solutions for industries like smart devices, hospitality, and healthcare. If you’re excited by complex challenges, enjoy collaborating across disciplines, and want your work to make a real impact in industries that matter, come build with us. We’re growing fast, learning faster, and always up for a cup of coffee and a great idea.


01

Qualifications & Experience

  • at least 2 years of experience as a Data Engineer
  • experience with scalable and production-ready distributed data systems (SQL/NoSQL) and cloud-based data solutions
  • proficiency in Python (preferred) for designing robust data solutions and AI integrations
  • experience with ETL tools and efficient, secure data pipelines
  • strong grasp of data lakehouse concepts, including modeling and performance
  • experience with Databricks or similar platforms for collaborative data and AI development, ideally with at least one year of hands-on experience
  • familiarity with Apache Spark, Snowflake, or Microsoft Fabric
  • hands-on experience with AWS, Azure, or GCP for cloud-native infrastructures
  • broad knowledge of design patterns for scalable, maintainable code
  • experience working independently and collaboratively in agile environments (Scrum, Kanban)
  • excellent communication skills, with the ability to explain complex technical concepts to non-technical stakeholders
  • fluency in English

Bonus points

  • solid understanding of LLM, prompt engineering, RAG architectures, and other similar AI concepts
  • familiarity with integrating pipelines with AI models and deploying AI features
  • knowledge of vector databases, graph databases, experience with real-time streaming pipelines
  • exposure to Kafka and Kubernetes

02

Your responsibilities

  • designing and implementing high-quality data architecture and software solutions
  • building and maintaining robust, scalable data pipelines
  • identifying, prioritizing and executing development tasks
  • automating tasks through appropriate tools and scripting
  • collaborating with other teams and vendors on enhancing products
  • documenting your work clearly and comprehensively
  • supporting technology adoption and integration into flagship products
  • working closely with the product team to research and develop ways for improving user experience, utilizing data and machine learning concepts
  • recommending and developing solutions that will create new product features or update existing ones, increase scalability, and eliminate technical debt

The selection process

Tools
we use

Python, SQL, Pandas, NumPy & PySpark

Python is our primary language for data processing, complemented by Pandas and NumPy for efficient data wrangling and analysis. PySpark allows us to scale data processing tasks seamlessly. SQL remains essential for querying and transforming data across platforms.

Apache Airflow

Our tool of choice for orchestrating data workflows. Airflow schedules, monitors, and manages complex pipelines, keeping data flowing reliably.

Azure Data Factory, AWS Glue, GCS & DataFlow

We use these managed services to build, deploy, and manage data pipelines across cloud platforms, ensuring smooth data integration and transformation.

BigQuery

Google Cloud’s serverless data warehouse powers our fast, scalable SQL queries on large datasets, without the need to manage infrastructure.

Databricks & Snowflake

Databricks offers a collaborative environment for big data and AI, while Snowflake provides a flexible and powerful data warehousing solution.

Delta Lake & Iceberg

We rely on these technologies to manage large-scale data lakes with ACID transactions and schema evolution, making our pipelines reliable and performant.

Docker & Git

Docker simplifies local development and containerized deployments. Git (with GitHub or Bitbucket) supports our pull request workflows and Continuous Integration practices.

What do we offer?

Feedback and feedforward

Honest communication fuels growth. In our 1 on 1 sessions, 360 reviews, and career progression meetings, we discuss what is great and what could be improved. 

Additional equipment budget

A little extra to supplement your standard work equipment. Pick a latest-model mobile phone, tablet, e-book reader, or a pair of earphones you’ve been dreaming about. Mix & match, why not.

Contributing to open source

Sharing is caring doesn’t only apply to chocolate.

Educational budget

We allocate yearly resources for employee education, including books, software, courses, or other learning materials. Big brain energy!

Paid language courses

Paid language courses help our employees master the English language.

Doing a career switch

We don’t have a sorting hat to tell you where you belong, but we will support your career switch from one job position to another.

Traveling on business

Having clients all over the world means our employees sometimes have to travel to and work from beautiful locations.

Subsidized recreation

Stay in shape with a sponsored fitness membership of your choice.

Sponsored health checks

You know the old saying – the greatest wealth is health.

Working remotely

Office location? Anywhere. It’s all about flexibility.

Free power-ups

Snack on fruit, cookies, and nuts to keep your energy levels up.

Car and bike parking

Don’t let it get caught in the rain. We offer free parking for bicycles and subsidized car parking.

Flexible working hours

Tailor your working hours to fit your schedule.

Fun and games budget

Every team gets a monthly budget to hang out and do fun stuff.

Benefits

In addition to professional development opportunities, we provide a selection of benefits that help you thrive and grow.

Explore benefits

Apply for this position

Resume
Add files
Motivation letter
Add files

EXPLORE OUR WORK

Interested in our projects?