Description
Throughout their 125-year history, our Client has grown into one of the world’s largest biotech companies and a global supplier of transformative innovative solutions across major disease areas.
We are looking for an IT specialist to join one of their teams in Polska within the Informatics division. They focus on delivering technology that evolves the practice of medicine and helps patients live longer, better lives. Poland plays the role of Technology Co-creation and Acceleration Hub building capabilities driving digital innovation. They are a diverse team of open and friendly people, enthusiastic about technological novelties and optimal IT solutions. We share knowledge, experience & appreciate different points of view.
Recruitment process:
- Short call with Recruiter ~10 min.
- Technical interview ~45 min💻
- Interview with Manager~45 min
- Handshake💎
Responsibilities
- Day-to-day data related support of data pipelines. You are going to operate and deliver the data pipeline end to end, including the data analysis, transformation, CI/CD maintenance & improvements as well as deployments on different data cloud platforms based on AWS and Snowflake technology.
- You will ensure delivery of solutions based on the backbone of good architecture, best data engineering practices around operational efficiencies, security, data interoperability, reliability, performance and cost optimisation.
- As a Data Ops Engineer you can demonstrate knowledge within data delivery, ETL/ELT, data management, solution design and delivery and integration from various sources into the Snowflake platform. This could involve setting up ETL processes, working with APIs, or using Snowflake's native data integration tools and/or design. Building data warehouses on the Snowflake platform including creating data models, setting up data pipelines, and ensuring data is stored in a way that is easy to access and analyze. The AWS services you are going to work with are at least (but not limited to): CloudFormation, EC2, ECS, RDS, Lambda, S3, SNS, SQS, SSM, SecretsManager, StepFunctions.
Requirements
- Minimum 1 year of experience with AWS Cloud Services, including Lambda and S3.
- Minimum 1 year of experience with Snowflake and Data Warehousing.
- Python for data engineering and strong SQL development skills.
- Proven track record of software development (at least 1 year).
- Hands-on experience in ETL processing, demonstrating data transformation capabilities.
- Ability to maintain CI/CD processes and drive continuous improvements.
- Familiarity with version control tools such as Gitlab or Bitbucket.
- Demonstrated expertise in data integration from diverse sources.
- Knowledge of data modeling, DevOps concepts, and Agile practices.
- Problem-solving orientation, contributing to innovation in data operations.
- Excellent documentation skills for clear and comprehensive records.
- Strong communication skills, collaborating effectively with diverse stakeholders.
We offer
- Salary range 13 000 - 18 000 PLN gross based on the employment contract (Umowa o pracę)
- Annual bonus payment based on your performance
- Dedicated training budget (training, certifications, conferences, diversified career paths etc.)
- Highly flexible working arrangements - you have an impact on your daily schedule and you will work with your manager on how often you visit our office
- Recharge Fridays (2 Fridays off per quarter available)
- Take time Program (up to 3 months of leave to use for any purpose)
- Flex Location (possibility to perform our work from different places in the world for a certain period of time)
- Take Time for Charity (additional paid leave of maximum 2 weeks to engage in the charity action of your choice)
- Private healthcare (Medicover packages) and group life insurance (UNUM)
- Stock share purchase additions
- Yearly sales of company laptops and cars and many more