Company description:
HRO Digital is a specialist traditional recruitment business. HRO Digital is a brand of Verita HR Polska.Verita HR Polska is a Human Resources service provider operating under number 5694.
We are working as a recruitment provider searching on our Client's behalf for a person in the following role:
DataOps Engineer
Responsibilities:
- Review and refine, interpret and implement business and technical requirements- Ensure you are part of the on-going productivity and priorities using User Stories, Jira, Backlogs, etc.
- Deliver requirements to scope, quality, and time commitments in Agile mode and practice
- Responsible for onboarding new data sources, design, build, test and deploy Cloud data ingest, pipelines, warehouse and data models/products
- Build and operate optimal data pipelines/models/products with SQL, stored procedures, indexes, clusters, partitions, triggers, etc.
- Creating, owning, enhancing, and operating CI/CD pipelines using Git, Jenkins, Groovy and etc.
- Deliver a data warehouse and pipelines which follow API, abstraction and ‘database refactoring’ best practice in order to support evolutionary development and continual change
- Develop procedures and scripts for data migration, back-population and feed-to-warehouse initialization
- Extend the solution with Data Catalogue
- Protect the solution with Data Governance, Security, Sovereignty, Masking and Lineage capabilities
- Deliver non-functional requirements, IT standards and developer and support tools to ensure our applications are a secure, compliant, scalable, reliable and cost effective
- Ensure a consistent approach to logging, monitoring, error handling and automated recovery
- Fix defects and enhancements
- Maintain good quality and up to date knowledge base, wiki and admin pages of the solution
- Peer review of colleague’s changes
- Speak up and help shape how we do things better
Requirements:
Essential Experience:- Expert in Administration and development of Traditional and Cloud Databases
- Excellent understanding of GCP Core and Data Products, Architecting and solution design
- Minimum 1+ years of working experience on Google Cloud Platform Development (or any other Cloud Platform), especially in Data / ETL related projects
- Data preparation, wrangling and refactoring skills, for example as part of a Data Science pipelines
- IT methodology/practices knowledge and solid experience in Agile/Scrum
- Experience in building and operating CI/CD life-cycle management Git, Jenkins, Groovy, Checkmarx, Nexus, Sonar IQ and etc.
- Experience in Collaboration tools usage such as JIRA/Confluence/Various board types
- BS/MS degree in Computer/Data Science, Engineering or a related subject
- Excellent communication and interpersonal skills in English.
- Enthusiastic willingness to rapidly and independently learn and develop technical and soft skills as needs require.
- Strong organisational and multi-tasking skills.
- Good team player who embraces teamwork and mutual support.
- Interested in working in a fast-paced environment
Additional Experience:
- Experience of deploying and operating Datafusion/CDAP based solutions
- Experience in GCP based big data / ETL solutions DevOps model
- Expertise of Java, Python, DataFlow
- Broad experience with IT development and collaboration tools.
- An understanding of IT Security and Application Development best practice.
- Understanding of and interest in various investment products and life cycle and the nature of the investment banking business.
- Experience of working with infrastructure teams to deliver the best architecture for applications.
The offer:
- Conference and training budget- Language course/studies partial reimbursement
- Safari books
- Online trainings: LinkedIn, Coursera
- Internal trainings
- Transfer between projects
- Team events and networking events
- Tech communities and cultural communities
- Mentoring programs
- On-site medical consultations in the office
- Formal assistance while going on a maternity/paternity leave
- Nursery funding
- Nursery room
- Family days