We are looking for someone with an easy-to-work-with, mature and no-nonsense mentality. Someone who is an open and honest communicator, who values working as part of a team, who is willing and able to coach or train other developers and who is aware of developments and trends in the industry and corporate ecosystem.
Are you also passionate about a (not so distant) future that most data processing is done in a streaming fashion, not scared off by complex data, and enjoy developing complex components in Java? Then please read on.
On the more technical side you must have 9+ years of relevant experience in data engineering and especially must have experience in the following fields:
- Agile / Scrum.
- Track record in building larger corporate systems.
- Kafka Streaming API.
- Kafka, Schema Registry and Kafka Connect, using the Confluent framework.
- Java 8 or higher backend development.
- CI / CD tooling: Azure DevOps, Maven, CheckMarx, Git, Ansible.
- Running and managing a Kafka cluster and related components.
- Linux (bash) scripting capabilities.
- Data Integration techniques.
- Oracle Sql 12c or higher.
Next to these must haves, we appreciate you to have knowledge of the following:
- Oracle RDBMS 12c or higher.
- Database Change Data Capture.
- Logging and monitoring with Grafana, Elastic, Kibana, Prometheus or Logstash.
- Data modelling.
- Oracle Data Integrator 12c.
- Experience in a complex, corporate environment.
- Experience in Lending, Financial systems.
- Issue trackers like JIRA, ServiceNow.
- Collaboration tooling like Confluence.